gyigit commited on
Commit
06eb413
·
1 Parent(s): 2a7ca7c
.gitattributes DELETED
@@ -1,36 +0,0 @@
1
- *.7z filter=lfs diff=lfs merge=lfs -text
2
- *.arrow filter=lfs diff=lfs merge=lfs -text
3
- *.bin filter=lfs diff=lfs merge=lfs -text
4
- *.bz2 filter=lfs diff=lfs merge=lfs -text
5
- *.ckpt filter=lfs diff=lfs merge=lfs -text
6
- *.ftz filter=lfs diff=lfs merge=lfs -text
7
- *.gz filter=lfs diff=lfs merge=lfs -text
8
- *.h5 filter=lfs diff=lfs merge=lfs -text
9
- *.joblib filter=lfs diff=lfs merge=lfs -text
10
- *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
- *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
- *.model filter=lfs diff=lfs merge=lfs -text
13
- *.msgpack filter=lfs diff=lfs merge=lfs -text
14
- *.npy filter=lfs diff=lfs merge=lfs -text
15
- *.npz filter=lfs diff=lfs merge=lfs -text
16
- *.onnx filter=lfs diff=lfs merge=lfs -text
17
- *.ot filter=lfs diff=lfs merge=lfs -text
18
- *.parquet filter=lfs diff=lfs merge=lfs -text
19
- *.pb filter=lfs diff=lfs merge=lfs -text
20
- *.pickle filter=lfs diff=lfs merge=lfs -text
21
- *.pkl filter=lfs diff=lfs merge=lfs -text
22
- *.pt filter=lfs diff=lfs merge=lfs -text
23
- *.pth filter=lfs diff=lfs merge=lfs -text
24
- *.rar filter=lfs diff=lfs merge=lfs -text
25
- *.safetensors filter=lfs diff=lfs merge=lfs -text
26
- saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
- *.tar.* filter=lfs diff=lfs merge=lfs -text
28
- *.tflite filter=lfs diff=lfs merge=lfs -text
29
- *.tgz filter=lfs diff=lfs merge=lfs -text
30
- *.wasm filter=lfs diff=lfs merge=lfs -text
31
- *.xz filter=lfs diff=lfs merge=lfs -text
32
- *.zip filter=lfs diff=lfs merge=lfs -text
33
- *.zst filter=lfs diff=lfs merge=lfs -text
34
- *tfevents* filter=lfs diff=lfs merge=lfs -text
35
- assets/ filter=lfs diff=lfs merge=lfs -text
36
- data/chembl_train.smi filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
.gitignore DELETED
@@ -1,2 +0,0 @@
1
- .venv/
2
- *.pyc
 
 
 
README.md DELETED
@@ -1,315 +0,0 @@
1
- ---
2
- title: Druggen
3
- sdk: gradio
4
- app_file: gradio_app.py
5
- emoji: 💊
6
- colorFrom: red
7
- colorTo: green
8
- ---
9
- # DrugGEN: Target Centric De Novo Design of Drug Candidate Molecules with Graph Generative Deep Adversarial Networks
10
-
11
-
12
-
13
- <p align="center">
14
- <a href="https://github.com/HUBioDataLab/DrugGEN/files/10828402/2302.07868.pdf"><img src="https://img.shields.io/badge/paper-report-red"/></a>
15
- <a href="http://www.gnu.org/licenses/"><img src="https://img.shields.io/badge/License-GPLv3-blue.svg"/></a>
16
-
17
- </p>
18
-
19
- <!--PUT HERE SOME QUALITATIVE RESULTS IN THE ASSETS FOLDER-->
20
- <!--YOU CAN PUT ALSO IN THE GIF OR PNG FORMAT -->
21
- <!--<p float="center">
22
- <img src="assets/sample1.png" width="49%" />
23
- <img src="assets/sample2.png" width="49%" />
24
- </p>-->
25
-
26
-
27
- ## Updated Pre-print!
28
-
29
- **Please see our most up-to-date document (pre-print) from 15.02.2023 here:** [2302.07868.pdf](https://github.com/HUBioDataLab/DrugGEN/files/10828402/2302.07868.pdf), [arXiv link](https://arxiv.org/abs/2302.07868)
30
-
31
- &nbsp;
32
- &nbsp;
33
-
34
- ## Abstract
35
-
36
- Discovering novel drug candidate molecules is one of the most fundamental and critical steps in drug development. Generative deep learning models, which create synthetic data given a probability distribution, have been developed with the purpose of picking completely new samples from a partially known space. Generative models offer high potential for designing de novo molecules; however, in order for them to be useful in real-life drug development pipelines, these models should be able to design target-specific molecules, which is the next step in this field. In this study, we propose DrugGEN, for the de novo design of drug candidate molecules that interact with selected target proteins. The proposed system represents compounds and protein structures as graphs and processes them via serially connected two generative adversarial networks comprising graph transformers. DrugGEN is trained using a large dataset of compounds from ChEMBL and target-specific bioactive molecules, to design effective and specific inhibitory molecules against the AKT1 protein, which has critical importance for developing treatments against various types of cancer. On fundamental benchmarks, DrugGEN models have either competitive or better performance against other methods. To assess the target-specific generation performance, we conducted further in silico analysis with molecular docking and deep learning-based bioactivity prediction. Results indicate that de novo molecules have high potential for interacting with the AKT1 protein structure in the level of its native ligand. DrugGEN can be used to design completely novel and effective target-specific drug candidate molecules for any druggable protein, given target features and a dataset of experimental bioactivities. Code base, datasets, results and trained models of DrugGEN are available in this repository.
37
-
38
- Our up-to-date pre-print is shared [here](https://github.com/HUBioDataLab/DrugGEN/files/10828402/2302.07868.pdf)
39
-
40
- <!--Check out our paper below for more details
41
-
42
- > [**DrugGEN: Target Centric De Novo Design of Drug Candidate Molecules with Graph Generative Deep Adversarial Networks
43
- **](link here),
44
- > [Atabey Ünlü](https://tr.linkedin.com/in/atabeyunlu), [Elif Çevrim](https://www.linkedin.com/in/elifcevrim/?locale=en_US), [Ahmet Sarıgün](https://asarigun.github.io/), [Heval Ataş](https://www.linkedin.com/in/heval-atas/), [Altay Koyaş](https://www.linkedin.com/in/altay-koya%C5%9F-8a6118a1/?originalSubdomain=tr), [Hayriye Çelikbilek](https://www.linkedin.com/in/hayriye-celikbilek/?originalSubdomain=tr), [Deniz Cansen Kahraman](https://www.linkedin.com/in/deniz-cansen-kahraman-6153894b/?originalSubdomain=tr), [Abdurrahman Olğaç](https://www.linkedin.com/in/aolgac/?originalSubdomain=tr), [Ahmet S. Rifaioğlu](https://saezlab.org/person/ahmet-sureyya-rifaioglu/), [Tunca Doğan](https://yunus.hacettepe.edu.tr/~tuncadogan/)
45
- > *Arxiv, 2020* -->
46
-
47
- &nbsp;
48
- &nbsp;
49
-
50
- <!--PUT THE ANIMATED GIF VERSION OF THE DRUGGEN MODEL (Figure 1)-->
51
- <p float="center">
52
- <img src="assets/DrugGEN_Figure1_final_v1.gif" width="100%" />
53
- </p>
54
-
55
- **Fig. 1.** **(A)** Generator (*G1*) of the GAN1 consists of an MLP and graph transformer encoder module. The generator encodes the given input into a new representation; **(B)** the MLP-based discriminator (*D1*) of GAN1 compares the generated de novo molecules to the real ones in the training dataset, scoring them for their assignment to the classes of “real” and “fake” molecules; **(C)** Generator (*G2*) of GAN2 makes use of the transformer decoder architecture to process target protein features and GAN1 generated de novo molecules together. The output of the generator two (*G2*) is the modified molecules, based on the given protein features; **(D)** the second discriminator (*D2*) takes the modified de novo molecules and known inhibitors of the given target protein and scores them for their assignment to the classes of “real” and “fake” inhibitors.
56
-
57
- &nbsp;
58
- &nbsp;
59
-
60
- ## Transformer Modules
61
-
62
- Given a random noise *z*, **the first generator** *G1* (below, on the left side) creates annotation and adjacency matrices of a supposed molecule. *G1* processes the input by passing it through a multi-layer perceptron (MLP). The input is then fed to the transformer encoder module [Vaswani et al., (2017)](https://arxiv.org/abs/1706.03762), which has a depth of 8 encoder layers with 8 multi-head attention heads for each. In the graph transformer setting, *Q*, *K* and *V* are the variables representing the annotation matrix of the molecule. After the final products are created in the attention mechanism, both the annotation and adjacency matrices are forwarded to layer normalization and then summed with the initial matrices to create a residual connection. These matrices are fed to separate feedforward layers, and finally, given to the discriminator network *D1* together with real molecules.
63
-
64
- **The second generator** *G2* (below, on the right side) modifies molecules that were previously generated by *G1*, with the aim of generating binders for the given target protein. *G2* module utilizes the transformer decoder architecture. This module has a depth of 8 decoder layers and uses 8 multi-head attention heads for each. *G2* takes both *G1(z)*, which is data generated by *G1*, and the protein features as input. Interactions between molecules and proteins are processed inside the multi-head attention module via taking their scaled dot product, and thus, new molecular graphs are created. Apart from the attention mechanism, further processing of the molecular matrices follows the same workflow as the transformer encoder. The output of this module are the final product of the DrugGEN model and are forwarded to *D2*.
65
-
66
-
67
- <!--PUT HERE 1-2 SENTECE FOR METHOD WHICH SHOULD BE SHORT Pleaser refer to our [arXiv report](link here) for further details.-->
68
-
69
-
70
- <!-- - supports both CPU and GPU inference (though GPU is way faster), -->
71
- <!-- ADD HERE SOME FEATURES FOR DRUGGEN & SUMMARIES & BULLET POINTS -->
72
-
73
-
74
- <!-- ADD THE ANIMATED GIF VERSION OF THE GAN1 AND GAN2 -->
75
- | First Generator | Second Generator |
76
- |------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------|
77
- | ![FirstGAN](assets/DrugGEN_G1_final2.gif) | ![SecondGAN](assets/DrugGEN_G2_final2.gif) |
78
-
79
- &nbsp;
80
- &nbsp;
81
-
82
- ## Model Variations
83
-
84
- - **DrugGEN-Prot** (the default model) is composed of two GANs. It incorporates protein features to the transformer decoder module of GAN2 (together with the de novo molecules generated by GAN1) to direct the target centric molecule design. The information provided above belongs to this model.
85
- - **DrugGEN-CrossLoss** is composed of only one GAN. The input of the GAN1 generator is the real molecules (ChEMBL) dataset (to ease the learning process) and the GAN1 discriminator compares the generated molecules with the real inhibitors of the given target protein.
86
- - **DrugGEN-Ligand** is composed of two GANs. It incorporates AKT1 inhibitor molecule features as the input of the GAN2-generator’s transformer decoder instead of the protein features in the default model.
87
- - **DrugGEN-RL** utilizes the same architecture as the DrugGEN-Ligand model. It uses reinforcement learning (RL) to avoid using molecular scaffolds that are already presented in the training set.
88
- - **DrugGEN-NoTarget** is composed of only one GAN. This model only focuses on learning the chemical properties from the ChEMBL training dataset, as a result, there is no target-specific generation.
89
-
90
- &nbsp;
91
- &nbsp;
92
-
93
- ## Files & Folders
94
-
95
- We provide the implementation of the DrugGEN, along with scripts from PyTorch Geometric framework to generate and run. The repository is organised as follows:
96
-
97
- ```data``` contains:
98
- - **Raw dataset files**, which should be text files containing SMILES strings only. Raw datasets preferably should not contain stereoisomeric SMILES to prevent Hydrogen atoms to be included in the final graph data.
99
- - Constructed **graph datasets** (.pt) will be saved in this folder along with atom and bond encoder/decoder files (.pk).
100
-
101
- ```experiments``` contains:
102
- - ```logs``` folder. Model loss and performance metrics will be saved in this directory in seperate files for each model.
103
- - ```tboard_output``` folder. Tensorboard files will be saved here if TensorBoard is used.
104
- - ```models``` folder. Models will be saved in this directory at last or preferred steps.
105
- - ```samples``` folder. Molecule samples will be saved in this folder.
106
- - ```inference``` folder. Molecules generated in inference mode will be saved in this folder.
107
-
108
- **Python scripts:**
109
-
110
- - ```layers.py``` contains **transformer encoder** and **transformer decoder** implementations.
111
- - ```main.py``` contains arguments and this file is used to run the model.
112
- - ```models.py``` has the implementation of the **Generators** and **Discriminators** which are used in GAN1 and GAN2.
113
- - ```new_dataloader.py``` constructs the graph dataset from given raw data. Uses PyG based data classes.
114
- - ```trainer.py``` is the training and testing file for the model. Workflow is constructed in this file.
115
- - ```utils.py``` contains performance metrics from several other papers and some unique implementations. (De Cao et al, 2018; Polykovskiy et al., 2020)
116
-
117
- &nbsp;
118
- &nbsp;
119
-
120
- ## Datasets
121
-
122
- Three different data types (i.e., compound, protein, and bioactivity) were retrieved from various data sources to train our deep generative models. GAN1 module requires only compound data while GAN2 requires all of three data types including compound, protein, and bioactivity.
123
- - **Compound data** includes atomic, physicochemical, and structural properties of real drug and drug candidate molecules. [ChEMBL v29 compound dataset](data/dataset_download.sh) was used for the GAN1 module. It consists of 1,588,865 stable organic molecules with a maximum of 45 atoms and containing C, O, N, F, Ca, K, Br, B, S, P, Cl, and As heavy atoms.
124
- - **Protein data** was retrieved from Protein Data Bank (PDB) in biological assembly format, and the coordinates of protein-ligand complexes were used to construct the binding sites of proteins from the bioassembly data. The atoms of protein residues within a maximum distance of 9 A from all ligand atoms were recorded as binding sites. GAN2 was trained for generating compounds specific to the target protein AKT1, which is a member of serine/threonine-protein kinases and involved in many cancer-associated cellular processes including metabolism, proliferation, cell survival, growth and angiogenesis. Binding site of human AKT1 protein was generated from the kinase domain (PDB: 4GV1).
125
- - **Bioactivity data** of AKT target protein was retrieved from large-scale ChEMBL bioactivity database. It contains ligand interactions of human AKT1 (CHEMBL4282) protein with a pChEMBL value equal to or greater than 6 (IC50 <= 1 µM) as well as SMILES information of these ligands. The dataset was extended by including drug molecules from DrugBank database known to interact with human AKT proteins. Thus, a total of [1,600 bioactivity data](data/filtered_akt_inhibitors.smi) points were obtained for training the AKT-specific generative model.
126
- <!-- To enhance the size of the bioactivity dataset, we also obtained two alternative versions by incorporating ligand interactions of protein members in non-specific serine/threonine kinase (STK) and kinase families. -->
127
-
128
- More details on the construction of datasets can be found in our paper referenced above.
129
-
130
- <!-- ADD SOME INFO HERE -->
131
-
132
- &nbsp;
133
- &nbsp;
134
-
135
- ## Getting Started
136
- DrugGEN has been implemented and tested on Ubuntu 18.04 with python >= 3.9. It supports both GPU and CPU inference.
137
-
138
- Clone the repo:
139
- ```bash
140
- git clone https://github.com/HUBioDataLab/DrugGEN.git
141
- ```
142
-
143
- <!--## Running the Demo
144
- You could try Google Colab if you don't already have a suitable environment for running this project.
145
- It enables cost-free project execution in the cloud. You can use the provided notebook to try out our Colab demo:
146
- [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](Give a link here)-->
147
-
148
- &nbsp;
149
- &nbsp;
150
-
151
- ## Training
152
-
153
- ### Setting up environment
154
-
155
- You can set up the environment using either conda or pip.
156
-
157
- Here is with conda:
158
-
159
- ```bash
160
- # set up the environment (installs the requirements):
161
-
162
- conda env create -f DrugGEN/dependencies.yml
163
-
164
- # activate the environment:
165
-
166
- conda activate druggen
167
- ```
168
-
169
- Here is with pip using virtual environment:
170
-
171
- ```bash
172
- python -m venv DrugGEN/.venv
173
- ./Druggen/.venv/bin/activate
174
- pip install -r DrugGEN/requirements.txt
175
- ```
176
-
177
-
178
- ### Starting the training
179
-
180
- ```
181
- # Download input files:
182
-
183
- cd DrugGEN/data
184
-
185
- bash dataset_download.sh
186
-
187
- cd
188
-
189
- # DrugGEN can be trained with the one-liner:
190
-
191
- python DrugGEN/main.py --submodel="CrossLoss" --mode="train" --raw_file="DrugGEN/data/chembl_train.smi" --dataset_file="chembl45_train.pt" --drug_raw_file="DrugGEN/data/akt_train.smi" --drug_dataset_file="drugs_train.pt" --max_atom=45
192
- ```
193
-
194
- ** Explanations of arguments can be found below:
195
-
196
- ```bash
197
- Model arguments:
198
- --submodel SUBMODEL Choose the submodel for training
199
- --act ACT Activation function for the model
200
- --z_dim Z_DIM Prior noise for the first GAN
201
- --max_atom MAX ATOM Maximum atom number for molecules must be specified
202
- --lambda_gp LAMBDA_GP Gradient penalty lambda multiplier for the first GAN
203
- --dim DIM Dimension of the Transformer models for both GANs
204
- --depth DEPTH Depth of the Transformer model from the first GAN
205
- --heads HEADS Number of heads for the MultiHeadAttention module from the first GAN
206
- --dec_depth DEC_DEPTH Depth of the Transformer model from the second GAN
207
- --dec_heads DEC_HEADS Number of heads for the MultiHeadAttention module from the second GAN
208
- --mlp_ratio MLP_RATIO MLP ratio for the Transformers
209
- --dis_select DIS_SELECT Select the discriminator for the first and second GAN
210
- --init_type INIT_TYPE Initialization type for the model
211
- --dropout DROPOUT Dropout rate for the encoder
212
- --dec_dropout DEC_DROPOUT Dropout rate for the decoder
213
- Training arguments:
214
- --batch_size BATCH_SIZE Batch size for the training
215
- --epoch EPOCH Epoch number for Training
216
- --warm_up_steps Warm up steps for the first GAN
217
- --g_lr G_LR Learning rate for G
218
- --g2_lr G2_LR Learning rate for G2
219
- --d_lr D_LR Learning rate for D
220
- --d2_lr D2_LR Learning rate for D2
221
- --n_critic N_CRITIC Number of D updates per each G update
222
- --beta1 BETA1 Beta1 for Adam optimizer
223
- --beta2 BETA2 Beta2 for Adam optimizer
224
- --clipping_value Clipping value for the gradient clipping process
225
- --resume_iters Resume training from this step for fine tuning if desired
226
- Dataset arguments:
227
- --features FEATURES Additional node features (Boolean) (Please check new_dataloader.py Line 102)
228
- ```
229
-
230
- <!--ADD HERE TRAINING COMMANDS WITH EXPLAINATIONS-->
231
-
232
- &nbsp;
233
- &nbsp;
234
-
235
- ## Molecule Generation Using Trained DrugGEN Models in the Inference Mode
236
-
237
-
238
- - First, please download the model weights of trained model, e.g., [DrugGEN-Prot](https://drive.google.com/drive/folders/19knQAtpieSamaxB4L5ft8bFiCVikBFDS?usp=share_link) and place it in the folder: "DrugGEN/experiments/models/".
239
- - After that, please run the code below:
240
-
241
-
242
- ```bash
243
-
244
- python DrugGEN/main.py --submodel="Prot" --mode="inference" --inference_model="DrugGEN/experiments/models/{Chosen model name}"
245
- ```
246
-
247
- - SMILES representation of the generated molecules will be saved into the file: "DrugGEN/experiments/inference/{Chosen submodel name}/denovo_molecules.txt".
248
-
249
- &nbsp;
250
- &nbsp;
251
-
252
- ## Results (De Novo Generated Molecules of DrugGEN Models)
253
-
254
- - SMILES notations of 50,000 de novo generated molecules from DrugGEN models (10,000 from each) can be downloaded from [here](results/generated_molecules).
255
- - We first filtered the 50,000 de novo generated molecules by applying Lipinski, Veber and PAINS filters; and 43,000 of them remained in our dataset after this operation ([SMILES notations of filtered de novo molecules](results/generated_molecules/filtered_all_generated_molecules.smi)).
256
- - We run our deep learning-based drug/compound-target protein interaction prediction system [DEEPScreen](https://pubs.rsc.org/en/content/articlehtml/2020/sc/c9sc03414e) on 43,000 filtered molecules. DEEPScreen predicted 18,000 of them as active against AKT1, 301 of which received high confidence scores (> 80%) ([SMILES notations of DeepScreen predicted actives](results/deepscreen)).
257
- - At the same time, we performed a molecular docking analysis on these 43,000 filtered de novo molecules against the crystal structure of [AKT1](https://www.rcsb.org/structure/4gv1), and found that 118 of them had sufficiently low binding free energies (< -9 kcal/mol) ([SMILES notations of de novo molecules with low binding free energies](results/docking/Molecules_th9_docking.smi)).
258
- - Finally, de novo molecules to effectively target AKT1 protein are selected via expert curation from the dataset of molecules with binding free energies lower than -9 kcal/mol. The structural representations of the selected molecules are shown in the figure below ([SMILES notations of the expert selected de novo AKT1 inhibitor molecules](results/docking/Selected_denovo_AKT1_inhibitors.smi)).
259
-
260
- ![structures](assets/Selected_denovo_AKT1_inhibitors.png)
261
- Fig. 2. Promising de novo molecules to effectively target AKT1 protein (generated by DrugGEN models), selected via expert curation from the dataset of molecules with sufficiently low binding free energies (< -9 kcal/mol) in the molecular docking experiment.
262
-
263
- &nbsp;
264
- &nbsp;
265
-
266
- ## Updates
267
-
268
- - 15/02/2023: Our pre-print is shared [here](https://github.com/HUBioDataLab/DrugGEN/files/10828402/2302.07868.pdf).
269
- - 01/01/2023: Five different DrugGEN models are released.
270
-
271
- &nbsp;
272
- &nbsp;
273
-
274
- ## Citation
275
- ```bash
276
- @misc{nl2023target,
277
- doi = {10.48550/ARXIV.2302.07868},
278
- title={Target Specific De Novo Design of Drug Candidate Molecules with Graph Transformer-based Generative Adversarial Networks},
279
- author={Atabey Ünlü and Elif Çevrim and Ahmet Sarıgün and Hayriye Çelikbilek and Heval Ataş Güvenilir and Altay Koyaş and Deniz Cansen Kahraman and Abdurrahman Olğaç and Ahmet Rifaioğlu and Tunca Doğan},
280
- year={2023},
281
- eprint={2302.07868},
282
- archivePrefix={arXiv},
283
- primaryClass={cs.LG}
284
- }
285
- ```
286
-
287
- Ünlü, A., Çevrim, E., Sarıgün, A., Çelikbilek, H., Güvenilir, H.A., Koyaş, A., Kahraman, D.C., Olğaç, A., Rifaioğlu, A., Doğan, T. (2023). Target Specific De Novo Design of Drug Candidate Molecules with Graph Transformer-based Generative Adversarial Networks. *arXiv preprint* arXiv:2302.07868.
288
-
289
-
290
- &nbsp;
291
- &nbsp;
292
-
293
- ## References/Resources
294
-
295
- In each file, we indicate whether a function or script is imported from another source. Here are some excellent sources from which we benefit from:
296
- <!--ADD THE REFERENCES THAT WE USED DURING THE IMPLEMENTATION-->
297
- - Molecule generation GAN schematic was inspired from [MolGAN](https://github.com/yongqyu/MolGAN-pytorch).
298
- - [MOSES](https://github.com/molecularsets/moses) was used for performance calculation (MOSES Script are directly embedded to our code due to current installation issues related to the MOSES repo).
299
- - [PyG](https://github.com/pyg-team/pytorch_geometric) was used to construct the custom dataset.
300
- - Transformer architecture was taken from [Vaswani et al. (2017)](https://arxiv.org/abs/1706.03762).
301
- - Graph Transformer Encoder architecture was taken from [Dwivedi & Bresson (2021)](https://arxiv.org/abs/2012.09699) and [Vignac et al. (2022)](https://github.com/cvignac/DiGress) and modified.
302
-
303
- Our initial project repository was [this one](https://github.com/asarigun/DrugGEN).
304
-
305
- &nbsp;
306
- &nbsp;
307
-
308
- ## License
309
- Copyright (C) 2023 HUBioDataLab
310
-
311
- This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
312
-
313
- This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
314
-
315
- You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
app.py DELETED
@@ -1,189 +0,0 @@
1
- import streamlit as st
2
- import streamlit_ext as ste
3
-
4
- from trainer import Trainer
5
- import random
6
- from rdkit.Chem import Draw
7
- from rdkit import Chem
8
- from rdkit.Chem.Draw import IPythonConsole
9
- import io
10
- from PIL import Image
11
-
12
- class DrugGENConfig:
13
- submodel='CrossLoss'
14
- act='relu'
15
- z_dim=16
16
- max_atom=45
17
- lambda_gp=1
18
- dim=128
19
- depth=1
20
- heads=8
21
- dec_depth=1
22
- dec_heads=8
23
- dec_dim=128
24
- mlp_ratio=3
25
- warm_up_steps=0
26
- dis_select='mlp'
27
- init_type='normal'
28
- batch_size=128
29
- epoch=50
30
- g_lr=0.00001
31
- d_lr=0.00001
32
- g2_lr=0.00001
33
- d2_lr=0.00001
34
- dropout=0.
35
- dec_dropout=0.
36
- n_critic=1
37
- beta1=0.9
38
- beta2=0.999
39
- resume_iters=None
40
- clipping_value=2
41
- features=False
42
- test_iters=10_000
43
- num_test_epoch=30_000
44
- inference_sample_num=1000
45
- num_workers=1
46
- mode="inference"
47
- inference_iterations=100
48
- inf_batch_size=1
49
- protein_data_dir='data/akt'
50
- drug_index='data/drug_smiles.index'
51
- drug_data_dir='data/akt'
52
- mol_data_dir='data'
53
- log_dir='experiments/logs'
54
- model_save_dir='experiments/models'
55
- # inference_model=""
56
- sample_dir='experiments/samples'
57
- result_dir="experiments/tboard_output"
58
- dataset_file="chembl45_train.pt"
59
- drug_dataset_file="akt_train.pt"
60
- raw_file='data/chembl_train.smi'
61
- drug_raw_file="data/akt_train.smi"
62
- inf_dataset_file="chembl45_test.pt"
63
- inf_drug_dataset_file='akt_test.pt'
64
- inf_raw_file='data/chembl_test.smi'
65
- inf_drug_raw_file="data/akt_test.smi"
66
- log_sample_step=1000
67
- set_seed=True
68
- seed=1
69
- resume=False
70
- resume_epoch=None
71
- resume_iter=None
72
- resume_directory=None
73
-
74
- class ProtConfig(DrugGENConfig):
75
- submodel="Prot"
76
- inference_model="experiments/models/Prot"
77
-
78
- class CrossLossConfig(DrugGENConfig):
79
- submodel="CrossLoss"
80
- inference_model="experiments/models/CrossLoss"
81
-
82
- class NoTargetConfig(DrugGENConfig):
83
- submodel="NoTarget"
84
- inference_model="experiments/models/NoTarget"
85
-
86
-
87
- model_configs = {
88
- "Prot": ProtConfig(),
89
- "CrossLoss": CrossLossConfig(),
90
- "NoTarget": NoTargetConfig(),
91
- }
92
-
93
-
94
- with st.sidebar:
95
- st.title("DrugGEN: Target Centric De Novo Design of Drug Candidate Molecules with Graph Generative Deep Adversarial Networks")
96
- st.write("[![arXiv](https://img.shields.io/badge/arXiv-2302.07868-b31b1b.svg)](https://arxiv.org/abs/2302.07868) [![github-repository](https://img.shields.io/badge/GitHub-black?logo=github)](https://github.com/HUBioDataLab/DrugGEN)")
97
-
98
- with st.expander("Expand to display information about models"):
99
- st.write("""
100
- ### Model Variations
101
- - **DrugGEN-Prot**: composed of two GANs, incorporates protein features to the transformer decoder module of GAN2 (together with the de novo molecules generated by GAN1) to direct the target centric molecule design.
102
- - **DrugGEN-CrossLoss**: composed of one GAN, the input of the GAN1 generator is the real molecules dataset and the GAN1 discriminator compares the generated molecules with the real inhibitors of the given target.
103
- - **DrugGEN-NoTarget**: composed of one GAN, focuses on learning the chemical properties from the ChEMBL training dataset, no target-specific generation.
104
-
105
- """)
106
-
107
- with st.form("model_selection_from"):
108
- model_name = st.radio(
109
- 'Select a model to make inference (DrugGEN-Prot and DrugGEN-CrossLoss models design molecules to target the AKT1 protein)',
110
- ('DrugGEN-Prot', 'DrugGEN-CrossLoss', 'DrugGEN-NoTarget')
111
- )
112
-
113
- model_name = model_name.replace("DrugGEN-", "")
114
-
115
- molecule_num_input = st.number_input('Number of molecules to generate', min_value=1, max_value=100_000, value=1000, step=1)
116
-
117
- seed_input = st.number_input("RNG seed value (can be used for reproducibility):", min_value=0, value=42, step=1)
118
-
119
- submitted = st.form_submit_button("Start Computing")
120
-
121
-
122
-
123
- if submitted:
124
- # if submitted or ("submitted" in st.session_state):
125
- # st.session_state["submitted"] = True
126
- config = model_configs[model_name]
127
-
128
- config.inference_sample_num = molecule_num_input
129
- config.seed = seed_input
130
-
131
- with st.spinner(f'Creating the trainer class instance for {model_name}...'):
132
- trainer = Trainer(config)
133
- with st.spinner(f'Running inference function of {model_name} (this may take a while) ...'):
134
- results = trainer.inference()
135
- st.success(f"Inference of {model_name} took {results['runtime']:.2f} seconds.")
136
-
137
- with st.expander("Expand to see the generation performance scores"):
138
- st.write("### Generation performance scores (novelty is calculated in comparison to the training dataset)")
139
- st.success(f"Validity: {results['fraction_valid']}")
140
- st.success(f"Uniqueness: {results['uniqueness']}")
141
- st.success(f"Novelty: {results['novelty']}")
142
-
143
- with open(f'experiments/inference/{model_name}/inference_drugs.txt') as f:
144
- inference_drugs = f.read()
145
- # st.download_button(label="Click to download generated molecules", data=inference_drugs, file_name=f'DrugGEN-{model_name}_denovo_mols.smi', mime="text/plain")
146
- ste.download_button(label="Click to download generated molecules", data=inference_drugs, file_name=f'DrugGEN-{model_name}_denovo_mols.smi', mime="text/plain")
147
-
148
-
149
- st.write("Structures of randomly selected 12 de novo molecules from the inference set:")
150
- # from rdkit.Chem import Draw
151
- # img = Draw.MolsToGridImage(mol_list, molsPerRow=5, subImgSize=(250, 250), maxMols=num_mols,
152
- # legends=None, useSVG=True)
153
- generated_molecule_list = inference_drugs.split("\n")
154
-
155
- selected_molecules = random.choices(generated_molecule_list,k=12)
156
-
157
- selected_molecules = [Chem.MolFromSmiles(mol) for mol in selected_molecules]
158
- # IPythonConsole.UninstallIPythonRenderer()
159
- drawOptions = Draw.rdMolDraw2D.MolDrawOptions()
160
- drawOptions.prepareMolsBeforeDrawing = False
161
- drawOptions.bondLineWidth = 1.
162
-
163
- molecule_image = Draw.MolsToGridImage(
164
- selected_molecules,
165
- molsPerRow=3,
166
- subImgSize=(250, 250),
167
- maxMols=len(selected_molecules),
168
- # legends=None,
169
- returnPNG=False,
170
- # drawOptions=drawOptions,
171
- highlightAtomLists=None,
172
- highlightBondLists=None,
173
-
174
- )
175
- print(type(molecule_image))
176
- # print(type(molecule_image._data_and_metadata()))
177
- molecule_image.save("result_grid.png")
178
- # png_data = io.BytesIO()
179
- # molecule_image.save(png_data, format='PNG')
180
- # png_data.seek(0)
181
-
182
- # Step 2: Read the PNG image data as a PIL image
183
- # pil_image = Image.open(png_data)
184
- # st.image(pil_image)
185
- st.image(molecule_image)
186
-
187
- else:
188
- st.warning("Please select a model to make inference")
189
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
assets/DrugGEN_Figure1.gif DELETED
Binary file (338 kB)
 
assets/DrugGEN_Figure1_1.gif DELETED
Binary file (151 kB)
 
assets/DrugGEN_Figure1_2.gif DELETED
Binary file (166 kB)
 
assets/DrugGEN_G1_4.gif DELETED
Binary file (386 kB)
 
assets/DrugGEN_G1_final2.gif DELETED
Binary file (362 kB)
 
assets/DrugGEN_G2_3.gif DELETED
Binary file (556 kB)
 
assets/DrugGEN_G2_final2.gif DELETED
Binary file (588 kB)
 
assets/Selected_denovo_AKT1_inhibitors.png DELETED
Binary file (507 kB)
 
assets/generator_1_mod.gif DELETED
Binary file (472 kB)
 
assets/generator_2_mod.gif DELETED
Binary file (622 kB)
 
assets/molecule_1.png DELETED
Binary file (44.8 kB)
 
assets/molecule_2.png DELETED
Binary file (53.5 kB)
 
data/akt/2x39_X39_BS_adj.csv DELETED
The diff for this file is too large to render. See raw diff
 
data/akt/2x39_X39_BS_adj_euc.csv DELETED
The diff for this file is too large to render. See raw diff
 
data/akt/2x39_X39_BS_annot.csv DELETED
@@ -1,499 +0,0 @@
1
- A,C,HD,N,NA,OA,SA
2
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
3
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
4
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
5
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
6
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
7
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
8
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
9
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
10
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
11
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
12
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
13
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
14
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
15
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
16
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
17
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
18
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
19
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
20
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
21
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
22
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
23
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
24
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
25
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
26
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
27
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
28
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
29
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
30
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
31
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
32
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
33
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
34
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
35
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
36
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
37
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
38
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
39
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
40
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
41
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
42
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
43
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
44
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
45
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
46
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
47
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
48
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
49
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
50
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
51
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
52
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
53
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
54
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
55
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
56
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
57
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
58
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
59
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
60
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
61
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
62
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
63
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
64
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
65
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
66
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
67
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
68
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
69
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
70
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
71
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
72
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
73
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
74
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
75
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
76
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
77
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
78
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
79
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
80
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
81
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
82
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
83
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
84
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
85
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
86
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
87
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
88
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
89
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
90
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
91
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
92
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
93
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
94
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
95
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
96
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
97
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
98
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
99
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
100
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
101
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
102
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
103
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
104
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
105
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
106
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
107
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
108
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
109
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
110
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
111
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
112
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
113
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
114
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
115
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
116
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
117
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
118
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
119
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
120
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
121
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
122
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
123
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
124
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
125
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
126
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
127
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
128
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
129
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
130
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
131
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
132
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
133
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
134
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
135
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
136
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
137
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
138
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
139
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
140
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
141
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
142
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
143
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
144
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
145
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
146
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
147
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
148
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
149
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
150
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
151
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
152
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
153
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
154
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
155
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
156
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
157
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
158
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
159
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
160
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
161
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
162
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
163
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
164
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
165
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
166
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
167
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
168
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
169
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
170
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
171
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
172
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
173
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
174
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
175
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
176
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
177
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
178
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
179
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
180
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
181
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
182
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
183
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
184
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
185
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
186
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
187
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
188
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
189
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
190
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
191
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
192
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
193
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
194
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
195
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
196
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
197
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
198
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
199
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
200
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
201
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
202
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
203
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
204
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
205
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
206
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
207
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
208
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
209
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
210
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
211
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
212
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
213
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
214
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
215
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
216
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
217
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
218
- 0.0,0.0,0.0,0.0,0.0,0.0,1.0
219
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
220
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
221
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
222
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
223
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
224
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
225
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
226
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
227
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
228
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
229
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
230
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
231
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
232
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
233
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
234
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
235
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
236
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
237
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
238
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
239
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
240
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
241
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
242
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
243
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
244
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
245
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
246
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
247
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
248
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
249
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
250
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
251
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
252
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
253
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
254
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
255
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
256
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
257
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
258
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
259
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
260
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
261
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
262
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
263
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
264
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
265
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
266
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
267
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
268
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
269
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
270
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
271
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
272
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
273
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
274
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
275
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
276
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
277
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
278
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
279
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
280
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
281
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
282
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
283
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
284
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
285
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
286
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
287
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
288
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
289
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
290
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
291
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
292
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
293
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
294
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
295
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
296
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
297
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
298
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
299
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
300
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
301
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
302
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
303
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
304
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
305
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
306
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
307
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
308
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
309
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
310
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
311
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
312
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
313
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
314
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
315
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
316
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
317
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
318
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
319
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
320
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
321
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
322
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
323
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
324
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
325
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
326
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
327
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
328
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
329
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
330
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
331
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
332
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
333
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
334
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
335
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
336
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
337
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
338
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
339
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
340
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
341
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
342
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
343
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
344
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
345
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
346
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
347
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
348
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
349
- 0.0,0.0,0.0,0.0,0.0,0.0,1.0
350
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
351
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
352
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
353
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
354
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
355
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
356
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
357
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
358
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
359
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
360
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
361
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
362
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
363
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
364
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
365
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
366
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
367
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
368
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
369
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
370
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
371
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
372
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
373
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
374
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
375
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
376
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
377
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
378
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
379
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
380
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
381
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
382
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
383
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
384
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
385
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
386
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
387
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
388
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
389
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
390
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
391
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
392
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
393
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
394
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
395
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
396
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
397
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
398
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
399
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
400
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
401
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
402
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
403
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
404
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
405
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
406
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
407
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
408
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
409
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
410
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
411
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
412
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
413
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
414
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
415
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
416
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
417
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
418
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
419
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
420
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
421
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
422
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
423
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
424
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
425
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
426
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
427
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
428
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
429
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
430
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
431
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
432
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
433
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
434
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
435
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
436
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
437
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
438
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
439
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
440
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
441
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
442
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
443
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
444
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
445
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
446
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
447
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
448
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
449
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
450
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
451
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
452
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
453
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
454
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
455
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
456
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
457
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
458
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
459
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
460
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
461
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
462
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
463
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
464
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
465
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
466
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
467
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
468
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
469
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
470
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
471
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
472
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
473
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
474
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
475
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
476
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
477
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
478
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
479
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
480
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
481
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
482
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
483
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
484
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
485
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
486
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
487
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
488
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
489
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
490
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
491
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
492
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
493
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
494
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
495
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
496
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
497
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
498
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
499
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
data/akt/4gv1_0XZ_BS_adj.csv DELETED
The diff for this file is too large to render. See raw diff
 
data/akt/4gv1_0XZ_BS_adj_euc.csv DELETED
The diff for this file is too large to render. See raw diff
 
data/akt/4gv1_0XZ_BS_annot.csv DELETED
@@ -1,499 +0,0 @@
1
- A,C,HD,N,NA,OA,SA
2
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
3
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
4
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
5
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
6
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
7
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
8
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
9
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
10
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
11
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
12
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
13
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
14
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
15
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
16
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
17
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
18
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
19
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
20
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
21
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
22
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
23
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
24
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
25
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
26
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
27
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
28
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
29
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
30
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
31
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
32
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
33
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
34
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
35
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
36
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
37
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
38
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
39
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
40
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
41
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
42
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
43
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
44
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
45
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
46
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
47
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
48
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
49
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
50
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
51
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
52
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
53
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
54
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
55
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
56
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
57
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
58
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
59
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
60
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
61
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
62
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
63
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
64
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
65
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
66
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
67
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
68
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
69
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
70
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
71
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
72
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
73
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
74
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
75
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
76
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
77
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
78
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
79
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
80
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
81
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
82
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
83
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
84
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
85
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
86
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
87
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
88
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
89
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
90
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
91
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
92
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
93
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
94
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
95
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
96
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
97
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
98
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
99
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
100
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
101
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
102
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
103
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
104
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
105
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
106
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
107
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
108
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
109
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
110
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
111
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
112
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
113
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
114
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
115
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
116
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
117
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
118
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
119
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
120
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
121
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
122
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
123
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
124
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
125
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
126
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
127
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
128
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
129
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
130
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
131
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
132
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
133
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
134
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
135
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
136
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
137
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
138
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
139
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
140
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
141
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
142
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
143
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
144
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
145
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
146
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
147
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
148
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
149
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
150
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
151
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
152
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
153
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
154
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
155
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
156
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
157
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
158
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
159
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
160
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
161
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
162
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
163
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
164
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
165
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
166
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
167
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
168
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
169
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
170
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
171
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
172
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
173
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
174
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
175
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
176
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
177
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
178
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
179
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
180
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
181
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
182
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
183
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
184
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
185
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
186
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
187
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
188
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
189
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
190
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
191
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
192
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
193
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
194
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
195
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
196
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
197
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
198
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
199
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
200
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
201
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
202
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
203
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
204
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
205
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
206
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
207
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
208
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
209
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
210
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
211
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
212
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
213
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
214
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
215
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
216
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
217
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
218
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
219
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
220
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
221
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
222
- 0.0,0.0,0.0,0.0,0.0,0.0,1.0
223
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
224
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
225
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
226
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
227
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
228
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
229
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
230
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
231
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
232
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
233
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
234
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
235
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
236
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
237
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
238
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
239
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
240
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
241
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
242
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
243
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
244
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
245
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
246
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
247
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
248
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
249
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
250
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
251
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
252
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
253
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
254
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
255
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
256
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
257
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
258
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
259
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
260
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
261
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
262
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
263
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
264
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
265
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
266
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
267
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
268
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
269
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
270
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
271
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
272
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
273
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
274
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
275
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
276
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
277
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
278
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
279
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
280
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
281
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
282
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
283
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
284
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
285
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
286
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
287
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
288
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
289
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
290
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
291
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
292
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
293
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
294
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
295
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
296
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
297
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
298
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
299
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
300
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
301
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
302
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
303
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
304
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
305
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
306
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
307
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
308
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
309
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
310
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
311
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
312
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
313
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
314
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
315
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
316
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
317
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
318
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
319
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
320
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
321
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
322
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
323
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
324
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
325
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
326
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
327
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
328
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
329
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
330
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
331
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
332
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
333
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
334
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
335
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
336
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
337
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
338
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
339
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
340
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
341
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
342
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
343
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
344
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
345
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
346
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
347
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
348
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
349
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
350
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
351
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
352
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
353
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
354
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
355
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
356
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
357
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
358
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
359
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
360
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
361
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
362
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
363
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
364
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
365
- 0.0,0.0,0.0,0.0,0.0,0.0,1.0
366
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
367
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
368
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
369
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
370
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
371
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
372
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
373
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
374
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
375
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
376
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
377
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
378
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
379
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
380
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
381
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
382
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
383
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
384
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
385
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
386
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
387
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
388
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
389
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
390
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
391
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
392
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
393
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
394
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
395
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
396
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
397
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
398
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
399
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
400
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
401
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
402
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
403
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
404
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
405
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
406
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
407
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
408
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
409
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
410
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
411
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
412
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
413
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
414
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
415
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
416
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
417
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
418
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
419
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
420
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
421
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
422
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
423
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
424
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
425
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
426
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
427
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
428
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
429
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
430
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
431
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
432
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
433
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
434
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
435
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
436
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
437
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
438
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
439
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
440
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
441
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
442
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
443
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
444
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
445
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
446
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
447
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
448
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
449
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
450
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
451
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
452
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
453
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
454
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
455
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
456
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
457
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
458
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
459
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
460
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
461
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
462
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
463
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
464
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
465
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
466
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
467
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
468
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
469
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
470
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
471
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
472
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
473
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
474
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
475
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
476
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
477
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
478
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
479
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
480
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
481
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
482
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
483
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
484
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
485
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
486
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
487
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
488
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
489
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
490
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
491
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
492
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
493
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
494
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
495
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
496
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
497
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
498
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
499
- 0.0,0.0,0.0,0.0,0.0,0.0,0.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
data/akt/akt3_0XZ_BS_adj.csv DELETED
The diff for this file is too large to render. See raw diff
 
data/akt/akt3_0XZ_BS_adj_euc.csv DELETED
The diff for this file is too large to render. See raw diff
 
data/akt/akt3_0XZ_BS_annot.csv DELETED
@@ -1,499 +0,0 @@
1
- A,C,HD,N,NA,OA,SA
2
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
3
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
4
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
5
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
6
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
7
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
8
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
9
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
10
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
11
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
12
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
13
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
14
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
15
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
16
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
17
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
18
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
19
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
20
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
21
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
22
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
23
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
24
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
25
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
26
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
27
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
28
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
29
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
30
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
31
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
32
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
33
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
34
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
35
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
36
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
37
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
38
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
39
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
40
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
41
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
42
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
43
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
44
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
45
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
46
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
47
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
48
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
49
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
50
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
51
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
52
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
53
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
54
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
55
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
56
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
57
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
58
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
59
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
60
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
61
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
62
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
63
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
64
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
65
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
66
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
67
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
68
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
69
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
70
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
71
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
72
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
73
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
74
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
75
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
76
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
77
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
78
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
79
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
80
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
81
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
82
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
83
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
84
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
85
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
86
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
87
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
88
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
89
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
90
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
91
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
92
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
93
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
94
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
95
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
96
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
97
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
98
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
99
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
100
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
101
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
102
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
103
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
104
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
105
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
106
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
107
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
108
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
109
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
110
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
111
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
112
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
113
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
114
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
115
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
116
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
117
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
118
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
119
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
120
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
121
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
122
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
123
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
124
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
125
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
126
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
127
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
128
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
129
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
130
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
131
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
132
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
133
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
134
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
135
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
136
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
137
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
138
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
139
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
140
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
141
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
142
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
143
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
144
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
145
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
146
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
147
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
148
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
149
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
150
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
151
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
152
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
153
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
154
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
155
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
156
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
157
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
158
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
159
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
160
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
161
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
162
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
163
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
164
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
165
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
166
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
167
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
168
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
169
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
170
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
171
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
172
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
173
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
174
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
175
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
176
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
177
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
178
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
179
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
180
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
181
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
182
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
183
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
184
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
185
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
186
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
187
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
188
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
189
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
190
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
191
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
192
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
193
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
194
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
195
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
196
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
197
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
198
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
199
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
200
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
201
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
202
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
203
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
204
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
205
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
206
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
207
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
208
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
209
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
210
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
211
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
212
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
213
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
214
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
215
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
216
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
217
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
218
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
219
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
220
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
221
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
222
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
223
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
224
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
225
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
226
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
227
- 0.0,0.0,0.0,0.0,0.0,0.0,1.0
228
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
229
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
230
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
231
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
232
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
233
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
234
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
235
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
236
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
237
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
238
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
239
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
240
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
241
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
242
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
243
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
244
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
245
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
246
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
247
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
248
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
249
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
250
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
251
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
252
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
253
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
254
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
255
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
256
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
257
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
258
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
259
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
260
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
261
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
262
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
263
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
264
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
265
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
266
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
267
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
268
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
269
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
270
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
271
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
272
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
273
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
274
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
275
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
276
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
277
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
278
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
279
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
280
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
281
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
282
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
283
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
284
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
285
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
286
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
287
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
288
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
289
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
290
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
291
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
292
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
293
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
294
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
295
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
296
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
297
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
298
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
299
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
300
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
301
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
302
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
303
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
304
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
305
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
306
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
307
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
308
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
309
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
310
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
311
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
312
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
313
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
314
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
315
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
316
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
317
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
318
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
319
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
320
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
321
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
322
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
323
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
324
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
325
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
326
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
327
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
328
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
329
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
330
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
331
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
332
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
333
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
334
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
335
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
336
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
337
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
338
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
339
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
340
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
341
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
342
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
343
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
344
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
345
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
346
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
347
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
348
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
349
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
350
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
351
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
352
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
353
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
354
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
355
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
356
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
357
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
358
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
359
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
360
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
361
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
362
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
363
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
364
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
365
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
366
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
367
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
368
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
369
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
370
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
371
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
372
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
373
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
374
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
375
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
376
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
377
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
378
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
379
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
380
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
381
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
382
- 0.0,0.0,0.0,0.0,0.0,0.0,1.0
383
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
384
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
385
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
386
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
387
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
388
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
389
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
390
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
391
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
392
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
393
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
394
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
395
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
396
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
397
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
398
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
399
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
400
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
401
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
402
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
403
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
404
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
405
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
406
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
407
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
408
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
409
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
410
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
411
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
412
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
413
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
414
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
415
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
416
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
417
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
418
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
419
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
420
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
421
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
422
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
423
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
424
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
425
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
426
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
427
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
428
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
429
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
430
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
431
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
432
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
433
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
434
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
435
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
436
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
437
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
438
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
439
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
440
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
441
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
442
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
443
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
444
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
445
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
446
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
447
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
448
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
449
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
450
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
451
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
452
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
453
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
454
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
455
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
456
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
457
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
458
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
459
- 0.0,0.0,0.0,0.0,1.0,0.0,0.0
460
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
461
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
462
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
463
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
464
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
465
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
466
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
467
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
468
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
469
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
470
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
471
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
472
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
473
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
474
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
475
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
476
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
477
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
478
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
479
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
480
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
481
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
482
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
483
- 0.0,0.0,0.0,1.0,0.0,0.0,0.0
484
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
485
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
486
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
487
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
488
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
489
- 0.0,0.0,0.0,0.0,0.0,1.0,0.0
490
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
491
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
492
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
493
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
494
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
495
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
496
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
497
- 1.0,0.0,0.0,0.0,0.0,0.0,0.0
498
- 0.0,0.0,1.0,0.0,0.0,0.0,0.0
499
- 0.0,1.0,0.0,0.0,0.0,0.0,0.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
data/akt_inhibitors.smi DELETED
The diff for this file is too large to render. See raw diff
 
data/akt_test.smi DELETED
@@ -1,320 +0,0 @@
1
- NC1(c2ccc(-c3nc4ccc(-c5ncc[nH]5)cn4c3-c3ccccc3)cc2)CCC1
2
- Cc1cn2cc(-c3ccccc3)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc2n1
3
- NC1(c2ccc(-c3nc4c(-c5ccc(F)cc5)cccn4c3-c3ccccc3)cc2)CCC1
4
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3c2CSC3)CC1
5
- Cn1c(CC(=O)N2CCc3c2cccc3C(F)(F)F)nc(N2CCOCC2)cc1=O
6
- CC1C(=O)Nc2ccc(NC(COc3cncc(-c4ccc5c(c4)C(C)C(=O)N5)c3)Cc3c[nH]c4ccccc34)cc21
7
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCc4n[nH]c(C)n4)cc3)nc2n1
8
- NC1(c2ccc(-c3nn4c(-c5ccn[nH]5)cnc4cc3-c3ccccc3)cc2)CCC1
9
- COC(=O)c1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc12
10
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccccc4Cl)c3)cc12
11
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
12
- Cc1n[nH]c2cnc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)cc12
13
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccccc1OCCN1CCCCC1
14
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(F)c4)cn3CCN3CCCC3)CC2)c1Br
15
- NC1(c2ccc(-c3nc4c5cc(F)ccc5nn4c(NC4CC4)c3-c3ccccc3)cc2)CCC1
16
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1ccc2[nH]nc(C)c2c1
17
- CCc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
18
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(C)N)cc21
19
- NCC(NC(=O)c1cc(C2CC2)c(-c2ccnc3[nH]ccc23)s1)c1ccccc1
20
- NCC(Cc1ccccc1C(F)(F)F)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
21
- Cn1c(CC(=O)Nc2ccc(F)c(C(F)F)c2)nc(N2CCOCC2)cc1=O
22
- CC(C)c1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2n1
23
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)c3ccccc3)cc21
24
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4cccc(F)c4)c3)cc12
25
- CCc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(Cl)cc4)C3)c12
26
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cc(F)c(F)c(F)c1
27
- CCc1cnn2cc(-c3ccccc3)c(-c3ccc(CN4CC(c5n[nH]c(-c6cccc(C)n6)n5)C4)cc3)nc12
28
- Cc1cc(-c2ccn[nH]2)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
29
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CC2CCN2)CC1
30
- NC1(c2ccc(-c3nc4c(-c5ccc(F)cc5)cccn4c3-c3ccccc3)cc2)CCC1
31
- Cc1ccc(F)cc1CC(N)COc1cncc(-c2ccc3[nH]nc(C)c3c2)c1
32
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1-c1cnoc1
33
- NC1(c2ccc(-c3nc4c(-c5cccc(F)c5)cccn4c3-c3ccccc3)cc2)CCC1
34
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCNCC2CC2)CC1
35
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cc(C(N)=O)ccc3-4)cc2)C1
36
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)C1CCC1
37
- Nc1ncnc2nc(-c3ccc(CN4CCC(n5cnc6c(N)ncnc65)CC4)cc3)c(-c3ccccc3)cc12
38
- Cc1cccc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c12
39
- COC(=O)c1cccc2c1nn1cc(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc21
40
- Cc1c[nH]c2ncnc(N3CC4(CCNCC4)c4ccccc43)c12
41
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCc4c[nH]cn4)cc3)nc2n1
42
- Nc1cc(C=Cc2cncc(OCC(N)Cc3c[nH]c4ccccc34)c2)ccn1
43
- CC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)C(C)CN1
44
- COCCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
45
- NC1(C(=O)NC(CCO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
46
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)cc12
47
- COc1cc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)n2C)ccc1F
48
- O=C(NC(c1ccc2ccccc2c1)C1CCNCC1)c1ccc2cnccc2c1
49
- NC1(c2ccc(-c3nc4c5ccc(-c6ccc(O)nc6)cc5nn4cc3-c3ccccc3)cc2)CCC1
50
- CNCCn1cc(-c2ccnc(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C2C=NOC2)CC1
51
- NC1(c2ccc(-c3nc4ccc(-n5cccn5)cn4c3-c3ccccc3)cc2)CCC1
52
- Cc1cc(C)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
53
- NC1CCN(c2ncnc3[nH]cc(Cl)c23)C1
54
- CC1COCCN1c1nc(N2CCOCC2C)c2ccc(-c3cccc(NS(=O)(=O)C(C)C)c3)nc2n1
55
- Nc1nc(O)nc2nc(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)c(-c3ccccc3)cc12
56
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccc(Cl)c(C(F)(F)F)c2)sc1Cl
57
- Cc1cc(-c2cn(CCNC(C)C)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccc1F
58
- CNc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5cccc(C)n5)n4)C3)cc2)nc2nc(C)nn12
59
- CNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
60
- N#Cc1ccc(CC(N)C(=O)N2CCN(c3ncnc4ccccc34)CC2)cc1
61
- Cc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)c(C)n2n1
62
- CCCC1OC2CC(=O)OC2C2=C1C(=O)c1c(O)cccc1C2=O
63
- Cc1ccccc1-c1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2n1
64
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cccc(Cl)c4)c3)cc12
65
- COc1ccc(CC(N)C(=O)N2CCN(c3ncnc4ccccc34)CC2)cc1
66
- CNC(=O)CC1CC(c2ccc(F)c(F)c2)C(NC(=O)c2cc(-c3c(Cl)cnn3C)c(Cl)o2)CN1
67
- CCCC1NC(=O)C(CCCNC(=N)N)NC(=O)CN(C(=O)C(N)CCCNC(=N)N)CCCNC(=O)NCCCCCCN(CC(N)=O)C(=O)C(CCC(C)C)NC(=O)C(CN)NC(=O)C(Cc2ccc(O)cc2)NC1=O
68
- CNc1ccc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c2c1
69
- CCc1n[nH]c2ncnc(N3CCN(c4cc(Cl)cc(NCCN(C)C)c4C)CC3)c12
70
- Cc1nc(N)nc2c1nc(-c1cc[nH]n1)c(=O)n2C1CCOCC1
71
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
72
- O=S(=O)(Nc1cc(-c2ccc3nccn3c2)cnc1Cl)c1ccc(F)cc1
73
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1c[nH]c2ccccc12
74
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCn3cncn3)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
75
- NC(COc1cncc(-c2ccc3c(c2)C(c2ccccn2)C(=O)N3)c1)Cc1c[nH]c2ccccc12
76
- O=C(N1CCN(c2ncnc3[nH]nc(Br)c23)CC1)C1(c2ccc(Br)cc2)CCNCC1
77
- NC(COc1cncc(-c2ccc3[nH]nc(C4CC4)c3c2)c1)Cc1c[nH]c2ccccc12
78
- CN(C)CCN1CCN(c2ccc3nc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
79
- NC(COc1cncc(-c2ccc3cnc(F)cc3c2)c1)Cc1c[nH]c2ccccc12
80
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4cccc(F)c4)c3)cc2s1
81
- CC1SCc2ncnc(N3CCN(C(=O)C(N)Cc4c[nH]c5ccccc45)CC3)c21
82
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCO)CC2c2ccc(F)c(F)c2)oc1Cl
83
- N#Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
84
- Fc1ccc(-c2cn3nc(C4CC4)nc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5cc(Cl)ccn5)n4)CC3)cc2)cc1
85
- NCC(Cc1ccncc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
86
- COC(=O)c1cc(Cl)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
87
- COCCNC(=O)c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
88
- Nc1ncnc2c1cnn2C1CCN(Cc2ccc(-c3nc4ccnn4cc3-c3ccccc3)cc2)CC1
89
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCC3CCNCC3)c21.O=C(O)C(F)(F)F
90
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)cc1Br
91
- CCN(CC)CCNC(=O)c1ccc2nc(-c3ccccc3)c(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)nc2c1
92
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4ccc(C(F)(F)F)cc4)c3)cc12
93
- COc1ccc(S(=O)(=O)Nc2cc(-c3ccc4nc(NC(C)=O)sc4c3)cnc2Cl)cc1
94
- COC(=O)c1cn2cc(-c3ccccc3)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc2n1
95
- CC(C)=C1C(=O)Nc2ccc(NC(COc3cncc(-c4ccc5c(c4)C(=C(C)C)C(=O)N5)c3)Cc3c[nH]c4ccccc34)cc21
96
- NC(=O)c1cc(-c2ccn[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
97
- Cc1ccc2c(c1)OCn1c-2nc(-c2ccc(C3(N)CC(O)(C4CC4)C3)cc2)c1-c1ccccc1
98
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
99
- CC(C)Nc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2n1
100
- Oc1nc2ccc(NC(COc3cncc(-c4ccc5nc(O)sc5c4)c3)Cc3c[nH]c4ccccc34)cc2s1
101
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
102
- COc1cc(Cl)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
103
- c1ccc(-c2nnc[nH]2)c(Nc2ncnc3[nH]ccc23)c1
104
- NC1(c2ccc(-c3nc4nc(Oc5ccccc5)ccn4c3-c3ccccc3)cc2)CCC1
105
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(c4ccc(Cl)cc4)C4COCCN4)CC3)c21
106
- COc1ccc(S(=O)(=O)Nc2cncc(-c3ccc4nc(NC(C)=O)sc4c3)c2)cc1
107
- COc1cc(COc2ccn3c(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc3n2)ccn1
108
- NC(Cc1cc(F)cc(F)c1)C(=O)N1CCN(c2ncnc3ccccc23)CC1
109
- Sc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
110
- Cc1c(NCCN2CCCC2)cc(OCC(C)C)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
111
- NC1(c2ccc(-c3nc4c(C5CC5)cccn4c3-c3ccccc3)cc2)CCC1
112
- NC(=O)c1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
113
- CNC(=O)CC1CC(c2ccc(F)c(F)c2)C(NC(=O)c2cc(-c3c(Cl)cnn3C)c(Cl)o2)CN1
114
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(C(N)=O)cc3-4)cc2)C1
115
- O=C1CC2OC(c3ccsc3)C3=C(C(=O)c4ccccc4C3=O)C2O1
116
- N=C(c1ccccc1)n1c(=N)ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)cc21
117
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)sc1Cl
118
- NC(COc1cncc(-c2ccc3[nH]ncc3c2)c1)Cc1c[nH]c2ccccc12
119
- COC(=O)c1cn2cc(-c3ccccc3)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc2n1
120
- COc1cc(-c2ncc[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
121
- O=C(Nc1ccc2c(c1)CCO2)NC1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
122
- Cn1c(CC(=O)N2CCc3c(F)cccc32)nc(N2CCOCC2)cc1=S
123
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccccc3-4)cc2)CCC1
124
- NC(COc1cncc(-c2ccc3cnc(Cl)cc3c2)c1)Cc1c[nH]c2ccccc12
125
- NC1CCN(c2ccnc3[nH]ccc23)CC1
126
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)c3ccccc3)cc21
127
- NC(COc1cncc(-c2ccc3c(F)nccc3c2)c1)Cc1c[nH]c2ccccc12
128
- Cl.Cn1ncc(Cl)c1-c1ccc(C(=O)NC2CNCCC2c2ccc(Cl)c(C(F)(F)F)c2)s1
129
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2cccc(F)c2)sc1Cl
130
- NC1(c2ccc(-c3nc4c5ccc(-c6ccc(F)c(O)c6)cc5nn4cc3-c3ccccc3)cc2)CCC1
131
- Cn1nccc1-c1ccc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)cc1
132
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)C(CC1CCCCC1)NC(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(N)=O
133
- NC1(c2ccc(-c3nc4ncccn4c3-c3ccccc3)cc2)CCC1
134
- CNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
135
- N#Cc1cccc(CC(N)COc2cncc(C=Cc3ccncc3)c2)c1
136
- NC(COc1cncc(-c2ccc3[nH]nc(Cl)c3c2)c1)Cc1c[nH]c2ccccc12
137
- Nc1ncccc1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
138
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccncc3-4)cc2)CCC1
139
- CN(C)CC1CN(C(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c2ccccc21
140
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)c(C2CC2)c1
141
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCCC3)CC2)c1-c1ccc(F)cc1
142
- O=C(NC(c1ccc(Cl)c(Cl)c1)C1CNC1)c1ccc2cnccc2c1
143
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C)c3)cn2CCN(CC)C(C)C)CC1
144
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccccc4)C3)c12
145
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3ccc[nH]3)cc12
146
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cccc(OC(F)(F)F)c4)c3)cc12
147
- NC1(c2ccc(-c3nc4ccc(F)cn4c3-c3ccccc3)cc2)CCC1
148
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CNC)cc21
149
- NC(COc1cnc2ccc(-c3ccncc3)cc2c1)Cc1c[nH]c2ccccc12
150
- CC(C(=O)N1CCc2c(F)cccc21)c1nc(N2CCOCC2)cc(=O)[nH]1
151
- Cn1nccc1-c1ccc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)cn1
152
- Nc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc12
153
- Cc1n[nH]c2ncc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)c(C#N)nc3-c3ccoc3)nc12
154
- COc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
155
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(c5nnc(N)s5)CC4)cc3)nc2n1
156
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(-c7ccccn7)nn6c(NC(C)C)c5-c5ccccc5)cc4)C3)n[nH]2)n1
157
- COc1cccc(CC2(N)CCN(c3ncnc4[nH]ccc34)CC2)c1
158
- O=C(C(CNC1CCCCC1)c1ccc(Cl)cc1)N1CCN(c2ncnc3sc4c(c23)CCC4)CC1
159
- Nc1ncnc2nc(-c3ccc(CN4CCC(c5nc6ccc(F)cc6[nH]5)CC4)cc3)c(-c3ccccc3)cc12
160
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CN4CCC(F)CC4)c4ccc(Cl)cc4)CC3)c21
161
- CCc1cc2cc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)ccc2cn1
162
- COc1ccc(C2(C(=O)N3CCN(c4ncnc5[nH]ccc45)CC3)CCNCC2)cc1
163
- NC(CNc1ncc(-c2ccc3cnccc3c2)s1)Cc1ccc(C(F)(F)F)cc1
164
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2cccc(F)c2)sc1Cl
165
- Cn1nccc1-c1csc(C(=O)NC2CNCCC2c2ccc(Cl)cc2)c1
166
- Cc1ccc(-c2ccc3nn4cc(-c5ccccc5)c(-c5ccc(C6(N)CCC6)cc5)nc4c3c2)cc1
167
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)C1CCCN1C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CS)C(=O)NC(CCC(C)C)C(N)=O
168
- NC1(c2ccc(-c3nc4cc(Cl)ccn4c3-c3ccccc3)cc2)CCC1
169
- NC1(c2ccc(-c3ncc4cccn4c3-c3ccccc3)cc2)CCC1
170
- Fc1ccc(-c2cn3c(Cl)cnc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
171
- COC(=O)COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
172
- CC(=O)Nc1ccc(-c2cncc(OCC(N)Cc3c[nH]c4ccccc34)c2)cc1
173
- COc1c(F)ccc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c1F
174
- NC1(C(=O)NCc2ccc(Cl)cc2)CCN(c2ccnc3[nH]ccc23)CC1
175
- Nc1ncnc2nc(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)c(-c3ccccc3)cc12
176
- NC1(c2ccc(-c3nc4c(-c5cn[nH]c5)cccn4c3-c3ccccc3)cc2)CCC1
177
- c1ccc(-c2cc(-c3nn[nH]n3)cnc2-c2ccc(CNCc3ccc(-c4csnn4)cc3)cc2)cc1
178
- NCC(NCc1ccc(-c2ccnc3[nH]ccc23)s1)c1ccccc1
179
- CC(=O)Nc1nc2ccc(-c3ccnc(N(C)S(=O)(=O)c4ccccc4F)n3)cc2s1
180
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3cc[nH]c(=O)c3)cn2CCN2CCC2)CC1
181
- NCC(Cc1cccc(F)c1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
182
- O=C(N1CCN(c2ncnc3[nH]nc(Cl)c23)CC1)C1(c2ccc(Cl)c(Cl)c2)CCNCC1
183
- Cc1cc(C(N)=O)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
184
- O=C(NC(c1cccc(Cl)c1)C1CCNCC1)c1ccc2cnccc2c1
185
- NC1(c2ccc(-c3nc4ccc(-c5cnc[nH]5)cn4c3-c3ccccc3)cc2)CCC1
186
- NC1(c2ccc(-n3c(-c4ccccc4)nc4ccc(-c5ccccc5)nc43)cc2)CCC1
187
- CC1(O)CC(O)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cc(C(=O)O)ccc3-4)cc2)C1
188
- CCOCCN(CC(O)CN1CCCC2(CCN(c3ncnc(N)c3C3CC3)C2)C1)S(=O)(=O)c1c(C)cccc1C
189
- NC1(Cc2cccc(OC(F)(F)F)c2)CCN(c2ncnc3[nH]ccc23)CC1
190
- c1ccc(-c2cc3cccnc3nc2-c2ccc(CN3CCC(c4cc(-c5ccccn5)[nH]n4)CC3)cc2)cc1
191
- CCCCCCCCCCCCCCCC(=O)OCC(COP(=O)(O)OC1C(O)C(OP(=O)(O)O)C(OP(=O)(O)O)C(OP(=O)(O)O)C1O)OC(=O)CCCCCCCCCCCCCCC
192
- Cc1noc(C)c1S(=O)(=O)N(CCOC(C)C)CC(O)CN1CCCC2(CC(=O)c3cc(O)ccc3O2)C1
193
- O=C(Cc1ccc(Cl)cc1)N1CCN(c2ncnc3[nH]cc(Br)c23)CC1
194
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4ccccc4)nc32)cc1
195
- CS(=O)(=O)c1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cc1
196
- Cc1c[nH]c2ncnc(N3CCC(NC(=O)c4ccccc4)C3)c12
197
- CNC(=O)C1CCN(c2cnc(C(=O)Nc3csc(-c4nncn4C(C)C(F)(F)F)n3)cc2-n2cnc(C3CC3)c2)CC1
198
- NCC(NC(=O)c1cc(-c2ccccc2)c(-c2ccnc3[nH]ccc23)s1)c1ccccc1
199
- CC1Cc2c(Br)cccc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)n1C
200
- Nc1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6cc[nH]c(=O)c6cc5-c5ccccc5)cc4)CC3)[nH]2)cn1
201
- NC1(c2ccc(-c3nc4c5ccc(Br)cc5nn4cc3-c3ccccc3)cc2)CCC1
202
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(Cl)c3)cn2CCN2CCCC2)CC1
203
- [C-]#[N+]c1cccc(C(=O)Nc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)c1
204
- CCc1cnn(C)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1.Cl
205
- Cc1cc(O)cc2c1NC(C)(CCCC(C)C)CC2
206
- O=c1ccc(-c2cc(C3CCN(Cc4ccc(-c5nc6ncccc6cc5-c5ccccc5)cc4)CC3)n[nH]2)c[nH]1
207
- O=S(=O)(NC1(c2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CCC1)c1cccc(F)c1
208
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccccc3-4)cc2)C1
209
- Cn1c(CC(=O)N2CCc3ccc(F)cc32)nc(N2CCOCC2)cc1=O
210
- N=C(c1ccccc1)n1c(=N)ccc2nc(-c3ccc(C4(N)CC(F)(F)C4)cc3)c(-c3ccccc3)cc21
211
- NC1(c2ccc(-c3nc4c5cccc(-c6cn[nH]c6)c5nn4cc3-c3ccccc3)cc2)CCC1
212
- N#Cc1cccc(-c2ccc3nn4cc(-c5ccccc5)c(-c5ccc(C6(N)CCC6)cc5)nc4c3c2)c1
213
- CCn1c(-c2nonc2N)nc2c(-c3ccoc3)ncc(OCCCN)c21
214
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2-c2cnc(N)nc2)CC1
215
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccc(Cl)cc4)s3)cc12
216
- NC1(c2ccc(-c3nc4ccc(-c5cn[nH]c5)cn4c3-c3ccccc3)cc2)CCC1
217
- CC1OC2CC(=O)OC2C2=C1C(=O)c1ccccc1C2=O
218
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccsc3)COc3cccc(F)c3-4)cc2)C1
219
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CC(=O)Nc5cccc(F)c5)cc4)c3n2)c1
220
- CNC1CC2OC(C)(C1OC)n1c3ccccc3c3c4c(c5c6ccccc6n2c5c31)C(=O)NC4
221
- CCOC(=O)c1c(C)nc(NNC(=O)c2cccc3c(=O)c4ccccc4[nH]c23)nc1-c1ccc(OC)c(OC)c1
222
- c1ccc(-c2cc3cnc(-n4ccnc4)nc3nc2-c2ccc(CN3CCC(c4nnc(-c5ccccn5)[nH]4)CC3)cc2)cc1
223
- Cc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
224
- CN(C)c1ccc(C(=O)NCc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)cc1
225
- NCC(Cc1ccc(C(F)(F)F)cc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
226
- CC1Cc2c(ccc(F)c2Cl)N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
227
- C=Cc1ncc(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1ccc2cnccc2c1
228
- Cc1cc(F)ccc1S(=O)(=O)NCC(O)CN1CCCC2(CCN(c3ncnc(N)c3C3CC3)C2)C1
229
- NC1(C(=O)NC(CCO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
230
- NC(=O)COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
231
- Cc1cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c(C)n1
232
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)cc3)cn2CCN2CC(F)C2)CC1
233
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)c(Cl)c(=O)[nH]1
234
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(Cl)cccc21
235
- CCc1ncc(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1ccc2cnccc2c1
236
- Cc1cc(-c2cccnc2)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
237
- CNC1CC2OC(C)(C1OC)n1c3ccccc3c3c4c(c5c6ccccc6n2c5c31)C(=O)NC4
238
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCCCCN)c21
239
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cccc(C(F)(F)F)c4)c3)cc12
240
- C=Cc1c(N)ncnc1N1CCC(c2nc(-c3cccc(F)c3)cn2CCN2CCCC2)CC1
241
- Cc1cc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)ccc1F
242
- Nc1ncccc1-c1nc2ccc(Nc3ccc(N4CCOCC4)cc3)nc2n1-c1ccc(C2(N)CCC2)cc1
243
- CCc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(F)cc4F)C3)c12
244
- NC1(c2ccc(-c3nc4c5cc(F)ccc5nn4cc3-c3ccccc3)cc2)CCC1
245
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(Cl)c1
246
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCC4CC4)c4ccc(Cl)cc4)CC3)c21
247
- Nc1ncccc1-c1nc2ccc(-c3cccc(N4CCC(C(=O)N5CCOCC5)CC4)c3)nc2n1-c1ccc(C2(N)CCC2)cc1
248
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOC(CF)C2)cc(=O)[nH]1
249
- Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
250
- NC1(c2ccc(-c3ncc4cnccc4c3-c3ccccc3)cc2)CCC1
251
- COc1ncc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCC(N)=O)CC3)c2=O)cn1
252
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCCNC)cc21
253
- Cc1cccc(-c2nc(C3CCN(Cc4ccc(-c5nc6nccn6cc5-c5ccc(F)cc5)cc4)CC3)n[nH]2)n1
254
- NC1(c2ccc(-c3nc4c5cc(-c6ccc(CO)cc6)ccc5nn4cc3-c3ccccc3)cc2)CCC1
255
- NC1(c2ccc(-c3nc4ncc(-c5ccccc5)cn4c3-c3ccccc3)cc2)CCC1
256
- CC1Cc2cc(F)c(F)cc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)n1C
257
- NC1(c2ccc(-n3c(-c4ccccc4)nc4ccc(NCc5ccccc5)nc43)cc2)CCC1
258
- OCCNC(c1ccc(Cl)cc1)c1ccc(-c2cn[nH]c2)cc1
259
- Cc1cc(-c2cn(CCNCC(C)C)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccc1F
260
- CCNc1nc(-c2ccoc2)c(-c2cnc3[nH]nc(C)c3n2)cc1OCC(N)Cc1ccccc1
261
- Cn1c(CC(=O)N2CCc3c(F)cccc32)nc(N2CCOCC2)cc1=O
262
- Cc1c(-c2ccn[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1C
263
- NC(COc1cncc(-c2ccc3[nH]nc(-c4ccc[nH]4)c3c2)c1)Cc1c[nH]c2ccccc12
264
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccc(F)c(F)c4)s3)cc12
265
- O=S(=O)(NCCNCC=Cc1ccc(Br)cc1)c1cccc2cnccc12
266
- COC(=O)c1c(C)nc(NNC(=O)c2cccc3c(=O)c4ccccc4[nH]c23)nc1-c1ccc(OC)cc1
267
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OC(CN)c3ccccc3)cc21
268
- CC(C)NCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
269
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(F)cc4)c3)cc2[nH]1
270
- Nc1ncnc(N2CCC(c3nc(-c4cccc(F)c4)cn3CCN3CCCC3)CC2)c1Br
271
- O=C(NC(c1ccc(Cl)cc1)C1CCNCC1)c1ccc2cnccc2c1
272
- Cc1cc(-c2cn(CCN(C)C)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
273
- CNc1nccc(-c2ccc(C(=O)NCC(C)c3ccc(Cl)cc3Cl)s2)n1
274
- CC1Cc2cc(F)ccc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)[nH]1
275
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccc(Cl)c(Cl)c2)oc1Cl
276
- COc1ccccc1C(=O)N1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
277
- Nc1cc2cc(-c3cnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)ccc2cn1
278
- CSc1nc2nc(-c3ccc(CN4CCC(c5n[nH]c(-c6cccc(C)n6)n5)CC4)cc3)c(-c3ccccc3)cn2n1
279
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCNC3CC3)CC2)c1Cl
280
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
281
- COc1ccc(COc2ccn3c(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc3n2)cn1
282
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)nc12
283
- Cc1cnn(C)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1.Cl
284
- N=C(c1ccccc1)n1c(=N)ccc2nc(-c3ccc(C4(N)CC(F)(F)C4)cc3)c(-c3ccccc3)cc21
285
- CC(C)Nc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)nc2nc(-c3ccccn3)nn12
286
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCc4nccs4)cc3)nc2n1
287
- NC1(c2ccc(-n3c(-c4cccc(Cl)c4)nc4ccc(-c5cccc(N6CCOCC6)c5)nc43)cc2)CCC1
288
- NC1(C(=O)NC(CCN2CCCCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
289
- NC1(c2ccc(-c3nc4c(F)cccn4c3-c3ccccc3)cc2)CCC1
290
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4csc5ccccc45)c3)cc12
291
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(Cl)c3)cn2CCN2CCC2)CC1
292
- Cc1cc(-c2cn(CCNC3CC3)c(C3CCN(c4ncnc(N)c4-c4cn[nH]c4)CC3)n2)ccc1F
293
- NC(=O)c1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
294
- Cc1cc(-c2cn(CC3CNC3)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
295
- NC1(c2ccc(-c3nc4c(-c5ccn[nH]5)cccn4c3-c3ccccc3)cc2)CCC1
296
- Cc1cc(-c2cn(CCNCC(C)C)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
297
- Cc1c[nH]c2ncnc(Nc3ccccc3-c3nnc[nH]3)c12
298
- CCn1c(-c2nonc2N)nc2c(C#CCCO)ncc(OCCCN)c21
299
- CC(C)NCC(Cc1ccc(Cl)c(F)c1)C(=O)N1CCN(c2ncnc3c2C(C)OC3)CC1
300
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(F)cccc21
301
- CN1CC(C(NC(=O)c2ccc3cnccc3c2)c2ccc(Cl)c(Cl)c2)C1
302
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCCCCCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCN)CC(N)=O
303
- Cc1c(NCCN2CCCC2)cc(C(=O)CCC(F)(F)F)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
304
- NCC(Cc1cccc(C(F)(F)F)c1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
305
- CSc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6cccc(C)n6)n5)C4)cc3)c(-c3ccccc3)cn2n1
306
- NC(COc1cncc(-c2ccc3c(c2)C(c2cccs2)C(=O)N3)c1)Cc1c[nH]c2ccccc12
307
- Cn1nnnc1-c1cnc(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)c(-c2ccccc2)c1
308
- CC(C)c1cccc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c1
309
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCC4CC4)c4ccc(C(F)(F)F)c(F)c4)CC3)c21
310
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(Cl)cc4)C3)c12
311
- NC1(c2ccc(-n3c(-c4ccccc4O)nc4ccc(-c5ccccc5)nc43)cc2)CCC1
312
- CC(C)(Cc1ccccc1)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
313
- Cc1ccc(CC(N)C(=O)N2CCN(c3ncnc4ccccc34)CC2)cc1
314
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
315
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(n5ncc6c(N)ncnc65)CC4)cc3)nc2n1
316
- NC1(c2ccc(-c3nn4c(-c5ccn[nH]5)cnc4cc3-c3ccccc3)cc2)CCC1
317
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4cn[nH]c4)nc32)cc1
318
- COC1(C)CN(c2cnc(C(=O)Nc3csc(-c4nncn4C4CC4)n3)cc2-n2cnc(C3CC3)c2)C1
319
- N#Cc1ncc(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1ccc2cnccc2c1
320
- NC(COc1cncc(-c2ccc3c(c2)C(c2ccccc2)C(=O)N3)c1)Cc1c[nH]c2ccccc12
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
data/akt_train.smi DELETED
The diff for this file is too large to render. See raw diff
 
data/chembl_test.smi DELETED
The diff for this file is too large to render. See raw diff
 
data/dataset_download.sh DELETED
@@ -1,6 +0,0 @@
1
- #!/bin/sh
2
- pip install gdown
3
-
4
- gdown --fuzzy "https://drive.google.com/file/d/1kDpTm36X3ugpr6Ooo4Fg_dkNRZhQ5EMC/view?usp=share_link"
5
-
6
- gdown --fuzzy "https://drive.google.com/file/d/13h465yaIbrAp5tcGbIwhxriorejr6Fsz/view?usp=share_link"
 
 
 
 
 
 
 
data/decoders/__init__.txt DELETED
File without changes
data/encoders/__init__.txt DELETED
File without changes
data/filtered_akt_inhibitors.smi DELETED
@@ -1,1600 +0,0 @@
1
- Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCCC2)CC1
2
- COC(=O)c1ccc(-c2c(N)ncnc2N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCC3)CC2)cc1
3
- Cl.Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(C2(N)CCC2)cc1
4
- NCC(NC(=O)c1ccc(-c2c[nH]c3ncccc23)s1)c1ccccc1
5
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(OC)c3)cn2CCN2CCC2)CC1
6
- NC1(c2ccc(-c3nc4cc(Cl)c(Cl)cn4c3-c3ccccc3)cc2)CCC1
7
- c1ccc(-c2cn3ccnc3nc2-c2ccc(CN3CCC(c4nc5cccnc5[nH]4)CC3)cc2)cc1
8
- NC1CCCN(c2ncnc3[nH]cc(Cl)c23)C1
9
- CNCCn1cc(-c2ccnc(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C2C=NOC2)CC1
10
- CNCCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
11
- COc1cccc2c1-c1nc(-c3ccc(C4(N)CC(O)(C5CC5)C4)cc3)c(-c3ccccc3)n1CO2
12
- NC1(C(=O)N2CCc3ccccc3C2)CCN(c2ncnc3[nH]ccc23)CC1
13
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)cc12
14
- Nc1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6nc(N7CCN(CCO)CC7)ncc6cc5-c5ccccc5)cc4)CC3)[nH]2)cn1
15
- Cc1ccc(-c2nc3c(C)nc(N)nc3n(C3CCC(O)CC3)c2=O)cn1
16
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3nccnc3-4)cc2)CC(O)(C2CC2)C1
17
- Cn1cc(-c2cnc3c(-c4csc(C(=O)NC5CCCCC5N)c4)cnn3c2)cn1
18
- CC1Cc2c(F)cccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)n1C
19
- CC(C)(C)NCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
20
- C=Cc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
21
- NC(=O)c1ccc(NC2CNCCC2c2ccc(F)c(Cl)c2)c2cncnc12
22
- NC1(c2ccc(-c3nc4nc(O)ccn4c3-c3ccccc3)cc2)CCC1
23
- COC(=O)c1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CCC(c5n[nH]c(-c6ccccn6)n5)CC4)cc3)nc12
24
- c1ccc(-c2cn3ccnc3nc2-c2ccc(CN3CCC(c4cnc5ccccc5n4)CC3)cc2)cc1
25
- Cc1c[nH]c2ncnc(Nc3ccccc3-c3nnc[nH]3)c12
26
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccccc3-4)cc2)CCC1
27
- Cc1c(NCCN2CCCC2)cc(CCC(C)(C)C)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
28
- N#Cc1ccc2nc(C3CCN(Cc4ccc(-c5nc6ccnn6cc5-c5ccccc5)cc4)CC3)[nH]c2c1
29
- NC1(C(=O)NC(Cc2ccccc2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
30
- NC(=O)Nc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
31
- NC(=O)Nc1ccc2c(c1)C(=Cc1cc(-c3cccc(F)c3)c[nH]1)C(=O)N2
32
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3F)CC1)c1ccc(Cl)cc1
33
- CS(=O)(=O)c1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cc1
34
- CC(=O)Nc1nc2ccc(-c3ccnc(N(C)S(=O)(=O)c4ccc(C)cc4)n3)cc2s1
35
- CCOCCN(CC(O)CN1CCCC2(CC(=O)c3cc(O)ccc3O2)C1)S(=O)(=O)c1c(C)noc1C
36
- COCCNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
37
- Cc1cn2c(NC(C)C)c(-c3ccccc3)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc2n1
38
- COc1nn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CCC(c5n[nH]c(-c6ccccn6)n5)CC4)cc3)nc2c1CO
39
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(C)nn6cc5-c5ccc(F)cc5F)cc4)C3)n[nH]2)n1
40
- CC(C)NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
41
- Cc1cccc(C)c1S(=O)(=O)N1CCCC1C(O)CN1CCCC2(CCN(c3ncnc(N)c3C3CC3)C2)C1
42
- Oc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
43
- Cc1cc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4OC(C)C)CC3)n2)ccc1F
44
- CCc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(F)cc4F)C3)c12
45
- Cc1cc(C)c(CC(N)COc2cncc(-c3ccc4[nH]nc(C)c4c3)c2)c(C)c1
46
- Nc1nccc(-c2ccc(C(=O)NCCc3ccc(Cl)cc3Cl)s2)n1
47
- NC(=O)Nc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
48
- CC1Cc2c(Cl)cccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
49
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cccnc3-4)cc2)C1
50
- CCOC(=O)c1c(C)nc(NNC(=O)c2cccc3c(=O)c4ccccc4[nH]c23)nc1-c1ccc(F)cc1
51
- Cc1nc2ccccc2n1C1CCN(Cc2ccc(-c3nc4ncnc(N)c4cc3-c3ccccc3)cc2)CC1
52
- NC1(C(=O)NCc2ccc(Cl)cc2)CCN(c2ncnc3[nH]c(=O)[nH]c23)CC1
53
- Cc1nc2nc(-c3ccc(CN4CCC(c5n[nH]c(-c6ccccn6)n5)CC4)cc3)c(-c3cccc(F)c3)cn2n1
54
- CC(C)(C)c1ccc(CC2(N)CCN(c3ncnc4[nH]ccc34)CC2)cc1
55
- O=C(N1CCN(c2ncnc3[nH]nc(Br)c23)CC1)C1(c2ccc(Cl)cc2)CCNCC1
56
- NC1(COCc2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
57
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC4CCOCC4)c4ccc(Cl)c(F)c4)CC3)c21
58
- CN(C)C(=O)N1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
59
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(c5nc(-c6cccnc6)no5)CC4)cc3)nc2n1
60
- CC1Cc2c(cccc2C(F)(F)F)N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
61
- Nc1ncccc1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
62
- Cc1cc(-c2cn(CCNC(C)C)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
63
- NC1(c2ccc(-c3nc4c5cccc(Br)c5nn4cc3-c3ccccc3)cc2)CCC1
64
- NC1(c2ccc(-c3nn4c(-c5ccc(O)nc5)cnc4cc3-c3ccccc3)cc2)CCC1
65
- NC(COc1cncc(-c2ccc3c(c2)C(c2c[nH]c4ccccc24)C(=O)N3)c1)Cc1c[nH]c2ccccc12
66
- c1nc(N2CCc3[nH]cnc3C2)c2cc[nH]c2n1
67
- NCC(Cc1ccc(C(F)(F)F)cc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
68
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ccnc3ccccc23)CC1
69
- CCNc1nc(-c2ccoc2)c(-c2cnc3[nH]nc(C)c3n2)cc1OCC(N)Cc1ccccc1
70
- Cc1[nH]c(C=C2C(=O)Nc3ccc(NC(N)=O)cc32)c(C)c1CCC(=O)O
71
- NC(COc1cncc(-c2ccc3c(c2)CC(=O)N3)c1)Cc1c[nH]c2ccccc12
72
- CC(C)(Cc1cccs1)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
73
- Cn1ncc(Cl)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)cc1
74
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CC(=O)Nc5cccc(F)c5)cc4)c3n2)c1
75
- Clc1ccc(C2(c3ccc(-c4ncnc5[nH]cnc45)cc3)CCNCC2)cc1
76
- NCC(Cc1cccc(F)c1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
77
- NC(COc1cncc(C=Cc2ccncc2)c1)Cc1cccc2ccccc12
78
- CC1SCc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c21
79
- CCOc1cccc2c1-c1nc(-c3ccc(C4(N)CC(O)(C5CC5)C4)cc3)c(-c3ccccc3)n1CO2
80
- N#Cc1ncc(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1ccc2cnccc2c1
81
- NC1(c2ccc(-c3nn4cccc4cc3-c3ccccc3)cc2)CCC1
82
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc2ocnc12
83
- N=C(c1ccccc1)n1c(=N)ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)cc21
84
- NC1(c2ccc(-c3nc4ccc(C(=O)O)cn4c3-c3ccccc3)cc2)CCC1
85
- CCOCCN(CC(O)CN1CCCC2(CCc3cc(C#N)ccc3O2)C1)S(=O)(=O)c1c(C)cccc1C
86
- COC(=O)c1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
87
- Cc1c(NCCN2CCCC2)cc(OCC(C)C)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
88
- Cl.Cn1ncc(Br)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1
89
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CCC(C(F)(F)F)CC2)cn1
90
- CC(C)Oc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CC2CNC2)CC1
91
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)oc1Cl.O=C(O)C(O)C(O)C(=O)O
92
- Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1Cl
93
- Cn1cc(-c2cnc3c(-c4csc(C(=O)NC5CCCCC5N)c4)cnn3c2)cn1
94
- CNC(=O)CC1CC(c2ccc(F)c(F)c2)C(NC(=O)c2cc(-c3c(Cl)cnn3C)c(Cl)o2)CN1
95
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccccc4)C3)c12
96
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(C(F)(F)F)c(F)c4)c3)cc12
97
- COc1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
98
- Cc1cc(-c2cn(CCNC3CCCC3)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccc1F
99
- CC(=O)NCCC1CC(c2ccc(F)c(F)c2)C(NC(=O)c2cc(-c3c(Cl)cnn3C)c(Cl)o2)CN1
100
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccncc3-4)cc2)C1
101
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3cccc(F)c3)cn2CCN2CCCC2)CC1
102
- Cc1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cc1
103
- NC1(c2ccc(-c3nc4cc(C(=O)O)ccn4c3-c3ccccc3)cc2)CCC1
104
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc12
105
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)cc1OCC1CCNCC1
106
- Cl.Cn1ncc(CO)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1
107
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(C(F)(F)F)cc3)cn2CCN2CCCC2)CC1
108
- CCN(CC)CCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2OC(C)C)CC1
109
- COC1(C)CCN(c2cnc(C(=O)Nc3csc(-c4nncn4C4CC4)n3)cc2-n2cnc(C3CC3)c2)CC1
110
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCCCNCCc3ccc(OC)cc3)c21
111
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCCN)CC(N)=O
112
- N#Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
113
- N#Cc1cc(Cl)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
114
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)c3ccccc3)cc21
115
- CN(C)CCN1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccc(=O)[nH]c7)[nH]6)CC5)cc4)nc3n2)CC1
116
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4cccc(F)c4)s3)cc12
117
- O=C(NC(c1ccccc1)C1CCNCC1)c1ccc2cnccc2c1
118
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccccc1OCc1cccnc1
119
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(n5ncc6c(N)ncnc65)CC4)cc3)nc2n1
120
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(F)cc4)C3)c12
121
- CC1CN(c2cc(=O)[nH]c(CC(=O)N3CCc4c(F)cccc43)n2)CCO1
122
- CSc1nc2nc(-c3ccc(CN4CCC(c5n[nH]c(-c6ncccn6)n5)CC4)cc3)c(-c3ccccc3)cn2n1
123
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CC2CCN2)CC1
124
- NC(COc1cncc(-c2ccc3c(c2)CC(=O)N3)c1)Cc1ccccc1
125
- NC1(Cc2ccc3ccccc3c2)CCN(c2ncnc3[nH]ccc23)CC1
126
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1Cl
127
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CC(=O)NC3CC3)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
128
- O=c1[nH]c(-c2ccccc2)cn1C1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
129
- CC(C)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCC2)CC1
130
- CN1CC(Cn2cc(-c3ccc(F)c(C(F)(F)F)c3)nc2C2CCN(c3ncnc(N)c3C(N)=O)CC2)C1
131
- Cn1cc(C(CN)c2cncc(C=Cc3ccncc3)c2)c2ccccc21
132
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCNCCCCl)CC2)c1Br
133
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccsc3)cn2CCN2CCC2)CC1
134
- CC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC(C)O1
135
- CCC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccncc3-4)cc2)C1
136
- NC(Cc1ccc(C(F)(F)F)cc1)C(=O)N1CCN(c2ncnc3ccccc23)CC1
137
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2(c3ccc(Cl)c(Cl)c3)CCNCC2)oc1Cl
138
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC(C)(C)CO)c4ccc(Cl)cc4)CC3)c21
139
- Cn1c(CC(=O)Nc2cccc(C3CC3)c2)nc(N2CCOCC2)cc1=O
140
- Nc1ncccc1-c1nc2cc(-c3cccnc3)cnc2n1-c1ccc(CNC(=O)c2cccc(F)c2)cc1
141
- NC1(C(=O)NC(c2ccccc2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
142
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)n1C
143
- Cc1cc(O)cc2c1OC(C)(CCCC(C)C)CC2
144
- CC(C)CNC(c1ccc(Cl)cc1)c1ccc(-c2cn[nH]c2)cc1
145
- COc1ccc(COc2ccn3c(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc3n2)cn1
146
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4c(F)cc(F)cc4F)c3)cc12
147
- CS(=O)(=O)N1CCN(Cc2cc3nc(-c4cccc5[nH]ncc45)nc(N4CCOCC4)c3s2)CC1
148
- NC1(c2ccc(-c3nc4ccc(Cl)cn4c3-c3ccccc3)cc2)CCC1
149
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CCN(CC(F)(F)F)CC2)cn1
150
- CN1CCC2(CC1)CN(c1ncnc3[nH]ccc13)c1ccccc12
151
- NC1(CNC(=O)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]cc(Cl)c23)C1
152
- CCC(C)(C)NCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
153
- Fc1ccc(-c2cn3ccnc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
154
- NCC1(c2ccc(Cl)cc2)CCN(c2ccnc3[nH]ccc23)CC1
155
- CC(C)Nc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2n1
156
- CC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CCN1
157
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCCCN)CC(N)=O
158
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(c4ccc(Cl)cc4)C4CCCCN4)CC3)c21
159
- Cc1c(OCC(N)Cc2c[nH]c3ccccc23)cncc1-c1ccc2cnccc2c1
160
- O=C(Nc1cccc(C(F)(F)F)c1)NC1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
161
- NC1(c2ccc(-c3nn4c(Br)cnc4cc3-c3ccccc3)cc2)CCC1
162
- NC1(c2ccc(-c3nc4c5cc(-c6ccc(CO)cc6)ccc5nn4cc3-c3ccccc3)cc2)CCC1
163
- Cc1nc2nc(-c3ccc(CN4CCC(n5ncc6c(N)ncnc65)CC4)cc3)c(-c3ccccc3)cn2n1
164
- Cc1cc2cc(-c3nnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)ccc2cn1
165
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c(Cl)[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2n1
166
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c1ccc(F)c2F
167
- OCCN1CCN(c2ccc3nc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
168
- Cc1cc(-c2cn(CCNCC3CC3)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
169
- CSc1nc2nc(-c3ccc(CN4CCC(c5n[nH]c(-c6cnccn6)n5)CC4)cc3)c(-c3ccccc3)cn2n1
170
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CC(=O)Nc5ccccc5)cc4)c3n2)c1
171
- O=C(Nc1cccc(F)c1)NC1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
172
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1c[nH]c2ccccc12
173
- Cc1cc(-c2ccc3nn4cc(-c5ccccc5)c(-c5ccc(C6(N)CCC6)cc5)nc4c3c2)[nH]n1
174
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CN)cc4)c3n2)c1.Cl
175
- CCn1c(-c2nonc2N)nc2c(-c3ccccc3)ncc(OCCCN)c21
176
- Clc1ccc(C(NC2CC2)c2ccc(-c3cn[nH]c3)cc2)cc1
177
- COc1ccc(COc2ccn3c(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc3n2)cn1
178
- Cc1cc(-c2cccnc2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
179
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
180
- Nc1cc(N2CCC(c3nc(-c4ccc(F)c(F)c4)cn3CCN3CCCC3)CC2)ncn1
181
- CNC1CC2OC(C)(C1OC)n1c3ccccc3c3c4c(c5c6ccccc6n2c5c31)C(=O)NC4
182
- COC(=O)c1cccc2c1nn1cc(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc21
183
- CN(C)C1CCN(C(=O)c2c[nH]c(C=C3C(=O)Nc4ccc(NC(N)=O)cc43)c2)C1
184
- CN1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ncccn7)[nH]6)CC5)cc4)nc3n2)CC1
185
- COC(=O)c1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cn1
186
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3cnoc3)cn2CCN2CCC2)CC1
187
- NC1(Cc2cccc3ccccc23)CCN(c2ncnc3[nH]ccc23)CC1
188
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc2c1NCCC2
189
- CCCOc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2n1
190
- CCNC(=O)c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
191
- CCOC(=O)c1cncc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)c1
192
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccc(F)cc1
193
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)n1C
194
- CC(C)C(C(=O)Nc1ccc(F)cc1)c1nc(N2CCOCC2)cc(=O)[nH]1
195
- Cn1c(CC(=O)N2CC(C)(C)c3c(Cl)cccc32)nc(N2CCOCC2)cc1=O
196
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCC3)CC2)c1-c1ccccc1F
197
- Cc1cc(-c2cn(CCNC(C)(C)C)c(C3CCN(c4ncnc(N)c4-c4cn[nH]c4)CC3)n2)ccc1F
198
- Cc1cc(C)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
199
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4cccc(OCCN5CCOCC5)c4)c3)cc12
200
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CC(F)(F)C2)cn1
201
- CCc1cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c(OC)n1
202
- CNC(=O)Nc1ccc(CNc2c(C(=O)Nc3ccc(SC(F)(F)F)cc3)cnn2C)cn1
203
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1ccc2[nH]nc(C)c2c1
204
- COC(=O)c1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6cccc(C)n6)n5)C4)cc3)nc12
205
- Cc1c[nH]c2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c12
206
- NC(COc1cncc(-c2ccc3c(c2)C(=Cc2ccc[nH]2)C(=O)N3)c1)Cc1ccccc1
207
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1cccnc1
208
- CCn1c(-c2nonc2N)nc2cncc(CNC3CCNCC3)c21
209
- NC1(c2ccc(-c3nc4c(F)cccn4c3-c3ccccc3)cc2)CCC1
210
- NC1(c2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CCCCC1
211
- COc1ncc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCO)CC3)c2=O)cc1F
212
- NC1(Cc2c(Cl)cccc2Cl)CCN(c2ncnc3[nH]ccc23)CC1
213
- NC(COc1cncc(C=Cc2ccncc2)c1)Cc1ccc2ccccc2c1
214
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(OC(F)F)cccc21
215
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccc(C(F)(F)F)cc1
216
- O=c1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6nc(N7CCN(CCO)CC7)ncc6cc5-c5ccccc5)cc4)CC3)[nH]2)c[nH]1
217
- Cc1cc(-c2cn(CCN(C)C)c(C3CCN(c4ncnc(N)c4-c4cn[nH]c4)CC3)n2)ccc1F
218
- COC1CCN(c2cnc(C(=O)Nc3csc(-c4nncn4C(C)C(F)(F)F)n3)cc2-n2cnc(C3CC3)c2)C1
219
- Cc1c(O)cc2c(c1C)OC(C)(CCCC(C)C)CC2
220
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CC(=O)Nc5ccccc5)cc4)c3n2)c1
221
- CCN(CCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2OC(C)C)CC1)C(C)C
222
- COc1cc(C(N)=O)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
223
- NC1(c2ccc(-c3nc4n(c3-c3ccsc3)COc3cccc(F)c3-4)cc2)CC(O)(C2CC2)C1
224
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(C(N)=O)cc3-4)cc2)C1
225
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4c[nH]c5cccnc45)cnc3-c3ccoc3)cc12
226
- O=C(Cc1nc(N2CCOC(CF)C2)cc(=O)[nH]1)N1CCc2c(F)cccc21
227
- CCONC(=O)c1ccc2c(c1)OCn1c-2nc(-c2ccc(C3(N)CC(C)(O)C3)cc2)c1-c1ccccc1
228
- O=c1[nH]c2ccccc2n1C1CCN(Cc2ccc(-c3nc4cc5[nH]cnc5cc4nc3-c3ccccc3)cc2)CC1
229
- NC(CNc1nnc(-c2ccc3[nH]ncc3c2)s1)Cc1ccc(Cl)cc1Cl
230
- CC(C)Oc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN(C)C)CC1
231
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(Br)c1
232
- COC(=O)COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
233
- COc1nn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc2c1CO
234
- NC1CCN(c2ncnc3[nH]ncc23)CC1
235
- CC(C)(Cc1ncc[nH]1)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
236
- NC(=O)Nc1ccc2c(c1)C(=Cc1cc(-c3ccccc3)c[nH]1)C(=O)N2
237
- CC1CN(C(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c2ccccc21
238
- NCC1(Cc2ccc(Cl)cc2)CCN(c2ncnc3[nH]cnc23)CC1
239
- CC1SCc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c21
240
- CN1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
241
- COc1cc2ncnc3c2cc1OCCCCCN(C)Cc1ccc(Br)cc1N3
242
- Cc1c(C(=O)O)c[nH]c1C=C1C(=O)Nc2ccc(NC(N)=O)cc21
243
- Cc1noc(C)c1S(=O)(=O)NCC(O)CN1CCCC2(CCN(c3ncnc(N)c3C3CC3)C2)C1
244
- O=c1[nH]ccc2nc(-c3ccc(CN4CCC(c5nnc(-c6ncccn6)[nH]5)CC4)cc3)c(-c3ccccc3)cc12
245
- NC1(c2ccc(-c3nc4c(Br)cccn4c3-c3ccccc3)cc2)CCC1
246
- Nc1ccc(C2OC3CC(=O)OC3C3=C2C(=O)c2ccccc2C3=O)cc1
247
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CNC(=O)Cc2ccccc2)cc1
248
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(c4ccc(Cl)cc4)C4CCCCN4)CC3)c21
249
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c(C(N)=O)[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2n1
250
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cc(Cl)cc(Cl)c4)c3)cc12
251
- CC(C)(C)c1ccc(CNC(=O)C2(N)CCN(c3ncnc4[nH]ccc34)CC2)cc1
252
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCCCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCN)CC(N)=O
253
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3cc(F)ccc3O)cc12
254
- Cc1[nH]nc2ccc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)cc12
255
- Cn1nccc1-c1ccc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)cc1
256
- Nc1ccccc1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(C2(N)CCC2)cc1
257
- CCOCCN(CC(O)CN1CCCC2(CCc3cc(F)ccc3O2)C1)S(=O)(=O)c1c(C)cccc1C
258
- CC(C)(Cc1ccc(Cl)cc1Cl)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
259
- O=c1[nH]c2ccccc2n1C1CCN(Cc2ccc(-c3ncc(-c4nnn[nH]4)cc3-c3ccccc3)cc2)CC1
260
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3cc(Br)cnc32)cc1
261
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)cc3)cn2CCN2CCCC2)CC1
262
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccccc4)c3)cc12
263
- Cc1noc(C2CCCN(Cc3ccc(-c4nnc5n4-c4cccnc4Nc4ccccc4-5)cc3)C2)n1
264
- NC1(C(=O)NC(CCCO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
265
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3)CC1)c1ccc(Cl)cc1
266
- O=c1[nH]c2ccccc2n1C1CCN(Cc2ccc(-c3nc4cc5cn[nH]c5cc4nc3-c3ccccc3)cc2)CC1
267
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(OC(F)(F)F)c1
268
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(C(F)(F)F)c1O
269
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(Cl)c(Cl)c4)c3)cc12
270
- CCNc1nc(-c2ccoc2)c(-c2cnc3[nH]nc(C)c3n2)cc1OCC(N)Cc1ccccc1
271
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
272
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CCN)cc21
273
- Cc1cccc(-c2nc(C3CCN(Cc4ccc(-c5nc6nccn6cc5-c5ccc(F)cc5)cc4)CC3)n[nH]2)n1
274
- Sc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
275
- O=C1Cc2cc(NC(COc3cncc(-c4ccc5c(c4)CC(=O)N5)c3)Cc3c[nH]c4ccccc34)ccc2N1
276
- Cc1c(Cl)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1Br
277
- c1ccc(OC2CCCN(Cc3ccc(-c4nnc5n4-c4cccnc4Nc4ccccc4-5)cc3)C2)cc1
278
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)c(Br)c1
279
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCN)CC(N)=O
280
- C=Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(C(F)(F)F)cc3)cn2CCN2CCCC2)CC1
281
- NC1(C(=O)NC(CCO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
282
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(c4ccc(Cl)cc4)C4COCCN4)CC3)c21
283
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCN3CCCC3)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
284
- CNCCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2-c2cn[nH]c2)CC1
285
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccccc4)c3)cc12
286
- NC1(c2ccc(-c3nc4cc(-c5cn[nH]c5)ccn4c3-c3ccccc3)cc2)CCC1
287
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3)CC1)c1ccc(Cl)cc1
288
- CCNCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
289
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCO)CC1
290
- CCN(CCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1)C(C)C
291
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2cccc(F)c2)oc1Cl
292
- CC1Cc2c(ccc(F)c2Cl)N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
293
- CC(C)CCCC1(C)CCc2ccc(O)c(O)c2O1
294
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CCCOCC2)cn1
295
- Cn1cc(-c2c(Cl)cnn2C)cc1C(=O)NC1CNCCC1c1ccc(Cl)cc1
296
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(C(F)(F)F)c(F)c3)cn2CCN2CCCC2)CC1
297
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CCN(C3CC3)CC2)cn1
298
- Nc1ccc(-n2c(-c3cccnc3N)nc3cccnc32)cc1
299
- CC1(O)CC(O)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cc(C(=O)O)ccc3-4)cc2)C1
300
- NC1(C(=O)NC(CCO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
301
- COC(=O)c1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cn1
302
- NC1(c2ccc(-n3c(-c4ccnnc4)nc4ccc(-c5cccc(N6CCOCC6)c5)nc43)cc2)CCC1
303
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)CC4CCCCC4)c3)cc12
304
- Nc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(n6cnc(-c7ccccn7)c6)CC5)cc4)nc3ccn12
305
- NC1CCN(c2ncnc3[nH]cc(Cl)c23)C1
306
- CCNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2OC(C)C)CC1
307
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CC(F)C2)CC1
308
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccc(Cl)cc4Cl)s3)cc12
309
- Cc1n[nH]c2ccc(-c3cncc(OCCCC(N)Cc4ccccc4)c3)cc12
310
- Cn1cccc1CC(C)(C)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
311
- NCC(c1ccccc1)c1cncc(C=Cc2ccncc2)c1
312
- O=c1[nH]ccc2nc(-c3ccc(CN4CCC(c5nnc(-c6cnccn6)[nH]5)CC4)cc3)c(-c3ccccc3)cc12
313
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCOc2c(Cl)cccc21
314
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCC3CCNC3)c21
315
- C=Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
316
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2CCC2CC2)CC1
317
- Cn1ncc(Br)c1-c1ccc(C(=O)NC2CNCCC2c2ccc(Cl)cc2)o1
318
- NC1(c2ccc(-c3nc4ccc(C(=O)NCCF)cn4c3-c3ccccc3)cc2)CCC1
319
- CN(Cc1ccc(Cl)cc1)C(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1
320
- CC1Cc2c(ccc(F)c2F)N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
321
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccccc4Cl)c3)cc2s1
322
- COc1cc(C(N)=O)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
323
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
324
- Cn1ccc(S(=O)(=O)NCc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)n1
325
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCC3CCNCC3)c21.O=C(O)C(F)(F)F
326
- CCONC(=O)c1ccc2c(c1)OCn1c-2nc(-c2ccc(C3(N)CC(C)(O)C3)cc2)c1-c1ccccc1
327
- NC(COc1cncc(-c2ccc3cnc(-c4ccccc4)cc3c2)c1)Cc1c[nH]c2ccccc12
328
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(c5nnc(-c6ccccn6)[nH]5)CC4)cc3)nc2n1
329
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCNC3CC3)CC2)c1Cl
330
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4cn[nH]c4)nc32)cc1
331
- Cn1c(=O)[nH]c2ccc(-c3cnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)cc21
332
- Fc1ccc(-c2cn3ccnc3nc2-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)cc1
333
- Cc1cc(CC(N)COc2cncc(-c3cc4c(C)n[nH]c4cn3)c2)ccc1F
334
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
335
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cccc(Br)c4)c3)cc12
336
- Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
337
- O=C(C(CNC1CCCC1)c1ccc(Cl)cc1)N1CCN(c2ncnc3sc4c(c23)CCC4)CC1
338
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CCN(CC4CC4)CC3)cn2)cs1)C(F)(F)F
339
- CC(O)CNC(=O)c1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(C5(N)CCC5)cc4)c3n2)c1
340
- CC(=C1C(=O)Nc2ccc(NC(N)=O)cc21)c1cc(CNC(=O)C2CCNCC2)c[nH]1
341
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccc(F)c(F)c4)s3)cc12
342
- NC1(c2ccc(-c3nc4ccc(C(=O)NC5CC5)cn4c3-c3ccccc3)cc2)CCC1
343
- CS(=O)(=O)NCCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
344
- NC(COc1cnc(Cl)c(-c2ccc3cnccc3c2)c1)Cc1c[nH]c2ccccc12
345
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCC4CC4)c4ccc(Cl)cc4)CC3)c21
346
- NC(CNc1cncc(Oc2cccc3cnccc23)c1)Cc1c[nH]c2ccccc12
347
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCC4CCOCC4)c4ccc(Cl)cc4)CC3)c21
348
- Nc1ncnc(N2CCC(c3nc(-c4cccc(F)c4)cn3CCN3CCCC3)CC2)c1Cl
349
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3cccc(C(F)(F)F)c3)cn2CCN2CCCC2)CC1
350
- NC(COc1cnc2ccc(-c3ccncc3)cc2c1)Cc1c[nH]c2ccccc12
351
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3cccc(F)c3)cn2CCN2CCCC2)CC1
352
- NC(=O)c1cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2cn1
353
- NC1(C(=O)NC(CCN2CCCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
354
- Cc1cnc2c(c1)OCn1c-2nc(-c2ccc(C3(N)CC(C)(O)C3)cc2)c1-c1ccccc1
355
- NC(=O)C=Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
356
- NC(CNc1nnc(-c2ccc3cnccc3c2)s1)Cc1ccc(C(F)(F)F)cc1
357
- CCn1c(-c2nonc2N)nc2cncc(OC3CCNCC3)c21
358
- NC(COc1cncc(-c2ccc3[nH]nc(N4CCOCC4)c3c2)c1)Cc1c[nH]c2ccccc12
359
- Cc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2c1Br
360
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OC(CCN)c3ccccc3)cc21
361
- CC(=O)N1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
362
- c1ccc(-c2cc(-c3nn[nH]n3)cnc2-c2ccc(CNCc3ccc(-c4csnn4)cc3)cc2)cc1
363
- COc1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6cc[nH]c(=O)c6cc5-c5ccccc5)cc4)CC3)[nH]2)cn1
364
- CCCC1NC(=O)C(CCCNC(=N)N)NC(=O)CN(C(=O)C(N)CCCNC(=N)N)CCCCCCNC(=O)NCCCCN(CC(N)=O)C(=O)C(CCC(C)C)NC(=O)C(CN)NC(=O)C(Cc2ccc(O)cc2)NC1=O
365
- CN(C)CCN1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7cc[n+]([O-])cc7)[nH]6)CC5)cc4)nc3n2)CC1
366
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCC3)CC2)c1-c1ccc(F)cc1
367
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(C7CC7)nn6cc5-c5ccccc5)cc4)C3)n[nH]2)n1
368
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1cc2c(C)n[nH]c2cn1
369
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3F)CC1)c1ccc(Cl)cc1
370
- CC(C)(C)c1ccc(CC2(N)CCN(c3ncnc4[nH]c(=O)[nH]c34)CC2)cc1
371
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cccc(OCCC5CCNCC5)c4)c3)cc12
372
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4cccc(OCC5CCNCC5)c4)c3)cc12
373
- COC(=O)c1cccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)c1
374
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(F)cc4)C3)c12
375
- NC1(c2ccc(-c3nn4c(-c5ccc(F)c(CO)c5)cnc4cc3-c3ccccc3)cc2)CCC1
376
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccccc4Cl)s3)cc12
377
- NC1(c2ccc(-c3nc4ccn5c(=O)[nH]nc5c4cc3-c3ccccc3)cc2)CCC1
378
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
379
- CCOCCN(CC(O)CN1CCCC2(CCN(c3ncnc(N)c3C3CC3)C2)C1)S(=O)(=O)c1c(C)cccc1C
380
- O=C(NC(c1ccc(Cl)c(F)c1)C1CCNCC1)c1ccc2cnccc2c1
381
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccccc1OCCN1CCCC1
382
- Cc1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
383
- Nc1ncccc1-c1nc2ccc(Nc3ccccc3)nc2n1-c1ccc(C2(N)CCC2)cc1
384
- NC(=O)COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
385
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(C(F)(F)F)c(F)c3)cn2CCN2CCCC(F)(F)C2)CC1
386
- CCCC1OC2CC(=O)OC2C2=C1C(=O)c1c(OC)cccc1C2=O
387
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
388
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3O)CC1)c1ccc(Cl)cc1
389
- Cc1c(CCC(=O)O)c[nH]c1C=C1C(=O)Nc2ccc(NC(N)=O)cc21
390
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1cnc2ccccc2c1
391
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(-c3ccccc3)cccc21
392
- NC1(c2ccc(-c3nc4c5ccc(-c6ccc(O)nc6)cc5nn4cc3-c3ccccc3)cc2)CCC1
393
- NC1(c2ccc(-c3nc4ccc5nnc(C6NCNN6)n5c4cc3-c3ccccc3)cc2)CC(O)(C2CC2)C1
394
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccccc1
395
- NC(COc1cncc(-c2cc3c(Cl)n[nH]c3cn2)c1)Cc1cccc(C(F)(F)F)c1
396
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6ncc(C)n6cc5-c5ccccc5)cc4)C3)n[nH]2)n1
397
- NC1(C(=O)NC(CCCN2CCCCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
398
- COc1ccc(-c2nc3c(C)nc(N)nc3n(C3CCC(O)CC3)c2=O)cn1
399
- O=C1NCN(c2ccccc2)C12CCN(Cc1ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc1)CC2
400
- O=C1Nc2ccc(NC(COc3cncc(-c4ccc5c(c4)C(=O)C(=O)N5)c3)Cc3c[nH]c4ccccc34)cc2C1=O
401
- CC(C)CCCC1(C)CCc2cc(N)ccc2O1
402
- CN(C)CCCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
403
- O=c1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6cc[nH]c(=O)c6cc5-c5ccccc5)cc4)CC3)[nH]2)c[nH]1
404
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCO)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
405
- CC(C)NCC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3c2C(C)SC3)CC1
406
- COc1cc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
407
- c1ccc(-c2cc3cnccc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
408
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCOc2ccccc21
409
- Cc1c[nH]c2ncnc(N3CC4(CCNCC4)c4ccccc43)c12
410
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccc(Cl)c(C(F)(F)F)c2)sc1Cl
411
- Cn1c(CC(=O)Nc2ccc(F)c(C(F)F)c2)nc(N2CCOCC2)cc1=O
412
- C=CC1CCc2ncnc(N3CCN(C(=O)C(CNC(C)C)c4ccc(Cl)cc4)CC3)c21
413
- COc1ccc(CC(N)C(=O)N2CCN(c3ncnc4ccccc34)CC2)cc1
414
- Nc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
415
- CC1=NN(C(=O)c2ccc(Cl)cc2)C(=O)C1N=Nc1ccc(S(=O)(=O)Nc2ncccn2)cc1
416
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CC(=O)Nc5cccc(F)c5)cc4)c3n2)c1
417
- CCOCCN(CC(O)CN1CCCC2(CCN(c3ncnc4ccccc34)C2)C1)S(=O)(=O)c1c(C)cccc1C
418
- Cc1cc(-c2ccc3nc(-c4ccc(C5(N)CCC5)cc4)c(-c4ccccc4)n3c2)[nH]n1
419
- c1ccc(-c2cc3c(ccn4cnnc34)nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
420
- Cc1ccnc2c1nc(-c1cccnc1N)n2-c1ccc(CN)cc1.Cl
421
- O=c1[nH]c2ccccc2n1C1CCN(Cc2ccc(-c3nc4ccc(-n5cnnn5)cc4nc3-c3ccccc3)cc2)CC1
422
- CCCCCCCCCCCCCCCC(=O)OCC(COP(=O)(O)OC1C(O)C(OP(=O)(O)O)C(OP(=O)(O)O)C(OP(=O)(O)O)C1O)OC(=O)CCCCCCCCCCCCCCC
423
- COc1ncc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCC(N)=O)CC3)c2=O)cc1F
424
- CC(C)CCCC1(C)CCc2cc(S(N)(=O)=O)cc(F)c2O1
425
- CCc1cnn(C)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1.Cl
426
- Cl.Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(C5(N)CCC5)cc4)c3n2)c1
427
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3O)CC1)c1ccc(Cl)cc1
428
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCC3CCNCC3)c21.O=C(O)C(F)(F)F
429
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4cccc(F)c4)C3)c12
430
- CNC(=O)CC1CC(c2ccc(Cl)c(Cl)c2)C(NC(=O)c2cc(-c3c(Cl)cnn3C)c(Cl)o2)CN1
431
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1c[nH]c2ccccc12
432
- COc1ncc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCC(N)=O)CC3)c2=O)cc1F
433
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(C)c(Br)n6cc5-c5ccccc5)cc4)C3)n[nH]2)n1
434
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(C)nn6c(C)c5-c5ccccc5)cc4)C3)n[nH]2)n1
435
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1Br
436
- Cc1n[nH]c2ncc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)c(C#N)nc3-c3ccoc3)nc12
437
- [C-]#[N+]c1ccc(C(=O)N2CCN(Cc3ccc(-c4nnc5n4-c4cccnc4Nc4ccccc4-5)cc3)CC2)cc1
438
- Cc1cnc2cc(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nn12
439
- N=C(c1ccccc1)n1c(=N)ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)cc21
440
- c1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nnc6n5-c5cccnc5Nc5ccccc5-6)cc4)CC3)o2)cc1
441
- NC1(c2ccc(-c3nc4cc(C(=O)NO)ccn4c3-c3ccccc3)cc2)CCC1
442
- OCCN1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
443
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(C)c3)cn2CCN2CCC2)CC1
444
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
445
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(F)c3)cn2CCN2CC(F)C2)CC1
446
- CC(C)CNC(c1ccc(Cl)cc1)c1ccc(-c2ncnc3[nH]cnc23)cc1
447
- CCc1cc(OC)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
448
- Cl.Nc1ncccc1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(C2(N)CCC2)cc1
449
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(C(F)(F)F)c3)cn2CCN2CCC2)CC1
450
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCCC3)CC2)c1-c1ccccc1F
451
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4cccc(Cl)c4)c3)cc2s1
452
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(OC(F)F)cc3)cn2CCN2CCC2)CC1
453
- CCOC(=O)c1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
454
- NCC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3[nH]ccc23)CC1
455
- COc1nc(C(N)=O)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
456
- O=S(=O)(NCCNCCOCc1ccc(Cl)cc1)c1cccc2cnccc12
457
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2cccc(F)c2)oc1Cl
458
- Cn1c(CC(=O)N2CCc3c(F)cccc32)nc(N2CCOCC2)cc1=O
459
- COC(=O)COc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
460
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCc4nnc(N)s4)cc3)nc2n1
461
- Cc1nc2nc(-c3ccc(CN4CCC(c5n[nH]c(-c6cc(Cl)ccn6)n5)CC4)cc3)c(-c3ccc(F)cc3F)cn2n1
462
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(S(C)(=O)=O)cc1
463
- CCc1n[nH]c2ncnc(N3CCN(c4cc(Cl)cc(NCCN5CCCC5)c4C)CC3)c12
464
- Cc1n[nH]c2cnc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)cc12
465
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OC3CCNCC3)c21.O=C(O)C(F)(F)F
466
- COc1ccccc1C(=O)N1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
467
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccccc2)oc1Cl.O=C(O)C(O)C(O)C(=O)O
468
- Cc1nc2nc(-c3ccc(CN4CCC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2n1
469
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CCC(F)CC2)cn1
470
- Cl.Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(C5(N)CCC5)cc4)c3n2)c1
471
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cccc(F)c3-4)cc2)CCC1
472
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4c[nH]c5cc(F)ccc45)cnc3-c3ccoc3)cc12
473
- COC(=O)c1cc(Cl)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
474
- Nc1ncccc1-c1nc2cc(C3CCCC3)cnc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
475
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(Cl)c(F)c4)c3)cc12
476
- Cn1c(CC(=O)Nc2ccc(F)c(C3CC3)c2)nc(N2CCOCC2)cc1=O
477
- NCC1(Cc2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
478
- NC1(c2ccc(-c3nn4c(-c5ccc(S(N)(=O)=O)cc5)cnc4cc3-c3ccccc3)cc2)CCC1
479
- CN(C)C(=O)c1cccc(-c2ccc3nc(-c4ccccc4)n(-c4ccc(C5(N)CCC5)cc4)c3n2)c1
480
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(c5nnc(N)s5)CC4)cc3)nc2n1
481
- NC(Cc1ccc(F)c(F)c1)C(=O)N1CCN(c2ncnc3ccccc23)CC1
482
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(c4ccc(Cl)cc4)C4COCCN4)CC3)c21
483
- CCC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cccnc3-4)cc2)C1
484
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
485
- Cc1ccc(-c2nc3c(C)nc(N)nc3n(C3CCOCC3)c2=O)cn1
486
- CN(C)CCN1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ncccn7)[nH]6)CC5)cc4)nc3n2)CC1
487
- Nc1cc2cc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)ccc2cn1
488
- CC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC(C)O1
489
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccccc1F
490
- Cc1cc(O)cc2c1NC(C)(CCCC(C)C)CC2
491
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(C)N)cc21
492
- CC1Cc2c(Br)cccc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)n1C
493
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4ccc(F)c(F)c4F)c3)cc12
494
- Nc1ncnc(N2CCC(c3nc(-c4cccc(F)c4)cn3CCN3CCCC3)CC2)c1Br
495
- CCOC(=O)c1cncc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)c1
496
- CC(C)CCCC1(C)CCc2cc(O)cc(Br)c2O1
497
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCC4CCOCC4)c4ccc(Cl)cc4)CC3)c21
498
- CC(C)NCC(Cc1ccc(Cl)c(F)c1)C(=O)N1CCN(c2ncnc3c2C(C)SC3)CC1
499
- CCOC(=O)c1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
500
- COc1cc2ncc3c(N)nc(-c4cncc(OCC(N)Cc5ccccc5)c4)cc3c2cc1OC
501
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1Cl
502
- COc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C)c3)cn2CC2CNC2)CC1
503
- Cn1nccc1-c1oc(C(=O)NC2CNCCC2c2ccc(Cl)cc2)cc1Br
504
- NC(=O)Nc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
505
- Cc1ccc(S(=O)(=O)NC2(c3ccc(-c4nnc5n4-c4cccnc4Nc4ccccc4-5)cc3)CCC2)cc1
506
- NC(=O)C1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
507
- NC1(c2ccc(-c3nc4nc(-n5ccccc5=O)ccn4c3-c3ccccc3)cc2)CCC1
508
- c1ccc(-c2cc3cccnc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
509
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ncccc3-4)cc2)CCC1
510
- NC1(c2ccc(-c3nc4cc(-c5ccncc5)ccn4c3-c3ccccc3)cc2)CCC1
511
- CNc1nccc(-c2ccc(C(=O)NC(CN)Cc3ccc(Cl)cc3Cl)s2)n1
512
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCCN)cc21
513
- CC1Cc2cc(F)c(F)cc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)n1C
514
- O=c1[nH]c2ccccc2n1C1CCN(Cc2ccc(-c3nc4cc(-n5cnnn5)ccc4nc3-c3ccccc3)cc2)CC1
515
- COc1cc(COc2ccn3c(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc3n2)ccn1
516
- O=S(=O)(NCc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1cccnc1
517
- NC1(c2ccc(-c3nc4ccc(-c5cnc[nH]5)cn4c3-c3ccccc3)cc2)CCC1
518
- C#Cc1cc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)ccc1F
519
- CCc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(F)cc4F)C3)c12
520
- COC(=O)COc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
521
- CC1OC2OC(=O)OC2C2=C1C(=O)c1c(O)cccc1C2=O
522
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc5c(c4)OCO5)c3)cc12
523
- Nc1cc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCC3)CC2)ncn1
524
- Nc1cc2cc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)ccc2cn1
525
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN)CC1
526
- CNc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)nc2nc(C)cn12
527
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3ccccc3O)cc12
528
- NC(=O)c1cc(Br)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
529
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccccc3)cn2CCN2CCC2)CC1
530
- C=CC1CCc2ncnc(N3CCN(C(=O)C(CNC(C)C)c4ccc(Cl)cc4)CC3)c21
531
- N#CCc1ccc(-n2cnc3cnc4ccc(C#Cc5cccnc5)cc4c32)cc1
532
- CN(C)CC(NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1)c1ccccc1
533
- CCn1c(-c2nonc2N)nc2c(C#CC3CC3)ncc(OCCCN)c21
534
- CC1Cc2cc(F)c(F)cc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)[nH]1
535
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CN3CCCC3)cc21
536
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1c[nH]c2ccccc12
537
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3ccccc3)cc12
538
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(OC(F)F)c1
539
- CCC1CN(c2cc(=O)[nH]c(CC(=O)Nc3ccc(F)cc3)n2)CCO1
540
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(-c5ccn[nH]5)cc3-4)cc2)C1
541
- NC1(c2ccc(-n3c(-c4ncccn4)nc4ccc(-c5cccc(N6CCOCC6)c5)nc43)cc2)CCC1
542
- NC1(c2ccc(-c3nn4c(-c5ccc(F)cc5)cnc4cc3-c3ccccc3)cc2)CCC1
543
- CC(=O)Nc1nc2ccc(-c3ccnc(N(C)S(=O)(=O)c4ccccc4F)n3)cc2s1
544
- CNc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2n1
545
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3cccc(C(F)(F)F)c3)cn2CCN2CCCC2)CC1
546
- Cc1cccc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c1F
547
- Cn1c(CC(=O)Nc2ccc(F)c(Br)c2)nc(N2CCOCC2)cc1=O
548
- Cn1cnc(S(=O)(=O)NCc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)c1
549
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2cc(Cl)ccc21
550
- Cc1cc(-c2cn[nH]c2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
551
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
552
- Nc1ncccc1-c1nc2ccc(Sc3ccccc3)nc2n1-c1ccc(C2(N)CCC2)cc1
553
- CSc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2n1
554
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(C5(N)CCC5)cc4)c3n2)c1.Cl
555
- NC1(C(=O)NCc2ccc(F)cc2)CCN(c2ncnc3[nH]ccc23)CC1
556
- NCC1CCN(c2ncnc3[nH]ccc23)CC1
557
- O=S(=O)(NCc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccc(Cl)cc1
558
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCN3CCC(O)CC3)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
559
- CCc1nc2cnc3ccc(C#Cc4cccnc4)cc3c2n1-c1ccc(CC#N)cc1
560
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c(C#N)[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2n1
561
- COc1cccc(-c2c[nH]c(C=C3C(=O)Nc4ccc(NC(N)=O)cc43)c2)c1
562
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(Cl)c4)cn3CCN3CCCC3)CC2)c1C1CCC1
563
- O=C(N1CCN(c2ncnc3[nH]cc(Cl)c23)CC1)C1(c2ccc(Cl)c(Cl)c2)CCNCC1
564
- NC(=O)c1cccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)c1
565
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccccc3-4)cc2)C1
566
- CCn1c(-c2nonc2N)nc2cncc(CNC3CCNCC3)c21
567
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2cc(F)ccc21
568
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2cccc(C(F)(F)F)c2)oc1Cl
569
- CCOCCN(CC(O)CN1CCCC2(CCN(c3ncnc4[nH]nc(C)c34)C2)C1)S(=O)(=O)c1c(C)cccc1C
570
- CCc1cc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccn1
571
- Nc1ncnc(Cl)c1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(C2(N)CCC2)cc1
572
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ncccc3-4)cc2)CCC1
573
- NC1(c2ccc(-c3nc4ccccc4cc3-c3ccccc3)cc2)CCC1
574
- CNc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)nc2nc(C)cn12
575
- COC1CCN(c2cnc(C(=O)Nc3csc(-c4nncn4C4CC4)n3)cc2-n2cnc(C3CC3)c2)C1
576
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3ccc(-c4ccccc4)cc23)CC1
577
- NC1(c2ccc(-c3nn4c(-c5ccc(CO)cc5)cnc4cc3-c3ccccc3)cc2)CCC1
578
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(C(F)(F)F)c3)cn2CCN2CCC2)CC1
579
- COC(=O)CCc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
580
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCNC2CCCC2)CC1
581
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CNC)cc21
582
- N#CC1CN(c2cnc(C(=O)Nc3csc(-c4nncn4C4CC4)n3)cc2-n2cnc(C3CC3)c2)C1
583
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCCCN)cc21
584
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCNS(C)(=O)=O)CC2c2ccc(F)c(F)c2)oc1Cl
585
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc12
586
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCc4nc[nH]n4)cc3)nc2n1
587
- CCC1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
588
- Nc1nc(O)nc2nc(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)c(-c3ccccc3)cc12
589
- CN(C)C(=O)c1ccc2c(c1)OCn1c-2nc(-c2ccc(C3(N)CC(C)(O)C3)cc2)c1-c1ccccc1
590
- CC(=O)c1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
591
- NC1(c2ccc(-c3nc4cc(-c5cc[nH]n5)ccn4c3-c3ccccc3)cc2)CCC1
592
- CC1SCc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(F)cc4)CC3)c21
593
- CCc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
594
- O=C(NC(c1cccc(Cl)c1Cl)C1CCNCC1)c1ccc2cnccc2c1
595
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)c[nH]3)CC2)c1Br
596
- Cc1nnc(-c2ccccc2Nc2ncnc3[nH]ccc23)[nH]1
597
- O=C(NCc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccccc1
598
- C=CC(=O)C(Cc1ccccc1)NC(=O)OC(C)(C)C
599
- Cc1ccc(-c2ccc3nn4cc(-c5ccccc5)c(-c5ccc(C6(N)CCC6)cc5)nc4c3c2)cc1
600
- NC1(C(=O)NC(c2ccc(Cl)cc2)C2CC2)CCN(c2ncnc3[nH]ccc23)CC1
601
- Cn1c(CC(=O)N2CCc3c(Cl)cccc32)nc(N2CCOCC2)cc1=O
602
- CC(C)(Cc1ccccn1)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
603
- COc1ncc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCO)CC3)c2=O)cn1
604
- COCCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
605
- CCOC(=O)c1c(C)nc(NNC(=O)c2cccc3c(=O)c4ccccc4[nH]c23)nc1-c1ccc(OC)cc1
606
- CCOCCN(CC(O)CN1CCCC2(CC(=O)c3cc(O)ccc3O2)C1)S(=O)(=O)c1ccccc1Cl
607
- NC1(C(=O)NC(c2ccccc2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
608
- CC(C)NCC(C(=O)N1CCN(c2ncnc3sc4c(c23)CCC4)CC1)c1ccc(Br)cc1
609
- CC(C)(Cc1ccccc1)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
610
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(NC(=O)CNC(C)=O)CC4)cc3)nc2n1
611
- Cc1n[nH]c2cnc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)nc12
612
- NC(COc1cncc(-c2ccc3[nH]ncc3c2)c1)Cc1c[nH]c2ccccc12
613
- NC1(Cc2ccc(Cl)cc2Cl)CCN(c2ncnc3[nH]ccc23)CC1
614
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(C(F)(F)F)c(F)c3)cn2CCN2CCC(F)CC2)CC1
615
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
616
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CCN)cc21
617
- Cc1ccc2c(c1)Nc1ncccc1-n1c(-c3ccc(C4(N)CCC4)cc3)nnc1-2
618
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(CF)CC3)CC1)c1ccc(Cl)cc1
619
- O=C(Cc1nc(N2CCOCC2)cc(=O)n1C1CC1)N1CCc2c(Cl)cccc21
620
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOC(CF)C2)cc(=O)[nH]1
621
- NC1(c2ccc(-c3nc4nc(Oc5ccccc5)ccn4c3-c3ccccc3)cc2)CCC1
622
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCCC2)CC1
623
- CCNC(=O)Nc1ccc(CNc2ncsc2C(=O)Nc2ccc3c(c2)OC(F)(F)O3)cn1
624
- COc1nn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6cccc(C)n6)n5)C4)cc3)nc2c1CO
625
- Cc1cnc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
626
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=S)[nH]1
627
- CN1CCN(c2nccc3nc(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)c(-c4ccccc4)cc23)CC1
628
- CC(C)(C)c1cccc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c1
629
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)Cc3ccccc3)cc21
630
- CC(C)CNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
631
- O=C(O)c1ccc2nc(-c3ccccc3)c(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)nc2c1
632
- CC1COCCN1c1nc(N2CCOCC2C)c2ccc(-c3ccc4[nH]nc(N)c4c3)nc2n1
633
- CCn1c(-c2nonc2N)nc2c(-c3ccoc3)ncc(OCCCN)c21
634
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c(Cl)[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2n1
635
- Cn1c(CC(=O)N2CCc3ccccc32)nc(N2CCOCC2)cc1=O
636
- CC1SCc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(F)c(F)c4)CC3)c21
637
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(C(C)(C)C)cc4)c3)cc2s1
638
- NC1(CNC(=O)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
639
- NC(=O)C1(c2ccc(-n3c(-c4cccnc4N)nc4ccc(-c5ccccc5)nc43)cc2)CCC1
640
- Cl.Nc1ncccc1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(C2(N)CCC2)cc1
641
- Cc1cc(-c2ccc3c(c2)nn2cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc32)[nH]n1
642
- NC(COc1cncc(-c2ccc3nnccc3c2)c1)Cc1c[nH]c2ccccc12
643
- Cc1[nH]nc2ccc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)cc12
644
- NC1(c2ccc(-c3nc4c(-c5cn[nH]c5)cccn4c3-c3ccccc3)cc2)CCC1
645
- CC(C)Oc1cccc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c1
646
- CCC(N)COc1cncc(-c2cc3c(cnc4cc(OC)c(OC)cc43)c(N)n2)c1
647
- Nc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
648
- O=S(=O)(NCc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccccc1F
649
- NC1(c2ccc(-c3nc4ncc(-c5ccccc5)cn4c3-c3ccccc3)cc2)CCC1
650
- CCOC(=O)c1cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2cn1
651
- O=C(Cc1nc(N2CCOCC2)c(F)c(=O)[nH]1)N1c2ccccc2CC1CO
652
- CCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
653
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC4CCCCC4)c4ccc(Cl)cc4)CC3)c21
654
- CCc1c(N)ncnc1N1CCC(c2nc(-c3cncc(Cl)c3)cn2CCN2CCC2)CC1
655
- NC(=O)c1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(c5nnc(-c6ccccn6)[nH]5)CC4)cc3)nc2n1
656
- N#CCn1c(O)nc2ccc(NC(COc3cncc(-c4ccc5nc(O)n(CC#N)c5c4)c3)Cc3c[nH]c4ccccc34)cc21
657
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCC2c2ccc(C(F)(F)F)cc2)oc1Cl
658
- CCOc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C)c3)cn2CCN(CC)CC)CC1
659
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3cccc(O)c3)cc12
660
- Nc1ncnc(N2CCC(c3nc(-c4ccc(O)c(C(F)(F)F)c4)cn3CCN3CCCC3)CC2)c1Cl
661
- Cl.Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(C2(N)CCC2)cc1
662
- CC1(C)CN(C(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c2cccc(F)c21
663
- NCC(O)(c1ccc(Cl)cc1)c1ccc(-c2cn[nH]c2)cc1
664
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
665
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccccc4)s3)cc12
666
- NC(CNc1cncc(Nc2cccc3cnccc23)c1)Cc1c[nH]c2ccccc12
667
- c1ccc(-c2cc3ncccc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
668
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)C1CCCN1C(=O)C(CCCNC(=N)N)NC)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(N)=O
669
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccnc(C(F)(F)F)c3)cn2CCN2CCC2)CC1
670
- Nc1ncnc(N2CCC(c3nc(-c4ccc(C(F)(F)F)c(F)c4)cn3CCN3CCCC3)CC2)c1Cl
671
- CCOc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2n1
672
- CN1CCN(c2ccc3nc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ncccn7)[nH]6)CC5)cc4)nc3n2)CC1
673
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCCC3)CC2)c1-c1ccc(F)cc1
674
- NC1(c2ccc(-c3ncc4cccnc4c3-c3ccccc3)cc2)CCC1
675
- CC(C)Cc1nc(-c2ccccc2)c(-c2ccc(CN3CCC(n4c(O)nc5ccccc54)CC3)cc2)nc1O
676
- COC(=O)C=Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
677
- COC(=O)c1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
678
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(F)cc1
679
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC(C)(C)CO)c4ccc(Cl)cc4)CC3)c21
680
- NC(c1ccc(Cl)cc1)c1ccc(-c2ncnc3[nH]cnc23)cc1
681
- Cc1ccc(S(=O)(=O)NCc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)cc1
682
- NC1(c2ccc(-c3nc4c(Cl)cccn4c3-c3ccccc3)cc2)CCC1
683
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCCCCCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCCN)CC(N)=O
684
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
685
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(F)cc1
686
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cc(CO)ccc3-4)cc2)C1
687
- C=Cc1cnc2cc(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nn12
688
- N=c1ccc2nc(-c3ccc(C4(NC(=O)Cc5cccnc5)CCC4)cc3)c(-c3ccccc3)cc2n1C(N)=O
689
- COc1cc2ncc3c(N)nc(-c4cncc(OCC(C)N)c4)cc3c2cc1OC
690
- COc1ncc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCC(N)=O)CC3)c2=O)cn1
691
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)cc3)cn2CCN2CCCC2)CC1
692
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(Cl)c4)cn3CCN3CCCC3)CC2)c1Br
693
- O=C(N1CCN(c2ncnc3[nH]ccc23)CC1)C1(c2ccc(Br)cc2)CCNCC1
694
- CC(=O)NCC1(N)CCN(c2ncnc3[nH]cc(C)c23)C1
695
- O=C1CC2OC(c3ccsc3)C3=C(C(=O)c4ccccc4C3=O)C2O1
696
- Nc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
697
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4ccccc4)nc32)cc1
698
- COC1CCC(NCC(C(=O)N2CCN(c3ncnc4c3C(C)CC4O)CC2)c2ccc(Cl)cc2)CC1
699
- Fc1ccc(-c2cn3c(Cl)cnc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
700
- NC(COc1cncc(-c2ccc3c(c2)C(c2cccnc2)C(=O)N3)c1)Cc1c[nH]c2ccccc12
701
- CC(C)(C)c1cc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccn1
702
- Cc1cc(-c2cn(CC3CNC3)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
703
- N#Cc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
704
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(C(F)(F)F)cc4)c3)cc12
705
- NC1(c2ccc(-n3c(-c4ccccc4O)nc4ccc(-c5ccccc5)nc43)cc2)CCC1
706
- Cn1cc(CC(N)COc2cncc(-c3ccc4cnccc4c3)c2)c2ccccc21
707
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CC3(CCOCC3)C2)cn1
708
- O=C(NCCc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccccc1
709
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1c2ccccc2CC1CO
710
- CNCCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
711
- CC(C)(C)n1nc(Cc2cccc(I)c2)c2c(N)ncnc21
712
- c1ccc2c(CC(COc3cncc(-c4cc5cnccc5s4)c3)Nc3cc4cnccc4s3)c[nH]c2c1
713
- CN1CCN(c2cccc3c2CCN3C(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)CC1
714
- CC(C)CCCC1(C)CCc2c(O)cccc2O1
715
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)C1CCCN1C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(N)=O
716
- NC(COc1cncc(-c2ccc3c(c2)C(c2ccccn2)C(=O)N3)c1)Cc1c[nH]c2ccccc12
717
- Cn1ncc(Cl)c1-c1csc(C(=O)NC2(c3ccc(F)cc3)CCNCC2)c1
718
- CC(C)Nc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)nc2nc(-c3ccccn3)nn12
719
- CC1CN(C(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c2cccc(O)c21
720
- Cc1cc(C(=O)NC(CN)c2ccccc2)sc1-c1ccnc2[nH]ccc12
721
- COc1cncc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)c1
722
- Cn1nnnc1-c1cnc(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)c(-c2ccccc2)c1
723
- COc1cc(NS(=O)(=O)c2ccc(NS(=O)(=O)c3c(C)noc3C)cc2)nc(OC)n1
724
- O=c1ccc(-c2cc(C3CCN(Cc4ccc(-c5nc6ncccc6cc5-c5ccccc5)cc4)CC3)n[nH]2)c[nH]1
725
- NC1(c2ccc(-c3nc4cc(-c5ncc[nH]5)ccn4c3-c3ccccc3)cc2)CCC1
726
- NC1(c2ccc(-c3nc4ccc(O)cn4c3-c3ccccc3)cc2)CCC1
727
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1c2ccccc2CC1CF
728
- CNCCn1cc(-c2ccnc(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2Cl)CC1
729
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2C2CNC2)CC1
730
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1-c1cnoc1
731
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(Cl)c3)cn2CCN2CCC2)CC1
732
- Cn1ncc(Cl)c1-c1ccc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)cn1
733
- Cc1c(NCCN2CCCC2)cc(Cl)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
734
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2-c2ccc(F)cc2)CC1
735
- NCC(NC(=O)c1cc(-c2ccccc2)c(-c2ccnc3[nH]ccc23)s1)c1ccccc1
736
- CCc1c(N)ncnc1N1CCC(c2nc(-c3cccc(Cl)c3)cn2CCN2CCC2)CC1
737
- NC1(c2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CCC1
738
- N#Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(F)cc4F)C3)c12
739
- NC(COc1cnc(-c2ccco2)c(-c2ccc3cnccc3c2)c1)Cc1c[nH]c2ccccc12
740
- Cc1c(Cl)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1-c1ccn[nH]1
741
- Nc1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6cc[nH]c(=O)c6cc5-c5ccccc5)cc4)CC3)[nH]2)cn1
742
- COc1ccc(O)c(C(=O)c2ccc(C=CC3CCCNCC3NC(=O)c3ccncc3)cc2)c1F.Cl
743
- N#CC1CCN(c2cnc(C(=O)Nc3csc(-c4nncn4C4CC4)n3)cc2-n2cnc(C3CC3)c2)CC1
744
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CC3CC3C2)cn1
745
- CC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC(C)N1
746
- CNc1nccc(-c2ccc(C(=O)NC(CO)Cc3ccc(Cl)cc3Cl)s2)n1
747
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCO)CC2c2ccc(F)c(F)c2)oc1Cl
748
- C=Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCCCC2)CC1
749
- Cc1cc(-c2cccnc2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
750
- N=C(N)NCCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CCCCCNC(=O)C(CCCCN)NC(=O)CCCCCNC(=O)C1OC(n2cnc3c(N)ncnc32)C(O)C1O)C(N)=O
751
- NC1(c2ccc(-c3nc4cc(C(=O)N5CCCC5)ccn4c3-c3ccccc3)cc2)CCC1
752
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
753
- NC1(c2ccc(-c3nn4c(-c5ccc(F)cc5)c(Cl)nc4cc3-c3ccccc3)cc2)CCC1
754
- NC1(C(=O)NCc2ccc(F)cc2Cl)CCN(c2ncnc3[nH]ccc23)CC1
755
- OC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CCN1
756
- CN(C)CCCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
757
- CCOc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C)c3)cn2CCN2CCCC2)CC1
758
- CC(=O)Nc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCC2)CC1
759
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC(C)(C)C)c4ccc(C(F)(F)F)c(F)c4)CC3)c21
760
- Cc1nc2cnc3ccc(C#Cc4cccnc4)cc3c2n1-c1ccc(C(C)C#N)cc1
761
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCc4n[nH]c(-c5ccccc5)n4)cc3)nc2n1
762
- CC(=C1C(=O)Nc2ccc(NC(N)=O)cc21)c1cc(CNC(=O)C2CCN(C)CC2)c[nH]1
763
- Cn1cncc1-c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
764
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCC3CCCNC3)c21
765
- NC(CNc1cnc(Cl)c(C=Cc2ccncc2)c1)Cc1c[nH]c2ccccc12
766
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2cccc(F)c2)sc1Cl
767
- NC1CCN(c2ccnc3[nH]ccc23)CC1
768
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(CO)CC3)CC1)c1ccc(Cl)cc1
769
- Cc1cc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4Br)CC3)n2)ccc1F
770
- NC(CNc1ncc(-c2ccc3cn[nH]c3c2)s1)Cc1ccc(C(F)(F)F)cc1
771
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)cc1F
772
- Cc1cc(-c2cn(CCNCC(C)C)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
773
- c1ccc(-c2nnc[nH]2)c(Nc2ncnc3[nH]ccc23)c1
774
- Clc1c[nH]c2ncnc(N3CCc4[nH]cnc4C3)c12
775
- Cc1ccc(CC(C)(C)C2C(=O)Nc3ccc(-c4cncc(OCC(N)Cc5c[nH]c6ccccc56)c4)cc32)o1
776
- N#Cc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
777
- COC(=O)c1sccc1S(=O)(=O)NC1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
778
- NC1(c2ccc(-c3nc4c5cc(Br)ccc5nn4cc3-c3ccccc3)cc2)CCC1
779
- COc1ccc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)cc1Cl
780
- CC(C)(C)C(=O)Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1
781
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c(C#N)[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2n1
782
- COc1cc(OC)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
783
- COc1cc(Cl)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
784
- NC1(c2ccc(-c3nc4cc(C(=O)NO)ccn4c3-c3ccccc3)cc2)CCC1
785
- NC1(c2ccc(-n3c(C4CC4)nc4ccc(-c5cccc(N6CCOCC6)c5)nc43)cc2)CCC1
786
- CN1CCN(c2ccc3nc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
787
- COC(=O)c1ccc(-c2c(N)ncnc2N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCC3)CC2)cc1
788
- COc1cncc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)c1
789
- O=c1[nH]ccc2nc(-c3ccc(CN4CCC(c5nnc(-c6cnccn6)[nH]5)CC4)cc3)c(-c3ccccc3)cc12
790
- CCOC(=O)c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
791
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCO)CC2c2ccc(F)c(F)c2)oc1Cl
792
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2ccccc21
793
- COc1cc2ncc3c(N)nc(-c4cncc(OCC(N)Cc5cccc(C(F)(F)F)c5)c4)cc3c2cc1OC
794
- Cn1ncc(Br)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)nc1
795
- CCOc1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
796
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OC(CN)c3ccccc3)cc21
797
- Cc1ccc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)cc1Cl
798
- CC(C)(C)NCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
799
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccsc3)COc3cccc(F)c3-4)cc2)C1
800
- COc1ccc(O)c(C(=O)c2ccc(C(=O)NC3CCCNCC3NC(=O)c3ccncc3)cc2)c1F.Cl.Cl
801
- NC(Cc1ccc(C(F)(F)F)cc1)C(=O)N1CCN(c2ncnc3ccccc23)CC1
802
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCC(=O)c4ccccc4N)cc3)nc2n1
803
- Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(F)cc4)c3)cc2s1
804
- NC1(c2ccc(-c3nc4cc(-c5ccc(F)cc5)ccn4c3-c3ccccc3)cc2)CCC1
805
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCCCNCCc3ccc(OC)cc3)c21
806
- NC1(c2ccc(-c3nc4c5cc(-c6ccc(S(N)(=O)=O)cc6)ccc5nn4cc3-c3ccccc3)cc2)CCC1
807
- CCCCNCC(C(=O)N1CCN(c2ncnc3sc4c(c23)CCC4)CC1)c1ccc(Cl)cc1
808
- CC1Cc2cc(F)c(F)cc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
809
- Cn1c(CC(=O)N2CCc3cc(F)c(F)cc32)nc(N2CCOCC2)cc1=O
810
- CC(C)NCC(Cc1ccc(F)c(F)c1)C(=O)N1CCN(c2ncnc3c2C(C)SC3)CC1
811
- CC(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CN)cc4)c3n2)c1.Cl
812
- Cc1cccc(-c2nc(C3CCN(Cc4ccc(-c5nc6nccn6cc5-c5ccccc5F)cc4)CC3)n[nH]2)n1
813
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2-c2ccc(N)nc2)CC1
814
- Cc1ccc2c(c1)OCn1c-2nc(-c2ccc(C3(N)CC(O)(C4CC4)C3)cc2)c1-c1ccccc1
815
- CCOc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CC2CNC2)CC1
816
- NC1(C(=O)N2CCCC2c2ccccc2)CCN(c2ncnc3[nH]ccc23)CC1
817
- CN1CC(N2CCN(c3ncc4cc(-c5ccccc5)c(-c5ccc(CN6CCC(c7nnc(-c8ccccn8)[nH]7)CC6)cc5)nc4n3)CC2)C1
818
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccccc1OCCN1CCCCC1
819
- CCc1cnc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
820
- CS(=O)(=O)c1cccc(-c2ccc3c(c2)nn2cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc32)c1
821
- C=Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)cc3)cn2CCN2CCCC2)CC1
822
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2ccc(F)c(F)c2)oc1Cl
823
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(F)c(C(F)(F)F)c4)c3)cc12
824
- CCc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
825
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCC3CCCNC3)c21
826
- NC(CCc1cnc(Cl)c(C=Cc2ccncc2)c1)Cc1c[nH]c2ccccc12
827
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccccc4C(F)(F)F)c3)cc2s1
828
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cccc(-c5ccccc5)c4)c3)cc12
829
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2c1
830
- NC1(C(=O)NC(CCCN2CCCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
831
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)C1CCC1
832
- CCN(CC)CCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
833
- CC(=C1C(=O)Nc2ccc(NC(N)=O)cc21)c1cc(CNC(=O)CCN2CCCCC2)c[nH]1
834
- CC(C)c1cc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccn1
835
- Fc1ccc2[nH]c(C3CCN(Cc4ccc(-c5ncc(-c6nn[nH]n6)cc5-c5ccccc5)cc4)CC3)nc2c1
836
- O=S(=O)(NC1(c2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CCC1)c1ccc(F)cc1
837
- CC(=O)Nc1ccc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c2c1
838
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
839
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cccc(F)c3-4)cc2)CC(O)(C2CC2)C1
840
- Cn1nccc1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)nc1
841
- NC1(C(=O)NC(CCN2CCCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
842
- Cc1nc(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)c(-c2ccccc2)[nH]c1=O
843
- NC(COc1cncc(-c2ccc3c(c2)C(c2ccccc2)C(=O)N3)c1)Cc1c[nH]c2ccccc12
844
- Cn1ncc(Br)c1-c1ccc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)cc1
845
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)nc12
846
- COc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(C(F)(F)F)c3)cn2CCN2CCC2)CC1
847
- O=C(N1CCN(c2ncnc3[nH]cc(Br)c23)CC1)C1(c2cccc(Br)c2)CCNCC1
848
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
849
- Nc1ncnc2nc(-c3ccc(CN4CCC(c5cc(-c6ccncc6)[nH]n5)CC4)cc3)c(-c3ccccc3)cc12
850
- O=C1CC2OC3(CCCC3)C3=C(C(=O)c4ccccc4C3=O)C2O1
851
- Cc1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)c(C)n2n1
852
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC4CCOCC4)c4ccc(Cl)cc4)CC3)c21
853
- N#Cc1c[nH]c2ncnc(N3CC4(CCNCC4)c4ccccc43)c12
854
- COc1nc(Br)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
855
- CCc1c(C)[nH]c(CC(C)(C)C2C(=O)Nc3ccc(-c4cncc(OCC(N)Cc5c[nH]c6ccccc56)c4)cc32)c1C
856
- CNc1nccc(-c2ccc(C(=O)NC(CN)Cc3ccc(Cl)cc3Cl)s2)n1
857
- COc1ccc(S(=O)(=O)Nc2cc(-c3ccc4nc(NC(C)=O)sc4c3)ccn2)cc1
858
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccccc1
859
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CC2CCN2)CC1
860
- CN(C)C(=O)N1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
861
- CC(=O)NC1CCN(c2cccc(-c3ccc4nc(-c5cccnc5N)n(-c5ccc(C6(N)CCC6)cc5)c4n3)c2)CC1
862
- CCOC(=O)c1c(C)nc(NNC(=O)c2cccc3c(=O)c4ccccc4[nH]c23)nc1-c1cccc(OC)c1
863
- Cc1cc(O)cc2c1OC(C)(CCCC(C)C)CC2=O
864
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cccnc3-4)cc2)CCC1
865
- COc1ccccc1NC(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
866
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1csc2ccccc12
867
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c1cccc2C(F)(F)F
868
- NC1(c2ccc(-c3nn4c(-c5cccc(CO)c5)cnc4cc3-c3ccccc3)cc2)CCC1
869
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CCCN)cc21
870
- NC1(c2ccc(-c3nc4c5cc(F)ccc5nn4c(NC4CC4)c3-c3ccccc3)cc2)CCC1
871
- COc1cc(-c2cnc[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
872
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CN)c4ccc(C(F)(F)F)cc4)CC3)c21
873
- Cn1ncc(Br)c1-c1coc(C(=O)NC2(c3ccc(Cl)c(Cl)c3)CCNCC2)c1
874
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCCCCN)cc21
875
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4cn[nH]c4)nc32)cc1
876
- NC1(C(=O)NC(CCCO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
877
- CNc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)nc2nccn12
878
- CCn1c(-c2nonc2N)nc2cncc(C(=O)N3CCC(N)C3)c21
879
- NC(=O)c1ccc(NC2CNCCC2c2ccc(F)c(Cl)c2)c2cncnc12
880
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCO)CC1
881
- Nc1cc2[nH]nc(Cl)c2cc1-c1cncc(OCC(N)Cc2c[nH]c3ccccc23)c1
882
- NC1(C(=O)NC(CCO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
883
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(C(=O)NCCC(N)=O)CC4)cc3)nc2n1
884
- NC(CNc1nnc(-c2ccc3c(c2)CC(=O)N3)s1)Cc1ccc(C(F)(F)F)cc1
885
- O=C(Cc1nc(N2CCOC(CO)C2)cc(=O)[nH]1)N1CCc2c(Cl)cccc21
886
- CC(C)Cc1nc(-c2ccc(CN3CCC(n4c(O)nc5ccccc54)CC3)cc2)c(-c2ccccc2)nc1O
887
- [O-][n+]1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6nc(N7CCN(CCO)CC7)ncc6cc5-c5ccccc5)cc4)CC3)[nH]2)cc1
888
- Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(F)cc4)c3)cc2o1
889
- O=c1[nH]c2ccccc2n1C1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
890
- COC(=O)c1cc(OC)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
891
- Nc1ncnc(N2CCC(c3nc(-c4ccc(C(F)(F)F)cc4)cn3CCN3CCCC3)CC2)c1C1CCC1
892
- Nc1cn2nc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(F)cc4)c3)ccc2n1
893
- NCC(Cc1ccccc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
894
- Cc1cc(CC(N)COc2cncc(-c3ccc4[nH]nc(C)c4c3)c2)ccc1F
895
- OCc1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc12
896
- CC(=O)N1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
897
- COc1ccc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)cc1F
898
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccccc4Cl)c3)cc12
899
- COc1ccc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCC(N)=O)CC3)c2=O)cn1
900
- c1ccc(-c2cc3cnc(N4CCn5cnnc5C4)nc3nc2-c2ccc(CN3CCC(c4nnc(-c5ccccn5)[nH]4)CC3)cc2)cc1
901
- COc1cc(OC)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
902
- N#Cc1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cc1
903
- CN(C)CCNc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(c5nnc(-c6ccccn6)[nH]5)CC4)cc3)nc2n1
904
- CCCC1OC(CC(=O)O)CC2=C1C(=O)c1c(O)cccc1C2=O
905
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)n1C
906
- COC(=O)c1cc(OC)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
907
- c1ccc(-c2cc3c(ccn4cnnc34)nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
908
- Cc1nc(-c2ccccc2)c(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)[nH]c1=O
909
- CNc1nccc(-c2ccc(C(=O)NC(Cc3ccc(Cl)cc3Cl)CN(C)C)s2)n1
910
- CS(=O)(=O)c1ccc(-c2ccc3c(c2)nn2cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc32)cc1
911
- CCn1c(-c2nonc2N)nc2cncc(CNC3CCNCC3)c21
912
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4cccc(C(F)(F)F)c4)c3)cc12
913
- CC1COCCN1c1nc(N2CCOCC2C)c2ccc(-c3cccc(N)c3)nc2n1
914
- COC1C(N(C)C(=O)c2ccccc2)CC2OC1(C)n1c3ccccc3c3c4c(c5c6ccccc6n2c5c31)C(=O)NC4
915
- NC1(c2ccc(-c3nc4nc(O)ccn4c3-c3ccccc3)cc2)CCC1
916
- c1ccc(-c2cc3cccnc3nc2-c2ccc(CN3CCC(c4cc(-c5cccnc5)[nH]n4)CC3)cc2)cc1
917
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cc(F)c(F)c(F)c1
918
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3[nH]cc(Cl)c23)CC1
919
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)c(Cl)c(=O)[nH]1
920
- Cn1cc(C(CN)Oc2cncc(C=Cc3ccncc3)c2)c2ccccc21
921
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CNC)cc21
922
- Cn1nc(N)c2cc(-c3ccccc3)c(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)nc21
923
- CC(C)NC(=O)c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
924
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3F)CC1)c1ccc(Cl)cc1
925
- CC1(C#N)CN(c2cnc(C(=O)Nc3csc(-c4nncn4C4CC4)n3)cc2-n2cnc(C3CC3)c2)C1
926
- COc1ccc2c(c1)-c1nc(-c3ccc(C4(N)CC(C)(O)C4)cc3)c(-c3ccccc3)n1CO2
927
- Cc1cccc(NC(=O)NCc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)c1
928
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2cccc(F)c2)sc1Cl
929
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3c(F)cccc3-4)cc2)CC(O)(C2CC2)C1
930
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)cc4)cn3CCN3CCCC3)CC2)c1Br
931
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4cccc(F)c4)c3)cc12
932
- c1ccc(-c2cn3ccnc3nc2-c2ccc(CN3CCC(c4cnc5ccccc5n4)CC3)cc2)cc1
933
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCO)CC2c2ccc(F)c(F)c2)oc1Cl
934
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(-c5cn[nH]c5)cc3-4)cc2)C1
935
- c1ccc(-c2cc3cccnc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
936
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2Br)CC1
937
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4cccc(C(C)(C)C)c4)c3)cc2s1
938
- Cn1ncc(Br)c1-c1oc(C(=O)NC2CNCCC2c2ccc(Cl)cc2)cc1Br
939
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(Cl)cc3)cn2CCN2CCCC2)CC1
940
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(Br)cc4)c3)cc12
941
- COc1cc2ncc3c(N)nc(-c4cncc(OCC(N)Cc5cccc(Cl)c5)c4)cc3c2cc1OC
942
- N#Cc1cc2cc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)ccc2cn1
943
- NC1(c2ccc(-c3nn4c(-c5ccn[nH]5)cnc4cc3-c3ccccc3)cc2)CCC1
944
- COC(=O)c1cccc(-c2ccc3nn4cc(-c5ccccc5)c(-c5ccc(C6(N)CCC6)cc5)nc4c3c2)c1
945
- NC1(c2ccc(-c3nc4ccc(-c5cc[nH]n5)cn4c3-c3ccccc3)cc2)CCC1
946
- Sc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
947
- NC(=O)C=Cc1ccc2nn3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc3c2c1
948
- Nc1ncccc1-c1nc2cc(Br)cnc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
949
- NC1(Cc2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
950
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCC3)CC2)c1-c1ccc(F)c(F)c1
951
- Cn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C=O)CC1
952
- CC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CCN1
953
- CCC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cccnc3-4)cc2)C1
954
- Cn1nccc1-c1csc(C(=O)NC2(c3ccc(F)cc3)CCNCC2)c1
955
- Cc1n[nH]c2ncc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)cc12
956
- CC(=O)OC(C)(C)C(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(C5(N)CCC5)cc4)c3n2)c1
957
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccccc4F)c3)cc2s1
958
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(-c3ccccc3Cl)cccc21
959
- Nc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
960
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)c(C(F)(F)F)c1
961
- Cc1cc(-c2cn(CCNCC3CC3)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccc1F
962
- O=C(NC(c1ccc(Cl)c(Cl)c1)C1CCNCC1)c1ccc2cnc(Cl)cc2c1
963
- CC(C)=C1C(=O)Nc2ccc(NC(COc3cncc(-c4ccc5c(c4)C(=C(C)C)C(=O)N5)c3)Cc3c[nH]c4ccccc34)cc21
964
- CN(C)CCC1CC(c2ccc(Cl)c(Cl)c2)C(NC(=O)c2cc(-c3c(Cl)cnn3C)c(Cl)o2)CN1
965
- Cn1nccc1-c1coc(C(=O)NC2CNCCC2c2ccccc2)c1.O=C(O)C(O)C(O)C(=O)O
966
- NC1(c2ccc(-c3nc4c(Br)cccn4c3-c3ccccc3)cc2)CCC1
967
- CC(C)N(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
968
- NCC(Cc1ccc(C(F)(F)F)cc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
969
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cc(F)c(F)cc1F
970
- Cc1nc2cnc3ccc(C#Cc4cccnc4)cc3c2n1-c1ccc(CC#N)cc1
971
- NC1(c2ccc(-c3ncc4cnccc4c3-c3ccccc3)cc2)CCC1
972
- Cc1c(-c2ccn[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1C
973
- NC(=O)c1cc(Br)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
974
- O=c1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6cc[nH]c(=O)c6cc5-c5ccccc5)cc4)CC3)[nH]2)c[nH]1
975
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C)c3)cn2CCN(CC)C(C)C)CC1
976
- CC(C)c1cc(-c2cn(CCN3CCC3)c(C3CCN(c4ncnc(N)c4Cl)CC3)n2)ccn1
977
- NC1(c2ccc(-c3ncc4cccn4c3-c3ccccc3)cc2)CCC1
978
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2n1
979
- CC(=O)Nc1ccc(S(=O)(=O)NCc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)cc1
980
- CCCOc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2n1
981
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccc(Cl)cc2)n(C)c1Cl
982
- NC1(c2ccc(-c3nc4cc(-c5ncc[nH]5)ccn4c3-c3ccccc3)cc2)CCC1
983
- COc1cccc(CNC(=O)NC2CCN(Cc3ccc(-c4nnc5n4-c4cccnc4Nc4ccccc4-5)cc3)CC2)c1
984
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(Br)c1O
985
- O=C(NC(c1ccc(Cl)c(-c2ccccc2)c1)C1CCNCC1)c1ccc2cnccc2c1
986
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4c(F)cc(F)cc4Br)c3)cc12
987
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CC(C)C2)CC1
988
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(-c7ccccn7)nn6c(NC(C)C)c5-c5ccccc5)cc4)C3)n[nH]2)n1
989
- NC1(Cc2ccc(Cl)c(Cl)c2)CCN(c2ncnc3[nH]ccc23)CC1
990
- CC(C)(C)c1ccc(CC2(N)CCN(c3ccnc4[nH]ccc34)CC2)cc1
991
- NC(=O)c1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cc1
992
- Cc1cccc(-c2nc(C3CCN(Cc4ccc(-c5nc6nccn6cc5-c5ccc(F)cc5)cc4)CC3)n[nH]2)n1
993
- COC(=O)c1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
994
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(F)cc4)c3)cc2[nH]1
995
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CN4CCC(n5ncc6c(N)ncnc65)CC4)cc3)nc2n1
996
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccccc4)c3)cc2s1
997
- Nc1nc(N)c2cc(-c3ccccc3)c(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)nc2n1
998
- Cc1cc(-c2cn(CCN3CCCC3)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccc1F
999
- Cc1ccc(CC(CNC(C)C)C(=O)N2CCN(c3ncnc4c3C(C)SC4)CC2)cc1
1000
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CCCN)cc21
1001
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3)CC1)c1ccc(Cl)cc1
1002
- COc1cccc2c1CCN2C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
1003
- COCC(NC(=O)C(Cc1ccc(O)cc1)NC(=O)C(C(C)C)N(C)C(=O)C(CCCNC(=N)N)NC(=O)C1CCCN1C(=O)C(N)CCCNC(=N)N)C(=O)NCCCC(=O)O
1004
- CCOc1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1005
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCCN)cc21
1006
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc2sccc12
1007
- Cn1cncc1-c1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1008
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1-c1cn[nH]c1
1009
- O=C(NC(c1ccc(Cl)c(Cl)c1)C1CCCNC1)c1ccc2cnccc2c1
1010
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)C(CN)NC(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(N)=O
1011
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
1012
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)Cc3ccccc3)cc21
1013
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CN)c4ccc(Cl)cc4)CC3)c21
1014
- O=C1C(=Cc2ccccn2)CNCC1=Cc1ccccn1
1015
- Nn1c(CC(=O)N2CCc3ccccc32)nc(N2CCOCC2)cc1=O
1016
- CCCC1OC2CC(=O)OC2C2=C1C(=O)c1ccccc1C2=O
1017
- CN1CCN(c2ccc3nc(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)c(-c4ccccc4)nc3n2)CC1
1018
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cc(C(N)=O)ccc3-4)cc2)C1
1019
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(Cl)cc4)c3)cc2s1
1020
- CCC(C)(C)NCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
1021
- COc1ccc2c(c1)OCn1c-2nc(-c2ccc(C3(N)CC(O)(C4CC4)C3)cc2)c1-c1ccccc1
1022
- CCC1SCc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c21
1023
- OCCN1CCN(c2ccc3nc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ncccn7)[nH]6)CC5)cc4)nc3n2)CC1
1024
- O=C1CC2OC(COCc3ccccc3)C3=C(C(=O)c4ccccc4C3=O)C2O1
1025
- CCOc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1026
- CC1COCCN1c1nc(N2CCOCC2)nc2nc(-c3ccc(N)nc3)ccc12
1027
- NC(=O)Nc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1028
- NC(=O)C(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
1029
- CCOc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN(C)C)CC1
1030
- NC(=O)Nc1ccc2c(c1)C(=Cc1cc(-c3cccc(C(=O)NCCN4CCCCC4)c3)c[nH]1)C(=O)N2
1031
- NC(=O)c1cc(Cl)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1032
- Cc1cc(-c2cn(CC3CNC3)c(C3CCC(c4ncnc(N)c4C(C)C)CC3)n2)ccc1F
1033
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(C)cn6cc5-c5ccccc5)cc4)C3)n[nH]2)n1
1034
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(-c5cn[nH]c5)cc3-4)cc2)C1
1035
- NC(Cc1ccccc1)c1ccc(-c2ncnc3[nH]cnc23)cc1
1036
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1ccccc1
1037
- N#Cc1cc(-c2ccccc2)c(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)nc1Cl
1038
- N=c1ccc2nc(-c3ccc(C4(NC(=O)Cc5cccnc5)CCC4)cc3)c(-c3ccccc3)cc2n1C(N)=O
1039
- NC(COc1cncc(-c2ccc3c(c2)C(c2ccco2)C(=O)N3)c1)Cc1c[nH]c2ccccc12
1040
- CNc1nccc(-c2ccc(C(=O)NC(CN)Cc3ccc(Cl)cc3Cl)s2)n1
1041
- NC(CNc1ncc(-c2ccc3cncnc3c2)s1)Cc1ccc(C(F)(F)F)cc1
1042
- NC1(c2ccc(-c3nc4cc(F)ccn4c3-c3ccccc3)cc2)CCC1
1043
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2C2CCNCC2)CC1
1044
- COc1nn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6cccc(C)n6)n5)C4)cc3)nc2c1CO
1045
- CC(=C1C(=O)Nc2ccc(NC(N)=O)cc21)c1cc(CN)c[nH]1
1046
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccccc4OC(F)(F)F)c3)cc12
1047
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C2=CCCC2)CC1
1048
- O=C(NCCc1ccc(Cl)cc1)NC1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
1049
- Cc1cc(-c2cn(CC3CN(CO)C3)c(C3CCN(c4ncnc(N)c4C(C)C)CC3)n2)ccc1F
1050
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CCN3CCCCC3)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
1051
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(F)cc(F)cc21
1052
- N#Cc1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1053
- Cc1nc(-c2ccoc2)c(-c2cnc3[nH]nc(C)c3n2)cc1OCC(N)Cc1c[nH]c2ccccc12
1054
- CC1=NN(C(=O)c2ccc(N)cc2)C(=O)C1N=Nc1ccc(S(=O)(=O)Nc2ncccn2)cc1
1055
- CC(O)c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1056
- CC(C)(O)C(=O)Nc1cccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(C5(N)CCC5)cc4)c3n2)c1
1057
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(C(C)C)c3)cn2CCN2CCC2)CC1
1058
- NCC(C(=O)N1CCN(c2ncnc3[nH]cc(Cl)c23)CC1)c1ccc(Cl)c(Cl)c1
1059
- CC1Cc2c(Br)cccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)n1C
1060
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OC(CN)c3ccccc3)cc21
1061
- CC(C)NC(c1ccc(Cl)cc1)c1ccc(-c2ccncc2)cc1
1062
- CN1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccc(N)nc7)[nH]6)CC5)cc4)nc3n2)CC1
1063
- CCCc1cc(-c2cncc(OCC(N)Cc3c[nH]c4ccccc34)c2)cc2c(C)n[nH]c12
1064
- COc1cccc2c1-c1nc(-c3ccc(C4(N)CC(O)(C5CC5)C4)cc3)c(-c3ccccc3)n1CO2
1065
- O=C1NCCc2nc(-c3ccc(CN4CCC(c5nnc(-c6ccccn6)[nH]5)CC4)cc3)c(-c3ccccc3)cc21
1066
- COc1ccc(-c2nc3ccc(-c4cccc(N5CCOCC5)c4)nc3n2-c2ccc(C3(N)CCC3)cc2)cc1
1067
- Nc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc12
1068
- NC1(c2ccc(-c3nc4ccc(O)cn4c3-c3ccccc3)cc2)CCC1
1069
- C=Cc1cnc2cc(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nn12
1070
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(Br)cccc21
1071
- NC(=O)Nc1ccc2c(c1)C(=Cc1cc(-c3cccnc3)c[nH]1)C(=O)N2
1072
- CNc1cc(C=Cc2cncc(OCC(N)Cc3c[nH]c4ccccc34)c2)ccn1
1073
- COC(=O)c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1074
- NC1(c2ccc(-c3nc4ccc(-c5ccncc5)cn4c3-c3ccccc3)cc2)CCC1
1075
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3O)CC1)c1ccc(Cl)cc1
1076
- CC(C)(N)c1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4ccccc4)nc32)cc1
1077
- NC1(Cc2ccccc2OC(F)(F)F)CCN(c2ncnc3[nH]ccc23)CC1
1078
- O=C(Cc1nc(-c2ccncc2)cc(=O)[nH]1)N1CCc2c(F)cccc21
1079
- Cl.Cn1ncc(C(=O)O)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1
1080
- c1ccc(-c2cn3nc(C4CC4)nc3nc2-c2ccc(CN3CCC(c4cnc5ccccc5n4)CC3)cc2)cc1
1081
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccccc1F
1082
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CC(C(F)F)C3)cn2)cs1)C(F)(F)F
1083
- Nc1cc(N2CCC(c3nc(-c4cccc(F)c4)cn3CCN3CCCC3)CC2)ncn1
1084
- Cc1cc(F)ccc1S(=O)(=O)NCC(O)CN1CCCC2(CCN(c3ncnc(N)c3C3CC3)C2)C1
1085
- CS(=O)(=O)OCc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1
1086
- NC1(C(=O)NC(CCCN2CCCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
1087
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1c2ccccc2CC1c1ccccc1
1088
- CC(C)CC(=O)Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1
1089
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCCF)c4ccc(Cl)cc4)CC3)c21
1090
- Cc1nc(N)nc2c1cc(-c1cnc3ccccc3c1)c(=O)n2C1CCC(OCC(N)=O)CC1
1091
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(-c3ccccc3F)cccc21
1092
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)c3ccccc3)cc21
1093
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCC(=O)c4ccccc4N)cc3)nc2n1
1094
- NC1(c2ccc(-c3nc4cc(-n5cccn5)ccn4c3-c3ccccc3)cc2)CCC1
1095
- Clc1ccc2nc(CCNCc3ccc(-c4nnc5n4-c4cccnc4Nc4ccccc4-5)cc3)[nH]c2c1
1096
- O=C(N1CCN(c2ncnc3[nH]ccc23)CC1)C1(c2ccc(Cl)cc2)CCNCC1
1097
- CC1Cc2c(O)cccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
1098
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CN(C)C)cc21
1099
- Cc1c(NCCN(C)C)cc(Cl)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
1100
- NC(=O)Nc1ccc2c(c1)C(=Cc1cc(-c3cccc(C(N)=O)c3)c[nH]1)C(=O)N2
1101
- NC1(c2ccc(-c3nc4c5cccc(-c6cn[nH]c6)c5nn4cc3-c3ccccc3)cc2)CCC1
1102
- CC1Cc2c(ccc(F)c2F)N1C(=O)Cc1nc(N2CCOCC2)cc(=O)n1C
1103
- O=C(NCc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1cccnc1
1104
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(C(F)(F)F)c(F)c3)cn2CCN2CCC(F)(F)CC2)CC1
1105
- Oc1nc2cc(NC(COc3cncc(-c4ccc5[nH]c(O)nc5c4)c3)Cc3c[nH]c4ccccc34)ccc2[nH]1
1106
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1c2ccccc2CC1C1CC1
1107
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCc4n[nH]c(-c5ccccc5)n4)cc3)nc2n1
1108
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC(C)(C)CO)c4ccc(Cl)cc4)CC3)c21
1109
- CCc1cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c(OC)n1
1110
- CC(=C1C(=O)Nc2ccc(NC(N)=O)cc21)c1cc(CNC(=O)C2CNCCN2)c[nH]1
1111
- NC1(c2ccc(-c3nc4n(c3-c3ccncc3)COc3cccc(F)c3-4)cc2)CC(O)(C2CC2)C1
1112
- CCN(CC)CCNC(=O)c1ccc2nc(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)c(-c3ccccc3)nc2c1
1113
- Cn1c(CC(=O)Nc2cccc3sccc23)nc(N2CCOCC2)cc1=O
1114
- CN(C)C1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
1115
- Oc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
1116
- CC(C)c1ccc(CC(CN)C(=O)N2CCN(c3ncnc4[nH]ccc34)CC2)cc1
1117
- COc1ccc(CC(CN)NC(=O)c2cc(Br)c(-c3ccnc4[nH]ccc34)s2)cc1
1118
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCCCNCCc3ccc(OC)cc3)c21
1119
- Cc1c[nH]c2ncnc(N3CCC(N)C3)c12
1120
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3ccsc3)cc12
1121
- Nc1ncnc2nc(-c3ccc(CN4CCC(n5cnc6c(N)ncnc65)CC4)cc3)c(-c3ccccc3)cc12
1122
- Nc1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nc6nc(N7CCN(CCO)CC7)ccc6nc5-c5ccccc5)cc4)CC3)[nH]2)cn1
1123
- Nc1ncccc1-c1nc2cc(-c3cccnc3)cnc2n1-c1ccc(CNC(=O)c2cccc(F)c2)cc1
1124
- Cn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2-c2cccc(C#N)c2)CC1
1125
- NC1(c2ccc(-c3nc4cc(-c5cnc[nH]5)ccn4c3-c3ccccc3)cc2)CCC1
1126
- COC(=O)c1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CCC(c5n[nH]c(-c6ccccn6)n5)CC4)cc3)nc12
1127
- NC1(c2ccc(-c3nc4ncc(-c5ccccc5)cn4c3-c3ccccc3)cc2)CCC1
1128
- CC(C)(Cc1ccco1)C1C(=O)Nc2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc21
1129
- NC1(C(=O)NCc2ccc(OC(F)(F)F)cc2)CCN(c2ncnc3[nH]ccc23)CC1
1130
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cnccc3-4)cc2)CCC1
1131
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CNC(=O)C2CCCCC2)cc1
1132
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1ccc2ccccc2c1
1133
- NC1(c2ccc(-c3nc4nc(C5CC5)ccn4c3-c3ccccc3)cc2)CCC1
1134
- COc1cc(Cl)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1135
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(Br)c(F)c4)c3)cc12
1136
- Cc1cc(-c2cn(CC3CNC3)c(C3CCN(c4ncnc(N)c4OC(C)C)CC3)n2)ccc1F
1137
- CSc1ncc2cc(-c3ccccc3)c(-c3ccc(CNCCC(N)=O)cc3)nc2n1
1138
- Cc1cccc(C)c1S(=O)(=O)N1CCCC1C(O)CN1CCCC2(CCN(c3ncnc(N)c3C3CC3)C2)C1
1139
- Cc1ccc2[nH]c(C3CCN(Cc4ccc(-c5ncc(-c6nn[nH]n6)cc5-c5ccccc5)cc4)CC3)nc2c1
1140
- CC1Cc2c(F)cccc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)[nH]1
1141
- CC(C)(C)NC(c1ccc(Cl)cc1)c1ccc(-c2ccncc2)cc1
1142
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=S)n1C
1143
- NC(CNc1cnc(-c2ccc3cnccc3c2)s1)Cc1ccc(C(F)(F)F)cc1
1144
- O=C1N=CC=C2N=C(c3ccc(CN4CCC(c5n[nH]c(-c6ccccn6)n5)CC4)cc3)C(c3ccccc3)=CC12
1145
- Cc1nc2nc(-c3ccc(CN4CCC(c5nc6ccc(C#N)cc6[nH]5)CC4)cc3)c(-c3ccccc3)cn2n1
1146
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3F)CC1)c1ccc(Cl)cc1
1147
- NC1(c2ccc(-c3nc4ccc(C(=O)NCC5CC5)cn4c3-c3ccccc3)cc2)CCC1
1148
- Brc1cnc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn12
1149
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(C(F)(F)F)c3)cn2CCNC)CC1
1150
- C#Cc1ncc(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1ccc2cnccc2c1
1151
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCC2)CC1
1152
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(F)c3)cn2CCN2CCCC2)CC1
1153
- [C-]#[N+]COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1154
- Cc1noc(-c2cc(NCCN3CCCC3)c(C)c(N3CCN(c4ncnc5[nH]nc(Br)c45)CC3)c2)n1
1155
- CC(C)NCC(Cc1ccc(Cl)c(F)c1)C(=O)N1CCN(c2ncnc3c2C(C)OC3)CC1
1156
- Fc1ccc2c(c1)C1(CCNCC1)CN2c1ncnc2[nH]ccc12
1157
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4cccc(C(F)(F)F)c4)s3)cc12
1158
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(C(C)C)nn6cc5-c5ccccc5)cc4)C3)n[nH]2)n1
1159
- NC1(C(=O)NC(CCCN2CCOCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
1160
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CC4CC4C3)cn2)cs1)C(F)(F)F
1161
- CCN(CC)CCNC(=O)c1ccc2nc(-c3ccccc3)c(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)nc2c1
1162
- Cc1cc(-c2cn(CCO)c(C3CCN(c4ncnc(N)c4C(N)=O)CC3)n2)ccc1F
1163
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCC3CCCNC3)c21
1164
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4csc5ccccc45)c3)cc12
1165
- Cc1ccc(S(=O)(=O)NCC(O)CN2CCCC3(CCN(c4ncnc(N)c4C4CC4)C3)C2)c(C)c1
1166
- NC1(C(=O)NC(c2ccc(Cl)cc2)C2CC2)CCN(c2ncnc3[nH]ccc23)CC1
1167
- NC(CNC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1)c1ccccc1
1168
- CON(C)C(=O)c1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1169
- c1ccc(-c2cn3nc(C4CC4)nc3nc2-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)cc1
1170
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
1171
- N#Cc1ccc(CC(N)C(=O)N2CCN(c3ncnc4ccccc34)CC2)cc1
1172
- O=C(NCCc1cccs1)NC1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
1173
- COC(=O)c1c(C)nc(NNC(=O)c2cccc3c(=O)c4ccccc4[nH]c23)nc1-c1ccccc1
1174
- COc1cc(-c2ccc3c(c2)nn2cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc32)ccc1F
1175
- Cc1cc(-c2cncc(OCC(N)Cc3c[nH]c4ccccc34)c2)cc2ccncc12
1176
- NCC(C(=O)Nc1ccc(-c2ccnc3[nH]ccc23)s1)c1ccccc1
1177
- Cc1n[nH]c2ccc(-c3cncc(SCC(N)Cc4c[nH]c5ccccc45)c3)cc12
1178
- NC1(CNC(=O)c2ccc(F)cc2F)CCN(c2ncnc3[nH]cc(Cl)c23)C1
1179
- COc1nn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CCC(c5n[nH]c(-c6ccccn6)n5)CC4)cc3)nc2c1CO
1180
- Nc1ncnc(N2CCC(c3nc(-c4ccc(C(F)(F)F)c(F)c4)cn3CCN3CCCC3)CC2)c1C1CCC1
1181
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCC4CC4)c4ccc(Cl)cc4)CC3)c21
1182
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNC4CCOCC4)c4ccc(Cl)cc4)CC3)c21
1183
- Cn1c(CC(=O)N2CCc3ccc(F)cc32)nc(N2CCOCC2)cc1=O
1184
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2cc(F)c(F)cc21
1185
- Nc1ncnc2c1cnn2C1CCN(Cc2ccc(-c3nc4ccnn4cc3-c3ccccc3)cc2)CC1
1186
- NC1(c2ccc(-c3nc4ccc(-c5ccncc5)cn4c3-c3ccccc3)cc2)CCC1
1187
- CNC(=O)c1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1188
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
1189
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4c[nH]c5ccncc45)cnc3-c3ccoc3)cc12
1190
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1N
1191
- CCn1c(-c2nonc2N)nc2cncc(OCC3CCNCC3)c21
1192
- NC(=O)Nc1ccc2c(c1)C(=Cc1cc(C(=O)O)c[nH]1)C(=O)N2
1193
- NCC(NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1)c1ccccc1
1194
- Cc1nc(-c2ccoc2)c(-c2cnc3[nH]nc(C)c3n2)cc1OCC(N)Cc1c[nH]c2ccccc12
1195
- NC1(c2ccc(-c3nc4cc(-c5ccccn5)ccn4c3-c3ccccc3)cc2)CCC1
1196
- NC1(c2ccc(-c3nn4c(-c5ccc(F)c(CO)c5)cnc4cc3-c3ccccc3)cc2)CCC1
1197
- CNC1CC2OC(C)(C1OC)n1c3ccccc3c3c4c(c5c6ccccc6n2c5c31)C(=O)NC4
1198
- CC(=N)n1c(=N)ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)cc21
1199
- NC1(c2ccc(-c3nc4nc(-c5ccccc5)ccn4c3-c3ccccc3)cc2)CCC1
1200
- NC1CCN(c2ncnc3[nH]ccc23)CC1
1201
- NC1(c2ccc(-c3nc4cc(CO)ccn4c3-c3ccccc3)cc2)CCC1
1202
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(S(C)(=O)=O)cc1
1203
- Cc1cc(-c2cn(CCN(C)C(C)C)c(C3CCN(c4ncnc(N)c4C#N)CC3)n2)ccc1F
1204
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CC(=O)NCC(O)CO)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
1205
- NC(CNc1nnc(-c2ccc3[nH]nc(C4CC4)c3c2)s1)Cc1ccc(C(F)(F)F)cc1
1206
- CNc1nccc(-c2ccc(C(=O)NC(CN)Cc3ccc(Cl)cc3Cl)s2)n1
1207
- CC1Cc2c(Br)cccc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)[nH]1
1208
- Cc1cc(F)ccc1CC(N)COc1cncc(-c2ccc3[nH]nc(C)c3c2)c1
1209
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCCCCN)cc21
1210
- Cc1ccc(C(=O)Nc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)cc1
1211
- Cc1n[nH]c2cnc(-c3cncc(OCC(N)Cc4cccc(F)c4F)c3)cc12
1212
- CCN(CC)C(=O)C1CCN(c2cccc(-c3ccc4nc(-c5cccnc5N)n(-c5ccc(C6(N)CCC6)cc5)c4n3)c2)CC1
1213
- NCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
1214
- O=C(NC(c1ccc(Cl)c(Cl)c1)C1CNC1)c1ccc2cnccc2c1
1215
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CNCc2ccccc2)cc1
1216
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(-c5ccn[nH]5)cc3-4)cc2)C1
1217
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(c4ccc(Cl)cc4)C4CCCN4)CC3)c21
1218
- CNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
1219
- CN(C)CCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
1220
- NC(COc1cncc(-c2ccc3[nH]nc(C(F)(F)F)c3c2)c1)Cc1c[nH]c2ccccc12
1221
- c1ccc(-c2cc3cnc(N4CCOCC4)nc3nc2-c2ccc(CN3CCC(c4nnc(-c5ccccn5)[nH]4)CC3)cc2)cc1
1222
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(-c3ccccn3)cccc21
1223
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccccc3-4)cc2)C1
1224
- Cn1nnnc1-c1cnc(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)c(-c2ccccc2)c1
1225
- CC1SCc2ncnc(N3CCN(C(=O)C(Cc4ccc(Cl)c(F)c4)CC4(N)CC4)CC3)c21
1226
- COc1ccc(CC(CN)NC(=O)c2cc(Br)c(-c3ccnc4[nH]ccc34)s2)cc1
1227
- Nc1ncccc1-c1nc2cc(-c3ccccc3)cnc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
1228
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2cccc(F)c2)oc1Cl
1229
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2C2CCNCC2)CC1
1230
- NC1(c2ccc(-c3nn4c(-c5ccc(S(N)(=O)=O)cc5)cnc4cc3-c3ccccc3)cc2)CCC1
1231
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCCN)cc21
1232
- Nc1ncnc(N2CCC(c3nc(-c4ccc(Cl)cc4)cn3CCN3CCCC3)CC2)c1C1CCC1
1233
- NCC(CC1CCCCC1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
1234
- NC1(c2ccc(-c3nn4c(-c5ccc(F)cc5)c(Cl)nc4cc3-c3ccccc3)cc2)CCC1
1235
- Cc1nc2nc(-c3ccc(CN4CCC(c5nc6cccnc6[nH]5)CC4)cc3)c(-c3ccccc3)cn2n1
1236
- CC1CC(O)c2ncnc(N3CCN(C(=O)C(CNCC4CC4)c4ccc(C(F)(F)F)c(F)c4)CC3)c21
1237
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4cccc(Cl)c4)s3)cc12
1238
- NC(Cc1ccccc1)C(=O)N1CCN(c2ncnc3ccccc23)CC1
1239
- NC1(c2ccc(-c3nc4ccc(C(=O)NC5CC5)cn4c3-c3ccccc3)cc2)CCC1
1240
- Fc1ccc(-c2cn3c(Cl)cnc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
1241
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(F)cc4F)c3)cc12
1242
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3ccccc23)CC1
1243
- CCC(=O)N1CCN(c2cccc(-c3ccc4nc(-c5cccnc5N)n(-c5ccc(C6(N)CCC6)cc5)c4n3)c2)CC1
1244
- NCC(O)(c1ccc(Cl)cc1)c1ccc(-c2cn[nH]c2)cc1
1245
- Nc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1246
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CCN(C4CC4)CC3)cn2)cs1)C(F)(F)F
1247
- Cc1c(NCCN2CCCC2)cc(C(=O)CCC(F)(F)F)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
1248
- CC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)c(Br)c(=O)[nH]1
1249
- NC1(c2ccc(-c3nc4ccc(F)cn4c3-c3ccccc3)cc2)CCC1
1250
- COc1cc(-c2cnc[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1251
- NC1(c2ccc(-c3nc4ccc(Cl)cn4c3-c3ccccc3)cc2)CCC1
1252
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cc(F)ccc3-4)cc2)C1
1253
- Cl.Cn1nccc1-c1ccc(C(=O)NC2CNCCC2c2ccc(Cl)cc2)s1
1254
- NC1(c2ccc(-c3nc4c(-c5ccc(F)cc5)cccn4c3-c3ccccc3)cc2)CCC1
1255
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccsc3)COc3cccc(F)c3-4)cc2)C1
1256
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)cc4)cn3CCN3CCCC3)CC2)c1Cl
1257
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCC3)CC2)c1-c1ccc(C(=O)O)cc1
1258
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3ccoc3)cc12
1259
- COc1ccc(-c2cc3c(C)nc(N)nc3n(C3CCC(OCC(N)=O)CC3)c2=O)cn1
1260
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4ccccc4)nc32)cc1
1261
- CC1Cc2cc(F)c(F)cc2N1C(=O)Cc1nc(N2CCOCC2)c(F)c(=O)n1C
1262
- NC1(c2ccc(-n3c(-c4ccc(Cl)cc4)nc4ccc(-c5cccc(N6CCOCC6)c5)nc43)cc2)CCC1
1263
- CC(C)NCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
1264
- CNC(=O)C1CCN(c2nc(N3CCOCC3C)c3ccc(-c4ccc(N)nc4)nc3n2)CC1
1265
- COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1266
- Cn1nccc1-c1coc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)c1.O=C(O)C(O)C(O)C(=O)O
1267
- c1ccc(-c2nnc(C3CCN(Cc4ccc(-c5nnc6n5-c5ccccc5Nc5ccccc5-6)cc4)CC3)[nH]2)nc1
1268
- NCc1cccc(-c2c[nH]c(C=C3C(=O)Nc4ccc(NC(N)=O)cc43)c2)c1
1269
- Cc1occc1-c1nc(N)c(OCC(N)Cc2ccccc2)cc1-c1cnc2[nH]nc(C)c2n1
1270
- COC(=O)c1cnn2cc(-c3ccc(F)cc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc12
1271
- NC1(c2ccc(-c3nc4c(C5CC5)cccn4c3-c3ccccc3)cc2)CCC1
1272
- Clc1ccc(C(NC2CCCCC2)c2ccc(-c3ncnc4[nH]cnc34)cc2)cc1
1273
- CC(C)Oc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCC2)CC1
1274
- Cn1c(CC(=O)N2CCc3c2cccc3C(F)(F)F)nc(N2CCOCC2)cc1=O
1275
- Cc1ncc(-c2c(N)ncnc2N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3C)CC2)s1
1276
- CC(=O)Nc1ccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CN)cc4)c3n2)cc1.Cl
1277
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3cc(F)ccc23)CC1
1278
- NC(COc1cc(C=Cc2ccncc2)cnc1Cl)Cc1c[nH]c2ccccc12
1279
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
1280
- Cn1c(CC(=O)Nc2ccc(F)c(F)c2)nc(N2CCOCC2)cc1=O
1281
- CS(=O)(=O)N1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
1282
- COc1cccc2c1CCN2C(=O)Cc1nc(N2CCOCC2)cc(=O)n1C
1283
- NC1(C(=O)NC(CO)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
1284
- CCNC(=O)c1ccc2c(c1)-c1nc(-c3ccc(C4(N)CC(C)(O)C4)cc3)c(-c3ccccc3)n1CO2
1285
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3O)CC1)c1ccc(Cl)cc1
1286
- CC(C)n1cc(C(=O)c2cncc(NC3CNCC3c3ccc(F)cc3)n2)c2c(N)ncnc21
1287
- CC1CN(c2cccc(-c3ccc4nc(-c5cccnc5N)n(-c5ccc(C6(N)CCC6)cc5)c4n3)c2)CC(C)O1
1288
- NCC(Cc1ccc(F)cc1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
1289
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CCN3CCCC3)CC2)c1C(=O)O
1290
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1OCC(F)(F)F
1291
- CC(=O)Nc1ccc(-c2ccc3nc(-c4cccnc4N)n(-c4ccc(CN)cc4)c3n2)cc1.Cl
1292
- COCc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1293
- O=S(=O)(NCCNCC=Cc1ccc(Br)cc1)c1cccc2cnccc12
1294
- NCC(CC1CCCCC1)NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1
1295
- CNS(=O)(=O)c1ccc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c2c1
1296
- CC(=O)OCC1OC(NC(=O)CCCn2[se]c3ccccc3c2=O)C(OC(C)=O)C(OC(C)=O)C1OC(C)=O
1297
- Cc1occc1-c1nc(N)c(OCC(N)Cc2ccccc2)cc1-c1cnc2[nH]nc(C)c2n1
1298
- C=CCC1CC(c2ccc(Cl)c(Cl)c2)C(NC(=O)c2cc(-c3c(Cl)cnn3C)c(Cl)o2)CN1
1299
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)Cc3ccccc3)cc21
1300
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CC(=O)Nc2ccccc2)cc1
1301
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C)c3)cn2CCN(CC)CC)CC1
1302
- CCN(CC)CCn1cc(-c2ccc(F)c(C)c2)nc1C1CCN(c2ncnc(N)c2C#N)CC1
1303
- CN(C)CCCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
1304
- CN(C)C(=O)COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1305
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)Cc3ccccc3)cc21
1306
- O=C(Nc1csc(-c2nncn2C2CC2)n1)c1cc(-n2cnc(C3CC3)c2)c(N2CCC3(CC2)COC3)cn1
1307
- CN(C)c1n[nH]c2ccc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)cc12
1308
- N#Cc1cc(-c2ccccc2)c(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)nc1N
1309
- COCCNCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3O)CC1)c1ccc(Cl)cc1
1310
- CCOC1CN(c2cc(N)ncn2)CCC1c1nc(-c2ccc(F)c(C)c2)cn1CCN(CC)C(C)C
1311
- NC1(c2ccc(-c3nn4c(-c5ccc(CO)cc5)cnc4cc3-c3ccccc3)cc2)CCC1
1312
- CC(C)C(=O)N1CCN(c2cccc(-c3ccc4nc(-c5cccnc5N)n(-c5ccc(C6(N)CCC6)cc5)c4n3)c2)CC1
1313
- CC(C)C1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
1314
- NCC(c1ccc(Cl)cc1)c1ccc(-c2cn[nH]c2)cc1
1315
- COC(=O)c1c(C)nc(NNC(=O)c2cccc3c(=O)c4ccccc4[nH]c23)nc1-c1ccccc1F
1316
- CC(C)c1nc2nc(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)c(-c3ccccc3)cn2n1
1317
- Cc1n[nH]c2ccc(-c3nnc(NCC(N)Cc4ccccc4C(F)(F)F)s3)cc12
1318
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(S(C)(=O)=O)cc1
1319
- CC(=O)Nc1cn2cc(-c3cnc(Cl)c(NS(=O)(=O)c4ccc(F)cc4)c3)ccc2n1
1320
- COCC1Cc2ccccc2N1C(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
1321
- CC1CN(c2cc(=O)[nH]c(CC(=O)N3c4ccccc4CC3C)n2)CCO1
1322
- COc1cc(CC(N)C(=O)N2CCN(c3ncnc4ccccc34)CC2)cc(OC)c1
1323
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(C)N)cc21
1324
- Cc1cc(C)c(CC(N)COc2cncc(-c3cc4c(C)n[nH]c4cn3)c2)c(C)c1
1325
- O=S(=O)(Nc1cc(-c2ccc3nccn3n2)cnc1Cl)c1ccc(F)cc1
1326
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)Cl)C3)c12
1327
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCCCCCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCCCN)CC(N)=O
1328
- Nc1nccnc1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(C2(N)CCC2)cc1
1329
- COc1cc2c(cc1Nc1nc(Nc3cccc4c3C(=O)NC4)c3cc[nH]c3n1)N(C(=O)CN(C)C)CC2
1330
- CCOC(=O)c1cc(Br)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1331
- Cc1c[nH]c2ncnc(N3CCC(NS(=O)(=O)c4ccccc4)C3)c12
1332
- COc1cc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)ccc1F
1333
- O=C(N1CCN(c2ncnc3[nH]cc(Cl)c23)CC1)C1(c2ccc(Br)cc2)CCNCC1
1334
- Cc1nc(N)nc2c1cc(-c1cnn(C)c1)c(=O)n2C1CCC(OCC(N)=O)CC1
1335
- OC1CCN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CC1
1336
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2c(-c3cccnc3)cccc21
1337
- CCn1c(-c2nonc2N)nc2c(-c3ccc[nH]3)ncc(OCCCN)c21
1338
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(I)c1
1339
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(F)cc3-4)cc2)C1
1340
- CS(=O)(=O)Nc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1341
- COCCNCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3O)CC1)c1ccc(Cl)cc1
1342
- N#Cc1cc(C(NC(=O)c2ccc3cnccc3c2)C2CCNCC2)ccc1Cl
1343
- Cc1c(-c2ccn[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1C
1344
- [C-]#[N+]COc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1345
- CCCC1NC(=O)C(CCCNC(=N)N)NC(=O)CN(C(=O)C(N)CCCNC(=N)N)CCCCCCNC(=O)NCCCN(CC(N)=O)C(=O)C(CCC(C)C)NC(=O)C(CN)NC(=O)C(Cc2ccc(O)cc2)NC1=O
1346
- NC(CNc1ncc(-c2ccc3c(c2)CC(=O)N3)s1)Cc1ccc(C(F)(F)F)cc1
1347
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4ccc(F)c(F)c4F)c3)cc12
1348
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)cn1
1349
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)Cc1c[nH]c2ccccc12
1350
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)cc3)cn2CCN2CC3(COC3)C2)CC1
1351
- COc1cc2ncc3c(N)nc(-c4cncc(OCC(N)CC(C)C)c4)cc3c2cc1OC
1352
- Nc1ncccc1-c1nc2cc(-c3cccnc3)cnc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
1353
- CC(C)CNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
1354
- CC1CN(c2ncc3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6nnc(-c7ccccn7)[nH]6)CC5)cc4)nc3n2)CCN1
1355
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)cc1Br
1356
- NC1(c2ccc(-c3nc4ccc(-c5cnc[nH]5)cn4c3-c3ccccc3)cc2)CCC1
1357
- Nc1ncccc1-c1nc2ccc(-c3cccnc3)nc2n1-c1ccc(CNC(=O)c2cccc(F)c2)cc1
1358
- Cn1ncc(Cl)c1-c1oc(C(=O)NC2CNCCC2c2ccc(Cl)cc2)cc1Br
1359
- NC1(C(=O)NC(CCCN2CCCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
1360
- Cn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2-c2cncnc2)CC1
1361
- NC1(c2ccc(-c3nc4c(-c5ccn[nH]5)cccn4c3-c3ccccc3)cc2)CCC1
1362
- COc1cncc(-c2cccn3c(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc23)c1
1363
- N#Cc1cnc(-c2ccc(CN3CCC(n4c(=O)[nH]c5ccccc54)CC3)cc2)c(-c2ccccc2)c1
1364
- Cc1ccccc1-c1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2n1
1365
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCCCNCC(O)CO)c21
1366
- Nc1ncnc2nc(-c3ccc(CN4CCC(n5ncc6c(N)ncnc65)CC4)cc3)c(-c3ccccc3)cc12
1367
- NC1(c2ccc(-c3nc4c(-c5ccc6cn[nH]c6c5)cccn4c3-c3ccccc3)cc2)CCC1
1368
- NC(=O)c1cccc(-c2ccc3c(c2)nn2cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nc32)c1
1369
- CN(C)C(=O)COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1370
- COCCOc1cc(F)ccc1NC(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
1371
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)cc1
1372
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)c(Cl)c1
1373
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cc(F)ccc3-4)cc2)C1
1374
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3cnc(F)c(Cl)c3)cn2CCN2CCC2)CC1
1375
- Cc1c[nH]c2ncnc(N3CCC(NC(=O)Nc4ccccc4)C3)c12
1376
- Nc1ccc(S(=O)(=O)Nc2nncs2)cc1
1377
- c1ccc(-c2cn3nccc3nc2-c2ccc(CN3CCC(c4nc5cccnc5[nH]4)CC3)cc2)cc1
1378
- Cn1ncc2ccc(-c3cnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)cc21
1379
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3c2CS(=O)(=O)C3)CC1
1380
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccc(CO)cc3-4)cc2)C1
1381
- Cn1c(CC(=O)N2CCc3c(O)cccc32)nc(N2CCOCC2)cc1=O
1382
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNCCC2c2ccc(F)c(F)c2)oc1Cl.O=C(O)C(O)C(O)C(=O)O
1383
- Cl.Cn1ncc(Cl)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1
1384
- O=C1CC2OCC3=C(C(=O)c4ccccc4C3=O)C2O1
1385
- COc1ccccc1NC(=O)NC1CCN(Cc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)CC1
1386
- NC1(c2ccc(-c3nc4ccc(-c5ncc[nH]5)cn4c3-c3ccccc3)cc2)CCC1
1387
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(F)c4)cn3CCN3CCCC3)CC2)c1C1CCC1
1388
- COc1ccc(C2(C(=O)N3CCN(c4ncnc5[nH]cc(Br)c45)CC3)CCNCC2)cc1
1389
- CS(=O)(=O)c1ccc(-c2cnc3cc(-c4ccccc4)c(-c4ccc(C5(N)CCC5)cc4)nn23)cc1
1390
- Cc1nc(N)nc2c1cc(-c1cnc3ccccc3c1)c(=O)n2C1CCC(OCO)CC1
1391
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2-c2ccc(C(=O)O)cc2)CC1
1392
- COc1ccc(C2(C(=O)N3CCN(c4ncnc5[nH]ccc45)CC3)CCNCC2)cc1
1393
- O=c1c(-c2ccc(O)cc2)coc2cc(O)cc(O)c12
1394
- CCOCCN(CC(O)CN1CCCC2(CCc3cc4c(cc3O2)CNC4=O)C1)S(=O)(=O)c1c(C)cccc1C
1395
- COc1cc(-c2cn[nH]c2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1396
- Nc1ncnc2nc(-c3ccc(CN4CCC(n5ncc6c(N)ncnc65)CC4)cc3)c(-c3ccccc3)cc12
1397
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(C)N)cc21
1398
- COc1cc2ncc3c(N)nc(-c4cncc(OCC(N)Cc5ccc(Cl)c(Cl)c5)c4)cc3c2cc1OC
1399
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)N1CCc2ncccc21
1400
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3ccco3)cc12
1401
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4c(F)cccc4F)c3)cc12
1402
- c1ccc(-c2cc3cnc(-n4ccnc4)nc3nc2-c2ccc(CN3CCC(c4nnc(-c5ccccn5)[nH]4)CC3)cc2)cc1
1403
- NC(CNc1ncc(-c2ccc3[nH]c(=O)oc3c2)s1)Cc1ccc(C(F)(F)F)cc1
1404
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3cccc(Cl)c3)cn2CCN2CCCC2)CC1
1405
- CC(F)(C(=O)N1CCc2c(F)cccc21)c1nc(N2CCOCC2)cc(=O)[nH]1
1406
- C=Cc1cccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1407
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(CN3CCCC3)cc21
1408
- N#Cc1cccc2c1nn1cc(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc21
1409
- Nc1nn(CCc2c[nH]cn2)c2nc(-c3ccc(CN4CCC(n5c(=O)[nH]c6ccccc65)CC4)cc3)c(-c3ccccc3)cc12
1410
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1cccc2ccccc12
1411
- NC(COc1cncc(-c2ccc3c(c2)C(F)(F)C(=O)N3)c1)Cc1c[nH]c2ccccc12
1412
- NC(COc1cncc(-c2ccc3c(F)nccc3c2)c1)Cc1c[nH]c2ccccc12
1413
- C#Cc1cccc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c1
1414
- NC1(C(=O)NCc2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
1415
- N#Cc1ccc2nc(C3CCN(Cc4ccc(-c5nc6nccn6cc5-c5ccccc5)cc4)CC3)[nH]c2c1
1416
- CC(NC(=O)CCCCCNC(=O)C1CC(O)C(n2cnc3c(N)ncnc32)C1)C(=O)NCCCCCC(=O)NC(CCCNC(=N)N)C(=O)NC(CCCNC(=N)N)C(N)=O
1417
- NC1CCN(c2ncnc3[nH]cnc23)CC1
1418
- NC(=O)Nc1ccc2c(c1)C(=Cc1ccc[nH]1)C(=O)N2
1419
- NC(CNc1nnc(-c2ccc3[nH]nc(-c4ccccc4)c3c2)s1)Cc1cccc(C(F)(F)F)c1
1420
- CC(=O)Nc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1421
- CC(=C1C(=O)Nc2ccc(NC(N)=O)cc21)c1cc(CNC(=O)c2ccncc2)c[nH]1
1422
- CC(=C1C(=O)Nc2ccc(NC(N)=O)cc21)c1ccc[nH]1
1423
- CC(=O)Nc1nc2ccc(-c3cnc(Cl)c(NS(=O)(=O)c4cccc(C(F)(F)F)c4)c3)cc2s1
1424
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ncccc3-4)cc2)CC(O)(C2CC2)C1
1425
- Cc1ccc(F)cc1CC(N)COc1cncc(-c2ccc3[nH]nc(C)c3c2)c1
1426
- NC(c1ccc(Cl)cc1)C1CCN(c2ccnc3[nH]ccc23)CC1
1427
- CCONC(=O)c1ccc2c(c1)-c1nc(-c3ccc(C4(N)CC(C)(O)C4)cc3)c(-c3ccccc3)n1CO2
1428
- COC(=O)c1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc12
1429
- NC1(c2ccc(-c3nc4ccc(Br)cn4c3-c3ccccc3)cc2)CCC1
1430
- COCCN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
1431
- c1ccc(-c2cc3ccncc3nc2-c2ccc(CN3CCC(c4n[nH]c(-c5ccccn5)n4)CC3)cc2)cc1
1432
- Cc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2n1
1433
- NC(COc1cncc(C=Cc2ccncc2)c1)Cc1cccc(O)c1
1434
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3cccc(F)c3-4)cc2)CCC1
1435
- CNc1nccc(-c2ccc(C(=O)NC(CCN)Cc3ccc(Cl)cc3Cl)s2)n1
1436
- Cn1c(CC(=O)Nc2cccc(Br)c2)nc(N2CCOCC2)cc1=O
1437
- Cc1ccc(-n2c(-c3cccnc3N)nc3cccnc32)cc1
1438
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c(C(N)=O)[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2n1
1439
- CCCC1NC(=O)C(CCCNC(=N)N)NC(=O)CN(C(=O)C(N)CCCNC(=N)N)CCNC(=O)NCCCCCCN(CC(N)=O)C(=O)C(CCC(C)C)NC(=O)C(CN)NC(=O)C(Cc2ccc(O)cc2)NC1=O
1440
- c1ccc(-n2cc(COCc3ccc(-c4nnc5n4-c4cccnc4Nc4ccccc4-5)cc3)nn2)cc1
1441
- NC1(c2ccc(-c3nc4c5ccc(Br)cc5nn4cc3-c3ccccc3)cc2)CCC1
1442
- c1ccc2c(CC(COc3cncc(-c4ccc5nonc5c4)c3)Nc3ccc4nonc4c3)c[nH]c2c1
1443
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(F)c3)cn2CCN2CCCC2)CC1
1444
- NC(COc1cncc(-c2ccc3[nH]c(=O)sc3c2)c1)Cc1c[nH]c2ccccc12
1445
- CCn1c(-c2nonc2N)nc2cncc(OC3CCNCC3)c21
1446
- NC1(c2ccc(-c3nc4c(F)cc(F)cn4c3-c3ccccc3)cc2)CCC1
1447
- NC(COc1cncc(-c2ccc3cnccc3c2)c1)CC1CCCCC1
1448
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ncccc3-4)cc2)C1
1449
- CC(C)NCC(C(=O)N1CCN(c2ncnc3c2C(C)CC3O)CC1)c1ccc(Cl)cc1
1450
- CNC(=O)Nc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1451
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OC(CN)c3ccccc3)cc21
1452
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccccc1
1453
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1cccc(OC(F)(F)C(F)F)c1
1454
- COc1ccc(S(=O)(=O)Nc2cc(-c3ccc4nc(NC(C)=O)sc4c3)cnc2Cl)cc1
1455
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccccc1
1456
- NC(CNc1cncc(C=Cc2ccncc2)c1)Cc1c[nH]c2ccccc12
1457
- CC1Cc2ccccc2N1C(=S)Cc1nc(N2CCOCC2)cc(=S)[nH]1
1458
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)cc3)cn2CCN2CCCC2)CC1
1459
- NC1(c2ccc(-c3nc4cc(-c5nn[nH]n5)ccn4c3-c3ccccc3)cc2)CCC1
1460
- CN(C)CCNC(=O)c1c[nH]c(C=C2C(=O)Nc3ccc(NC(N)=O)cc32)c1
1461
- COc1cc(-c2ncc[nH]2)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1462
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CC4CN(C)CC4C3)cn2)cs1)C(F)(F)F
1463
- Nc1ncccc1-c1nc2cccnc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
1464
- O=C(NC(c1ccc(Cl)c(Cl)c1)C1CCNCC1)c1ccc2cnc(F)cc2c1
1465
- Nc1ccc2ncnc(N3CCN(C(=O)C(N)Cc4ccc(Cl)cc4)CC3)c2c1
1466
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)C(CC1CCCCC1)NC(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(N)=O
1467
- NC1(c2ccc(-c3nc4cc(-n5cccn5)ccn4c3-c3ccccc3)cc2)CCC1
1468
- CC(C)CCCC1(C)CCc2cc(S(N)(=O)=O)cc(Br)c2O1
1469
- NC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3[nH]cc(C4CC4)c23)CC1
1470
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CC(=O)O)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
1471
- NC(CNc1ncc(-c2ccc3cnccc3c2)s1)Cc1ccc(C(F)(F)F)cc1
1472
- Cn1cncc1-c1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1
1473
- COCCNCCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C(N)=O)CC1
1474
- CC(=O)Nc1ccc(-c2cncc(OCC(N)Cc3c[nH]c4ccccc34)c2)cc1
1475
- NC(Cc1cccc2ccccc12)C(=O)Nc1cncc(C=Cc2ccncc2)c1
1476
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1cc2c(C)n[nH]c2cn1
1477
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CCC4(CC3)COC4)cn2)cs1)C(F)(F)F
1478
- Nc1ncnc(N2CCC(c3nc(-c4ccc(F)c(C(F)(F)F)c4)cn3CC3CCN3)CC2)c1Cl
1479
- C=Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(F)c3)cn2CCN2CCCC2)CC1
1480
- Cn1ncc(Cl)c1-c1cc(C(=O)NC2CNC(CC(N)=O)CC2c2ccc(Cl)c(Cl)c2)oc1Cl
1481
- NC1(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ccncc3-4)cc2)CCC1
1482
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccnc(Cl)c3)cn2CCN2CCC2)CC1
1483
- COC(=O)c1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CCC(c5n[nH]c(-c6cccc(C)n6)n5)CC4)cc3)nc12
1484
- CCN(c1nccc(-c2ccc3nc(NC(C)=O)sc3c2)n1)S(=O)(=O)c1ccc(OC)cc1
1485
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(N)=O)c3)cn2CCN2CCC2)CC1
1486
- NC(COc1cncc(C=Cc2ccnc(NCc3ccccc3)c2)c1)Cc1c[nH]c2ccccc12
1487
- Cc1occc1-c1nc(N)c(OCC(N)Cc2c[nH]c3ccccc23)cc1-c1cnc2[nH]nc(C)c2c1
1488
- CC(C)NCC(Cc1ccc(Cl)cc1)C(=O)N1CCN(c2ncnc3c2C(C)OC3)CC1
1489
- NC1(c2ccc(-c3nc4nc(-c5ccccc5)ccn4c3-c3ccccc3)cc2)CCC1
1490
- Cc1cnn(C)c1-c1ccc(C(=O)NC2CNCCC2c2cccc(F)c2)s1.Cl
1491
- O=c1[nH]c2ccccc2n1C1CCN(Cc2ccc(-c3nc4ccc(-c5nn[nH]n5)cc4nc3-c3ccccc3)cc2)CC1
1492
- O=C(Cc1nc(N2CCOCC2)cc(=O)[nH]1)Nc1ccc(F)c(F)c1F
1493
- c1ccc(-c2cc3cccnc3nc2-c2ccc(CN3CCC(c4cc(-c5ccncc5)[nH]n4)CC3)cc2)cc1
1494
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2ccc(F)c(F)c2)oc1Cl
1495
- CC(C)Nc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)nc2nc(-c3ccccn3)nn12
1496
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccccc4F)C3)c12
1497
- CCN(CC)CC1CN(C(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c2ccccc21
1498
- NC1(c2ccc(-c3nn4c(Br)cnc4cc3-c3ccccc3)cc2)CCC1
1499
- CCNC(=O)NC1CCN(Cc2ccc(-c3nc4nc(SC)ncc4cc3-c3ccccc3)cc2)CC1
1500
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCCCC2)CC1
1501
- Cc1noc(C)c1S(=O)(=O)N(CCOC(C)C)CC(O)CN1CCCC2(CC(=O)c3cc(O)ccc3O2)C1
1502
- Cc1cccc(-c2nc(C3CCN(Cc4ccc(-c5nc6nc(C)nn6cc5-c5cccc(F)c5)cc4)CC3)n[nH]2)n1
1503
- CCc1cc2cc(-c3cncc(OCC(N)Cc4c[nH]c5ccccc45)c3)ccc2cn1
1504
- CN(C)CC1CN(C(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c2ccccc21
1505
- CCOCCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN(C)C)CC1
1506
- Cc1ccc(CC(N)COc2cncc(-c3ccc4[nH]nc(C)c4c3)c2)cc1F
1507
- NC1(c2ccc(-c3nc4n(c3-c3ccsc3)COc3cccc(F)c3-4)cc2)CC(O)(C2CC2)C1
1508
- Clc1c[nH]c2ncnc(N3CC4(CCNCC4)c4ccccc43)c12
1509
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)c3ccccc3)cc21
1510
- NC(CNc1ncc(-c2ccc3c(c2)CNC3=O)s1)Cc1ccc(C(F)(F)F)cc1
1511
- COc1cc2ncc3c(N)nc(-c4cncc(OCC(N)Cc5ccccc5)c4)cc3c2cc1OC
1512
- COc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c1C#N
1513
- COc1nc(Br)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1514
- Cn1c(CC(=O)N2CCc3c2ccc(F)c3F)nc(N2CCOCC2)cc1=O
1515
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CC2CCN2)CC1
1516
- c1ccc(-c2cc3cnc(N4CCn5cnnc5C4)nc3nc2-c2ccc(CN3CCC(c4nnc(-c5ccccn5)[nH]4)CC3)cc2)cc1
1517
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN2CCCCC2)CC1
1518
- Nc1ncnc(N2CCN(C(=O)C(N)Cc3ccc(Cl)cc3)CC2)c1Br
1519
- Cl.Cn1ncc(Cl)c1-c1csc(C(=O)NC2CNCCC2c2ccc(Cl)c(C(F)(F)F)c2)c1
1520
- NC(COc1cncc(C=Cc2ccncc2)c1)Cc1c[nH]c2ccccc12
1521
- Cn1c(CC(=O)N2CCc3c(Cl)cccc32)nc(N2CCOCC2)cc1=S
1522
- O=C(N1CCN(c2ncnc3[nH]nc(Br)c23)CC1)C1(c2ccc(Cl)c(Cl)c2)CCNCC1
1523
- COc1cccc(C(=O)Nc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)c1
1524
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
1525
- N#Cc1c(N)ncnc1N1CCC(c2nc(-c3cnoc3)cn2CCN2CCC2)CC1
1526
- CC(C)(C)C(=O)N1CCN(c2cccc(-c3ccc4nc(-c5cccnc5N)n(-c5ccc(C6(N)CCC6)cc5)c4n3)c2)CC1
1527
- CNC(=O)Nc1ccc(CNc2ncsc2C(=O)Nc2ccc3c(c2)OC(F)(F)O3)cn1
1528
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)c4ccccc4)c3)cc12
1529
- Cl.NCc1ccc(-n2c(-c3cccnc3N)nc3ccc(-c4cn[nH]c4)nc32)cc1
1530
- Cn1ncc(Cl)c1-c1cc(C(=O)NC(CN)Cc2cccc(F)c2)sc1Cl
1531
- Cc1cc(C(N)=O)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1532
- Nc1ncccc1-c1nc2ccc(Nc3ccc(N4CCOCC4)cc3)nc2n1-c1ccc(C2(N)CCC2)cc1
1533
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)nc(OCC(N)c3ccccc3)cc21
1534
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1cc2ccccc2s1
1535
- COCCC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
1536
- Nc1cc2cc(-c3cnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)ccc2cn1
1537
- NC1(c2ccc(-c3nn4c(-c5ccn[nH]5)cnc4cc3-c3ccccc3)cc2)CCC1
1538
- NC1(C(=O)NC(CCCN2CCOCC2)c2ccc(Cl)cc2)CCN(c2ncnc3[nH]ccc23)CC1
1539
- CCc1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CCN(C)C)CC1
1540
- ClCc1nnc2c3cc(-c4ccccc4)c(-c4ccc(CN5CCC(c6n[nH]c(-c7ccccn7)n6)CC5)cc4)nc3ccn12
1541
- Cc1cccc(-c2nc(C3CN(Cc4ccc(-c5nc6nc(C)c(Br)n6cc5-c5ccccc5)cc4)C3)n[nH]2)n1
1542
- CCOc1cccc2c1-c1nc(-c3ccc(C4(N)CC(C)(O)C4)cc3)c(-c3ccccc3)n1CO2
1543
- CCOc1cccc2c1-c1nc(-c3ccc(C4(N)CC(C)(O)C4)cc3)c(-c3ccccc3)n1CO2
1544
- Cc1ccc2c(c1)-c1nnc(-c3ccc(C4(N)CCC4)cc3)n1-c1cccnc1N2
1545
- COC(=O)c1cn2cc(-c3ccccc3)c(-c3ccc(CN4CC(c5n[nH]c(-c6ccccn6)n5)C4)cc3)nc2n1
1546
- Nc1noc2ccc(-c3cnc(NCC(N)Cc4ccc(C(F)(F)F)cc4)s3)cc12
1547
- CCn1c(CC(=O)Nc2ccc(F)cc2)nc(N2CCOCC2)cc1=O
1548
- COC(=O)c1cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2c(OC)n1
1549
- Cc1n[nH]c2ccc(-c3cncc(OCC(N)Cc4cccc(F)c4F)c3)cc12
1550
- CC(NC(=O)C1(N)CCN(c2ncnc3[nH]ccc23)CC1)c1ccc(Cl)cc1
1551
- Cc1n[nH]c2cnc(-c3cc(OCC(N)Cc4c[nH]c5ccccc45)cnc3-c3ccoc3)nc12
1552
- COc1cc(F)ccc1NC(=O)Cc1nc(N2CCOCC2)cc(=O)[nH]1
1553
- Nc1ncnc2nc(-c3ccc(CN4CCC(c5nc6ccc(F)cc6[nH]5)CC4)cc3)c(-c3ccccc3)cc12
1554
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CC(C)(C#N)C3)cn2)cs1)C(F)(F)F
1555
- CN(C)CCNC(=O)c1cccc(-c2c[nH]c(C=C3C(=O)Nc4ccc(NC(N)=O)cc43)c2)c1
1556
- NC1(c2ccc(-c3nc4n(c3-c3ccsc3)COc3ccccc3-4)cc2)CCC1
1557
- Cc1c[nH]c2ncnc(N3CCC(N)(CNC(=O)c4ccc(F)cc4F)C3)c12
1558
- Cc1cc(Cl)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1559
- CNc1c(-c2ccccc2)c(-c2ccc(CN3CC(c4n[nH]c(-c5ccccn5)n4)C3)cc2)nc2ncc(Br)n12
1560
- Nc1ncccc1-c1nc2ccc(-c3ccccc3)nc2n1-c1ccc(CNC(=O)c2ccccc2)cc1
1561
- Cn1c(CC(=O)N2CCc3c(OC(F)F)cccc32)nc(N2CCOCC2)cc1=O
1562
- NC1(c2ccc(-c3nc4c5cc(-c6ccc(F)c(CO)c6)ccc5nn4cc3-c3ccccc3)cc2)CCC1
1563
- Cc1c(NCCN(C)C)cc(Cl)cc1N1CCN(c2ncnc3[nH]nc(Br)c23)CC1
1564
- CC1SCc2ncnc(N3CCN(C(=O)C(Cc4ccc(Cl)cc4)C4(N)CC4)CC3)c21
1565
- NC(=O)c1c(N)ncnc1N1CCC(c2nc(-c3ccc(F)c(C(F)(F)F)c3)cn2CC2CNC2)CC1
1566
- Nc1ncccc1-c1nc2ccc(-c3cccc(N4CCC(C(=O)N5CCOCC5)CC4)c3)nc2n1-c1ccc(C2(N)CCC2)cc1
1567
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1ccc(F)c(F)c1
1568
- NC1(c2ccc(-c3nc4c5cc(-c6cn[nH]c6)ccc5nn4cc3-c3ccccc3)cc2)CCC1
1569
- CCOc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1570
- CC(=O)Nc1ccc2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1571
- Cc1c[nH]c2ncnc(N3CCN(C(=O)C(N)Cc4ccc(F)cc4)CC3)c12
1572
- Cc1cccc(NC(=O)Cc2nc(N3CCOCC3)cc(=O)[nH]2)c1O
1573
- NC1(c2ccc(-c3nc4cc(-c5ccncc5)ccn4c3-c3ccccc3)cc2)CCC1
1574
- COC(=O)c1cnn2cc(-c3c(F)cccc3F)c(-c3ccc(CN4CCC(c5n[nH]c(-c6cccc(C)n6)n5)CC4)cc3)nc12
1575
- NC(c1ccc(Cl)cc1)C1CCN(c2ncnc3[nH]cnc23)CC1
1576
- Nc1ncnc(N2CCC(c3nc(-c4ccnc(C(F)(F)F)c4)cn3CCN3CCC3)CC2)c1-c1cnoc1
1577
- Cc1nc(N)nc2c1cc(-c1cn[nH]c1)c(=O)n2C1CCC(OCC(N)=O)CC1
1578
- COc1ccc(S(=O)(=O)NCc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)cc1
1579
- CC(C)NC(c1ccc(Cl)cc1)c1ccc(-c2ncnc3[nH]cnc23)cc1
1580
- Cc1cccc2c1CCN2C(=O)Cc1nc(N2CCOCC2)cc(=O)n1C
1581
- Cc1n[nH]c2ccc(-c3cc(OCC(N)Cc4ccccc4)cnc3-c3ccc[nH]3)cc12
1582
- O=C1Nc2ccccc2C1=Cc1c[nH]nc1-c1ccccc1[N+](=O)[O-]
1583
- O=C(Nc1ccc(-c2nnc3n2-c2cccnc2Nc2ccccc2-3)cc1)c1cnc2ccccc2n1
1584
- CC(n1cnnc1-c1nc(NC(=O)c2cc(-n3cnc(C4CC4)c3)c(N3CC(N4CCOCC4)C3)cn2)cs1)C(F)(F)F
1585
- NC(=O)c1cc(Cl)c2nc(-c3ccc(C4(N)CCC4)cc3)c(-c3ccccc3)n2c1
1586
- CCOc1cccc2c1-c1nc(-c3ccc(C4(N)CC(O)(C5CC5)C4)cc3)c(-c3ccccc3)n1CO2
1587
- CC1(O)CC(N)(c2ccc(-c3nc4n(c3-c3ccccc3)COc3ncccc3-4)cc2)C1
1588
- Clc1cnc2nc(-c3ccc(CN4CC(c5nnc(-c6ccccn6)[nH]5)C4)cc3)c(-c3ccccc3)cn12
1589
- O=C(NC(c1ccc(Cl)c(Cl)c1)C1CCNC1)c1ccc2cnccc2c1
1590
- CC(C)CCCC1(C)CCc2cc(O)cc(F)c2O1
1591
- COCCOc1ccn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc2n1
1592
- CCCC(NC(=O)C(CCCNC(=N)N)NC(=O)CN(CCN)C(=O)C(N)CCCNC(=N)N)C(=O)NC(Cc1ccc(O)cc1)C(=O)NC(CN)C(=O)NC(CCC(C)C)C(=O)N(CCCCN)CC(N)=O
1593
- CCn1c(-c2nonc2N)nc2c(C#CC(C)(C)O)ncc(OCCCN)c21
1594
- COc1cc(Br)cn2c(-c3ccccc3)c(-c3ccc(C4(N)CCC4)cc3)nc12
1595
- NCCC(NC(=O)c1cc(Br)c(-c2ccnc3[nH]ccc23)s1)c1ccccc1
1596
- NC(Cc1ccc(Br)cc1)C(=O)N1CCN(c2ncnc3ccccc23)CC1
1597
- O=C(NC(c1ccc(Cl)c(Cl)c1)C1CCNCC1)c1ccc2cnccc2c1
1598
- [C-]#[N+]c1ccc(C(=O)Nc2ccc(-c3nnc4n3-c3cccnc3Nc3ccccc3-4)cc2)cc1
1599
- CN(C)CCn1cc(-c2ccc(F)c(C(F)(F)F)c2)nc1C1CCN(c2ncnc(N)c2C2CC2)CC1
1600
- Cc1ccc(CC(N)C(=O)N2CCN(c3ncnc4ccccc34)CC2)cc1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
experiments/.gitignore DELETED
@@ -1 +0,0 @@
1
- inference/
 
 
gradio_app.py DELETED
@@ -1,208 +0,0 @@
1
- import gradio as gr
2
- from trainer import Trainer
3
- import PIL
4
- from PIL import Image
5
- import pandas as pd
6
- import random
7
- from rdkit import Chem
8
- from rdkit.Chem import Draw
9
- from rdkit.Chem.Draw import IPythonConsole
10
- import shutil
11
-
12
- class DrugGENConfig:
13
- submodel='CrossLoss'
14
- act='relu'
15
- z_dim=16
16
- max_atom=45
17
- lambda_gp=1
18
- dim=128
19
- depth=1
20
- heads=8
21
- dec_depth=1
22
- dec_heads=8
23
- dec_dim=128
24
- mlp_ratio=3
25
- warm_up_steps=0
26
- dis_select='mlp'
27
- init_type='normal'
28
- batch_size=128
29
- epoch=50
30
- g_lr=0.00001
31
- d_lr=0.00001
32
- g2_lr=0.00001
33
- d2_lr=0.00001
34
- dropout=0.
35
- dec_dropout=0.
36
- n_critic=1
37
- beta1=0.9
38
- beta2=0.999
39
- resume_iters=None
40
- clipping_value=2
41
- features=False
42
- test_iters=10_000
43
- num_test_epoch=30_000
44
- inference_sample_num=1000
45
- num_workers=1
46
- mode="inference"
47
- inference_iterations=100
48
- inf_batch_size=1
49
- protein_data_dir='data/akt'
50
- drug_index='data/drug_smiles.index'
51
- drug_data_dir='data/akt'
52
- mol_data_dir='data'
53
- log_dir='experiments/logs'
54
- model_save_dir='experiments/models'
55
- # inference_model=""
56
- sample_dir='experiments/samples'
57
- result_dir="experiments/tboard_output"
58
- dataset_file="chembl45_train.pt"
59
- drug_dataset_file="akt_train.pt"
60
- raw_file='data/chembl_train.smi'
61
- drug_raw_file="data/akt_train.smi"
62
- inf_dataset_file="chembl45_test.pt"
63
- inf_drug_dataset_file='akt_test.pt'
64
- inf_raw_file='data/chembl_test.smi'
65
- inf_drug_raw_file="data/akt_test.smi"
66
- log_sample_step=1000
67
- set_seed=True
68
- seed=1
69
- resume=False
70
- resume_epoch=None
71
- resume_iter=None
72
- resume_directory=None
73
-
74
- class ProtConfig(DrugGENConfig):
75
- submodel="Prot"
76
- inference_model="experiments/models/Prot"
77
-
78
- class CrossLossConfig(DrugGENConfig):
79
- submodel="CrossLoss"
80
- inference_model="experiments/models/CrossLoss"
81
-
82
- class NoTargetConfig(DrugGENConfig):
83
- submodel="NoTarget"
84
- inference_model="experiments/models/NoTarget"
85
-
86
-
87
- model_configs = {
88
- "Prot": ProtConfig(),
89
- "DrugGEN": CrossLossConfig(),
90
- "DrugGEN-NoTarget": NoTargetConfig(),
91
- }
92
-
93
-
94
-
95
- def function(model_name: str, mol_num: int, seed: int) -> tuple[PIL.Image, pd.DataFrame, str]:
96
- '''
97
- Returns:
98
- image, score_df, file path
99
- '''
100
-
101
- config = model_configs[model_name]
102
- config.inference_sample_num = mol_num
103
- config.seed = seed
104
-
105
- trainer = Trainer(config)
106
- scores = trainer.inference() # create scores_df out of this
107
-
108
- score_df = pd.DataFrame(scores, index=[0])
109
- old_model_names = {
110
- "DrugGEN": "CrossLoss",
111
- "DrugGEN-NoTarget": "NoTarget",
112
- }
113
- output_file_path = f'experiments/inference/{old_model_names[model_name]}/inference_drugs.txt'
114
-
115
- import os
116
- new_path = f'{model_name}_denovo_mols.smi'
117
- os.rename(output_file_path, new_path)
118
-
119
- with open(new_path) as f:
120
- inference_drugs = f.read()
121
-
122
- generated_molecule_list = inference_drugs.split("\n")
123
-
124
- rng = random.Random(seed)
125
-
126
- selected_molecules = rng.choices(generated_molecule_list,k=12)
127
- selected_molecules = [Chem.MolFromSmiles(mol) for mol in selected_molecules]
128
-
129
- drawOptions = Draw.rdMolDraw2D.MolDrawOptions()
130
- drawOptions.prepareMolsBeforeDrawing = False
131
- drawOptions.bondLineWidth = 0.5
132
-
133
- molecule_image = Draw.MolsToGridImage(
134
- selected_molecules,
135
- molsPerRow=3,
136
- subImgSize=(400, 400),
137
- maxMols=len(selected_molecules),
138
- # legends=None,
139
- returnPNG=False,
140
- drawOptions=drawOptions,
141
- highlightAtomLists=None,
142
- highlightBondLists=None,
143
- )
144
-
145
-
146
- return molecule_image, score_df, new_path
147
-
148
-
149
-
150
- with gr.Blocks() as demo:
151
- with gr.Row():
152
- with gr.Column(scale=1):
153
- gr.Markdown("# DrugGEN: Target Centric De Novo Design of Drug Candidate Molecules with Graph Generative Deep Adversarial Networks")
154
- with gr.Row():
155
- gr.Markdown("[![arXiv](https://img.shields.io/badge/arXiv-2302.07868-b31b1b.svg)](https://arxiv.org/abs/2302.07868)")
156
- gr.Markdown("[![github-repository](https://img.shields.io/badge/GitHub-black?logo=github)](https://github.com/HUBioDataLab/DrugGEN)")
157
-
158
- with gr.Accordion("Expand to display information about models", open=False):
159
- gr.Markdown("""
160
- ### Model Variations
161
- - **DrugGEN**: composed of one GAN, the input of the GAN1 generator is the real molecules dataset and the GAN1 discriminator compares the generated molecules with the real inhibitors of the given target.
162
- - **DrugGEN-NoTarget**: composed of one GAN, focuses on learning the chemical properties from the ChEMBL training dataset, no target-specific generation.
163
- """)
164
- model_name = gr.Radio(
165
- choices=("DrugGEN", "DrugGEN-NoTarget"),
166
- value="DrugGEN",
167
- label="Select a model to make inference",
168
- info=" DrugGEN-Prot and DrugGEN-CrossLoss models design molecules to target the AKT1 protein"
169
- )
170
-
171
- num_molecules = gr.Number(
172
- label="Number of molecules to generate",
173
- precision=0, # integer input
174
- minimum=1,
175
- value=1000,
176
- maximum=10_000,
177
- )
178
- seed_num = gr.Number(
179
- label="RNG seed value (can be used for reproducibility):",
180
- precision=0, # integer input
181
- minimum=0,
182
- value=42,
183
- )
184
-
185
- submit_button = gr.Button(
186
- value="Start Generating"
187
- )
188
-
189
- with gr.Column(scale=2):
190
- scores_df = gr.Dataframe(
191
- label="Scores",
192
- headers=["Runtime (seconds)", "Validity", "Uniqueness", "Novelty (Train)", "Novelty (Inference)"],
193
- )
194
- file_download = gr.File(
195
- label="Click to download generated molecules",
196
- )
197
- image_output = gr.Image(
198
- label="Structures of randomly selected 12 de novo molecules from the inference set:"
199
- )
200
- # ).style(
201
- # height=200*4,
202
- # width=200*3,
203
- # )
204
-
205
- submit_button.click(function, inputs=[model_name, num_molecules, seed_num], outputs=[image_output, scores_df, file_download], api_name="inference")
206
-
207
- demo.queue(concurrency_count=1)
208
- demo.launch()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
layers.py DELETED
@@ -1,265 +0,0 @@
1
- import torch
2
- import torch.nn as nn
3
- from torch.nn import functional as F
4
- import math
5
-
6
- class MLP(nn.Module):
7
- def __init__(self, in_feat, hid_feat=None, out_feat=None,
8
- dropout=0.):
9
- super().__init__()
10
- if not hid_feat:
11
- hid_feat = in_feat
12
- if not out_feat:
13
- out_feat = in_feat
14
- self.fc1 = nn.Linear(in_feat, hid_feat)
15
- self.act = torch.nn.ReLU()
16
- self.fc2 = nn.Linear(hid_feat,out_feat)
17
- self.droprateout = nn.Dropout(dropout)
18
-
19
- def forward(self, x):
20
- x = self.fc1(x)
21
- x = self.act(x)
22
- x = self.fc2(x)
23
- return self.droprateout(x)
24
-
25
- class Attention_new(nn.Module):
26
- def __init__(self, dim, heads, attention_dropout=0.):
27
- super().__init__()
28
- assert dim % heads == 0
29
- self.heads = heads
30
- self.scale = 1./dim**0.5
31
-
32
- self.q = nn.Linear(dim, dim)
33
- self.k = nn.Linear(dim, dim)
34
- self.v = nn.Linear(dim, dim)
35
- self.e = nn.Linear(dim, dim)
36
- #self.attention_dropout = nn.Dropout(attention_dropout)
37
-
38
- self.d_k = dim // heads
39
- self.heads = heads
40
- self.out_e = nn.Linear(dim,dim)
41
- self.out_n = nn.Linear(dim, dim)
42
-
43
-
44
- def forward(self, node, edge):
45
- b, n, c = node.shape
46
-
47
-
48
- q_embed = self.q(node).view(-1, n, self.heads, c//self.heads)
49
- k_embed = self.k(node).view(-1, n, self.heads, c//self.heads)
50
- v_embed = self.v(node).view(-1, n, self.heads, c//self.heads)
51
-
52
- e_embed = self.e(edge).view(-1, n, n, self.heads, c//self.heads)
53
-
54
- q_embed = q_embed.unsqueeze(2)
55
- k_embed = k_embed.unsqueeze(1)
56
-
57
- attn = q_embed * k_embed
58
-
59
- attn = attn/ math.sqrt(self.d_k)
60
-
61
-
62
- attn = attn * (e_embed + 1) * e_embed
63
-
64
- edge = self.out_e(attn.flatten(3))
65
-
66
- attn = F.softmax(attn, dim=2)
67
-
68
- v_embed = v_embed.unsqueeze(1)
69
-
70
- v_embed = attn * v_embed
71
-
72
- v_embed = v_embed.sum(dim=2).flatten(2)
73
-
74
- node = self.out_n(v_embed)
75
-
76
- return node, edge
77
-
78
- class Encoder_Block(nn.Module):
79
- def __init__(self, dim, heads,act, mlp_ratio=4, drop_rate=0.):
80
- super().__init__()
81
- self.ln1 = nn.LayerNorm(dim)
82
-
83
- self.attn = Attention_new(dim, heads, drop_rate)
84
- self.ln3 = nn.LayerNorm(dim)
85
- self.ln4 = nn.LayerNorm(dim)
86
- self.mlp = MLP(dim, dim*mlp_ratio, dim, dropout=drop_rate)
87
- self.mlp2 = MLP(dim, dim*mlp_ratio, dim, dropout=drop_rate)
88
- self.ln5 = nn.LayerNorm(dim)
89
- self.ln6 = nn.LayerNorm(dim)
90
-
91
- def forward(self, x,y):
92
- x1 = self.ln1(x)
93
- x2,y1 = self.attn(x1,y)
94
- x2 = x1 + x2
95
- y2 = y1 + y
96
- x2 = self.ln3(x2)
97
- y2 = self.ln4(y2)
98
-
99
- x = self.ln5(x2 + self.mlp(x2))
100
- y = self.ln6(y2 + self.mlp2(y2))
101
- return x, y
102
-
103
-
104
- class TransformerEncoder(nn.Module):
105
- def __init__(self, dim, depth, heads, act, mlp_ratio=4, drop_rate=0.1):
106
- super().__init__()
107
-
108
- self.Encoder_Blocks = nn.ModuleList([
109
- Encoder_Block(dim, heads, act, mlp_ratio, drop_rate)
110
- for i in range(depth)])
111
-
112
- def forward(self, x,y):
113
-
114
- for Encoder_Block in self.Encoder_Blocks:
115
- x, y = Encoder_Block(x,y)
116
-
117
- return x, y
118
-
119
- class enc_dec_attention(nn.Module):
120
- def __init__(self, dim, heads, attention_dropout=0., proj_dropout=0.):
121
- super().__init__()
122
- self.dim = dim
123
- self.heads = heads
124
- self.scale = 1./dim**0.5
125
-
126
-
127
- "query is molecules"
128
- "key is prot"
129
- "values is again molecule"
130
- self.q_mx = nn.Linear(dim,dim)
131
- self.k_px = nn.Linear(dim,dim)
132
- self.v_mx = nn.Linear(dim,dim)
133
-
134
-
135
- self.k_pa = nn.Linear(dim,dim)
136
- self.v_ma = nn.Linear(dim,dim)
137
-
138
-
139
-
140
-
141
-
142
- #self.dropout_dec = nn.Dropout(proj_dropout)
143
- self.out_nd = nn.Linear(dim, dim)
144
- self.out_ed = nn.Linear(dim,dim)
145
-
146
- def forward(self, mol_annot, prot_annot, mol_adj, prot_adj):
147
-
148
- b, n, c = mol_annot.shape
149
- _, m, _ = prot_annot.shape
150
-
151
-
152
- query_mol_annot = self.q_mx(mol_annot).view(-1,m, self.heads, c//self.heads)
153
- key_prot_annot = self.k_px(prot_annot).view(-1,n, self.heads, c//self.heads)
154
- value_mol_annot = self.v_mx(mol_annot).view(-1,m, self.heads, c//self.heads)
155
-
156
- mol_e = self.v_ma(mol_adj).view(-1,m,m, self.heads, c//self.heads)
157
- prot_e = self.k_pa(prot_adj).view(-1,m,m, self.heads, c//self.heads)
158
-
159
- query_mol_annot = query_mol_annot.unsqueeze(2)
160
- key_prot_annot = key_prot_annot.unsqueeze(1)
161
-
162
-
163
-
164
- #attn = torch.einsum('bnchd,bmahd->bnahd', query_mol_annot, key_prot_annot)
165
-
166
- attn = query_mol_annot * key_prot_annot
167
-
168
- attn = attn/ math.sqrt(self.dim)
169
-
170
-
171
- attn = attn * (prot_e + 1) * mol_e
172
-
173
- mol_e_new = attn.flatten(3)
174
-
175
- mol_adj = self.out_ed(mol_e_new)
176
-
177
- attn = F.softmax(attn, dim=2)
178
-
179
- value_mol_annot = value_mol_annot.unsqueeze(1)
180
-
181
- value_mol_annot = attn * value_mol_annot
182
-
183
- value_mol_annot = value_mol_annot.sum(dim=2).flatten(2)
184
-
185
- mol_annot = self.out_nd(value_mol_annot)
186
-
187
- return mol_annot, prot_annot, mol_adj, prot_adj
188
-
189
- class Decoder_Block(nn.Module):
190
- def __init__(self, dim, heads, mlp_ratio=4, drop_rate=0.):
191
- super().__init__()
192
-
193
-
194
- self.ln1_ma = nn.LayerNorm(dim)
195
- self.ln1_pa = nn.LayerNorm(dim)
196
- self.ln1_mx = nn.LayerNorm(dim)
197
- self.ln1_px = nn.LayerNorm(dim)
198
-
199
- self.attn2 = Attention_new(dim, heads, drop_rate)
200
-
201
- self.ln2_pa = nn.LayerNorm(dim)
202
- self.ln2_px = nn.LayerNorm(dim)
203
-
204
- self.dec_attn = enc_dec_attention(dim, heads, drop_rate, drop_rate)
205
-
206
- self.ln3_ma = nn.LayerNorm(dim)
207
- self.ln3_mx = nn.LayerNorm(dim)
208
-
209
- self.mlp_ma = MLP(dim, dim, dropout=drop_rate)
210
- self.mlp_mx = MLP(dim, dim, dropout=drop_rate)
211
-
212
- self.ln4_ma = nn.LayerNorm(dim)
213
- self.ln4_mx = nn.LayerNorm(dim)
214
-
215
-
216
- def forward(self,mol_annot, prot_annot, mol_adj, prot_adj):
217
-
218
- mol_annot = self.ln1_mx(mol_annot)
219
- mol_adj = self.ln1_ma(mol_adj)
220
-
221
- prot_annot = self.ln1_px(prot_annot)
222
- prot_adj = self.ln1_pa(prot_adj)
223
-
224
- px1, pa1= self.attn2(prot_annot, prot_adj)
225
-
226
- prot_annot = prot_annot + px1
227
- prot_adj = prot_adj + pa1
228
-
229
- prot_annot = self.ln2_px(prot_annot)
230
- prot_adj = self.ln2_pa(prot_adj)
231
-
232
- mx1, prot_annot, ma1, prot_adj = self.dec_attn(mol_annot,prot_annot,mol_adj,prot_adj)
233
-
234
- ma1 = mol_adj + ma1
235
- mx1 = mol_annot + mx1
236
-
237
- ma2 = self.ln3_ma(ma1)
238
- mx2 = self.ln3_mx(mx1)
239
-
240
- ma3 = self.mlp_ma(ma2)
241
- mx3 = self.mlp_mx(mx2)
242
-
243
- ma = ma3 + ma2
244
- mx = mx3 + mx2
245
-
246
- mol_adj = self.ln4_ma(ma)
247
- mol_annot = self.ln4_mx(mx)
248
-
249
- return mol_annot, prot_annot, mol_adj, prot_adj
250
-
251
- class TransformerDecoder(nn.Module):
252
- def __init__(self, dim, depth, heads, mlp_ratio=4, drop_rate=0.):
253
- super().__init__()
254
-
255
- self.Decoder_Blocks = nn.ModuleList([
256
- Decoder_Block(dim, heads, mlp_ratio, drop_rate)
257
- for i in range(depth)])
258
-
259
- def forward(self, mol_annot, prot_annot, mol_adj, prot_adj):
260
-
261
- for Decoder_Block in self.Decoder_Blocks:
262
- mol_annot, prot_annot, mol_adj, prot_adj = Decoder_Block(mol_annot, prot_annot, mol_adj, prot_adj)
263
-
264
- return mol_annot, prot_annot,mol_adj, prot_adj
265
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
loss.py DELETED
@@ -1,158 +0,0 @@
1
-
2
- import torch
3
-
4
- def discriminator_loss(generator, discriminator, mol_graph, adj, annot, batch_size, device, grad_pen, lambda_gp,z_edge,z_node):
5
-
6
- # Compute loss with real molecules.
7
-
8
- logits_real_disc = discriminator(mol_graph)
9
-
10
- prediction_real = - torch.mean(logits_real_disc)
11
-
12
- # Compute loss with fake molecules.
13
-
14
- node, edge, node_sample, edge_sample = generator(z_edge, z_node)
15
-
16
- graph = torch.cat((node_sample.view(batch_size, -1), edge_sample.view(batch_size, -1)), dim=-1)
17
-
18
- logits_fake_disc = discriminator(graph.detach())
19
-
20
- prediction_fake = torch.mean(logits_fake_disc)
21
-
22
- # Compute gradient loss.
23
-
24
- eps = torch.rand(mol_graph.size(0),1).to(device)
25
- x_int0 = (eps * mol_graph + (1. - eps) * graph).requires_grad_(True)
26
-
27
- grad0 = discriminator(x_int0)
28
- d_loss_gp = grad_pen(grad0, x_int0)
29
-
30
- # Calculate total loss
31
-
32
- d_loss = prediction_fake + prediction_real + d_loss_gp * lambda_gp
33
-
34
- return node, edge,d_loss
35
-
36
-
37
- def generator_loss(generator, discriminator, v, adj, annot, batch_size, penalty, matrices2mol, fps_r,submodel, dataset_name):
38
-
39
- # Compute loss with fake molecules.
40
-
41
- node, edge, node_sample, edge_sample = generator(adj, annot)
42
-
43
-
44
- graph = torch.cat((node_sample.view(batch_size, -1), edge_sample.view(batch_size, -1)), dim=-1)
45
-
46
-
47
- logits_fake_disc = discriminator(graph)
48
-
49
- prediction_fake = - torch.mean(logits_fake_disc)
50
-
51
- # Produce molecules.
52
-
53
- g_edges_hat_sample = torch.max(edge_sample, -1)[1]
54
- g_nodes_hat_sample = torch.max(node_sample , -1)[1]
55
-
56
- fake_mol = [matrices2mol(n_.data.cpu().numpy(), e_.data.cpu().numpy(), strict=True, file_name=dataset_name)
57
- for e_, n_ in zip(g_edges_hat_sample, g_nodes_hat_sample)]
58
- g_loss = prediction_fake
59
- # Compute penalty loss.
60
- if submodel == "RL":
61
- reward = penalty(fake_mol, fps_r)
62
-
63
- # Reinforcement Loss
64
-
65
- rew_fake = v(graph)
66
-
67
- reward_loss = torch.mean(rew_fake) ** 2 + reward
68
-
69
- # Calculate total loss
70
-
71
- g_loss = prediction_fake + reward_loss * 1
72
-
73
-
74
- return g_loss, fake_mol, g_edges_hat_sample, g_nodes_hat_sample, node, edge
75
-
76
- def discriminator2_loss(generator, discriminator, mol_graph, adj, annot, batch_size, device, grad_pen, lambda_gp,akt1_adj,akt1_annot):
77
-
78
- # Generate molecules.
79
-
80
- dr_edges, dr_nodes = generator(adj,
81
- annot,
82
- akt1_adj,
83
- akt1_annot)
84
-
85
-
86
- dr_edges_hat = dr_edges.view(batch_size, -1)
87
-
88
- dr_nodes_hat = dr_nodes.view(batch_size, -1)
89
-
90
- dr_graph = torch.cat((dr_nodes_hat, dr_edges_hat), dim=-1)
91
-
92
- # Compute loss with fake molecules.
93
-
94
- dr_logits_fake = discriminator(dr_graph.detach())
95
-
96
- d2_loss_fake = torch.mean(dr_logits_fake)
97
-
98
- # Compute loss with real molecules.
99
-
100
- dr_logits_real2 = discriminator(mol_graph)
101
-
102
- d2_loss_real = - torch.mean(dr_logits_real2)
103
-
104
- # Compute gradient loss.
105
-
106
- eps_dr = torch.rand(mol_graph.size(0),1).to(device)
107
- x_int0_dr = (eps_dr * mol_graph + (1. - eps_dr) * dr_graph).requires_grad_(True)
108
-
109
-
110
- grad0_dr = discriminator(x_int0_dr)
111
- d2_loss_gp = grad_pen(grad0_dr, x_int0_dr)
112
-
113
- # Compute total loss.
114
-
115
- d2_loss = d2_loss_fake + d2_loss_real + d2_loss_gp * lambda_gp
116
-
117
- return d2_loss
118
-
119
- def generator2_loss(generator, discriminator, v, adj, annot, batch_size, penalty, matrices2mol, fps_r,ak1_adj,akt1_annot, submodel, drugs_name):
120
-
121
- # Generate molecules.
122
-
123
- dr_edges_g, dr_nodes_g = generator(adj,
124
- annot,
125
- ak1_adj,
126
- akt1_annot)
127
-
128
- dr_edges_hat_g = dr_edges_g.view(batch_size, -1)
129
-
130
- dr_nodes_hat_g = dr_nodes_g.view(batch_size, -1)
131
-
132
- dr_graph_g = torch.cat((dr_nodes_hat_g, dr_edges_hat_g), dim=-1)
133
-
134
- # Compute loss with fake molecules.
135
-
136
- dr_g_edges_hat_sample, dr_g_nodes_hat_sample = torch.max(dr_edges_g, -1)[1], torch.max(dr_nodes_g, -1)[1]
137
-
138
- g_tra_logits_fake2 = discriminator(dr_graph_g)
139
-
140
- g2_loss_fake = - torch.mean(g_tra_logits_fake2)
141
-
142
- # Reward
143
- fake_mol_g = [matrices2mol(n_.data.cpu().numpy(), e_.data.cpu().numpy(), strict=True, file_name=drugs_name)
144
- for e_, n_ in zip(dr_g_edges_hat_sample, dr_g_nodes_hat_sample)]
145
- g2_loss = g2_loss_fake
146
- if submodel == "RL":
147
- reward2 = penalty(fake_mol_g, fps_r)
148
-
149
- # Reinforcement Loss
150
-
151
- rew_fake2 = v(dr_graph_g)
152
- reward_loss2 = torch.mean(rew_fake2) ** 2 + reward2
153
-
154
- # Calculate total loss
155
-
156
- g2_loss = g2_loss_fake + reward_loss2 * 10
157
-
158
- return g2_loss, fake_mol_g, dr_g_edges_hat_sample, dr_g_nodes_hat_sample#, reward2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
models.py DELETED
@@ -1,210 +0,0 @@
1
- import torch
2
- import torch.nn as nn
3
- import torch.nn.functional as F
4
- from layers import TransformerEncoder, TransformerDecoder
5
-
6
- class Generator(nn.Module):
7
- """Generator network."""
8
- def __init__(self, z_dim, act, vertexes, edges, nodes, dropout, dim, depth, heads, mlp_ratio, submodel):
9
- super(Generator, self).__init__()
10
-
11
- self.submodel = submodel
12
- self.vertexes = vertexes
13
- self.edges = edges
14
- self.nodes = nodes
15
- self.depth = depth
16
- self.dim = dim
17
- self.heads = heads
18
- self.mlp_ratio = mlp_ratio
19
-
20
- self.dropout = dropout
21
- self.z_dim = z_dim
22
-
23
- if act == "relu":
24
- act = nn.ReLU()
25
- elif act == "leaky":
26
- act = nn.LeakyReLU()
27
- elif act == "sigmoid":
28
- act = nn.Sigmoid()
29
- elif act == "tanh":
30
- act = nn.Tanh()
31
- self.features = vertexes * vertexes * edges + vertexes * nodes
32
- self.transformer_dim = vertexes * vertexes * dim + vertexes * dim
33
- self.pos_enc_dim = 5
34
- #self.pos_enc = nn.Linear(self.pos_enc_dim, self.dim)
35
-
36
- self.node_layers = nn.Sequential(nn.Linear(nodes, 64), act, nn.Linear(64,dim), act, nn.Dropout(self.dropout))
37
- self.edge_layers = nn.Sequential(nn.Linear(edges, 64), act, nn.Linear(64,dim), act, nn.Dropout(self.dropout))
38
-
39
- self.TransformerEncoder = TransformerEncoder(dim=self.dim, depth=self.depth, heads=self.heads, act = act,
40
- mlp_ratio=self.mlp_ratio, drop_rate=self.dropout)
41
-
42
- self.readout_e = nn.Linear(self.dim, edges)
43
- self.readout_n = nn.Linear(self.dim, nodes)
44
- self.softmax = nn.Softmax(dim = -1)
45
-
46
- def _generate_square_subsequent_mask(self, sz):
47
- mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1)
48
- mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0))
49
- return mask
50
-
51
- def laplacian_positional_enc(self, adj):
52
-
53
- A = adj
54
- D = torch.diag(torch.count_nonzero(A, dim=-1))
55
- L = torch.eye(A.shape[0], device=A.device) - D * A * D
56
-
57
- EigVal, EigVec = torch.linalg.eig(L)
58
-
59
- idx = torch.argsort(torch.real(EigVal))
60
- EigVal, EigVec = EigVal[idx], torch.real(EigVec[:,idx])
61
- pos_enc = EigVec[:,1:self.pos_enc_dim + 1]
62
-
63
- return pos_enc
64
-
65
- def forward(self, z_e, z_n):
66
- b, n, c = z_n.shape
67
- _, _, _ , d = z_e.shape
68
- #random_mask_e = torch.randint(low=0,high=2,size=(b,n,n,d)).to(z_e.device).float()
69
- #random_mask_n = torch.randint(low=0,high=2,size=(b,n,c)).to(z_n.device).float()
70
- #z_e = F.relu(z_e - random_mask_e)
71
- #z_n = F.relu(z_n - random_mask_n)
72
-
73
- #mask = self._generate_square_subsequent_mask(self.vertexes).to(z_e.device)
74
-
75
- node = self.node_layers(z_n)
76
-
77
- edge = self.edge_layers(z_e)
78
-
79
- edge = (edge + edge.permute(0,2,1,3))/2
80
-
81
- #lap = [self.laplacian_positional_enc(torch.max(x,-1)[1]) for x in edge]
82
-
83
- #lap = torch.stack(lap).to(node.device)
84
-
85
- #pos_enc = self.pos_enc(lap)
86
-
87
- #node = node + pos_enc
88
-
89
- node, edge = self.TransformerEncoder(node,edge)
90
-
91
- node_sample = self.softmax(self.readout_n(node))
92
-
93
- edge_sample = self.softmax(self.readout_e(edge))
94
-
95
- return node, edge, node_sample, edge_sample
96
-
97
-
98
-
99
- class Generator2(nn.Module):
100
- def __init__(self, dim, dec_dim, depth, heads, mlp_ratio, drop_rate, drugs_m_dim, drugs_b_dim, submodel):
101
- super().__init__()
102
- self.submodel = submodel
103
- self.depth = depth
104
- self.dim = dim
105
- self.mlp_ratio = mlp_ratio
106
- self.heads = heads
107
- self.dropout_rate = drop_rate
108
- self.drugs_m_dim = drugs_m_dim
109
- self.drugs_b_dim = drugs_b_dim
110
-
111
- self.pos_enc_dim = 5
112
-
113
-
114
- if self.submodel == "Prot":
115
- self.prot_n = torch.nn.Linear(3822, 45) ## exact dimension of protein features
116
- self.prot_e = torch.nn.Linear(298116, 2025) ## exact dimension of protein features
117
-
118
- self.protn_dim = torch.nn.Linear(1, dec_dim)
119
- self.prote_dim = torch.nn.Linear(1, dec_dim)
120
-
121
-
122
- self.mol_nodes = nn.Linear(dim, dec_dim)
123
- self.mol_edges = nn.Linear(dim, dec_dim)
124
-
125
- self.drug_nodes = nn.Linear(self.drugs_m_dim, dec_dim)
126
- self.drug_edges = nn.Linear(self.drugs_b_dim, dec_dim)
127
-
128
- self.TransformerDecoder = TransformerDecoder(dec_dim, depth, heads, mlp_ratio, drop_rate=self.dropout_rate)
129
-
130
- self.nodes_output_layer = nn.Linear(dec_dim, self.drugs_m_dim)
131
- self.edges_output_layer = nn.Linear(dec_dim, self.drugs_b_dim)
132
- self.softmax = nn.Softmax(dim=-1)
133
-
134
- def laplacian_positional_enc(self, adj):
135
-
136
- A = adj
137
- D = torch.diag(torch.count_nonzero(A, dim=-1))
138
- L = torch.eye(A.shape[0], device=A.device) - D * A * D
139
-
140
- EigVal, EigVec = torch.linalg.eig(L)
141
-
142
- idx = torch.argsort(torch.real(EigVal))
143
- EigVal, EigVec = EigVal[idx], torch.real(EigVec[:,idx])
144
- pos_enc = EigVec[:,1:self.pos_enc_dim + 1]
145
-
146
- return pos_enc
147
-
148
- def _generate_square_subsequent_mask(self, sz):
149
- mask = (torch.triu(torch.ones(sz, sz)) == 1).transpose(0, 1)
150
- mask = mask.float().masked_fill(mask == 0, float('-inf')).masked_fill(mask == 1, float(0.0))
151
- return mask
152
-
153
- def forward(self, edges_logits, nodes_logits ,akt1_adj,akt1_annot):
154
-
155
- edges_logits = self.mol_edges(edges_logits)
156
- nodes_logits = self.mol_nodes(nodes_logits)
157
-
158
- if self.submodel != "Prot":
159
- akt1_annot = self.drug_nodes(akt1_annot)
160
- akt1_adj = self.drug_edges(akt1_adj)
161
-
162
- else:
163
- akt1_adj = self.prote_dim(self.prot_e(akt1_adj).view(1,45,45,1))
164
- akt1_annot = self.protn_dim(self.prot_n(akt1_annot).view(1,45,1))
165
-
166
-
167
- #lap = [self.laplacian_positional_enc(torch.max(x,-1)[1]) for x in drug_e]
168
- #lap = torch.stack(lap).to(drug_e.device)
169
- #pos_enc = self.pos_enc(lap)
170
- #drug_n = drug_n + pos_enc
171
-
172
- if self.submodel == "Ligand" or self.submodel == "RL" :
173
- nodes_logits,akt1_annot, edges_logits, akt1_adj = self.TransformerDecoder(akt1_annot,nodes_logits,akt1_adj,edges_logits)
174
-
175
- else:
176
- nodes_logits,akt1_annot, edges_logits, akt1_adj = self.TransformerDecoder(nodes_logits,akt1_annot,edges_logits,akt1_adj)
177
-
178
- edges_logits = self.edges_output_layer(edges_logits)
179
- nodes_logits = self.nodes_output_layer(nodes_logits)
180
-
181
- edges_logits = self.softmax(edges_logits)
182
- nodes_logits = self.softmax(nodes_logits)
183
-
184
- return edges_logits, nodes_logits
185
-
186
-
187
- class simple_disc(nn.Module):
188
- def __init__(self, act, m_dim, vertexes, b_dim):
189
- super().__init__()
190
- if act == "relu":
191
- act = nn.ReLU()
192
- elif act == "leaky":
193
- act = nn.LeakyReLU()
194
- elif act == "sigmoid":
195
- act = nn.Sigmoid()
196
- elif act == "tanh":
197
- act = nn.Tanh()
198
- features = vertexes * m_dim + vertexes * vertexes * b_dim
199
-
200
- self.predictor = nn.Sequential(nn.Linear(features,256), act, nn.Linear(256,128), act, nn.Linear(128,64), act,
201
- nn.Linear(64,32), act, nn.Linear(32,16), act,
202
- nn.Linear(16,1))
203
-
204
- def forward(self, x):
205
-
206
- prediction = self.predictor(x)
207
-
208
- #prediction = F.softmax(prediction,dim=-1)
209
-
210
- return prediction
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
new_dataloader.py DELETED
@@ -1,293 +0,0 @@
1
- import pickle
2
- import numpy as np
3
- import torch
4
- from rdkit import Chem
5
- from torch_geometric.data import (Data, InMemoryDataset)
6
- import os.path as osp
7
- from tqdm import tqdm
8
- import re
9
- from rdkit import RDLogger
10
- RDLogger.DisableLog('rdApp.*')
11
- class DruggenDataset(InMemoryDataset):
12
-
13
- def __init__(self, root, dataset_file, raw_files, max_atom, features, transform=None, pre_transform=None, pre_filter=None):
14
- self.dataset_name = dataset_file.split(".")[0]
15
- self.dataset_file = dataset_file
16
- self.raw_files = raw_files
17
- self.max_atom = max_atom
18
- self.features = features
19
- super().__init__(root, transform, pre_transform, pre_filter)
20
- path = osp.join(self.processed_dir, dataset_file)
21
- self.data, self.slices = torch.load(path)
22
- self.root = root
23
-
24
-
25
- @property
26
- def processed_dir(self):
27
-
28
- return self.root
29
-
30
- @property
31
- def raw_file_names(self):
32
- return self.raw_files
33
-
34
- @property
35
- def processed_file_names(self):
36
- return self.dataset_file
37
-
38
- def _generate_encoders_decoders(self, data):
39
-
40
- self.data = data
41
- print('Creating atoms encoder and decoder..')
42
- atom_labels = sorted(set([atom.GetAtomicNum() for mol in self.data for atom in mol.GetAtoms()] + [0]))
43
- self.atom_encoder_m = {l: i for i, l in enumerate(atom_labels)}
44
- self.atom_decoder_m = {i: l for i, l in enumerate(atom_labels)}
45
- self.atom_num_types = len(atom_labels)
46
- print('Created atoms encoder and decoder with {} atom types and 1 PAD symbol!'.format(
47
- self.atom_num_types - 1))
48
- print("atom_labels", atom_labels)
49
- print('Creating bonds encoder and decoder..')
50
- bond_labels = [Chem.rdchem.BondType.ZERO] + list(sorted(set(bond.GetBondType()
51
- for mol in self.data
52
- for bond in mol.GetBonds())))
53
- print("bond labels", bond_labels)
54
- self.bond_encoder_m = {l: i for i, l in enumerate(bond_labels)}
55
- self.bond_decoder_m = {i: l for i, l in enumerate(bond_labels)}
56
- self.bond_num_types = len(bond_labels)
57
- print('Created bonds encoder and decoder with {} bond types and 1 PAD symbol!'.format(
58
- self.bond_num_types - 1))
59
- #dataset_names = str(self.dataset_name)
60
- with open("data/encoders/" +"atom_" + self.dataset_name + ".pkl","wb") as atom_encoders:
61
- pickle.dump(self.atom_encoder_m,atom_encoders)
62
-
63
-
64
- with open("data/decoders/" +"atom_" + self.dataset_name + ".pkl","wb") as atom_decoders:
65
- pickle.dump(self.atom_decoder_m,atom_decoders)
66
-
67
-
68
- with open("data/encoders/" +"bond_" + self.dataset_name + ".pkl","wb") as bond_encoders:
69
- pickle.dump(self.bond_encoder_m,bond_encoders)
70
-
71
-
72
- with open("data/decoders/" +"bond_" + self.dataset_name + ".pkl","wb") as bond_decoders:
73
- pickle.dump(self.bond_decoder_m,bond_decoders)
74
-
75
-
76
-
77
- def _genA(self, mol, connected=True, max_length=None):
78
-
79
- max_length = max_length if max_length is not None else mol.GetNumAtoms()
80
-
81
- A = np.zeros(shape=(max_length, max_length))
82
-
83
- begin, end = [b.GetBeginAtomIdx() for b in mol.GetBonds()], [b.GetEndAtomIdx() for b in mol.GetBonds()]
84
- bond_type = [self.bond_encoder_m[b.GetBondType()] for b in mol.GetBonds()]
85
-
86
- A[begin, end] = bond_type
87
- A[end, begin] = bond_type
88
-
89
- degree = np.sum(A[:mol.GetNumAtoms(), :mol.GetNumAtoms()], axis=-1)
90
-
91
- return A if connected and (degree > 0).all() else None
92
-
93
- def _genX(self, mol, max_length=None):
94
-
95
- max_length = max_length if max_length is not None else mol.GetNumAtoms()
96
-
97
- return np.array([self.atom_encoder_m[atom.GetAtomicNum()] for atom in mol.GetAtoms()] + [0] * (
98
- max_length - mol.GetNumAtoms()))
99
-
100
- def _genF(self, mol, max_length=None):
101
-
102
- max_length = max_length if max_length is not None else mol.GetNumAtoms()
103
-
104
- features = np.array([[*[a.GetDegree() == i for i in range(5)],
105
- *[a.GetExplicitValence() == i for i in range(9)],
106
- *[int(a.GetHybridization()) == i for i in range(1, 7)],
107
- *[a.GetImplicitValence() == i for i in range(9)],
108
- a.GetIsAromatic(),
109
- a.GetNoImplicit(),
110
- *[a.GetNumExplicitHs() == i for i in range(5)],
111
- *[a.GetNumImplicitHs() == i for i in range(5)],
112
- *[a.GetNumRadicalElectrons() == i for i in range(5)],
113
- a.IsInRing(),
114
- *[a.IsInRingSize(i) for i in range(2, 9)]] for a in mol.GetAtoms()], dtype=np.int32)
115
-
116
- return np.vstack((features, np.zeros((max_length - features.shape[0], features.shape[1]))))
117
-
118
- def decoder_load(self, dictionary_name, file):
119
- with open("data/decoders/" + dictionary_name + "_" + file + '.pkl', 'rb') as f:
120
- return pickle.load(f)
121
-
122
- def drugs_decoder_load(self, dictionary_name):
123
- with open("data/decoders/" + dictionary_name +'.pkl', 'rb') as f:
124
- return pickle.load(f)
125
-
126
- def matrices2mol(self, node_labels, edge_labels, strict=True, file_name=None):
127
- mol = Chem.RWMol()
128
- RDLogger.DisableLog('rdApp.*')
129
- atom_decoders = self.decoder_load("atom", file_name)
130
- bond_decoders = self.decoder_load("bond", file_name)
131
-
132
- for node_label in node_labels:
133
- mol.AddAtom(Chem.Atom(atom_decoders[node_label]))
134
-
135
- for start, end in zip(*np.nonzero(edge_labels)):
136
- if start > end:
137
- mol.AddBond(int(start), int(end), bond_decoders[edge_labels[start, end]])
138
- #mol = self.correct_mol(mol)
139
- if strict:
140
- try:
141
-
142
- Chem.SanitizeMol(mol)
143
- except:
144
- mol = None
145
-
146
- return mol
147
-
148
- def drug_decoder_load(self, dictionary_name, file):
149
-
150
- ''' Loading the atom and bond decoders '''
151
-
152
- with open("data/decoders/" + dictionary_name +"_" + file +'.pkl', 'rb') as f:
153
-
154
- return pickle.load(f)
155
- def matrices2mol_drugs(self, node_labels, edge_labels, strict=True, file_name=None):
156
- mol = Chem.RWMol()
157
- RDLogger.DisableLog('rdApp.*')
158
- atom_decoders = self.drug_decoder_load("atom", file_name)
159
- bond_decoders = self.drug_decoder_load("bond", file_name)
160
-
161
- for node_label in node_labels:
162
-
163
- mol.AddAtom(Chem.Atom(atom_decoders[node_label]))
164
-
165
- for start, end in zip(*np.nonzero(edge_labels)):
166
- if start > end:
167
- mol.AddBond(int(start), int(end), bond_decoders[edge_labels[start, end]])
168
- #mol = self.correct_mol(mol)
169
- if strict:
170
- try:
171
- Chem.SanitizeMol(mol)
172
- except:
173
- mol = None
174
-
175
- return mol
176
- def check_valency(self,mol):
177
- """
178
- Checks that no atoms in the mol have exceeded their possible
179
- valency
180
- :return: True if no valency issues, False otherwise
181
- """
182
- try:
183
- Chem.SanitizeMol(mol, sanitizeOps=Chem.SanitizeFlags.SANITIZE_PROPERTIES)
184
- return True, None
185
- except ValueError as e:
186
- e = str(e)
187
- p = e.find('#')
188
- e_sub = e[p:]
189
- atomid_valence = list(map(int, re.findall(r'\d+', e_sub)))
190
- return False, atomid_valence
191
-
192
-
193
- def correct_mol(self,x):
194
- xsm = Chem.MolToSmiles(x, isomericSmiles=True)
195
- mol = x
196
- while True:
197
- flag, atomid_valence = self.check_valency(mol)
198
- if flag:
199
- break
200
- else:
201
- assert len (atomid_valence) == 2
202
- idx = atomid_valence[0]
203
- v = atomid_valence[1]
204
- queue = []
205
- for b in mol.GetAtomWithIdx(idx).GetBonds():
206
- queue.append(
207
- (b.GetIdx(), int(b.GetBondType()), b.GetBeginAtomIdx(), b.GetEndAtomIdx())
208
- )
209
- queue.sort(key=lambda tup: tup[1], reverse=True)
210
- if len(queue) > 0:
211
- start = queue[0][2]
212
- end = queue[0][3]
213
- t = queue[0][1] - 1
214
- mol.RemoveBond(start, end)
215
-
216
- #if t >= 1:
217
-
218
- #mol.AddBond(start, end, self.decoder_load('bond_decoders')[t])
219
- # if '.' in Chem.MolToSmiles(mol, isomericSmiles=True):
220
- # mol.AddBond(start, end, self.decoder_load('bond_decoders')[t])
221
- # print(tt)
222
- # print(Chem.MolToSmiles(mol, isomericSmiles=True))
223
-
224
- return mol
225
-
226
-
227
-
228
- def label2onehot(self, labels, dim):
229
-
230
- """Convert label indices to one-hot vectors."""
231
-
232
- out = torch.zeros(list(labels.size())+[dim])
233
- out.scatter_(len(out.size())-1,labels.unsqueeze(-1),1.)
234
-
235
- return out.float()
236
-
237
- def process(self, size= None):
238
-
239
- mols = [Chem.MolFromSmiles(line) for line in open(self.raw_files, 'r').readlines()]
240
-
241
- mols = list(filter(lambda x: x.GetNumAtoms() <= self.max_atom, mols))
242
- mols = mols[:size]
243
- indices = range(len(mols))
244
-
245
- self._generate_encoders_decoders(mols)
246
-
247
-
248
-
249
- pbar = tqdm(total=len(indices))
250
- pbar.set_description(f'Processing chembl dataset')
251
- max_length = max(mol.GetNumAtoms() for mol in mols)
252
- data_list = []
253
-
254
- self.m_dim = len(self.atom_decoder_m)
255
- for idx in indices:
256
- mol = mols[idx]
257
- A = self._genA(mol, connected=True, max_length=max_length)
258
- if A is not None:
259
-
260
-
261
- x = torch.from_numpy(self._genX(mol, max_length=max_length)).to(torch.long).view(1, -1)
262
-
263
- x = self.label2onehot(x,self.m_dim).squeeze()
264
- if self.features:
265
- f = torch.from_numpy(self._genF(mol, max_length=max_length)).to(torch.long).view(x.shape[0], -1)
266
- x = torch.concat((x,f), dim=-1)
267
-
268
- adjacency = torch.from_numpy(A)
269
-
270
- edge_index = adjacency.nonzero(as_tuple=False).t().contiguous()
271
- edge_attr = adjacency[edge_index[0], edge_index[1]].to(torch.long)
272
-
273
- data = Data(x=x, edge_index=edge_index, edge_attr=edge_attr)
274
-
275
- if self.pre_filter is not None and not self.pre_filter(data):
276
- continue
277
-
278
- if self.pre_transform is not None:
279
- data = self.pre_transform(data)
280
-
281
- data_list.append(data)
282
- pbar.update(1)
283
-
284
- pbar.close()
285
-
286
- torch.save(self.collate(data_list), osp.join(self.processed_dir, self.dataset_file))
287
-
288
-
289
-
290
-
291
- if __name__ == '__main__':
292
- data = DruggenDataset("data")
293
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
packages.txt DELETED
@@ -1 +0,0 @@
1
- libcairo2-dev
 
 
requirements.txt DELETED
@@ -1,12 +0,0 @@
1
- torch
2
- rdkit-pypi
3
- tqdm
4
- numpy
5
- seaborn
6
- matplotlib
7
- pandas
8
- torch_geometric
9
- # demo related installs
10
- streamlit
11
- ipython
12
- streamlit-ext
 
 
 
 
 
 
 
 
 
 
 
 
 
trainer.py DELETED
@@ -1,927 +0,0 @@
1
- import os
2
- import time
3
- import torch.nn
4
- import torch
5
-
6
- from utils import *
7
- from models import Generator, Generator2, simple_disc
8
- import torch_geometric.utils as geoutils
9
- #import wandb
10
- import re
11
- from torch_geometric.loader import DataLoader
12
- from new_dataloader import DruggenDataset
13
- import torch.utils.data
14
- from rdkit import RDLogger
15
- import pickle
16
- from rdkit.Chem.Scaffolds import MurckoScaffold
17
- torch.set_num_threads(5)
18
- RDLogger.DisableLog('rdApp.*')
19
- from loss import discriminator_loss, generator_loss, discriminator2_loss, generator2_loss
20
- from training_data import load_data
21
- import random
22
- from tqdm import tqdm
23
-
24
- class Trainer(object):
25
-
26
- """Trainer for training and testing DrugGEN."""
27
-
28
- def __init__(self, config):
29
-
30
- if config.set_seed:
31
- np.random.seed(config.seed)
32
- random.seed(config.seed)
33
- torch.manual_seed(config.seed)
34
- torch.cuda.manual_seed(config.seed)
35
-
36
- torch.backends.cudnn.deterministic = True
37
- torch.backends.cudnn.benchmark = False
38
-
39
- os.environ["PYTHONHASHSEED"] = str(config.seed)
40
-
41
- print(f'Using seed {config.seed}')
42
-
43
- self.device = torch.device("cuda" if torch.cuda.is_available() else 'cpu')
44
- """Initialize configurations."""
45
- self.submodel = config.submodel
46
- self.inference_model = config.inference_model
47
- # Data loader.
48
- self.raw_file = config.raw_file # SMILES containing text file for first dataset.
49
- # Write the full path to file.
50
-
51
- self.drug_raw_file = config.drug_raw_file # SMILES containing text file for second dataset.
52
- # Write the full path to file.
53
-
54
-
55
- self.dataset_file = config.dataset_file # Dataset file name for the first GAN.
56
- # Contains large number of molecules.
57
-
58
- self.drugs_dataset_file = config.drug_dataset_file # Drug dataset file name for the second GAN.
59
- # Contains drug molecules only. (In this case AKT1 inhibitors.)
60
-
61
- self.inf_raw_file = config.inf_raw_file # SMILES containing text file for first dataset.
62
- # Write the full path to file.
63
-
64
- self.inf_drug_raw_file = config.inf_drug_raw_file # SMILES containing text file for second dataset.
65
- # Write the full path to file.
66
-
67
-
68
- self.inf_dataset_file = config.inf_dataset_file # Dataset file name for the first GAN.
69
- # Contains large number of molecules.
70
-
71
- self.inf_drugs_dataset_file = config.inf_drug_dataset_file # Drug dataset file name for the second GAN.
72
- # Contains drug molecules only. (In this case AKT1 inhibitors.)
73
- self.inference_iterations = config.inference_iterations
74
-
75
- self.inf_batch_size = config.inf_batch_size
76
-
77
- self.mol_data_dir = config.mol_data_dir # Directory where the dataset files are stored.
78
-
79
- self.drug_data_dir = config.drug_data_dir # Directory where the drug dataset files are stored.
80
-
81
- self.dataset_name = self.dataset_file.split(".")[0]
82
- self.drugs_name = self.drugs_dataset_file.split(".")[0]
83
-
84
- self.max_atom = config.max_atom # Model is based on one-shot generation.
85
- # Max atom number for molecules must be specified.
86
-
87
- self.features = config.features # Small model uses atom types as node features. (Boolean, False uses atom types only.)
88
- # Additional node features can be added. Please check new_dataloarder.py Line 102.
89
-
90
-
91
- self.batch_size = config.batch_size # Batch size for training.
92
-
93
- self.dataset = DruggenDataset(self.mol_data_dir,
94
- self.dataset_file,
95
- self.raw_file,
96
- self.max_atom,
97
- self.features) # Dataset for the first GAN. Custom dataset class from PyG parent class.
98
- # Can create any molecular graph dataset given smiles string.
99
- # Nonisomeric SMILES are suggested but not necessary.
100
- # Uses sparse matrix representation for graphs,
101
- # For computational and speed efficiency.
102
-
103
- self.loader = DataLoader(self.dataset,
104
- shuffle=True,
105
- batch_size=self.batch_size,
106
- drop_last=True) # PyG dataloader for the first GAN.
107
-
108
- self.drugs = DruggenDataset(self.drug_data_dir,
109
- self.drugs_dataset_file,
110
- self.drug_raw_file,
111
- self.max_atom,
112
- self.features) # Dataset for the second GAN. Custom dataset class from PyG parent class.
113
- # Can create any molecular graph dataset given smiles string.
114
- # Nonisomeric SMILES are suggested but not necessary.
115
- # Uses sparse matrix representation for graphs,
116
- # For computational and speed efficiency.
117
-
118
- self.drugs_loader = DataLoader(self.drugs,
119
- shuffle=True,
120
- batch_size=self.batch_size,
121
- drop_last=True) # PyG dataloader for the second GAN.
122
-
123
- # Atom and bond type dimensions for the construction of the model.
124
-
125
- self.atom_decoders = self.decoder_load("atom") # Atom type decoders for first GAN.
126
- # eg. 0:0, 1:6 (C), 2:7 (N), 3:8 (O), 4:9 (F)
127
-
128
- self.bond_decoders = self.decoder_load("bond") # Bond type decoders for first GAN.
129
- # eg. 0: (no-bond), 1: (single), 2: (double), 3: (triple), 4: (aromatic)
130
-
131
- self.m_dim = len(self.atom_decoders) if not self.features else int(self.loader.dataset[0].x.shape[1]) # Atom type dimension.
132
-
133
- self.b_dim = len(self.bond_decoders) # Bond type dimension.
134
-
135
- self.vertexes = int(self.loader.dataset[0].x.shape[0]) # Number of nodes in the graph.
136
-
137
- self.drugs_atom_decoders = self.drug_decoder_load("atom") # Atom type decoders for second GAN.
138
- # eg. 0:0, 1:6 (C), 2:7 (N), 3:8 (O), 4:9 (F)
139
-
140
- self.drugs_bond_decoders = self.drug_decoder_load("bond") # Bond type decoders for second GAN.
141
- # eg. 0: (no-bond), 1: (single), 2: (double), 3: (triple), 4: (aromatic)
142
-
143
- self.drugs_m_dim = len(self.drugs_atom_decoders) if not self.features else int(self.drugs_loader.dataset[0].x.shape[1]) # Atom type dimension.
144
-
145
- self.drugs_b_dim = len(self.drugs_bond_decoders) # Bond type dimension.
146
-
147
- self.drug_vertexes = int(self.drugs_loader.dataset[0].x.shape[0]) # Number of nodes in the graph.
148
-
149
- # Transformer and Convolution configurations.
150
-
151
- self.act = config.act
152
-
153
- self.z_dim = config.z_dim
154
-
155
- self.lambda_gp = config.lambda_gp
156
-
157
- self.dim = config.dim
158
-
159
- self.depth = config.depth
160
-
161
- self.heads = config.heads
162
-
163
- self.mlp_ratio = config.mlp_ratio
164
-
165
- self.dec_depth = config.dec_depth
166
-
167
- self.dec_heads = config.dec_heads
168
-
169
- self.dec_dim = config.dec_dim
170
-
171
- self.dis_select = config.dis_select
172
-
173
- """self.la = config.la
174
- self.la2 = config.la2
175
- self.gcn_depth = config.gcn_depth
176
- self.g_conv_dim = config.g_conv_dim
177
- self.d_conv_dim = config.d_conv_dim"""
178
- """# PNA config
179
-
180
- self.agg = config.aggregators
181
- self.sca = config.scalers
182
- self.pna_in_ch = config.pna_in_ch
183
- self.pna_out_ch = config.pna_out_ch
184
- self.edge_dim = config.edge_dim
185
- self.towers = config.towers
186
- self.pre_lay = config.pre_lay
187
- self.post_lay = config.post_lay
188
- self.pna_layer_num = config.pna_layer_num
189
- self.graph_add = config.graph_add"""
190
-
191
- # Training configurations.
192
-
193
- self.epoch = config.epoch
194
-
195
- self.g_lr = config.g_lr
196
-
197
- self.d_lr = config.d_lr
198
-
199
- self.g2_lr = config.g2_lr
200
-
201
- self.d2_lr = config.d2_lr
202
-
203
- self.dropout = config.dropout
204
-
205
- self.dec_dropout = config.dec_dropout
206
-
207
- self.n_critic = config.n_critic
208
-
209
- self.beta1 = config.beta1
210
-
211
- self.beta2 = config.beta2
212
-
213
- self.resume_iters = config.resume_iters
214
-
215
- self.warm_up_steps = config.warm_up_steps
216
-
217
- # Test configurations.
218
-
219
- self.num_test_epoch = config.num_test_epoch
220
-
221
- self.test_iters = config.test_iters
222
-
223
- self.inference_sample_num = config.inference_sample_num
224
-
225
- # Directories.
226
-
227
- self.log_dir = config.log_dir
228
- self.sample_dir = config.sample_dir
229
- self.model_save_dir = config.model_save_dir
230
- self.result_dir = config.result_dir
231
-
232
- # Step size.
233
-
234
- self.log_step = config.log_sample_step
235
- self.clipping_value = config.clipping_value
236
- # Miscellaneous.
237
-
238
- # resume training
239
-
240
- self.resume = config.resume
241
- self.resume_epoch = config.resume_epoch
242
- self.resume_iter = config.resume_iter
243
- self.resume_directory = config.resume_directory
244
-
245
-
246
- self.mode = config.mode
247
-
248
- self.noise_strength_0 = torch.nn.Parameter(torch.zeros([]))
249
- self.noise_strength_1 = torch.nn.Parameter(torch.zeros([]))
250
- self.noise_strength_2 = torch.nn.Parameter(torch.zeros([]))
251
- self.noise_strength_3 = torch.nn.Parameter(torch.zeros([]))
252
-
253
- self.init_type = config.init_type
254
- self.build_model()
255
-
256
-
257
-
258
- def build_model(self):
259
- """Create generators and discriminators."""
260
-
261
- ''' Generator is based on Transformer Encoder:
262
-
263
- @ g_conv_dim: Dimensions for first MLP layers before Transformer Encoder
264
- @ vertexes: maximum length of generated molecules (atom length)
265
- @ b_dim: number of bond types
266
- @ m_dim: number of atom types (or number of features used)
267
- @ dropout: dropout possibility
268
- @ dim: Hidden dimension of Transformer Encoder
269
- @ depth: Transformer layer number
270
- @ heads: Number of multihead-attention heads
271
- @ mlp_ratio: Read-out layer dimension of Transformer
272
- @ drop_rate: depricated
273
- @ tra_conv: Whether module creates output for TransformerConv discriminator
274
- '''
275
-
276
- self.G = Generator(self.z_dim,
277
- self.act,
278
- self.vertexes,
279
- self.b_dim,
280
- self.m_dim,
281
- self.dropout,
282
- dim=self.dim,
283
- depth=self.depth,
284
- heads=self.heads,
285
- mlp_ratio=self.mlp_ratio,
286
- submodel = self.submodel)
287
-
288
- self.G2 = Generator2(self.dim,
289
- self.dec_dim,
290
- self.dec_depth,
291
- self.dec_heads,
292
- self.mlp_ratio,
293
- self.dec_dropout,
294
- self.drugs_m_dim,
295
- self.drugs_b_dim,
296
- self.submodel)
297
-
298
-
299
-
300
- ''' Discriminator implementation with PNA:
301
-
302
- @ deg: Degree distribution based on used data. (Created with _genDegree() function)
303
- @ agg: aggregators used in PNA
304
- @ sca: scalers used in PNA
305
- @ pna_in_ch: First PNA hidden dimension
306
- @ pna_out_ch: Last PNA hidden dimension
307
- @ edge_dim: Edge hidden dimension
308
- @ towers: Number of towers (Splitting the hidden dimension to multiple parallel processes)
309
- @ pre_lay: Pre-transformation layer
310
- @ post_lay: Post-transformation layer
311
- @ pna_layer_num: number of PNA layers
312
- @ graph_add: global pooling layer selection
313
- '''
314
-
315
-
316
- ''' Discriminator implementation with Graph Convolution:
317
-
318
- @ d_conv_dim: convolution dimensions for GCN
319
- @ m_dim: number of atom types (or number of features used)
320
- @ b_dim: number of bond types
321
- @ dropout: dropout possibility
322
- '''
323
-
324
- ''' Discriminator implementation with MLP:
325
-
326
- @ act: Activation function for MLP
327
- @ m_dim: number of atom types (or number of features used)
328
- @ b_dim: number of bond types
329
- @ dropout: dropout possibility
330
- @ vertexes: maximum length of generated molecules (molecule length)
331
- '''
332
-
333
- #self.D = Discriminator_old(self.d_conv_dim, self.m_dim , self.b_dim, self.dropout, self.gcn_depth)
334
- self.D2 = simple_disc("tanh", self.drugs_m_dim, self.drug_vertexes, self.drugs_b_dim)
335
- self.D = simple_disc("tanh", self.m_dim, self.vertexes, self.b_dim)
336
- self.V = simple_disc("tanh", self.m_dim, self.vertexes, self.b_dim)
337
- self.V2 = simple_disc("tanh", self.drugs_m_dim, self.drug_vertexes, self.drugs_b_dim)
338
-
339
- ''' Optimizers for G1, G2, D1, and D2:
340
-
341
- Adam Optimizer is used and different beta1 and beta2s are used for GAN1 and GAN2
342
- '''
343
-
344
- self.g_optimizer = torch.optim.AdamW(self.G.parameters(), self.g_lr, [self.beta1, self.beta2])
345
- self.g2_optimizer = torch.optim.AdamW(self.G2.parameters(), self.g2_lr, [self.beta1, self.beta2])
346
-
347
- self.d_optimizer = torch.optim.AdamW(self.D.parameters(), self.d_lr, [self.beta1, self.beta2])
348
- self.d2_optimizer = torch.optim.AdamW(self.D2.parameters(), self.d2_lr, [self.beta1, self.beta2])
349
-
350
-
351
-
352
- self.v_optimizer = torch.optim.AdamW(self.V.parameters(), self.d_lr, [self.beta1, self.beta2])
353
- self.v2_optimizer = torch.optim.AdamW(self.V2.parameters(), self.d2_lr, [self.beta1, self.beta2])
354
- ''' Learning rate scheduler:
355
-
356
- Changes learning rate based on loss.
357
- '''
358
-
359
- #self.scheduler_g = ReduceLROnPlateau(self.g_optimizer, mode='min', factor=0.5, patience=10, min_lr=0.00001)
360
-
361
-
362
- #self.scheduler_d = ReduceLROnPlateau(self.d_optimizer, mode='min', factor=0.5, patience=10, min_lr=0.00001)
363
-
364
- #self.scheduler_v = ReduceLROnPlateau(self.v_optimizer, mode='min', factor=0.5, patience=10, min_lr=0.00001)
365
- #self.scheduler_g2 = ReduceLROnPlateau(self.g2_optimizer, mode='min', factor=0.5, patience=10, min_lr=0.00001)
366
- #self.scheduler_d2 = ReduceLROnPlateau(self.d2_optimizer, mode='min', factor=0.5, patience=10, min_lr=0.00001)
367
- #self.scheduler_v2 = ReduceLROnPlateau(self.v2_optimizer, mode='min', factor=0.5, patience=10, min_lr=0.00001)
368
- self.print_network(self.G, 'G')
369
- self.print_network(self.D, 'D')
370
-
371
- self.print_network(self.G2, 'G2')
372
- self.print_network(self.D2, 'D2')
373
-
374
- self.G.to(self.device)
375
- self.D.to(self.device)
376
-
377
- self.V.to(self.device)
378
- self.V2.to(self.device)
379
- self.G2.to(self.device)
380
- self.D2.to(self.device)
381
-
382
- #self.V2.to(self.device)
383
- #self.modules_of_the_model = (self.G, self.D, self.G2, self.D2)
384
- """for p in self.G.parameters():
385
- if p.dim() > 1:
386
- if self.init_type == 'uniform':
387
- torch.nn.init.xavier_uniform_(p)
388
- elif self.init_type == 'normal':
389
- torch.nn.init.xavier_normal_(p)
390
- elif self.init_type == 'random_normal':
391
- torch.nn.init.normal_(p, 0.0, 0.02)
392
- for p in self.G2.parameters():
393
- if p.dim() > 1:
394
- if self.init_type == 'uniform':
395
- torch.nn.init.xavier_uniform_(p)
396
- elif self.init_type == 'normal':
397
- torch.nn.init.xavier_normal_(p)
398
- elif self.init_type == 'random_normal':
399
- torch.nn.init.normal_(p, 0.0, 0.02)
400
- if self.dis_select == "conv":
401
- for p in self.D.parameters():
402
- if p.dim() > 1:
403
- if self.init_type == 'uniform':
404
- torch.nn.init.xavier_uniform_(p)
405
- elif self.init_type == 'normal':
406
- torch.nn.init.xavier_normal_(p)
407
- elif self.init_type == 'random_normal':
408
- torch.nn.init.normal_(p, 0.0, 0.02)
409
-
410
- if self.dis_select == "conv":
411
- for p in self.D2.parameters():
412
- if p.dim() > 1:
413
- if self.init_type == 'uniform':
414
- torch.nn.init.xavier_uniform_(p)
415
- elif self.init_type == 'normal':
416
- torch.nn.init.xavier_normal_(p)
417
- elif self.init_type == 'random_normal':
418
- torch.nn.init.normal_(p, 0.0, 0.02)"""
419
-
420
-
421
- def decoder_load(self, dictionary_name):
422
-
423
- ''' Loading the atom and bond decoders'''
424
-
425
- with open("data/decoders/" + dictionary_name + "_" + self.dataset_name + '.pkl', 'rb') as f:
426
-
427
- return pickle.load(f)
428
-
429
- def drug_decoder_load(self, dictionary_name):
430
-
431
- ''' Loading the atom and bond decoders'''
432
-
433
- with open("data/decoders/" + dictionary_name +"_" + self.drugs_name +'.pkl', 'rb') as f:
434
-
435
- return pickle.load(f)
436
-
437
- def print_network(self, model, name):
438
-
439
- """Print out the network information."""
440
-
441
- num_params = 0
442
- for p in model.parameters():
443
- num_params += p.numel()
444
- print(model)
445
- print(name)
446
- print("The number of parameters: {}".format(num_params))
447
-
448
-
449
- def restore_model(self, epoch, iteration, model_directory):
450
-
451
- """Restore the trained generator and discriminator."""
452
-
453
- print('Loading the trained models from epoch / iteration {}-{}...'.format(epoch, iteration))
454
-
455
- G_path = os.path.join(model_directory, '{}-{}-G.ckpt'.format(epoch, iteration))
456
- D_path = os.path.join(model_directory, '{}-{}-D.ckpt'.format(epoch, iteration))
457
-
458
- self.G.load_state_dict(torch.load(G_path, map_location=lambda storage, loc: storage))
459
- self.D.load_state_dict(torch.load(D_path, map_location=lambda storage, loc: storage))
460
-
461
-
462
- G2_path = os.path.join(model_directory, '{}-{}-G2.ckpt'.format(epoch, iteration))
463
- D2_path = os.path.join(model_directory, '{}-{}-D2.ckpt'.format(epoch, iteration))
464
-
465
- self.G2.load_state_dict(torch.load(G2_path, map_location=lambda storage, loc: storage))
466
- self.D2.load_state_dict(torch.load(D2_path, map_location=lambda storage, loc: storage))
467
-
468
-
469
- def save_model(self, model_directory, idx,i):
470
- G_path = os.path.join(model_directory, '{}-{}-G.ckpt'.format(idx+1,i+1))
471
- D_path = os.path.join(model_directory, '{}-{}-D.ckpt'.format(idx+1,i+1))
472
- torch.save(self.G.state_dict(), G_path)
473
- torch.save(self.D.state_dict(), D_path)
474
-
475
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
476
- G2_path = os.path.join(model_directory, '{}-{}-G2.ckpt'.format(idx+1,i+1))
477
- D2_path = os.path.join(model_directory, '{}-{}-D2.ckpt'.format(idx+1,i+1))
478
-
479
- torch.save(self.G2.state_dict(), G2_path)
480
- torch.save(self.D2.state_dict(), D2_path)
481
-
482
- def reset_grad(self):
483
-
484
- """Reset the gradient buffers."""
485
-
486
- self.g_optimizer.zero_grad()
487
- self.v_optimizer.zero_grad()
488
- self.g2_optimizer.zero_grad()
489
- self.v2_optimizer.zero_grad()
490
-
491
- self.d_optimizer.zero_grad()
492
- self.d2_optimizer.zero_grad()
493
-
494
- def gradient_penalty(self, y, x):
495
-
496
- """Compute gradient penalty: (L2_norm(dy/dx) - 1)**2."""
497
-
498
- weight = torch.ones(y.size(),requires_grad=False).to(self.device)
499
- dydx = torch.autograd.grad(outputs=y,
500
- inputs=x,
501
- grad_outputs=weight,
502
- retain_graph=True,
503
- create_graph=True,
504
- only_inputs=True)[0]
505
-
506
- dydx = dydx.view(dydx.size(0), -1)
507
- gradient_penalty = ((dydx.norm(2, dim=1) - 1) ** 2).mean()
508
-
509
- return gradient_penalty
510
-
511
- def train(self):
512
-
513
- ''' Training Script starts from here'''
514
-
515
- #wandb.config = {'beta2': 0.999}
516
- #wandb.init(project="DrugGEN2", entity="atabeyunlu")
517
-
518
- # Defining sampling paths and creating logger
519
-
520
- self.arguments = "{}_glr{}_dlr{}_g2lr{}_d2lr{}_dim{}_depth{}_heads{}_decdepth{}_decheads{}_ncritic{}_batch{}_epoch{}_warmup{}_dataset{}_dropout{}".format(self.submodel,self.g_lr,self.d_lr,self.g2_lr,self.d2_lr,self.dim,self.depth,self.heads,self.dec_depth,self.dec_heads,self.n_critic,self.batch_size,self.epoch,self.warm_up_steps,self.dataset_name,self.dropout)
521
-
522
- self.model_directory= os.path.join(self.model_save_dir,self.arguments)
523
- self.sample_directory=os.path.join(self.sample_dir,self.arguments)
524
- self.log_path = os.path.join(self.log_dir, "{}.txt".format(self.arguments))
525
- if not os.path.exists(self.model_directory):
526
- os.makedirs(self.model_directory)
527
- if not os.path.exists(self.sample_directory):
528
- os.makedirs(self.sample_directory)
529
-
530
- # Learning rate cache for decaying.
531
-
532
-
533
- # protein data
534
- full_smiles = [line for line in open("data/chembl_train.smi", 'r').read().splitlines()]
535
- drug_smiles = [line for line in open("data/akt_train.smi", 'r').read().splitlines()]
536
-
537
- drug_mols = [Chem.MolFromSmiles(smi) for smi in drug_smiles]
538
- drug_scaf = [MurckoScaffold.GetScaffoldForMol(x) for x in drug_mols]
539
- fps_r = [Chem.RDKFingerprint(x) for x in drug_scaf]
540
-
541
- akt1_human_adj = torch.load("data/akt/AKT1_human_adj.pt").reshape(1,-1).to(self.device).float()
542
- akt1_human_annot = torch.load("data/akt/AKT1_human_annot.pt").reshape(1,-1).to(self.device).float()
543
-
544
- if self.resume:
545
- self.restore_model(self.resume_epoch, self.resume_iter, self.resume_directory)
546
-
547
- # Start training.
548
-
549
- print('Start training...')
550
- self.start_time = time.time()
551
- for idx in range(self.epoch):
552
-
553
- # =================================================================================== #
554
- # 1. Preprocess input data #
555
- # =================================================================================== #
556
-
557
- # Load the data
558
-
559
- dataloader_iterator = iter(self.drugs_loader)
560
-
561
- for i, data in enumerate(self.loader):
562
- try:
563
- drugs = next(dataloader_iterator)
564
- except StopIteration:
565
- dataloader_iterator = iter(self.drugs_loader)
566
- drugs = next(dataloader_iterator)
567
-
568
- # Preprocess both dataset
569
-
570
- bulk_data = load_data(data,
571
- drugs,
572
- self.batch_size,
573
- self.device,
574
- self.b_dim,
575
- self.m_dim,
576
- self.drugs_b_dim,
577
- self.drugs_m_dim,
578
- self.z_dim,
579
- self.vertexes)
580
-
581
- drug_graphs, real_graphs, a_tensor, x_tensor, drugs_a_tensor, drugs_x_tensor, z, z_edge, z_node = bulk_data
582
-
583
- if self.submodel == "CrossLoss":
584
- GAN1_input_e = a_tensor
585
- GAN1_input_x = x_tensor
586
- GAN1_disc_e = drugs_a_tensor
587
- GAN1_disc_x = drugs_x_tensor
588
- elif self.submodel == "Ligand":
589
- GAN1_input_e = a_tensor
590
- GAN1_input_x = x_tensor
591
- GAN1_disc_e = a_tensor
592
- GAN1_disc_x = x_tensor
593
- GAN2_input_e = drugs_a_tensor
594
- GAN2_input_x = drugs_x_tensor
595
- GAN2_disc_e = drugs_a_tensor
596
- GAN2_disc_x = drugs_x_tensor
597
- elif self.submodel == "Prot":
598
- GAN1_input_e = a_tensor
599
- GAN1_input_x = x_tensor
600
- GAN1_disc_e = a_tensor
601
- GAN1_disc_x = x_tensor
602
- GAN2_input_e = akt1_human_adj
603
- GAN2_input_x = akt1_human_annot
604
- GAN2_disc_e = drugs_a_tensor
605
- GAN2_disc_x = drugs_x_tensor
606
- elif self.submodel == "RL":
607
- GAN1_input_e = a_tensor
608
- GAN1_input_x = x_tensor
609
- GAN1_disc_e = a_tensor
610
- GAN1_disc_x = x_tensor
611
- GAN2_input_e = drugs_a_tensor
612
- GAN2_input_x = drugs_x_tensor
613
- GAN2_disc_e = drugs_a_tensor
614
- GAN2_disc_x = drugs_x_tensor
615
- elif self.submodel == "NoTarget":
616
- GAN1_input_e = a_tensor
617
- GAN1_input_x = x_tensor
618
- GAN1_disc_e = a_tensor
619
- GAN1_disc_x = x_tensor
620
-
621
- # =================================================================================== #
622
- # 2. Train the discriminator #
623
- # =================================================================================== #
624
- loss = {}
625
- self.reset_grad()
626
-
627
- # Compute discriminator loss.
628
-
629
- node, edge, d_loss = discriminator_loss(self.G,
630
- self.D,
631
- real_graphs,
632
- GAN1_disc_e,
633
- GAN1_disc_x,
634
- self.batch_size,
635
- self.device,
636
- self.gradient_penalty,
637
- self.lambda_gp,
638
- GAN1_input_e,
639
- GAN1_input_x)
640
-
641
- d_total = d_loss
642
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
643
- d2_loss = discriminator2_loss(self.G2,
644
- self.D2,
645
- drug_graphs,
646
- edge,
647
- node,
648
- self.batch_size,
649
- self.device,
650
- self.gradient_penalty,
651
- self.lambda_gp,
652
- GAN2_input_e,
653
- GAN2_input_x)
654
- d_total = d_loss + d2_loss
655
-
656
- loss["d_total"] = d_total.item()
657
- d_total.backward()
658
- self.d_optimizer.step()
659
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
660
- self.d2_optimizer.step()
661
- self.reset_grad()
662
- generator_output = generator_loss(self.G,
663
- self.D,
664
- self.V,
665
- GAN1_input_e,
666
- GAN1_input_x,
667
- self.batch_size,
668
- sim_reward,
669
- self.dataset.matrices2mol,
670
- fps_r,
671
- self.submodel,
672
- self.dataset_name)
673
-
674
- g_loss, fake_mol, g_edges_hat_sample, g_nodes_hat_sample, node, edge = generator_output
675
-
676
- self.reset_grad()
677
- g_total = g_loss
678
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
679
- output = generator2_loss(self.G2,
680
- self.D2,
681
- self.V2,
682
- edge,
683
- node,
684
- self.batch_size,
685
- sim_reward,
686
- self.dataset.matrices2mol_drugs,
687
- fps_r,
688
- GAN2_input_e,
689
- GAN2_input_x,
690
- self.submodel,
691
- self.drugs_name)
692
-
693
- g2_loss, fake_mol_g, dr_g_edges_hat_sample, dr_g_nodes_hat_sample = output
694
-
695
- g_total = g_loss + g2_loss
696
-
697
- loss["g_total"] = g_total.item()
698
- g_total.backward()
699
- self.g_optimizer.step()
700
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
701
- self.g2_optimizer.step()
702
-
703
- if self.submodel == "RL":
704
- self.v_optimizer.step()
705
- self.v2_optimizer.step()
706
-
707
-
708
- if (i+1) % self.log_step == 0:
709
-
710
- logging(self.log_path, self.start_time, fake_mol, full_smiles, i, idx, loss, 1,self.sample_directory)
711
- mol_sample(self.sample_directory,"GAN1",fake_mol, g_edges_hat_sample.detach(), g_nodes_hat_sample.detach(), idx, i)
712
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
713
- logging(self.log_path, self.start_time, fake_mol_g, drug_smiles, i, idx, loss, 2,self.sample_directory)
714
- mol_sample(self.sample_directory,"GAN2",fake_mol_g, dr_g_edges_hat_sample.detach(), dr_g_nodes_hat_sample.detach(), idx, i)
715
-
716
-
717
- if (idx+1) % 10 == 0:
718
- self.save_model(self.model_directory,idx,i)
719
- print("model saved at epoch {} and iteration {}".format(idx,i))
720
-
721
-
722
-
723
- def inference(self):
724
-
725
- # Load the trained generator.
726
- self.G.to(self.device)
727
- self.G2.to(self.device)
728
-
729
- G_path = os.path.join(self.inference_model, '{}-G.ckpt'.format(self.submodel))
730
- self.G.load_state_dict(torch.load(G_path, map_location=lambda storage, loc: storage))
731
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
732
- G2_path = os.path.join(self.inference_model, '{}-G2.ckpt'.format(self.submodel))
733
- self.G2.load_state_dict(torch.load(G2_path, map_location=lambda storage, loc: storage))
734
-
735
-
736
- smiles_test = [line for line in open("data/chembl_test.smi", 'r').read().splitlines()]
737
- if self.submodel == "NoTarget":
738
- smiles_train = [line for line in open("data/chembl_train.smi", 'r').read().splitlines()]
739
- else:
740
- smiles_train = [line for line in open("data/akt_train.smi", 'r').read().splitlines()]
741
-
742
- if self.submodel == "RL":
743
- drug_mols = [Chem.MolFromSmiles(smi) for smi in drug_smiles]
744
- drug_scaf = [MurckoScaffold.GetScaffoldForMol(x) for x in drug_mols]
745
- fps_r = [Chem.RDKFingerprint(x) for x in drug_scaf]
746
- else:
747
- fps_r = None
748
- akt1_human_adj = torch.load("data/akt/AKT1_human_adj.pt").reshape(1,-1).to(self.device).float()
749
- akt1_human_annot = torch.load("data/akt/AKT1_human_annot.pt").reshape(1,-1).to(self.device).float()
750
-
751
- self.G.eval()
752
- #self.D.eval()
753
- self.G2.eval()
754
- #self.D2.eval()
755
-
756
- step = self.inference_iterations
757
-
758
- self.inf_dataset = DruggenDataset(self.mol_data_dir,
759
- self.inf_dataset_file,
760
- self.inf_raw_file,
761
- self.max_atom,
762
- self.features) # Dataset for the first GAN. Custom dataset class from PyG parent class.
763
- # Can create any molecular graph dataset given smiles string.
764
- # Nonisomeric SMILES are suggested but not necessary.
765
- # Uses sparse matrix representation for graphs,
766
- # For computational and speed efficiency.
767
-
768
- self.inf_loader = DataLoader(self.inf_dataset,
769
- shuffle=True,
770
- batch_size=self.inf_batch_size,
771
- drop_last=True) # PyG dataloader for the first GAN.
772
-
773
- self.inf_drugs = DruggenDataset(self.drug_data_dir,
774
- self.inf_drugs_dataset_file,
775
- self.inf_drug_raw_file,
776
- self.max_atom,
777
- self.features) # Dataset for the second GAN. Custom dataset class from PyG parent class.
778
- # Can create any molecular graph dataset given smiles string.
779
- # Nonisomeric SMILES are suggested but not necessary.
780
- # Uses sparse matrix representation for graphs,
781
- # For computational and speed efficiency.
782
-
783
- self.inf_drugs_loader = DataLoader(self.inf_drugs,
784
- shuffle=True,
785
- batch_size=self.inf_batch_size,
786
- drop_last=True) # PyG dataloader for the second GAN.
787
- start_time = time.time()
788
- #metric_calc_mol = []
789
- metric_calc_dr = []
790
- date = time.time()
791
- if not os.path.exists("experiments/inference/{}".format(self.submodel)):
792
- os.makedirs("experiments/inference/{}".format(self.submodel))
793
- with torch.inference_mode():
794
-
795
- dataloader_iterator = iter(self.inf_drugs_loader)
796
- pbar = tqdm(range(self.inference_sample_num))
797
- pbar.set_description('Inference mode for {} model started'.format(self.submodel))
798
- for i, data in enumerate(self.inf_loader):
799
- try:
800
- drugs = next(dataloader_iterator)
801
- except StopIteration:
802
- dataloader_iterator = iter(self.inf_drugs_loader)
803
- drugs = next(dataloader_iterator)
804
-
805
- # Preprocess both dataset
806
-
807
- bulk_data = load_data(data,
808
- drugs,
809
- self.inf_batch_size,
810
- self.device,
811
- self.b_dim,
812
- self.m_dim,
813
- self.drugs_b_dim,
814
- self.drugs_m_dim,
815
- self.z_dim,
816
- self.vertexes)
817
-
818
- drug_graphs, real_graphs, a_tensor, x_tensor, drugs_a_tensor, drugs_x_tensor, z, z_edge, z_node = bulk_data
819
-
820
- if self.submodel == "CrossLoss":
821
- GAN1_input_e = a_tensor
822
- GAN1_input_x = x_tensor
823
- GAN1_disc_e = drugs_a_tensor
824
- GAN1_disc_x = drugs_x_tensor
825
- elif self.submodel == "Ligand":
826
- GAN1_input_e = a_tensor
827
- GAN1_input_x = x_tensor
828
- GAN1_disc_e = a_tensor
829
- GAN1_disc_x = x_tensor
830
- GAN2_input_e = drugs_a_tensor
831
- GAN2_input_x = drugs_x_tensor
832
- GAN2_disc_e = drugs_a_tensor
833
- GAN2_disc_x = drugs_x_tensor
834
- elif self.submodel == "Prot":
835
- GAN1_input_e = a_tensor
836
- GAN1_input_x = x_tensor
837
- GAN1_disc_e = a_tensor
838
- GAN1_disc_x = x_tensor
839
- GAN2_input_e = akt1_human_adj
840
- GAN2_input_x = akt1_human_annot
841
- GAN2_disc_e = drugs_a_tensor
842
- GAN2_disc_x = drugs_x_tensor
843
- elif self.submodel == "RL":
844
- GAN1_input_e = a_tensor
845
- GAN1_input_x = x_tensor
846
- GAN1_disc_e = a_tensor
847
- GAN1_disc_x = x_tensor
848
- GAN2_input_e = drugs_a_tensor
849
- GAN2_input_x = drugs_x_tensor
850
- GAN2_disc_e = drugs_a_tensor
851
- GAN2_disc_x = drugs_x_tensor
852
- elif self.submodel == "NoTarget":
853
- GAN1_input_e = a_tensor
854
- GAN1_input_x = x_tensor
855
- GAN1_disc_e = a_tensor
856
- GAN1_disc_x = x_tensor
857
- # =================================================================================== #
858
- # 2. GAN1 Inference #
859
- # =================================================================================== #
860
- generator_output = generator_loss(self.G,
861
- self.D,
862
- self.V,
863
- GAN1_input_e,
864
- GAN1_input_x,
865
- self.inf_batch_size,
866
- sim_reward,
867
- self.dataset.matrices2mol,
868
- fps_r,
869
- self.submodel,
870
- self.dataset_name)
871
-
872
- _, fake_mol_g, _, _, node, edge = generator_output
873
-
874
- # =================================================================================== #
875
- # 3. GAN2 Inference #
876
- # =================================================================================== #
877
-
878
- if self.submodel != "NoTarget" and self.submodel != "CrossLoss":
879
- output = generator2_loss(self.G2,
880
- self.D2,
881
- self.V2,
882
- edge,
883
- node,
884
- self.inf_batch_size,
885
- sim_reward,
886
- self.dataset.matrices2mol_drugs,
887
- fps_r,
888
- GAN2_input_e,
889
- GAN2_input_x,
890
- self.submodel,
891
- self.drugs_name)
892
-
893
- _, fake_mol_g, edges, nodes = output
894
-
895
- inference_drugs = [Chem.MolToSmiles(line) for line in fake_mol_g if line is not None]
896
- inference_drugs = [None if x is None else max(x.split('.'), key=len) for x in inference_drugs]
897
-
898
- with open("experiments/inference/{}/inference_drugs.txt".format(self.submodel), "a") as f:
899
- for molecules in inference_drugs:
900
-
901
- f.write(molecules)
902
- f.write("\n")
903
- metric_calc_dr.append(molecules)
904
-
905
- if len(inference_drugs) > 0:
906
- pbar.update(1)
907
-
908
- if len(metric_calc_dr) == self.inference_sample_num:
909
- break
910
-
911
- et = time.time() - start_time
912
-
913
- print("Inference mode is lasted for {:.2f} seconds".format(et))
914
-
915
- # print("Metrics calculation started using MOSES.")
916
- return {
917
- "Runtime (seconds)": round(et, 2),
918
- "Validity": f"{fraction_valid(metric_calc_dr)*100:.2f}%",
919
- "Uniqueness": f"{fraction_unique(metric_calc_dr)*100:.2f}%",
920
- "Novelty (Train)": f"{novelty(metric_calc_dr, smiles_train)*100:.2f}%",
921
- "Novelty (Inference)": f"{novelty(metric_calc_dr, smiles_test)*100:.2f}%"
922
- }
923
- # print("Validity: ", fraction_valid(metric_calc_dr), "\n")
924
- # print("Uniqueness: ", fraction_unique(metric_calc_dr), "\n")
925
- # print("Validity: ", novelty(metric_calc_dr, drug_smiles), "\n")
926
-
927
- # print("Metrics are calculated.")
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
training_data.py DELETED
@@ -1,50 +0,0 @@
1
- import torch
2
- import torch_geometric.utils as geoutils
3
- from utils import *
4
-
5
- def load_data(data, drugs, batch_size, device, b_dim, m_dim, drugs_b_dim, drugs_m_dim,z_dim,vertexes):
6
-
7
- z = sample_z(batch_size, z_dim) # (batch,max_len)
8
-
9
- z = torch.from_numpy(z).to(device).float().requires_grad_(True)
10
- data = data.to(device)
11
- drugs = drugs.to(device)
12
- z_e = sample_z_edge(batch_size,vertexes,b_dim) # (batch,max_len,max_len)
13
- z_n = sample_z_node(batch_size,vertexes,m_dim) # (batch,max_len)
14
- z_edge = torch.from_numpy(z_e).to(device).float().requires_grad_(True) # Edge noise.(batch,max_len,max_len)
15
- z_node = torch.from_numpy(z_n).to(device).float().requires_grad_(True) # Node noise.(batch,max_len)
16
- a = geoutils.to_dense_adj(edge_index = data.edge_index,batch=data.batch,edge_attr=data.edge_attr, max_num_nodes=int(data.batch.shape[0]/batch_size))
17
- x = data.x.view(batch_size,int(data.batch.shape[0]/batch_size),-1)
18
-
19
- a_tensor = label2onehot(a, b_dim, device)
20
- #x_tensor = label2onehot(x, m_dim)
21
- x_tensor = x
22
-
23
- a_tensor = a_tensor #+ torch.randn([a_tensor.size(0), a_tensor.size(1), a_tensor.size(2),1], device=a_tensor.device) * noise_strength_0
24
- x_tensor = x_tensor #+ torch.randn([x_tensor.size(0), x_tensor.size(1),1], device=x_tensor.device) * noise_strength_1
25
-
26
- drugs_a = geoutils.to_dense_adj(edge_index = drugs.edge_index,batch=drugs.batch,edge_attr=drugs.edge_attr, max_num_nodes=int(drugs.batch.shape[0]/batch_size))
27
-
28
- drugs_x = drugs.x.view(batch_size,int(drugs.batch.shape[0]/batch_size),-1)
29
-
30
- drugs_a = drugs_a.to(device).long()
31
- drugs_x = drugs_x.to(device)
32
- drugs_a_tensor = label2onehot(drugs_a, drugs_b_dim,device).float()
33
- drugs_x_tensor = drugs_x
34
-
35
- drugs_a_tensor = drugs_a_tensor #+ torch.randn([drugs_a_tensor.size(0), drugs_a_tensor.size(1), drugs_a_tensor.size(2),1], device=drugs_a_tensor.device) * noise_strength_2
36
- drugs_x_tensor = drugs_x_tensor #+ torch.randn([drugs_x_tensor.size(0), drugs_x_tensor.size(1),1], device=drugs_x_tensor.device) * noise_strength_3
37
- #prot_n = akt1_human_annot[None,:].to(device).float()
38
- #prot_e = akt1_human_adj[None,None,:].view(1,546,546,1).to(device).float()
39
-
40
-
41
-
42
- a_tensor_vec = a_tensor.reshape(batch_size,-1)
43
- x_tensor_vec = x_tensor.reshape(batch_size,-1)
44
- real_graphs = torch.concat((x_tensor_vec,a_tensor_vec),dim=-1)
45
-
46
- a_drug_vec = drugs_a_tensor.reshape(batch_size,-1)
47
- x_drug_vec = drugs_x_tensor.reshape(batch_size,-1)
48
- drug_graphs = torch.concat((x_drug_vec,a_drug_vec),dim=-1)
49
-
50
- return drug_graphs, real_graphs, a_tensor, x_tensor, drugs_a_tensor, drugs_x_tensor, z, z_edge, z_node
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
utils.py DELETED
@@ -1,442 +0,0 @@
1
- from statistics import mean
2
- from rdkit import DataStructs
3
- from rdkit import Chem
4
- from rdkit.Chem import AllChem
5
- from rdkit.Chem import Draw
6
- import os
7
- import numpy as np
8
- #import seaborn as sns
9
- import matplotlib.pyplot as plt
10
- from matplotlib.lines import Line2D
11
- from rdkit import RDLogger
12
- import torch
13
- from rdkit.Chem.Scaffolds import MurckoScaffold
14
- import math
15
- import time
16
- import datetime
17
- import re
18
- RDLogger.DisableLog('rdApp.*')
19
- import warnings
20
- from multiprocessing import Pool
21
- class Metrics(object):
22
-
23
- @staticmethod
24
- def valid(x):
25
- return x is not None and Chem.MolToSmiles(x) != ''
26
-
27
- @staticmethod
28
- def tanimoto_sim_1v2(data1, data2):
29
- min_len = data1.size if data1.size > data2.size else data2
30
- sims = []
31
- for i in range(min_len):
32
- sim = DataStructs.FingerprintSimilarity(data1[i], data2[i])
33
- sims.append(sim)
34
- mean_sim = mean(sim)
35
- return mean_sim
36
-
37
- @staticmethod
38
- def mol_length(x):
39
- if x is not None:
40
- return len([char for char in max(Chem.MolToSmiles(x).split(sep =".")).upper() if char.isalpha()])
41
- else:
42
- return 0
43
-
44
- @staticmethod
45
- def max_component(data, max_len):
46
-
47
- return (np.array(list(map(Metrics.mol_length, data)), dtype=np.float32)/max_len).mean()
48
-
49
-
50
- def sim_reward(mol_gen, fps_r):
51
-
52
- gen_scaf = []
53
-
54
- for x in mol_gen:
55
- if x is not None:
56
- try:
57
-
58
- gen_scaf.append(MurckoScaffold.GetScaffoldForMol(x))
59
- except:
60
- pass
61
-
62
- if len(gen_scaf) == 0:
63
-
64
- rew = 1
65
- else:
66
- fps = [Chem.RDKFingerprint(x) for x in gen_scaf]
67
-
68
-
69
- fps = np.array(fps)
70
- fps_r = np.array(fps_r)
71
-
72
- rew = average_agg_tanimoto(fps_r,fps)[0]
73
- if math.isnan(rew):
74
- rew = 1
75
-
76
- return rew ## change this to penalty
77
-
78
- ##########################################
79
- ##########################################
80
- ##########################################
81
-
82
- def mols2grid_image(mols,path):
83
- mols = [e if e is not None else Chem.RWMol() for e in mols]
84
-
85
- for i in range(len(mols)):
86
- if Metrics.valid(mols[i]):
87
- #if Chem.MolToSmiles(mols[i]) != '':
88
- AllChem.Compute2DCoords(mols[i])
89
- Draw.MolToFile(mols[i], os.path.join(path,"{}.png".format(i+1)), size=(1200,1200))
90
- else:
91
- continue
92
-
93
- def save_smiles_matrices(mols,edges_hard, nodes_hard,path,data_source = None):
94
- mols = [e if e is not None else Chem.RWMol() for e in mols]
95
-
96
- for i in range(len(mols)):
97
- if Metrics.valid(mols[i]):
98
- #m0= all_scores_for_print(mols[i], data_source, norm=False)
99
- #if Chem.MolToSmiles(mols[i]) != '':
100
- save_path = os.path.join(path,"{}.txt".format(i+1))
101
- with open(save_path, "a") as f:
102
- np.savetxt(f, edges_hard[i].cpu().numpy(), header="edge matrix:\n",fmt='%1.2f')
103
- f.write("\n")
104
- np.savetxt(f, nodes_hard[i].cpu().numpy(), header="node matrix:\n", footer="\nsmiles:",fmt='%1.2f')
105
- f.write("\n")
106
- #f.write(m0)
107
- f.write("\n")
108
-
109
-
110
- print(Chem.MolToSmiles(mols[i]), file=open(save_path,"a"))
111
- else:
112
- continue
113
-
114
- ##########################################
115
- ##########################################
116
- ##########################################
117
-
118
- def dense_to_sparse_with_attr(adj):
119
- ###
120
- assert adj.dim() >= 2 and adj.dim() <= 3
121
- assert adj.size(-1) == adj.size(-2)
122
-
123
- index = adj.nonzero(as_tuple=True)
124
- edge_attr = adj[index]
125
-
126
- if len(index) == 3:
127
- batch = index[0] * adj.size(-1)
128
- index = (batch + index[1], batch + index[2])
129
- #index = torch.stack(index, dim=0)
130
- return index, edge_attr
131
-
132
-
133
- def label2onehot(labels, dim, device):
134
-
135
- """Convert label indices to one-hot vectors."""
136
-
137
- out = torch.zeros(list(labels.size())+[dim]).to(device)
138
- out.scatter_(len(out.size())-1,labels.unsqueeze(-1),1.)
139
-
140
- return out.float()
141
-
142
-
143
- def sample_z_node(batch_size, vertexes, nodes):
144
-
145
- ''' Random noise for nodes logits. '''
146
-
147
- return np.random.normal(0,1, size=(batch_size,vertexes, nodes)) # 128, 9, 5
148
-
149
-
150
- def sample_z_edge(batch_size, vertexes, edges):
151
-
152
- ''' Random noise for edges logits. '''
153
-
154
- return np.random.normal(0,1, size=(batch_size, vertexes, vertexes, edges)) # 128, 9, 9, 5
155
-
156
-
157
- def sample_z( batch_size, z_dim):
158
-
159
- ''' Random noise. '''
160
-
161
- return np.random.normal(0,1, size=(batch_size,z_dim)) # 128, 9, 5
162
-
163
-
164
- def mol_sample(sample_directory, model_name, mol, edges, nodes, idx, i):
165
- sample_path = os.path.join(sample_directory,"{}-{}_{}-epoch_iteration".format(model_name,idx+1, i+1))
166
-
167
- if not os.path.exists(sample_path):
168
- os.makedirs(sample_path)
169
-
170
- mols2grid_image(mol,sample_path)
171
-
172
- save_smiles_matrices(mol,edges.detach(), nodes.detach(), sample_path)
173
-
174
- if len(os.listdir(sample_path)) == 0:
175
- os.rmdir(sample_path)
176
-
177
- print("Valid molecules are saved.")
178
- print("Valid matrices and smiles are saved")
179
-
180
-
181
- def logging(log_path, start_time, mols, train_smiles, i,idx, loss,model_num, save_path, get_maxlen=False):
182
-
183
- gen_smiles = []
184
- for line in mols:
185
- if line is not None:
186
- gen_smiles.append(Chem.MolToSmiles(line))
187
- elif line is None:
188
- gen_smiles.append(None)
189
-
190
- #gen_smiles_saves = [None if x is None else re.sub('\*', '', x) for x in gen_smiles]
191
- #gen_smiles_saves = [None if x is None else re.sub('\.', '', x) for x in gen_smiles_saves]
192
- gen_smiles_saves = [None if x is None else max(x.split('.'), key=len) for x in gen_smiles]
193
-
194
- sample_save_dir = os.path.join(save_path, "samples-GAN{}.txt".format(model_num))
195
- with open(sample_save_dir, "a") as f:
196
- for idxs in range(len(gen_smiles_saves)):
197
- if gen_smiles_saves[idxs] is not None:
198
-
199
- f.write(gen_smiles_saves[idxs])
200
- f.write("\n")
201
-
202
- k = len(set(gen_smiles_saves) - {None})
203
-
204
-
205
- et = time.time() - start_time
206
- et = str(datetime.timedelta(seconds=et))[:-7]
207
- log = "Elapsed [{}], Epoch/Iteration [{}/{}] for GAN{}".format(et, idx, i+1, model_num)
208
-
209
- # Log update
210
- #m0 = get_all_metrics(gen = gen_smiles, train = train_smiles, batch_size=batch_size, k = valid_mol_num, device=self.device)
211
- valid = fraction_valid(gen_smiles_saves)
212
- unique = fraction_unique(gen_smiles_saves, k, check_validity=False)
213
- novel = novelty(gen_smiles_saves, train_smiles)
214
-
215
- #qed = [QED(mol) for mol in mols if mol is not None]
216
- #sa = [SA(mol) for mol in mols if mol is not None]
217
- #logp = [logP(mol) for mol in mols if mol is not None]
218
-
219
- #IntDiv = internal_diversity(gen_smiles)
220
- #m0= all_scores_val(fake_mol, mols, full_mols, full_smiles, vert, norm=True) # 'mols' is output of Fake Reward
221
- #m1 =all_scores_chem(fake_mol, mols, vert, norm=True)
222
- #m0.update(m1)
223
-
224
- if get_maxlen:
225
- maxlen = Metrics.max_component(mols, 45)
226
- loss.update({"MaxLen": maxlen})
227
-
228
- #m0 = {k: np.array(v).mean() for k, v in m0.items()}
229
- #loss.update(m0)
230
- loss.update({'Valid': valid})
231
- loss.update({'Unique': unique})
232
- loss.update({'Novel': novel})
233
- #loss.update({'QED': statistics.mean(qed)})
234
- #loss.update({'SA': statistics.mean(sa)})
235
- #loss.update({'LogP': statistics.mean(logp)})
236
- #loss.update({'IntDiv': IntDiv})
237
-
238
- for tag, value in loss.items():
239
-
240
- log += ", {}: {:.4f}".format(tag, value)
241
- with open(log_path, "a") as f:
242
- f.write(log)
243
- f.write("\n")
244
- print(log)
245
- print("\n")
246
-
247
-
248
- #def plot_attn(dataset_name, heads,attn_w, model, iter, epoch):
249
- #
250
- # cols = 4
251
- # rows = int(heads/cols)
252
- #
253
- # fig, axes = plt.subplots( rows,cols, figsize = (30, 14))
254
- # axes = axes.flat
255
- # attentions_pos = attn_w[0]
256
- # attentions_pos = attentions_pos.cpu().detach().numpy()
257
- # for i,att in enumerate(attentions_pos):
258
- #
259
- # #im = axes[i].imshow(att, cmap='gray')
260
- # sns.heatmap(att,vmin = 0, vmax = 1,ax = axes[i])
261
- # axes[i].set_title(f'head - {i} ')
262
- # axes[i].set_ylabel('layers')
263
- # pltsavedir = "/home/atabey/attn/second"
264
- # plt.savefig(os.path.join(pltsavedir, "attn" + model + "_" + dataset_name + "_" + str(iter) + "_" + str(epoch) + ".png"), dpi= 500,bbox_inches='tight')
265
-
266
-
267
- def plot_grad_flow(named_parameters, model, iter, epoch):
268
-
269
- # Based on https://discuss.pytorch.org/t/check-gradient-flow-in-network/15063/10
270
- '''Plots the gradients flowing through different layers in the net during training.
271
- Can be used for checking for possible gradient vanishing / exploding problems.
272
-
273
- Usage: Plug this function in Trainer class after loss.backwards() as
274
- "plot_grad_flow(self.model.named_parameters())" to visualize the gradient flow'''
275
- ave_grads = []
276
- max_grads= []
277
- layers = []
278
- for n, p in named_parameters:
279
- if(p.requires_grad) and ("bias" not in n):
280
- print(p.grad,n)
281
- layers.append(n)
282
- ave_grads.append(p.grad.abs().mean().cpu())
283
- max_grads.append(p.grad.abs().max().cpu())
284
- plt.bar(np.arange(len(max_grads)), max_grads, alpha=0.1, lw=1, color="c")
285
- plt.bar(np.arange(len(max_grads)), ave_grads, alpha=0.1, lw=1, color="b")
286
- plt.hlines(0, 0, len(ave_grads)+1, lw=2, color="k" )
287
- plt.xticks(range(0,len(ave_grads), 1), layers, rotation="vertical")
288
- plt.xlim(left=0, right=len(ave_grads))
289
- plt.ylim(bottom = -0.001, top=1) # zoom in on the lower gradient regions
290
- plt.xlabel("Layers")
291
- plt.ylabel("average gradient")
292
- plt.title("Gradient flow")
293
- plt.grid(True)
294
- plt.legend([Line2D([0], [0], color="c", lw=4),
295
- Line2D([0], [0], color="b", lw=4),
296
- Line2D([0], [0], color="k", lw=4)], ['max-gradient', 'mean-gradient', 'zero-gradient'])
297
- pltsavedir = "/home/atabey/gradients/tryout"
298
- plt.savefig(os.path.join(pltsavedir, "weights_" + model + "_" + str(iter) + "_" + str(epoch) + ".png"), dpi= 500,bbox_inches='tight')
299
-
300
-
301
- def get_mol(smiles_or_mol):
302
- '''
303
- Loads SMILES/molecule into RDKit's object
304
- '''
305
- if isinstance(smiles_or_mol, str):
306
- if len(smiles_or_mol) == 0:
307
- return None
308
- mol = Chem.MolFromSmiles(smiles_or_mol)
309
- if mol is None:
310
- return None
311
- try:
312
- Chem.SanitizeMol(mol)
313
- except ValueError:
314
- return None
315
- return mol
316
- return smiles_or_mol
317
-
318
-
319
- def mapper(n_jobs):
320
- '''
321
- Returns function for map call.
322
- If n_jobs == 1, will use standard map
323
- If n_jobs > 1, will use multiprocessing pool
324
- If n_jobs is a pool object, will return its map function
325
- '''
326
- if n_jobs == 1:
327
- def _mapper(*args, **kwargs):
328
- return list(map(*args, **kwargs))
329
-
330
- return _mapper
331
- if isinstance(n_jobs, int):
332
- pool = Pool(n_jobs)
333
-
334
- def _mapper(*args, **kwargs):
335
- try:
336
- result = pool.map(*args, **kwargs)
337
- finally:
338
- pool.terminate()
339
- return result
340
-
341
- return _mapper
342
- return n_jobs.map
343
-
344
-
345
- def remove_invalid(gen, canonize=True, n_jobs=1):
346
- """
347
- Removes invalid molecules from the dataset
348
- """
349
- if not canonize:
350
- mols = mapper(n_jobs)(get_mol, gen)
351
- return [gen_ for gen_, mol in zip(gen, mols) if mol is not None]
352
- return [x for x in mapper(n_jobs)(canonic_smiles, gen) if
353
- x is not None]
354
-
355
-
356
- def fraction_valid(gen, n_jobs=1):
357
- """
358
- Computes a number of valid molecules
359
- Parameters:
360
- gen: list of SMILES
361
- n_jobs: number of threads for calculation
362
- """
363
- gen = mapper(n_jobs)(get_mol, gen)
364
- return 1 - gen.count(None) / len(gen)
365
-
366
-
367
- def canonic_smiles(smiles_or_mol):
368
- mol = get_mol(smiles_or_mol)
369
- if mol is None:
370
- return None
371
- return Chem.MolToSmiles(mol)
372
-
373
-
374
- def fraction_unique(gen, k=None, n_jobs=1, check_validity=True):
375
- """
376
- Computes a number of unique molecules
377
- Parameters:
378
- gen: list of SMILES
379
- k: compute unique@k
380
- n_jobs: number of threads for calculation
381
- check_validity: raises ValueError if invalid molecules are present
382
- """
383
- if k is not None:
384
- if len(gen) < k:
385
- warnings.warn(
386
- "Can't compute unique@{}.".format(k) +
387
- "gen contains only {} molecules".format(len(gen))
388
- )
389
- gen = gen[:k]
390
- canonic = set(mapper(n_jobs)(canonic_smiles, gen))
391
- if None in canonic and check_validity:
392
- canonic = [i for i in canonic if i is not None]
393
- #raise ValueError("Invalid molecule passed to unique@k")
394
- return 0 if len(gen) == 0 else len(canonic) / len(gen)
395
-
396
-
397
- def novelty(gen, train, n_jobs=1):
398
- gen_smiles = mapper(n_jobs)(canonic_smiles, gen)
399
- gen_smiles_set = set(gen_smiles) - {None}
400
- train_set = set(train)
401
- return 0 if len(gen_smiles_set) == 0 else len(gen_smiles_set - train_set) / len(gen_smiles_set)
402
-
403
-
404
- def average_agg_tanimoto(stock_vecs, gen_vecs,
405
- batch_size=5000, agg='max',
406
- device='cpu', p=1):
407
- """
408
- For each molecule in gen_vecs finds closest molecule in stock_vecs.
409
- Returns average tanimoto score for between these molecules
410
-
411
- Parameters:
412
- stock_vecs: numpy array <n_vectors x dim>
413
- gen_vecs: numpy array <n_vectors' x dim>
414
- agg: max or mean
415
- p: power for averaging: (mean x^p)^(1/p)
416
- """
417
- assert agg in ['max', 'mean'], "Can aggregate only max or mean"
418
- agg_tanimoto = np.zeros(len(gen_vecs))
419
- total = np.zeros(len(gen_vecs))
420
- for j in range(0, stock_vecs.shape[0], batch_size):
421
- x_stock = torch.tensor(stock_vecs[j:j + batch_size]).to(device).float()
422
- for i in range(0, gen_vecs.shape[0], batch_size):
423
-
424
- y_gen = torch.tensor(gen_vecs[i:i + batch_size]).to(device).float()
425
- y_gen = y_gen.transpose(0, 1)
426
- tp = torch.mm(x_stock, y_gen)
427
- jac = (tp / (x_stock.sum(1, keepdim=True) +
428
- y_gen.sum(0, keepdim=True) - tp)).cpu().numpy()
429
- jac[np.isnan(jac)] = 1
430
- if p != 1:
431
- jac = jac**p
432
- if agg == 'max':
433
- agg_tanimoto[i:i + y_gen.shape[1]] = np.maximum(
434
- agg_tanimoto[i:i + y_gen.shape[1]], jac.max(0))
435
- elif agg == 'mean':
436
- agg_tanimoto[i:i + y_gen.shape[1]] += jac.sum(0)
437
- total[i:i + y_gen.shape[1]] += jac.shape[0]
438
- if agg == 'mean':
439
- agg_tanimoto /= total
440
- if p != 1:
441
- agg_tanimoto = (agg_tanimoto)**(1/p)
442
- return np.mean(agg_tanimoto)