Duke-de-Artois commited on
Commit
549ba4c
·
verified ·
1 Parent(s): 3f45874

Upload folder using huggingface_hub

Browse files
Files changed (44) hide show
  1. .gitattributes +3 -0
  2. OpenMolIns/large/train.csv +3 -0
  3. OpenMolIns/light/train.csv +0 -0
  4. OpenMolIns/medium/train.csv +0 -0
  5. OpenMolIns/readme.md +6 -0
  6. OpenMolIns/small/train.csv +0 -0
  7. benchmarks/open_generation/MolCustom/AtomNum/test.csv +0 -0
  8. benchmarks/open_generation/MolCustom/BasicProp/test.csv +0 -0
  9. benchmarks/open_generation/MolCustom/BondNum/test.csv +0 -0
  10. benchmarks/open_generation/MolCustom/FunctionalGroup/test.csv +0 -0
  11. benchmarks/open_generation/MolCustom/readme.md +52 -0
  12. benchmarks/open_generation/MolEdit/AddComponent/test.csv +0 -0
  13. benchmarks/open_generation/MolEdit/AddComponent/test_raw.csv +0 -0
  14. benchmarks/open_generation/MolEdit/DelComponent/test.csv +0 -0
  15. benchmarks/open_generation/MolEdit/DelComponent/test_raw.csv +0 -0
  16. benchmarks/open_generation/MolEdit/SubComponent/test.csv +0 -0
  17. benchmarks/open_generation/MolEdit/SubComponent/test_raw.csv +0 -0
  18. benchmarks/open_generation/MolEdit/readme.md +40 -0
  19. benchmarks/open_generation/MolOpt/LogP/test.csv +0 -0
  20. benchmarks/open_generation/MolOpt/LogP/test_raw.csv +0 -0
  21. benchmarks/open_generation/MolOpt/MR/test.csv +0 -0
  22. benchmarks/open_generation/MolOpt/MR/test_raw.csv +0 -0
  23. benchmarks/open_generation/MolOpt/QED/test.csv +0 -0
  24. benchmarks/open_generation/MolOpt/QED/test_raw.csv +0 -0
  25. benchmarks/open_generation/MolOpt/readme.md +40 -0
  26. benchmarks/targeted_generation/CapMol/ChEBI-20/test.txt +0 -0
  27. benchmarks/targeted_generation/CapMol/ChEBI-20/train.txt +0 -0
  28. benchmarks/targeted_generation/CapMol/ChEBI-20/validation.txt +0 -0
  29. benchmarks/targeted_generation/CapMol/PubChem/test.txt +0 -0
  30. benchmarks/targeted_generation/CapMol/PubChem/train.txt +0 -0
  31. benchmarks/targeted_generation/CapMol/PubChem/validation.txt +0 -0
  32. benchmarks/targeted_generation/IUPAC/PubChem/test.txt +0 -0
  33. benchmarks/targeted_generation/IUPAC/PubChem/train.txt +0 -0
  34. benchmarks/targeted_generation/IUPAC/PubChem/validation.txt +0 -0
  35. benchmarks/targeted_generation/MolDebug/Bracket/generation.py +7 -0
  36. benchmarks/targeted_generation/MolDebug/InvalidAtom/generation.py +7 -0
  37. benchmarks/targeted_generation/MolDebug/InvalidRingNo/generation.py +1 -0
  38. benchmarks/targeted_generation/MolDebug/InvalidSymbol/generation.py +1 -0
  39. benchmarks/targeted_generation/MolDebug/plot.ipynb +50 -0
  40. benchmarks/targeted_generation/MolDebug/readme.md +25 -0
  41. benchmarks/targeted_generation/MolDebug/sample.py +1 -0
  42. benchmarks/targeted_generation/Reaction/012020_txt_reactions_wbib.json +3 -0
  43. benchmarks/targeted_generation/Reaction/retro50k.csv +0 -0
  44. sources/zinc250k/zinc250k_selfies.csv +3 -0
.gitattributes CHANGED
@@ -57,3 +57,6 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
57
  # Video files - compressed
58
  *.mp4 filter=lfs diff=lfs merge=lfs -text
59
  *.webm filter=lfs diff=lfs merge=lfs -text
 
 
 
 
57
  # Video files - compressed
58
  *.mp4 filter=lfs diff=lfs merge=lfs -text
59
  *.webm filter=lfs diff=lfs merge=lfs -text
60
+ OpenMolIns/large/train.csv filter=lfs diff=lfs merge=lfs -text
61
+ benchmarks/targeted_generation/Reaction/012020_txt_reactions_wbib.json filter=lfs diff=lfs merge=lfs -text
62
+ sources/zinc250k/zinc250k_selfies.csv filter=lfs diff=lfs merge=lfs -text
OpenMolIns/large/train.csv ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:80e0631dae2c6e3ac5dff7fd83458bb7bbac8d54d7e8b182d65eac7f27509727
3
+ size 13360913
OpenMolIns/light/train.csv ADDED
The diff for this file is too large to render. See raw diff
 
OpenMolIns/medium/train.csv ADDED
The diff for this file is too large to render. See raw diff
 
OpenMolIns/readme.md ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ ## The Instruction Tuning Dataset
2
+ We provide four versions of the instruction tuning dataset:
3
+ - light: 4500 instructions
4
+ - small: 18000 instructions
5
+ - medium: 45000 instructions
6
+ - large: 90000 instructions
OpenMolIns/small/train.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolCustom/AtomNum/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolCustom/BasicProp/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolCustom/BondNum/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolCustom/FunctionalGroup/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolCustom/readme.md ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ## MolCustom
2
+ Target: Let the LLM generate the customized molecule. Match if the generated molecule meets the standard of the requirements.
3
+
4
+ ### Subtasks
5
+
6
+ #### AtomNum
7
+ - **Description**: The number of atoms in the generated molecule should be equal to the given number.
8
+ - **Input**: The instruction that specifies the number of atoms in the generated molecule.
9
+ - **Output**: The molecule SMILES
10
+ - **Example**:
11
+ - Input: `Please generate a molecule with 8 carbon atoms, 1 nitrogen atoms, and 2 oxygen atoms.`
12
+ - Output: `CCCCC(C)NCC(=O)O`
13
+ - **Evaluation Metrics**:
14
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
15
+ - **Molecule Novelty**: The percentage of generated molecules that are novel.
16
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
17
+
18
+ #### BasicProp
19
+ - **Description**: The generated molecule should meet the basic properties, such as toxity, solubility, etc.
20
+ - **Input**: The instruction that specifies the basic properties of the generated molecule.
21
+ - **Output**: The molecule SMILES
22
+ - **Example**:
23
+ - Input: `Please generate a molecule with low toxicity.`
24
+ - Output: `c1ccccc1O`
25
+ - **Evaluation Metrics**:
26
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements. We could apply GNN Models that have been trained on datasets with toxicity labels to predict the toxicity of the generated molecules.
27
+ - **Molecule Novelty**: The percentage of generated molecules that are novel.
28
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
29
+
30
+ #### FunctionalGroup
31
+ - **Description**: The generated molecule should contain the specified functional groups.
32
+ - **Input**: The instruction that specifies the numbers of the functional groups in the generated molecule.
33
+ - **Output**: The molecule SMILES
34
+ - **Example**:
35
+ - Input: `Please generate a molecule with 2 hydroxyl groups.`
36
+ - Output: `OCCCCO`
37
+ - **Evaluation Metrics**:
38
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
39
+ - **Molecule Novelty**: The percentage of generated molecules that are novel.
40
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
41
+
42
+ #### BondNum
43
+ - **Description**: The generated molecule should contain the specified number of bonds.
44
+ - **Input**: The instruction that specifies the number of bonds in the generated molecule.
45
+ - **Output**: The molecule SMILES
46
+ - **Example**:
47
+ - Input: `Please generate a molecule with 1 single bond.`
48
+ - Output: `CC`
49
+ - **Evaluation Metrics**:
50
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
51
+ - **Molecule Novelty**: The percentage of generated molecules that are novel.
52
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
benchmarks/open_generation/MolEdit/AddComponent/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolEdit/AddComponent/test_raw.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolEdit/DelComponent/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolEdit/DelComponent/test_raw.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolEdit/SubComponent/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolEdit/SubComponent/test_raw.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolEdit/readme.md ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ## MolEdit
2
+ Target: Let the LLM edit the molecule. Match if the generated molecule meets the standard of the requirements.
3
+
4
+ ### Subtasks
5
+
6
+ #### AddComponent
7
+ - **Description**: Add the specified functional groups to the molecule.
8
+ - **Input**: The instruction that specifies the functional groups to be added to the molecule.
9
+ - **Output**: The edited molecule SMILES
10
+ - **Example**:
11
+ - Input: `Please add a hydroxyl group to the molecule CC.`
12
+ - Output: `CCO`
13
+ - **Evaluation Metrics**:
14
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
15
+ - **Molecule Similarity**: The edited molecule should make as few changes as possible to the original molecule.
16
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
17
+
18
+ #### RemoveComponent (Is this open? Could be open if the instruction is not specific (number/group))
19
+ - **Description**: Remove the specified functional groups from the molecule.
20
+ - **Input**: The instruction that specifies the functional groups to be removed from the molecule.
21
+ - **Output**: The edited molecule SMILES
22
+ - **Example**:
23
+ - Input: `Please remove a hydroxyl group from the molecule CCO.`
24
+ - Output: `CC`
25
+ - **Evaluation Metrics**:
26
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
27
+ - **Molecule Similarity**: The edited molecule should make as few changes as possible to the original molecule.
28
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
29
+
30
+ #### SubstituteComponent (Is this open? Could be open if we do not specify which functional group to substitute)
31
+ - **Description**: Substitute the specified functional groups in the molecule.
32
+ - **Input**: The instruction that specifies the functional groups to be substituted in the molecule.
33
+ - **Output**: The edited molecule SMILES
34
+ - **Example**:
35
+ - Input: `Please substitute a hydroxyl group in the molecule CCO with a carboxyl group.`
36
+ - Output: `CCN`
37
+ - **Evaluation Metrics**:
38
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
39
+ - **Molecule Similarity**: The edited molecule should make as few changes as possible to the original molecule.
40
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
benchmarks/open_generation/MolOpt/LogP/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolOpt/LogP/test_raw.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolOpt/MR/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolOpt/MR/test_raw.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolOpt/QED/test.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolOpt/QED/test_raw.csv ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/open_generation/MolOpt/readme.md ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ## MolEdit
2
+ Target: Let the LLM edit the molecule and optimize the molecule with specific properties. Match if the generated molecule meets the standard of the requirements.
3
+
4
+ ### Subtasks
5
+
6
+ #### LogP
7
+ - **Description**: Optimize the molecule to have the higher or lower LogP value.
8
+ - **Input**: The instruction that specifies the target LogP value direction.
9
+ - **Output**: The optimized molecule SMILES
10
+ - **Example**:
11
+ - Input: `Please optimize the molecule CCO to have a higher LogP value.`
12
+ - Output: `OCCO`
13
+ - **Evaluation Metrics**:
14
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
15
+ - **Molecule Similarity**: The edited molecule should make as few changes as possible to the original molecule.
16
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
17
+
18
+ #### MR
19
+ - **Description**: Optimize the molecule to have the higher or lower MR value.
20
+ - **Input**: The instruction that specifies the target MR value direction.
21
+ - **Output**: The optimized molecule SMILES
22
+ - **Example**:
23
+ - Input: `Please optimize the molecule CCO to have a higher MR value.`
24
+ - Output: `CCCC`
25
+ - **Evaluation Metrics**:
26
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
27
+ - **Molecule Similarity**: The edited molecule should make as few changes as possible to the original molecule.
28
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
29
+
30
+ #### Toxity
31
+ - **Description**: Optimize the molecule to have the higher or lower toxicity.
32
+ - **Input**: The instruction that specifies the target toxicity direction.
33
+ - **Output**: The optimized molecule SMILES
34
+ - **Example**:
35
+ - Input: `Please optimize the molecule CCO to have a higher toxicity.`
36
+ - Output: `CCN`
37
+ - **Evaluation Metrics**:
38
+ - **Success Rate (MAIN)**: The percentage of generated molecules that meet the requirements.
39
+ - **Molecule Similarity**: The edited molecule should make as few changes as possible to the original molecule.
40
+ - **Molecule Validity**: The percentage of generated molecules that are valid.
benchmarks/targeted_generation/CapMol/ChEBI-20/test.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/CapMol/ChEBI-20/train.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/CapMol/ChEBI-20/validation.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/CapMol/PubChem/test.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/CapMol/PubChem/train.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/CapMol/PubChem/validation.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/IUPAC/PubChem/test.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/IUPAC/PubChem/train.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/IUPAC/PubChem/validation.txt ADDED
The diff for this file is too large to render. See raw diff
 
benchmarks/targeted_generation/MolDebug/Bracket/generation.py ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ import random
2
+
3
+ random.choice(['(', ')'])
4
+
5
+ if "(" in molecule:
6
+ # randomly removes **a** bracket from the molecule
7
+ pass
benchmarks/targeted_generation/MolDebug/InvalidAtom/generation.py ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ import random
2
+
3
+ invalid_characters = []
4
+
5
+ if "(" in molecule:
6
+ # randomly changes a C atom to an invalid character
7
+ pass
benchmarks/targeted_generation/MolDebug/InvalidRingNo/generation.py ADDED
@@ -0,0 +1 @@
 
 
1
+ # randomly labels incorrect ring numbers
benchmarks/targeted_generation/MolDebug/InvalidSymbol/generation.py ADDED
@@ -0,0 +1 @@
 
 
1
+ # randomly introduce invalid symbols to the molecule SMILES
benchmarks/targeted_generation/MolDebug/plot.ipynb ADDED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "code",
5
+ "execution_count": 24,
6
+ "metadata": {},
7
+ "outputs": [
8
+ {
9
+ "data": {
10
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAcIAAACWCAIAAADCEh9HAAAABmJLR0QA/wD/AP+gvaeTAAAU30lEQVR4nO3dfVBUZf8G8LMsIMga5aj4kqa8RDE1usEmplOUIDAumLMczAUTJafSAcx3xLIMzQoJsOhVBdlF5IiJLCgslZMCSgrSjEwiCI2BKGYYmOiyh98fpx+PT4+xC+yee3fP9fnzzJG9ZoRrDpz7e9+ivr4+CgAAhsqOdAAAAOuGGgUAGBbUKADAsKBGAQCGBTUKADAsqFEAnrS1tW3fvv348eOkg4CJ2ZMOACAIzc3N3t7eOp2OoqgRI0b4+fnFxcUtWrSIdC4wARHWjQLwYNKkSW1tbWKxmGXZ/h86Ly8vhUKhUCj8/PzIxoPhwC/1AGanVCrb2tpEItGJEyfu3LmTmpr69NNPjx8//tKlSzt37pTJZI899lhCQkJ5eblerycdFgYNT6MA5lVfXz99+vTe3t74+Pj09PT+6yzLVlZWajSaQ4cONTU1cRfHjBkTGhpK03RISIiDgwOhyDA4qFEAM+rp6fH396+rqwsPDy8sLPy32y5cuMAwzIEDBxoaGrgro0ePnj9/Pk3TwcHBjo6OfOWFoUCNApjRqlWrMjMzvby8ampqJBKJwfu5PmUYpr6+nrvy8MMPBwUFyeVyhULh4uJi5rwwFKhRAHMpLi4OCwtzdHQ8ffr0jBkzBvVvL1++XFRUxDBMRUUFd2XkyJEvvfQSTdMLFy4cNWqUGfLCEKFGAcyitbV1+vTpv//+e0ZGRlxc3JC/TktLS2FhIcMwlZWV3E+rk5NTYGAgTdMLFixwdXU1XWQYItQogOmxLBsUFPT999+HhoYWFxeLRKLhf80rV64cPnxYo9GcOHGit7eXoiixWOzv70/T9CuvvOLm5jb8j4ChQY0CmN62bdu2bt3q5uZWV1dn8oK7ceNGSUkJwzClpaXcev7+PqVpeuLEiab9ODAINQpgYhUVFQEBASzLlpWVzZ0713wfdPPmTY1GwzBMWVnZvXv3KIqys7ObNWtWWFhYRESEh4eH+T4a7ocaBTClzs5OqVTa0tKSlJSUnJzM24dqtdqioqLDhw/fvn2bu+jj40PT9OLFi729vfmJIVioUQBTWrx4cV5enkwmq6io4H/9/J07d8rLyxmGOXLkSFdXF3eR69PIyEgfHx+e8wgEahTAZL766qvXX3/d1dW1trZ22rRpBJP09PRotVqGYY4ePXrr1i3uoru7u1wup2l69uzZJnnrBRzUKIBp1NfXy2Syv/76S61WK5VKg/c3NDQ8/vjj5k6l1+urqqoYhjl48OC1a9e4i1OnTg0PD0efmkwfAAxbT08Pt8A+NjbWmPv37Nnj4OCQnp5u7mD9ent7T548GR8ff/+r/MmTJ8fHx2u1Wp1Ox1sS24OnUQATiIuL+/TTTz09PWtqagyOGDU2Nj7zzDNdXV05OTnR0dH8JOzXvyVKQUFBY2MjdxFbogwHahRguEpKSuRyuaOjY1VVlVQqHfjmu3fvzpo1q7a2dunSpVlZWbwE/FfcCH9eXt7Fixe5K9gSZQhQowDD0traOmPGjBs3bqSlpSUkJBi8f/Xq1enp6R4eHjU1NQ899BAPCY2BLVGGAzUKMHQsy86bN++7774zcujz2LFj8+fPt7e3P3ny5MyZM/kJOSj9W6L0j/A7OzvPnTsXW6IMhOhfZgGs2/vvv09RlJubW3t7u8Gb29vbucHQlJQUHrINU3Nzc1pa2v2v8p2cnORyeXZ2dmdnJ+l0lgVPowBDVF1dPWfOHL1eX1paGhgYOPDNLMsGBweXl5cHBwcfO3bMipYZYUsUw0j3OIBV6uzs5BbYJyYmGnP/jh07KIoaN27c1atXzZ3NTDo6OrKzs+Vyef+rfLFYPHv27LS0tNbWVtLpSMLTKMBQKJXKAwcOyGSyU6dOGXyj/dNPP82ZM0en0xUVFc2fP5+fhOYzwJYoCoXC09OTdEDeke5xAOvz9ddfUxQlkUgaGhoM3tzV1eXl5UVR1Pr163nIxqc//vgjPz9/yZIl95+P4uzsHBgYWFZWRjodf/A0CjA4ly5d8vX17erqUqlUUVFRBu+Pjo5Wq9W+vr6VlZW2uhKzu7u7pKSkoKDgyJEj3PMpRVGurq7nz5+fOnUq0Wi8IN3jANakp6eHW2C/bNkyY+7fu3cvRVESieSXX34xdzZLcPPmzc2bN7u7u3Pv0GbOnEk6ER/wNAowCAkJCRkZGYMd+ty/f/+SJUv4SWghXnvttT179ri7uzc1NZHOYnZ2pAMAWI1jx47t3r3bwcFBrVYb7FCdThcVFdXV1RUZGSm0DqUoSiwWk47AH9QogFGuXbvG/SL/4YcfPvvsswbv37hxY3V1tYeHB/c+CmwYahTAMJZlo6Ojr127FhISsnr1aoP3Hz9+PC0tzcHBQaVSWc7gPJ8ee+wxiqIE8X4JNQpgjJ07d5aXl48bN27fvn0GB5CuX7/OPbdu377d39+fn4SW5tFHH6UoatKkSaSD8AE1CmDATz/99O6779rZ2alUqvHjxw98M/fc2t7ePm/evLVr1/KTEMhCjQIM5NatW4sWLdLpdOvXrw8KCjJ4/8cff6zVaseOHZuVlWVnh58vQcB/M8BAVq5c2dzc7Ofnt23bNoM3nz179p133hGJRHv37p0wYQIP8cASoEYB/tXevXtzc3MlEolarTY4gNTd3R0VFXXv3r01a9bI5XJ+EhKxdu3amJiYmzdvkg5iKVCjAA/W2NjIvZT//PPPjTnC880332xoaHjmmWe4zZxsGMMw2dnZ3d3dpINYCtQowAPcvXs3MjKyq6tr6dKlxpw6l52drVKpXFxccnNzbXVwHv4NahTgATZt2lRbW+vh4ZGRkWHw5qampvj4eIqiPvvsM29vb/OnA8uCGgX4p+PHj6enp3NDnwYXz3NDn3/++WdkZOTSpUv5SQgWBTUK8F/6hz4/+OADY06dS0xMPHPmjLu7O4Y+BQs1CvAfLMsuWbKkvb09ODh4zZo1Bu8vLS1NTU21t7cX7NAnUKhRgPt99NFHWq123LhxWVlZxgx9xsTE9PX1JScnz5o1i5+EYIFQowB/O3v27NatW7nF8waHPvv6+pYvX97e3h4QELBu3Tp+EoJlQo0CUNR9i+fXrVtnzKlzKSkpxcXFY8eOVavVgtpbE/4XahSAoijqjTfeaGho8PX1TU5ONnjzuXPntmzZIhKJ9uzZM3HiRB7igSVDjQJQWVlZarV6sEOfq1evDgsL4ychWDLUKAhdY2Mjt3g+MzPTmMXzK1euvHjx4tNPP23zQ59gJNQoCJpOp4uOjjb+xKT8/PycnBwXF5f8/HwnJyceEoLlQ42CoG3atOnMmTNGnpjU1NS0YsUKiqJ27979xBNPmD8dWAfUKAhXaWnpJ598YuSJSdxz659//knT9LJly/hJCFYBNQoC1b943sgTk5KSkk6fPj1lypQvv/ySh3hgRVCjIET9Jya9+OKLxpyYVFZWtmvXLnt7+7y8vEceeYSHhGBF7EkHACAgJSWFOzFJrVYbPDGpo6MjJiaGZVkMfcIDCaVGCwoKamtrSacgYOLEiStXriSdwrKcO3fu7bffNvLEJG7o8+rVqy+88MKGDRv4SQjWRSg1WlhYmJOTQzoFAVKpFDV6v+7ubqVSee/evbVr1xpzYlJqaqpGoxkzZkxubi6GPuGBhFKjERERxhynY3twPuU/DOrEpJqams2bN2PoEwYmlBoNDw8PDw8nnQII279/v/EnJt2+fZt7bk1ISMA3DwwAb+pBKJqamuLi4iijT0xatWrVxYsXn3rqqQ8++MD86cCKoUZBEAZ7YlJ+fn52djY39Ons7MxDQrBeqFEQhM2bNxt/YtLly5e5oc/09PQnn3zS/OnAuqFGwfaVlZUZf2JSb28vN/QZERERGxvLT0KwakJ5xQSCxQ19siy7Y8cOYxbPb9mypaqqavLkyRj6BCPhaRRsWV9fX2xs7NWrV408MenEiRMpKSnc0Ofo0aN5SAg2AE+jYMt27dql0WiMPDGpo6NDqVTq9frk5OTnnnuOn4RgA/A0Cjbr3LlzSUlJRi6e739uff755zdt2sRPQrANqFGwTbdv3x7UiUnp6elFRUWPPPJITk4Ohj5hUITyS/2rr74qzJl6IROLxT4+PsYsntfr9bm5uSKRKCsra8qUKTxkA1silBoFGIBYLP7xxx81Gg2GPmEIhPJL/f79+/tASLq7uz09Pevr6xMTE435DnFycoqIiDD39yHYJKHUKAiNi4sLd+h8WlpaUVER6Thgy1CjYLN8fX23b9/e19cXGxvb1tZGOg7YLNQo2DJub+aOjo6oqCi9Xk86Dtgm1CjYMm7R6IQJE7jxJNJxwDahRsHGjRs3Lisry87OjhuWJx0HbBBqFGzfvHnz1qxZ0791E+k4YGtQoyAIO3bsmDlzZv9GogAmhBoFQXBwcFCr1Q899BC3rT3pOGBTUKMgFB4eHrt376b+/5Al0nHAdghlGPTo0aM///wz6RQETJgwAVu493v11Ve1Wq1KpVIqlVVVVQYPBwUwhlBq9NChQ8LcmkQqlaJG7/f5559XV1dzB9BjCRSYhFBqdMGCBcLcucfgPptCI5FIcnNzn3vuudTU1ICAALlcTjoRWD2h1KhCoVAoFKRTgEXw9fV9//33N27cuHz58rq6ugkTJpBOBNYNr5hAiNatWxcUFMQNibIsSzoOWDfUKAiRnZ2dSqUaP378Dz/8sGvXLtJxwLqhRkGguCFRkUiUlJR0+vRp0nHAiqFGQbiCg4PfeustnU6HIVEYDtQoCNrOnTtnzpzZ1NSEIVEYMtQoCJqDg4NKpRo1alR+fr4wVxbD8KFGQeg8PT0zMjIoilq5ciWGRGEIUKMAVExMTFRUVHd3N3e0Pek4YGVQowAURVFffPHF448/fu7cuS1btpDOAlYGNQpAURQlkUi4k0RTUlKKi4tJxwFrghoF+Jufn997773X19e3fPny9vZ20nHAaqBGAf5jw4YNQUFB169fj4mJ6evrIx0HrANqFOA/7OzscnJyxo8fX1pampqaSjoOWAfUKMB/cXNz27dvn0gkSkxMPHPmDOk4YAVQowD/FBISkpCQoNPpoqKiMCQKBqFGAR5g586dUqm0qakpPj6edBawdKhRgAcYMWJEfn7+qFGjsrOzVSoV6Thg0VCjAA/m6emZlpZGUdSbb77Z0NBAOo4FoWl66dKlEomEdBBLgRoF+FfLly9XKpUYEv2HXbt2ZWVljR49mnQQS4EaBRhIZmbmtGnTzp49+84775DOAhYKNQowEFdX14MHDzo4OHz88cdarZZ0HLBEqFEAA2Qy2bvvvsuybHR0NIZE4X+hRgEM27RpU2Bg4PXr15ctW4YhUWP89ttvFEW1traSDsIH1CiAYdxJom5ubsePH+de38PAfv31V4qiWlpaSAfhA2oUwCj9Q6IbN26srq4mHQcsCGoUwFihoaFxcXHckGhXVxfpOBZNr9eTjsAf1CjAIHz00UdSqbSxsTEhIYF0Fkv0xx9/JCUleXh47N27l6KosWPHkk7EBxH+Xg4wKJcuXfL19e3q6lKpVFFRUaTjWITu7u6SkpKCgoIjR470zym4urqeP39+6tSpRKPxATUKMGjffPPNihUrJBJJTU2Nl5cX6TjEdHZ2arXaoqKib7/9tru7m7vo7Ow8e/ZsbgNssvF4gxoFGAqlUnngwAGZTHbq1ClHR0fScXh18+ZNjUbDMExZWRn37GlnZzdr1qywsDCFQuHp6Uk6IN9QowBDcevWLalU2tzcnJiYuGPHDtJx+HDjxo2SkhKGYUpLS3U6HUVRYrHY39+fpmmapidOnEg6IDGoUYAhqq6unjNnjl6vLy0tDQwMJB3HXK5cuXL48GGNRnPixIne3l7qvvZ85ZVX3NzcSAckDzUKMHTJyclvv/22m5tbXV2djRVKS0tLYWEhwzCVlZVcSzg5OQUGBtI0vWDBAldXV9IBLQhqFGDoWJadN2/ed999FxoaWlxcLBKJSCcarsuXLxcVFd3fns7OznPnzqVpeuHChaNGjSId0BKhRgGGpbW1dcaMGTdu3EhLS7PexaQXLlxgGIZhmPr6eu7Kww8/HBQUJJfLFQqFi4sL2XgWDjUKMFwlJSVyudzR0bGqqkoqlZKOMwhce+bl5V28eJG7Mnr06Pnz59M0HRwcLLQVCEOGGgUwgbi4uE8//dTT07OmpsbCf/NlWbayslKj0RQUFDQ2NnIXx4wZExoaStN0SEiIg4MD2YRWBzUKYAJ379719/c/f/58bGzsN998QzrOA+j1+qqqKoZhDh061NbWxl2cPHnywoULw8LCAgIC7O3tySa0XqhRANOor6+XyWR//fWXWq1WKpWk4/ytvz0PHjx47do17uLUqVPDw8Npmp49e7YNvBYjDjUKYDJfffXV66+/7urqWltbO23aNIJJenp6tFotwzBHjx69desWd9Hd3V0ul6M9TQ41CmBKixcvzsvLk8lkFRUV/P+R8c6dO+Xl5QzDHDlypH8rPx8fH5qmIyMjfXx8eM4jEKhRAFPq7OyUSqUtLS1JSUnJycm8fSi3Rcjhw4dv377NXeTac/Hixd7e3vzEECzUKICJVVRUBAQEsCxbVlY2d+5c833QAFuEREREeHh4mO+j4X6oUQDT27Zt29atW800JIotQiwNahTA9FiWDQoK+v777004JIotQiwWahTALFpbW6dPn/77779nZGTExcUN+etgixDLhxoFMJfi4uKwsDBHR8fTp0/PmDFjUP+2f4uQiooK7srIkSNfeuklbBFigVCjAGa0atWqzMxMLy+vmpoaiURi8H5sEWKNUKMAZtTT0+Pv719XVxceHl5YWPhvt3HteeDAgYaGBu4KtgixIqhRAPOqr6+fPn16b29vfHx8enp6//X+LUIOHTrU1NTEXcQWIdYINQpgdtz5dyKR6Mcff5TJZJmZmfv27evo6Ghvb+dumDJlyssvvxwWFvbiiy+KxWKyaWGwUKMAfJg0aVJbW5tYLGZZtv+HzsvLS6FQKBQKPz8/svFgOFCjAHxobm729vbmVsuPGDHCz88vLi5u0aJFpHOBCaBGAXjS1ta2b98+X1/fkJAQ0lnAlFCjAADDYkc6AACAdUONAgAMC2oUAGBYUKMAAMOCGgUAGJb/A9pLHs6z/LISAAAAqXpUWHRyZGtpdFBLTCByZGtpdCAyMDIzLjA5LjUAAHice79v7T0GIOBnQAB2KG5gZGNIANJMTGwMGUCamZkRG0MDxGCB0EwsEAkmZkZuBkYGRiYNJiZmBWYWDSYWVgVWNg1mFgYGEUagAlagHDOLeByQyQi3d8VXGftzIbP3gzhTupgcGBgckNn2IDZUDZj9o75zH1BcFcT+F/9oP8yg//qBB4DUUhBbDADL1x3R9Y+HIwAAAOJ6VFh0TU9MIHJka2l0IDIwMjMuMDkuNQAAeJyFkV0OgjAMx993il6ApR+MyaMwYowREkXv4Lv3j60EN2OC7Zps7Q/a/efA7JJOjyd8jJNzALix2raFuyCiO4NtoBsOxxH6ed+tmX66jfMVojqaf5P7eTqvGYIe0DckxAIVeSYJHDSFb8ufsoLsUdqIO6jQx7DUf0ApwQ2uLjtvNA7KaT9udiHURf0HbAwkH/+C0UBZ/1htkMOYvtRa9OumMWX9zDmrZC5ZC9Ko85VZI+SbkUaT5xeNmKe0KpWjlI3tvD657t0LVadudlrRmfEAAABpelRYdFNNSUxFUyByZGtpdCAyMDIzLjA5LjUAAHicRYpBCoAwDAS/InhRSEI2MVYpnvKtPl6lWGEPO8PknIkr8xmmtrCL7UcEKVWGlPG1++0FlR0Oc4IYPJyqifpZQColfuKOI+evX9sNvo8YIFZtgwYAAAAASUVORK5CYII=",
11
+ "text/plain": [
12
+ "<rdkit.Chem.rdchem.Mol at 0x7e706e94b680>"
13
+ ]
14
+ },
15
+ "execution_count": 24,
16
+ "metadata": {},
17
+ "output_type": "execute_result"
18
+ }
19
+ ],
20
+ "source": [
21
+ "from rdkit import Chem\n",
22
+ "\n",
23
+ "mol = Chem.MolFromSmiles('C1C=CC=C1C#C')\n",
24
+ "\n",
25
+ "mol"
26
+ ]
27
+ }
28
+ ],
29
+ "metadata": {
30
+ "kernelspec": {
31
+ "display_name": "base",
32
+ "language": "python",
33
+ "name": "python3"
34
+ },
35
+ "language_info": {
36
+ "codemirror_mode": {
37
+ "name": "ipython",
38
+ "version": 3
39
+ },
40
+ "file_extension": ".py",
41
+ "mimetype": "text/x-python",
42
+ "name": "python",
43
+ "nbconvert_exporter": "python",
44
+ "pygments_lexer": "ipython3",
45
+ "version": "3.12.2"
46
+ }
47
+ },
48
+ "nbformat": 4,
49
+ "nbformat_minor": 2
50
+ }
benchmarks/targeted_generation/MolDebug/readme.md ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ SMILES(Simplified Molecular Input Line Entry System)是一种用于描述化学物质结构的简明方式。一个SMILES字符串被判定为invalid时,可能存在以下几种出错情况:
2
+ 1. **语法错误**:
3
+ - **括号不匹配**:开括号和闭括号数量不一致。
4
+ - **原子符号错误**:使用了非法的原子符号,或者在原子符号后没有正确地指定其连接关系。
5
+ - **缺少连接符号**:原子之间或原子与环之间的连接符号缺失。
6
+ - **多余的字符**:存在不应该出现的字符或符号。
7
+ - **环的指定错误**:环的编号不正确或未指定。
8
+ 2. **结构错误**:
9
+ - **不合理的原子连接**:例如,氢原子连接了两个其他原子。
10
+ - **不合理的环结构**:环的尺寸不合理,或者环的连接方式不可能在现实中存在。
11
+ - **立体化学描述错误**:@和/的使用不正确,或者指定了不可能的立体化学。
12
+ 3. **电荷和同位素标记错误**:
13
+ - **电荷不平衡**:分子中正负电荷的代数和不为零。
14
+ - **同位素标记错误**:使用了非法的同位素标记,或者标记的位置不正确。
15
+ 4. **环的闭合错误**:
16
+ - **环闭合到错误的原子**:环的编号指向了错误的原子。
17
+ - **未闭合的环**:环被打开,没有正确闭合。
18
+ 5. **分支错误**:
19
+ - **分支点不明确**:分支点之前没有明确的原子或连接点。
20
+ - **分支结构嵌套错误**:分支结构嵌套使用不当。
21
+ 6. **分子片段错误**:
22
+ - **孤立的分子片段**:分子中存在未与其他部分连接的孤立片段。
23
+ 7. **其他规则违反**:
24
+ - **SMILES规范中的特定规则违反**:例如,某些特定的环结构或立体化学描述在SMILES规范中可能有特定的表示方法,违反这些规则也会导致SMILES字符串无效。
25
+ 识别和处理这些错误通常需要使用专业的化学信息学工具或软件,这些工具能够提供详细的错误信息,帮助用户修正SMILES字符串。
benchmarks/targeted_generation/MolDebug/sample.py ADDED
@@ -0,0 +1 @@
 
 
1
+ # sample molecules from the ZINC and PubChem datasets
benchmarks/targeted_generation/Reaction/012020_txt_reactions_wbib.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:43c4398862ea937a4eeaec97b307638a8bf02cb2930fc36ce0190880b38d903d
3
+ size 24278278
benchmarks/targeted_generation/Reaction/retro50k.csv ADDED
File without changes
sources/zinc250k/zinc250k_selfies.csv ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:93ef9e3f8f2bb09f2b2d80c9aefeca2a207d0cf292ac37556ae89cd765ebf8ac
3
+ size 69323598