DunnBC22 commited on
Commit
af31ff2
·
1 Parent(s): e4813b2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -7,11 +7,13 @@ metrics:
7
  model-index:
8
  - name: codet5-small-Generate_Docstrings_for_Python
9
  results: []
 
 
 
 
 
10
  ---
11
 
12
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
- should probably proofread and complete it, then remove this comment. -->
14
-
15
  # codet5-small-Generate_Docstrings_for_Python
16
 
17
  This model is a fine-tuned version of [Salesforce/codet5-small](https://huggingface.co/Salesforce/codet5-small) on the None dataset.
@@ -25,15 +27,17 @@ It achieves the following results on the evaluation set:
25
 
26
  ## Model description
27
 
28
- More information needed
 
 
29
 
30
  ## Intended uses & limitations
31
 
32
- More information needed
33
 
34
  ## Training and evaluation data
35
 
36
- More information needed
37
 
38
  ## Training procedure
39
 
@@ -60,4 +64,4 @@ The following hyperparameters were used during training:
60
  - Transformers 4.27.3
61
  - Pytorch 1.13.1+cu116
62
  - Datasets 2.10.1
63
- - Tokenizers 0.13.2
 
7
  model-index:
8
  - name: codet5-small-Generate_Docstrings_for_Python
9
  results: []
10
+ datasets:
11
+ - kejian/codesearchnet-python-raw
12
+ language:
13
+ - en
14
+ pipeline_tag: text-generation
15
  ---
16
 
 
 
 
17
  # codet5-small-Generate_Docstrings_for_Python
18
 
19
  This model is a fine-tuned version of [Salesforce/codet5-small](https://huggingface.co/Salesforce/codet5-small) on the None dataset.
 
27
 
28
  ## Model description
29
 
30
+ This model is trained to provide the docstring for functions.
31
+
32
+ For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Generate%20Docstrings/Code_T5_Project.ipynb
33
 
34
  ## Intended uses & limitations
35
 
36
+ This model is intended to demonstrate my ability to solve a complex problem using technology.
37
 
38
  ## Training and evaluation data
39
 
40
+ Dataset Source: kejian/codesearchnet-python-raw (from HuggingFace Datasets; https://huggingface.co/datasets/kejian/codesearchnet-python-raw)
41
 
42
  ## Training procedure
43
 
 
64
  - Transformers 4.27.3
65
  - Pytorch 1.13.1+cu116
66
  - Datasets 2.10.1
67
+ - Tokenizers 0.13.2