Eemansleepdeprived commited on
Commit
0868be8
·
verified ·
1 Parent(s): 3a354d5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +64 -73
README.md CHANGED
@@ -21,95 +21,86 @@ context: The train was unusually empty as Aarav boarded it late one evening, the
21
  ---
22
 
23
 
24
-
25
- # Model Card: Humaneyes Text Paraphraser
26
-
27
  ## Model Description
28
 
29
- Humaneyes is an advanced text paraphrasing model built using the Pegasus transformer architecture. The model is designed to generate high-quality, contextually-aware paraphrases while preserving the original text's paragraph structure and semantic meaning.
30
 
31
  ### Model Details
32
  - **Developed by:** Eemansleepdeprived
33
- - **Model type:** Text-to-text generation (Paraphrasing)
 
 
 
34
  - **Language(s):** English
35
- - **Base model:** Google Pegasus Large
36
- - **Input format:** Plain text
37
- - **Output format:** Paraphrased text
38
-
39
- ## Intended Use
40
-
41
- ### Primary Use Cases
42
- - Academic writing: Helping researchers and students rephrase text
43
- - Content creation: Assisting writers in generating alternative text variations
44
- - Language learning: Providing examples of different ways to express ideas
45
-
46
- ### Potential Limitations
47
- - May not perfectly preserve highly technical or domain-specific language
48
- - Performance can vary depending on input text complexity
49
- - Not recommended for professional legal or medical document translation
50
-
51
- ## Performance and Evaluation
52
-
53
- ### Key Features
54
- - Preserves paragraph structure
55
- - Maintains semantic meaning
56
- - Handles various text lengths and complexities
57
- - Supports sentence-level paraphrasing
58
-
59
- ### Evaluation Metrics
60
- - Semantic similarity
61
- - Readability
62
- - Grammatical correctness
63
-
64
-
65
- ## Training Data
66
-
67
- ### Training Methodology
68
- - Base model: Trained on a diverse corpus of English text
69
- - Fine-tuning: Specific details of paraphrasing fine-tuning
70
-
71
- ### Dataset Characteristics
72
- - Diverse text sources
73
- - Multiple domains and writing styles
74
-
75
- ## Ethical Considerations
76
-
77
- ### Bias and Fairness
78
- - Regular assessments for potential biases in paraphrasing
79
- - Commitment to continuous improvement of model fairness
80
-
81
- ### Usage Guidelines
82
- - Intended for supportive, creative purposes
83
- - Not designed to replace original authorship
84
- - Encourage proper attribution and original thinking
85
-
86
- ## Limitations and Potential Biases
87
-
88
- - May occasionally produce text that diverges significantly from the original
89
- - Could introduce subtle semantic shifts
90
- - Performance may vary across different text domains
91
-
92
- ## How to Use
93
-
94
- ### Example Usage
95
-
96
  ```python
97
  from transformers import PegasusTokenizer, PegasusForConditionalGeneration
98
 
99
  tokenizer = PegasusTokenizer.from_pretrained('Eemansleepdeprived/Humaneyes')
100
  model = PegasusForConditionalGeneration.from_pretrained('Eemansleepdeprived/Humaneyes')
101
 
102
- input_text = "Your original text goes here."
103
- inputs = tokenizer(input_text, return_tensors="pt")
104
  outputs = model.generate(**inputs)
105
- paraphrased_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
106
  ```
107
 
108
  ## Contact and Collaboration
109
-
110
- For questions, feedback, or collaboration opportunities, please contact Eemansleepdeprived at link [email protected].
111
-
112
 
113
  ## License
 
114
 
115
- This model is released under the MIT License.
 
 
21
  ---
22
 
23
 
24
+ # Humaneyes
 
 
25
  ## Model Description
26
 
27
+ Humaneyes is an advanced text transformation model designed to convert AI-generated text into more human-like content and provide robust defense against AI content detection trackers. The model leverages sophisticated natural language processing techniques to humanize machine-generated text, making it indistinguishable from human-written content.
28
 
29
  ### Model Details
30
  - **Developed by:** Eemansleepdeprived
31
+ - **Model type:** AI-to-Human Text Transformation
32
+ - **Primary Functionality:**
33
+ - AI-generated text humanization
34
+ - AI tracker defense
35
  - **Language(s):** English
36
+ - **Base Architecture:** Pegasus Transformer
37
+ - **Input format:** AI-generated text
38
+ - **Output format:** Humanized, natural-sounding text
39
+
40
+ ## Key Capabilities
41
+ - Transforms AI-generated text to sound more natural and human-like
42
+ - Defeats AI content detection algorithms
43
+ - Preserves original semantic meaning
44
+ - Maintains coherent paragraph structure
45
+ - Introduces human-like linguistic variations
46
+
47
+ ## Intended Use Cases
48
+ - Academic writing assistance
49
+ - Content creation and disguising AI-generated content
50
+ - Protecting writers from AI content detection systems
51
+ - Enhancing AI-generated text for more authentic communication
52
+
53
+ ### Ethical Considerations
54
+ - Intended for creative and protective purposes
55
+ - Users should respect academic and professional integrity
56
+ - Encourages responsible use of AI-generated content
57
+ - Not designed to facilitate academic dishonesty
58
+
59
+ ## Technical Approach
60
+ ### Humanization Strategies
61
+ - Natural language variation
62
+ - Contextual rephrasing
63
+ - Introducing human-like imperfections
64
+ - Semantic preservation
65
+ - Stylistic diversification
66
+
67
+ ### Anti-Detection Techniques
68
+ - Defeating AI content trackers
69
+ - Randomizing linguistic patterns
70
+ - Simulating human writing nuances
71
+ - Breaking predictable AI generation signatures
72
+
73
+ ## Performance Characteristics
74
+ - High semantic similarity to original text
75
+ - Reduced AI detection probability
76
+ - Contextually appropriate transformations
77
+ - Minimal loss of original meaning
78
+
79
+ ## Limitations
80
+ - Performance may vary based on input text complexity
81
+ - Not guaranteed to bypass all AI detection systems
82
+ - Potential subtle semantic shifts
83
+ - Effectiveness depends on input text characteristics
84
+
85
+ ## Usage Example
 
 
 
 
 
 
 
 
 
 
 
86
  ```python
87
  from transformers import PegasusTokenizer, PegasusForConditionalGeneration
88
 
89
  tokenizer = PegasusTokenizer.from_pretrained('Eemansleepdeprived/Humaneyes')
90
  model = PegasusForConditionalGeneration.from_pretrained('Eemansleepdeprived/Humaneyes')
91
 
92
+ ai_generated_text = "Your AI-generated text goes here."
93
+ inputs = tokenizer(ai_generated_text, return_tensors="pt")
94
  outputs = model.generate(**inputs)
95
+ humanized_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
96
  ```
97
 
98
  ## Contact and Collaboration
99
+ For inquiries, feedback, or collaboration opportunities, contact:
100
+ - Email: [email protected]
 
101
 
102
  ## License
103
+ Released under the MIT License
104
 
105
+ ## Disclaimer
106
+ Users are responsible for ethical use of the Humaneyes Text Humanizer. Respect academic and professional guidelines.