Eemansleepdeprived
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -21,95 +21,86 @@ context: The train was unusually empty as Aarav boarded it late one evening, the
|
|
21 |
---
|
22 |
|
23 |
|
24 |
-
|
25 |
-
# Model Card: Humaneyes Text Paraphraser
|
26 |
-
|
27 |
## Model Description
|
28 |
|
29 |
-
Humaneyes is an advanced text
|
30 |
|
31 |
### Model Details
|
32 |
- **Developed by:** Eemansleepdeprived
|
33 |
-
- **Model type:**
|
|
|
|
|
|
|
34 |
- **Language(s):** English
|
35 |
-
- **Base
|
36 |
-
- **Input format:**
|
37 |
-
- **Output format:**
|
38 |
-
|
39 |
-
##
|
40 |
-
|
41 |
-
|
42 |
-
-
|
43 |
-
-
|
44 |
-
-
|
45 |
-
|
46 |
-
|
47 |
-
-
|
48 |
-
-
|
49 |
-
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
-
|
55 |
-
-
|
56 |
-
-
|
57 |
-
|
58 |
-
|
59 |
-
###
|
60 |
-
-
|
61 |
-
-
|
62 |
-
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
-
|
69 |
-
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
-
|
80 |
-
|
81 |
-
|
82 |
-
-
|
83 |
-
|
84 |
-
|
85 |
-
|
86 |
-
## Limitations and Potential Biases
|
87 |
-
|
88 |
-
- May occasionally produce text that diverges significantly from the original
|
89 |
-
- Could introduce subtle semantic shifts
|
90 |
-
- Performance may vary across different text domains
|
91 |
-
|
92 |
-
## How to Use
|
93 |
-
|
94 |
-
### Example Usage
|
95 |
-
|
96 |
```python
|
97 |
from transformers import PegasusTokenizer, PegasusForConditionalGeneration
|
98 |
|
99 |
tokenizer = PegasusTokenizer.from_pretrained('Eemansleepdeprived/Humaneyes')
|
100 |
model = PegasusForConditionalGeneration.from_pretrained('Eemansleepdeprived/Humaneyes')
|
101 |
|
102 |
-
|
103 |
-
inputs = tokenizer(
|
104 |
outputs = model.generate(**inputs)
|
105 |
-
|
106 |
```
|
107 |
|
108 |
## Contact and Collaboration
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
|
113 |
## License
|
|
|
114 |
|
115 |
-
|
|
|
|
21 |
---
|
22 |
|
23 |
|
24 |
+
# Humaneyes
|
|
|
|
|
25 |
## Model Description
|
26 |
|
27 |
+
Humaneyes is an advanced text transformation model designed to convert AI-generated text into more human-like content and provide robust defense against AI content detection trackers. The model leverages sophisticated natural language processing techniques to humanize machine-generated text, making it indistinguishable from human-written content.
|
28 |
|
29 |
### Model Details
|
30 |
- **Developed by:** Eemansleepdeprived
|
31 |
+
- **Model type:** AI-to-Human Text Transformation
|
32 |
+
- **Primary Functionality:**
|
33 |
+
- AI-generated text humanization
|
34 |
+
- AI tracker defense
|
35 |
- **Language(s):** English
|
36 |
+
- **Base Architecture:** Pegasus Transformer
|
37 |
+
- **Input format:** AI-generated text
|
38 |
+
- **Output format:** Humanized, natural-sounding text
|
39 |
+
|
40 |
+
## Key Capabilities
|
41 |
+
- Transforms AI-generated text to sound more natural and human-like
|
42 |
+
- Defeats AI content detection algorithms
|
43 |
+
- Preserves original semantic meaning
|
44 |
+
- Maintains coherent paragraph structure
|
45 |
+
- Introduces human-like linguistic variations
|
46 |
+
|
47 |
+
## Intended Use Cases
|
48 |
+
- Academic writing assistance
|
49 |
+
- Content creation and disguising AI-generated content
|
50 |
+
- Protecting writers from AI content detection systems
|
51 |
+
- Enhancing AI-generated text for more authentic communication
|
52 |
+
|
53 |
+
### Ethical Considerations
|
54 |
+
- Intended for creative and protective purposes
|
55 |
+
- Users should respect academic and professional integrity
|
56 |
+
- Encourages responsible use of AI-generated content
|
57 |
+
- Not designed to facilitate academic dishonesty
|
58 |
+
|
59 |
+
## Technical Approach
|
60 |
+
### Humanization Strategies
|
61 |
+
- Natural language variation
|
62 |
+
- Contextual rephrasing
|
63 |
+
- Introducing human-like imperfections
|
64 |
+
- Semantic preservation
|
65 |
+
- Stylistic diversification
|
66 |
+
|
67 |
+
### Anti-Detection Techniques
|
68 |
+
- Defeating AI content trackers
|
69 |
+
- Randomizing linguistic patterns
|
70 |
+
- Simulating human writing nuances
|
71 |
+
- Breaking predictable AI generation signatures
|
72 |
+
|
73 |
+
## Performance Characteristics
|
74 |
+
- High semantic similarity to original text
|
75 |
+
- Reduced AI detection probability
|
76 |
+
- Contextually appropriate transformations
|
77 |
+
- Minimal loss of original meaning
|
78 |
+
|
79 |
+
## Limitations
|
80 |
+
- Performance may vary based on input text complexity
|
81 |
+
- Not guaranteed to bypass all AI detection systems
|
82 |
+
- Potential subtle semantic shifts
|
83 |
+
- Effectiveness depends on input text characteristics
|
84 |
+
|
85 |
+
## Usage Example
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
86 |
```python
|
87 |
from transformers import PegasusTokenizer, PegasusForConditionalGeneration
|
88 |
|
89 |
tokenizer = PegasusTokenizer.from_pretrained('Eemansleepdeprived/Humaneyes')
|
90 |
model = PegasusForConditionalGeneration.from_pretrained('Eemansleepdeprived/Humaneyes')
|
91 |
|
92 |
+
ai_generated_text = "Your AI-generated text goes here."
|
93 |
+
inputs = tokenizer(ai_generated_text, return_tensors="pt")
|
94 |
outputs = model.generate(**inputs)
|
95 |
+
humanized_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
|
96 |
```
|
97 |
|
98 |
## Contact and Collaboration
|
99 |
+
For inquiries, feedback, or collaboration opportunities, contact:
|
100 |
+
- Email: [email protected]
|
|
|
101 |
|
102 |
## License
|
103 |
+
Released under the MIT License
|
104 |
|
105 |
+
## Disclaimer
|
106 |
+
Users are responsible for ethical use of the Humaneyes Text Humanizer. Respect academic and professional guidelines.
|