|
--- |
|
license: mit |
|
language: |
|
- en |
|
library_name: transformers |
|
pipeline_tag: text2text-generation |
|
tags: |
|
- LCARS |
|
- Star-Trek |
|
- 128k-Context |
|
- mistral |
|
- chemistry |
|
- biology |
|
- finance |
|
- legal |
|
- art |
|
- code |
|
- medical |
|
- text-generation-inference |
|
--- |
|
If anybody has star trek data please send as this starship computer database archive needs it! |
|
|
|
then i can correctly theme this model to be inside its role as a starship computer : |
|
so as well as any space dara ffrom nasa ; i have collected some mufon files which i am still framing the correct prompts for ; for recall as well as interogation : |
|
I shall also be adding a lot of biblical data and historical data ; from sacred texts; so any generated discussions as phylosophers discussing ancient history and how to solve the problems of the past which they encountered ; in thier lifes: using historical and factual data; as well as playig thier roles after generating a biography and character role to the models to play: they should also be amazed by each others acheivements depending on thier periods: |
|
we need multiple role and characters for these discussions: as well as as much historical facts and historys as possible to enhance this models abitlity to dicern ancient aliens truth or false : (so we need astrological, astronomical, as well as sizmological and ecological data for the periods of histroy we know : as well as the unfounded suupositions from youtube subtitles !) another useful source of themed data! |
|
|
|
|
|
This model is a Collection of merged models via various merge methods : Reclaiming Previous models which will be orphened by thier parent models : |
|
THis model is the model of models so it may not Remember some task or Infact remember them all as well as highly perform ! |
|
There were some very bad NSFW Merges from role play to erotica as well as various characters and roles downloaded into the model: |
|
So those models were merged into other models which had been specifically trained for maths or medical data and the coding operations or even translation: |
|
|
|
the models were heavliy dpo trained ; and various newer methodologies installed : the deep mind series is a special series which contains self correction recal, visio spacial ... step by step thinking: |
|
|
|
SO the multi merge often fizes these errors between models as well as training gaps :Hopefully they all took and merged well ! |
|
Performing even unknown and unprogrammed tasks: |