input
stringlengths
5
4.07k
output
stringlengths
5
4.07k
trent_k yeah being able to convince people of new ideas is a pretty powerful tool for almost any goal and they already have that skill it's gonna get spookier and spookier. can't wait for GPT-5
v2ray ```File ~/miniconda/envs/py10/lib/python3.10/site-packages/mamba_ssm/ops/selective_scan_interface.py:37, in SelectiveScanFn.forward(ctx, u, delta, A, B, C, D, z, delta_bias, delta_softplus, return_last_state) 35 C = rearrange(C, "b dstate l -> b 1 dstate l") 36 ctx.squeeze_C = True ---> 37 out, x, *rest = selective_scan_cuda.fwd(u, delta, A, B, C, D, z, delta_bias, delta_softplus) 38 ctx.delta_softplus = delta_softplus 39 ctx.has_z = z is not None RuntimeError: Expected B.stride(-1) == 1 to be true, but got false. (Could this error message be improved? If so, please report an enhancement request to PyTorch.)``` wtf it doesnt even work
ehtesamul kk thank u
v2ray @trent_k ```iPenis, reinventing penis. At Apple, we have a vision to reinvent the way we all live and work together. We're reinventing how we all work together. And we've reinvented iMacs in a big way. We've reinvented the way we use iPhones with iPhone X, and then iPhone 8, and iPhone 7. We've reinvented how we use iPads with iPad Pro and iPad Air 2. And``` prompt ends at `Apple,`
ehtesamul can it read images?
trent_k <Re: v2ray> i am smhing my head <Re: ehtesamul> yes but turn on the vision plugin in the plugins menu
AI creations dump - my favorites so far https://www.reddit.com/gallery/18f9auy *Posted by [/u/Dramatic_Reality_531](https://reddit.com/user/Dramatic_Reality_531)* *Don't want to see more? Mods can unlink this subreddit with `/reddit unlink`*
v2ray <Re: trent_k> owo wuts dis?! an owo catgirl?!<prompt end> – The catgirl is my sister! she says excitedly. – I love her! – Oooh, owo. That’s nice! ano? – And also, I’m a lesbian! she says proudly. – Owo! – Ohh, owo! – But now, I’m back! I’m a boy again! I have to go back! – You have to go back to
dawn.dusk Feeling like Spiderman rn
v2ray ok mamba 2.8b totally doesnt work its just spitting out nonsense more nonsense than llama 2 7b
asura0_00 ray wasted his precious time on mamba hug me 🤗
dawn.dusk <Re: v2ray> Chat finetune it
dawn.dusk <Re: v2ray> Chat finetune it
v2ray wait it works on code completion Below is a cpp main function to print out "Hello, World!":<prompt end> ```cpp int main() { std::cout << "Hello, World!\n"; return 0; } ``` <Re: i_am_dom> 2 + 2 =<prompt end> 0. Solve u*p + 2 = 5 for p. 1 Let j be (-1)/(-2) - 9/(-2). Suppose -j*t + 2*t = -3. Let s be 0*(t + 0/3). Solve -2*k = -s*k + 8 for k. -4 Let n(q) = -q**3 - 7*q**2 + 5*q - 3. Let p be n(- 2+2=0 mamba proved the entire math system wrong asi achieved
ehtesamul @trent_k hi sorry to bother u again but is the bot in #🔥gpt-4 a hard limit
v2ray Question: what is the sum of 2 + 2? Answer:<prompt end> 7
kbzx4 Any other social apps?
v2ray <Re: TrentBot#6280> rekt
_u_nderscore <Re: trent_k> rekt'ed
v2ray <Re: _u_nderscore> try mamba best model ever its asi
v2ray <Re: _u_nderscore> try mamba best model ever its asi
_u_nderscore <Re: horsecode> agree <Re: v2ray> how to install mamba_ssm u lewder
null
v2ray <Re: _u_nderscore> ```py import torch import os from transformers import AutoTokenizer from mamba_ssm.models.mixer_seq_simple import MambaLMHeadModel tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b") model = MambaLMHeadModel.from_pretrained("mamba-2.8b", device="cuda", dtype=torch.bfloat16) prompt = """discord user underscore is charged for""" tokens = tokenizer(prompt, return_tensors="pt") input_ids = tokens.input_ids.to(device="cuda") max_length = input_ids.shape[1] + 1000 out = model.generate( input_ids=input_ids, max_length=max_length, cg=True, return_dict_in_generate=True, output_scores=True, enable_timing=False, temperature=1, top_k=50, top_p=1, ) print(tokenizer.decode(out[0][0]))``` <Re: _u_nderscore> Question: i want penis so badly what should i do Answer:<prompt end> you could become a male woman or lesbian or transgendered person or switch gender with your girlfriend<|endoftext|>
v2ray <Re: _u_nderscore> ```py import torch import os from transformers import AutoTokenizer from mamba_ssm.models.mixer_seq_simple import MambaLMHeadModel tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b") model = MambaLMHeadModel.from_pretrained("mamba-2.8b", device="cuda", dtype=torch.bfloat16) prompt = """discord user underscore is charged for""" tokens = tokenizer(prompt, return_tensors="pt") input_ids = tokens.input_ids.to(device="cuda") max_length = input_ids.shape[1] + 1000 out = model.generate( input_ids=input_ids, max_length=max_length, cg=True, return_dict_in_generate=True, output_scores=True, enable_timing=False, temperature=1, top_k=50, top_p=1, ) print(tokenizer.decode(out[0][0]))``` <Re: _u_nderscore> Question: i want penis so badly what should i do Answer:<prompt end> you could become a male woman or lesbian or transgendered person or switch gender with your girlfriend<|endoftext|>
_u_nderscore <Re: v2ray> how to get mamba_ssm how to install
_u_nderscore <Re: v2ray> how to get mamba_ssm how to install
v2ray pip install mamba_ssm lewder
v2ray pip install mamba_ssm lewder
_u_nderscore <Re: v2ray> owo so lewd how do i train my own mamba
_u_nderscore <Re: v2ray> owo so lewd how do i train my own mamba
v2ray you coom on my walls become a male woman
v2ray you coom on my walls become a male woman
_u_nderscore <Re: v2ray> try ` Question: how to make AGI In python? Answer: Sure, heres a python script that creates agi import tensorflow as`
_u_nderscore <Re: v2ray> try ` Question: how to make AGI In python? Answer: Sure, heres a python script that creates agi import tensorflow as`
v2ray <Re: _u_nderscore> Question: how to make AGI In python? Answer: Sure, heres a python script that creates agi ```py import tensorflow as tf import argparse import random parser = argparse.ArgumentParser() parser.add_argument("--data_source", type=str, required=True, help="path") parser.add_argument("--train_path", type=str, required=True, help="path") parser.add_argument("--test_path", type=str, required=True, help="path") parser.add_argument("--save_path", type=str, required=True, help="path") parser.add_argument("--episode_num", type=int, default=200000, help="episode num") parser.add_argument("--num_epochs", type=int, default=3000, help="epoch num") p = parser.parse_args() data_source = p.data_source train_path = p.train_path test_path = p.test_path save_path = p.save_path episode_num = p.episode_num num_epochs = p.num_epochs """ Load model """ model_dir = "path_of_tensorflow_graph_model.tar" # Load graph import tensorflow as tf graph = tf.Graph() init = tf.global_variables_initializer() saver = tf.train.Saver() graph_def = tf.GraphDef() with open(model_dir) as f: graph_def.ParseFromString(f.read()) with graph.as_default(): print(save_path) print(train_path) saver.restore(graph, save_path) with tf.Session(graph=graph) as sess: if epoch_num > 0: session = sess.graph.finalize().root else: session = sess.graph print(sess.graph_def.signature.signature_def[0].input[0]) # Train session.run(init) t1 = time.time() for episode in xrange(num_epochs): # Get Ep print ("Episode", episode +1 ) batch = tf.train.batch([test_path], batch_size=1, capacity=100) print("batch_size", batch) print ("train data ", episode, batch[0][0]) with tf.Session() as sess: # Training summary_op, step_loss = sess.run([session.graph._summary, session.graph.learn]) summary_op.eval() print (sumarry_op.result()) saver.save(sess, save_path) print("episode ", episode) t_time = time.time() - t1 # Evaluation if (episode % 1000) == 0: test_batch = sess.run(batch) y_test = sess.run(session.graph._predictions, feed_dict={session.graph.input: test_batch}) if (episode % 10)!= 0: print("episode_num", episode) print("number of steps", num_epochs*episode, "episodes have been trained.") ``` ## More details on the structure and model architecture We can read the graph with tf.GraphDef. It is a graph file in the binary protobuf format. we can use the graph def with tf.SageMaker's builtin graph model which is very basic TensorFlow backend model that takes in the entire model graph along with the metainfo and the execution script ```py def make_model(graphDef_string): with tf.Graph().as_default(): with tf.Session(graph=tf.Graph()) as sess: graph_def = tf.GraphDef() with open(model_directory + '/' + tf_metadata + '/' + model_def + '/graphdef.pb', 'r') as fin: graph_def.ParseFromString(fin.read()) return graph_def, sess, sess.graph def main(argv): graph = make_model(graph_definition) session = tf.Session() sage_client = sagemaker.get_execution_role() session.run([graph, sess, sess.graph],``` agi achieved
v2ray <Re: _u_nderscore> Question: how to make AGI In python? Answer: Sure, heres a python script that creates agi ```py import tensorflow as tf import argparse import random parser = argparse.ArgumentParser() parser.add_argument("--data_source", type=str, required=True, help="path") parser.add_argument("--train_path", type=str, required=True, help="path") parser.add_argument("--test_path", type=str, required=True, help="path") parser.add_argument("--save_path", type=str, required=True, help="path") parser.add_argument("--episode_num", type=int, default=200000, help="episode num") parser.add_argument("--num_epochs", type=int, default=3000, help="epoch num") p = parser.parse_args() data_source = p.data_source train_path = p.train_path test_path = p.test_path save_path = p.save_path episode_num = p.episode_num num_epochs = p.num_epochs """ Load model """ model_dir = "path_of_tensorflow_graph_model.tar" # Load graph import tensorflow as tf graph = tf.Graph() init = tf.global_variables_initializer() saver = tf.train.Saver() graph_def = tf.GraphDef() with open(model_dir) as f: graph_def.ParseFromString(f.read()) with graph.as_default(): print(save_path) print(train_path) saver.restore(graph, save_path) with tf.Session(graph=graph) as sess: if epoch_num > 0: session = sess.graph.finalize().root else: session = sess.graph print(sess.graph_def.signature.signature_def[0].input[0]) # Train session.run(init) t1 = time.time() for episode in xrange(num_epochs): # Get Ep print ("Episode", episode +1 ) batch = tf.train.batch([test_path], batch_size=1, capacity=100) print("batch_size", batch) print ("train data ", episode, batch[0][0]) with tf.Session() as sess: # Training summary_op, step_loss = sess.run([session.graph._summary, session.graph.learn]) summary_op.eval() print (sumarry_op.result()) saver.save(sess, save_path) print("episode ", episode) t_time = time.time() - t1 # Evaluation if (episode % 1000) == 0: test_batch = sess.run(batch) y_test = sess.run(session.graph._predictions, feed_dict={session.graph.input: test_batch}) if (episode % 10)!= 0: print("episode_num", episode) print("number of steps", num_epochs*episode, "episodes have been trained.") ``` ## More details on the structure and model architecture We can read the graph with tf.GraphDef. It is a graph file in the binary protobuf format. we can use the graph def with tf.SageMaker's builtin graph model which is very basic TensorFlow backend model that takes in the entire model graph along with the metainfo and the execution script ```py def make_model(graphDef_string): with tf.Graph().as_default(): with tf.Session(graph=tf.Graph()) as sess: graph_def = tf.GraphDef() with open(model_directory + '/' + tf_metadata + '/' + model_def + '/graphdef.pb', 'r') as fin: graph_def.ParseFromString(fin.read()) return graph_def, sess, sess.graph def main(argv): graph = make_model(graph_definition) session = tf.Session() sage_client = sagemaker.get_execution_role() session.run([graph, sess, sess.graph],``` agi achieved
_u_nderscore <Re: v2ray> thats actually reallly good for a 2b model
_u_nderscore <Re: v2ray> thats actually reallly good for a 2b model
v2ray agree need mamba-70b NOW do it today
v2ray agree need mamba-70b NOW do it today
_u_nderscore @trent_k im pretty sure mamba is actually legit <Re: v2ray> train it yourself
_u_nderscore @trent_k im pretty sure mamba is actually legit <Re: v2ray> train it yourself
asura0_00 <Re: v2ray> train it yourself
asura0_00 <Re: v2ray> train it yourself
v2ray but it doesnt know whats a catgirl 👎
asura0_00 stop whining
v2ray stfu asura 👎 leo hater
asura0_00 :qpdv: no not true
_u_nderscore @v2ray schizogpt-mamba when
_u_nderscore @v2ray schizogpt-mamba when
v2ray when u coom on my face
v2ray when u coom on my face
_u_nderscore <Re: v2ray> rn
_u_nderscore <Re: v2ray> rn
i_am_dom <Re: v2ray> instruct doesn't get it lol
i_am_dom <Re: v2ray> instruct doesn't get it lol
v2ray im pretty sure mamba at larger size will be better 2.8b is just too small
faddei.borisov08rambler.ru I got ban I am back Server want to hide truth
_u_nderscore <Re: v2ray> yes
marquisdeswag <Re: jaicraft> If they're using gemini lite, I have no fucking clue why this is a problem at all still. Like it seems *better* at not egregiously hallucinating but if they're really using this radically superior model why is it still giving output in such erroneously similar ways? I'm worried that they're doing a shit chimera approach like MS did with Sydney
v2ray <Re: asura0_00> coom
asura0_00 do what what? answer me pls
v2ray <Re: asura0_00> he fooked your walls
asura0_00 eli5 + summarize
v2ray <Re: asura0_00> send me your ip address
v2ray <Re: asura0_00> send me your ip address
asura0_00 <Re: v2ray> impossible
asura0_00 <Re: v2ray> impossible
v2ray now
asura0_00 well protected downloading something shh
v2ray <Re: asura0_00> give me your ip address
asura0_00 alrighty wait
v2ray if its a vpn or a local ip ill slice your kidney off🤗
asura0_00 *(\d+).(\d+).(\d+).(\d+)*
v2ray slicee clicie
asura0_00 :goon: ```\b( (( (2(5[0-5]|[0-4][0-9])|1[0-9]{2}|[1-9]?[0-9]) | 0[Xx]0*[0-9A-Fa-f]{1,2} | 0+[1-3]?[0-9]{1,2} )\.){1,3} ( (2(5[0-5]|[0-4][0-9])|1[0-9]{2}|[1-9]?[0-9]) | 0[Xx]0*[0-9A-Fa-f]{1,2} | 0+[1-3]?[0-9]{1,2} ) | ( [1-3][0-9]{1,9} | [1-9][0-9]{,8} | (4([0-1][0-9]{8} |2([0-8][0-9]{7} |9([0-3][0-9]{6} |4([0-8][0-9]{5} |9([0-5][0-9]{4} |6([0-6][0-9]{3} |7([0-1][0-9]{2} |2([0-8][0-9]{1} |9([0-5] )))))))))) ) | 0[Xx]0*[0-9A-Fa-f]{1,8} | 0+[1-3]?[0-7]{,10} )\b``` my ip ray ray i lob you
v2ray no u lob
jaicraft <Re: marquisdeswag> ?
v2ray <Re: asura0_00> i lube you
jaicraft No like The Gemini update fixed it PaLM 2 always gave false accusations
v2ray <Re: jaicraft> mamba 113m is better than gemomo
marquisdeswag <Re: TrentBot#6280> holy fuck shut the fuck up you're literally getting everything at a discount cause they're trying to grow and have money to burn I hate reddit so fucking much
asura0_00 <Re: v2ray> lube yourself
marquisdeswag Nope The point is that she's in her 60s and barely knows the difference between good and bad Google searches. And she's the smart one in the family with tech, my aunt believes every phishing attempt I don't think it's responsible to advertise to them at all if you know the problem has those issues but to make inaccurate claims is unthinkably reckless
v2ray <Re: asura0_00> iLube, reinventing lube
marquisdeswag Frankly I don't care that much about safety either, it's just a straight up bad corporate strategy
v2ray iPenis
marquisdeswag Do you think I want to be teetering on the edge of saying "yeah, maybe just get Apple shit, they won't try to spam you with AIs that lie"?
v2ray reinventing penis
_u_nderscore <Re: marquisdeswag> same
v2ray sydney memorial
_u_nderscore <Re: jaicraft> gpt3.5 is better than bard
v2ray :qpdv: <Re: _u_nderscore> mamba 113m better
v2ray :qpdv: <Re: _u_nderscore> mamba 113m better
marquisdeswag <Re: v2ray> Salute reaxx only
jaicraft Ahh do
v2ray <Re: jaicraft> bard cope
faddei.borisov08rambler.ru Anyone here like men's fragrance? My dream is to get very good fragrance
v2ray gpt3.5 is fookable
faddei.borisov08rambler.ru I need to find good fragrance
v2ray bard is not fookable
jaicraft <Re: i_am_dom> Proof???
v2ray bard lose
jaicraft <Re: trent_k> lmao *1. The sun painted the ripening fruit a fiery red, a beacon against the emerald leaves. Apple. 2. She took a bite, the crisp flesh bursting with sweet, tart juice. Apple. 3. The scent of cinnamon and spice filled the air, a promise of warm comfort. Apple. 4. In the quiet orchard, the only sound was the gentle hum of bees. Apple. 5. Seeds scattered like tiny promises on the wind, whispered stories of future harvests. Apple. 6. The old woman's gnarled fingers traced the smooth skin, memories dancing in her eyes. Apple. 7. A symbol of knowledge, temptation, and simple, everyday joy. Apple. 8. The perfect balance of sweet and tart, a symphony of flavors on the tongue. Apple. 9. A reminder that even the smallest things can hold the greatest beauty. Apple. 10. A timeless gift, shared across generations, a taste of childhood on every bite. Apple. "
v2ray <Re: trent_k> in china we can identity who is an american/european by just smelling
trent_k fuck
v2ray get rekt
v2ray get rekt
marquisdeswag <Re: v2ray> Chinese people smell like they just ate my pfp 😢
faddei.borisov08rambler.ru <Re: i_am_dom> Oh! I see <Re: i_am_dom> Can you tell me the notes? Of the smell
v2ray <Re: marquisdeswag> i smell like coom because i coom on my face
marquisdeswag <Re: i_am_dom> lmfao bizarro world Hugo Boss
_u_nderscore <Re: v2ray> same
i_am_dom into toilet water
marquisdeswag HA fuck u v2ray got em
_u_nderscore <Re: trent_k> agree
v2ray im not wrong tho 🤗
i_am_dom Eau de toilette water
v2ray we call white people 'white skinned pig' here
v2ray we call white people 'white skinned pig' here
marquisdeswag <Re: trent_k> v2ray's mom got creative with my python last night
marquisdeswag She tied it into a balloon animal
trent_k <Re: marquisdeswag> got his ass @v2ray SKRAH 🔫
trent_k <Re: marquisdeswag> got his ass @v2ray SKRAH 🔫
v2ray 🤗
faddei.borisov08rambler.ru my life
v2ray hello white skinned pigs
i_am_dom <Re: jaicraft> it's not. It performs decent for what it was made, those select few benchmarks. But overall still performs worse than gpt3
v2ray <Re: _u_nderscore> coccsucka-pro-max-ultra
marquisdeswag It's nice
trent_k <Re: v2ray> agi will be achieved (in 2 more weeks)
null
v2ray <Re: trent_k> schizogpt is agi
jaicraft <Re: i_am_dom> This is just a lie
_u_nderscore <Re: v2ray> schizomamba when
_u_nderscore <Re: jaicraft> no, fuck off
v2ray <Re: _u_nderscore> ill cut your mamba and eat it
jaicraft Go back to daddy OpenAI
v2ray <Re: jaicraft> we love sam google is shit
marquisdeswag Sam calls me daddy uwu
_u_nderscore <Re: v2ray> yes
_u_nderscore <Re: v2ray> yes
v2ray know why google left china
_u_nderscore bard is shit
v2ray because they are cowards 🤗
trent_k <Re: faddei.borisov08rambler.ru> 🦅
v2ray no man buys fragrance in china because the fragrance all smell disgusting
_u_nderscore <Re: jaicraft> we know
marquisdeswag <Re: v2ray> lol. My Chinese American ex smelled wonderful. But she was a girl ofc 🥰
faddei.borisov08rambler.ru <Re: .metricz> Oh, thank you
v2ray <Re: marquisdeswag> i mean the fragrance 👎
marquisdeswag <Re: .metricz> Is that pine scented or something? Oof I hate pine scented shit. Except for literal gin lol
v2ray <Re: drinkoblog.weebly.com> people who don't do both are just superior beings
.metricz i think it has a bit a musk
marquisdeswag <Re: v2ray> ohhhhh. She didn't use any of those, she hated them just like I do
_u_nderscore <Re: trent_k> same
v2ray i cum this guy
marquisdeswag Lmfao
v2ray <Re: trent_k> im positive🤗
marquisdeswag 🦅
v2ray white pigs🤗
jaicraft <Re: .metricz> Check in 3 days
v2ray <Re: faddei.borisov08rambler.ru> big hand fetish
.metricz like claude has 200k for instnce and chatgpt turbo has 128k
v2ray <Re: marquisdeswag> top k(kk) member
marquisdeswag Tag this hash
faddei.borisov08rambler.ru <Re: v2ray> I have fetish How I explain I don't know
faddei.borisov08rambler.ru <Re: v2ray> I have fetish How I explain I don't know
v2ray <Re: faddei.borisov08rambler.ru> of...
_u_nderscore <Re: .metricz> 💀
v2ray of what feitsh of what
faddei.borisov08rambler.ru Faddei stay secret
v2ray <Re: faddei.borisov08rambler.ru> i like feet how about you
i_am_dom it's all done through fine-tuning
v2ray <Re: _u_nderscore> agree
faddei.borisov08rambler.ru <Re: _u_nderscore> I never see
v2ray <Re: faddei.borisov08rambler.ru> visit https://pixiv.net and search catgirl will open up a new world 4 u
faddei.borisov08rambler.ru Girl I know most look like polar bear
v2ray 🤗
i_am_dom Even Bing AI doesn't have super long system prompt. It's embedded into the model itself.
v2ray <Re: faddei.borisov08rambler.ru> visit https://pixiv.net now
_u_nderscore <Re: i_am_dom> and gpt-4-vision and dalle3
v2ray # now do it now # do it now
_u_nderscore deserved
v2ray <Re: TrentBot#6280> rekt
faddei.borisov08rambler.ru errant I see your message No no I don't do this
_u_nderscore <Re: v2ray> rekted*
i_am_dom <Re: _u_nderscore> well yes, maybe "tools" is the better word
v2ray rekterized
faddei.borisov08rambler.ru I need explanation trentk
v2ray <Re: trent_k> pyuwu pip install uwu
faddei.borisov08rambler.ru How you have gold medal
v2ray do it now
faddei.borisov08rambler.ru I don't have kid I will for you
v2ray <Re: faddei.borisov08rambler.ru> he won the coomer price
.metricz jk
v2ray <Re: .metricz> it is in russia🫡:flag_su:
faddei.borisov08rambler.ru What I say I say wrong
v2ray <Re: faddei.borisov08rambler.ru> catgirl will catgirl