modelId
stringlengths 4
81
| tags
list | pipeline_tag
stringclasses 17
values | config
dict | downloads
int64 0
59.7M
| first_commit
timestamp[ns, tz=UTC] | card
stringlengths 51
438k
|
---|---|---|---|---|---|---|
BigSalmon/InformalToFormalLincoln24
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"has_space"
] |
text-generation
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": true,
"max_length": 50
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 5 | null |
---
license: apache-2.0
language:
- en
- es
- it
- fr
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 47.566 %
> * Mac-F1 : 34.753 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/XLM-Roberta-finetuned-emojis-1-client-toxic-FedAvg-IID-Fed"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
BigSalmon/Points
|
[
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"transformers",
"has_space"
] |
text-generation
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": true,
"max_length": 50
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 13 | null |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: t5-base-finetuned-arxiv2
results: []
pipeline_tag: summarization
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# t5-base-finetuned-arxiv2
This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.1012
- Validation Loss: 2.0674
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': 1.0, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 24894, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 2.3695 | 2.1248 | 0 |
| 2.1762 | 2.0810 | 1 |
| 2.1012 | 2.0674 | 2 |
### Framework versions
- Transformers 4.26.1
- TensorFlow 2.11.0
- Datasets 2.9.0
- Tokenizers 0.13.2
|
Bimal/my_bot_model
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 10 | null |
Flowers
Miley Cyrus
We were good, we were gold
Kind of dream that can't be sold
We were right, 'til we weren't
Built a home and watched it burn
Mm, I didn't wanna leave you, I didn't wanna lie
Started to cry, but then remembered, I
I can buy myself flowers
Write my name in the sand
Talk to myself for hours
Say things you don't understand
I can take myself dancing
And I can hold my own hand
Yeah, I can love me better than you can
Can love me better, I can love me better, baby
Can love me better, I can love me better, baby
Paint my nails, cherry-red
Match the roses that you left
No remorse, no regret
I forgive every word you said
Ooh, I didn't wanna leave you, baby, I didn't wanna fight
Started to cry, but then remembered, I
I can buy myself flowers
Write my name in the sand
Talk to myself for hours, yeah
Say things you don't understand
I can take myself dancing, yeah
I can hold my own hand
Yeah, I can love me better than you can
Can love me better, I can love me better, baby
Can love me better, I can love me better, baby
Can love me better, I can love me better, baby
Can love me better, I (ooh, I)
I didn't wanna leave you, I didn't wanna fight
Started to cry, but then remembered, I
I can buy myself flowers, uh-huh
Write my name in the sand
Talk to myself for hours, yeah
Say things you don't understand (better than you)
I can take myself dancing, yeah
I can hold my own hand
Yeah, I can love me better than
Yeah, I can love me better than you can
Can love me better, I can love me better, baby (uh-huh)
Can love me better, I can love me better, baby (than you can)
Can love me better, I can love me better, baby
Can love me better, I
Kill Bill
SZA
I'm still a fan even though I was salty
Hate to see you with some other broad, know you happy
Hate to see you happy if I'm not the one driving
I'm so mature, I'm so mature
I'm so mature, I got me a therapist to tell me there's other men
I don't want none, I just want you
If I can't have you, no one should
I might
I might kill my ex, not the best idea
His new girlfriend's next, how'd I get here?
I might kill my ex, I still love him though
Rather be in jail than alone
I get the sense that it's a lost cause
I get the sense that you might really love her
The text gon' be evidence, this text is evidence
I tried to ration with you, no murders, no crimes of passion, but damn
You was out of reach
You was at the farmer's market with your perfect peach
Now I'm in amazement, playing on my patience
Now you laying face-down, got me singing over a beat
I'm so mature, I'm so mature
I'm so mature, I got me a therapist to tell me there's other men
I don't want none, I just want you
If I can't have you, no one will
(I might)
I might kill my ex, not the best idea
His new girlfriend's next, how'd I get here?
I might kill my ex, I still love him though
Rather be in jail than alone
I did it all for love (love)
I did it all on no drugs (drugs)
I did all of this sober
I did it all for us, oh
I did it all for love (love)
I did it all of this on no drugs (drugs)
I did all of this sober
Don't you know I did it all for us? (I'll kill your ass tonight)
Uh, I just killed my ex (my ex)
Not the best idea (idea)
Killed his girlfriend next, how'd I get here?
I just killed my ex (my ex)
I still love him, though (I do)
Rather be in Hell than alone
Boy’s A Liar, Pt. 2
PinkPantheress & Ice Spice
Take a look inside your heart
Is there any room for me?
I won't have to hold my breath
'Til you get down on one knee
Because you only want to hold me
When I'm looking good enough
Did you ever feel me?
Would you ever picture us?
Every time I pull my hair
Was only out of fear
That you'll find me ugly
And one day you'll disappear because
What's the point of crying?
It was never even love
Did you ever want me?
Was I ever good enough?
The, the boy's a liar
The boy's a liar
He doesn't see ya
You're not lookin' at me, boy
The boy's a liar
The boy's a liar
He doesn't see ya
You're not lookin' at me, boy
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
He say that I'm good enough, grabbin' my duh-duh-duh
Think about shit that I shouldn't have (huh)
So I tell him it's one of me, he makin' fun of me (ha-ha)
His girl is a bum to me (grrah)
Like that boy is a cap
Sayin' he home, but I know where he at, like
Bet he blowin' her back
Thinkin' 'bout me 'cause he know that ass fat (damn)
And it been what it been (huh)
Callin' his phone like, "Yo, send me your pin"
Duckin' my shit, 'cause he know what I'm on (grrah)
But when he hit me, I'm not gon' respond (grrah)
But I don't sleep enough without you
And I can't eat enough without you (huh)
If you don't speak, does that mean we're through? (Huh)
Don't like sneaky shit that you do (grrah)
The, the boy's a liar
The boy's a liar
He doesn't see ya
You're not lookin' at me, boy
The boy's a liar
The boy's a liar
He doesn't see ya
You're not lookin' at me, boy
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Good eno-o-ough
Creepin'
Metro Boomin, The Weeknd & 21 Savage
Oooh
Just can't believe this man
(Metro Boomin want some more nigga)
Somebody said they saw you
The person you were kissing wasn't me
And I would never ask you, I just kept it to myself
I don't wanna know, if you're playing me
Keep it on the low
Cause my heart can't take it anymore
And if you creeping, please don't let it show
Oh baby, I don't wanna know
I think about it when I hold you
When looking in your eyes, I can't believe
And I don't need to know the truth
But baby keep it to yourself
I don't wanna know, if you're playing me
Keep it on the low
Cause my heart can't take it anymore
And if you creeping, please don't let it show
Oh baby, I don't wanna know
Did he touch you better than me? (touch you better than me?)
Did he watch you fall asleep (watch you fall asleep?)
Did you show him all those things that you used to do to me?
If you're better off that way (better off that way)
Baby all that I can say (all that I can say)
If you're gonna do your thing, then don't come back to me
Ooh
Woah, woah, woah
21
Had me crushing, I was cuffing like the precinct
How you go from housewife to a sneaky link
Got you ridin round in all types of benz's and rovers
Girl you used to ride in a rinky dink
I'm the one put you in Eliante (on God)
Fashion Nova model, I put you on the runway (on God)
You was rocking coach bags, got you chanaynay
Side bitch in frisco, I call her my bae bae (21)
I got a girl but I still feel alone
If you playing me that mean my home aint home
Having nightmares of going through your phone (21)
Can't even record you got me out my zone
I don't wanna know, if you're playing me
Keep it on the low
Cause my heart can't take it anymore
And if you creeping, please don't let it show
Oh baby
I don't wanna know, if you're playing me
Keep it on the low
Cause my heart can't take it anymore
And if you creeping, please don't let it show
Oh baby I don't wanna know
If you creeping just don't let me find out (on God)
Get a hotel never bring him to the house (on God)
If you're better off that way (better off that way)
Baby all that I can say (all that I can say)
If you're gonna do your thing, then don't come back to me
Last Night
Morgan Wallen
Last night we let the liquor talk
I can't remember everything we said but we said it all
You told me that you wish I was somebody you never met
But baby, baby somethin's tellin' me this ain't over yet
No way it was our last night
I kiss your lips
Make you grip the sheets with your fingertips
Last bottle of Jack we split a fifth
Just talk about life goin' sip for sip
You, you know you love to fight
And I say shit I don't mean
But I'm still gon' wake up wantin' you and me
I know that last night we let the liquor talk
I can't remember everything we said but we said it all
You told me that you wish I was somebody you never met
But baby, baby somethin's tellin' me this ain't over yet
No way it was our last night
No way it was our last night
No way it was the last night that we break up
I see your tail lights in the dust
You call your momma, I call your bluff
In the middle of the night, pull it right back up
Yeah my, my friends say let her go
Your friends say what the hell
I wouldn't trade your kind of love for nothin' else
Oh baby, last night we let the liquor talk
I can't remember everything we said but we said it all
You told me that you wish I was somebody you never met
But baby, baby somethin's tellin' me this ain't over yet
No way it was our last night, we said we'd had enough
I can't remember everything we said but we said too much
I know you packed your shit and slammed the door right before you left
But baby, baby somethin's tellin' me this ain't over yet
No way it was our last night
No way it was our last night
I know you said this time you really weren't coming back again
But baby, baby somethin's tellin' me this ain't over yet
No way it was our last night
No way it was our last night
Die For You
The Weeknd & Ariana Grande
I'm findin' ways to articulate the feelin' I'm goin' through
I just can't say I don't love you
'Cause I love you, yeah
It's hard for me to communicate the thoughts that I hold
But tonight, I'm gon' let you know
Let me tell the truth
Baby, let me tell the truth, yeah
You know what I'm thinkin', see it in your eyes
You hate that you want me, hate it when you cry
You're scared to be lonely, 'specially in the night
I'm scared that I'll miss you, happens every time
I don't want this feelin', I can't afford love
I try to find a reason to pull us apart
It ain't workin', 'cause you're perfect, and I know that you're worth it
I can't walk away, oh
Even though we're goin' through it
And it makes you feel alone
Just know that I would die for you
Baby, I would die for you, yeah
The distance and the time between us
It'll never change my mind
'Cause baby, I would die for you
Baby, I would die for you, yeah
I'm findin' ways to manipulate the feelin' you're goin' through
But, baby girl, I'm not blamin' you
Just don't blame me, too, yeah
'Cause I can't take this pain forever
And you won't find no one that's better
'Cause I'm right for you, babe
I think I'm right for you, babe
You know what I'm thinkin', see it in your eyes
You hate that you want me, hate it when you cry
It ain't workin', 'cause you're perfect, and I know that you're worth it
I can't walk away, oh
Even though we're goin' through it
And it makes you feel alone
Just know that I would die for you
Baby, I would die for you, yeah
The distance and the time between us
It'll never change my mind
'Cause baby, I would die for you, uh
Baby, I would die for you, yeah
I would die for you, I would lie for you
Keep it real with you, I would kill for you
My baby
I'm just sayin', yeah
I would die for you, I would lie for you
Keep it real with you, I would kill for you
My baby
Na-na-na, na-na-na, na-na, ooh
Even though we're goin' through it
And it makes you feel alone
Just know that I would die for you
Baby, I would die for you, yeah
The distance and the time between us
It'll never change my mind
'Cause baby, I would die for you
Baby, I would die for you, yeah (oh, babe)
Unholy
Sam Smith & Kim Petras
Mummy don't know daddy's getting hot
At the body shop, doing something unholy
He lucky, lucky, yeah (ooh)
He lucky, lucky, yeah (ye-yeah)
He lucky, lucky, yeah
He lucky, lucky, yeah
A lucky, lucky girl
She got married to a boy like you
She'd kick you out if she ever, ever knew
'Bout all the - you tell me that you do
Dirty, dirty boy
You know everyone is talking on the scene
I hear them whispering 'bout the places that you've been
And how you don't know how to keep your business clean
Mummy don't know daddy's getting hot
At the body shop, doing something unholy
He's sat back while she's dropping it, she be popping it
Yeah, she put it down slowly
Oh-ee-oh-ee-oh, he left his kids at
Ho-ee-oh-ee-ome, so he can get that
Mummy don't know daddy's getting hot
At the body shop, doing something unholy (woo)
Mmm, daddy, daddy, if you want it, drop the add'y (yuh)
Give me love, give me Fendi, my Balenciaga daddy
You gon' need to bag it up, 'cause I'm spending on Rodeo (woo)
You can watch me back it up, I'll be gone in the a.m
And he, he get me Prada, get me Miu Miu like Rihanna (ah)
He always call me 'cause I never cause no drama
And when you want it, baby, I know I got you covered
And when you need it, baby, just jump under the covers
Mummy don't know daddy's getting hot
At the body shop, doin' somethin' unholy
He's sat back while she's dropping it, she be popping it
Yeah, she put it down slowly
Oh-ee-oh-ee-oh, he left his kids at
Ho-ee-oh-ee-ome, so he can get that
Mummy don't know daddy's getting hot
At the body shop, doin' something unholy
Anti-Hero
Taylor Swift
I have this thing where I get older but just never wiser
Midnights become my afternoons
When my depression works the graveyard shift
All of the people I've ghosted stand there in the room
I should not be left to my own devices
They come with prices and vices
I end up in crisis (tale as old as time)
I wake up screaming from dreaming
One day I'll watch as you're leaving
'Cause you got tired of my scheming
(For the last time)
It's me, hi, I'm the problem, it's me
At tea time, everybody agrees
I'll stare directly at the sun but never in the mirror
It must be exhausting always rooting for the anti-hero
Sometimes I feel like everybody is a sexy baby
And I'm a monster on the hill
Too big to hang out, slowly lurching toward your favorite city
Pierced through the heart, but never killed
Did you hear my covert narcissism I disguise as altruism
Like some kind of congressman? (Tale as old as time)
I wake up screaming from dreaming
One day I'll watch as you're leaving
And life will lose all its meaning
(For the last time)
It's me, hi, I'm the problem, it's me (I'm the problem, it's me)
At tea time, everybody agrees
I'll stare directly at the sun but never in the mirror
It must be exhausting always rooting for the anti-hero
I have this dream my daughter in-law kills me for the money
She thinks I left them in the will
The family gathers 'round and reads it and then someone screams out
"She's laughing up at us from hell"
It's me, hi, I'm the problem, it's me
It's me, hi, I'm the problem, it's me
It's me, hi, everybody agrees, everybody agrees
It's me, hi (hi), I'm the problem, it's me (I'm the problem, it's me)
At tea (tea) time (time), everybody agrees (everybody agrees)
I'll stare directly at the sun but never in the mirror
It must be exhausting always rooting for the anti-hero
Cuff it
Beyonce
I wanna go missin'
I need a prescription
I wanna go higher, can I sit on top of you? (Ooh-la-la-la, la-la-la)
I wanna go where nobody's been (wanna go where nobody's been)
Have you ever had fun like this? (Have you ever had fun? Yeah)
We gon' fuck up the night, black lights
Spaceships fly (spaceships fly)
Yeah, unapologetic when we fuck up the night
Fuck up the night
We gettin' fucked up tonight
We gon' fuck up the night
Bet you you'll see far
Bet you you'll see stars
Bet you you'll elevate
Bet you you'll meet God
'Cause I feel like fallin' in love
I'm in the mood to fuck somethin' up
'Cause we gon' fuck up the night
What's in these four walls?
You sexy, my love (turn it up)
Don't miss this roll-call
Is you here or what? (Roll it up)
Yeah, show up, show up (show up, show up)
Po' up, po' up (po' up, po' up)
Uh, you Mr. Nasty, I'll clean it up
Go where nobody's been (wanna go where nobody's been)
Have you ever had fun like this? (Have you ever had fun? Yeah)
I wanna go missin'
I need a prescription
I wanna go higher, can I sit on top of you?
We gon' fuck up the night (funk it up, funk it up)
Black lights
Spaceships fly (spaceships fly)
Yeah, unapologetic when we fuck up the night (funk it up, funk it up)
Fuck up tonight
We gettin' fucked up tonight
We gon' fuck up the night
Bet you you'll see far
Bet you you'll seeI feel like fallin' in love (fallin' love)
I'm in the mood to fuck somethin' up (tonight, I'm fuckin' somethin' up, baby)
I need some drink in my cup (I need a drink)
Hey (pour me a drink)
I'm in the mood to fuck somethin' up (I'm in the mood to fuck somethin' up)
stars
Bet you you'll elevate
Bet you you'll meet God
'Cause I feel like fallin' in love
I'm in the mood to fuck somethin' up
We gon' fuck up the night
Hypersonic, sex erotic
On my body, boy, you got it
Hit them 'draulics, while I ride it
Got me actin' hella thotty
So excited, so exotic
I'm a seasoned professional
Squeeze it, don't let it go
Tease it, no self control
I got time today (I got time today, I got time)
Oh, I (I got time today, I got time)
I can't wait 'to come out and play
Ooh, yeah you
Come and cuff it, cuff it, cuff it, cuff it, baby
While I buss it, buss it, buss it, for you baby, ayy
Oh, baby
Anywhere, any time
I don't mind, I don't mind
Yeah (I don't mind)
For you (all for you)
I'm backin' the truck up, huh (back that truck up)
For you (all for you, for you)
A bitch'll get fucked up, huh (I fucked her up)
For you (all for you)
I'm puttin' my cup up, huh (put my cup up, yeah)
For you (all for you, you)
'Cause we gon' fuck up the night
Take flight (woo), blindin' lights (yeah)
Fuck it up, fuck it up, fuck it up
(Unapologetic when we fuck up the night)
Bet you you'll see stars (night)
Bet you you'll go far (night)
Bet you you'll levitate (night)
Bet you you'll meet God (party people, roll up)
Ooh (yeah)
We gon' fuck up the night (huh, yeah)
Spaceships fly
Fuck it up, fuck it up
Just Wanna Rock
Lil Uzi Vert
Ah, ah, ah, ah
I just wanna rock
Body-ody, yeah (shake it down)
Damn
Damn
whoa!
This ain't what you want (Project, Project X)
This ain't what you want
This ain't what you want
Ha! 1600 block, I just wanna rock (shake it down)
I just wanna, ah, ah, ah, ah, ah, ah, ah
I just wanna rock, body-ody, yeah (shake it down)
Shawty got that body-ody, ah, ah, ah (shake it down)
Hit her once, no ties (shake it-shake it)
How the fuck you gon' kill my vibe? (Shake it down)
Stand on my money, don't know my size (shake it-shake it)
Pick them sides, and you better choose wisely (shake it-shake it down-down)
That's my high, one, two, three, four, throw up the five (shake it-shake it)
That's my high
Damn
Damn (one, two, three, four, throw up the five)
whoa!
This ain't what you want (Project, Project X)
This ain't what you want
This ain't what you want
Buh, buh, buh, buh (down, down)
Buh, buh, buh, buh (down, down)
Buh, buh, buh, buh (down, down)
Buh, buh, buh, buh (down, shake it down)
Buh, buh, buh, buh (down, down)
Buh, buh, buh, buh (down, down)
Buh, buh, buh, buh (down, down)
Buh, buh, buh, damn (down, shake it down)
Shake it down-down
Shake it-shake it down-down
Shake it down-down
Shake it-shake it down-down
Shake it down-down
Shake it-shake it down-down
Shake it down-down
Shake it-shake it-shake
As It Was
Harry Styles
Holdin' me back
Gravity's holdin' me back
I want you to hold out the palm of your hand
Why don't we leave it at that?
Nothin' to say
When everything gets in the way
Seems you cannot be replaced
And I'm the one who will stay, oh
In this world, it's just us
You know it's not the same as it was
In this world, it's just us
You know it's not the same as it was
As it was, as it was
You know it's not the same
Answer the phone
"Harry, you're no good alone
Why are you sittin' at home on the floor?
What kind of pills are you on?"
Ringin' the bell
And nobody's comin' to help
Your daddy lives by himself
He just wants to know that you're well, oh
In this world, it's just us
You know it's not the same as it was
In this world, it's just us
You know it's not the same as it was
As it was, as it was
You know it's not the same
Go home, get ahead, light-speed internet
I don't wanna talk about the way that it was
Leave America, two kids follow her
I don't wanna talk about who's doin' it first
As it was
You know it's not the same as it was
As it was, as it was
Thank God
Kane Brown With Katelyn Brown
I was lost
You found a way to bring me back
Needed forgiveness
You always gave me that
Girl, I'm a witness of your love
'Cause you don't be giving up
And it's crazy
How you saved me
Hand on the Bible
Don't know how I got you
But I couldn't ask for more
Girl, what we got's worth thanking God for
So, thank God
I get to wake up by your side
And thank God
Your hand fits perfectly in mine
And thank God
You loved me when you didn't have to
But you did and you do, and he knew
Thank God for giving me you
Thank God
Thank God for giving me you
Never thought I'd find an angel undercover
Who made a change to everything
From my heart to my last name
Hey, hard to tell
When he fell
That boy was Heaven sent
And every night
When I close my eyes
Before I say amen
I thank God
I get to wake up by your side
And thank God
Your hand fits perfectly in mine
And thank God
You loved me when you didn't have to
But you did and you do, and he knew
Thank God for giving me you
Thank God
Thank God, (yeah, yeah)
Thank God, (oh)
Hand on the bible
Don't know how I got you
I couldn't ask for more
Girl, what we got's worth thanking God for
So, thank God
I get to wake up by your side
And thank God
Your hand fits perfectly in mine
And thank God
You loved me when you didn't have to
But you did and you do and, he knew
Thank God for giving me you
Thank God, thank God
Thank God, yeah, yeah
Thank God
Thank God for giving me you
Thank God (oh, oh)
Yeah, thank God
Oh, thank God
Thank God for giving me you
Rich Flex
Drake & 21 Savage
Go buy a zip of w-, hit the club
Pay for like ten n- to get in, we crunk, lit in this b-, yeah
Know we walk around the world
Steppin' not givin a damn 'bout where our feet land at
Yeah, get your a- mushed, smooshed (6ix)
Yeah, 21, the biggest
Put a n- in the chicken wing
21, can you do somethin' for me? (21)
Can you hit a lil' rich flex for me? (21)
And 21, can you do somethin' for me? (21, 21)
Drop some bars to my - ex for me
And 21 (21), can you do somethin" for me? (Yeah)
Can you talk to the opps necks for me? (Okay)
21, do your thing, 21, do your thing (21)
Do your thing, 21, do your thing
Yellow diamonds in the watch, this sh- cost a lot
Never send a b- your dot, that's how you get shot
I DM in Vanish Mode, I do that sh- a lot
Took her panties off and this b- thicker than a plot
All my exes ain't nothin', them h- busted
If my opps ain't rappin', they ass duckin'
You ain't ready to pull the trigger, don't clutch it
I know you on your -, baby, can you -?
I'm a savage (21)
Smack her booty in Magic (21)
I'll slap a - with a ratchet
I might slap a tracker on his whip and get to addin'
Don't call me on Christmas Eve, b-, call your daddy (21)
Call your uncle (21), b-, don't call me (21)
Always in my ear, your h- a flea(
Why my opps be posting guns and only use they feet? (21)
Paid like an athlete, I got-
All you h-
All of you h- need to remember who y'all talking to
It's the Slaughter Gang CEO
I got d- for you if I'm not working, girl
If I'm busy, then, f- no
You need to find you someone else to call
When your bank account get low
You need to find you someone
Ayy, ayy, ayy, ayy, ayy
I'm on that Slaughter Gang sh-, ayy, murder gang sh-
Ayy, Slaughter Gang sh-, ayy, murder gang sh-
Ayy, sticks and stones, chrome on chrome
That's just what a n- on
Internet clones, got 'em kissin' through the phone
Clickin' up so they don't feel alone, ayy
Nan' n- seein' me, I'm Young Money CMB
I used to roll with CMG, the house is not a BNB
The bad b- waitin' on a n- like I'm PnD
I'm steady pushing P, you n- pushing PTSD
I told her a- to kiss me in the club, f- a TMZ
I used to want a GMC, when Woe was doing BNE
We revvin' up and goin' on a run like the DMC
I layup with her for a couple days, then its BRB
You rappers like askin' if I f-, when you know we did
When you know we did
She came in heels, but she left out on her cozy sh-
Ayy, I'm livin every 24 like Kobe did
Shoutout to the 6ix, R.I.P the 8
Swear this sh- is getting ate, I'm on ten for the cake
Get a lot of love from 12, but I don't reciprocate
51 division stay patrolling when it's late
21 my addy, so the knife is on the gate
All the dawgs eating off a Baccarat plate
See Drake and they underestimate
Take it from a vet, that's a rookie a- mistake, ayy
Ah, what, what
Slaughter Gang sh-, ayy, murder gang sh-, ayy
Slaughter Gang sh-, ayy, murder gang sh-, ayy
(Slaughter gang sh-, ayy, murder gang sh-, ayy)
(And you got 'em)
Boy, look, you the m- man
Boy, you, ooh, you is the man, you hear me?
Thought You Should Know
Morgan Wallen
What's goin' on, mama?
Something just dawned on me
I ain't been home in some months
Been chasin' songs and women
Makin' some bad decisions
God knows I'm drinkin' too much
Yeah, I know you've been worrying 'bout me
You've been losin' sleep since '93
I thought you should know
That all those prayers you thought you wasted on me
Must've finally made their way on through
I thought you should know
I got me a new girl down there in Jefferson City, and
She lets me fish whenever I want to
Yeah, I'm still proud of where I came from
Still your only damn son
Can you believe I'm on the radio?
Just thought you should know, thought you should know, thought you should know
Oh, by the way, mama, didn't mean to ramble on ya
How's everything back at home?
Yeah, how's that garden comin'?
Is dad still doing dumb shit?
And how'd he keep you this long?
Yeah, I'm sorry that I called you so late
I just miss you, but anyways
I thought you should know
That all those prayers you thought you wasted on me
Must've finally made their way on through
I thought you should know
I got me a new girl down there in Jefferson City, and
She lets me fish whenever I want to
Yeah, I'm still proud of where I came from
Still your only damn son
Can you believe I'm on the radio?
Just thought you should know, thought you should know, thought you should know
Yeah, I know you've been worrying 'bout me
You've been losing sleep since '93
I thought you should know
That all those prayers you thought you wasted on me
Must've finally made their way on through
I thought you should know
That I really like this girl down in Jefferson City, and
Turns out she's a lot like you
Yeah, I'm still proud of where I came from
Still your only damn son
The bus is leavin' so I gotta roll
Just thought you should know, thought you should know, thought you should know
I thought you should know, thought you should know
I thought you should know, thought you should know, thought you should know
Rock And A Hard Place
Bailey Zimmerman
We've been swingin' and missin'
It ain't broke yet, but damn, it needs fixin'
Been a while since your kiss felt like kissin'
It's just different
We've been talkin' 'bout forever since we've been together
Somethin' 'bout a ring makes you think we're better off with
All this but we're caught in
Between a rock and a hard place
Red wine and mistakes
Tears rollin' down your face
When I walked out that door
And that's when I lost it
A midnight in Austin
Damn, I'm exhausted
What the hell's this all for?
Is this where it mends or it breaks?
Between a rock and a hard place
For the record, shit
Throwin' in the towel takes some effort
So I'd rather ride it out for better weather
Together
Between a rock and a hard place
Red wine and mistakes
Tears rollin' down your face
When I walked out that door
And that's when I lost it
A midnight in Austin
Damn, I'm exhausted
What the hell's this all for?
Is this where it mends or it breaks?
Between a rock and a hard place
We've been talkin' 'bout forever since we've been together
Something 'bout a ring makes you think we're better off with
All this but we're caught in
Between a rock and a hard place
Tears rollin' down your face
As I walked out that door
And that's when I lost it
Midnight in Austin
Damn, I'm exhausted
What the hell's this all for?
Is this where it mends or it breaks?
How much more of this can we take?
Players
Coi Leray
'Cause girls is players too, uh
Yeah, yeah
'Cause girls is players too (keep it player, baby)
'Cause girls is players too
Bitches gettin' money all around the world
'Cause girls is players too
What you know 'bout livin' on the top
Penthouse suites, lookin' down on the opps?
Took him for a test drive, left him on the lot
Time is money so I spent it on a watch, hol' on
Lil' titties showin' through the white tee
You can see the thong bussin' out my tight jeans (okay)
Rocks on my fingers like a nigga wifed me
Got another shorty? She ain't nothin' like me (yeah)
'Bout to catch another flight (yeah)
The apple bottom make 'em wanna bite (yeah)
I just wanna have a good night
I just wanna have a good night (hold up)
If you don't know now you know
If he broke then you gotta let him go
You could have anybody, eeny, miny, moe
'Cause when you a boss, you could do what you want
Yeah
'Cause girls is players too, uh
Yeah, yeah
'Cause girls is players too (keep it player, baby)
'Cause girls is players too
Bitches gettin' money all around the world
'Cause girls is players too
I go, on and on and on again
He blowin' up my phone but I'm ignorin' him
He thinkin' he the one, I got like four of him
Yeah, I'm sittin' first class like Valedictorian, uh
Came a long way from rag to riches
Five-star bitch, yeah, I taste so delicious
Let him lick the plate, yeah, I make him do the dishes
Now he on news talk 'cause a bitch went missin', sheesh
'Bout to catch another flight (yeah)
The apple bottom make 'em wanna bite (yeah)
I just wanna have a good night
I just wanna have a good night (hold up)
If you don't know now you know
If he broke then you gotta let him go
You could have anybody, eeny, miny, moe
'Cause when you a boss, you could do what you want
Yeah
'Cause girls is players too, uh
(And it's time that we let 'em know that)
'Cause girls is players too (keep it player, baby)
'Cause girls is players too
Bitches gettin' money all around the world
'Cause girls is players too
Under The Influence
Chris Brown
Get up, get up
Kiddominant on the beat, better run it back
Fuckin' Robitussin
I don't know why this shit got me lazy right now, yeah
Can't do Percocets or Molly
I'm turnin' one, tryna live it up here right, right, right
Baby, you can
Ride it, ooh, yeah
Bring it over to my place
And you be like
"Baby, who cares?"
But I know you care
Bring it over to my place
You don't know what you did, did to me
Your body lightweight speaks to me
I don't know what you did, did to me
Your body lightweight speaks to me
I can make it hurricane on it
Hunnid bands, make it rain on it
Tie it up, put a chain on it
Make you tattoo my name on it, oh
Make you cry like a baby, yeah
Let's GoPro and make a video, yeah
Make you cry like a baby, yeah
Let's GoPro and make a video
Oh, yeah, yeah, yeah, yeah
Baby, you can
Ride it, ooh, yeah
Bring it over to my place
And you be like
"Baby, who cares?"
But I know you care
Bring it over to my place
You don't know what you did, did to me
Your body lightweight speaks to me
I don't know what you did, did to me
Your body lightweight speaks to me
Baby, you can
Ride it, ooh, yeah
And you be like
"Baby, who cares?"
But I know you care
Calm Down
Rema & Selena Gomez
Baby, calm down, calm down
Girl, this your body e put my heart for lockdown
For lockdown, oh, lockdown
Girl, you sweet like Fanta o
Fanta o
If I tell you say I love you
No dey form yanga o, oh, yanga o
No tell me no, no, no, no
Whoa, whoa, whoa, whoa
Oh-oh-oh-oh-oh-oh-oh-oh-oh-oh-oh
Baby, come gimme your lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-love
You got me like whoa-whoa-whoa-whoa-whoa-whoa-whoa-whoa-whoa
Shawty, come gimme your lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-love, hmm
I see this fine girl, for my party, she wear yellow
Every other girl they dey do too much, but this girl mellow
Naim I dey find situation I go use take tell am hello
Finally, I find way to talk to the girl but she no wan' follow
Who you come dey form for? (Uhum)
Why you no wan' conform? (Uhum)
Then I start to feel her bum-bum, whoa (uhum)
But she dey gimme small-small, whoa
I know say she sabi pass that one (uhum)
But she feeling insecure
'Cause her friends go dey gum her like chewing gum (uhum)
Go dey gum her like chewing gum
Baby, calm down, calm down
Girl, this your body e put my heart for lockdown
For lockdown, oh, lockdown
Girl, you sweet like Fanta o
Fanta o
If I tell you say I love you
No dey form yanga o, oh, yanga o
No tell me no, no, no, no
Whoa, whoa, whoa, whoa
Oh-oh-oh-oh-oh-oh-oh-oh-oh-oh-oh
Baby, come gimme your lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-love
You got me like whoa-whoa-whoa-whoa-whoa-whoa-whoa-whoa-whoa
Shawty, come gimme your lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-love, hmm
As I reach my house I say make I rest small (make I rest small)
As me I wake up na she dey my mind day o (na she dey my mind day o)
Day one, day two, I no fit focus (I no fit focus)
Na so me I call am, say make we link up (say make we link up)
As I start to dey tell her how I feel, all my heart dey race
Baby girl, if you leave me, I no go love again
Because e get many girls wey put my heart for pain
Shebi, you feel my pain
Baby, calm down, calm down
Girl, this your body e put my heart for lockdown
For lockdown, oh, lockdown
Girl, you sweet like Fanta o
Fanta o
If I tell you say I love you
No dey form yanga o, oh, yanga o
No tell me no, no, no, no
Whoa, whoa, whoa, whoa
Oh-oh-oh-oh-oh-oh-oh-oh-oh-oh-oh
Baby, come gimme your lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-love
You got me like whoa-whoa-whoa-whoa-whoa-whoa-whoa-whoa-whoa
Shawty, come gimme your lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-lo-love, hmm
You Proof
Morgan Wallen
Yeah, I've been throwin' down the whiskey
I oughta get my money back
And someone said it drowns a memory
Ah, but it ain't doing jack
Yeah, I've been sippin', I've been buzzin'
Shootin' doubles like it's nothin'
Ah, but nothin' makes you go away
I need something you proof
Somethin' stronger than I'm used to
Yeah, I've been pourin' ninety to a hundred
Feel like nothing's gonna cut it, that's the hard truth
Yeah, I need something you proof
Oh, I need something you proof
Poured 'em up 'til they're shuttin' 'em down, yeah
You never ain't not around, yeah
Don't matter what time, what town
I can't get you gone
Turn the bar, yeah, upside down
Just looking for somethin' that does it
I give 'em all my money
Ain't nobody sellin' nothing you proof
Somethin' stronger than I'm used to
Yeah, I've been pourin' ninety to a hundred
Feel like nothing's gonna cut it, that's the hard truth
Yeah, I need something you proof
Oh, I need something you proof
Hey, I've been mixing liquors tryin' to get you gone
Ah, but I must be doin' somethin' wrong
'Cause I've been working hard to fade your memory
Baby, but the only thing faded is me
I need something you proof
Somethin' stronger than I'm used to
Yeah, I've been pourin' ninety to a hundred
Feel like nothing's gonna cut it, that's the hard truth
I need something you proof (Poured 'em up 'til they're shuttin' 'em down, yeah)
Ah, I need something you proof (you never ain't not around)
(Don't matter what time, what town)
(I can't get you gone)
(Turn the bar, yeah, upside down)
(Just looking for somethin' that does it)
(I'll give 'em all my money)
(Ain't nobody selling nothing you proof)
Made You Look
Meghan Trainor
I could have my Gucci on
I could wear my Louis Vuitton
But even with nothin' on
Bet I made you look (I made you look)
I'll make you double take soon as I walk away
Call up your chiropractor just in case your neck break
Ooh, tell me what you, what you, what you gon' do, ooh
'Cause I'm 'bout to make a scene, double up that sunscreen
I'm 'bout to turn the heat up, gonna make your glasses steam
Ooh, tell me what you, what you, what you gon' do, ooh
When I do my walk, walk (oh)
I can guarantee your jaw will drop, drop (oh)
'Cause they don't make a lot of what I got, got (ah, ah)
Ladies if you feel me, this your bop, bop (bop-bop-bop)
I could have my Gucci on (Gucci on)
I could wear my Louis Vuitton
But even with nothin' on
Bet I made you look (I made you look)
Yeah, I look good in my Versace dress (take it off)
But I'm hotter when my morning hair's a mess
'Cause even with my hoodie on
Bet I made you look (I made you look)
Mhm-hm-hm
And once you get a taste (woo), you'll never be the same
This ain't that ordinary, this that 14 karat cake
Ooh, tell me what you, what you, what you gon' do, ooh (what you gon' do, ooh, ooh)
When I do my walk, walk (oh)
I can guarantee your jaw will drop, drop (oh) (I guarantee your jaw will drop, drop)
'Cause they don't make a lot of what I got, got (ah, ah)
Ladies if you feel me, this your bop, bop (bop-bop-bop)
Oh, I could have my Gucci on (Gucci on)
I could wear my Louis Vuitton
But even with nothin' on
Bet I made you look (said, I made you look)
Yeah, I look good in my Versace dress (take it off, baby)
But I'm hotter when my morning hair's a mess
'Cause even with my hoodie on
Bet I made you look (said, I made you look)
Lavender Haze
Taylor Swift
Meet me at midnight
(Oh, ooh, oh-oh, oh, ooh, oh-oh, oh, ooh, oh-oh, whoa, whoa, whoa, whoa, whoa)
Staring at the ceiling with you
Oh, you don't ever say too much
And you don't really read into
My melancholia
I've been under scrutiny (yeah, oh, yeah)
You handle it beautifully (yeah, oh, yeah)
All this shit is new to me (yeah, oh, yeah)
I feel the lavender haze creeping up on me
Surreal, I'm damned if I do give a damn what people say
No deal, the 1950s shit they want from me
I just wanna stay in that lavender haze
(Oh, ooh, oh-oh, oh, ooh, oh-oh, whoa, whoa, whoa, whoa, whoa)
All they keep asking me (all they keep asking me)
Is if I'm gonna be your bride
The only kind of girl they see (only kind of girl they see)
Is a one night or a wife
I find it dizzying (yeah, oh, yeah)
They're bringing up my history (yeah, oh, yeah)
But you aren't even listening (yeah, oh, yeah)
I feel the lavender haze creeping up on me
Surreal, I'm damned if I do give a damn what people say
No deal, the 1950s shit they want from me
I just wanna stay in that lavender haze (oh, ooh, oh-oh, oh, ooh, oh-oh, whoa, whoa, whoa, whoa, whoa)
That lavender haze
Talk your talk and go viral
I just need this love spiral
Get it off your chest
Get it off my desk (get it off my desk)
Talk your talk and go viral
I just need this love spiral
Get it off your chest
Get it off my desk
I feel (I feel) the lavender haze creeping up on me
Surreal, I'm damned if I do give a damn what people say
No deal (no deal), the 1950s shit they want from me
I just wanna stay in that lavender haze
(Oh, ooh, oh-oh, oh, ooh, oh-oh, whoa, whoa, whoa, whoa, whoa)
Get it off your chest
Get it off my desk
That lavender haze
I just wanna stay
I just wanna stay
In that lavender haze
Escapism
RAYE Featuring 070 Shake
Sleazin' and teasin', I'm sittin' on him
All of my diamonds are drippin' on him
I met him at the bar, it was 12 or somethin'
I ordered two more wines, 'cause tonight, I want him
A little context if you care to listen
I find myself in a shit position
The man that I love sat me down last night
And he told me that it's over, dumb decision
And I don't wanna feel how my heart is rippin'
In fact, I don't wanna feel, so I stick to sippin'
And I'm out on the town with a simple mission
In my little black dress, and this shit is sittin'
Just a heart broke bitch, high heels, six inch
In the back of the nightclub, sippin' champagne
I don't trust any of these bitches I'm with
In the back of the taxi, sniffin' cocaine
Drunk calls, drunk texts, drunk tears, drunk sex
I was lookin' for a man who was on the same page
Now it's back to the intro, back to the bar
To the Bentley, to the hotel, to my old ways
'Cause I don't wanna feel how I did last night
I don't wanna feel how I did last night
Doctor, doctor, anything, please
Doctor, doctor, have mercy on me, take this pain away
You're asking me my symptoms, doctor, I don't wanna feel
Toke this joint how I'm blowin' this steam
Back to my ways like 2019
Not 24 hours since my ex did dead it
I got a new man on me, it's about to get sweaty
Last night really was the cherry on the cake
Been some dark days lately and I'm finding it crippling
Excuse my state, I'm as high as your hopes
That you'll make it to my bed, get me hot and sizzling
If I take a step back to see the glass half full
At least it's the Prada two-piece that I'm trippin' in
And I'm already actin' like a dick, know what I mean?
So you might as well stick it in
Just a heart broke bitch, high heels, six inch
In the back of the nightclub, sippin' champagne
I don't trust any of these bitches I'm with
In the back of the taxi, sniffin' cocaine
Drunk calls, drunk texts, drunk tears, drunk sex
I was lookin' for a man who was on the same page
Now it's back to the intro, back to the bar
To the Bentley, to the hotel, to my old ways
'Cause I don't wanna feel how I did last night
I don't wanna feel how I did last night
Doctor, doctor, anything, please
Doctor, doctor, have mercy on me, take this pain away
You're asking me my symptoms, doctor, I don't wanna feel, mm (what?)
'Cause I don't wanna feel like I felt last night
I don't wanna feel like I felt last night
Be at peace with the things you can't change (last night)
I'll be naked when I leave and I was naked when I came, yeah
Out of reach, out of touch, too numb, I don't feel no way
Toast up, so what? Street small, but it go both ways
So you'll run, yeah, but you'll never escape
Sunset in the maze (you're asking me my symptoms, doctor, I don't wanna feel)
I don't wanna feel how I did last night
I don't wanna feel how I did last night, oh
Doctor, doctor, anything, please
Doctor, doctor, have mercy on me
You're asking me my symptoms, doctor, I don't wanna feel
I don't wanna feel how I did last night
I don't wanna feel how I did last night
I don't wanna feel how I did last night
Mm, lipstick smudged like modern art
I don't know where the fuck I am or who's drivin' the fuckin' car
Speedin' down the highway, sippin'
Mixin' pills with the liquor 'cah fuck these feelings
I left everyone I love on read (uh-huh)
Spilling secrets to the stranger in my bed (uh-huh)
I remember nothing, so there's nothing to regret (uh-huh)
Other than this 4-4 kick drum poundin' in my head
Going, Going, Gone
Luke Combs
Some things in life are meant to fly
And others, they were born to run
You can't tie them up and leavin'
Like the changing of the seasons
Good things, they come and then they go
Like a runaway Southbound train
Like an Arizona desert rain
Like lightning in the sky
Like fireworks in July
Like a left field homerun ball
Like a whiskey shot at last call
It's like she was made for moving on
That girl is going, going, gone
I can say it wasn't meant to be
But maybe meant to be is misunderstood
I can't hold on to letting go
Change the way the river flows
Lovin' her's like roping in the wind
Like a runaway Southbound train
Like an Arizona desert rain
Like lightning in the sky
Like fireworks in July
Like a left field homerun ball
Like a whiskey shot at last call
It's like she was made for moving on
That girl is going, going, gone
She ain't got one bit of stick around
There's no sense in tryin' to slow her down
Like a runaway Southbound train
Like an Arizona desert rain
Like lightning in the sky
Like fireworks in July
Like a left field homerun ball
Like a whiskey shot at last call
It's like she was made for moving on
That girl is going, going, gone
Going, going, gone
Going, going, gone
Superhero (Heroes & Villains)
Metro Boomin, Future & Christ Brown
Drankin' dope turned me to a superhero, yeah, yeah
Hit that pill, turned me to a superhero, yeah, yeah
Boominati turned me to a superhero, yeah, yeah (Metro)
(If Young Metro don't trust you, I'm gon' shoot you)
I'm on that dope again, I'm in that flow again
Switch up the flow again, yeah, yeah
Flyer than a parachute, gripping that pole again
I'm on that oil again, yeah, yeah
Candy in the cup, gotta get paid
King in the streets, young nigga made
Sprayin' up the crowd, take it to the grave
Ain't havin' problems, I'm sippin' the bar
Shoutout to Dallas, my bitch is a star
Nigga get rich, ready take you to war
Piss on your casket, shoot at your broad
Do you something nasty, roll you in a 'gar
Bitch get graphic, fuck me in a car
I get you a brand new Rollie tomorrow
I put that brand new Rollie on your arm
Ain't movin' slow but I'm still on oil
Tennis bracelets and they came with the frost
Cuban links all the way up to your jaw
Step up the swag when I step on a broad
Two dollar half, ooh, that's the cheapest one
Stackin' these hundreds up, like coupons
Told you from the beginning, upper echelon
And I get to stackin' up, I'm untouchable
I get to represent, money multiple
I'm at the top of the charts, unapproachable
Bread by the loaf, turbo the motor
Tic-tac-toe, killing all the vultures
Selling elbows, bitch do yoga
I deserve awards, serving these boulders
A hundred grand large when I shop, that's the total
Fill up the garage, bitch, I'm a mogul
Ain't no facadin', ain't no fugazi
I drop it off, I get paid
Drop top Royce, I'm going crazy
I push off, smoking on haze
Not tryna floss, Cartier shades
Candy in the cup, gotta get paid
King in the streets, young nigga made
Sprayin' up the crowd, take it to the grave
Ain't havin' problems, I'm sippin' the bar
Shoutout to Dallas, my bitch is a star
Nigga get rich, ready take you to war
Piss on your casket, shoot at your broad
Do you something nasty, roll you in a 'gar
Bitch get graphic, fuck me in a car
I get you a brand new Rollie tomorrow
I put that brand new Rollie on your arm
Ain't movin' slow but I'm still on oil
Tennis bracelets and they came with the frost
Cuban links all the way up to your jaw
Step up the swag when I step on a broad
Dark Knight feeling, die or be a hero
Or live long enough to see yourself become a villain
Soon as you up, these niggas wanna bring you down
The weight of the world sit on my shoulders, hold the crown
I ain't got a cape so I can't save you now
Niggas wanna hate, rather see you drown (yeah)
And the world keep spinnin' (yeah)
Like I'm the only one in it (am I the only one?) Why?
They don't wanna see you winnin' (no, no, no, no)
So who's really the villain? (Yeah)
(Whooo, who's the villain? Who's the villain?)
(Live long enough to see yourself become a villain)
Something In The Orange
Zach Bryan
It'll be fine by dusk light I'm telling you, baby
These things eat at your bones and drive your young mind crazy
But when you place your head between my collar and jaw
I don't know much but there's no weight at all
And I'm damned if I do and I'm damned if I don't
'Cause if I say I miss you I know that you won't
But I miss you in the mornings when I see the sun
Something in the orange tells me we're not done
To you I'm just a man, to me you're all I am
Where the hell am I supposed to go?
I poisoned myself again
Something in the orange tells me you're never coming home
I need to hear you say you've been waitin' all night
There's orange dancing in your eyes from bulb light
Your voice only trembles when you try to speak
Take me back to us dancing, this wood used to creak
To you I'm just a man, to me you're all I am
Where the hell am I supposed to go?
I poisoned myself again
Something in the orange tells me you're never coming home
To you I'm just a man, to me you're all I am
Where the hell am I supposed to go?
I poisoned myself again
Something in the orange tells me you're never coming home
If you leave today, I'll just stare at the way
The orange touches all things around
The grass, trees and dew, how I just hate you
Please turn those headlights around
Please turn those headlights around
Golden Hour
JVKE
It was just two lovers
Sittin' in the car, listening to Blonde
Fallin' for each other
Pink and orange skies, feelin' super childish
No Donald Glover
Missed call from my mother
Like, "Where you at tonight?" Got no alibi
I was all alone with the love of my life
She's got glitter for skin
My radiant beam in the night
I don't need no light to see you
Shine
It's your golden hour (oh)
You slow down time
In your golden hour (oh)
We were just two lovers
Feet up on the dash, drivin' nowhere fast
Burnin' through the summer
Radio on blast, make the moment last
She got solar power
Minutes feel like hours
She knew she was the baddest, can you even imagine
Fallin' like I did?
For the love of my life
She's got glow on her face
A glorious look in her eyes
My angel of light
I was all alone with the love of my life
She's got glitter for skin
My radiant beam in the night
I don't need no light to see you
Shine
It's your golden hour (oh)
You slow down time
In your golden hour (oh)
Sure Thing
Migue
Love you like a brother
Treat you like a friend
Respect you like a lover
Oh, oh, oh, oh, oh, oh
You could bet that, never gotta sweat that (oh, oh, oh, oh, oh)
You could bet that, never gotta sweat that (yeah, yeah, yeah)
You could bet that, never gotta sweat that
You could bet that, never gotta sweat that (yeah)
If you be the cash
I'll be the rubber band
You be the match
I will be a fuse, boom
Painter, baby, you could be the muse
I'm the reporter, baby, you could be the news
'Cause you're the cigarette and I'm the smoker
We raise a bet 'cause you're the joker
Checked off, you are the chalk
And I can be the blackboard
You can be the talk
And I can be the walk, yeah
Even when the sky comes falling
Even when the sun don't shine
I got faith in you and I
So put your pretty little hand in mine
Even when we're down to the wire, babe
Even when it's do or die
We could do it, baby, simple and plain
'Cause this love is a sure thing
You could bet that, never gotta sweat that (yeah, yeah, yeah)
You could bet that, never gotta sweat that
You could bet that, never gotta sweat that
You could bet that, never gotta sweat that
You could be the lover, I'll be the fighter, babe
If I'm the blunt (uh), you could be the lighter, babe
Fire it up
Writer, baby, you could be the quote, yeah (uh)
If I'm the lyric, baby, you could be the note (uh), record that
Saint I'm a sinner (uh), prize I'm a winner (uh)
And it's you, what did I do to deserve that?
Paper, baby, I'll be the pen
Say that I'm the one 'cause you are a ten
Real and not pretend
Even when the sky comes falling (yeah, yeah, yeah, yeah, yeah)
Even when the sun don't shine (yeah)
I got faith in you and I
So put your pretty little hand in mine
Even when we're (you could bet that, never gotta sweat that)
Down to the wire, baby
Even when it's do or die (you could bet that, never gotta sweat that)
(You could bet that, never gotta sweat that)
We could do it baby, simple and plain
(You could bet that, never gotta sweat that)
'Cause this love is a sure thing
Uh, now rock with me, baby
Let me hold you in my arms
Talk with me babe, yeah, yeah
Uh, now rock with me baby
Let me hold you in my arms
Talk with me babe, yeah, yeah
This love between you and I is simple as pie, baby
Yeah, it's such a sure thing (it's such a sure thing)
Oh, it such a sure thing (it's such a sure thing)
Even when (you could bet that, never gotta sweat that)
The sky comes falling
Even when (you could bet that, never gotta sweat that)
The sun don't shine
(You could bet that, never gotta sweat that)
I got faith in you and I
So put your pretty little hand in mine (you could bet that, never gotta sweat that)
Even when (you could bet that, never gotta sweat that)
We're down to the wire, babe
Even when (you could bet that, never gotta sweat that)
It's do or die (you could bet that, never gotta sweat that)
We could do it, baby, simple and plain
(You could bet that, never gotta sweat that)
'Cause this love is a sure thing
Love you like a brother (you could bet that, never gotta sweat that)
Treat you like a friend (you could bet that, never gotta sweat that)
Respect you like a lover (you could bet that, never gotta sweat that)
Oh, oh, oh, oh, oh, oh
l
Heart Like A Truck
Lainey Wilson
I never stay in one place too long
A dirt road's singing me a siren song
I gotta find a field
I need to spin my wheels
I got a hankering for four wide tires
And I can't help it it's the way I'm wired
'Fore you get too close
Boy you need to know
I got a heart like a truck
It's been drug through the mud
Runs on dreams and gasoline
And that ole highway holds the key
It's got a lead foot down when it's leaving
Lord knows it's taken a hell of a beating
A little bit of love is all that it's needing
But it's good as it is tough
I got a heart like a truck
There ain't no breaking when I throw it in drive
Don't always keep it in between the lines
If you're ready for a ride pedal down state of mind
Boy I tell you what
You better buckle up
I got a heart like a truck
It's been drug through the mud
Runs on dreams and gasoline
And that ole highway holds the key
It's got a lead foot down when it's leaving
Lord knows it's taken a hell of a beating
A little bit of love is all that it's needing
But it's good as it is tough
I got a heart like a truck
Go on and see if you can knock off the dust yea
Shine it up revv it up and let it run yea
It gets a high riding off into the sun yea
I got a heart like a truck
It's been drug through the mud
Runs on dreams and gasoline
And that ole highway holds the key
It's got a lead foot down when it's leaving
Lord knows it's taken a hell of a beating
A little bit of love is all that it's needing
But it's good as it is tough
I got a heart like a truck
Go on and see if you can knock off the dust yea
Shine it up revv it up and let it run yea
It gets a high riding off into the sun yea
It gets a high riding off into the sun
Bad Habit
Steve Lacy
wanted me
I wish I knew, I wish I knew you wanted me
I wish I knew, I wish I knew you wanted me
What you, ooh, uh, what you do?
Made a move, coulda made a move
If I knew I'd be with you
Is it too late to pursue?
I bite my tongue, it's a bad habit
Kinda mad that I didn't take a stab at it
Thought you were too good for me, my dear
Never gave me time of day, my dear
It's okay, things happen for
Reasons that I think are sure, yeah
I wish I knew, I wish I knew you wanted me
I wish I knew (oh), I wish I knew you wanted me
I wish I knew (yeah), I wish I knew you wanted me (oh)
I wish I knew, I wish I knew you wanted me
Say to me (please just say to me)
If you still want it
I wish you wouldn't play with me
I wanna know (oh no)
Uh, can I bite your tongue like my bad habit?
Would you mind if I tried to make a pass at it?
Were you not too good for me, my dear?
Funny you come back to me, my dear
It's okay, things happen for
Reasons that I can't ignore, yeah
I wish I knew, I wish I knew you wanted me
I wish I knew (wish I knew), I wish I knew you wanted me (oh)
You can't surprise a Gemini
I'm everywhere, I'm cross-eyed, and
Now that you're back, I can't decide
If I decide if you're invited
You always knew the way to wow me
Fuck around, get tongue-tied, and
I turn it on, I make it rowdy
Then carry on, but I'm not hidin'
You grabbin' me hard 'cause you know what you found
Is biscuits, is gravy, babe, ah-ah
You can't surprise a Gemini
But you know it's biscuits, is gravy, babe
I knew you'd come back around
'Cause you know it's biscuits, it's gravy, babe
Let's fuck in the back of the mall, lose control
Go stupid, go crazy, babe
I know I'll be in your heart 'til the end
You'll miss me, don't beg me, babe
Until I Found You
Stephen Sanchez
Georgia, wrap me up in all your-
I want you in my arms
Oh, let me hold you
I'll never let you go again like I did
Oh, I used to say
"I would never fall in love again until I found her"
I said, "I would never fall unless it's you I fall into"
I was lost within the darkness, but then I found her
I found you
Heaven, when I held you again
How could we ever just be friends?
I would rather die than let you go
Juliet to your Romeo, how I heard you say
"I would never fall in love again until I found her"
I said, "I would never fall unless it's you I fall into"
I was lost within the darkness, but then I found her
I found you
I would never fall in love again until I found her
I said, "I would never fall unless it's you I fall into"
I was lost within the darkness, but then I found her
I found you
Shirt
SZA
Kiss me, dangerous
Been so lost without you all around me
Get anxious
Lead me, don't look back, it's all about you
In the dark right now
Feeling lost, but I like it
Comfort in my sins, and all about me
All I got right now
Feel the taste of resentment
Simmer in my skin, it's all about
Blood stain on my shirt, new bitch on my nerves
Old nigga got curved, going back on my word
Damn, bitch, you so thirsty
Still don't know my worth, still stressing perfection
Let you all in my mental, got me looking too desperate
Damn, you ain't deserve
Broad day, sunshine, I'll find a way to fuck it up still
Can't cry about the shit that I can't change
Just my mind, gotta get outta here
Tough crowd, I hate it, can't stay
In the dark right now
Feeling lost, but I like it
Comfort in my sins, and all about me
All I got right now
Feel the taste of resentment
Simmer in my skin, it's all about
Blood stain on my shirt, new bitch on my nerves
Old nigga got curved, going back on my word
Damn, bitch, you so thirsty
Still don't know my worth, still stressing perfection
Let you all in my mental, got me looking too desperate
Damn
It's what you say and how you do me
How I'm 'posed to trust, baby? 'Posed to love?
It ain't 'posed to hurt this way, all I need is the best of you
How I got to say it? Give me all of you
In the dark right now
Feeling lost, but I like it
Comfort in my sins, and all about me
All I got right now
Feel the taste of resentment
Simmer in my skin, it's all about
Blood stain on my shirt, new bitch on my nerves
Old nigga got curved, going back on my word
Damn, bitch, you so thirsty
Still don't know my worth, still stressing perfection
Let you all in my mental, got me looking too desperate
Damn, you ain't deserve
Snooze
SZA
I'll touch that fire for you
I do that three four times again, I testify for you
I told that lie, I'd kill that bitch
I do what all of them around you scared to do, I'm not
Long as you juggin' out here for me, I got it
Mobbin', schemin', lootin', hide your bodies
Long as you dreamin' 'bout me, ain't no problem
I don't got nobody, just with you right now
Tell the truth, I look better under you
I can't lose when I'm with you
How can I snooze and miss the moment?
You just too important
Nobody do body like you do
I can't lose when I'm with you
I can't just snooze and miss the moment
You just too important
Nobody do body like you do, you do
In the droptop ride with you, I feel like Scarface (Scarface)
Like that white bitch with the bob, I'll be your main one (main one)
Let's take this argument back up to place
Sex remind you, I'm nonviolent, I'm your day one
We ain't had shit, yeah, it was magic, yeah
Smash and grab shit, yeah
Nasty habits take a hold when you not here
Ain't a home when you not here
Hard to grow when you not here, I'm sayin'
I can't lose when I'm with you
How can I snooze and miss the moment?
You just too important
Nobody do body like you do
I can't lose when I'm with you
How can I snooze and miss the moment?
You just too important
Nobody do body like you do, you do
Main one ridin'
How you frontin' on me and I'm the main one tryin'?
How you blame it on me and you the main one lyin'?
How you threatenin' to leave and I'm the main one cryin'?
Just tryna be your everything
Main one ridin'
How you frontin' on me and I'm the main one tryin'?
How you blame it on me and you the main one lyin'?
How you threatenin' to leave and I'm the main one cryin'?
I can't lose when I'm with you, ooh
How can I snooze and miss the moment?
You just too important
Nobody do body like you do
I can't lose when I'm with you
How can I snooze and miss the moment?
You just too important
Nobody do body like you do, you do
Nah, nah, nah, nah
I think I know, whoa-oh
See, no, I can't lose
I think I know, ooh-whoa, ooh-whoa-oh
Painting Pictures
Superstar Pride
Holding back, no
Go without you
I don't wanna be with you
Uh, uh (ayy, 40, what that do?)
Lil Wooski ain't your average teen, he see the opps, gon' bang it out
He know exactly how to hit they block like that's his favorite route
Ayy, lil' nigga, if you send some shots, you better make it count
RPs touch his brain cells, rearrange his scalp
They killed Lamp
He took three with him, they all know what your name about
Ain't see you in so long, it's like your voice is slowly fadin' out
Love my niggas right or wrong, no, I would never trade 'em out
Ain't thuggin' 'cause he mean it, these lil' niggas in a race for clout
Block died, we threw up X's, TimTim died, we threw up T's
Twin 'nem died, we bangin' L's, I wish that they ain't never leave
For Killer, K's to the brain, I'm scorin' if I catch the G's
Forever stuntin' like I'm Durb, I keep a few tricks up my sleeve
God, I know I'm nothin' like them
I'm just different, I'm built different
I feel as if we all got a purpose and we all special in some way
But me, my potential is unmatched
I'm the chosen one
But only time'll tell, long live the guys
Uh
Say, "Fuck the opposites, " but deep down, we really all alike
Since the elementary, our elders gave us small advice
This shit could lead to death and jail, it's crazy how they all was right
I was playin' AAU with Ed, it was ball is life
Wish I could've warned him in advance 'fore he lost his life
Game time, call troopers for backup before we call for Christ
Was real to him whole time while he sharpened his knife
He put it in my back for racks, ain't know his heart had a price
Everything done in the dark gon' come to light
Tryna choose your thoughts over your feelings, that's the hardest fight
Had to prove 'em wrong, they said my dreams was out of sight
Now it's 50K up in my jeans on chartered flights
And we rock the hardest ice
Holding back, no
Go without you
I don't wanna be with you, yeah
Wait For U
Future Featuring Drake & Terms
Early in the morning, late at night (I will wait for you)
It don't even matter what time it is (I will wait for you)
Presidential Rollie already on the way (higher, sayin', "Aye, yi, yi, yi")
Whenever I find time, it's okay (ayy)
(ATL Jacob, ATL Jacob)
You pray for my demons, girl, I got you
Every time I sip on codeine, I get vulnerable
I'm knowin' the sounds of the storm when it come
She understand I can't take her everywhere a nigga going
I been in the field like the children of the corn
I can hear your tears when they drop over the phone
Get mad at yourself 'cause you can't leave me alone
Gossip, bein' messy, that ain't what we doing (world was ending)
Travel around the world (would you cry, or would you try to get me?)
Over the phone, dropping tears (tell me now, I want you be clear, yeah)
I get more vulnerable when I do drugs (tell me now, I need you to be clear, yeah)
When you drunk, you tell me exactly how you feel (I will wait for you, for you)
When I'm loaded, I keep it real (I will wait for you, I will wait for you)
Please tell a real one exactly what it is (I will wait, will wait, for you, for you)
Don't say it 'cause you know that's what I wanna hear (I will wait for you, I will wait for you)
Yeah, I been trapping 'round the world
I sit on my balcony and wonder how you feeling
I got a career that takes my time away from women
I cannot convince you that I love you for a living (will wait for you, for you)
I be on your line, feelings flowing like a river
You be texting back you at Kiki on the river (I will wait for you)
Message say "Delivered" (I will wait for you), but I know that you 'on't get it
Why you introduce us if you knew that you was with him? (I will wait for you, for you, for you)
Made me shake his hand when y'all been fucking for a minute (I will wait for you, for you)
Walk me off the plank because you know that I'm a swimmer (I will wait for you)
Supposed to be your dawg, but you done put me in a kennel
Girl, put a muzzle on it, all that barking over dinner
I was fucking with you when you had the tiny Presidential
You got better when you met me and that ain't coincidental
Tried to bring the best out you, guess I'm not that influential
Guess I'm not the one that's meant for you
I can hear your tears when they drop over the phone
Get mad at myself 'cause I can't leave you alone
Gossip, being messy, that ain't what we doing, yeah (world was ending)
Trapping around the world (would you cry, or would you try to get me?)
Over the phone, dropping tears (tell me now, I want you to be clear)
I get more vulnerable when I do pills (tell me now, I need you to be clear, yeah)
When you drunk, you tell me exactly how you feel (I will wait for you, for you)
When I'm loaded, I keep it real (I will wait for you, I will wait for you)
Please tell a real one exactly what it is (I will wait, will wait, for you, for you)
Don't say it 'cause you know that's what I wanna hear (I will wait for you, I will wait for you)
Early in the morning, late at night
It don't even matter what time it is
(World was ending, would you cry, or would you try to get me?)
(Tell me now, I want you to be clear, yeah)
(Tell me now)
The Kind Of Love We Make
Luke Combs
We've been burnin' both ends
Keepin' the lights on
So I've been thinkin' we need
A little time alone
So whatcha say we cancel our plans?
Tonight, I'm only gonna be your man
Let's get some candles burnin'
And some records turnin'
All the lights down low
Take it nice and slow
The way your body's movin'
Keep doin' what you're doin'
To me all night long
Writin' our love song
Girl, I want it, gotta have it
Let the passion take us to a higher place
Makin' the kind of love we make
Well, there ain't no way, baby
To get me out this house
When you look this good
What could I even think about? Oh
Besides turnin' round and lockin' the door
Watchin' your red dress fall to the floor
Let's get some candles burnin'
And some records turnin'
All the lights down low
Take it nice and slow
The way your body's movin'
Keep doin' what you're doin'
To me all night long
Writin' our love song
Girl, I want it, gotta have it
Let the passion take us to a higher place
Makin' the kind of love we make
Kind of love we make
So whatcha say we cancel our plans?
Tonight, I'm only gonna be your man
Let's get some candles burnin'
Some records turnin'
All the lights down low
Take it nice and slow
The way your body's movin'
Keep doin' what you're doin'
To me all night long
Writin' our love song
Girl, I want it, gotta have it
Let the passion take us to a higher place
Girl, I want it, gotta have it
Let the passion take us to a higher place
Makin' the kind of love we make
Kind of love we make
Makin' the kind of love we make
I Like You (A Happier Song)
Post Malone Featuring Doja Cat
We've been burnin' both ends
Keepin' the lights on
So I've been thinkin' we need
A little time alone
So whatcha say we cancel our plans?
Tonight, I'm only gonna be your man
Let's get some candles burnin'
And some records turnin'
All the lights down low
Take it nice and slow
The way your body's movin'
Keep doin' what you're doin'
To me all night long
Writin' our love song
Girl, I want it, gotta have it
Let the passion take us to a higher place
Makin' the kind of love we make
Well, there ain't no way, baby
To get me out this house
When you look this good
What could I even think about? Oh
Besides turnin' round and lockin' the door
Watchin' your red dress fall to the floor
Let's get some candles burnin'
And some records turnin'
All the lights down low
Take it nice and slow
The way your body's movin'
Keep doin' what you're doin'
To me all night long
Writin' our love song
Girl, I want it, gotta have it
Let the passion take us to a higher place
Makin' the kind of love we make
Kind of love we make
So whatcha say we cancel our plans?
Tonight, I'm only gonna be your man
Let's get some candles burnin'
Some records turnin'
All the lights down low
Take it nice and slow
The way your body's movin'
Keep doin' what you're doin'
To me all night long
Writin' our love song
Girl, I want it, gotta have it
Let the passion take us to a higher place
Girl, I want it, gotta have it
Let the passion take us to a higher place
Makin' the kind of love we make
Kind of love we make
Makin' the kind of love we make
Love You Anyway
Luke Coombs
The parts of me I gave away
Was honestly a huge mistake
An apology from you'd be great
But nothing between us will ever be the same
Do you wanna see inside my heart?
Inside my heart
It's what you took from me at the start
To breaking it apart
Now I have to love you from far away
But I want you to stay in my arms
Why don't you wanna put down the blade?
I can't keep on hiding these scars
That you made, that you gave
That's why I love you from far away
That you gave, that you made
That's why I love you from far away
My family still brings up your name
So I threw our pictures in the fireplace
And watched them burn right threw your face
While I cried my tears 'til no more came
Do you wanna see inside my heart?
Inside my heart
It's what you took from me at the start
To breaking it apart
Now I have to love you from far away
I know I told you I'd stay when its hard
But you haven't put down the blade
I'm getting tired of hiding these scars
We said we're meant to be
But we've come to this
When I love to see what we could've been
But we won't know, oh
Oh, we'll never know
Now I have to love you from far away
But I want you to stay in my arms
Since you couldn't put down the blade
I'm done hiding all of these scars
Bebe Dame
Fuerza Regida X Grupo Frontera
Bzrp Music Sessions, Vol. 53
Bizarrap & Shakira
Wait In The Truck
HARDY Featuring Lainey Wilson
I got turned around in some little town
I'd never been to before
Working my way through a middle-of-June
Midnight thunderstorm
There was something in the headlights
It stopped me on a dime
Well, she was scared to death, so I said
"Climb in, " and then she climbed
Oh, yeah
Well, she was bruised and broke from head to toe
With a tear in her blood-stained shirt
She didn't tell the whole truth, but she didn't have to
I knew what had happened to her
I didn't load her down with questions
That girl had been through enough
I just threw it in drive, looked in those eyes
And I asked her where he was
I don't know if he's an angel
'Cause angels don't do what he did
He was hellbent to find the man behind
All the whiskey scars I hid
I never thought my day of justice
Would come from a judge under a seat
But I knew right then I'd never get hit again
When he said to me
"Wait in the truck
Just wait in the truck"
Well, I knocked and knocked and no one came
So I kicked in his double-wide door
I let the hammer drop before he got
To that 12 he was reaching for
I didn't try to hide my pistol
I didn't even try to run
I just sat on the porch, smoking one of his cigarettes
And waited for the cops to come
I don't know if he's an angel
'Cause angels don't do what he did
He was hellbent to find the man behind
All the whiskey scars I hid
I never thought my day of justice
Would come from a judge under a seat
But I knew right then I'd never get hit again
When he said to me
"Wait in the truck
Just wait in the truck"
Whoa (whoa)
Have mercy on me, Lord
Have mercy on me
Have mercy on me (hey), Lord
It's been 60 months and she still comes
To see me from time to time
It was worth the price, to see a brighter side
Of the girl I picked up that night
And I might be here forever
It ain't paradise, that's true
But it's whole hell of a lot better
Than the place I sent him to, yeah
Wait in the truck (have mercy on me)
Just wait in the truck
(Have mercy, have mercy, have mercy on me)
Wait in the truck (Lord, have mercy)
Just wait in the truck
have mercy on me, Lord
Have mercy on me
(Have mercy, have mercy, have mercy on me) have mercy on me, Lord
Have mercy on me
(Have mercy, have mercy, have mercy on me) wait in the truck
Just wait in the truck
(Have mercy, have mercy, have mercy on me) wait in the truck
Just wait in the truck (please have me mercy)
She Had Me At Heads Carolina
Cole Swindell
I was out with the boys, catchin' up at a neon light
Didn't know 'til we walked in, it was karaoke night
She was in a circle of girls, chasin' a shot with a lime
She was laughin', they were darin' her to get on the mic
One of 'em walked up and turned in her name
Next thing I knew, man, she was up on the stage, singin'
"Heads Carolina, tails California"
Maybe she'd fall for a boy from South Georgia
She's got the bar in the palm of her hand
And she's a '90s country fan like I am
Hey, I got a Chevy, she can flip a quarter
I'd drive her anywhere from here to California
When this song is over, I gotta find her
'Cause she had me at "Heads Carolina"
Yeah, she knew every word by heart, didn't need no screen, no
I was raisin' my glass up for her, I saw her smilin' at me, yeah
She had me down in the front by the end of verse two
Like there wasn't no-one else in the room, we were singin'
"Heads Carolina, tails California"
Maybe she'd fall for a boy from South Georgia
She's got the bar in the palm of her hand
And she's a '90s country fan like I am
Hey, I got a Chevy, she can flip a quarter
I'd drive her anywhere from here to California
When this song is over, I gotta find her
'Cause she had me at "Heads Carolina"
Yeah, I bought her a round, and we talked 'til the lights came on
I still see that girl every time I hear that song
"Heads Carolina, tails California"
Maybe she'd fall for a boy from South Georgia
She's got the bar in the palm of her hand
And she's a '90s country fan like I am
Hey, I got a Chevy, she can flip a quarter
I'd drive her anywhere from here to California
When this song is over, I gotta find her
'Cause she had me at "Heads Carolina"
Yeah, she had me at "Heads Carolina" (somewhere greener, somewhere warmer)
(Heads Carolina, tails California)
Yeah (somewhere greener, somewhere warmer)
(Heads Carolina)
(Somewhere together, I've got a quarter)
(Heads Carolina, tails California)
Dawns
Zach Bryan Featuring Maggie Rogers
Wake me up, when the season's gone
'Cause, I've wasted all my dawns, on you
So, what do I do
Oh, what I do
I get fucked up, just 'cause I'm scared
Love's just another drug, I have grown a victim to
So, what do I do
Oh, what do I do
All is fair, in love and war
So, what the hell are we even fightin' for
I'm on your front porch, beggin' for my dawns, back
Give my God damn records, and my clothes, back
'Cause, I'm through
Oh, how I'm through
And, by the time she wakes
I'll be halfway, to my mama's home
It just dawned on me
Life is as fleeting, as the passin' dawn
And, it was my mistake
'Cause, she never said a thing, about Jesus
I miss my mother's southern drawl
And, her prayin', through the walls, in the evenin
Give me my dawns, back
Everything that dies, makes it's way, back
I lost her, last July, in a heart attack
I need one small victory, mmm
Give me my dawns, back
'Cause, everything that dies, makes it's way, on back
I lost her, last July, in a heart attack
I need one, small victory
Wake me up, when the season's gone
'Cause, I've wasted all my dawns, on you
So, what do I do
Oh, what I do
And, by the time he wakes
I'll be halfway, to my best friend's home
It just dawned on me
Life is as fleeting, as the passin' dawn
And, I shoulda told him twice
I believe in something bigger, than both of us
I miss goin' out to bars, shootin' stars
Not worryin', 'bout what's left of us, mmm
Give me my dawns, back
Everything that dies, makes it's way, on back
I lost her, last July, in a heart attack
I need one small victory, mmm
Give me my dawns, back
Everything that dies, makes it's way, on back
I lost her, last July, in a heart attack
I need one small victory
Ooh
Ooh, ooh
I got fucked up, just 'cause I'm scared
Love's just another drug, I have grown a victim to
What do I do
Oh, what do I do
What My World Spins Around
Jordan Davis
I love a first cast when the water's glass and the line starts to run
Or that first sip of a cold beer when the working week's done
I love the twilight in the morning 'fore the day wakes up
With the windows down on the first ride in a paid up truck
And I love a slow down in a beach town with an ocean view
And I love a first fall Saturday trip down to Baton Rouge
And I love a six string with the stars out and the campfire glow
But girl, that don't even come close
To the way that it feels when you lean in and kiss me
The way that you dance when you get kinda tipsy
I'm wrapped 'round your finger like this ring I'm wearing
That look in your eye, girl, when you catch me staring
And I don't even know what it is but now that I found it
I can't imagine me living without this
Back forty view on our piece of ground
Watching you watch the sun going down
That's what my world spins around
Well, I finally get it now, when they say you know you know
And yeah, girl, you had me from that first hello
And the only thing better is Heaven above
But until I get there, I'll never get enough of
The way that it feels when you lean in and kiss me
The way that you dance when you get kinda tipsy
I'm wrapped 'round your finger like this ring I'm wearing
That look in your eye, girl, when you catch me staring and
I don't even know what it is but now that I found it
I can't imagine me living without this
Back forty view on our piece of ground
Watching you watch the sun going down
That's what my world spins around
What my world spins around
Oh, yeah
The way that it feels when you lean in and kiss me
The way that you dance when you get kinda tipsy
And I'm wrapped 'round your finger like this ring I'm wearing
And that look in your eye, girl, when you catch me staring and
I don't even know what it is but now that I found it
I can't imagine me living without this
Back forty view on our piece of ground
Watching you watch the sun going down
That's what my world spins around
You're what my world spins around
What my world spins around
About Damn Time
Lizzo
It's bad bitch o'clock, yeah, it's thick-thirty
I've been through a lot but I'm still flirty (okay)
Is everybody back up in the buildin'?
It's been a minute, tell me how you're healin'
'Cause I'm about to get into my feelings
How you feelin'? How you feel right now?
Oh, I've been so down and under pressure
I'm way too fine to be this stressed, yeah
Oh, I'm not the girl I was or used to be
Uh, bitch, I might be better
Turn up the music, turn down the lights
I got a feelin' I'm gon' be alright
Okay (okay), alright
It's about damn time (time)
Turn up the music, let's celebrate (alright)
I got a feelin' I'm gon' be okay
Okay (okay), alright
It's about damn time
In a minute, I'ma need a sentimental
Man or woman to pump me up
Feeling fussy, walkin' in my Balenci-ussy's
Tryna bring out the fabulous
'Cause I give a fuck way too much
I'ma need like two shots in my cup
One to get up, one to get down
Mm, that's how I feel right now
Oh, I've been so down and under pressure
I'm way too fine to be this stressed, yeah
Oh, I'm not the girl I was or used to be
Uh, bitch, I might be better
Turn up the music, turn down the lights
I got a feelin' I'm gon' be alright
Okay (okay), alright
It's about damn time (time)
Turn up the music, let's celebrate (alright)
I got a feelin' I'm gon' be okay
Okay (okay), alright
It's about damn time
Bitch
'Cause, uh, you know that time it is, uh
I'm comin' out tonight, I'm comin' out tonight (uh-huh)
I'm comin' out tonight, I'm comin' out tonight (woo)
I'm comin' out tonight, I'm comin' out tonight
Okay (okay), alright (alright)
It's about damn time
I'm comin' out tonight, (let's go) I'm comin' out tonight (comin' out tonight)
I'm comin' out tonight, I'm comin' out tonight (woo)
I'm comin' out tonight, I'm comin' out tonight (comin' out tonight)
Okay (okay), alright
It's about damn time
Oh
Bitch
Yeah, yeah
It's about damn time
Tomorrow 2
GloRilla & Cardi B
They say they don't fuck wit' me (Cheese)
But I say they can't fuck wit' me
Just like the air, I'm everywhere
How you say it's up with' me?
Pop-poppin' shit, you would think I went to school for chiropractin' (poppin')
Lookin' good as hell today, just sent my nigga five attachments (look at this)
Why did you confront me 'bout a nigga? Man, you bitches backwards (stupid ass)
They come at me 'bout niggas who I don't even find attractive (ugh)
I don't know the nigga, I just seened him on the town before
I can't be up in her face, I took her nigga down before (nah)
When I lose a nigga, I just pop out and go find some mo' (easy)
Soon as I feel like my time get wasted, then it's time to go (deuces)
They say they don't fuck with' me
But I say they can't fuck with' me (on gang)
Just like the air, I'm everywhere
How you say it's up with' me? (Huh?)
Them bitches should've stayed down
They could've been up wit' me (too bad)
But all they doin' is talkin' down
'Cause they can't get up wit' me (lame ass)
My ex fuckin' on my old friend, both they ass some fuckin' clowns (haha)
Thinkin' that she got one up on me, she got my hand-me-downs (lame ass ho)
He thought wasn't gon' have to stand on shit, like he was handicap (thought it was)
Make that nigga stand on that, now his ass can't stand me now
High as fuck, I'm lit, yeah, I don't smoke no Swishers (nope)
Slidin' with' my gang and them, look at them like sisters (that's gang)
These bitches be lovin' to go out sad about these niggas (ugh)
I don't wanna hang with' them, they don't handle business (they can't hang with' us)
They be goin' for anything, but I can't go for none of that (none of that)
Why would I go chase you? If I know you gon' come runnin' back (fuckin' dumb)
Cut everybody off, lately been feelin' like the lumberjack (fuck 'em)
They really got me fucked up, and I wasn't goin' for none of that (none of that)
She the type, the nigga make her mad she go and tweet somethin' (ugh)
Me, I'm kinda ratchet still so I'm the type to beat somethin' (beat 'em up)
I can't love you, baby, like yo' bitch do, so don't leave her (keep that bitch)
He gon' choose her every time 'cause it's cheaper to keep her (hahaha)
Can't say yo' name up in my songs, might not fuck wit' you tomorrow (nah)
Can get my feelings hurt today, I won't give a fuck tomorrow (that's just me)
Ain't fucked up 'bout no credit score, I might be rich as fuck tomorrow (duh)
Every day the sun won't shine, but that's why I love tomorrows
Ridin' with my twin and 'nem (skrrt), and we all look good as fuck (gang)
She say she my opp but I don't know her, had to look her up (fuck is you?)
I know that I'm rich, but I can't help it, bitch, I'm hood as fuck (woo)
I've been on these bitches neck so long, sometimes my foot get stuck (ah)
I can't put you in my business (no), you might wish me dead tomorrow (yeah)
Bitches be on dick today, sing every word of "Up" tomorrow (Up)
Bitch, I still got cases opened, keep your mouth shut tomorrow (shh)
Play with me today then get some sleep, you know it's up tomorrow (woo)
Fake bitch, that's why my friend fucked on your nigga (ah-ha)
Both you bitches pussy, I think y'all should scissor (ah, ha, ha)
She bought a chain, I bought the same one, even bigger (bitch, it's bigger)
She throwin' shots, that's how I know I got her triggered (ah)
I don't speak dog, ho (woof), I don't care what no bitch say (no)
I stay on her mind, I got condos in that bitch head (ah)
She say she don't fuck with me (who?), Who said that you can, ho? (No)
That nigga a munch and he gon' eat me like a mango
Long ass weave, it be ticklin' my ass crack (ah)
Wonder what I'll do tomorrow that these hoes will be mad at (huh?)
All y'all bitches sweet, and I always get my lick, boo (facts)
I, I fight for my bitches and I'm fightin' over dick too (that, that, Cardi, yup)
Can't say yo' name up in my songs, might not fuck wit' you tomorrow (nah)
Can get my feelings hurt today, I won't give a fuck tomorrow
Ain't fucked up 'bout no credit score, I might be rich as fuck tomorrow (duh)
Every day the sun won't shine, but that's why I love tomorrows
Can't say yo' name up in my songs, might not fuck wit' you tomorrow (nah)
Can get my feelings hurt today, I won't give a fuck tomorrow (that's just me)
Ain't fucked up 'bout no credit score, I might be rich as fuck tomorrow (duh)
Every day the sun won't shine, but that's why I love tomorrows
Bloody Mary
Lady Gaga
Money
Oh
Love is just a history that they may prove
And when you're gone
I'll tell them my religion's you
When Punktious comes to kill the king upon his throne
I'm ready for their stones
I'll dance, dance, dance
With my hands, hands, hands
Above my head, head, head
Like Jesus said
I'm gonna dance, dance, dance
With my hands, hands, hands above my head
Hands together, forgive him before he's dead, because
I won't cry for you
I won't crucify the things you do
I won't cry for you
See, when you're gone, I'll still be Bloody Mary
Love
We are not just art for Michelangelo to carve
He can't rewrite the aggro of my furied heart
I'll wait on mountain tops in Paris, cold
J'veux pas mourir toute seule
I'll dance, dance, dance
With my hands, hands, hands
Above my head, head, head
Like Jesus said
I'm gonna dance, dance, dance
With my hands, hands, hands above my head
Hands together, forgive him before he's dead, because
I won't cry for you
I won't crucify the things you do
I won't cry for you
See, when you're gone, I'll still be Bloody Mary
Love
Gaga, Gaga
Gaga, Gaga
Gaga, Gaga
Gaga, Gaga
Gaga, Gaga
Gaga, Gaga
Gaga, Gaga
Gaga, Gaga
Dum dum, da-di-da
Dum dum, da-di-da-dadda-da-di-da
Dum dum, da-di-da
Dum dum, da-di-da
Dum dum, da-di-da-dadda-da-di-da
Dum dum, da-di-da
I won't cry for you
I won't crucify the things you do, do, do
I won't cry for you
See, when you're gone, I'll still be Bloody Mary
Oh-oh-oh-oh-oh
Oh-oh-oh-oh-oh
Oh-oh-oh-oh-oh
Oh-oh-oh-oh-oh
Oh-oh-oh-oh-oh
Oh-oh-oh-oh-oh
Oh-oh-oh-oh-oh
Oh-oh-oh-oh-oh
Líberate, mi amor
Nobody Gets Me
SZA
Took a long vacation, no makeup, just Jay-Z
You were balls deep, now we beefin', had me butt-naked at the MGM
So wasted screamin', "Fuck that, " love me now, but I'm anythin'
Hurry now, baby, stick it in 'fore the memories get to kickin' in
It's too late, I don't wanna lose what's left of you
How am I supposed to tell ya?
I don't wanna see you with anyone but me
Nobody gets me like you
How am I supposed to let you go?
Only like myself when I'm with you
Nobody gets me, you do (do)
You do
Nobody gets me, you do (do)
You do
Nobody gets me, you do
You do, nobody gets me, you do
Took me out to the ballet
You proposеd, I went on the road
You was feelin' empty, so you lеft me
Now I'm stuck dealin' with a deadbeat
If I'm real, I deserve less
If I was you, I wouldn't take me back
I pretend when I'm with a man, it's you
And I know that it's too late
I don't wanna lose what's left of you
How am I supposed to tell ya?
I don't wanna see you with anyone but me
Nobody gets me like you
How am I supposed to let you go?
Only like myself when I'm with you
Nobody gets me, you do (do)
You do
Nobody gets me, you do (do)
You do
Nobody gets me, you do (do, ooh)
You do, nobody gets me, you do (do, ooh)
Nobody gets me, you do
Hope
NF
Yeah
Rest in peace to all the kids that lost their lives in the Parkland shooting
This song is dedicated to you
Okay, she keep cryin', she keep cryin' every single night
Day and night, on my mind, please don't kill the vibe
Oh no, I swear to God, I be in my mind
Swear I wanna die, yeah, when you cross my-
Said I wanna die, yuh
No, I'm not alright, yuh
I might start a riot
I'm so fuckin' tired, yuh
So what's up? What you say?
Feelin' good, I'm feelin' great
Tired of the fuckin' hate
Stackin' cheese all on my plate
So outside of my misery, I think I'll find
A way of envisioning a better life
For the rest of us, the rest of us
There's hope for the rest of us, the rest of us
Okay, she keep cryin', she keep cryin' every single night
Day and night, on my mind, please don't kill the vibe
Oh, no, I swear to God, I be in my mind
Swear I wanna die, yeah, when you cross my-
Said I wanna die, yuh
No, I'm not alright, yuh
I might start a riot
I'm so fuckin' tired, yuh
So what's up? What you say?
Feelin' good, I'm feelin' great
Tired of the fuckin' hate
Stackin' cheese all on my plate
Unstoppable
Sia
All smiles, I know what it takes to fool this town
I'll do it 'til the sun goes down
And all through the nighttime
Oh, yeah
Oh, yeah, I'll tell you what you wanna hear
Leave my sunglasses on while I shed a tear
It's never the right time
Yeah, yeah
I put my armor on, show you how strong I am
I put my armor on, I'll show you that I am
I'm unstoppable
I'm a Porsche with no brakes
I'm invincible
Yeah, I win every single game
I'm so powerful
I don't need batteries to play
I'm so confident
Yeah, I'm unstoppable today
Unstoppable today
Unstoppable today
Unstoppable today
I'm unstoppable today
Break down, only alone I will cry out loud
You'll never see what's hiding out
Hiding out deep down
Yeah, yeah
I know, I've heard that to let your feelings show
Is the only way to make friendships grow
But I'm too afraid now
Yeah, yeah
I put my armor on, show you how strong I am
I put my armor on, I'll show you that I am
I'm unstoppable
I'm a Porsche with no brakes
I'm invincible
Yeah, I win every single game
I'm so powerful
I don't need batteries to play
I'm so confident
Yeah, I'm unstoppable today
Unstoppable today
Unstoppable today
Unstoppable today
I'm unstoppable today
Unstoppable today
Unstoppable today
Unstoppable today
I'm unstoppable today
I put my armor on, show you how strong I am
I put my armor on, I'll show you that I am
I'm unstoppable
I'm a Porsche with no brakes
I'm invincible
Yeah, I win every single game
I'm so powerful
I don't need batteries to play
I'm so confident
Yeah, I'm unstoppable today
Unstoppable today
Unstoppable today
Unstoppable today
I'm unstoppable today
Unstoppable today
Unstoppable today
Unstoppable today
I'm unstoppable today
Favorite Song
Toosii
Yeah, I'm on the stage right now
Singin' your favorite song
Look in the crowd
And you're nowhere to be found as they sing along
I say you look good without no makeup
No lashes, even better when you wake up
Uh-uh-uh
I see the look on your face
I see you hidin' the hate
I see you lookin' for someone to scoop you right off of your feet
You wanna ride in a Wraith
You wanna go out on dates
You want somebody to come bring you flowers
Someone to talk to for hours
Wash your back while y'all sit in the shower, yeah
Someone to tell you, "You're beautiful"
Someone to tell you and mean it
Someone to tell you, "I love you" every day
And don't got a reason
You want someone to bring you peace, uh
Someone to help you sleep, yeah
Someone to pick you up when you're feelin' down, feelin' lonely
Need somebody who can make it better
Somebody who can open up those gates
Open up those gates to your heart
Only if you'll let me
I'm on the stage right now
Singin' your favorite song
Look in the crowd
And you're nowhere to be found as they sing along
I say you look good without no makeup
No lashes even better when you wake up
Uh-uh-uh
I see the look on your face
I see you lookin' for peace
I see you tired of the hurt
Tired of the pain
Tired of the nights where you can't get no sleep
I see you're tired thinkin' 'bout if he cheat
See you're tired thinkin' 'bout if you leavin'
See you're tired of bein' so tired
And you damn sure ain't gettin' even
Need somebody
Who can make it better
Somebody
Who can open up those gates
Open up those gates to your heart
Only if you'll let me
I'm on the stage right now
Singin' your favorite song
Look in the crowd
And you're nowhere to be found as they sing along
I say you look good without no makeup
No lashes even better when you wake up
Uh-uh-uh
Lift Me Up
Rihanna
Lift me up
Hold me down
Keep me close
Safe and sound
Burning in a hopeless dream
Hold me when you go to sleep
Keep me in the warmth of your love
When you depart, keep me safe
Safe and sound
Lift me up
Hold me down
Keep me close
Safe and sound
Drowning in an endless sea
Take some time and stay with me
Keep me in the strength of your arms
Keep me safe
Safe and sound
Lift me up
Hold me down
Keep me safe
Safe and sound
Burning in a hopeless dream
Hold me when you go to sleep
Keep me safe
We need light, we need love
lift me up in your arms
(Hold me down) I need love, I need love, I need love
(Keep me close) hold me, hold me
(Safe and sound) hold me, hold me, hold me, hold me
(Lift me up) hold me, hold me, hold me, hold me
(Hold me down) hold me, hold me
(Keep me safe) we need light, we need love
What He Didn’t Do
Carly Pearce
Everybody's asking what the hell happened
Wonderin' why it all went wrong
Mama always said, "If you can't say something nice
Then don't say anything at all"
And I've got my side of the story and he's got his side, too
So I ain't gonna go and tell you what he did
But I'll tell you what he didn't do
Treat me right, put me first, be a man of his word
Stay home 'cause he wanted to
Always fight for my love, hold on tight like it's something
That he couldn't stand to lose
The devil's in the details, I won't tell the hell that he put me through
All I know is in the end, it wasn't what he did
No, it was what he didn't do
I'm already halfway over him and I ain't taking time to turn around
So I'ma take the high road, even though we both know
I could run him out of this town
That's just dirty laundry, I don't need to wear the truth
So I ain't gonna tell you everything he did
But I'll tell you what he didn't do
Treat me right, put me first, be a man of his word
Stay home 'cause he wanted to
Always fight for my love, hold on tight like it's something
That he couldn't stand to lose
The devil's in the details, I won't tell the hell that he put me through
All I know is in the end, it wasn't what he did
No, it was what he didn't do
I ain't met the right one yet but I know when I do
He'll treat me right, put me first, be a man of his word
Stay home 'cause he wanted to
Always fight for my love, hold on tight like it's something
That he can't stand to lose
The devil's in the details, I won't tell the hell that he put me through
All I know is in the end, it wasn't what he did
No, it was what he didn't do
And all I know is in the end, it wasn't what he did
No, it was what he didn't do
Love Again
The Kid LAROI
I never thought that I would find a way out
I never thought I'd hear my heartbeat so loud
I can't believe there's something left in my chest anymore
But goddamn, you got me in love again
I used to think that I was made out of stone
I used to spend so many nights on my own
I never knew I had it in me to dance anymore
But goddamn, you got me in love again
Show me that heaven's right here, baby
Touch me, so I know I'm not crazy
Never have I ever met somebody like you
Used to be afraid of love and what it might do
But goddamn, you got me in love again
You got me in love again
You got me in love again
You got me in love again
Again
So many nights, my tears fell harder than rain
Scared I would take my broken heart to the grave
I'd rather die than have to live in a storm like before
But goddamn, you got me in love again
Show me that heaven's right here, baby
Touch me, so I know I'm not crazy
Never have I ever met somebody like you
Used to be afraid of love, and what it might do
But goddamn, you got me in love again
You got me in love again
You got me in love again
You got me in love again
Again
I can't believe, I can't believe
I finally found someone
I'll sink my teeth in disbelief
'Cause you're the one that I want
I can't believe, I can't believe
I'm not afraid anymore
Goddamn, you got me in love again
La-la-la, la-la-la
La-la-la, la-la-la
I never thought that I would find a way out
I never thought I'd hear my heartbeat so loud
I can't believe there's something left in my chest anymore
Oh, goddamn, you got me in love again
I can't believe, I can't believe
I finally found someone
I'll sink my teeth in disbelief
'Cause you're the one that I want
I can't believe there's something left inside my chest anymore
But goddamn, you got me in love again
You got me in love again
You got me in love again
You got me in love again (again and again)
Again (and again and again and again)
Special
Lizzo Featuring SZA
Special
Woke up this mornin' to somebody judgin' me
No surprise they're judgin' me, don't know who I'm 'posed to be
I'm just actin' up, I'm rash as fuck, and never sayin' sorry
Found out it in the end that I can only do it for me
You call it sensitive and I call it superpower
You just lack empathy 'cause you think it gives you power
All I know is only God can judge me
I don't hide my heart, I wear it on me
I'm used to feeling alone, oh (feeling alone)
So, I thought that I'd let you know
In case nobody told you today
You're special
In case nobody made you believe (nobody, no, no)
You're special
Well, I will always love you the same
You're special
I'm so glad that you're still with us (so glad, so glad, so glad)
Broken, but damn, you're still perfect (you're perfect)
Special (ba-ba-ba, ba-ba-ba)
(Ba-ba-ba, ba-ba-ba)
Could you imagine a world where everybody's the same?
And you could cancel a girl 'cause she just wanted to change?
How could you throw fucking stones if you ain't been through her pain?
That's why we feel so alone, that's why we feel so much shame, hmm
I'm used to feeling alone, oh
So, I thought that I'd let you know (oh)
In case nobody told you today
(I wait around for you)
You're special
In case nobody made you believe (nobody made you believe)
You're special
Well, I will always love you the same (I'll always love you)
You're special
I'm so glad that you're still with us
Broken, but damn, you're still perfect (perfect)
I know that I'm not alone, oh (no, you're not alone)
So, I thought that I'd let you know
Uh, yeah, yeah, yeah, yeah, yeah (woo)
In case nobody told you today (nobody told you, nobody)
You're special (yeah)
In case nobody made you believe (oh-oh)
You're special (yeah)
Well, I will always love you the same
(I'll always love you, I'll always love you)
You're special
I'm so glad that you're still with us (I'm so glad, so glad, so glad)
Broken, but damn, you're still perfect (I'm so glad, so glad, so glad)
Special
You are, you are, you are so
Special
Yeah, yeah, yeah, yeah, yeah
Yeah, yeah, yeah, yeah, yeah (mm-mm)
Hey
Que Vuelvas
Carin Leon X Grupo Frontera
Spin Bout You
Drake & 21 Savage
You gotta motherfuckin' feel this shit, boy
(BanBwoi)
Woah
I got feelings for you
Hope you ain't lovin' the crew
How many bodies you got?
Pray it ain't more than a few
Know that you dealt with some lames
When you was young and in school
He had to pop your cherry
But I got it wet like a pool
She got a new G-Wag'
She wanna hit Highlight Room and show it off
Got a new body, girl, show it off
This a Brazilian, I know it's soft
Toned up and she got a six-pack
Look like she used to play volleyball
American Express, you can have it all
Code to the safe, you can have it all
Fuck your main page, what's your Finsta? I wanna know the real you
You started dancin' to pay your tuition, girl, I wanna know what you been through
You want a boutique or you wanna sell hair, just let me know what you into
If you out in public, and he want your number, just tell him, "My nigga'll spin you"
The way you make me feel these days
Somethin' gettin' dry for you, baby girl
Smoke a nigga top for you, baby girl
Burn somebody block for you
The way you make me feel these days
Comin' out my body for you, baby girl
Wipe him like he snotty for you, baby girl
Comin' out my body for you
Damn, just turned on the news and seen that men who never got pussy in school
Are makin' laws about what women can do
I gotta protect ya, I'm a made man, tied in, all the way, baby
So I gotta respect ya
Niggas put hands on you in the past, insecure because your body is pressure
Four words when I think about them is crusty, musty, dusty, rusty
Eight words when I think about us is fuck me, fuck me, fuck me, fuck me
Disrespect ya and I'll smack 'em
The texts that you send in captions
The videos we got ever leak, we goin' viral or goin' platinum
Don't worry 'bout your friend's story when I had her alone
She gon' try and put some extras on it, take you out of your zone
You know how it goes when they can't get a reservation up in Carbone
They gon' tell you it's a chill night, tell you how they'd rather stay home, yeah
Jealous-ass hoes, yeah
And I know what I said 'bout bein' in Vogue
But just like that R&B group from the '90s
Girl, one call, I'll get you in Vogue
One call, you in runway shows
One call, I'm sittin' front row
One wrong call from your ex nigga sayin' dumb shit'll get him sent home
One call and my niggas ten toes
Down to go wherever I say go
Even if we gotta travel 'cross the globe
Down to take you to the end of the road, for real
The way you make me feel these days
Somethin' gettin' dry for you, baby girl
Smoke a nigga top for you, baby girl
Burn somebody block for you
The way you make me feel these days
Comin' out my body for you, baby girl
Wipe him like he snotty for you, baby girl
Comin' out my body for you
Want ya, I want-want ya
Oh, your lovin' so deep (feelin' so deep)
Want ya, I want-want ya
Give me your lovin' (feelin' so deep)
Handle On You
Parker McCollum
I went and bought the biggest bottle they got 'cause you're gone
Drop the needle on a vinyl and cry to an old Haggard song
Sittin' at the table, baby, breaking the seal
Gonna see how much of this pain I can kill
I went and bought the biggest bottle they got 'cause you're gone
Tennessee and Kentucky 'cause you ain't here to love me
I drink now that there's nothing to lose
I've been fightin' with your memory, I hate the way it hits me
I wake up every day, black and blue
After all this back and forth, a fifth won't do
Yeah, I finally got a handle, finally got a handle on you
I tell myself that I should quit but I don't listen to drunks
I keep on sippin' 'til, "I miss you" don't roll off my tongue
Since you poured our love down the sink
I think I'll just stay here and drink
I tell myself that I should quit but I don't listen to drunks
Tennessee and Kentucky 'cause you ain't here to love me
I drink now that there's nothing to lose
I've been fightin' with your memory, I hate the way it hits me
I wake up every day, black and blue
After all this back and forth, a fifth won't do
Well, I finally got a handle on you
After all this back and forth, a fifth won't do
Yeah, I finally got a handle, finally got a handle on you
Handle on you
Yeah, I finally got a handle on you
Wild As Her
Corey Kent
She never wanted to be white picket fenced in
Her heart's like a feather in a Tulsa wind
Seaside breeze'll bring her to life
And all them other boys say she's a goodbye girl
She'll wreck your world
And leave before the mornin' sun
But here she is, free, layin' next to me
'Cause I ain't tryna to tame her love
I keep the windows down and the wind in her hair
Keep her heart hangin' on 'round every turn
She ain't scared to get tied down, scared to get burned
Just looking for somebody as wild as her
Wild, wild
Saw that highway unwind in her deep brown eyes
She saw a long stretch of dirt road dreamin' in mine
She ain't living for a diamond ring
Just living like the rock'n'roll song she sings out loud
We're burnin' it down
Blazin' up a trail of smoke
Wherever we are, wherever we go
Yeah, that's where she calls home
I keep the windows down and the wind in her hair
Keep her heart hangin' on 'round every turn
She ain't scared to get tied down, scared to get burned
Just looking for somebody as wild as her
Wild, wild
Yeah, man
Yeah, I ain't tryna fix her
I just wanna kiss her
Fuel a little fire in her soul
But we don't say forever, but when we're together
Swear that we ain't ever lettin' go
'Cause she knows
I keep the windows down and the wind in her hair
Keep her heart hanging on 'round every turn
She ain't scared to get tied down, scared to get burned
Just looking for somebody as wild as her
Wild, wild
Yeah
One Thing At A Time
Morgan Wallen
Somebody hand me a cigarette
I know I ain't had one in over a week
Somebody pour me a double shot
Been gettin' better by the day, but tonight I drink
You say I gotta get over you and get sober too
I got a lot of habits I gotta kick
Weigh out all your options and take your pick
I can either burn the bar down
Or I can take your number out my phone
I can give you up right now
And never want you back long as I'm half-stoned
If you want me to quit you, want me to get you
Outta my heart and, baby, off my mind
I hate to tell you, girl, but I'm only quittin' one thing at a time
I know I got me some problems
About a thousand memories I gotta forget
But if I'm gonna solve them
Baby, I'll take all the help I can get
If you ain't gonna kiss me
Then I'll take some whiskey, some Grizzly
Nicotine, amphetamines too
You want me to stop some of that
Or you want me to stop loving you?
Hey, what you want me to do?
I can either burn the bar down
Or I can take your number out my phone
I can give you up right now
And never want you back long as I'm half-stoned
If you want me to quit you, want me to get you
Outta my heart and, baby, off my mind
I hate to tell you, girl, but I'm only quittin' one thing at a time
Aw yeah, I hate to tell you
Aw yeah, I hate to tell you
I ain't no Superman, I'm just the way I am
If I'm gonna move on, then I need me something in my hand
Ain't nothing wrong with that
And if you ain't comin' back
I can either burn the bar down
Or I can take your number out my phone
I can give you up right now
And never want you back long as I'm half-stoned
If you want me to quit you, want me to get you
Outta my heart and, baby, off my mind
I hate to tell you, girl, but I'm only quittin' one thing at a time
Aw yeah, I hate to tell you
Aw yeah, I hate to tell you
Aw yeah, I hate to tell you
I Wrote The Book
Morgan Wallen
When it comes to
Hitchin' the boat up
Backin' down the ramp
In my old truck
To find a bunch of logs
To catch a bunch of hogs
Yeah, I wrote the book
Yeah, I wrote the book
If you wanna learn
To throw a curve right
To catch a clean up
Lookin' on a third strike
Talk a little smack
While he's walkin' back
Yeah, I wrote the book
But there's one that lays by the lamp on the nightstand
One that says don't cuss and don't fight
Or let the bottle turn you into a different man
But, damn, if I don't do it every Friday night
Those get you into Heaven letters in red
Ain't gettin' read enough to keep me on a straight line
I'm a Jack of all trades
But man I gotta say
That's one book I didn't write
I met a good girl
She had her life straight
She said she loved that I was good at everythin'
One day she left me in a cloud of dust
'Cause I never was too good at pickin' up
The one that lays by the lamp on the nightstand
One that says don't cuss and don't fight
Or let the bottle turn you into a different man
But, damn, if I don't do it every Friday night
Those get you into Heaven letters in red
Ain't gettin' read enough to keep me on a straight line
I'm a jack of all trades
But man I gotta say
That's one book I didn't write
Yeah the good Lord knows I need it
I didn't write it but I probably oughta read it
The one that lays by the lamp on the nightstand
One that says don't cuss and don't fight
Or let the bottle turn you into a different man
But, damn, if I don't do it every Friday night
Those get you into Heaven letters in red
Ain't gettin' read enough to keep me on a straight line
I'm a jack of all trades
But man I gotta say
That's one book I didn't write
That's one book I didn't write
That's one book I didn't write
Watch This (ARIZONATEARS Pluggnb Remix)
Lil Uzi Vert
Jumped in a whip that you never seen (ARIZONATEARS)
It got bulletproof glass and a ceiling screen (ARIZONATEARS)
"Let me guess, it cost you like a hundred thousand?"
No, little bitch, it cost seven beans (vyoom)
I'm doin' what you do in your dreams
I fucked your bitch pussy, Vaseline (yeah)
These bitches always wan' capture me
Hatin' ass nigga turn me to meme (yeah)
Fucked all y'all bitches like ten of me
On the real it was just me and Mean (yeah)
We don't know how tall your girlfriend is
When we saw her, she was on her knees (yeah)
I cannot get tired of money
But this money, it will turn people right into greed (one, two, yeah)
I know he sick of my money (one, two, three)
'Cause every time he call you (let's go), he wish that he called for me (yeah)
Benjamins sittin' in my left, right pocket, watch this
These niggas broke boys, watch this, watch this (let's go)
These niggas broke boys, watch this, watch this (watch)
I can make the whole crowd mosh pit, watch this (woo)
I can make your bitch wanna stop and watch this (watch)
I can make your bitch wanna stop then pop it (pop)
I ain't even drop no songs, still poppin'
I stay with a whole lotta (woo)
Benjamins sittin' in my left, right pocket, watch this
These niggas broke boys, watch this, watch this (let's go)
These niggas broke boys, watch this, watch this (watch)
I can make the whole crowd mosh pit, watch this (let's go)
I can make your bitch wanna stop and watch this (watch)
I can make your bitch wanna stop then pop it (whoa)
I ain't even drop no songs, still poppin'
I stay with a whole lotta (baow)
Bustas, Lil Uzi stay on some mob shit, stop it
All these drugs keep fucking with my conscience
How you from my hood, but you still on some opp shit?
Time gon' tell, fuck niggas get popped quick
Grabbin' on Lil Uzi, yeah, that's a lose on lose (let's go)
I keep changing my clothes dependin' on my mood, yeah
I keep changing my hoes, same way I'm changin' my shows, yeah
I keep changing my poles, but I'm not changin' my shows
Niggas, they goin' outside like a hall, 'cause they not following rules (at all)
No matter in the world where I'm gon' go, I never tuck in my jewels (jewels, yeah)
Benjamins sittin' in my left, right pocket, watch this (let's go)
These niggas broke boys (let's go), watch this, watch this (yeah)
These niggas broke boys, watch this, watch this
I can make the whole crowd mosh pit, watch this (let's go)
I can make your bitch wanna stop and watch this
I can make your bitch wanna stop then pop it (whoa)
I ain't even drop no songs, still poppin'
I stay with a whole lotta (baow)
Benjamins sittin' in my left, right pocket, watch this
These niggas broke boys (let's go), watch this, watch this
These niggas broke boys, watch this, watch this
I can make the whole crowd mosh pit, watch this (woo)
I can make your bitch wanna stop and watch this (let's go)
I can make your bitch wanna stop then pop it (whoa)
I ain't even drop no songs, still poppin'
I stay with a whole lotta (Lil Uzi Vert)
Dead men on me, yeah, dead men on me, yeah (yeah)
She been feelin' me, 'cause I walked in with her rent on me (yeah)
In the sky, Percocet on me, yeah, you can feel on me, yeah
I'm not worried 'bout them, 'cause I got two big FN's on me (yeah)
Can you pop that thing 'til the mornin'?
When you pop that, smell no aroma
Don't want her like that, I don't own it
That's your little bitch, please don't disown her
I'ma keep throwing this money, bee
Heaven
Niall Horan
Baby, you're all that I want
When you're lyin' here in my arms,
I'm findin' it hard to believe
We're in heaven
Oh, thinkin' about all our younger years,
There was only you and me,
We were young and wild and free
Now nothin' can take you away from me
We've been down that road before
But that's over now,
You keep me comin' back for more
Baby, you're all that I want
When you're lyin' here in my arms,
I'm findin' it hard to believe
We're in heaven
And love is all that I need,
And I found it there in your heart
It isn't too hard to see
We're in heaven
(We're in heaven)
Now nothin' could change what you mean to me uh,
there's a lot that I could say
But just hold me now,
Cause our love will light the way
Baby you're all that I want
When you're lyin' here in my arms,
I'm findin' it hard to believe
We're in heaven
And love is all that I need,
And I found it there in your heart
It isn't too hard to see
We're in heaven
heaven
Now our dreams are comin' true,
Through the good times and the bad,
I'll be standin' there by you
And love is all that I need,
And I found it there in your heart
It isn't too hard to see
We're in heaven
We're in heaven
Freestyle
Lil Baby
Da Vinci (Da Vinci, Da Vinci)
Yeah
Shoutout the whole Oakland City, man
You know what I'm sayin'? The whole 4PF
You know what I'm sayin'? I put this up
Shoutout my label, that's me
I'm in this bitch with TB
I'm in this bitch with 4 Trey
I just poured up me a eight
Real nigga all in my face
Five hunnid racks in my safe
Five hunnid racks to the plug
What you know 'bout showin' love?
What you know 'bout pullin' up, in Bentley trucks?
Make these bitches fall in love
All of my niggas on go
None of my niggas no ho
All of my niggas want smoke
All of my niggas together
We came from the bottom
We used to wear each other clothes
None of my niggas gon' fold
Couple pussy niggas told
They ain't my niggas no mo'
Hold it down for the 4
In the 9 with the woes
Marlo my dawg that's fo' sho'
We won't fall out about shit
Specially not 'bout no bitch
We ain't gon' fall out 'bout hoes
Me and Ced get them loads
We let 'em go for the low
I got my hood in control
I got my left wrist on froze
I got my right wrist on froze
I got my necklace on froze
Both of my ears on froze
I been gettin' faded, I'm sippin' on maple
If she won't fuck, I won't make her
I don't like bitches with makeup
If she want titties I pay for 'em
Get outta there when I wake up
I pass the ball I don't layup
I'm a big boss, I got say so
They'll wipe you down if I say so
Dracos, on Dracos, on Dracos, on Dracos
.40s, on .40s, on .40s
I just bought me some new water
Wetter than Katrina, shoutout New Orleans
I made a promise my niggas gon' ball
Hard in the paint, change my name to John Wall
Geekin' off trees like a leaf in the fall
Find a new plug then we takin' 'em all
Pull up in a brand new Benz Truck
Hop out fresher than a menthol
Lil' nigga, but I'm big dawg
All I gotta make is one call
Hit a nigga block, took off
Cross a nigga up, Hot Sauce
Ooh, I got 'em mad, my fault
Talking bout the shit that I bought
Poppin' these Percs, I done turned to a savage
Hunnid racks stuffed in the mattress
Hunnid racks stuffed in the attic
Hunnid racks stuffed in the sofa
These niggas play gangsta, but they won't approach me
I know they'll never approach me
They know that they'll catch a bullet
I rock the game to the fullest
I run with some real ones, I don't hang with no pussies
I ain't no killer don't push me
I see how you niggas be lookin'
I hope you don't think you no bully
I'm livin' the life, I should star in a movie
Ridin' in a 'Vert with a uzi
12 get behind me I lose 'em
They tryna guess what I'm doin'
They tryna guess who I'm screwin'
That ain't even they business
They ain't wanna fuck with me
Now they see a nigga drippin'
Now they wanna fuck with me
They can't get in touch with me
Hardly ever in the city
They just know I'm gettin' bigger
They just know a nigga busy
I been runnin' up them digits, yeah
Low
SZA
Tell 'em to shoot
I'm out the loop, I'm outta range
Oh-yeah, I stay out the way-ay-ay!
Got another side of me, I like to get it poppin'
But these bitches in my business got me outchea choosin' violence
If you see me out in public, you don't know me, keep it silent
In the bedroom, I be screamin', but outside, I keep it quiet
Keep it on the lowski, I'm the lowest of the lowest
Wanna see if you can keep it like nobody knows shit
I need you to get the fuck out of my space (yeah)
Replacement's on the way, please don't play
That pussy's feelin' like a great escape (oh, oh)
Don't need no trick, old dogs don't change
I'm fuckin', I ain't makin' love no more (pussy)
You got a new bitch, what the fuck you cryin' for?
I'm movin' selfish, callin' all my favorite hoes
You know how to reach me every time and it plays in your mind
With a rush that feels like, we committin' a crime
You know where you belong, I'm gon' save you a spot
But we can't be outside 'cause the block is too hot
And I'm all on your mind
Wherever you are, don't call me! (Let's go)
Got another side of me, I like to get it poppin' (alright)
But these bitches in my business got me outchea choosin' violence (yeah, yeah)
If you see me out in public, you don't know me, keep it silent (okay)
In the bedroom, I be screamin', but outside, I keep it quiet
Keep it on the lowski, I'm the lowest of the lowest
Wanna see if you can keep it like nobody knows shit (let's–)
Keep it on the lowski, I'm the lowest of the lowest
Wanna see if you can keep it like nobody know shit (yeah-yeah)
I need total confidential private shit (yeah)
Don't want no one thinkin' I'm a groupie (it's lit)
Time zones change, now we on a first-class trip (straight up)
Don't work my nerves, you know I get moody
We fuckin', we ain't makin' love no more
You talk that talk, but it don't match it with some stroke
Wherever you are
Whatever you need
Don't call me
Don't call me!
Got another side of me, I like to get it poppin'
But these bitches in my business got me outchea choosin' violence (alright)
If you see me out in public, you don't know me, keep it silent
In the bedroom, I be screamin', but outside, I keep it quiet
Keep it on the lowski, I'm the lowest of the lowest
Wanna see if you can keep it like nobody know shit
Keep it on the lowski, I'm the lowest of the lowest
Wanna see if you can keep it like nobody knows shit (let's–)
Fuck you, real shit
I wasn't even on, fuck you
But you tryna make me look stupid
I'll slap the dog shit out of you, stop playin' with you
Tennesse Orange
Megan Moroney
Mama, I'm callin', I've got some news
Don't ya tell daddy, he'll blow a fuse
Don't worry, I'm doin' okay
I know you raised me to know right from wrong
It ain't what you think and I'm still writin' songs
Just never thought I'd see the day
I've never felt this way
I met somebody and he's got blue eyes
He opens the door and he don't make me cry
He ain't from where we're from
But he feels like home, yeah
He's got me doin' things I've never done
In Georgia, they call it a sin
I'm wearing Tennessee orange for him
Took me to Knoxville last Saturday
And I wore the hat on his dash to the game
It sure wasn't Athens but I
Fell for him under those Neyland lights
I met somebody and he's got blue eyes
He opens the door and he don't make me cry
He ain't from where we're from
But he feels like home, yeah
He's got me doin' things I've never done
In Georgia they call it a sin
I'm wearing Tennessee orange for him
Mama, forgive me, I like him a lot
Hell, I'm learning the words to Old Rocky Top
And he's got a smile that makes me forget
I've always looked better in red
But I met somebody and he's got blue eyes
He opens the door and he don't make me cry
He ain't from where we're from
But he feels like home, yeah
He's got me doin' things I've never done
I met somebody and he's got blue eyes
He opens the door and he don't make me cry
He ain't from where we're from
But he feels like home, yeah
He's got me doin' things I've never done
In Georgia, they call it a sin
And I still want the Dawgs to win
But I'm wearing Tennessee orange for him
I'm wearing Tennessee orange for him
Next Thing You Know
Jordan Davis
You swear that you're staying single
Next thing you know
You meet a girl at a bar
And next thing you know
You get her laughing
It's 2 a.m.
You're telling your buddies
Three months in
That she ain't moving in
The next thing you know
There's a U-Haul trailer
Next thing you know
Your old apartment
Is y'all's new place
There goes the carpet
But the deer head stays
Next thing you know
You're saving money like never before, just to
Spend it all at a jewelry store, getting
Down on one knee on her mama's porch
Just praying she don't say no
Next thing you know
Your best man gives a half-drunk speech and you're
Sunburnt on a honeymoon beach and you're
Left hand's getting used to that ring
And there the next two or three years go
Next thing you know
You weren't really trying
Next thing you know
There's a test on the counter
Next thing you know
She's standing there crying
Nodding her head yes
You're half excited
Half scared to death
'Cause next thing you know
You're wearing scrubs and a funny white hat and the
Doctor's saying, how you doing there dad and
Nobody's ever called you that
And you take that drive home slow
Next thing you know
It's first steps, first dates, first car, it's
11:01 wondering where they are
You're saying that USC's too far
It's amazing how fast 17 years go
Next thing you know
Next thing you know
Next thing you know
You get to know your wife again and you're
More in love than you've ever been with a
Lot of years of remember when's
And still some down the road
'Cause next thing you know
You got a yard full of your kid's kids and ya
Take them to church, teach them to fish, and ya
Tell them stories every chance you get
About how fast this life down here can go
Next thing you know
(Next thing you know)
Next thing you know
(Next thing you know)
In Ha Mood
Ice Spice
Where were you last week
When you stopped comin' by?
(Stop playin' with 'em, RIOT)
Like, damn, she in her mood (grrah)
Like, damn, she in her mood (boom)
Like, damn, she in her mood (in her mood, she in her mood)
Like, damn, she in her mood (she in her mood)
She lit, get money too (like)
Like, damn, she in her mood (she in her mood, damn)
In the mirror, I'm doin' my dance (like)
And he packin', I know by his pants (grrah)
He a rapper, but don't got a chance
Stuck in my ways so I'm lovin' my bands
Like a million views in a day (like)
It's so many ways to get paid (grrah)
I tried dippin', he begged me to stay
Bae, I'm not stayin', I just wanna play (just wanna play)
In the party, he just wanna rump (rump)
Big boobs and the butt stay plump (stay plump)
She a baddie, she know she a ten (baddie, ten)
She a baddie with her baddie friend (damn, friend)
They like, "Ice, how you always stay hot?" (Hot)
Oh, they mad 'cause I keep makin' bops (bops)
Oh, she mad 'cause I'm takin' her spot
If I was bitches, I'd hate me a lot (grrah)
Like, damn, she in her mood (grrah)
Like, damn, she in her mood (boom)
Like, damn, she in her mood (in her mood, she in her mood)
Like, damn, she in her mood (she in her mood)
She lit, get money too (like)
Like, damn, she in her mood (she in her mood, damn)
No friends, I don't fuck with the fakes (grrah)
Sayin' they love me, but wantin' my place (like)
Step in the party, I'm lookin' the baddest
So the paparazzi in my face (grrah)
Pretty bitch, but I came from the gutter
Said I'd be lit by the end of the summer (like)
And I'm proud that I'm still gettin' bigger (damn)
Goin' viral is gettin' 'em sicker
Like, what? Let's keep it a buck (huh)
Bitches too borin', got 'em stuck in a rut (damn)
Lamborghini roarin' when I hop out the truck (huh)
Pretty bitch like Lauren with a big ass butt, yup
Pretty face and the waist all gone (huh)
And I'm makin' them wait, hold on (hold on)
And I'm makin' them wait, hold on (hold on)
Wait, hold on (grrah, hold on)
Like, damn, she in her mood (grrah)
Like, damn, she in her mood (boom)
Like, damn, she in her mood (in her mood, she in her mood)
Like, damn, she in her mood (she in her mood)
She lit, get money too (like)
Like, damn, she in her mood (she in her mood, damn)
Like, damn, she in her mood (grrah)
Like, damn, she in her mood (boom)
Like, damn, she in her mood (in her mood, she in her mood)
Like, damn, she in her mood (she in her mood)
She lit, get money too (like)
Like, damn, she in her mood (she in her mood, damn)
No Time Wasted
Polo G & Future
Uh, uh-uh, uh, uh
Uh, uh-uh, uh
Uh, I know you waitin'
Gettin' fed up, you runnin' out of patience
Just keep your head up, I told you we would make it
Can't give my meds up, I see too many faces, uh
In them foreigns racin'
Best make it count, make sure it's no time wasted
I beat the trenches for my team, that was a great win
Lose when you gave it everything, that's hard to take in (uh-uh-uh)
Ayy, look, I do this shit for all my guys that got put under
Fifty shots, I'm a drummer, them guns spit thunder
One shot and he dead, call that bitch the one-hit wonder
No, we don't aim for legs, tryna make it hot all summer
As a small kid, I used to want a H2 Hummer
My uncle did a bid, they could've gave him football numbers
I went from Mandrake daily tryna work on my jumper
Now it's gang-gang, we stay on top of shit like a plunger, uh
Old heads can't trick us, drop rakes and we flick up
Blow his face, get zipped up, ambulance pick up
Uh, red cups full of liquor, let lead bust for my niggas
Come through and tear it up with them blickers
I really fed all of my killers
Uh, I know you waitin'
Gettin' fed up, you runnin' out of patience
Just keep your head up, I told you we would make it
Can't give my meds up, I see too many faces, uh
In them foreigns racin'
Best make it count, make sure it's no time wasted
I beat the trenches for my team, that was a great win
Lose when you gave it everything, that's hard to take in (uh-uh-uh)
Ayy, I'm gettin' fast money 'cause I ain't got no patience
I'm with my team 'cause they cook niggas like bacon
I did my thing trappin' out the spot, it was vacant
Lil' nigga in jail stabbin' shit up like he Jason, uh, uh, uh
Ridin' in a Porsche with some horses like I'm racin'
Bitch so bad look photoshopped when she naked
Made it out the jungle where it's scorchin' and blazin'
My dog get money and I get money, it's contagious
Went from a trapper to livin' like a popstar
I done got so rich I'm feedin' my lil' kids caviar
My bitch don't want for shit, she get whatever she want
My shooter walk you down, stand over you, make sure it's done
Uh, I know you waitin'
Gettin' fed up, you runnin' out of patience
Just keep your head up, I told you we would make it
Can't give my meds up, I see too many faces, uh
In them foreigns racin'
Best make it count, make sure it's no time wasted
I beat the trenches for my team, that was a great win
Lose when you gave it everything, that's hard to take in (uh-uh-uh)
AMG
Gabito Ballesteros, Peso Pluma & Nataneal Cano
You Didn’t
Brett Young
I could never hate you even if I tried
Girl, I love you too much and I've loved you too long to fight
Mm
It ain't always a bad thing saying goodbye
And the last thing that I wanna be is a waste of your time
There's nothing I could say
To make you wanna stay
Your heart made up it's mind
I don't want you to lie one more minute
You ain't done nothing wrong
I'm not where you belong
Don't let one teardrop fall
Girl, you think it's your fault
But it isn't
I fell in love and you didn't
I hope you find someone
To be the person for you
Mm
That you were to me, what I wanted to be for you
I, I ain't ever asked you to settle or compromise
Girl, you deserve everything in this world
Even if it means you can't be mine
There's nothing I could say
To make you wanna stay
Your heart made up it's mind
I don't want you to lie one more minute
You ain't done nothing wrong
I'm not where you belong
Don't let one teardrop fall
Girl, you think it's your fault
But it isn't
I fell in love and you didn't
I fell in love and you didn't
Feel like I did
I gave you everything I had to give
You didn't need more
You just needed different
And there's nothing I could say
To make you wanna stay
Your heart made up it's mind
I don't want you to lie one more minute
You ain't done nothing wrong
I'm not where you belong
Don't let one teardrop fall
Girl, you think it's your fault
But it isn't
I fell in love and you didn't
I fell in love and you didn't
Nonsense
Sabrina Carpenter
No
(La-la, la-la) da-ah-ah, ah
(Ah-ah, uh, uh, uh, yeah)
Think I only want one number in my phone
I might change your contact to "don't leave me alone"
You said you like my eyes and you like to make 'em roll
Treat me like a queen, now you got me feelin' thrown, oh
But I can't help myself
When you get close to me
Baby, my tongue goes numb
Sounds like bleh, blah, blee
I don't want no one else (don't want)
Baby, I'm in too deep
Here's a lil' song I wrote (a song I wrote)
It's about you and me (me)
I'll be honest
Lookin' at you got me thinkin' nonsense
Cartwheels in my stomach when you walk in
And when you got your arms around me
Ooh, it feels so good I had to jump the octave
I think I got an ex but I forgot him
And I can't find my chill, I must have lost it
I don't even know I'm talkin' nonsense
I'm talkin', I'm talkin' (ah)
I'm talkin' all around clock
I'm talkin' hope nobody knocks
I'm talkin' opposite of soft
I'm talkin' wild, wild thoughts
You gotta keep up with me
I got some young energy
I caught the L-O-V-E
How do you do this to me?
But I can't help myself
When you get close to me
Baby, my tongue goes numb
Sounds like blah, blah, bleh, blee
I don't want no one else (don't want)
Baby, I'm in too deep (too deep)
Here's a lil' song I wrote (a song I wrote)
It's about you and me
I'll be honest (honest)
Lookin' at you got me thinkin' nonsense
Cartwheels in my stomach when you walk in (when you walk in)
When you got your arms around me
Ooh, it feels so good I had to hit the octave
I think I got an ex but I forgot him
And I can't find my chill, I must have lost it
I don't even know I'm talkin' nonsense (oh)
I'm talkin', I'm talkin', I'm talkin'
I'm talkin', I'm talkin'
Blah, blah, blah, blah, uh-uh-uh-uh
Ah
I don't even know anymore
(Oh)
This song catchier than chickenpox is
I bet your house is where my other sock is
Woke up this morning, thought I'd write a pop hit
How quickly can you take your clothes off pop quiz?
That one's not gonna make it
Most of these aren't gonna make
10:35
Tiesto Featuring Tate McRae
All I know it's 10:35
And I can feel your arms around me
Let 'em drown me
All I know it's 10:35
And I'm thanking, thanking God you found me
That you found me
Everyday, I go places in my head
Darker thoughts are harder now
They look like monsters under my bed
And every time, it's like a rocket through my chest
The TV make you think the whole world's about to end
I don't know where this night is goin' (goin')
But I know that you and me got somethin' (somethin')
So many things that I'm afraid of (afraid of)
But right now I ain't scared of nothin'
('Cause all I know is)
(All I know is, all I, all I)
'Cause all I know it's 10:35
And I can feel your arms around me
Let 'em drown me
All I know it's 10:35
And I'm thanking, thanking God you found me
That you found me
So don't you worry
About tomorrow
Don't you worry
Just pass the bottle
All I know it's 10:35
And I can feel your arms around me
Let 'em drown me
Every night, I go places in my dreams
So many never-ending alleyways
I don't know what it means
But this is it
I know the sun will wake me up
Tell me I'd be dumb to not get what I want
I don't know where this night is goin' (goin')
But I know that you and me got somethin'
So many things that I'm afraid of
But right now I ain't scared of nothin'
('Cause all I know is)
(All I know is, all I-, all I-)
'Cause all I know it's 10:35
And I can feel your arms around me
Let 'em drown me
All I know it's 10:35
And I'm thanking, thanking God you found me
That you found me
So don't you worry
About tomorrow
Don't you worry
Just pass the bottle
All I know it's 10:35
And I can feel your arms around me
Let 'em drown me (uh-oh, oh)
It's 10:35 (ooh-oh)
10:35 (uh-oh, oh)
Know it's 10:35 (uh-oh, oh)
Know it's 10:35 (uh-oh, oh)
Lost
Linkin Park
Just a scar somewhere down inside of me
Something I can not repair
Even though it will always be
I pretend it isn't there (this is how I feel)
I'm trapped in yesterday (just stay out of my way)
Where the pain is all I know (this is all I know)
And I'll never break away (can't break free)
'Cause when I'm alone
I'm lost in these memories
Living behind my own illusion
Lost all my dignity
Living inside my own confusion
But I'm tired, I will always be afraid
Of the damage I've received
Broken promises they made
And how blindly I believed (this is all I know)
I will never break away (can't break free)
'Cause when I'm alone
I'm lost in these memories
Living behind my own illusion
Lost all my dignity
Living inside my own confusion
I try to keep this pain inside but I will never be alright
I try to keep this pain inside but I will never be alright
(I'm lost) I try to keep this pain inside but I will never be alright
(I'm lost) I try to keep this pain inside but I will never be alright
I'm lost in these memories
Living behind my own illusion
Lost all my dignity
Living inside my own confusion
Ceilings
Lizzy McAlpine
Ceilings, plaster
Can't you just make it move faster?
Lovely to be sitting here with you
You're kinda cute but it's raining harder
My shoes are now full of water
Lovely to be rained on with you
It's kinda cute but it's so short
Then you're drivin' me home
And I don't wanna leave
But I have to go
You kiss me in your car
And it feels like the start of a movie I've seen before
Before
Bedsheets, no clothes
Touch me like nobody else does
Lovely to just lay here with you
You're kinda cute and I would say all of this
But I don't wanna ruin the moment
Lovely to sit between comfort and chaos
But it's over
Then you're drivin' me home
And it kinda comes out as I get up to go
You kiss me in your car
And it feels like the start of a movie I've seen before
But it's not real
And you don't exist
And I can't recall the last time I was kissed
It hits me in the car
And it feels like the end of a movie I've seen before
Before
Trance
Metro Boomin, Travis Scott & Young Thug
Did you forget?
Do it for life
Chicago that time
All bullshit aside
Wonderful vibe
Wonderful night
Did it with Trav
All the kids, you and I
Off in this club
Bumpin and grindin
Who made it flood
You see the signs (signs)
We pulled out the feathers for this type of weather
She pulled to club, to buss up a dub
She came with her man, I called in a sub
She givin' out hugs, we know 'bout them hugs
She put in my hand, don't know what it was
She know some of the fam, but don't know enough
My trust is In God We Trust
Sippin' on wok, don't do tuss
She got her own fans, she need her a bus
Might give her a chance, it's givin' her
Out in a trance it's givin' her
Not on them Xans is givin her
Nigga with bands is givin her
A nigga with plans is givin her
Still in the gym, ain't did the implants
I like that for real, ain't givin up
Like they know that you real, they give it up
Like if you got the steel, they give it up
Takin these M's they givin us
Then run in the field like it's ten of us
I'm cleanin shit out like an enema
I make that shit look like a cinema
Take off the top, baby let's ride
I'm with my dogs, I picked a side
She want the boss, the one own the tribe
(I own the tribe yeah)
Arm out the window just throw it when we ride
I bent the corner scraped the wheels and the tires
Put twenty hoes on a boat til they tired
Everybody on
You now you need me my nigga, just keep the shit real
Don't you crab with your song
Who else fuck up the city like us?
When it rains, it's a thunderstorm
I party at Shabba, in New York and LA
Where they keep on going till the dawn (shabba)
$200k what I'm on
She licking all down my chest
I told her I aint slime bae call me sex
It ain't no dope where I put these racks at
If you my hoe I call you sexy
Got LA shit so bad it's dangerous
I backed out of the knot, she tried to tangle up
She got Paris manners and they so dangerous
I'm in a trance, its givin her
I'm in a trance, its givin her
I'm in a trance, its givin her
I'm in a trance, its givin her
I'm in a trance, its givin her
I'm in a trance, its givin her
I'm in a trance, its givin her
I'm in a trance, its givin her
I move so far in time
I move so far in time
I've been
I've been whipped up in whip
With my fist up in drive
I've been fucked up in time
I've been fucked up in night
I've been working in time
I've been... inside
I've been inside
I've been just like
It's this life
It's this life
It's this life
That’s What Tequila Does
Jason Aldean
It don't take too much
To get the wheels turning 'round and 'round in my head
Top shelf or Cuervo
I know it's gonna stir up every memory she left
Still burns, I can't lie
When I think about the good before goodbye
Guess that's what I'm drinking on tonight
It'll make you think that
You got a shot at the one that got away when it goes down
She's gonna tell you, she's gonna come back
And that's what makes you stay for another round
Keep you stickin' around, keep pourin' out
Until she's all you're thinkin' 'bout
It'll keep you hung up, keep you drunk on what it was
Man, that's what tequila does
It sneaks up on you
Going out for one turns into 2 AM
You know what's next, get you looking for a text
Get you checking your phone again
Make you stay all night, it'll give you a million reasons why
It'll make you think that
You got a shot at the one that got away when it goes down
She's gonna tell you, she's gonna come back
And that's what makes you stay for another round
Keep you stickin' around, keep pourin' out
Until she's all you're thinkin' 'bout
It'll keep you hung up, keep you drunk on what it was
Man, that's what tequila does
Whoa
Yeah, that's what tequila does
Whoa
It still burns (still burns), I can't lie (I can't lie)
Oh, guess that's what I'm drinking on tonight (whoa)
It'll make you think that
You got a shot at the one that got away when it goes down
She's gonna tell you, she's gonna come back
And that's what makes you stay for another round
Keep you stickin' around, keep pourin' out
Until she's all you're thinkin' 'bout
It'll keep you hung up, keep you drunk on what it was
Man, that's what tequila does
Whoa
Yeah, that's what tequila does
Whoa
Here With Me
D4vd
Can I tell you something just between you and me?
When I hear your voice, I know I'm finally free
Every single word is perfect as it can be
And I need you here with me
When you lift me up, I know that I'll never fall
I can speak to you by saying nothing at all
Every single time, I find it harder to breathe
'Cause I need you here with me
Every day
You're saying the words that I want you to say
There's a pain in my heart and it won't go away
Now I know I'm falling in deep
'Cause I need you here with me
Every day
You're saying the words that I want you to say
There's a pain in my heart and it won't go away
Now I know I'm falling in deep
'Cause I need you here with me
I think I see your face in every place that I go
I try to hide it, but I know that it's gonna show
Every single night, I find it harder to sleep
'Cause I need you here with me
Everyday
You're saying the words that I want you to say
There's a pain in my heart and it won't go away
Now I know I'm falling in deep
'Cause I need you here with me
Every day
You're saying the words that I want you to say
There's a pain in my heart and it won't go away
Now I know I'm falling in deep
'Cause I need you here with me
Can I tell you something just between you and me?
When I hear your voice, I know I'm finally free
Every single word is perfect as it can be
'Cause I need you here with me
PRC
Peso Pluma X Nataneal Cano
Everything I Love
Morgan Wallen
I wish I woulda met you anywhere but where I did
Some ol' high rise town that I won't ever go again
Wish we woulda rolled around in some old cab and chased them city lights
And hit bars I don't like
We were listenin' to One More Silver Dollar
Hanging out my Silverado
Down a road I love to ride
Wish I woulda known that by now, you'd be good and gone
And you'd leave us in a cloud of dust
And can't you see what you're doing, girl?
You ruined damn near everything I love
I don't care how much they're bitin'
I won't even crank the boat
Soon as that bobber hits the water
Girl, your memory starts to float
Baby, why'd I'd ever take the bait
And take you places that I love to go?
Hell, I'll never know
I even took you to my hometown to meet my mama
Now, I'm gonna see you every time I see that welcome sign
Wish I would've known that by now, you'd be good and gone
And you'd leave us in a cloud of dust
And can't you see what you're doing, girl?
You ruined damn near everything I love
I don't wanna hear One More Silver Dollar
I can't take my Silverado down them roads we used to ride
Wish I would've known that by now, you'd be good and gone
And you'd leave us in a cloud of dust
Can't you see what you're doing, girl?
You ruined damn near everything I love
I can't go nowhere near the whiskey
'Cause you used to drink it with me
In the bed of my truck
And now I can't get drunk
Can't you see what you're doing, girl?
You ruined damn near everything I love
Can't you see what you're doing, girl?
You ruined damn near everything I love
Forever
Lil Baby Featuring Fridayy
It may not mean nothing to y'all,
Understand nothing was done for me,
So i don't plan on stopping at all,
I want this sh-t forever man, ever man, ever man,
I'm shutting sh-t down in the mall,
And telling every girl she the one for me,
And i aint even planning to call,
I want this sh-t forever man, ever man, ever man
Last name ever,
First name greatest,
Like a sprained ankle boy I ain't nuttin to play with,
Started off local, but thanks to all the haters,
I know G4 pilots on a first name basis,
And your city faded off the brown, Nino,
She insists she got more class, we know!
Swimming in the money come and find me, Nemo,
If i was at the club you know I ball'd, chemo,
Drop the mixtape that sh-t sounded like an album
Who'd have thought a country wide tour would be the outcome
Labels want my name beside the X like Malcolm
Everybody got a deal, I did it without one,
Yeah n-gga i'm about my business,
Killing all these rappers you would swear I had a hit list,
Everyone who doubted me is asking for forgiveness,
If you aint been a part of it at least you got to witness,
B-tches,
It may not mean nothing to y'all,
Understand nothing was done for me,
So i don't plan on stopping at all,
I want this sh-t forever man, ever man, ever man,
I'm shutting sh-t down in the mall,
And telling every girl she the one for me,
And i aint even planning to call,
I want this sh-t forever man, ever man, ever man
Ever ever, Mr West is in the Building,
Aint no question who about to kill em,
I used to have hood dreams,
Big fame, big chains,
I stuck my d-ck inside this life until that b-tch came,
I went hard all Fall like the ball teams,
Just so I can make it rain all spring,
Y'all seen my story my glory,
I had raped the game young,
You can call it statutory,
When a n-gga blow up they gon build statues of me
Old money Benjamin Button, whaat, nuttin,
Now superbad chicks giving me McLovin,
You would think I ran the world like Michelle's husband,
You would think these n-ggas would know me when they really doesn't
Like they was down with the old me, no you f-cking wasn't,
Your'e such a f-cking loser,
He didn't even go to class Bueller,
Trade the Grammy plaques just to have my granny back,
Lyrics courtesy of killerhiphop.com
Remember she had that bad hip like a fanny pack,
Chasing that stardom would turn you into a maniac,
All the way in Hollywood and I can't even act,
They pull their cameras out and God damn he snapped,
I used to want this thing forever y'all can have it back,
It may not mean nothing to y'all,
Understand nothing was done for me,
So i don't plan on stopping at all,
I want this sh-t forever man, ever man, ever man,
I'm shutting sh-t down in the mall,
And telling every girl she the one for me,
And i aint even planning to call,
I want this sh-t forever man, ever man, ever man
Ok, hello its da martian,
Space jam Jordan's,
I want this sh-t forever wake up and smell the Garden,
Fresher than the harvest
Step up to the target,
If i had one guess than I guess im just New Orleans,
And I will never stop like i'm running from the cops,
Hopped up in my car and told my chauffeur "to the top",
Life is such a f-cking roller coaster then it drops,
But what should I scream for this is my theme park,
My minds shine even when my thoughts seem dark,
Pistol on my side you don't wanna hear that thing talk,
Let the King talk, check the price and pay attention,
Lil Wayne thats what they got to say or mention,
Lyrics courtesy of killerhiphop.com
Im like Nevada in the middle of the summer,
I'm resting in the lead I need a pillow and a cover,
Ssshhh, my foots sleeping on the gas,
No brake pads no such thing as last- huh,
It may not mean nothing to y'all,
Understand nothing was done for me,
So i don't plan on stopping at all,
I want this sh-t forever man, ever man, ever man,
I'm shutting sh-t down in the mall,
And telling every girl she the one for me,
And i aint even planning to call,
I want this sh-t forever man, ever man, ever man
There they go, packin' stadiums
As Shady spits his flow,
Nuts they go, macadamian they go so ballistic whoa,
We can make them look like bozo's,
He's wondering if he should spit this slow,
F-ck no go for broke,
His cup just runneth over oh no
He aint had a buzz like this since the last time that he overdosed,
They've been waiting patiently for Pinnochio to poke his nose,
Back into the game and they know,
Rap will never be the same as before,
Bashin' in the brains of these hoes,
And establishing a name as he goes,
The passion and the flame is ignited,
You can't put it out once we light it,
This sh-t is exactly what the f-ck that i'm talking about when we riot,
You dealin with a few true villians
Who stand inside of the booth truth spillin,
Lyrics courtesy of killerhiphop.com
And spit true feelings, until our tooth fillings come flying up out of our mouths
Now rewind it
Payback muthaf-cka for the way that you doubted me so how's it taste?
When I slap the taste out your mouth with the bass so loud that it shakes the place,
I'm hannibal lecter so just in case your thinking of saving face,
You aint gonna have no face to save by the time Im through with this place,
So Drake...
It may not mean nothing to y'all,
Understand nothing was done for me,
So i don't plan on stopping at all,
I want this sh-t forever man, ever man, ever man,
I'm shutting sh-t down in the mall,
And telling every girl she the one for me,
And i aint even planning to call,
I want this sh-t forever man, ever man, ever man
No Se Va
Grupo Frontera
Lottery
Latto Featuring LU KALA
Wait, (Renegade), go, go, go, go
Go, go, go, go
Go, go, go, go
Go, go, go, go, let's go
Cash on me, like I hit the lottery (the lottery)
Hoes will trip, watch them how they follow me (wait)
Hunnids blue, yeah I got them all on me
Go, go, go, go, let's go
Prada shoes, yeah I keep a style on me (style on me)
Pretty freaks, make them bitches pile on me (I swear)
Rack party, I got thirty thou' on me (right now)
Go, go, go, go, let's go
Too much cake, all these bitches want a piece (I want a piece)
Hunnid racks, I threw ten on top of my teeth (racks)
Diamond choker, VVS, I can't breathe (I can't breathe)
Go, go, go, go, let's go
Saint Laurent, Marmier, Givenchy (Ginvenchy)
Gucci goggles, Gucci buckles, Gucci skis
You gon' cuff that ho, we know that ho a freak
Go, go, go, go, let's go (can't breathe)
Cash on me, like I hit the lottery (the lottery)
Hoes will trip, watch them how they follow me (wait)
Hunnids blue, yeah I got them all on me
Go, go, go, go, let's go
Prada shoes, yeah I keep a style on me (style on me)
Pretty freaks, make them bitches pile on me (I swear)
Rack party, I got thirty thou' on me (right now)
Go, go, go, go, let's go
I know shooters, in DC, Bradley Bill
Shawty thick, she say, "I don't miss no meals" (no meals)
Buddy broke, you know you can't front the bills (you broke)
Go, go, go, go, let's go
Shrimp and lobsters, his and hers, it's a date (yum)
We gon' eat, she gon' eat, it's a plate (it's a plate)
Make them racks, spend them racks, give or take (give or take)
Go, go, go, go, let's go
Cash on me, like I hit the lottery (the lottery) (racks)
Hoes will trip, watch them how they follow me (wait)
Hunnids blue, yeah I got them all on me
Go, go, go, go, let's go (let's go)
Prada shoes, yeah I keep a style on me (style on me)
Pretty freaks, make them bitches pile on me (I swear)
Rack party, I got thirty thou' on me (right now)
Go, go, go, go, let's go
Wait, wait, wait, wait
Wait, wait, wait, uh
Wait, wait, wait, wait
Wait, wait, wait, uh
The Color Violet
Tory Lanez
I took my drugs and took my lovin' when I left out the spot
I left the party with a Barbie, markin' X on the dot
She calls my phone up but I told her, "I'm a loner" (uh)
But she likes my watch and my droptop and my persona (uh)
We hit the highway, 1-5-5, with my whole foot on the dash
She's in my ear, she's got no fear, she could care less if we crash
But on my radar, I've got some nerve to play hard
I've waited for my chance, but playboys we don't dance, dance, dance
I gave my heart (uh)
Speedin' car goin' ninety in the rain
She took my heart, filled it with nothin' but pain
This beat in my hands is not for romance
I wanna stay but, playboys, we don't dance, dance, dance
So I won't dance again, oh, baby
No, I won't dance again, ooh, yeah (uh)
No, I won't dance again
No, I won't dance again
Pretty baby, ooh
Oh, face in the daylight, wastin' time on the stars in the sky
She's got my pager, play games of love all on my eyes
Then I'm reminded, love don't come 'til you find it
I just hope that it's workin', I'm yearnin', I'm searchin', uh
The afterparty was on Wilson and 73rd
You got the notion that somebody else was with me first
But on my radar, you had some nerve to play hard
You took away my chance, but playboys we don't dance, dance, dance
I gave my heart (uh)
Speedin' car goin' ninety in the rain
She took my heart, filled it with nothin' but pain
This beat in my hands is not for romance
I wanna stay but, playboys, we don't dance, dance, dance
So I won't dance again, oh, baby
No, I won't dance again, ooh, yeah (uh)
No, I won't dance again
No, I won't dance again
Pretty baby, ooh
Human
Cody Johnson
I'm only human
I'm only, I'm only
I'm only human, human
Maybe I'm foolish
Maybe I'm blind
Thinking I can see through this
And see what's behind
Got no way to prove it
So maybe I'm blind
But I'm only human after all
I'm only human after all
Don't put your blame on me
Don't put your blame on me
Take a look in the mirror
And what do you see
Do you see it clearer
Or are you deceived
In what you believe
'Cause I'm only human after all
You're only human after all
Don't put the blame on me
Don't put your blame on me
Some people got the real problems
Some people out of luck
Some people think I can solve them
Lord heavens above
I'm only human after all
I'm only human after all
Don't put the blame on me
Don't put the blame on me
Don't ask my opinion
Don't ask me to lie
Then beg for forgiveness
For making you cry
Making you cry
'Cause I'm only human after all
I'm only human after all
Don't put your blame on me
Don't put the blame on me
Oh, some people got the real problems
Some people out of luck
Some people think I can solve them
Lord heavens above
I'm only human after all
I'm only human after all
Don't put the blame on me
Don't put the blame on me
I'm only human
I make mistakes
I'm only human
That's all it takes
To put the blame on me
Don't put the blame on me
I'm no prophet or Messiah
Should go looking somewhere higher
I'm only human after all
I'm only human after all
Don't put the blame on me
Don't put the blame on me
I'm only human
I do what I can
I'm just a man
I do what I can
Don't put the blame on me
Don't put your blame on me
Love You Better
Future
Could this thing be more?
You can never-, this could be yours
If it's really meant to-, baby, baby
'Cause I'm interested, baby, baby
ATL Jacob, ATL Jacob
You tellin' me you fallin' out of love with me
Hope you can find someone to love you better than I did
Takin' our memories on love and treatin' it like nothin'
Takin' our memories on love and treatin' it like gossip
It's my love from my grandmother make me gentle when I care for you
Tell me you fallin' out of love, it's breakin' my heart in two
I just don't wanna sit and pray, baby
I almost like it didn't happen to make you happy
You tellin' me you fallin' out of love with me
Hope you can find someone to love you better than I did
You tellin' me you fallin' out of love with me
(You tellin' me you fallin' out of love with me)
Hope you can find someone to love you better than I did
(Hope you can find someone to love you better than I did)
Could this thing be more?
You tellin' me you fallin' out of love with me
It's my love from my grandmother make me gentle when I care for you
(Hope you can find someone to love you better than I did)
If it's really meant to-, baby, baby
'Cause I'm interested, baby, baby
Dancin’ In The Country
Tyler Hubbard
Them neon lights
Look good on you, so good on you
But we ain't got
No room to move, no room to move
We need some space and I know a place
Outside of town where it don't ever close down
No, we never close down
Yeah, we can two step
Put your boots on, baby, we can do that
Girl, cut a little loose, you can move that
Move that, ah yeah
I'll take you dancin' in the country
Levi's in them low beams
Spin you in some red dirt
Sweep you off of both feet
Out here where the sun sets
Silverado backbeat
You'll never wanna go home
And never wanna not be
Dancin' in the country
Dancin' in the country
Yeah, it goes left, right, left
You makin' me wanna sway
You makin' me wanna play
Some Alabama and Jackson
Got you 'round my neck
You makin' me wanna stay
You makin' me wanna lay you down here in the pasture
Yeah, we can two step
Put your boots on, baby, we can do that
Girl, cut a little loose, you can move that
Move that, ah, yeah
I'll take you dancin' in the country
Levi's in them low beams
Spin you in some red dirt
Sweep you off of both feet
Out here where the sun sets
Silverado backbeat
You'll never wanna go home
And never wanna not be
Dancin' in the country
Dancin' in the country
Oh, yeah
I know you feel that heat
Watermelon summer
Get them Luccheses
Stompin' like the thunder
We can two step
Put your boots on, baby, we can do that
Girl, cut a little loose, you can move that (woo)
Move that, ah, yeah
I'll take you dancin' in the country
Levi's in them low beams
Spin you in some red dirt
Sweep you off of both feet (ah, yeah)
Out here where the sun sets
Silverado backbeat
You'll never wanna go home
And never wanna not be
Dancin' in the country (don't stop, don't stop, keep it movin')
Oh, I'm dancin' in the country (don't stop, don't stop, keep it movin')
Dancin' in the country (don't stop, don't stop, keep it movin') (oh, yeah)
Dancin' in the country (don't stop, don't stop, keep it movin')
Oh, I'm dancin' in the country
Let's go
Hey Mor
Ozuna Featuring Feid
OMG
NewJeans
Yandel 150
Yandel & Feid
X Si Volvemos
Karol G x Romeo Santos
Too Many Nights
Metro Boomin Featuring Don Toliver & Future
Keep the bitch jump, uh-uh
Keep it on jump, uh-uh (jump)
Keep the bitch ju-u-ump
I caught it cool, for a ten
The bitch get loose, she tryna win
I beat her by the house, I beat her in
There's forty in the couch, I let her spend
When the car's lit, better call in
She done popped all out, she done called twin
I done went too spazzed out, I put the raw in
I done hit the strip club and spent a tall ten
Lil' shawty off the Clicquot
She been comin' hot just like a heat stroke (heat stroke)
I could see you lurkin' through the peephole
I'm stackin' different money, type of C notes (C notes)
I'm talkin' C notes, nigga, hit C notes
You spend what you want and you get what you want
I guess you got what you wanted
You're hittin' the pole and you give it your all
Now, you keepin' it honest (Yeah)
It's too many nights I went nameless
It's too many nights I went famous
It's too many nights I went brainless
Sayin', "Uh-uh-uh-uh" (yeah)
Let's get drunk, uh-uh
Keep the bitch jump, uh-uh
Keep the bitch jump, uh-uh (jump)
Keep the
I caught it cool, for a ten
The bitch get loose, she tryna win
I beat her by the house, I beat her in
There's forty in the couch, I let her spend
You made a hundred and you fall back
Need you on a call back
Knowin' that you're all that, bae
Oh, it's two-hundred on your dashboard
Stampin' out your passport
Ask me if I'm really okay
You get what you want, you want, you want
You get what you want, you want, you want
You get what you want, you want, you want
You get what you want, you want, you want
You spend what you want and you get what you want
I guess you got what you wanted
You're hittin' the pole and you give it your all
Now, you keepin' it honest (yeah)
It's too many nights I went nameless
It's too many nights I went famous
It's too many nights I went brainless
Sayin', "Uh-uh-uh-uh" (yeah)
Let's get drunk, uh-uh
Keep it on jump, uh-uh (jump)
Keep it on jump, uh-uh
Ooh-ooh, ooh-ooh (Keep it on jump, uh-uh)
Ooh-ooh, ooh-ooh (Keep it on jump, uh-uh)
Ooh-ooh, ooh-ooh (Keep it on jump, uh-uh)
Ooh-ooh (haha)
Bottega Veneta whenever you ride with me
It ain't like I'm askin' you to ride for free
From trappin' to rappin', need to be proud of me (proud of me)
Pack out the studio and throw parties (throw parties)
Money comin' too fast, I can't slow it (I can't slow it)
Feel like I'm runnin' from my past, I can't slow down
Too many nights, 'bout to crash (skrrt)
Now I'm buyin' the foreigns, all cash
I can't slow down
Blind
SZA
Niggas want me to get ratchet
Niggas want me to attack it
Put the hood on, now they callin' me Cassius
Raunchy like Bob Saget
Greedy, I can't pass it
Eatin' everything, nigga, no fasting
I don't care how much you knew me in the past tense
I ain't no Julia Stiles, this ain't no last dance, way past it
Way
Fuckin' on my ex 'cause he validate me
Fuckin' up a check, I don't want no receipt
Might get possessed, let my spirit speak freely
Hey, my past can't escape me
My pussy precedes me
My, my, how the times change
I'm still playin' the victim
And you still playin' the pick-me
It's so embarrassing
All of the things I need living inside of me
I can't see it
It's so embarrassing
All of the love I seek living inside of me
I can't see, I'm blind
Blind, blind
Blind, blind
Blind, blind
Blind, blind
Blind, blind
You ain't getting your bitch back
Calm down, shit could be worse, never say that
I don't wanna pipe down, rather get payback
Mama told me, "Never shit where you lay at"
I don't want righteousness
I hurt too much, I lost too much, I lust too much
I hit my clutch and vroom
Third day, pop out the tomb
I like when you pull your gun at the red light
I like all that violence, give me dysfunction
I like when you come, never stay the whole night
Better when you high, never tell me I'm wrong
'Cause my past can't escape me
My pussy precedes me
My, my, how the times change
You still talkin' 'bout babies
And I'm still taking the Plan B
It's so embarrassing
All of the things I need living inside of me
I can't see it
It's so embarrassing
All of the love I seek living inside of me
I can't see, I'm blind
Blind, blind
Blind, blind
Blind, blind
Blind, blind
Blind, blind
Trustfall
P!nk
Picture a place where it all doesn't hurt
Where everything's safe and it doesn't get worse
Oh my
We see through bloodshot eyes
Picture a place, somewhere else far away
Where you know what they mean and they mean what they say
To us
And would that be enough?
Are we runnin' out of time?
Are we hidin' from the light?
Are we just too scared to fight
For what we want tonight?
Close your eyes and leave it all behind
Go where love is on our side
It's a trust fall, baby
It's a trust fall, baby
You and I and everyone, alive
We can run into the fire
It's a trust fall, baby
Yeah, it's a trust fall, baby
Jump with me, come with me, burn like the sun
We'll talk, then we'll cry, then we'll laugh 'til we're done
Oh my
It's like we're out our minds
We've been runnin' for our lives
We've been hidin' from the light
We've been far too scared to fight
For what we want tonight
Close your eyes and leave it all behind
Go where love is on our side
It's a trust fall, baby
It's a trust fall, baby
You and I and everyone, alive
We can run into the fire
It's a trust fall, baby
Yeah, it's a trust fall, baby
What if we just fall?
I'm not goin' without you
(And you're not goin' alone)
I fell so far 'til I found you
(But you know what you know, when you know)
So I'm not goin' without you
(And you're not goin' alone)
'Cause you know when you know
Close your eyes and leave it all behind
Go where love is on our side
It's a trust fall, baby
It's a trust fall, baby
What if we just fall?
What if we just fall?
What if we just fall?
What if we just fall?
What if we just fall?
What if we just fall?
What if we just-
Do It Again
NLE Choppa & 2Rare
Walking around these walls
I thought by now they'd fall
But You have never failed me yet
Waiting for change to come
Knowing the battle's won
For You have never failed me yet
Your promise still stands
Great is Your faithfulness, faithfulness
I'm still in Your hands
This is my confidence
You've never failed me yet
I know the night won't last
Your word will come to pass
My heart will sing Your praise again
Jesus You're still enough
Keep me within Your love, oh
My heart will sing Your praise again
(Oh, yes, it will)
Your promise still stands
Great is Your faithfulness, faithfulness
I'm still in Your hands
This is my confidence
You've never failed
Your promise still stands
Great is Your faithfulness, faithfulness
I'm still in Your hands
This is my confidence
You've never failed me yet
Never failed me yet
Oh, oh-oh
I've seen You move, You move the mountains
And I believe I'll see You do it again
You made a way, where there was no way
And I believe I'll see You do it again
I've seen You move, You move the mountains
And I believe I'll see You do it again
You made a way, where there was no way
And I believe I'll see You do it again
I've seen You move, You move the mountains
And I believe I'll see You do it again
You made a way, where there was no way
And I believe I'll see You do it again
I'll see You do it again
Your promise still stands
Great is Your faithfulness, faithfulness
I'm still in Your hands
This is my confidence
You've never failed
Your promise still stands
Great is Your faithfulness, faithfulness
I'm still in Your hands
This is my confidence
You've never failed me yet
Oh, You've never failed me yet
And I never will forget
You've never failed me yet
And I never will forget
Slut Me Out
NLE Choppa
Yeah, ETB
Why you being weird to me?
Ayy, rip off my shirt if you love me (love me)
Spit in my face when you fuck me (fuck me)
Play with my gooch, while you suck me (suck me)
Eat the dick like you was ugly
I mean, hold on, wait
Where your friend? Bring your buddy (your buddy)
I don't think that you enoughie (enoughie)
Her favorite thing to say is, "Cuff me"
Slut me out (out)
Slut me out (out)
Slut (slut), slut (slut)
Slut me out
Rip off my shirt if you love me (sexy)
Spit in my face when you fuck me (come sex me)
Play with my gooch, while you suck me (don't text me)
Eat the dick like you was ugly (don't text me)
Big dick energy, I give it (I give it)
Don't believe me, then come feel it (come feel it)
Gon' put this here in your kidney, please (please)
And hush it like some kidney beans
Suck my balls, come chickpea me
Why you being weird to me? (Weird to me)
Put your ass in my face 'til I get pink eye
Fuck you anywhere, I'm that type guy (that type guy)
At the church, on the plane, at the basketball game
I don't care, I'ma bust my nut 'til I die ('til I die)
What position do I like? All of 'em, baby (Baby)
Put it on camera, masturbate to it later (to it later)
Ever sucked a vegan dick? Baby, come taste me
Promise that my nut taste like sugar gravy
Don't cum quick, I control my bladder (control my bladder)
Dick real big, come climb my ladder (my ladder)
Fat coochies, little coochies, all coochies matter (they matter)
Ass real fat, I can make it get fatter (fatter)
Wanna see a magic trick? Bend over backwards (Over backwards)
Meat to meat, wall to wall
Coochie to my balls, dawg
Ayy, rip off my shirt if you love me (love me)
Spit in my face when you fuck me (fuck me)
Play with my gooch, while you suck me (suck me)
Eat the dick like you was ugly
I mean, hold on, wait
Where your friend? Bring your buddy (your buddy)
I don't think that you enoughie (enoughie)
Her favorite thing to say is, "Cuff me"
Slut me out (out)
Slut me out (out)
Slut (slut), slut (slut)
Slut me out
La Jumpa
Arcangel & Bad Bunny
Shut Up My Moms Calling
Hotel Ugly
I just wanna rewind
I haven't seen you in a long time
You got me feelin' so lonely
Even when you come through
I can tell that it isn't you
So baby, bring it in closely
Hate the way I love you, but you're so sweet
I always find a way to say the wrong things
I wish that we were layin' in the same sheets
But lately, you've been actin' like you hardly know me
Baby, come home
(So baby, won't you say somethin'?)
Home
Baby, come home
(So baby, won't you say somethin'?)
Home
Baby, come home
(So baby, won't you say somethin'?)
Home
Baby, come home
(So baby, won't you say somethin'?)
Home
I just wanna rewind
I haven't seen you in a long time
You got me feelin' so lonely
Even when you come through
I can tell that it isn't you
So baby, bring it in closely
Hate the way I love you, but you're so sweet
I always find a way to say the wrong things
I wish that we were layin' in the same sheets
But lately, you've been actin' like you hardly know me
I've only recently began to fall
I feel the need to go and waste it all
I tried to numb away the pain
I hope someone is watchin' me, watchin' me, watchin' me
Baby, come home
(So baby, won't you say somethin'?)
Home
Baby, come home
(So baby, won't you say somethin'?)
Home
Baby, come home
(So baby, won't you say somethin'?)
Home
Baby, come home
(So baby, won't you say somethin'?)
Home
Gold
Dierks Bentley
Thank you for comin' home
Sorry that the chairs are all worn
I left them here, I could have sworn
These are my salad days
Slowly being eaten away
Just another play for today
Oh, but I'm proud of you, but I'm proud of you
Nothing left to make me feel small
Luck has left me standing so tall
Gold
Always believe in your soul
You've got the power to know
You're indestructible
Always believe in
'Cause you are gold
I'm glad that you're bound to return
Something I could have learned
You're indestructible
Always believe in
After the rush has gone
I hope you find a little more time
Remember we were partners in crime
It's only two years ago
The man with the suit and the face
You knew that he was there on the case
Now he's in love with you, he's in love with you
Love is like a high prison wall
But you don't leave me standing so tall
Gold
Always believe in your soul
You've got the power to know
You're indestructible
Always believe in, 'cause you are
Gold
I'm glad that you're bound to return
There's something I could have learned
You're indestructible
Always believe in
And love is like a high prison wall
And you could leave me standing so tall
Gold
Always believe in your soul
You've got the power to know
You're indestructible
Always believe in
'Cause you are gold
I'm glad that you're bound to return
Something I could have learned
You're indestructible
Always believe in, oh
Always believe in your soul
Soul, soul
Soul, soul, soul
Back End
Finesse2Tymes
Finesse, two of 'em
It's cool when they do it (it's cool, huh?), it's a problem when I do it
Fuck 'em (fuck 'em)
Birds of a feather (what?), they flock together
That make you a sucker (a sucker)
I don't fuck with nobody (none of 'em), if you ain't mob ties
Then it's fuck you (fuck you)
I don't want nothin' but some money (some money)
But if they get with this shit, I'ma flush 'em
I'm in South Carolina (what you doing?), I'm looking for Renny
She know I'm some pimpin' though (yeah, she know that)
She gotta be thick as hell, big player
I can't do nothin' with no skinny ho (oh, nah)
Bought a Glock 47, that bitch look like a D.E
I hit it with both hands
I sign my own check (I'm CEO), better stay in your lanе
Your lil' bitty broke ass (you broke)
They hatе that they left me (they hate it), they regret what they said (they see)
But I don't accept it (I don't)
They claimin' he bought it (huh?), the whole time it was somebody else's (that's crazy)
Finesse bringing real back (the real), he be speaking his mind, that boy with the shit (he lit)
These rap niggas pitiful (I'm so sick of 'em), they be ready to kill you over a bitch
Hold up, I come around, niggas go put they ho up (go put they ho up)
Quit asking questions, bitch, shut up and roll up (roll up)
She eat the dick 'til she throw up, ugh
He tried to diss me to blow up (boom)
I'm VVS to the floor up, ooh
Pinkie ring look like a doughnut
I do this shit for them niggas who solid and free all them real ones who jailin' in Beaumont
Damn, they can't do nothin' with 'em
Beefin' with who? It ain't nothin' with 'em (none)
I'm hearin' the slick shit (slick shit), tote nothin' but big shit
You can get slumped with 'em (slumped with 'em)
The Draco is under me (yeah), that bitch in my lap now
Come and get 'em (lap now, come and get 'em)
You think I ain't on point (you stupid), run up if you want
Shoot this bitch through the window (bah, bah, bah)
They like, "Ooh, he violent" (shh), they ain't dropping nothin', ooh, they silent (shh)
They making up rumors, don't nobody believe 'em, that shit don't surprise me (it don't)
My side bitch main homie (on God), she play her position, ain't doing no snitchin' (shh)
Remember I was broke (remember), now they be like, "Hell nah, look at Ricky" (lil' Deebo)
Don't act like you know a nigga
All them years I did, you ain't wrote a nigga
Don't be speakin' on me if you owe a nigga
I'll catch him in traffic and ho the nigga (ho the nigga)
I put that on my grandmammy (I put that on my grandmammy)
I was gone, but I'm back at it (now I'm back at it)
I ain't cool, I just act happy (cool, I just act happy)
Mini Drac', I can backpack it (bitch, I can tote it)
Say they got a bounty on who? (On who?)
They say they got prices on who head? (On who head?)
32 shots in this Glock 17, and bitch, it'll be thirty-two dead, on me
I hit that bitch with my foot in her neck (foot in her neck)
Sleepin' on me, now a crook in your neck (crook in your neck)
$30K for a 20 minute set
Put some baguettes in the Cartier specs, I'm back in (I'm back)
They ain't fuck with me back then (they was sleeping)
Now they hollerin' 'bout, "Tap in" (hollerin' 'bout, "Tap in")
I pulled up with a MAC-10 (pulled up with a-, rrrt)
If she give you some pussy and you tell the business
That mean you a rat then (then you'll tell somethin')
I don't want your bottles, I don't want your weed, just give me my backends
I'm goin' back in (I'm goin' back in)
|
Blabla/Pipipopo
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-Taxi_v3-version2
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="TalesLF/q-Taxi_v3-version2", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Blaine-Mason/hackMIT-finetuned-sst2
|
[
"pytorch",
"tensorboard",
"bert",
"text-classification",
"dataset:glue",
"transformers",
"generated_from_trainer"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 36 | null |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 247.41 +/- 23.82
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
```python
import gym
from huggingface_sb3 import load_from_hub, package_to_hub, push_to_hub
from huggingface_hub import notebook_login # To log to our Hugging Face account to be able to upload models to the Hub.
from stable_baselines3 import PPO
from stable_baselines3.common.evaluation import evaluate_policy
from stable_baselines3.common.env_util import make_vec_env
```
|
BlindMan820/Sarcastic-News-Headlines
|
[
"pytorch",
"distilbert",
"text-classification",
"English",
"dataset:Kaggle Dataset",
"transformers",
"Text",
"Sequence-Classification",
"Sarcasm",
"DistilBert"
] |
text-classification
|
{
"architectures": [
"DistilBertForSequenceClassification"
],
"model_type": "distilbert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 28 | null |
---
tags:
- generated_from_trainer
model-index:
- name: CV11_finetuning1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# CV11_finetuning1
This model is a fine-tuned version of [Roshana/Wav2Vec1_CV](https://huggingface.co/Roshana/Wav2Vec1_CV) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7162
- Wer: 0.3625
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.5067 | 0.86 | 400 | 0.6193 | 0.4492 |
| 0.4448 | 1.72 | 800 | 0.6325 | 0.4384 |
| 0.3781 | 2.59 | 1200 | 0.6248 | 0.4197 |
| 0.3172 | 3.45 | 1600 | 0.6408 | 0.4343 |
| 0.2556 | 4.31 | 2000 | 0.6593 | 0.4230 |
| 0.2148 | 5.17 | 2400 | 0.6742 | 0.3987 |
| 0.1779 | 6.03 | 2800 | 0.6658 | 0.3929 |
| 0.1446 | 6.9 | 3200 | 0.6768 | 0.3846 |
| 0.1248 | 7.76 | 3600 | 0.6809 | 0.3804 |
| 0.108 | 8.62 | 4000 | 0.7214 | 0.3683 |
| 0.0938 | 9.48 | 4400 | 0.7162 | 0.3625 |
### Framework versions
- Transformers 4.22.2
- Pytorch 1.12.1+cu113
- Datasets 2.5.1
- Tokenizers 0.12.1
|
Bloodwarrior/Chikfalay
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
language:
- en
pipeline_tag: text-generation
---
GPT-J-6B Model Finetuned on summarized clinical notes dataset.
Input: Clinical Notes/Discharge Summary -> Output: Summarization w/ Pertinent Information
|
BlueGamerBeast/DialoGPT-small-Morgana
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 12 | null |
---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-pixel-copter
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 49.80 +/- 41.70
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
Broadus20/DialoGPT-small-joshua
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 12 | null |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="Vorlde/Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Brykee/BrykeeBot
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: flan-t5-large-da-multiwoz2.0_80-new
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-large-da-multiwoz2.0_80-new
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8043
- Accuracy: 35.9874
- Num: 3690
- Gen Len: 15.7702
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 24
- seed: 1799
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Num | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:----:|:-------:|
| 0.1444 | 5.48 | 400 | 0.7043 | 35.5433 | 3690 | 15.6268 |
### Framework versions
- Transformers 4.18.0
- Pytorch 1.10.0+cu111
- Datasets 2.5.1
- Tokenizers 0.12.1
|
BumBelDumBel/ZORK_AI_SCIFI
|
[
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"transformers",
"generated_from_trainer"
] |
text-generation
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": true,
"max_length": 50
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 14 | 2023-03-07T06:15:55Z |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SnowballTarget
library_name: ml-agents
---
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-SnowballTarget
2. Step 1: Write your model_id: Nelsonlin0321/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Buntan/bert-finetuned-ner
|
[
"pytorch",
"tensorboard",
"bert",
"token-classification",
"dataset:conll2003",
"transformers",
"generated_from_trainer",
"license:apache-2.0",
"model-index",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"BertForTokenClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 8 | null |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 249.48 +/- 22.23
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Buntan/xlm-roberta-base-finetuned-marc-en
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | 2023-03-07T06:20:08Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi-v3-001
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.44 +/- 2.68
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="alvarez/Taxi-v3-001", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Bwehfuk/Ron
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 263.74 +/- 21.46
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
CALM/backup
|
[
"lean_albert",
"transformers"
] | null |
{
"architectures": [
"LeanAlbertForPretraining",
"LeanAlbertForTokenClassification",
"LeanAlbertForSequenceClassification"
],
"model_type": "lean_albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 4 | null |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 261.82 +/- 20.05
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
CAMeL-Lab/bert-base-arabic-camelbert-ca-ner
|
[
"pytorch",
"tf",
"bert",
"token-classification",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"BertForTokenClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 85 | 2023-03-07T06:27:58Z |
---
license: openrail
datasets:
- stanfordnlp/SHP
- Nerfgun3/bad_prompt
language:
- en
- ga
metrics:
- bertscore
library_name: allennlp
pipeline_tag: text-classification
---
|
CAMeL-Lab/bert-base-arabic-camelbert-ca-poetry
|
[
"pytorch",
"tf",
"bert",
"text-classification",
"ar",
"arxiv:1905.05700",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 42 | null |
---
license: apache-2.0
datasets:
- competitions/aiornot
metrics:
- accuracy
---
|
CAMeL-Lab/bert-base-arabic-camelbert-ca-sentiment
|
[
"pytorch",
"tf",
"bert",
"text-classification",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 73 | null |
---
datasets:
- IlyaGusev/habr
- Den4ikAI/russian_instructions
- wiki_qa
inference:
parameters:
max_new_tokens: 32
temperature: 1
top_k: 50
top_p: 0.7
do_sample: true
license: apache-2.0
language:
- en
pipeline_tag: text-generation
widget:
- text: Чем отличается лось от ежа?
example_title: Question Answering
- text: Как выпросить повышение?
example_title: Logical reasoning
- text: Какая температура закипания азота?
example_title: Scientific knowledge
library_name: transformers
tags:
- finance
- code
---
<h1 style="font-size: 42px">Instructions ruGPT Small v0.1a<h1/>
# Model Summary
> Я дообучил small rugpt на датасете инструкций, хабра, QA и кода
# Quick Start
```python
from transformers import pipeline
pipe = pipeline(model='AlexWortega/instruct_rugptSmall')
pipe('''Как собрать питон код?''')
```
or
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("AlexWortega/instruct_rugptSmall")
model = AutoModelForCausalLM.from_pretrained("AlexWortega/instruct_rugptSmall")
```
# License
The weights of Instructions ruGPT Small v0.1a are licensed under version 2.0 of the Apache License.
## Hyperparameters
I used Novograd with a learning rate of 2e-5 and global batch size of 6 (3 for each data parallel worker).
I use both data parallelism and pipeline parallelism to conduct training.
During training, we truncate the input sequence to 1024 tokens, and for input sequence that contains less than 1024 tokens, we concatenate multiple sequences into one long sequence to improve the data efficiency.
# References
#Metrics
SOON
## BibTeX entry and citation info
```bibtex
@article{
title={GPT2xl is underrated task solver},
author={Nickolich Aleksandr, Karina Romanova, Arseniy Shahmatov, Maksim Gersimenko},
year={2023}
}
```
|
CAMeL-Lab/bert-base-arabic-camelbert-ca
|
[
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"BertForMaskedLM"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 580 | 2023-03-07T06:49:20Z |
---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-Pixelcopter-PLE-v0
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 50.40 +/- 26.36
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
CAMeL-Lab/bert-base-arabic-camelbert-mix-did-madar-corpus26
|
[
"pytorch",
"tf",
"bert",
"text-classification",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 45 | 2023-03-07T07:03:09Z |
## A new version of this model was released:
**[The_Owl_Characters_V2](https://huggingface.co/Jartemio/The_Owl_Characters_V2)**
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the model to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
|
CAMeL-Lab/bert-base-arabic-camelbert-mix-did-madar-corpus6
|
[
"pytorch",
"tf",
"bert",
"text-classification",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 34 | 2023-03-07T07:03:53Z |
---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-Pixelcopter-PLE-v0
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 43.60 +/- 32.62
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
CAMeL-Lab/bert-base-arabic-camelbert-mix-sentiment
|
[
"pytorch",
"tf",
"bert",
"text-classification",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 855 | null |
---
license: mit
---
### ahx-model-14 on Stable Diffusion
This is the `<ahx-model-14>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb).
Here is the new concept you will be able to use as a `style`:








|
CAMeL-Lab/bert-base-arabic-camelbert-msa-did-madar-twitter5
|
[
"pytorch",
"tf",
"bert",
"text-classification",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 75 | null |
---
datasets:
- glue
language:
- ja
---
using 'bert-base-cased' model, do text classification.
|
CAMeL-Lab/bert-base-arabic-camelbert-msa-eighth
|
[
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"BertForMaskedLM"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 21 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Pyramids
library_name: ml-agents
---
# **ppo** Agent playing **Pyramids**
This is a trained model of a **ppo** agent playing **Pyramids** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Pyramids
2. Step 1: Write your model_id: Nelsonlin0321/Pyramids
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
CAMeL-Lab/bert-base-arabic-camelbert-msa-half
|
[
"pytorch",
"tf",
"jax",
"bert",
"fill-mask",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"BertForMaskedLM"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 16 | null |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: ppo
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 264.67 +/- 13.99
name: mean_reward
verified: false
---
# **ppo** Agent playing **LunarLander-v2**
This is a trained model of a **ppo** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
CAMeL-Lab/bert-base-arabic-camelbert-msa-pos-msa
|
[
"pytorch",
"tf",
"bert",
"token-classification",
"ar",
"arxiv:2103.06678",
"transformers",
"license:apache-2.0",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"BertForTokenClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 133 | null |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 255.07 +/- 18.22
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
CLAck/indo-mixed
|
[
"pytorch",
"marian",
"text2text-generation",
"en",
"id",
"dataset:ALT",
"transformers",
"translation",
"license:apache-2.0",
"autotrain_compatible"
] |
translation
|
{
"architectures": [
"MarianMTModel"
],
"model_type": "marian",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 15 | null |
---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 536.00 +/- 167.57
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga sptrodon -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga sptrodon -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga sptrodon
```
## Hyperparameters
```python
OrderedDict([('batch_size', 64),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 10000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
|
CLAck/indo-pure
|
[
"pytorch",
"marian",
"text2text-generation",
"en",
"id",
"dataset:ALT",
"transformers",
"translation",
"license:apache-2.0",
"autotrain_compatible"
] |
translation
|
{
"architectures": [
"MarianMTModel"
],
"model_type": "marian",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 4 | null |
---
license: mit
tags:
- generated_from_trainer
datasets:
- lg-ner
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: luganda-ner-v2
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: lg-ner
type: lg-ner
config: lug
split: test
args: lug
metrics:
- name: Precision
type: precision
value: 0.7704421562689279
- name: Recall
type: recall
value: 0.7695099818511797
- name: F1
type: f1
value: 0.7699757869249395
- name: Accuracy
type: accuracy
value: 0.9434371807967313
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# luganda-ner-v2
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the lg-ner dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2829
- Precision: 0.7704
- Recall: 0.7695
- F1: 0.7700
- Accuracy: 0.9434
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 261 | 0.4835 | 0.5191 | 0.3037 | 0.3832 | 0.8719 |
| 0.5738 | 2.0 | 522 | 0.3454 | 0.7288 | 0.5203 | 0.6071 | 0.9117 |
| 0.5738 | 3.0 | 783 | 0.2956 | 0.7752 | 0.6612 | 0.7137 | 0.9235 |
| 0.2549 | 4.0 | 1044 | 0.2791 | 0.7537 | 0.6848 | 0.7176 | 0.9258 |
| 0.2549 | 5.0 | 1305 | 0.2801 | 0.7530 | 0.7211 | 0.7367 | 0.9335 |
| 0.1566 | 6.0 | 1566 | 0.2675 | 0.7956 | 0.7229 | 0.7575 | 0.9393 |
| 0.1566 | 7.0 | 1827 | 0.2610 | 0.7744 | 0.7350 | 0.7542 | 0.9423 |
| 0.1054 | 8.0 | 2088 | 0.2731 | 0.7614 | 0.7586 | 0.7600 | 0.9423 |
| 0.1054 | 9.0 | 2349 | 0.2763 | 0.7794 | 0.7526 | 0.7658 | 0.9434 |
| 0.0771 | 10.0 | 2610 | 0.2829 | 0.7704 | 0.7695 | 0.7700 | 0.9434 |
### Framework versions
- Transformers 4.27.4
- Pytorch 1.13.1+cu116
- Datasets 2.11.0
- Tokenizers 0.13.2
|
CLAck/vi-en
|
[
"pytorch",
"marian",
"text2text-generation",
"en",
"vi",
"dataset:ALT",
"transformers",
"translation",
"license:apache-2.0",
"autotrain_compatible"
] |
translation
|
{
"architectures": [
"MarianMTModel"
],
"model_type": "marian",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 6 | null |
---
license: cc-by-4.0
metrics:
- bleu4
- meteor
- rouge-l
- bertscore
- moverscore
language: ja
datasets:
- lmqg/qg_jaquad
pipeline_tag: text2text-generation
tags:
- question answering
widget:
- text: "question: 新型車両として6000系が構想されたのは、製造費用のほか、どんな費用を抑えるためだったの?, context: 三多摩地区開発による沿線人口の増加、相模原線延伸による多摩ニュータウン乗り入れ、都営地下鉄10号線(現都営地下鉄新宿線、以下新宿線と表記する)乗入構想により、京王線の利用客増加が見込まれ、相当数の車両を準備する必要に迫られるなか、製造費用、保守費用を抑えた新型車両として6000系が構想された。新宿線建設に際してはすでに1号線(後の浅草線)を1,435mm軌間で開業させていた東京都は京成電鉄と1号線との乗り入れにあたり京成電鉄の路線を1,372mmから1,435mmに改軌させた事例や、1,372mm軌間の特殊性から運輸省(当時、2001年から国土交通省)と共に京王にも改軌を求めたが、改軌工事中の輸送力確保が困難なことを理由に改軌しないことで決着している。"
example_title: "Question Answering Example 1"
- text: "question: 1968年に開催されたオリンピックの名前は何ですか?, context: オリンピックが世界的大イベントに成長するに従って政治に左右されるようになると、1968年のメキシコシティ大会では黒人差別を訴える場と化し、1972年のミュンヘン大会ではアラブのゲリラによるイスラエル選手に対するテロ事件まで起きた(ミュンヘンオリンピック事件)。1976年のモントリオール大会になると、ニュージーランドのラグビーチームの南アフリカ遠征に反対してアフリカの諸国22ヶ国がボイコットを行った。そして、1980年のモスクワ大会ではソ連のアフガニスタン侵攻に反発したアメリカ・西ドイツ・日本などの西側諸国が相次いでボイコットを行った。1984年ロサンゼルス大会ではソ連と東側諸国が報復ボイコットを行ない、参加したのはソ連と対立していた中国とルーマニアだけだった。中でも、イラン革命後のイラン・イスラム共和国はモスクワとロサンゼルス双方のオリンピックをボイコットしている。オリンピックが巨大化するに従って財政負担の増大が大きな問題となり、1976年の夏季大会では大幅な赤字を出し、その後夏季・冬季とも立候補都市が1〜2都市だけという状態が続いた。"
example_title: "Question Answering Example 2"
model-index:
- name: vocabtrimmer/mt5-small-trimmed-ja-jaquad-qa
results:
- task:
name: Text2text Generation
type: text2text-generation
dataset:
name: lmqg/qg_jaquad
type: default
args: default
metrics:
- name: BLEU4 (Question Answering)
type: bleu4_question_answering
value: 0.0
- name: ROUGE-L (Question Answering)
type: rouge_l_question_answering
value: 61.11
- name: METEOR (Question Answering)
type: meteor_question_answering
value: 48.39
- name: BERTScore (Question Answering)
type: bertscore_question_answering
value: 96.01
- name: MoverScore (Question Answering)
type: moverscore_question_answering
value: 87.95
- name: AnswerF1Score (Question Answering)
type: answer_f1_score__question_answering
value: 63.04
- name: AnswerExactMatch (Question Answering)
type: answer_exact_match_question_answering
value: 63.04
---
# Model Card of `vocabtrimmer/mt5-small-trimmed-ja-jaquad-qa`
This model is fine-tuned version of [vocabtrimmer/mt5-small-trimmed-ja](https://huggingface.co/vocabtrimmer/mt5-small-trimmed-ja) for question answering task on the [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) (dataset_name: default) via [`lmqg`](https://github.com/asahi417/lm-question-generation).
### Overview
- **Language model:** [vocabtrimmer/mt5-small-trimmed-ja](https://huggingface.co/vocabtrimmer/mt5-small-trimmed-ja)
- **Language:** ja
- **Training data:** [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) (default)
- **Online Demo:** [https://autoqg.net/](https://autoqg.net/)
- **Repository:** [https://github.com/asahi417/lm-question-generation](https://github.com/asahi417/lm-question-generation)
- **Paper:** [https://arxiv.org/abs/2210.03992](https://arxiv.org/abs/2210.03992)
### Usage
- With [`lmqg`](https://github.com/asahi417/lm-question-generation#lmqg-language-model-for-question-generation-)
```python
from lmqg import TransformersQG
# initialize model
model = TransformersQG(language="ja", model="vocabtrimmer/mt5-small-trimmed-ja-jaquad-qa")
# model prediction
answers = model.answer_q(list_question="新型車両として6000系が構想されたのは、製造費用のほか、どんな費用を抑えるためだったの?", list_context=" 三多摩地区開発による沿線人口の増加、相模原線延伸による多摩ニュータウン乗り入れ、都営地下鉄10号線(現都営地下鉄新宿線、以下新宿線と表記する)乗入構想により、京王線の利用客増加が見込まれ、相当数の車両を準備する必要に迫られるなか、製造費用、保守費用を抑えた新型車両として6000系が構想された。新宿線建設に際してはすでに1号線(後の浅草線)を1,435mm軌間で開業させていた東京都は京成電鉄と1号線との乗り入れにあたり京成電鉄の路線を1,372mmから1,435mmに改軌させた事例や、1,372mm軌間の特殊性から運輸省(当時、2001年から国土交通省)と共に京王にも改軌を求めたが、改軌工事中の輸送力確保が困難なことを理由に改軌しないことで決着している。")
```
- With `transformers`
```python
from transformers import pipeline
pipe = pipeline("text2text-generation", "vocabtrimmer/mt5-small-trimmed-ja-jaquad-qa")
output = pipe("question: 新型車両として6000系が構想されたのは、製造費用のほか、どんな費用を抑えるためだったの?, context: 三多摩地区開発による沿線人口の増加、相模原線延伸による多摩ニュータウン乗り入れ、都営地下鉄10号線(現都営地下鉄新宿線、以下新宿線と表記する)乗入構想により、京王線の利用客増加が見込まれ、相当数の車両を準備する必要に迫られるなか、製造費用、保守費用を抑えた新型車両として6000系が構想された。新宿線建設に際してはすでに1号線(後の浅草線)を1,435mm軌間で開業させていた東京都は京成電鉄と1号線との乗り入れにあたり京成電鉄の路線を1,372mmから1,435mmに改軌させた事例や、1,372mm軌間の特殊性から運輸省(当時、2001年から国土交通省)と共に京王にも改軌を求めたが、改軌工事中の輸送力確保が困難なことを理由に改軌しないことで決着している。")
```
## Evaluation
- ***Metric (Question Answering)***: [raw metric file](https://huggingface.co/vocabtrimmer/mt5-small-trimmed-ja-jaquad-qa/raw/main/eval/metric.first.answer.paragraph_question.answer.lmqg_qg_jaquad.default.json)
| | Score | Type | Dataset |
|:-----------------|--------:|:--------|:-----------------------------------------------------------------|
| AnswerExactMatch | 63.04 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| AnswerF1Score | 63.04 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| BERTScore | 96.01 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| Bleu_1 | 58.82 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| Bleu_2 | 0 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| Bleu_3 | 0 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| Bleu_4 | 0 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| METEOR | 48.39 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| MoverScore | 87.95 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
| ROUGE_L | 61.11 | default | [lmqg/qg_jaquad](https://huggingface.co/datasets/lmqg/qg_jaquad) |
## Training hyperparameters
The following hyperparameters were used during fine-tuning:
- dataset_path: lmqg/qg_jaquad
- dataset_name: default
- input_types: ['paragraph_question']
- output_types: ['answer']
- prefix_types: None
- model: vocabtrimmer/mt5-small-trimmed-ja
- max_length: 512
- max_length_output: 32
- epoch: 14
- batch: 16
- lr: 0.0006
- fp16: False
- random_seed: 1
- gradient_accumulation_steps: 4
- label_smoothing: 0.15
The full configuration can be found at [fine-tuning config file](https://huggingface.co/vocabtrimmer/mt5-small-trimmed-ja-jaquad-qa/raw/main/trainer_config.json).
## Citation
```
@inproceedings{ushio-etal-2022-generative,
title = "{G}enerative {L}anguage {M}odels for {P}aragraph-{L}evel {Q}uestion {G}eneration",
author = "Ushio, Asahi and
Alva-Manchego, Fernando and
Camacho-Collados, Jose",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, U.A.E.",
publisher = "Association for Computational Linguistics",
}
```
|
CLTL/icf-levels-enr
|
[
"pytorch",
"roberta",
"text-classification",
"nl",
"transformers",
"license:mit"
] |
text-classification
|
{
"architectures": [
"RobertaForSequenceClassification"
],
"model_type": "roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 30 | null |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi-v3_v1
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.52 +/- 2.74
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="Farzad-ES/Taxi-v3_v1", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Cameron/BERT-eec-emotion
|
[
"pytorch",
"jax",
"bert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 36 | null |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- conll2003
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: conll2003
type: conll2003
config: conll2003
split: validation
args: conll2003
metrics:
- name: Precision
type: precision
value: 0.9271559836156316
- name: Recall
type: recall
value: 0.9369056941492337
- name: F1
type: f1
value: 0.9320053416425551
- name: Accuracy
type: accuracy
value: 0.9837482326401575
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0603
- Precision: 0.9272
- Recall: 0.9369
- F1: 0.9320
- Accuracy: 0.9837
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.2437 | 1.0 | 878 | 0.0672 | 0.9145 | 0.9203 | 0.9174 | 0.9813 |
| 0.053 | 2.0 | 1756 | 0.0597 | 0.9229 | 0.9350 | 0.9289 | 0.9832 |
| 0.0301 | 3.0 | 2634 | 0.0603 | 0.9272 | 0.9369 | 0.9320 | 0.9837 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
Canadiancaleb/DialoGPT-small-jesse
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 9 | null |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
config: split
split: validation
args: split
metrics:
- name: Accuracy
type: accuracy
value: 0.9415
- name: F1
type: f1
value: 0.9415982247067175
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1484
- Accuracy: 0.9415
- F1: 0.9416
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.2501 | 1.0 | 1000 | 0.2103 | 0.927 | 0.9268 |
| 0.1599 | 2.0 | 2000 | 0.1605 | 0.939 | 0.9393 |
| 0.0969 | 3.0 | 3000 | 0.1484 | 0.9415 | 0.9416 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
Canadiancaleb/DialoGPT-small-walter
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 13 | null |
---
library_name: stable-baselines3
tags:
- AntBulletEnv-v0
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: AntBulletEnv-v0
type: AntBulletEnv-v0
metrics:
- type: mean_reward
value: 1518.86 +/- 58.27
name: mean_reward
verified: false
---
# **A2C** Agent playing **AntBulletEnv-v0**
This is a trained model of a **A2C** agent playing **AntBulletEnv-v0**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Canadiancaleb/jessebot
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: cc
datasets:
- animelover/danbooru2022
- KBlueLeaf/Danbooru2021-SQLite
language:
- en
metrics:
- accuracy
- character
pipeline_tag: image-classification
tags:
- danbooru
- anime
---
# Deep Danbooru Models
This repository contains the models used in the [Danbooru](https://danbooru.donmai.us).
https://github.com/libwaifu/deep-danbooru
https://huggingface.co/oovm/deep-danbooru
|
Capreolus/birch-bert-large-mb
|
[
"pytorch",
"tf",
"jax",
"bert",
"next-sentence-prediction",
"transformers"
] | null |
{
"architectures": [
"BertForNextSentencePrediction"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 1 | null |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-qnli
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-qnli
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7416
- Accuracy: 0.2781
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.738 | 1.0 | 635 | 1.7130 | 0.2656 |
| 1.671 | 2.0 | 1270 | 1.7190 | 0.2445 |
| 1.5906 | 3.0 | 1905 | 1.7416 | 0.2781 |
| 1.3229 | 4.0 | 2540 | 1.8047 | 0.2781 |
| 1.217 | 5.0 | 3175 | 1.8409 | 0.2773 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
Captain272/lstm
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: mit
tags:
- simplification
- generated_from_trainer
metrics:
- bleu
model-index:
- name: mbart-simplification
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mbart-simplification
This model is a fine-tuned version of [facebook/mbart-large-50](https://huggingface.co/facebook/mbart-large-50) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6308
- Bleu: 68.7458
- Gen Len: 31.1582
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Bleu | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|
| No log | 1.0 | 168 | 0.6290 | 66.5469 | 29.3433 |
| No log | 2.0 | 336 | 0.6308 | 68.7458 | 31.1582 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
CarlosPR/mt5-spanish-memmories-analysis
|
[
"pytorch",
"mt5",
"text2text-generation",
"transformers",
"autotrain_compatible"
] |
text2text-generation
|
{
"architectures": [
"MT5ForConditionalGeneration"
],
"model_type": "mt5",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 7 | null |
---
license: creativeml-openrail-m
base_model: /home/mobile360/data/lucien/model/diffusers/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA text2image fine-tuning - https://huggingface.co/lluu/pokemon-lora
These are LoRA adaption weights for /home/mobile360/data/lucien/model/diffusers/stable-diffusion-v1-5. The weights were fine-tuned on the lambdalabs/pokemon-blip-captions dataset. You can find some example images in the following.




|
dccuchile/albert-base-spanish-finetuned-pos
|
[
"pytorch",
"albert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"AlbertForTokenClassification"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 5 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SoccerTwos
library_name: ml-agents
---
# **poca** Agent playing **SoccerTwos**
This is a trained model of a **poca** agent playing **SoccerTwos** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-SoccerTwos
2. Step 1: Write your model_id: FlavienDeseure/poca-SoccerTwos
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
dccuchile/albert-base-spanish-finetuned-xnli
|
[
"pytorch",
"albert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"AlbertForSequenceClassification"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 28 | null |
Access to model multimodalart/test5 is restricted and you are not in the authorized list. Visit https://huggingface.co/multimodalart/test5 to ask for access.
|
dccuchile/albert-tiny-spanish-finetuned-mldoc
|
[
"pytorch",
"albert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"AlbertForSequenceClassification"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 32 | null |
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segment_test
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segment_test
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1260
- Mean Iou: 0.8447
- Mean Accuracy: 0.9060
- Overall Accuracy: 0.9643
- Per Category Iou: [0.8581731584290526, 0.9316708017224354, 0.6429457606853357, 0.9461543989839276]
- Per Category Accuracy: [0.9425328146084153, 0.9716290009653072, 0.7414471782174548, 0.9684613946705924]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-----------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------:|
| 1.0598 | 0.12 | 20 | 1.0566 | 0.4321 | 0.5515 | 0.8279 | [0.006337257377741027, 0.7300264416674316, 0.23785776348718268, 0.7540601698014678] | [0.008091933477034156, 0.9660296098187956, 0.4518657444815921, 0.779822911077032] |
| 0.7539 | 0.25 | 40 | 0.6128 | 0.4529 | 0.5206 | 0.8749 | [0.00885438037222906, 0.7887343476637405, 0.18630294846181414, 0.827538656414587] | [0.009217881990081138, 0.9747549910409742, 0.23250084632472676, 0.8657848800171862] |
| 0.8122 | 0.38 | 60 | 0.4928 | 0.4920 | 0.5507 | 0.8982 | [0.02270113673871408, 0.8263637783329851, 0.2554015068180403, 0.8635887381305902] | [0.023653681019224364, 0.9630036206967801, 0.30407150146655104, 0.9122355221296055] |
| 0.5643 | 0.5 | 80 | 0.4269 | 0.5063 | 0.5663 | 0.9129 | [0.007592876375505398, 0.8532772644930355, 0.27466317854807065, 0.8898355351518092] | [0.007684489073480188, 0.9441805272844986, 0.3632055074118278, 0.9501073674832299] |
| 0.4774 | 0.62 | 100 | 0.3648 | 0.5490 | 0.6034 | 0.9218 | [0.054267747505144744, 0.8633878614911209, 0.37650159164096025, 0.9020084353341656] | [0.055975413139862956, 0.9428365409577416, 0.45329399467742054, 0.961363980883101] |
| 0.6967 | 0.75 | 120 | 0.3155 | 0.6158 | 0.6710 | 0.9325 | [0.19913477919641662, 0.8733583185910088, 0.47153849968591927, 0.9190236996330312] | [0.20509568371799589, 0.9502280021743253, 0.5638919570734214, 0.9648268960892941] |
| 0.4543 | 0.88 | 140 | 0.2964 | 0.6272 | 0.7016 | 0.9310 | [0.2268179599804783, 0.8694887954672872, 0.4961026114602021, 0.9165896188501754] | [0.2321184480311235, 0.9344494411072529, 0.6716976656263668, 0.9680582434884342] |
| 0.2924 | 1.0 | 160 | 0.2722 | 0.6820 | 0.7592 | 0.9384 | [0.40849985639476094, 0.8843766461027478, 0.5120505100078259, 0.9231923075849184] | [0.4548831992709812, 0.9611082383832005, 0.667739089824088, 0.9532322308907735] |
| 0.3467 | 1.12 | 180 | 0.2517 | 0.6815 | 0.7430 | 0.9414 | [0.40527347045732925, 0.8910632241964158, 0.5029203059015287, 0.9266932255262942] | [0.447198710197501, 0.9557005046935547, 0.6038508392823166, 0.9651288642296557] |
| 0.5212 | 1.25 | 200 | 0.2396 | 0.6752 | 0.7408 | 0.9427 | [0.3668495492266549, 0.8948194469733646, 0.5099936563424119, 0.9292469056689828] | [0.3789189141825701, 0.965308543339751, 0.6575510698285852, 0.9614078764674099] |
| 0.3599 | 1.38 | 220 | 0.2108 | 0.6717 | 0.7162 | 0.9446 | [0.3204229923658836, 0.891874970369172, 0.5386909667961237, 0.9357227289520356] | [0.32437612813907435, 0.9667832593900233, 0.605627503453252, 0.9680877242092517] |
| 0.7198 | 1.5 | 240 | 0.2018 | 0.7692 | 0.8618 | 0.9527 | [0.6678205541462868, 0.9118050318864169, 0.5610132418656683, 0.9363469668607539] | [0.8350594956451641, 0.9497174012785288, 0.6927260551877853, 0.9696466684460499] |
| 0.3682 | 1.62 | 260 | 0.1931 | 0.7812 | 0.8479 | 0.9538 | [0.6890403931363558, 0.9109837638365132, 0.5887742209415688, 0.93605613049878] | [0.7620327532726986, 0.9497927662356516, 0.7058595323345466, 0.9739980042388998] |
| 0.3358 | 1.75 | 280 | 0.1947 | 0.7916 | 0.8837 | 0.9550 | [0.7285153761152414, 0.9167544021441286, 0.5844378827851311, 0.9366799980170186] | [0.8390747945253492, 0.9631461386121828, 0.7712946420849488, 0.9611668227501886] |
| 0.2334 | 1.88 | 300 | 0.2014 | 0.7843 | 0.8673 | 0.9522 | [0.7115100993908304, 0.9111127309647635, 0.5831612483285399, 0.9312333415206432] | [0.850726390130207, 0.9751160540868734, 0.6929311499390893, 0.9502407282203037] |
| 0.2996 | 2.0 | 320 | 0.1701 | 0.8027 | 0.8896 | 0.9568 | [0.7526564111480359, 0.9206156591940219, 0.5997533073741746, 0.9377288799677301] | [0.8344833780207841, 0.964968540097336, 0.7965929561072522, 0.9622910891033862] |
| 0.4622 | 2.12 | 340 | 0.1897 | 0.7614 | 0.8286 | 0.9509 | [0.5700844122547072, 0.9050700733042791, 0.6361100555452245, 0.9344921716585569] | [0.6064108967281776, 0.975059629707727, 0.777640224269875, 0.9551298779200561] |
| 0.338 | 2.25 | 360 | 0.1713 | 0.8093 | 0.8947 | 0.9579 | [0.7434863865211815, 0.9179570569529378, 0.6345544665942896, 0.9411654222120547] | [0.9167652419255911, 0.9513808608505936, 0.7380297560360868, 0.9725677707832107] |
| 0.1828 | 2.38 | 380 | 0.1613 | 0.8102 | 0.8854 | 0.9583 | [0.7563041981028708, 0.923073161283024, 0.6226932638283688, 0.9387242789473116] | [0.8467351874244257, 0.9575329724998702, 0.7662636431252486, 0.9708901596701879] |
| 0.1848 | 2.5 | 400 | 0.1523 | 0.8332 | 0.8882 | 0.9617 | [0.8272877802882616, 0.9256725856927198, 0.6369852470922355, 0.9430524306296677] | [0.8886931987452464, 0.9707900525673887, 0.7260329485953481, 0.9672511041320438] |
| 0.2053 | 2.62 | 420 | 0.1450 | 0.8382 | 0.9129 | 0.9626 | [0.8318187789064231, 0.9285519737454524, 0.6480460535396757, 0.9442850494628351] | [0.9293894467518357, 0.9631609731907377, 0.7888883123172988, 0.970120219961517] |
| 0.1596 | 2.75 | 440 | 0.1376 | 0.8327 | 0.8953 | 0.9624 | [0.8243205167317957, 0.928415735271021, 0.6344793824796972, 0.9436346449664342] | [0.9292602036345793, 0.9591271598881472, 0.717386845766276, 0.9752671632199643] |
| 0.137 | 2.88 | 460 | 0.1472 | 0.8260 | 0.9081 | 0.9616 | [0.7858947726139123, 0.9279875195406946, 0.6464381724674384, 0.9437068326674424] | [0.9422546133221177, 0.9608167124242775, 0.7587838622553998, 0.9706571968763456] |
| 0.1679 | 3.0 | 480 | 0.1471 | 0.8331 | 0.9126 | 0.9624 | [0.8265760388185004, 0.9295336697547381, 0.6321998556258659, 0.9441039056088667] | [0.9403510155442231, 0.9637791247810575, 0.7769013889609604, 0.9693229385117723] |
| 0.1736 | 3.12 | 500 | 0.1322 | 0.8402 | 0.8960 | 0.9634 | [0.8539363341788129, 0.9291877911278092, 0.632669557547822, 0.9451061440414477] | [0.9248133641764366, 0.9646743651065282, 0.721404726074956, 0.9731695680841902] |
| 0.234 | 3.25 | 520 | 0.1355 | 0.8448 | 0.9076 | 0.9632 | [0.8526502893205608, 0.9289085404109714, 0.6534740415947964, 0.944330630190735] | [0.9257553055394915, 0.9662036512136275, 0.7685765188748946, 0.9699754203327694] |
| 0.1183 | 3.38 | 540 | 0.1256 | 0.8402 | 0.8954 | 0.9637 | [0.8579969655846054, 0.9294856863546612, 0.6276158375419669, 0.9457592749372996] | [0.9389994392163048, 0.9661690813475308, 0.7035145333106, 0.9727224283186038] |
| 0.2085 | 3.5 | 560 | 0.1279 | 0.8480 | 0.9079 | 0.9642 | [0.8653295827282605, 0.9310186709909459, 0.6502631789997088, 0.9455883129741319] | [0.9333061703730964, 0.9630336872086728, 0.7613438401150507, 0.9739094690773278] |
| 0.3477 | 3.62 | 580 | 0.1419 | 0.8392 | 0.9182 | 0.9627 | [0.8356838761929862, 0.9295750569194587, 0.6477355195833494, 0.9437915808167578] | [0.9515710705711231, 0.9679427406460035, 0.7873562792352684, 0.9660274217203163] |
| 0.4279 | 3.75 | 600 | 0.1241 | 0.8453 | 0.8990 | 0.9639 | [0.8543157504970579, 0.9297143070002641, 0.6517228913538011, 0.9456141839889123] | [0.9299589926922874, 0.955372819449404, 0.7305771563983385, 0.9801914665868114] |
| 0.1622 | 3.88 | 620 | 0.1255 | 0.8438 | 0.9206 | 0.9638 | [0.8381374979278546, 0.9308920069470188, 0.6604220534808366, 0.9458239210474509] | [0.9524867251984649, 0.95692011897332, 0.7977592780664754, 0.9753784831657644] |
| 0.1982 | 4.0 | 640 | 0.1229 | 0.8484 | 0.9091 | 0.9647 | [0.8568106859830862, 0.9316640237424656, 0.6584440596013207, 0.9466997405108393] | [0.9452293955803235, 0.9595748462766798, 0.754422510013813, 0.9770213126081696] |
| 0.2273 | 4.12 | 660 | 0.1237 | 0.8444 | 0.8986 | 0.9640 | [0.8592317269149002, 0.9306231098333563, 0.6421928363458029, 0.9456102319021257] | [0.9358165536337031, 0.9691394407999653, 0.7183357178686949, 0.970921035377795] |
| 0.2371 | 4.25 | 680 | 0.1216 | 0.8460 | 0.9096 | 0.9645 | [0.8583380957371707, 0.9321090417297068, 0.6472432759450742, 0.9464390728899327] | [0.9508328514098453, 0.961341618092252, 0.7507901089967407, 0.9753970829896241] |
| 0.2474 | 4.38 | 700 | 0.1283 | 0.8411 | 0.9093 | 0.9635 | [0.8497493402399663, 0.9309167522466655, 0.6387399097291454, 0.9448643127623003] | [0.9486861013266039, 0.9710814460747176, 0.7504342819583336, 0.9668204252105732] |
| 0.2696 | 4.5 | 720 | 0.1273 | 0.8430 | 0.9049 | 0.9633 | [0.8574838830140721, 0.929602014003094, 0.6403298755318946, 0.9444960387864841] | [0.9355734013283564, 0.9743545898715855, 0.7448102379346218, 0.9649164542411783] |
| 0.1506 | 4.62 | 740 | 0.1255 | 0.8479 | 0.9182 | 0.9646 | [0.8525455460809918, 0.9327217587877741, 0.6600164613831294, 0.9464177410883513] | [0.9496433766188248, 0.9639090597950073, 0.7866569802639545, 0.972458775815393] |
| 0.2033 | 4.75 | 760 | 0.1209 | 0.8501 | 0.9107 | 0.9650 | [0.8658943552605436, 0.9327154832962604, 0.6546122598692601, 0.9470460950988947] | [0.935571210767047, 0.9689885784341254, 0.76733359526157, 0.9708881136895634] |
| 0.1636 | 4.88 | 780 | 0.1203 | 0.8483 | 0.9157 | 0.9648 | [0.8518065574480739, 0.9329685900413733, 0.661433964801116, 0.9468031896359194] | [0.9475010076582023, 0.9648201943117868, 0.7779515729284813, 0.9726341721543897] |
| 0.1047 | 5.0 | 800 | 0.1260 | 0.8447 | 0.9060 | 0.9643 | [0.8581731584290526, 0.9316708017224354, 0.6429457606853357, 0.9461543989839276] | [0.9425328146084153, 0.9716290009653072, 0.7414471782174548, 0.9684613946705924] |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.0
- Datasets 2.10.1
- Tokenizers 0.13.2
|
dccuchile/albert-tiny-spanish-finetuned-ner
|
[
"pytorch",
"albert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"AlbertForTokenClassification"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 8 | null |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="MarkieMark1/q-Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
dccuchile/albert-tiny-spanish-finetuned-pos
|
[
"pytorch",
"albert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"AlbertForTokenClassification"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 5 | null |
---
tags:
- autotrain
- text-classification
language:
- en
widget:
- text: "I love AutoTrain 🤗"
datasets:
- lucafrost/autotrain-data-claimcoherence-lf
co2_eq_emissions:
emissions: 0.5905299701991715
---
# Model Trained Using AutoTrain
- Problem type: Binary Classification
- Model ID: 39443102994
- CO2 Emissions (in grams): 0.5905
## Validation Metrics
- Loss: 0.396
- Accuracy: 0.820
- Precision: 0.913
- Recall: 0.750
- AUC: 0.907
- F1: 0.824
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/lucafrost/autotrain-claimcoherence-lf-39443102994
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("lucafrost/autotrain-claimcoherence-lf-39443102994", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("lucafrost/autotrain-claimcoherence-lf-39443102994", use_auth_token=True)
inputs = tokenizer("I love AutoTrain", return_tensors="pt")
outputs = model(**inputs)
```
|
dccuchile/albert-xlarge-spanish-finetuned-mldoc
|
[
"pytorch",
"albert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"AlbertForSequenceClassification"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 26 | null |
---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
---
Comparison:

Sample pictures of this concept:
More detailed samples will be uploaded this month. in my free time.
and comparisons between different models with different combinations.


|
dccuchile/albert-xlarge-spanish-finetuned-pawsx
|
[
"pytorch",
"albert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"AlbertForSequenceClassification"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 24 | null |
---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 541.50 +/- 153.00
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Ibtisam -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Ibtisam -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga Ibtisam
```
## Hyperparameters
```python
OrderedDict([('batch_size', 64),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.02),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.00025),
('learning_starts', 10000),
('n_timesteps', 1000000),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
|
dccuchile/albert-base-spanish
|
[
"pytorch",
"tf",
"albert",
"pretraining",
"es",
"dataset:large_spanish_corpus",
"transformers",
"spanish",
"OpenCENIA"
] | null |
{
"architectures": [
"AlbertForPreTraining"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 586 | 2023-03-07T11:14:44Z |
---
library_name: stable-baselines3
tags:
- PandaReachDense-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: PandaReachDense-v2
type: PandaReachDense-v2
metrics:
- type: mean_reward
value: -0.28 +/- 0.14
name: mean_reward
verified: false
---
# **A2C** Agent playing **PandaReachDense-v2**
This is a trained model of a **A2C** agent playing **PandaReachDense-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
dccuchile/albert-tiny-spanish
|
[
"pytorch",
"tf",
"albert",
"pretraining",
"es",
"dataset:large_spanish_corpus",
"transformers",
"spanish",
"OpenCENIA"
] | null |
{
"architectures": [
"AlbertForPreTraining"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 393 | null |
---
license: apache-2.0
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: mt5-small-finetuned-new3
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mt5-small-finetuned-new3
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3994
- Rouge1: 20.61
- Rouge2: 6.06
- Rougel: 20.22
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 9
- eval_batch_size: 9
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|
| 4.7204 | 1.45 | 500 | 2.6053 | 16.92 | 4.91 | 16.76 |
| 3.1289 | 2.9 | 1000 | 2.4878 | 18.01 | 5.24 | 17.83 |
| 2.8862 | 4.35 | 1500 | 2.4109 | 17.48 | 5.07 | 17.11 |
| 2.7669 | 5.8 | 2000 | 2.4006 | 18.57 | 5.26 | 18.22 |
| 2.6433 | 7.25 | 2500 | 2.4017 | 18.77 | 5.68 | 18.56 |
| 2.5514 | 8.7 | 3000 | 2.3917 | 19.38 | 5.9 | 19.11 |
| 2.4947 | 10.14 | 3500 | 2.3994 | 20.61 | 6.06 | 20.22 |
| 2.3995 | 11.59 | 4000 | 2.3608 | 20.13 | 6.5 | 19.78 |
| 2.3798 | 13.04 | 4500 | 2.3251 | 20.03 | 6.24 | 19.72 |
| 2.3029 | 14.49 | 5000 | 2.3387 | 19.69 | 6.13 | 19.44 |
| 2.2563 | 15.94 | 5500 | 2.3372 | 20.17 | 6.35 | 19.77 |
| 2.2109 | 17.39 | 6000 | 2.3410 | 20.58 | 6.36 | 20.1 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
dccuchile/albert-xxlarge-spanish
|
[
"pytorch",
"tf",
"albert",
"pretraining",
"es",
"dataset:large_spanish_corpus",
"transformers",
"spanish",
"OpenCENIA"
] | null |
{
"architectures": [
"AlbertForPreTraining"
],
"model_type": "albert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 42 | null |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-CartPole-v1
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
dccuchile/bert-base-spanish-wwm-cased-finetuned-ner
|
[
"pytorch",
"bert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"BertForTokenClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 81 | 2023-03-07T11:22:51Z |
---
license: mit
tags:
- generated_from_trainer
datasets:
- lg-ner
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: luganda-ner-v3
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: lg-ner
type: lg-ner
config: lug
split: test
args: lug
metrics:
- name: Precision
type: precision
value: 0.8141289437585734
- name: Recall
type: recall
value: 0.7971793149764943
- name: F1
type: f1
value: 0.8055649813369528
- name: Accuracy
type: accuracy
value: 0.952700740525628
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# luganda-ner-v3
This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on the lg-ner dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2295
- Precision: 0.8141
- Recall: 0.7972
- F1: 0.8056
- Accuracy: 0.9527
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 261 | 0.4226 | 0.6273 | 0.3606 | 0.4580 | 0.8928 |
| 0.5572 | 2.0 | 522 | 0.2835 | 0.7720 | 0.6185 | 0.6868 | 0.9219 |
| 0.5572 | 3.0 | 783 | 0.2740 | 0.7579 | 0.7401 | 0.7489 | 0.9311 |
| 0.1745 | 4.0 | 1044 | 0.2423 | 0.7895 | 0.7683 | 0.7788 | 0.9399 |
| 0.1745 | 5.0 | 1305 | 0.2273 | 0.8048 | 0.7945 | 0.7996 | 0.9498 |
| 0.086 | 6.0 | 1566 | 0.2295 | 0.8141 | 0.7972 | 0.8056 | 0.9527 |
### Framework versions
- Transformers 4.27.4
- Pytorch 1.13.1+cu116
- Datasets 2.11.0
- Tokenizers 0.13.2
|
dccuchile/bert-base-spanish-wwm-cased-finetuned-pos
|
[
"pytorch",
"bert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"BertForTokenClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 1 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
library_name: ml-agents
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Write your model_id: Y-T-G/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
dccuchile/bert-base-spanish-wwm-uncased-finetuned-mldoc
|
[
"pytorch",
"bert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 39 | null |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-cartpole
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 360.80 +/- 40.06
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
dccuchile/bert-base-spanish-wwm-uncased-finetuned-pos
|
[
"pytorch",
"bert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"BertForTokenClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 5 | null |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 277.65 +/- 18.91
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
dccuchile/distilbert-base-spanish-uncased-finetuned-mldoc
|
[
"pytorch",
"distilbert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"DistilBertForSequenceClassification"
],
"model_type": "distilbert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 27 | null |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="Yureeh/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
dccuchile/distilbert-base-spanish-uncased-finetuned-ner
|
[
"pytorch",
"distilbert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"DistilBertForTokenClassification"
],
"model_type": "distilbert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 28 | null |
---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
---
### wasma-agha Dreambooth model trained by Falah with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook
Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb)
Sample pictures of this concept:


|
dccuchile/distilbert-base-spanish-uncased-finetuned-pos
|
[
"pytorch",
"distilbert",
"token-classification",
"transformers",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"DistilBertForTokenClassification"
],
"model_type": "distilbert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 3 | 2023-03-07T11:50:48Z |
---
license: creativeml-openrail-m
pipeline_tag: text-to-image
library_name: diffusers
tags:
- stable-diffusion
- stable-diffusion-diffusers
---
This is Model Card of "GuchaGucha_Shiki"(ぐちゃぐちゃ式).
GuchaGucha_Shiki(GGS) is a Merge Model, that can be used with StableDiffusion-Webui:Automatic1111 and others.
Using [EasyNegative](https://huggingface.co/datasets/gsdf/EasyNegative) makes better outputs. Try it.
## License
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies:
- You can't use the model to deliberately produce nor share illegal or harmful outputs or content.
- The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license.
- You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read [the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license).
## Example

```
1girl,
Negative prompt: EasyNegative
Steps: 20, Sampler: DPM++ SDE Karras, CFG scale: 7, Seed: 3263107773, Size: 512x768, Model hash: cf1348dc14, Model: GuchaGucha_Shiki
```

```
Aerial perspective, spring, 1girl, solo, looking afar, sunny, morning, on road, serafuku, shirts, skirt, ribbon, long hair, long skirt, long sleeves, animal ears, standing, from above, wolf ears, red eyes, black hair, dark blue inner color hair, black tail, blue serafuku, blue shirts, white ribbon, blue skirt, flat chest, depth of field, glare,
Negative prompt: EasyNegative
Steps: 20, Sampler: DPM++ SDE Karras, CFG scale: 7, Seed: 1154925912, Size: 512x768, Model hash: cf1348dc14, Model: GuchaGucha_Shiki
```
## Model Detail & Merge Recipes
### GuchaGucha_Shiki
This is Main Model. But perhaps "Original" version is better.
ACertainThing-half.ckpt * 0.4 + CounterfeitV2.0fp16.safetensors * 0.6 = GGS
### GuchaGucha_Shiki_Original
ACertainThing.ckpt * 0.4 + CounterfeitV2.0.safetensors * 0.6 = GGS_Original
|
dccuchile/distilbert-base-spanish-uncased-finetuned-qa-mlqa
|
[
"pytorch",
"distilbert",
"question-answering",
"transformers",
"autotrain_compatible"
] |
question-answering
|
{
"architectures": [
"DistilBertForQuestionAnswering"
],
"model_type": "distilbert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 5 | null |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-cartpole_2
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
dccuchile/distilbert-base-spanish-uncased
|
[
"pytorch",
"distilbert",
"fill-mask",
"es",
"dataset:large_spanish_corpus",
"transformers",
"spanish",
"OpenCENIA",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"DistilBertForMaskedLM"
],
"model_type": "distilbert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 670 | null |
---
license: mit
---
### mipizzadel on Stable Diffusion
This is the `<pizzadelx>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb).
Here is the new concept you will be able to use as an `object`:














|
CennetOguz/distilbert-base-uncased-finetuned-recipe-accelerate
|
[
"pytorch",
"distilbert",
"fill-mask",
"transformers",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"DistilBertForMaskedLM"
],
"model_type": "distilbert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 7 | null |
---
license: cc-by-4.0
language: eu
tags:
- bert
- basque
- euskara
---
# ElhBERTeu-medium
This is the medium-size version of [ElhBERTeu](https://huggingface.co/orai-nlp/ElhBERTeu) model, the BERT-base for Basque introduced in [BasqueGLUE: A Natural Language Understanding Benchmark for Basque](https://aclanthology.org/2022.lrec-1.172/).
To train ElhBERTeu-medium was trained over the same corpus as for ElhBERTeu, for which we employed different corpora sources from several domains: updated (2021) national and local news sources, Basque Wikipedia, as well as novel news sources and texts from other domains, such as science (both academic and divulgative), literature or subtitles. More details about the corpora used and their sizes are shown in the following table. Texts from news sources were oversampled (duplicated) as done during the training of BERTeus. In total 575M tokens were used for pre-training ElhBERTeu.
|Domain | Size |
|-----------|----------|
|News | 2 x 224M |
|Wikipedia | 40M |
|Science | 58M |
|Literature | 24M |
|Others | 7M |
|Total | 575M |
ElhBERTeu-medium is a medium-sized (L=8, H=512), cased monolingual BERT model for Basque, with a vocab size of 50K, which has 51M parameters in total and was trained as ElhBERTeu (steps=1M, batch_size=256).
If you use this model, please cite the following paper:
- G. Urbizu, I. San Vicente, X. Saralegi, R. Agerri, A. Soroa. BasqueGLUE: A Natural Language Understanding Benchmark for Basque. In proceedings of the 13th Language Resources and Evaluation Conference (LREC 2022). June 2022. Marseille, France
```
@InProceedings{urbizu2022basqueglue,
author = {Urbizu, Gorka and San Vicente, Iñaki and Saralegi, Xabier and Agerri, Rodrigo and Soroa, Aitor},
title = {BasqueGLUE: A Natural Language Understanding Benchmark for Basque},
booktitle = {Proceedings of the Language Resources and Evaluation Conference},
month = {June},
year = {2022},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {1603--1612},
abstract = {Natural Language Understanding (NLU) technology has improved significantly over the last few years and multitask benchmarks such as GLUE are key to evaluate this improvement in a robust and general way. These benchmarks take into account a wide and diverse set of NLU tasks that require some form of language understanding, beyond the detection of superficial, textual clues. However, they are costly to develop and language-dependent, and therefore they are only available for a small number of languages. In this paper, we present BasqueGLUE, the first NLU benchmark for Basque, a less-resourced language, which has been elaborated from previously existing datasets and following similar criteria to those used for the construction of GLUE and SuperGLUE. We also report the evaluation of two state-of-the-art language models for Basque on BasqueGLUE, thus providing a strong baseline to compare upon. BasqueGLUE is freely available under an open license.},
url = {https://aclanthology.org/2022.lrec-1.172}
}
```
License:
CC BY 4.0
|
Chaddmckay/Cdm
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- recall
- precision
model-index:
- name: norbert2_sentiment_norec_12
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# norbert2_sentiment_norec_12
This model is a fine-tuned version of [bert-large-uncased](https://huggingface.co/bert-large-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4914
- Compute Metrics: :
- Accuracy: 0.8
- Balanced Accuracy: 0.5
- F1 Score: 0.8889
- Recall: 1.0
- Precision: 0.8
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Compute Metrics | Accuracy | Balanced Accuracy | F1 Score | Recall | Precision |
|:-------------:|:-----:|:----:|:---------------:|:---------------:|:--------:|:-----------------:|:--------:|:------:|:---------:|
| 0.4908 | 1.0 | 5 | 0.4910 | : | 0.8 | 0.5 | 0.8889 | 1.0 | 0.8 |
| 0.3751 | 2.0 | 10 | 0.4914 | : | 0.8 | 0.5 | 0.8889 | 1.0 | 0.8 |
### Framework versions
- Transformers 4.26.0
- Pytorch 1.13.1+cu117
- Datasets 2.9.0
- Tokenizers 0.13.2
|
Chaewon/mmnt_decoder_en
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers"
] |
text-generation
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": true,
"max_length": 50
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 12 | null |
---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 618.50 +/- 198.02
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga issajatt -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga issajatt -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga issajatt
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
|
chainyo/speaker-recognition-meetup
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 1 | null |
---
tags:
- fastai
---
# Amazing!
🥳 Congratulations on hosting your fastai model on the Hugging Face Hub!
# Some next steps
1. Fill out this model card with more information (see the template below and the [documentation here](https://huggingface.co/docs/hub/model-repos))!
2. Create a demo in Gradio or Streamlit using 🤗 Spaces ([documentation here](https://huggingface.co/docs/hub/spaces)).
3. Join the fastai community on the [Fastai Discord](https://discord.com/invite/YKrxeNn)!
Greetings fellow fastlearner 🤝! Don't forget to delete this content from your model card.
---
# Model card
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
|
Chakita/KNUBert
|
[
"pytorch",
"tensorboard",
"roberta",
"fill-mask",
"transformers",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"RobertaForMaskedLM"
],
"model_type": "roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 20 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
library_name: ml-agents
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Write your model_id: AjayD53/Huggy_Dog_RL
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Chakita/KannadaBERT
|
[
"pytorch",
"roberta",
"fill-mask",
"transformers",
"masked-lm",
"fill-in-the-blanks",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"RobertaForMaskedLM"
],
"model_type": "roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 5 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
library_name: ml-agents
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Write your model_id: Mahmoud22/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Cheatham/xlm-roberta-large-finetuned4
|
[
"pytorch",
"xlm-roberta",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"XLMRobertaForSequenceClassification"
],
"model_type": "xlm-roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 20 | null |
---
license: mit
widget:
- বহুল আলোচিত দশম জাতীয় সংসদ
- গাজীপুরের কালিয়াকৈর উপজেলার তেলিরচালা
---
Bangla GPT2 model was trained using the Bangla Newspaper dataset. Here we used prothom alo 250mb data for GPT2 model training and also vocab size 50k.
Github link : https://github.com/saiful9379/Bangla_GPT2
```py
from transformers import TFGPT2LMHeadModel, GPT2Tokenizer
tokenizer = GPT2Tokenizer.from_pretrained("saiful9379/Bangla_GPT2")
model = TFGPT2LMHeadModel.from_pretrained("saiful9379/Bangla_GPT2")
text = "বহুল আলোচিত দশম জাতীয় সংসদ"
input_ids = tokenizer.encode(text, return_tensors='tf')
print(input_ids)
output = model.generate(
input_ids,
max_length=175,
num_beams=10,
temperature=0.7,
no_repeat_ngram_size=2,
num_return_sequences=5
)
predicted_text = tokenizer.decode(output[0], skip_special_tokens=True)
print(predicted_text)
```
Here is the basic configuration of Bangla GPT2 Model,
```
vocab_size = 50000
block_size = 200
learning_rate=3e-5
num_epoch = 100
batch_size = 12
buffer_size = 1000
```
|
Check/vaw2tmp
|
[
"tensorboard"
] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- cppe-5
model-index:
- name: detr-resnet-50_finetuned_cppe5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# detr-resnet-50_finetuned_cppe5
This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the cppe-5 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
Chester/traffic-rec
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Pyramids
library_name: ml-agents
---
# **ppo** Agent playing **Pyramids**
This is a trained model of a **ppo** agent playing **Pyramids** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Pyramids
2. Step 1: Write your model_id: chandc/PyramidsTraining
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Chikita1/www_stash_stock
|
[
"license:bsd-3-clause-clear"
] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- generated_from_trainer
datasets:
- training_corpus
model-index:
- name: gpt2large-lhm-08
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# gpt2large-lhm-08
This model was trained from scratch on the training_corpus dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1.8e-05
- train_batch_size: 40
- eval_batch_size: 40
- seed: 42
- gradient_accumulation_steps: 100
- total_train_batch_size: 4000
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- num_epochs: 1
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1
- Datasets 2.8.0
- Tokenizers 0.13.2
|
Ching/negation_detector
|
[
"pytorch",
"roberta",
"question-answering",
"transformers",
"autotrain_compatible"
] |
question-answering
|
{
"architectures": [
"RobertaForQuestionAnswering"
],
"model_type": "roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 9 | 2023-03-07T13:08:13Z |
---
license: apache-2.0
language:
- en
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 50.234 %
> * Mac-F1 : 36.960 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-SemEval-2018-emojis-cen-1"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
Chiuchiyin/DialoGPT-small-Donald
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 7 | null |
---
license: apache-2.0
language:
- en
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 50.074 %
> * Mac-F1 : 38.133 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-SemEval-2018-emojis-cen-2"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
Chiuchiyin/Donald
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: apache-2.0
language:
- en
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 50.872 %
> * Mac-F1 : 37.475 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-SemEval-2018-emojis-IID-Fed"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
ChoboAvenger/DialoGPT-small-DocBot
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-CartPole-v1_v1
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
ChrisP/xlm-roberta-base-finetuned-marc-en
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- autotrain
- text-classification
language:
- en
widget:
- text: "I love AutoTrain 🤗"
datasets:
- lucafrost/autotrain-data-claimcoherence-n500
co2_eq_emissions:
emissions: 0.002315234498119706
---
# Model Trained Using AutoTrain
- Problem type: Binary Classification
- Model ID: 39483103025
- CO2 Emissions (in grams): 0.0023
## Validation Metrics
- Loss: 0.517
- Accuracy: 0.780
- Precision: 0.804
- Recall: 0.804
- AUC: 0.817
- F1: 0.804
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoTrain"}' https://api-inference.huggingface.co/models/lucafrost/autotrain-claimcoherence-n500-39483103025
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("lucafrost/autotrain-claimcoherence-n500-39483103025", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("lucafrost/autotrain-claimcoherence-n500-39483103025", use_auth_token=True)
inputs = tokenizer("I love AutoTrain", return_tensors="pt")
outputs = model(**inputs)
```
|
ChrisVCB/DialoGPT-medium-cmjs
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 7 | null |
---
license: creativeml-openrail-m
tags:
- text-to-image
widget:
- text:
---
### ntrdttt Dreambooth model trained by DreamyFrog with [Hugging Face Dreambooth Training Space](https://huggingface.co/spaces/multimodalart/dreambooth-training) with the v2-1-512 base model
You run your new concept via `diffusers` [Colab Notebook for Inference](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_dreambooth_inference.ipynb). Don't forget to use the concept prompts!
Sample pictures of:
|
ChristianOrr/madnet_keras
|
[
"tensorboard",
"dataset:flyingthings-3d",
"dataset:kitti",
"arxiv:1810.05424",
"vision",
"deep-stereo",
"depth-estimation",
"Tensorflow2",
"Keras",
"license:apache-2.0"
] |
depth-estimation
|
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: apache-2.0
language:
- en
- es
- it
- fr
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 48.000 %
> * Mac-F1 : 35.321 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-emojis-cen-2"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
ChukSamuels/DialoGPT-small-Dr.FauciBot
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 13 | 2023-03-07T13:21:02Z |
---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 148 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": 148,
"warmup_steps": 15,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
|
Chun/DialoGPT-medium-dailydialog
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers"
] |
text-generation
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 15 | null |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: my_awesome_asr_mind_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_asr_mind_model
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.4146
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 30
- training_steps: 300
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---:|
| 50.2402 | 6.0 | 30 | 35.6862 | 1.0 |
| 30.6841 | 12.0 | 60 | 6.6236 | 1.0 |
| 8.3238 | 18.0 | 90 | 4.0080 | 1.0 |
| 4.663 | 24.0 | 120 | 3.7875 | 1.0 |
| 3.986 | 30.0 | 150 | 3.6613 | 1.0 |
| 3.9205 | 36.0 | 180 | 3.5573 | 1.0 |
| 3.7835 | 42.0 | 210 | 3.4962 | 1.0 |
| 3.7003 | 48.0 | 240 | 3.4505 | 1.0 |
| 3.6903 | 54.0 | 270 | 3.4273 | 1.0 |
| 3.6331 | 60.0 | 300 | 3.4146 | 1.0 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
Chun/DialoGPT-small-dailydialog
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers"
] |
text-generation
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 10 | null |
---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 148 with parameters:
```
{'batch_size': 32, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 1,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": 148,
"warmup_steps": 15,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
|
Chun/w-en2zh-hsk
|
[
"pytorch",
"marian",
"text2text-generation",
"transformers",
"autotrain_compatible"
] |
text2text-generation
|
{
"architectures": [
"MarianMTModel"
],
"model_type": "marian",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 1 | null |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="ljones/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Chun/w-en2zh-mtm
|
[
"pytorch",
"mbart",
"text2text-generation",
"transformers",
"autotrain_compatible"
] |
text2text-generation
|
{
"architectures": [
"MBartForConditionalGeneration"
],
"model_type": "mbart",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 7 | null |
---
license: apache-2.0
language:
- en
- es
- it
- fr
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 49.488 %
> * Mac-F1 : 35.730 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-emojis-cen-2"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
Chun/w-en2zh-otm
|
[
"pytorch",
"mbart",
"text2text-generation",
"transformers",
"autotrain_compatible"
] |
text2text-generation
|
{
"architectures": [
"MBartForConditionalGeneration"
],
"model_type": "mbart",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 7 | null |
---
license: apache-2.0
language:
- en
- es
- it
- fr
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 50.344 %
> * Mac-F1 : 36.849 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-emojis-IID-Fed"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
Chun/w-zh2en-hsk
|
[
"pytorch",
"marian",
"text2text-generation",
"transformers",
"autotrain_compatible"
] |
text2text-generation
|
{
"architectures": [
"MarianMTModel"
],
"model_type": "marian",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 3 | null |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: bert-finetuned-squad
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-squad
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the squad dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
Chun/w-zh2en-mtm
|
[
"pytorch",
"mbart",
"text2text-generation",
"transformers",
"autotrain_compatible"
] |
text2text-generation
|
{
"architectures": [
"MBartForConditionalGeneration"
],
"model_type": "mbart",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 8 | null |
---
license: apache-2.0
language:
- en
- es
- it
- fr
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 49.582 %
> * Mac-F1 : 36.317 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-emojis-non-IID-Fed"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
Chun/w-zh2en-mto
|
[
"pytorch",
"mbart",
"text2text-generation",
"transformers",
"autotrain_compatible"
] |
text2text-generation
|
{
"architectures": [
"MBartForConditionalGeneration"
],
"model_type": "mbart",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 7 | null |
---
tags:
- conversational
---
@ Harry Potter DialoGPT Model
|
Chungu424/repo
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: apache-2.0
language:
- en
- es
- it
- fr
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 49.246 %
> * Mac-F1 : 35.649 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-emojis-1-client-toxic-cen-2"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
Chungu424/repodata
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-sidewalk-oct-22
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-segments-sidewalk-oct-22
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.0+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
Ci/Pai
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: apache-2.0
language:
- en
- es
- it
- fr
metrics:
- f1
---
# Federated Learning Based Multilingual Emoji Prediction
This repository contains code for training and evaluating transformer-based models for Uni/multilingual emoji prediction in clean and attack scenarios using Federated Learning. This work is described in the paper "Federated Learning-Based Multilingual Emoji Prediction in Clean and Attack Scenarios."
# Abstract
Federated learning is a growing field in the machine learning community due to its decentralized and private design. Model training in federated learning is distributed over multiple clients giving access to lots of client data while maintaining privacy. Then, a server aggregates the training done on these multiple clients without access to their data, which could be emojis widely used in any social media service and instant messaging platforms to express users' sentiments. This paper proposes federated learning-based multilingual emoji prediction in both clean and attack scenarios. Emoji prediction data have been crawled from both Twitter and SemEval emoji datasets. This data is used to train and evaluate different transformer model sizes including a sparsely activated transformer with either the assumption of clean data in all clients or poisoned data via label flipping attack in some clients. Experimental results on these models show that federated learning in either clean or attacked scenarios performs similarly to centralized training in multilingual emoji prediction on seen and unseen languages under different data sources and distributions. Our trained transformers perform better than other techniques on the SemEval emoji dataset in addition to the privacy as well as distributed benefits of federated learning.
# Performance
> * Acc : 48.196 %
> * Mac-F1 : 36.357 %
> * Also see our [GitHub Repo](https://github.com/kareemgamalmahmoud/FEDERATED-LEARNING-BASED-MULTILINGUAL-EMOJI-PREDICTION-IN-CLEAN-AND-ATTACK-SCENARIOS)
# Dependencies
> * Python 3.6+
> * PyTorch 1.7.0+
> * Transformers 4.0.0+
# Usage
> To use the model, first install the `transformers` package from Hugging Face:
```python
pip install transformers
```
> Then, you can load the model and tokenizer using the following code:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
import numpy as np
import urllib.request
import csv
```
```python
MODEL = "Karim-Gamal/BERT-base-finetuned-emojis-1-client-toxic-FedAvg-IID-Fed"
tokenizer = AutoTokenizer.from_pretrained(MODEL)
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
```
> Once you have the tokenizer and model, you can preprocess your text and pass it to the model for prediction:
```python
# Preprocess text (username and link placeholders)
def preprocess(text):
new_text = []
for t in text.split(" "):
t = '@user' if t.startswith('@') and len(t) > 1 else t
t = 'http' if t.startswith('http') else t
new_text.append(t)
return " ".join(new_text)
text = "Hello world"
text = preprocess(text)
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
scores = output[0][0].detach().numpy()
```
> The scores variable contains the probabilities for each of the possible emoji labels. To get the top k predictions, you can use the following code:
```python
# download label mapping
labels=[]
mapping_link = "https://raw.githubusercontent.com/cardiffnlp/tweeteval/main/datasets/emoji/mapping.txt"
with urllib.request.urlopen(mapping_link) as f:
html = f.read().decode('utf-8').split("\n")
csvreader = csv.reader(html, delimiter='\t')
labels = [row[1] for row in csvreader if len(row) > 1]
k = 3 # number of top predictions to show
ranking = np.argsort(scores)
ranking = ranking[::-1]
for i in range(k):
l = labels[ranking[i]]
s = scores[ranking[i]]
print(f"{i+1}) {l} {np.round(float(s), 4)}")
```
## Note : this is the source for that code : [Link](https://huggingface.co/cardiffnlp/twitter-roberta-base-emoji)
|
CodeNinja1126/xlm-roberta-large-kor-mrc
|
[
"pytorch",
"xlm-roberta",
"question-answering",
"transformers",
"autotrain_compatible"
] |
question-answering
|
{
"architectures": [
"XLMRobertaForQuestionAnswering"
],
"model_type": "xlm-roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 8 | 2023-03-07T14:17:24Z |
---
license: cc
---
- Runtime: https://github.com/libwaifu/waifu-2x
- Models: https://huggingface.co/oovm/waifu-2x
|
ComCom/gpt2-medium
|
[
"pytorch",
"gpt2",
"feature-extraction",
"transformers"
] |
feature-extraction
|
{
"architectures": [
"GPT2Model"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": true,
"max_length": 50
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 5 | null |
---
license: mit
language:
- de
metrics:
- cer
library_name: transformers
tags:
- kurrent
- ocr
- htr
- 16th century
- 17th century
- 18th century
- trocr
---
# TrOCR Kurrent-Model 16th to 18th century
Base model: **dh-unibe/trocr-kurrent**
Epochs: 19.85 / 20
Eval CER: 0.05673
Test CER: 0.05416
This model is based on an extensive training set (of roughly 1579200 words) and evaluated against the same hands in an evaluation and test set (automatic split).
Consisting of German Kurrent scripts written in the 16th-18th century.
The ground truth stems from different projects and partners and is biased toward Swiss documents.
It is based on documents from a variety of archives and projects.
Among others, the State Archives of Zürich (Stillstandsprotokolle, Ratsmanuale, Findmittel), and the scholarly edition project Königsfelden (Universitäten Zürich und Bern: www.koenigsfelden.uzh.ch).
As well as transcriptions from Einsiedeln.
Further contributions by the university archives of Greifswald: https://rechtsprechung-im-ostseeraum.archiv.uni-greifswald.de/.
The public Transkribus model (based on PyLaia) can be found here: https://readcoop.eu/model/german-kurrent-16th-18th/
Extensive testing of the model has still to be carried out.
This is only a first attempt but might help for fine-tuning tasks.
|
Cometasonmi451/Mine
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SnowballTarget
library_name: ml-agents
---
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-SnowballTarget
2. Step 1: Write your model_id: slopezay/ppo-SnowballTarget1
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
CrayonShinchan/bart_fine_tune_test
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
library_name: ml-agents
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Write your model_id: shreyansjain/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
CurtisBowser/DialoGPT-medium-sora-two
|
[
"pytorch",
"conversational"
] |
conversational
|
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | null |
---
license: apache-2.0
---
# OFA-tiny
## Introduction
This is the **tiny** version of OFA pretrained model finetuned on vqaV2.
The directory includes 4 files, namely `config.json` which consists of model configuration, `vocab.json` and `merge.txt` for our OFA tokenizer, and lastly `pytorch_model.bin` which consists of model weights.
## How to use
Download the models as shown below.
```bash
git clone https://github.com/sohananisetty/OFA_VQA.git
git clone https://huggingface.co/SohanAnisetty/ofa-vqa-tiny
```
After, refer the path to ofa-vqa-tiny to `ckpt_dir`, and prepare an image for the testing example below.
```python
>>> from PIL import Image
>>> from torchvision import transforms
>>> from transformers import OFATokenizer, OFAModelForVQA
>>> mean, std = [0.5, 0.5, 0.5], [0.5, 0.5, 0.5]
>>> resolution = 256
>>> patch_resize_transform = transforms.Compose([
lambda image: image.convert("RGB"),
transforms.Resize((resolution, resolution), interpolation=Image.BICUBIC),
transforms.ToTensor(),
transforms.Normalize(mean=mean, std=std)
])
>>> tokenizer = OFATokenizer.from_pretrained(ckpt_dir)
>>> txt = " what does the image describe?"
>>> inputs = tokenizer([txt], return_tensors="pt").input_ids
>>> img = Image.open(path_to_image)
>>> patch_img = patch_resize_transform(img).unsqueeze(0)
>>> model = OFAModel.from_pretrained(ckpt_dir, use_cache=False)
>>> gen = model.generate(inputs, patch_images=patch_img, num_beams=5, no_repeat_ngram_size=3)
>>> print(tokenizer.batch_decode(gen, skip_special_tokens=True))
```
|
D3xter1922/distilbert-base-uncased-finetuned-cola
|
[] | null |
{
"architectures": null,
"model_type": null,
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 0 | 2023-03-07T16:28:24Z |
---
tags:
- fastai
---
# Amazing!
🥳 Congratulations on hosting your fastai model on the Hugging Face Hub!
# Some next steps
1. Fill out this model card with more information (see the template below and the [documentation here](https://huggingface.co/docs/hub/model-repos))!
2. Create a demo in Gradio or Streamlit using 🤗 Spaces ([documentation here](https://huggingface.co/docs/hub/spaces)).
3. Join the fastai community on the [Fastai Discord](https://discord.com/invite/YKrxeNn)!
Greetings fellow fastlearner 🤝! Don't forget to delete this content from your model card.
---
# Model card
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
|
DARKVIP3R/DialoGPT-medium-Anakin
|
[
"pytorch",
"gpt2",
"text-generation",
"transformers",
"conversational"
] |
conversational
|
{
"architectures": [
"GPT2LMHeadModel"
],
"model_type": "gpt2",
"task_specific_params": {
"conversational": {
"max_length": 1000
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 13 | 2023-03-07T16:38:27Z |
---
datasets:
- justinian336/salvadoran-news
language:
- es
metrics:
- rouge
pipeline_tag: summarization
---
|
DSI/personal_sentiment
|
[
"pytorch",
"bert",
"text-classification",
"transformers"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 25 | 2023-03-07T16:58:29Z |
---
language: en
thumbnail: http://www.huggingtweets.com/ipsd204/1678219365994/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/378800000330457629/d797c028b4269e9acdb5c8acc92aba30_400x400.png')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Indian Prairie 204</div>
<div style="text-align: center; font-size: 14px;">@ipsd204</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Indian Prairie 204.
| Data | Indian Prairie 204 |
| --- | --- |
| Tweets downloaded | 3246 |
| Retweets | 1069 |
| Short tweets | 20 |
| Tweets kept | 2157 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/wkxgljgz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ipsd204's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/n8v7sa8x) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/n8v7sa8x/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ipsd204')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
|
DTAI-KULeuven/robbertje-1-gb-bort
|
[
"pytorch",
"roberta",
"fill-mask",
"nl",
"dataset:oscar",
"dataset:oscar (NL)",
"dataset:dbrd",
"dataset:lassy-ud",
"dataset:europarl-mono",
"dataset:conll2002",
"arxiv:2101.05716",
"transformers",
"Dutch",
"Flemish",
"RoBERTa",
"RobBERT",
"RobBERTje",
"license:mit",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"RobertaForMaskedLM"
],
"model_type": "roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 6 | null |
---
tags:
- DemonAttack-v5
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
library_name: cleanrl
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: DemonAttack-v5
type: DemonAttack-v5
metrics:
- type: mean_reward
value: 132494.50 +/- 471.53
name: mean_reward
verified: false
---
# (CleanRL) **PPO** Agent Playing **DemonAttack-v5**
This is a trained model of a PPO agent playing DemonAttack-v5.
The model was trained by using [CleanRL](https://github.com/vwxyzjn/cleanrl) and the most up-to-date training code can be
found [here](https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/cleanba_ppo_envpool_impala_atari_wrapper.py).
## Get Started
To use this model, please install the `cleanrl` package with the following command:
```
pip install "cleanrl[jax,envpool,atari]"
python -m cleanrl_utils.enjoy --exp-name cleanba_ppo_envpool_impala_atari_wrapper --env-id DemonAttack-v5
```
Please refer to the [documentation](https://docs.cleanrl.dev/get-started/zoo/) for more detail.
## Command to reproduce the training
```bash
curl -OL https://huggingface.co/cleanrl/DemonAttack-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/cleanba_ppo_envpool_impala_atari_wrapper.py
curl -OL https://huggingface.co/cleanrl/DemonAttack-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/pyproject.toml
curl -OL https://huggingface.co/cleanrl/DemonAttack-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/poetry.lock
poetry install --all-extras
python cleanba_ppo_envpool_impala_atari_wrapper.py --distributed --learner-device-ids 1 2 3 --track --wandb-project-name cleanba --save-model --upload-model --hf-entity cleanrl --env-id DemonAttack-v5 --seed 2
```
# Hyperparameters
```python
{'actor_device_ids': [0],
'actor_devices': ['gpu:0'],
'anneal_lr': True,
'async_batch_size': 20,
'async_update': 3,
'batch_size': 15360,
'capture_video': False,
'clip_coef': 0.1,
'concurrency': True,
'cuda': True,
'distributed': True,
'ent_coef': 0.01,
'env_id': 'DemonAttack-v5',
'exp_name': 'cleanba_ppo_envpool_impala_atari_wrapper',
'gae_lambda': 0.95,
'gamma': 0.99,
'global_learner_decices': ['gpu:1',
'gpu:2',
'gpu:3',
'gpu:5',
'gpu:6',
'gpu:7'],
'hf_entity': 'cleanrl',
'learner_device_ids': [1, 2, 3],
'learner_devices': ['gpu:1', 'gpu:2', 'gpu:3'],
'learning_rate': 0.00025,
'local_batch_size': 7680,
'local_minibatch_size': 1920,
'local_num_envs': 60,
'local_rank': 0,
'max_grad_norm': 0.5,
'minibatch_size': 3840,
'norm_adv': True,
'num_envs': 120,
'num_minibatches': 4,
'num_steps': 128,
'num_updates': 3255,
'profile': False,
'save_model': True,
'seed': 2,
'target_kl': None,
'test_actor_learner_throughput': False,
'torch_deterministic': True,
'total_timesteps': 50000000,
'track': True,
'update_epochs': 4,
'upload_model': True,
'vf_coef': 0.5,
'wandb_entity': None,
'wandb_project_name': 'cleanba',
'world_size': 2}
```
|
DTAI-KULeuven/robbertje-1-gb-non-shuffled
|
[
"pytorch",
"roberta",
"fill-mask",
"nl",
"dataset:oscar",
"dataset:dbrd",
"dataset:lassy-ud",
"dataset:europarl-mono",
"dataset:conll2002",
"arxiv:2101.05716",
"transformers",
"Dutch",
"Flemish",
"RoBERTa",
"RobBERT",
"RobBERTje",
"license:mit",
"autotrain_compatible"
] |
fill-mask
|
{
"architectures": [
"RobertaForMaskedLM"
],
"model_type": "roberta",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 53 | null |
---
tags:
- Asterix-v5
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
library_name: cleanrl
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Asterix-v5
type: Asterix-v5
metrics:
- type: mean_reward
value: 359500.00 +/- 436291.02
name: mean_reward
verified: false
---
# (CleanRL) **PPO** Agent Playing **Asterix-v5**
This is a trained model of a PPO agent playing Asterix-v5.
The model was trained by using [CleanRL](https://github.com/vwxyzjn/cleanrl) and the most up-to-date training code can be
found [here](https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/cleanba_ppo_envpool_impala_atari_wrapper.py).
## Get Started
To use this model, please install the `cleanrl` package with the following command:
```
pip install "cleanrl[jax,envpool,atari]"
python -m cleanrl_utils.enjoy --exp-name cleanba_ppo_envpool_impala_atari_wrapper --env-id Asterix-v5
```
Please refer to the [documentation](https://docs.cleanrl.dev/get-started/zoo/) for more detail.
## Command to reproduce the training
```bash
curl -OL https://huggingface.co/cleanrl/Asterix-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed3/raw/main/cleanba_ppo_envpool_impala_atari_wrapper.py
curl -OL https://huggingface.co/cleanrl/Asterix-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed3/raw/main/pyproject.toml
curl -OL https://huggingface.co/cleanrl/Asterix-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed3/raw/main/poetry.lock
poetry install --all-extras
python cleanba_ppo_envpool_impala_atari_wrapper.py --distributed --learner-device-ids 1 2 3 --track --wandb-project-name cleanba --save-model --upload-model --hf-entity cleanrl --env-id Asterix-v5 --seed 3
```
# Hyperparameters
```python
{'actor_device_ids': [0],
'actor_devices': ['gpu:0'],
'anneal_lr': True,
'async_batch_size': 20,
'async_update': 3,
'batch_size': 15360,
'capture_video': False,
'clip_coef': 0.1,
'concurrency': True,
'cuda': True,
'distributed': True,
'ent_coef': 0.01,
'env_id': 'Asterix-v5',
'exp_name': 'cleanba_ppo_envpool_impala_atari_wrapper',
'gae_lambda': 0.95,
'gamma': 0.99,
'global_learner_decices': ['gpu:1',
'gpu:2',
'gpu:3',
'gpu:5',
'gpu:6',
'gpu:7'],
'hf_entity': 'cleanrl',
'learner_device_ids': [1, 2, 3],
'learner_devices': ['gpu:1', 'gpu:2', 'gpu:3'],
'learning_rate': 0.00025,
'local_batch_size': 7680,
'local_minibatch_size': 1920,
'local_num_envs': 60,
'local_rank': 0,
'max_grad_norm': 0.5,
'minibatch_size': 3840,
'norm_adv': True,
'num_envs': 120,
'num_minibatches': 4,
'num_steps': 128,
'num_updates': 3255,
'profile': False,
'save_model': True,
'seed': 3,
'target_kl': None,
'test_actor_learner_throughput': False,
'torch_deterministic': True,
'total_timesteps': 50000000,
'track': True,
'update_epochs': 4,
'upload_model': True,
'vf_coef': 0.5,
'wandb_entity': None,
'wandb_project_name': 'cleanba',
'world_size': 2}
```
|
alexandrainst/da-binary-emotion-classification-base
|
[
"pytorch",
"tf",
"safetensors",
"bert",
"text-classification",
"da",
"transformers",
"license:cc-by-sa-4.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 1,066 | null |
---
tags:
- Asterix-v5
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
library_name: cleanrl
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Asterix-v5
type: Asterix-v5
metrics:
- type: mean_reward
value: 120150.00 +/- 16997.13
name: mean_reward
verified: false
---
# (CleanRL) **PPO** Agent Playing **Asterix-v5**
This is a trained model of a PPO agent playing Asterix-v5.
The model was trained by using [CleanRL](https://github.com/vwxyzjn/cleanrl) and the most up-to-date training code can be
found [here](https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/cleanba_ppo_envpool_impala_atari_wrapper.py).
## Get Started
To use this model, please install the `cleanrl` package with the following command:
```
pip install "cleanrl[jax,envpool,atari]"
python -m cleanrl_utils.enjoy --exp-name cleanba_ppo_envpool_impala_atari_wrapper --env-id Asterix-v5
```
Please refer to the [documentation](https://docs.cleanrl.dev/get-started/zoo/) for more detail.
## Command to reproduce the training
```bash
curl -OL https://huggingface.co/cleanrl/Asterix-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed1/raw/main/cleanba_ppo_envpool_impala_atari_wrapper.py
curl -OL https://huggingface.co/cleanrl/Asterix-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed1/raw/main/pyproject.toml
curl -OL https://huggingface.co/cleanrl/Asterix-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed1/raw/main/poetry.lock
poetry install --all-extras
python cleanba_ppo_envpool_impala_atari_wrapper.py --distributed --learner-device-ids 1 2 3 --track --wandb-project-name cleanba --save-model --upload-model --hf-entity cleanrl --env-id Asterix-v5 --seed 1
```
# Hyperparameters
```python
{'actor_device_ids': [0],
'actor_devices': ['gpu:0'],
'anneal_lr': True,
'async_batch_size': 20,
'async_update': 3,
'batch_size': 15360,
'capture_video': False,
'clip_coef': 0.1,
'concurrency': True,
'cuda': True,
'distributed': True,
'ent_coef': 0.01,
'env_id': 'Asterix-v5',
'exp_name': 'cleanba_ppo_envpool_impala_atari_wrapper',
'gae_lambda': 0.95,
'gamma': 0.99,
'global_learner_decices': ['gpu:1',
'gpu:2',
'gpu:3',
'gpu:5',
'gpu:6',
'gpu:7'],
'hf_entity': 'cleanrl',
'learner_device_ids': [1, 2, 3],
'learner_devices': ['gpu:1', 'gpu:2', 'gpu:3'],
'learning_rate': 0.00025,
'local_batch_size': 7680,
'local_minibatch_size': 1920,
'local_num_envs': 60,
'local_rank': 0,
'max_grad_norm': 0.5,
'minibatch_size': 3840,
'norm_adv': True,
'num_envs': 120,
'num_minibatches': 4,
'num_steps': 128,
'num_updates': 3255,
'profile': False,
'save_model': True,
'seed': 1,
'target_kl': None,
'test_actor_learner_throughput': False,
'torch_deterministic': True,
'total_timesteps': 50000000,
'track': True,
'update_epochs': 4,
'upload_model': True,
'vf_coef': 0.5,
'wandb_entity': None,
'wandb_project_name': 'cleanba',
'world_size': 2}
```
|
alexandrainst/da-hatespeech-classification-base
|
[
"pytorch",
"tf",
"safetensors",
"bert",
"text-classification",
"da",
"transformers",
"license:cc-by-sa-4.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 866 | null |
---
tags:
- DoubleDunk-v5
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
library_name: cleanrl
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: DoubleDunk-v5
type: DoubleDunk-v5
metrics:
- type: mean_reward
value: 0.60 +/- 0.92
name: mean_reward
verified: false
---
# (CleanRL) **PPO** Agent Playing **DoubleDunk-v5**
This is a trained model of a PPO agent playing DoubleDunk-v5.
The model was trained by using [CleanRL](https://github.com/vwxyzjn/cleanrl) and the most up-to-date training code can be
found [here](https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/cleanba_ppo_envpool_impala_atari_wrapper.py).
## Get Started
To use this model, please install the `cleanrl` package with the following command:
```
pip install "cleanrl[jax,envpool,atari]"
python -m cleanrl_utils.enjoy --exp-name cleanba_ppo_envpool_impala_atari_wrapper --env-id DoubleDunk-v5
```
Please refer to the [documentation](https://docs.cleanrl.dev/get-started/zoo/) for more detail.
## Command to reproduce the training
```bash
curl -OL https://huggingface.co/cleanrl/DoubleDunk-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/cleanba_ppo_envpool_impala_atari_wrapper.py
curl -OL https://huggingface.co/cleanrl/DoubleDunk-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/pyproject.toml
curl -OL https://huggingface.co/cleanrl/DoubleDunk-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/poetry.lock
poetry install --all-extras
python cleanba_ppo_envpool_impala_atari_wrapper.py --distributed --learner-device-ids 1 2 3 --track --wandb-project-name cleanba --save-model --upload-model --hf-entity cleanrl --env-id DoubleDunk-v5 --seed 2
```
# Hyperparameters
```python
{'actor_device_ids': [0],
'actor_devices': ['gpu:0'],
'anneal_lr': True,
'async_batch_size': 20,
'async_update': 3,
'batch_size': 15360,
'capture_video': False,
'clip_coef': 0.1,
'concurrency': True,
'cuda': True,
'distributed': True,
'ent_coef': 0.01,
'env_id': 'DoubleDunk-v5',
'exp_name': 'cleanba_ppo_envpool_impala_atari_wrapper',
'gae_lambda': 0.95,
'gamma': 0.99,
'global_learner_decices': ['gpu:1',
'gpu:2',
'gpu:3',
'gpu:5',
'gpu:6',
'gpu:7'],
'hf_entity': 'cleanrl',
'learner_device_ids': [1, 2, 3],
'learner_devices': ['gpu:1', 'gpu:2', 'gpu:3'],
'learning_rate': 0.00025,
'local_batch_size': 7680,
'local_minibatch_size': 1920,
'local_num_envs': 60,
'local_rank': 0,
'max_grad_norm': 0.5,
'minibatch_size': 3840,
'norm_adv': True,
'num_envs': 120,
'num_minibatches': 4,
'num_steps': 128,
'num_updates': 3255,
'profile': False,
'save_model': True,
'seed': 2,
'target_kl': None,
'test_actor_learner_throughput': False,
'torch_deterministic': True,
'total_timesteps': 50000000,
'track': True,
'update_epochs': 4,
'upload_model': True,
'vf_coef': 0.5,
'wandb_entity': None,
'wandb_project_name': 'cleanba',
'world_size': 2}
```
|
alexandrainst/da-ner-base
|
[
"pytorch",
"tf",
"bert",
"token-classification",
"da",
"dataset:dane",
"transformers",
"license:cc-by-sa-4.0",
"autotrain_compatible"
] |
token-classification
|
{
"architectures": [
"BertForTokenClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 78 | null |
---
tags:
- Atlantis-v5
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
library_name: cleanrl
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Atlantis-v5
type: Atlantis-v5
metrics:
- type: mean_reward
value: 807920.00 +/- 18495.93
name: mean_reward
verified: false
---
# (CleanRL) **PPO** Agent Playing **Atlantis-v5**
This is a trained model of a PPO agent playing Atlantis-v5.
The model was trained by using [CleanRL](https://github.com/vwxyzjn/cleanrl) and the most up-to-date training code can be
found [here](https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/cleanba_ppo_envpool_impala_atari_wrapper.py).
## Get Started
To use this model, please install the `cleanrl` package with the following command:
```
pip install "cleanrl[jax,envpool,atari]"
python -m cleanrl_utils.enjoy --exp-name cleanba_ppo_envpool_impala_atari_wrapper --env-id Atlantis-v5
```
Please refer to the [documentation](https://docs.cleanrl.dev/get-started/zoo/) for more detail.
## Command to reproduce the training
```bash
curl -OL https://huggingface.co/cleanrl/Atlantis-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed3/raw/main/cleanba_ppo_envpool_impala_atari_wrapper.py
curl -OL https://huggingface.co/cleanrl/Atlantis-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed3/raw/main/pyproject.toml
curl -OL https://huggingface.co/cleanrl/Atlantis-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed3/raw/main/poetry.lock
poetry install --all-extras
python cleanba_ppo_envpool_impala_atari_wrapper.py --distributed --learner-device-ids 1 2 3 --track --wandb-project-name cleanba --save-model --upload-model --hf-entity cleanrl --env-id Atlantis-v5 --seed 3
```
# Hyperparameters
```python
{'actor_device_ids': [0],
'actor_devices': ['gpu:0'],
'anneal_lr': True,
'async_batch_size': 20,
'async_update': 3,
'batch_size': 15360,
'capture_video': False,
'clip_coef': 0.1,
'concurrency': True,
'cuda': True,
'distributed': True,
'ent_coef': 0.01,
'env_id': 'Atlantis-v5',
'exp_name': 'cleanba_ppo_envpool_impala_atari_wrapper',
'gae_lambda': 0.95,
'gamma': 0.99,
'global_learner_decices': ['gpu:1',
'gpu:2',
'gpu:3',
'gpu:5',
'gpu:6',
'gpu:7'],
'hf_entity': 'cleanrl',
'learner_device_ids': [1, 2, 3],
'learner_devices': ['gpu:1', 'gpu:2', 'gpu:3'],
'learning_rate': 0.00025,
'local_batch_size': 7680,
'local_minibatch_size': 1920,
'local_num_envs': 60,
'local_rank': 0,
'max_grad_norm': 0.5,
'minibatch_size': 3840,
'norm_adv': True,
'num_envs': 120,
'num_minibatches': 4,
'num_steps': 128,
'num_updates': 3255,
'profile': False,
'save_model': True,
'seed': 3,
'target_kl': None,
'test_actor_learner_throughput': False,
'torch_deterministic': True,
'total_timesteps': 50000000,
'track': True,
'update_epochs': 4,
'upload_model': True,
'vf_coef': 0.5,
'wandb_entity': None,
'wandb_project_name': 'cleanba',
'world_size': 2}
```
|
alexandrainst/da-subjectivivity-classification-base
|
[
"pytorch",
"tf",
"safetensors",
"bert",
"text-classification",
"da",
"dataset:DDSC/twitter-sent",
"dataset:DDSC/europarl",
"transformers",
"license:cc-by-sa-4.0"
] |
text-classification
|
{
"architectures": [
"BertForSequenceClassification"
],
"model_type": "bert",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 846 | null |
---
tags:
- Atlantis-v5
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
library_name: cleanrl
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Atlantis-v5
type: Atlantis-v5
metrics:
- type: mean_reward
value: 888770.00 +/- 35799.41
name: mean_reward
verified: false
---
# (CleanRL) **PPO** Agent Playing **Atlantis-v5**
This is a trained model of a PPO agent playing Atlantis-v5.
The model was trained by using [CleanRL](https://github.com/vwxyzjn/cleanrl) and the most up-to-date training code can be
found [here](https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/cleanba_ppo_envpool_impala_atari_wrapper.py).
## Get Started
To use this model, please install the `cleanrl` package with the following command:
```
pip install "cleanrl[jax,envpool,atari]"
python -m cleanrl_utils.enjoy --exp-name cleanba_ppo_envpool_impala_atari_wrapper --env-id Atlantis-v5
```
Please refer to the [documentation](https://docs.cleanrl.dev/get-started/zoo/) for more detail.
## Command to reproduce the training
```bash
curl -OL https://huggingface.co/cleanrl/Atlantis-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/cleanba_ppo_envpool_impala_atari_wrapper.py
curl -OL https://huggingface.co/cleanrl/Atlantis-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/pyproject.toml
curl -OL https://huggingface.co/cleanrl/Atlantis-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed2/raw/main/poetry.lock
poetry install --all-extras
python cleanba_ppo_envpool_impala_atari_wrapper.py --distributed --learner-device-ids 1 2 3 --track --wandb-project-name cleanba --save-model --upload-model --hf-entity cleanrl --env-id Atlantis-v5 --seed 2
```
# Hyperparameters
```python
{'actor_device_ids': [0],
'actor_devices': ['gpu:0'],
'anneal_lr': True,
'async_batch_size': 20,
'async_update': 3,
'batch_size': 15360,
'capture_video': False,
'clip_coef': 0.1,
'concurrency': True,
'cuda': True,
'distributed': True,
'ent_coef': 0.01,
'env_id': 'Atlantis-v5',
'exp_name': 'cleanba_ppo_envpool_impala_atari_wrapper',
'gae_lambda': 0.95,
'gamma': 0.99,
'global_learner_decices': ['gpu:1',
'gpu:2',
'gpu:3',
'gpu:5',
'gpu:6',
'gpu:7'],
'hf_entity': 'cleanrl',
'learner_device_ids': [1, 2, 3],
'learner_devices': ['gpu:1', 'gpu:2', 'gpu:3'],
'learning_rate': 0.00025,
'local_batch_size': 7680,
'local_minibatch_size': 1920,
'local_num_envs': 60,
'local_rank': 0,
'max_grad_norm': 0.5,
'minibatch_size': 3840,
'norm_adv': True,
'num_envs': 120,
'num_minibatches': 4,
'num_steps': 128,
'num_updates': 3255,
'profile': False,
'save_model': True,
'seed': 2,
'target_kl': None,
'test_actor_learner_throughput': False,
'torch_deterministic': True,
'total_timesteps': 50000000,
'track': True,
'update_epochs': 4,
'upload_model': True,
'vf_coef': 0.5,
'wandb_entity': None,
'wandb_project_name': 'cleanba',
'world_size': 2}
```
|
alexandrainst/da-hatespeech-detection-small
|
[
"pytorch",
"electra",
"text-classification",
"da",
"transformers",
"license:cc-by-4.0"
] |
text-classification
|
{
"architectures": [
"ElectraForSequenceClassification"
],
"model_type": "electra",
"task_specific_params": {
"conversational": {
"max_length": null
},
"summarization": {
"early_stopping": null,
"length_penalty": null,
"max_length": null,
"min_length": null,
"no_repeat_ngram_size": null,
"num_beams": null,
"prefix": null
},
"text-generation": {
"do_sample": null,
"max_length": null
},
"translation_en_to_de": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_fr": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
},
"translation_en_to_ro": {
"early_stopping": null,
"max_length": null,
"num_beams": null,
"prefix": null
}
}
}
| 1,506 | null |
---
tags:
- DoubleDunk-v5
- deep-reinforcement-learning
- reinforcement-learning
- custom-implementation
library_name: cleanrl
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: DoubleDunk-v5
type: DoubleDunk-v5
metrics:
- type: mean_reward
value: -0.40 +/- 2.15
name: mean_reward
verified: false
---
# (CleanRL) **PPO** Agent Playing **DoubleDunk-v5**
This is a trained model of a PPO agent playing DoubleDunk-v5.
The model was trained by using [CleanRL](https://github.com/vwxyzjn/cleanrl) and the most up-to-date training code can be
found [here](https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/cleanba_ppo_envpool_impala_atari_wrapper.py).
## Get Started
To use this model, please install the `cleanrl` package with the following command:
```
pip install "cleanrl[jax,envpool,atari]"
python -m cleanrl_utils.enjoy --exp-name cleanba_ppo_envpool_impala_atari_wrapper --env-id DoubleDunk-v5
```
Please refer to the [documentation](https://docs.cleanrl.dev/get-started/zoo/) for more detail.
## Command to reproduce the training
```bash
curl -OL https://huggingface.co/cleanrl/DoubleDunk-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed1/raw/main/cleanba_ppo_envpool_impala_atari_wrapper.py
curl -OL https://huggingface.co/cleanrl/DoubleDunk-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed1/raw/main/pyproject.toml
curl -OL https://huggingface.co/cleanrl/DoubleDunk-v5-cleanba_ppo_envpool_impala_atari_wrapper-seed1/raw/main/poetry.lock
poetry install --all-extras
python cleanba_ppo_envpool_impala_atari_wrapper.py --distributed --learner-device-ids 1 2 3 --track --wandb-project-name cleanba --save-model --upload-model --hf-entity cleanrl --env-id DoubleDunk-v5 --seed 1
```
# Hyperparameters
```python
{'actor_device_ids': [0],
'actor_devices': ['gpu:0'],
'anneal_lr': True,
'async_batch_size': 20,
'async_update': 3,
'batch_size': 15360,
'capture_video': False,
'clip_coef': 0.1,
'concurrency': True,
'cuda': True,
'distributed': True,
'ent_coef': 0.01,
'env_id': 'DoubleDunk-v5',
'exp_name': 'cleanba_ppo_envpool_impala_atari_wrapper',
'gae_lambda': 0.95,
'gamma': 0.99,
'global_learner_decices': ['gpu:1',
'gpu:2',
'gpu:3',
'gpu:5',
'gpu:6',
'gpu:7'],
'hf_entity': 'cleanrl',
'learner_device_ids': [1, 2, 3],
'learner_devices': ['gpu:1', 'gpu:2', 'gpu:3'],
'learning_rate': 0.00025,
'local_batch_size': 7680,
'local_minibatch_size': 1920,
'local_num_envs': 60,
'local_rank': 0,
'max_grad_norm': 0.5,
'minibatch_size': 3840,
'norm_adv': True,
'num_envs': 120,
'num_minibatches': 4,
'num_steps': 128,
'num_updates': 3255,
'profile': False,
'save_model': True,
'seed': 1,
'target_kl': None,
'test_actor_learner_throughput': False,
'torch_deterministic': True,
'total_timesteps': 50000000,
'track': True,
'update_epochs': 4,
'upload_model': True,
'vf_coef': 0.5,
'wandb_entity': None,
'wandb_project_name': 'cleanba',
'world_size': 2}
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.