Unnamed: 0
int64 0
192k
| title
stringlengths 1
200
| text
stringlengths 10
100k
| url
stringlengths 32
885
| authors
stringlengths 2
392
| timestamp
stringlengths 19
32
⌀ | tags
stringlengths 6
263
| info
stringlengths 45
90.4k
|
---|---|---|---|---|---|---|---|
2,000 | How to Get in the Perfect Mood for Coding | How to Get in the Perfect Mood for Coding
Improve your productivity with these emotional tips
Photo by Fabian Møller on Unsplash
Sitting in front of a computer and “just code” is not always that easy. We are humans too, with worry days and tired mornings. Programming is such an emotional game to win. You have to control yourself to disclose your concentration abilities and get the job done.
Think about all you’re asking yourself for when you’re doing it:
Study a problem and ideate a solution for it.
Manage DRY principles, maintenance, scalability and simplicity.
Try not to get distracted when you have Google ready to answer any dumb questions you might have at that moment.
Work under the pressure of an incoming deadline.
Coding is a mental game too. And you should care about being in an appropriate mood for it. So that your days can be productive.
Here’s a list of my advice for reaching such a mood and getting the most out of your days. | https://medium.com/better-programming/how-to-get-in-the-perfect-mood-for-coding-21173dd084d | ['Piero Borrelli'] | 2020-10-28 10:47:44.715000+00:00 | ['Mental Health', 'Work', 'Programming', 'Startup', 'Technology'] | Title Get Perfect Mood CodingContent Get Perfect Mood Coding Improve productivity emotional tip Photo Fabian Møller Unsplash Sitting front computer “just code” always easy human worry day tired morning Programming emotional game win control disclose concentration ability get job done Think you’re asking you’re Study problem ideate solution Manage DRY principle maintenance scalability simplicity Try get distracted Google ready answer dumb question might moment Work pressure incoming deadline Coding mental game care appropriate mood day productive Here’s list advice reaching mood getting daysTags Mental Health Work Programming Startup Technology |
2,001 | 9 Writing Lessons I Learned From Drafting My First Fiction Novel | 9 Writing Lessons I Learned From Drafting My First Fiction Novel
Tips to consider if you’re interested in crafting a new story
Photo by Ben Karpinski on Unsplash
The idea first started over the summer.
I had just finished reading a book by a musician named Andrew Peterson where he talked about the creative process and why it was important for adults to retain the imaginations of their youth. He quoted two of my favorite authors, C.S. Lewis and Tolkien. That day in early June, in my daily journal, I wrote down a question that was starting to creep into the background of my brain:
What would it look like to try to write a fiction novel?
I’m an avid reader and have been writing for years now. I’ve written drafts of non-fiction books and worked in a myriad of the stages of publishing other people’s books, but I had never attempted to write my own fiction story.
Like most ideas, I sat on it and didn’t do anything about it. A few months later, a friend invited me to participate in a creative writing group focused on completing a specific writing project of our choice throughout the fall. I threw out that I’d write a series of short stories, still partially afraid to commit to the idea of an entire novel, and started working in my spare time.
As most writers can attest, once I got going, I kept going. I had a story I liked and characters that were starting to take shape. When I found myself thinking about the idea and the events about to unfold within the novel even when I wasn’t writing, that’s when I knew I was hooked.
By the end of October, I was sitting on nearly 60,000 words and a partial story. I had never really participated in National November Writing Month but thought this would be a good moment, considering I was already at work on a novel. Over the first 21 days of November, I wrote the remaining 75,000 words and finished the draft of my first fiction novel.
While writing, I learned so much about the ideas that go into getting a fiction novel down on paper. After combing back through the process and noting the speed bumps, stalls, and sticky spots, I’ve come up with nine core ideas to consider if you want to write a fiction novel as well. | https://medium.com/better-marketing/9-writing-lessons-i-learned-from-drafting-my-first-fiction-novel-bef878d564f0 | ['Jake Daghe'] | 2020-12-18 14:01:00.503000+00:00 | ['Fiction Writing', 'Storytelling', 'Fiction', 'Advice', 'Writing'] | Title 9 Writing Lessons Learned Drafting First Fiction NovelContent 9 Writing Lessons Learned Drafting First Fiction Novel Tips consider you’re interested crafting new story Photo Ben Karpinski Unsplash idea first started summer finished reading book musician named Andrew Peterson talked creative process important adult retain imagination youth quoted two favorite author CS Lewis Tolkien day early June daily journal wrote question starting creep background brain would look like try write fiction novel I’m avid reader writing year I’ve written draft nonfiction book worked myriad stage publishing people’s book never attempted write fiction story Like idea sat didn’t anything month later friend invited participate creative writing group focused completing specific writing project choice throughout fall threw I’d write series short story still partially afraid commit idea entire novel started working spare time writer attest got going kept going story liked character starting take shape found thinking idea event unfold within novel even wasn’t writing that’s knew hooked end October sitting nearly 60000 word partial story never really participated National November Writing Month thought would good moment considering already work novel first 21 day November wrote remaining 75000 word finished draft first fiction novel writing learned much idea go getting fiction novel paper combing back process noting speed bump stall sticky spot I’ve come nine core idea consider want write fiction novel wellTags Fiction Writing Storytelling Fiction Advice Writing |
2,002 | Understanding gender detransition | Gender detransition refers to reversing a gender transition, and reidentifying as the gender assigned at birth. I recently came across a detransitioner, Peter, to talk about something that I struggled to understand. Detransition has been used interchangeably with ‘sex change regret’, which is not the same thing. As a transwoman, I wished to better understand detransition, and have a respectful dialogue about it with Peter. It was not an easy thing to do. Peter detransitioned after questioning his political views, and started attending church once he started his detransition.
On the other hand, I transitioned nearly a decade ago, and during that time my political views went from progressive to conservative, and I recently became a regular church-goer. Only that I’m not detransitioning, nor interested in doing such. I caved in to detransition once due to social pressure, but it felt very wrong and distressing very quickly. Never taking such morally repugnant action again. So I asked Peter, who went from male to female then back to male, what do changes in political and religious views have to do with gender identity and dysphoria, or lack thereof:
“I discovered that I was indoctrinated into leftist thinking, and when I started questioning the trans issue I came to the opinion that it also is an agenda, and I no longer believe in the whole concept that people can change their ‘gender’, that biological sex is it, and doesn’t change.
I experienced gender confusion from age 5, which became a regular feature of my life. I found out that I was a DES baby. My mum had taken the potent estrogen when I was in the womb, so I put my confusion down to the effects of the drug. When I found that out I wanted to detransition even more.”
DES refers to diethylstilbestrol, a drug that was prescribed to prevent miscarriages during the 1940–70s, albeit ineffectively. Peter referred to this video to elaborate: https://www.youtube.com/watch?v=3fjmnyq0n2s. I contested that gender confusion is not the same as gender dysphoria, but he clarified that he now calls it confusion, after having called it dysphoria post-transition. The conversation continued:
Dana: “Why do you think you called it dysphoria at the time, instead of confusion? Was it because you were indoctrinated into leftist thinking? If so, please elaborate.”
Peter: “Because it seemed to be an apt description at the time, now I call it confusion because I believe it’s more accurate. I would have benefited more from some kind of biological sex affirming treatment instead. When I found out about the category ‘transgender’ I thought that was me because of the confusion I felt. Now I think giving transgender affirming treatment is wrong, people should be helped to come to terms with their biological sex.”
Dana: “Do you think that because you believe that your experience is applicable to the experiences of others, including mine? Because my experience was dysphoria, not confusion. I’m sorry that the healthcare profession failed by not vetting you enough for transition candidature.”
Peter: “To me that is just semantics, I saw my experience as dysphoria before, now I judge it as confusion. The health profession doesn’t really vet people, it just relies on people’s self-assessment. At least that was my experience. I don’t think people should be given affirming treatment, instead they should be helped by biological sex affirming treatment, because I think the entire concept is wrong, that you can’t really change gender/sex. When I talk about detransitioning I often get accused of not being a real trans, but that wasn’t the case. I was trans for 20 years before I stopped believing the validity of the concept.”
Dana: “It isn’t semantics. Confusion refers to uncertainty about what is happening, intended, or required. Dysphoria refers to a state of unease or generalised dissatisfaction with life. When you said, “now I judge it as confusion”, it implies that you misjudged your experience. In other words, your experience wasn’t dysphoria. It seems to me that said healthcare professionals didn’t do their jobs properly, and for that I am sorry that they failed you.”
Peter: “Yes, but the way I’m using the words to describe my experience is entirely subjective to me, so at the time dysphoria was an accurate description, now looking back I prefer the word confusion. It seems to me that I was treated no differently from any candidate for treatment, in fact now it is even easier than ever, and affirming treatment is being pushed as the only option.”
Dana: “If it’s subjective, then what’s the objective description? What evidence or research do you know of that supports alternative options?”
Peter: “It’s difficult to be objective when we’re talking about people’s feelings. There has been research by different people, for example Zucker’s clinic in Canada had a high success rate with biological sex affirming treatment. Unfortunately that kind of approach has been shut down and research suppressed.”
I referred Peter to my gender transition memoir: https://link.medium.com/FWlVa3MyYX. I wanted to understand from his perspective, how biological sex affirming treatment would’ve helped me as an alternative. I also pointed out to him that at present, the literature points in the direction that being trans is likely to be innate, that gender identity is usually known by ages 3–5. Even Kenneth Zucker, an American-Canadian psychologist and sexologist who’s (in)famously fallen out of favour with the trans community, has agreed that at age 3, children begin to self-label and form their gender identity.
Zucker further elaborated in a 2015 CAMH Gender Identity Clinic for Children Review, that “at age 15, the gender dysphoric child’s dysphoria will most likely to persist, 70%-80% to be specific”. As an authority in North America on this subject matter, he was known to prescribe puberty blockers and later HRT for trans adolescents. I put to Peter that I think he’s misunderstood Zucker’s research. Zucker’s sacking from the clinic is not proof alone that “that kind of approach has been shut down and research suppressed”. Peter disagreed with my assessment:
“Biological sex affirming treatment means helping people feel comfortable with their biological sex. I’ve heard differently about Zucker. His research found that 80–95% of children with gender issues naturally came to accept their biological sex without any treatment, so that implies gender identity isn’t innate and can change over time. It’s not just about Zucker’s sacking and his clinic being shut down. Any biological sex affirming approach or research into it is routinely squashed.
I read your memoir. Interesting. My mum was more supportive, but I wish she had retained a traditional Catholic belief like your parents. Looking back I would’ve preferred the approach of your parents, though I would’ve hated it at the time.”
I referred to Peter research, supporting his case, that hasn’t been quashed, which includes https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0202330 and https://www.tandfonline.com/doi/full/10.1080/00332925.2017.1350804. But I argued that trans children should be allowed to socially transition genders, and trans adolescents should be allowed to hormonally transition genders in a careful and medically appropriate manner. I then elaborated:
“Despite attempts to prove that trans children and adolescents can grow out of being transgender, the proof of that, which has been thrown around in public discourse, is flawed. As I’ve pointed out to you Peter, some studies seem to show that lots of young trans children change their mind. What these studies do is that they randomly take a group of children from gender clinics and follow them, only to seemingly find that most aren’t trans when they grow up. But what does that mean?
It means that a lot of these studies are just studying children, at random, that attend these gender clinics, without differentiating between those who have a gender dysphoria diagnosis, those who identify as trans, with or without diagnosis, and those who don’t identify as trans at all. All these children attended these gender clinics for a wide range of reasons, not just for gender dysphoria diagnosis. So the next time you hear the argument that “60–90% of children will naturally grow out of it”, it’s because that 60–90% weren’t trans to begin with. In fact, many of these 60–90% are LGB(-T)QIA in some way, shape or form, just not T.
The 10–40% don’t deserve to be forgotten — they deserve gender identity presentation alignment as appropriate, not denial of transition treatment. It’s worth noting that during the child’s formative years, the most rapid cognitive and emotional growth occurs. We now know that children’s physical and emotional environments dramatically impact the development of their nervous system. This is especially true of the brain and has profound implications for their psychological health as adults. Let’s get it right for the 10–40%.”
Despite that, Peter couldn’t agree, claiming that from his reading and reasoning, transgenderism is an agenda that’s being pushed, with the science behind it questionable. Specifically, most studies he’s looked at are allegedly biased towards affirming treatment, and that dissenting views are not given adequate consideration. Admittedly, I don’t believe detransitioners have been served well, given that WPATH’s (World Professional Association of Transgender Health) Standards of Care don’t offer detransition guidelines, even though a majority of WPATH surgeons want to see such guidelines included. I did find it interesting that Peter proceeded to accuse me of entrenching my views and showing no interest in learning about his views. I then clarified:
“To be fair, it appears that both of us have entrenched views to varying degrees, whether right or wrong. Hence this conversation. I’ve been hearing your views, but that doesn’t mean you’re free from any challenges from me. Otherwise why would I ask to begin with? A good learner doesn’t rest on their laurels, that’s for sure. If you haven’t already, be prepared to be challenged, because I’ve been prepared for your challenges.
Transgenderism could be an agenda that’s pushed, but not necessarily. Please elaborate on why the science is questionable. Is the bias towards affirming treatment because there’s an agenda, or is it because the evidence for non-affirming treatment consistently unconvincing? I’ve provided you with references to recent dissenting views which have been given adequate consideration, I think. Personally, I find them unconvincing, but I’m happy to hear you argue otherwise.”
And so he did. Peter argued that the entire medical field has bought the affirming paradigm, and that doctors who dissent are consistently silenced or fired:
“The dissenting science is often shut down before it can do any studies. Meanwhile, the affirming camp refuse to look into detransition or any dissenting views, saying they are too dangerous to be given air time, as they could cause trans people to be suicidal. But this is just one small aspect of the sexual revolution that has been raging for decades. This video is a doozy, looking at the subject from a social and political philosophical point of view: https://www.youtube.com/watch?v=QPVNxYkawao.”
I watched the YouTube video, which was a presentation by Rebecca Reilly-Cooper in critically examining the doctrine of gender identity. Rebecca appeared ignorant of the neuroscience behind gender identity, and growing genetic research on the matter. I put to Peter that preliminary neuroscientific research have come about over the years and more recently, which indicate that the brains of trans adults and children resemble their gender identity, not their apparent ‘biological sex’. If the brain acts as a sex organ, which it does, perhaps trans people are indeed intersex.
If this sounds incomprehensible, it’s because we’re currently in the middle of an explosion of brain research, which has greatly enhanced our understanding of the human mind. Stay tuned for more to come. There is also a piece of preliminary genetic research recently which indicate that “certain ‘versions’ of 12 different genes were significantly overrepresented in transgender women”. One study published a few years ago looked at identical twins and found that when one twin is transgender 40% of the time, the other twin is too, which is genetically significant. There are even case reports of twins raised apart and both coming out as trans.
Of note, the Royal Children’s Hospital in Melbourne, Australia, has seen more than 700 children diagnosed with gender dysphoria, and only 4% of those children ‘grow out of it’. 96% of those diagnosed as trans as children remained so at late adolescence. On that alone, it appears that the medical field hasn’t bought the affirming paradigm completely, as they do acknowledge the 4%. So the real question is, what do we do about the 4%? I put to Peter that I think he has the answer. I also put to him that:
“I do agree that dissenting doctors should not be silenced or fired. But the silencing or firing that has happened is not necessarily an indicator of a cover-up. The dissenting science is given consideration in ongoing longitudinal studies. If the longitudinal studies will support dissenting views, then it will show.”
Pete reiterated that it’s clear that dissenting studies are strongly discouraged, and that amongst gender affirmers, such as myself, there is a bias against dissenting views and a bias towards the affirming paradigm, ensuring their studies have questionable scientific value. I disagreed, and he continued:
“I have noticed the presence of bias in the scientific world frequently. In the field of transgender studies there’s a glut of light and fluffy affirming studies and no or very little scientific dissent apparent, which is necessary for good science. There are accusations of self-peer reviewing and bias in peer reviews. I wouldn’t trust their conclusions when they seem desperate to prove ‘born that way’. The subjectivity of gender as discussed in the Rebecca Reilly-Cooper video exposes the trans movement as ideologically based, and one that is currently elevating the subjective feeling of gender identity over the reality of biological sex, changing our society. This is typical of the leftist thinking that also produces ‘science’ that denies the existence of race, and shuts down studies in the area by calling it ‘racist’.
Dissenting science in the trans field is simply called ‘transphobic’. You may claim I must not have been a real trans (and that is the reason I detransitioned), but I can assure you I had the dysphoria we spoke of earlier, transitioned, and lived happily for 20 years before questioning the entire concept. My experience is evidence that it is possible to find a way to end the dysphoria and accept our biological sex, and I can assure you that I wasn’t ‘born that way’, and I bring the message that others can also affirm their biology over their gender identity, that it can be overcome.
Overcoming dysphoria is being able to feel comfortable with the biology you were born with, in my case taking testosterone has helped with this, which leads me to think that low levels of testosterone contributed to my hatred of being male. Which also suggests people who are given opposite sex hormones would contribute towards strengthening their trans identity. Overcoming the need to go against your innate biology would be different for each person, because there are many causes, including psychological/trauma that contribute towards gender dysphoria. Reparative therapy is a good model to use, as well as the hormonal side. Such therapy should be encouraged, in fact there is no reason why biological sex affirming treatment shouldn’t be tried first and studied for effectiveness, and studies of this sort are very few: there is a huge ideological bias against this treatment.”
I have no idea what my testosterone levels were by early adulthood. Regardless, wouldn’t you think then that my male puberty would’ve reduced my gender dysphoria, not increase it? It is my position that increasing testosterone levels only made my dysphoria worse. It is the psych’s role to address comorbidity issues, which they do. Of course, the not-so-good psychs won’t address comorbidity issues, but there are bad apples around wherever you go. I did not experience any psychological trauma growing up that contributed towards my dysphoria. Rather, it was the other way around: the neglectful decision to not treat my dysphoria in childhood by means of transition exacerbated my dysphoria further than needed. I wanted to know from Peter what effective reparative therapy looks like, especially for effectiveness:
“Everyone’s different. I’m not here to judge or dispute the experience that is real for you, but to use my experience and logical conclusions to say there is another way to look at it, if they so choose. I know of many cases of successful reparative therapy and heaps more ex-homosexuals who have reclaimed their lives through Jesus, and I know many detransitioners who have decided to leave transgenderism, and feel much better for it, including me.”
I tried detransition once due to social pressure, and I am never trying it ever again. I can’t see this matter any other way other than transition for myself. My final word: detransition stories can be far more complicated than tabloid headlines would have you believe, sometimes distorted and abused. Peter is a pseudonym, and we agreed to disagree. | https://danapham-au.medium.com/understanding-gender-detransition-98768223a800 | ['Dana Pham'] | 2019-07-30 21:05:02.715000+00:00 | ['LGBTQ', 'Mental Health', 'Psychology', 'Transgender', 'Detransition'] | Title Understanding gender detransitionContent Gender detransition refers reversing gender transition reidentifying gender assigned birth recently came across detransitioner Peter talk something struggled understand Detransition used interchangeably ‘sex change regret’ thing transwoman wished better understand detransition respectful dialogue Peter easy thing Peter detransitioned questioning political view started attending church started detransition hand transitioned nearly decade ago time political view went progressive conservative recently became regular churchgoer I’m detransitioning interested caved detransition due social pressure felt wrong distressing quickly Never taking morally repugnant action asked Peter went male female back male change political religious view gender identity dysphoria lack thereof “I discovered indoctrinated leftist thinking started questioning trans issue came opinion also agenda longer believe whole concept people change ‘gender’ biological sex doesn’t change experienced gender confusion age 5 became regular feature life found DES baby mum taken potent estrogen womb put confusion effect drug found wanted detransition even more” DES refers diethylstilbestrol drug prescribed prevent miscarriage 1940–70s albeit ineffectively Peter referred video elaborate httpswwwyoutubecomwatchv3fjmnyq0n2s contested gender confusion gender dysphoria clarified call confusion called dysphoria posttransition conversation continued Dana “Why think called dysphoria time instead confusion indoctrinated leftist thinking please elaborate” Peter “Because seemed apt description time call confusion believe it’s accurate would benefited kind biological sex affirming treatment instead found category ‘transgender’ thought confusion felt think giving transgender affirming treatment wrong people helped come term biological sex” Dana “Do think believe experience applicable experience others including mine experience dysphoria confusion I’m sorry healthcare profession failed vetting enough transition candidature” Peter “To semantics saw experience dysphoria judge confusion health profession doesn’t really vet people relies people’s selfassessment least experience don’t think people given affirming treatment instead helped biological sex affirming treatment think entire concept wrong can’t really change gendersex talk detransitioning often get accused real trans wasn’t case trans 20 year stopped believing validity concept” Dana “It isn’t semantics Confusion refers uncertainty happening intended required Dysphoria refers state unease generalised dissatisfaction life said “now judge confusion” implies misjudged experience word experience wasn’t dysphoria seems said healthcare professional didn’t job properly sorry failed you” Peter “Yes way I’m using word describe experience entirely subjective time dysphoria accurate description looking back prefer word confusion seems treated differently candidate treatment fact even easier ever affirming treatment pushed option” Dana “If it’s subjective what’s objective description evidence research know support alternative options” Peter “It’s difficult objective we’re talking people’s feeling research different people example Zucker’s clinic Canada high success rate biological sex affirming treatment Unfortunately kind approach shut research suppressed” referred Peter gender transition memoir httpslinkmediumcomFWlVa3MyYX wanted understand perspective biological sex affirming treatment would’ve helped alternative also pointed present literature point direction trans likely innate gender identity usually known age 3–5 Even Kenneth Zucker AmericanCanadian psychologist sexologist who’s infamously fallen favour trans community agreed age 3 child begin selflabel form gender identity Zucker elaborated 2015 CAMH Gender Identity Clinic Children Review “at age 15 gender dysphoric child’s dysphoria likely persist 7080 specific” authority North America subject matter known prescribe puberty blocker later HRT trans adolescent put Peter think he’s misunderstood Zucker’s research Zucker’s sacking clinic proof alone “that kind approach shut research suppressed” Peter disagreed assessment “Biological sex affirming treatment mean helping people feel comfortable biological sex I’ve heard differently Zucker research found 80–95 child gender issue naturally came accept biological sex without treatment implies gender identity isn’t innate change time It’s Zucker’s sacking clinic shut biological sex affirming approach research routinely squashed read memoir Interesting mum supportive wish retained traditional Catholic belief like parent Looking back would’ve preferred approach parent though would’ve hated time” referred Peter research supporting case hasn’t quashed includes httpsjournalsplosorgplosonearticleid101371journalpone0202330 httpswwwtandfonlinecomdoifull1010800033292520171350804 argued trans child allowed socially transition gender trans adolescent allowed hormonally transition gender careful medically appropriate manner elaborated “Despite attempt prove trans child adolescent grow transgender proof thrown around public discourse flawed I’ve pointed Peter study seem show lot young trans child change mind study randomly take group child gender clinic follow seemingly find aren’t trans grow mean mean lot study studying child random attend gender clinic without differentiating gender dysphoria diagnosis identify trans without diagnosis don’t identify trans child attended gender clinic wide range reason gender dysphoria diagnosis next time hear argument “60–90 child naturally grow it” it’s 60–90 weren’t trans begin fact many 60–90 LGBTQIA way shape form 10–40 don’t deserve forgotten — deserve gender identity presentation alignment appropriate denial transition treatment It’s worth noting child’s formative year rapid cognitive emotional growth occurs know children’s physical emotional environment dramatically impact development nervous system especially true brain profound implication psychological health adult Let’s get right 10–40” Despite Peter couldn’t agree claiming reading reasoning transgenderism agenda that’s pushed science behind questionable Specifically study he’s looked allegedly biased towards affirming treatment dissenting view given adequate consideration Admittedly don’t believe detransitioners served well given WPATH’s World Professional Association Transgender Health Standards Care don’t offer detransition guideline even though majority WPATH surgeon want see guideline included find interesting Peter proceeded accuse entrenching view showing interest learning view clarified “To fair appears u entrenched view varying degree whether right wrong Hence conversation I’ve hearing view doesn’t mean you’re free challenge Otherwise would ask begin good learner doesn’t rest laurel that’s sure haven’t already prepared challenged I’ve prepared challenge Transgenderism could agenda that’s pushed necessarily Please elaborate science questionable bias towards affirming treatment there’s agenda evidence nonaffirming treatment consistently unconvincing I’ve provided reference recent dissenting view given adequate consideration think Personally find unconvincing I’m happy hear argue otherwise” Peter argued entire medical field bought affirming paradigm doctor dissent consistently silenced fired “The dissenting science often shut study Meanwhile affirming camp refuse look detransition dissenting view saying dangerous given air time could cause trans people suicidal one small aspect sexual revolution raging decade video doozy looking subject social political philosophical point view httpswwwyoutubecomwatchvQPVNxYkawao” watched YouTube video presentation Rebecca ReillyCooper critically examining doctrine gender identity Rebecca appeared ignorant neuroscience behind gender identity growing genetic research matter put Peter preliminary neuroscientific research come year recently indicate brain trans adult child resemble gender identity apparent ‘biological sex’ brain act sex organ perhaps trans people indeed intersex sound incomprehensible it’s we’re currently middle explosion brain research greatly enhanced understanding human mind Stay tuned come also piece preliminary genetic research recently indicate “certain ‘versions’ 12 different gene significantly overrepresented transgender women” One study published year ago looked identical twin found one twin transgender 40 time twin genetically significant even case report twin raised apart coming trans note Royal Children’s Hospital Melbourne Australia seen 700 child diagnosed gender dysphoria 4 child ‘grow it’ 96 diagnosed trans child remained late adolescence alone appears medical field hasn’t bought affirming paradigm completely acknowledge 4 real question 4 put Peter think answer also put “I agree dissenting doctor silenced fired silencing firing happened necessarily indicator coverup dissenting science given consideration ongoing longitudinal study longitudinal study support dissenting view show” Pete reiterated it’s clear dissenting study strongly discouraged amongst gender affirmer bias dissenting view bias towards affirming paradigm ensuring study questionable scientific value disagreed continued “I noticed presence bias scientific world frequently field transgender study there’s glut light fluffy affirming study little scientific dissent apparent necessary good science accusation selfpeer reviewing bias peer review wouldn’t trust conclusion seem desperate prove ‘born way’ subjectivity gender discussed Rebecca ReillyCooper video expose trans movement ideologically based one currently elevating subjective feeling gender identity reality biological sex changing society typical leftist thinking also produce ‘science’ denies existence race shuts study area calling ‘racist’ Dissenting science trans field simply called ‘transphobic’ may claim must real trans reason detransitioned assure dysphoria spoke earlier transitioned lived happily 20 year questioning entire concept experience evidence possible find way end dysphoria accept biological sex assure wasn’t ‘born way’ bring message others also affirm biology gender identity overcome Overcoming dysphoria able feel comfortable biology born case taking testosterone helped lead think low level testosterone contributed hatred male also suggests people given opposite sex hormone would contribute towards strengthening trans identity Overcoming need go innate biology would different person many cause including psychologicaltrauma contribute towards gender dysphoria Reparative therapy good model use well hormonal side therapy encouraged fact reason biological sex affirming treatment shouldn’t tried first studied effectiveness study sort huge ideological bias treatment” idea testosterone level early adulthood Regardless wouldn’t think male puberty would’ve reduced gender dysphoria increase position increasing testosterone level made dysphoria worse psych’s role address comorbidity issue course notsogood psychs won’t address comorbidity issue bad apple around wherever go experience psychological trauma growing contributed towards dysphoria Rather way around neglectful decision treat dysphoria childhood mean transition exacerbated dysphoria needed wanted know Peter effective reparative therapy look like especially effectiveness “Everyone’s different I’m judge dispute experience real use experience logical conclusion say another way look choose know many case successful reparative therapy heap exhomosexuals reclaimed life Jesus know many detransitioners decided leave transgenderism feel much better including me” tried detransition due social pressure never trying ever can’t see matter way transition final word detransition story far complicated tabloid headline would believe sometimes distorted abused Peter pseudonym agreed disagreeTags LGBTQ Mental Health Psychology Transgender Detransition |
2,003 | What Nobody Tells You About Your Worthiness | Let’s talk about inherent worthiness for a sec.
“You are inherently worthy.”
What does that statement make you feel? Like, really feel?
For years, this statement made me feel…kind of good? (I guess). But there was always a question quick on the heels:
Then why don’t I feel like it?
The answer: Because I didn’t actually believe it.
How could I?
I’ve spent my life believing my worthiness is something to be proved through good performance. And even deeper, I had a hunch there was something inherently wrong with me — literally the opposite of inherent worthiness.
All the more reason to constantly prove my worthiness so folks won’t be tempted to pull back the curtain to see the wounded child for the wizard I made them think I was.
Someone telling me I am inherently worthy wasn’t enough to rewrite a lifetime of the opposite belief.
No matter how often I hear it and no matter how hard I try to believe their words, I never do.
Who can relate? We are legion, those of us with hearts convinced of our unworthiness.
But rest assured, this is just a chapter of the story of us coming back to our light. There’s a way to feel worthy again. It both requires nothing and everything from you.
What blocks you from experiencing your worthiness.
Whenever I sink into the feeling of my inherent unworthiness, I go to the darkest place inside of me. A dragon lives in this place and its terrifying.
Recently, though, I realized this dragon was the manifestation of my power turned inward on me because I didn’t believe it was safe to experience my power in the outer world.
Once I realized this, I got curious about it and started listening to what it had to say. My mistake, because it began asking me some very hard questions. As you read them, know the dragon inside of you is asking you the same ones:
How many more moments of your one life do you want to spend feeling unworthy?
How many more moments do you want to spend comparing yourself to others?
How many more moments do you want to spend proving your worth instead of living your life?
None. None more moments.
“So are you going to count yourself worthy, or not?” asked the dragon, its voice echoing deep into the dark caverns of my heart.
After much consideration, I replied, “I’ll have to get back to you on that.”
Wah-wah.
But at least I was honest.
There were too many reasons not to count myself worthy. For instance…
What about the terrible things I’ve done? If I count myself worthy despite those, isn’t that letting myself off too easy? Doesn’t that give me more precedence to do terrible things if I’m worthy no matter what?
What about that broken part of me I can never seem to fix? Don’t I need to do more work to fix that broken part? How can I be worthy and broken? Counting myself worthy seems like spiritual bypassing or living in a fantasy.
What if people don’t like who I am after I count myself worthy? What if I end up even more lonely after counting myself unconditionally worthy? If I count myself worthy, it means I don’t have to play society’s worthiness games — the games that make up the foundation of our culture. Who am I if I am not trying to prove my worth or make myself more likable and palatable to others? What if people don’t like the new me — the me after I’ve stepped out of the matrix? What if I find myself more lonely than before?
The dragon answered these rebuttals with the following:
What about the terrible things I’ve done?
A big part of your transition from surviving to thriving is forgiving yourself for what you’ve had to do in survival mode. Forgive yourself for what you’ve done when you believed you were unworthy. People who believe they are unworthy do things unworthy of their true character. Count yourself worthy, so you can break the cycle and do things that are worthy of your true character.
What about that broken part of me I can never seem to fix?
The only way to experience your wholeness is to count yourself worthy even in your imperfection. And at any rate — realizing you are already whole is what triggers your parts to begin to healing themselves. Counting yourself worthy even though you feel broken is the point: You do not have to be fixed in order to be worthy. You just have to be you.
What if people don’t like who I am after I count myself worthy?
Ah, here we are. The heart of it. You’ve been taught it isn’t okay to be you, and the more you you are, the more abandoned you’ll be. The real tragedy is that you had to believe them in order to survive, escape from crippling shame, get your needs met, and experience what sliver of love you could. It’s natural you are afraid of counting yourself worthy in a world that has told you it isn’t safe to do so. But in order to thrive, you must be brave enough to count yourself worthy, otherwise you’ll stay stuck surviving for the rest of your life. It’s a risk to count yourself worthy no matter what, but the biggest thing that blocks you from thriving is your commitment to the perceived safety of survival mode. Count yourself worthy, and thrive.
Experiencing your worthiness requires everything and nothing.
Experiencing my worthiness requires nothing from me in the sense that there is nothing I need to prove. There is nothing I need to “keep” up. I can just be my natural self in each moment.
Yet experiencing my worthiness requires everything from me in the sense that I must step out of the home I’ve made for myself in survival mode. I must take the risk to drop the armor of my defenses and learn what it means to embody my essence. I must be brave enough to count myself worthy, just because I am me, and say goodbye to everything that does not serve this truth. I must be brave enough to thrive.
How much more comforting it would be to stay in survival mode, playing the same old worthiness games, safe behind the armor of my defense mechanisms.
But we have one life to live.
How many more moments will you live in survival mode when you could count yourself worthy and thrive right now? | https://medium.com/real-1-0/what-nobody-tells-you-about-your-worthiness-4a090e923d43 | ['Jordin James'] | 2020-11-26 17:22:56.972000+00:00 | ['Self', 'Spirituality', 'Psychology', 'Mental Health', 'Advice'] | Title Nobody Tells WorthinessContent Let’s talk inherent worthiness sec “You inherently worthy” statement make feel Like really feel year statement made feel…kind good guess always question quick heel don’t feel like answer didn’t actually believe could I’ve spent life believing worthiness something proved good performance even deeper hunch something inherently wrong — literally opposite inherent worthiness reason constantly prove worthiness folk won’t tempted pull back curtain see wounded child wizard made think Someone telling inherently worthy wasn’t enough rewrite lifetime opposite belief matter often hear matter hard try believe word never relate legion u heart convinced unworthiness rest assured chapter story u coming back light There’s way feel worthy requires nothing everything block experiencing worthiness Whenever sink feeling inherent unworthiness go darkest place inside dragon life place terrifying Recently though realized dragon manifestation power turned inward didn’t believe safe experience power outer world realized got curious started listening say mistake began asking hard question read know dragon inside asking one many moment one life want spend feeling unworthy many moment want spend comparing others many moment want spend proving worth instead living life None None moment “So going count worthy not” asked dragon voice echoing deep dark cavern heart much consideration replied “I’ll get back that” Wahwah least honest many reason count worthy instance… terrible thing I’ve done count worthy despite isn’t letting easy Doesn’t give precedence terrible thing I’m worthy matter broken part never seem fix Don’t need work fix broken part worthy broken Counting worthy seems like spiritual bypassing living fantasy people don’t like count worthy end even lonely counting unconditionally worthy count worthy mean don’t play society’s worthiness game — game make foundation culture trying prove worth make likable palatable others people don’t like new — I’ve stepped matrix find lonely dragon answered rebuttal following terrible thing I’ve done big part transition surviving thriving forgiving you’ve survival mode Forgive you’ve done believed unworthy People believe unworthy thing unworthy true character Count worthy break cycle thing worthy true character broken part never seem fix way experience wholeness count worthy even imperfection rate — realizing already whole trigger part begin healing Counting worthy even though feel broken point fixed order worthy people don’t like count worthy Ah heart You’ve taught isn’t okay abandoned you’ll real tragedy believe order survive escape crippling shame get need met experience sliver love could It’s natural afraid counting worthy world told isn’t safe order thrive must brave enough count worthy otherwise you’ll stay stuck surviving rest life It’s risk count worthy matter biggest thing block thriving commitment perceived safety survival mode Count worthy thrive Experiencing worthiness requires everything nothing Experiencing worthiness requires nothing sense nothing need prove nothing need “keep” natural self moment Yet experiencing worthiness requires everything sense must step home I’ve made survival mode must take risk drop armor defense learn mean embody essence must brave enough count worthy say goodbye everything serve truth must brave enough thrive much comforting would stay survival mode playing old worthiness game safe behind armor defense mechanism one life live many moment live survival mode could count worthy thrive right nowTags Self Spirituality Psychology Mental Health Advice |
2,004 | 3 Levels of Prototyping and When to Use Them | Prototypes are created to save time, communicate better and focus on how the product will actually work.
Prototypes are often created early on and used for user testing or done through code to understand the feasibility of a technology. Both are extremely important parts of the product development process. They help with the understanding of user flows, feeling out interactions, communicating the desired experience with the broader team, used to raise money and more.
“If a picture is worth a thousand words, a prototype is worth a thousand meetings” — IDEO
Level 1: Click Through Prototypes
The prototype shown above is composed of around 25 images that are linked together from invisible buttons that you can tap on. You can see that some screens slide in from the side or bottom and you can scroll with fixed navigation. These are basic functionalities that help mimic the feel of a truly mobile experience.
Many design programs such as Sketch, Figma, and XD allow you to build click through prototypes right in their apps. Invision is a popular online tool that allows you to create and share these prototypes with the world and there is even a tool called POP that allows you to make prototypes from drawings on paper.
Pros
— Very quick
— Easy to create
— Easily shared
— Free tools available
Cons
— Limited interactions
— Static images only
— Can’t access device inputs like camera and keyboard
— No logic
— No gestures
— Can become hard to maintain
Why create a click through prototype?
Even though click through prototypes have limitations, they still serve a vital role within the design process. I like to use click through prototypes early on in the design process to find answers to the early questions. I will often prototype out several takes of an experience to see how the content fits and feels on a mobile device or how I can break it up into steps or screens. These prototypes are great for exploring early concepts, user testing, getting buy in from team members, and communicating an overall strategy.
Even at this low fidelity state, a hands on experience is far easier to understand than written notes or several slides in pitch deck.
With click-through prototypes, it is common to do them atvarious levels of fidelity. Pen and paper all the way through high fidelity screens are both fine to use. The main purpose is to understand the flow and get a feel for how the mobile application connects all the parts. | https://medium.com/swiftkickmobile/3-levels-of-prototyping-and-when-to-use-them-735f17bf84e2 | ['Andrew Acree'] | 2020-08-10 16:22:25.703000+00:00 | ['Mobile App Development', 'Mobile Apps', 'Prototyping', 'Design', 'UX'] | Title 3 Levels Prototyping Use ThemContent Prototypes created save time communicate better focus product actually work Prototypes often created early used user testing done code understand feasibility technology extremely important part product development process help understanding user flow feeling interaction communicating desired experience broader team used raise money “If picture worth thousand word prototype worth thousand meetings” — IDEO Level 1 Click Prototypes prototype shown composed around 25 image linked together invisible button tap see screen slide side bottom scroll fixed navigation basic functionality help mimic feel truly mobile experience Many design program Sketch Figma XD allow build click prototype right apps Invision popular online tool allows create share prototype world even tool called POP allows make prototype drawing paper Pros — quick — Easy create — Easily shared — Free tool available Cons — Limited interaction — Static image — Can’t access device input like camera keyboard — logic — gesture — become hard maintain create click prototype Even though click prototype limitation still serve vital role within design process like use click prototype early design process find answer early question often prototype several take experience see content fit feel mobile device break step screen prototype great exploring early concept user testing getting buy team member communicating overall strategy Even low fidelity state hand experience far easier understand written note several slide pitch deck clickthrough prototype common atvarious level fidelity Pen paper way high fidelity screen fine use main purpose understand flow get feel mobile application connects partsTags Mobile App Development Mobile Apps Prototyping Design UX |
2,005 | Why Pandas itertuples() Is Faster Than iterrows() and How To Make It Even Faster | Introduction
In this article, I will explain why pandas’ itertuples() function is faster than iterrows() . More importantly, I will share the tools and techniques I used to uncover the source of the bottleneck in iterrows() . By the end of this article, you will be equipped with the basic tools to profile and optimize your Python code.
The code to reproduce the results described in this article is available here. I assume the reader has a decent amount of experience writing Python code for production use.
Motivation
Imagine you are in this scenario:
You are a data scientist tasked with building a web API to classify whether a picture contains a cat given a batch of images. You decide to use Django to build the API component and to keep things simple, embed the image classifier code in in the same codebase too. You spend a couple of weeks working on this project only to find that your web app is too slow for production use. You consult your colleague who is a software engineer for advice. That colleague tells you that Python is slow and that for anything API-related, Go is the tool of choice.
Do you rewrite everything in Go (including learning a new web framework) or do you try to systematically identify what is causing your Python code to run slowly?
I’ve seen many data scientists who favour the former option despite a very steep learning curve because they do not know how to troubleshoot their code’s running time. I hope this article will change that and will stop people from needlessly abandoning Python.
Problem Statement
To make things more concrete, we will use this scenario as a running example for the rest of this article:
You’d like to populate the content of a container based on the content of a dataframe. For simplicity, let the container be a dictionary keeping track of the count of observations in the dataframe. For example, if this is the dataframe you are given:
Figure 1: Sample dataframe
then the content of the dictionary will look like this:
Figure 2: Sample output
The dictionary’s key can be a tuple of (col1, col2) or another dictionary where the first key is col1 and the second key is col2. The exact implementation details don’t matter. The point here is that you want a dictionary that tracks the count of all possible pairs in col1 and col2.
Solution
Iterrows() Solution
Here’s what an iterrows() solution would look like given the problem statement described in the preceding section:
Figure 3: Solution using iterrows()
big_df is a data frame whose content is similar to Figure 1 except that it has 3 million rows instead of 5.
On my machine, this solution took almost 12 minutes to execute.
Itertuples() Solution
Here’s what an itertuples() solution would like:
Figure 4: Solution using itertuples()
This solution only took 8.68 seconds to execute which is about 83x faster compared to the iterrows() solution.
Analysis
So why is itertuples() so much faster compared to iterrows() ?
The starting point to understand the difference in speed is to run these solutions through a profiler. A profiler is a tool that will execute a given code while keeping track the number of times each function is called and its execution time. That way, you can start your optimization process by focussing your attention on the function(s) that consume the most time.
Python comes with a built-in profiler that can be conveniently called from a Jupyter notebook using the %%prun cell magic.
Let’s reduce big_df to just 1,000 rows and look at what are the top 10 functions that took the most time to execute in total under each solution. Here are the results:
Figure 5: Top 10 functions in the itertuples() solution with the longest total execution time
Figure 6: Top 10 functions in the iterrows() solution with the longest execution time
There’s a lot of information to unpack here so for brevity, I will focus on the parts that are relevant to our problem, starting with Figure 5. I encourage the reader to read the profile module’s documentation to understand what the rest of the output means.
According to Figure 5, the itertuples() solution made 3,935 function calls in 0.003 seconds to process 1,000 rows. The function that took up the most execution time was _make which was called 1,000 times, consuming 0.001 seconds of the execution time. This function belongs to the collections module and is defined here.
_make just creates a tuple out of an iterable and since we have 1,000 rows, it makes sense that this function gets called 1,000 times (the iterable in each call being a row in our dataframe). Noting that the total time that this solution took is 0.003 seconds and rest of the functions took 0 seconds, let’s proceed to analyzing the output in Figure 6.
Figure 6 shows that the iterrows() solution made 295,280 function calls in 0.254 seconds. Compared to the itertuples() solution, all top 10 functions in the iterrows() solution have non-zero tottime values. Moreover, the actual call to iterrows() is not even in the list of 10 top functions that took the longest to execute. In contrast, the call to itertuples() in the itertuples() solution is ranked at position 7 in Figure 5.
This suggests that there is a lot of overhead associated with the call to iterrows() . Looking at the list of functions being called, we see that these overhead pertains to type checking code, e.g. is_instance and _check in the first and second row respectively. You can verify that this is the case by manually stepping through an execution of iterrows() using a debugger.
So there you have it. The reason iterrows() is slower than itertuples() is due to iterrows() doing a lot of type checks in the lifetime of its call. Now let’s see what we can do with this insight.
Application: Building A Faster Solution
Suppose we didn’t know the function itertuples() exists. What can we do to improve the row iteration performance? Well, in the preceding section, we have identified that the bottleneck is due to excessive type checking so a good first attempt at a solution is to create a data structure that does not do type checks. Here’s an example:
Figure 7: An attempt to iterate faster than iterrows()
Line 3 of Figure 7 shows that we create our rows to iterate over by simply zipping the relevant columns. This solution only took 5 seconds to execute over 3 million rows, which is almost twice as fast as the itertuples() solution. Let’s call our solution the custom solution and profile it to see if we can identify the source of the speedup.
Here’s the top 10 functions that took the most time to execute in our custom solution on a dataframe of 1,000 rows:
Figure 8: Top 10 functions in the custom solution with the longest execution time
What is striking about Figure 8 is that it shows the custom solution only made 233 function calls in 0.002 seconds. This is surprising to me since I expected at least 1,000 calls since we are still iterating over 1,000 rows.
Let’s see which function is called the most by sorting the ncalls column in descending order:
Figure 9: Same thing as Figure 9 except sorted by ncalls
Figure 9 shows that the most called function is isinstance which was called only 39 times. This still does not provide any useful information to figure out how the iteration was done with a total of less 1,000 function calls.
Another useful profiling technique is to profile the lines of our code i.e. see how many times each line is executed and how long it took. Jupyter has a line magic called %lprun which comes with the line_profile package.
Here’s what the line profile looks like for our custom solution:
Figure 10: Line profile of the custom solution (Time column is in microseconds)
As expected, we see that the iteration does happen 1,000 times (line 12). This suggests that iterating n rows does necessarily mean having to call a function n times. So the next logical question to ask is: Who is calling _make in Figure 5 1,000 times and is there any way we can avoid/reduce the number calls?
Fortunately for us, Python comes with a pstats module that allows us to dig deeper into the output of a function profile run. I refer the reader to the code accompanying this article for details on how to get this information. Anyway, here’s all the functions that called _make :
Figure 11: The functions that called _make
In this case, the output is not useful at all ( <string:1(<module>) refers to the top-level code for the “script” passed to the profiler, which is the content of the entire cell implementing the itertuples() solution).
Another approach to figure out who is calling _make is by inserting a breakpoint inside _make and then executing the solution inside a debugger. When the breakpoint is hit, we can trace the frames to see the chain of calls that led to _make .
Doing so reveals that the 1,000 calls to _make originate from the call to itertuples() itself, as shown here. The following figure shows the most interesting part of itertuples() :
Figure 12: The origin of the 1,000 calls to _make
Figure 12 shows that there are 1,000 calls to _make because line 927 returns a map that basically calls _make for each row in the dataframe. The interesting part of this snippet is that the call to map is nested under an if statement where one of the condition is that the name parameter in itertuples() must not be None . If it is, then it will return an iterator that iterates over the zipped columns in the dataframe … which is the same thing as what our custom solution does!
The documentation of itertuples() says that if the name parameter is a string, then it will return named tuples with the given name . If name is None , then it will return regular tuples instead. Our code will work just as well regardless of itertuples() ’s return type. So let’s prefer regular tuples over named tuples so that we can skip the 1,000 calls to _make . This is what happens when we set the name parameter in itertuples() to None :
Figure 13: Iterating over 3 million rows with itertuples(name=None)
Figure 14: Function profile of iterating over 1,000 rows with itertuples(name=None)
Figure 15: Line profile of iterating over 1,000 rows with itertuples(name=None)
The itertuples(name=None) solution is competitive with our custom solution. It took 5.18 seconds to iterate over 3 million rows whereas our custom solution only took 4.92 seconds.
Conclusion
This article has shown the reader how to use a Jupyter Notebook to:
Figure out which function calls are taking the most time to execute, and Which lines in a code snippet is taking the most time to execute
It also has illustrated the need to be adept with using a debugger to step through code and reading documentation to identify optimization opportunities.
I hope you will consider applying the techniques described in this article the next time you face “slow” Python code. Let me know in the comments if you have any questions.
References
Documentation On Python’s profile Module | https://medium.com/swlh/why-pandas-itertuples-is-faster-than-iterrows-and-how-to-make-it-even-faster-bc50c0edd30d | [] | 2019-10-20 17:34:32.113000+00:00 | ['Jupyter Notebook', 'Pandas', 'Programming', 'Data Science', 'Python'] | Title Pandas itertuples Faster iterrows Make Even FasterContent Introduction article explain pandas’ itertuples function faster iterrows importantly share tool technique used uncover source bottleneck iterrows end article equipped basic tool profile optimize Python code code reproduce result described article available assume reader decent amount experience writing Python code production use Motivation Imagine scenario data scientist tasked building web API classify whether picture contains cat given batch image decide use Django build API component keep thing simple embed image classifier code codebase spend couple week working project find web app slow production use consult colleague software engineer advice colleague tell Python slow anything APIrelated Go tool choice rewrite everything Go including learning new web framework try systematically identify causing Python code run slowly I’ve seen many data scientist favour former option despite steep learning curve know troubleshoot code’s running time hope article change stop people needlessly abandoning Python Problem Statement make thing concrete use scenario running example rest article You’d like populate content container based content dataframe simplicity let container dictionary keeping track count observation dataframe example dataframe given Figure 1 Sample dataframe content dictionary look like Figure 2 Sample output dictionary’s key tuple col1 col2 another dictionary first key col1 second key col2 exact implementation detail don’t matter point want dictionary track count possible pair col1 col2 Solution Iterrows Solution Here’s iterrows solution would look like given problem statement described preceding section Figure 3 Solution using iterrows bigdf data frame whose content similar Figure 1 except 3 million row instead 5 machine solution took almost 12 minute execute Itertuples Solution Here’s itertuples solution would like Figure 4 Solution using itertuples solution took 868 second execute 83x faster compared iterrows solution Analysis itertuples much faster compared iterrows starting point understand difference speed run solution profiler profiler tool execute given code keeping track number time function called execution time way start optimization process focussing attention function consume time Python come builtin profiler conveniently called Jupyter notebook using prun cell magic Let’s reduce bigdf 1000 row look top 10 function took time execute total solution result Figure 5 Top 10 function itertuples solution longest total execution time Figure 6 Top 10 function iterrows solution longest execution time There’s lot information unpack brevity focus part relevant problem starting Figure 5 encourage reader read profile module’s documentation understand rest output mean According Figure 5 itertuples solution made 3935 function call 0003 second process 1000 row function took execution time make called 1000 time consuming 0001 second execution time function belongs collection module defined make creates tuple iterable since 1000 row make sense function get called 1000 time iterable call row dataframe Noting total time solution took 0003 second rest function took 0 second let’s proceed analyzing output Figure 6 Figure 6 show iterrows solution made 295280 function call 0254 second Compared itertuples solution top 10 function iterrows solution nonzero tottime value Moreover actual call iterrows even list 10 top function took longest execute contrast call itertuples itertuples solution ranked position 7 Figure 5 suggests lot overhead associated call iterrows Looking list function called see overhead pertains type checking code eg isinstance check first second row respectively verify case manually stepping execution iterrows using debugger reason iterrows slower itertuples due iterrows lot type check lifetime call let’s see insight Application Building Faster Solution Suppose didn’t know function itertuples exists improve row iteration performance Well preceding section identified bottleneck due excessive type checking good first attempt solution create data structure type check Here’s example Figure 7 attempt iterate faster iterrows Line 3 Figure 7 show create row iterate simply zipping relevant column solution took 5 second execute 3 million row almost twice fast itertuples solution Let’s call solution custom solution profile see identify source speedup Here’s top 10 function took time execute custom solution dataframe 1000 row Figure 8 Top 10 function custom solution longest execution time striking Figure 8 show custom solution made 233 function call 0002 second surprising since expected least 1000 call since still iterating 1000 row Let’s see function called sorting ncalls column descending order Figure 9 thing Figure 9 except sorted ncalls Figure 9 show called function isinstance called 39 time still provide useful information figure iteration done total le 1000 function call Another useful profiling technique profile line code ie see many time line executed long took Jupyter line magic called lprun come lineprofile package Here’s line profile look like custom solution Figure 10 Line profile custom solution Time column microsecond expected see iteration happen 1000 time line 12 suggests iterating n row necessarily mean call function n time next logical question ask calling make Figure 5 1000 time way avoidreduce number call Fortunately u Python come pstats module allows u dig deeper output function profile run refer reader code accompanying article detail get information Anyway here’s function called make Figure 11 function called make case output useful string1module refers toplevel code “script” passed profiler content entire cell implementing itertuples solution Another approach figure calling make inserting breakpoint inside make executing solution inside debugger breakpoint hit trace frame see chain call led make reveals 1000 call make originate call itertuples shown following figure show interesting part itertuples Figure 12 origin 1000 call make Figure 12 show 1000 call make line 927 return map basically call make row dataframe interesting part snippet call map nested statement one condition name parameter itertuples must None return iterator iterates zipped column dataframe … thing custom solution documentation itertuples say name parameter string return named tuples given name name None return regular tuples instead code work well regardless itertuples ’s return type let’s prefer regular tuples named tuples skip 1000 call make happens set name parameter itertuples None Figure 13 Iterating 3 million row itertuplesnameNone Figure 14 Function profile iterating 1000 row itertuplesnameNone Figure 15 Line profile iterating 1000 row itertuplesnameNone itertuplesnameNone solution competitive custom solution took 518 second iterate 3 million row whereas custom solution took 492 second Conclusion article shown reader use Jupyter Notebook Figure function call taking time execute line code snippet taking time execute also illustrated need adept using debugger step code reading documentation identify optimization opportunity hope consider applying technique described article next time face “slow” Python code Let know comment question References Documentation Python’s profile ModuleTags Jupyter Notebook Pandas Programming Data Science Python |
2,006 | How to Create Your First REST API With Deno | Create an API With Deno
What better way to start playing with Deno than by creating our first REST API.
With this little tutorial, I am going to create a very simple array of movies and the five methods to list, search, create, update, and delete elements.
The first step is to create an index file, in this case app.ts . The first thing will be to load Oak, a middleware framework for Deno’s HTTP server. Oak is inspired by Koa, a middleware for Node.js. It seems that they continue with the pun. In the end, it helps us make writing APIs easier.
It is a fairly simple example that is practically self-explanatory. The server will listen on port 4000 and load the routes defined in the router.ts file that we will see right after. In the file ./api/controller.ts , I will put the definition of the functions for the different endpoints.
It’s time to define the routes in the router.ts file. Here we will also import the Oak router and the definitions that we will create in the controller.ts .
We instantiate a router and define the five commented routes:
getMovies — Returns all the movies
— Returns all the movies getMovie — Returns a movie given its ID
— Returns a movie given its ID createMovie — Creates a new movie
— Creates a new movie updateMovie — Updates an existing movie
— Updates an existing movie deleteMovie — Deletes a movie
Now it’s time to create the controller.ts file to define the API methods and the test database.
interface Movie {
id: string;
title: string;
rating: number;
}
Then, the movies array:
And now, the different methods, starting with the one that lists all the movies. It is really that simple:
/**
* Returns all the movies in database
*/
const getMovies = ({ response }: { response: any }) => {
response.body = movies;
};
Let’s go to the next one, the one in charge of returning a movie from an ID that we can pass as a parameter.
If we try to launch the request with Postman, we will see that it works.
It is the turn of the createMovie method to create a movie. The code is the following:
If we launch the test request, the server will reply with a message containing the recently created movie data.
If we then launch the request to return all the movies, we will see how the new one appears correctly.
It is the turn of the updateMovie method to update a movie. The code is:
We launch the corresponding PUT request with Postman, and we will get the correct response.
And finally, we only have the deleteMovie method that, in this case, deletes a movie from a given id. What I do is use the filter () to update the array, keeping all the movies with a different id than the one sent.
We try with Postman …
And effectively the movie with id = 1 just disappeared.
You can download all the source code for this example in this repository from my GitHub. | https://medium.com/better-programming/how-to-create-your-first-rest-api-with-deno-296330832090 | ['Marc Pàmpols'] | 2020-05-26 14:07:35.353000+00:00 | ['API', 'Typescript', 'Deno'] | Title Create First REST API DenoContent Create API Deno better way start playing Deno creating first REST API little tutorial going create simple array movie five method list search create update delete element first step create index file case appts first thing load Oak middleware framework Deno’s HTTP server Oak inspired Koa middleware Nodejs seems continue pun end help u make writing APIs easier fairly simple example practically selfexplanatory server listen port 4000 load route defined routerts file see right file apicontrollerts put definition function different endpoint It’s time define route routerts file also import Oak router definition create controllerts instantiate router define five commented route getMovies — Returns movie — Returns movie getMovie — Returns movie given ID — Returns movie given ID createMovie — Creates new movie — Creates new movie updateMovie — Updates existing movie — Updates existing movie deleteMovie — Deletes movie it’s time create controllerts file define API method test database interface Movie id string title string rating number movie array different method starting one list movie really simple Returns movie database const getMovies response response responsebody movie Let’s go next one one charge returning movie ID pas parameter try launch request Postman see work turn createMovie method create movie code following launch test request server reply message containing recently created movie data launch request return movie see new one appears correctly turn updateMovie method update movie code launch corresponding PUT request Postman get correct response finally deleteMovie method case deletes movie given id use filter update array keeping movie different id one sent try Postman … effectively movie id 1 disappeared download source code example repository GitHubTags API Typescript Deno |
2,007 | Breaking down Google Cloud IAP | But not literally
Recently I’ve had the problem of securing a custom ETL tool for a client project built using a combination of AppEngine and Kubernetes applications on Google Cloud Platform (GCP). I need to control access to these apps from external networks, and ideally integrate with an existing IAM policy. Enter IAP!
Identity Aware Proxy (IAP) is GCP’s offering to lock down applications that would otherwise be publicly exposed on the cloud. The sell is pretty sweet, just turn it on and within minutes you get a free wall of Google’s good stuff surrounding your shamefully exposed app. The security guy stops sweating and everyone is happy, right? Well… kinda: It does what the box says, however there are some design decisions that will leave you a little bit stumped.
In this post I’ll cover how it works behind the scenes, and how you can integrate with a micro-services architecture built around service accounts. I’ll provide a summary of the good and the bad at the end if you’re after a TLDR. Otherwise, let’s see how it works, and more importantly, how you can use it.
Build the Wall!
Turning on IAP is pretty simple. In fact, let’s do it right now. First things first, I’ve deployed a basic flask app that will print out the user’s identity. For testing purposes I’ve done this in App Engine Standard, as it has the users API baked in to validate identity.
IAP is available in Security > Identity-Aware Proxy. Once you get here you’re greeted with a screen like this:
It tells us that we need to configure our OAuth consent screen before I can protect my app. This screen is presented to unauthenticated users when they hit a secured endpoint, and provides information like your company name, homepage, and privacy policy. Unfortunately there is no way to automate this configuration yet which means that an IAP-centric app cannot use a fully-automated deployment.
Anyway, once we’ve configured that then we can turn IAP on for all App Engine apps in the project. This will instantly lock down all AppEngine apps completely. Including the service URL below will allow authorised users to access that URL.
After this when I hit my App Engine endpoint I’m prompted to log in and:
Oops — my account doesn’t have access to the App Engine resource yet. On the IAP page under access you’ll find the add button, which will let you give IAP permissions to users at the project level. Behind the scenes this is just applying the “IAP-Secured Web App User” IAM role to the account you provide. This means that we can apply these permissions to already existing groups or roles. Once that change has propagated we get the result:
It worked!
But how does it work?
IAP is the central pillar of BeyondCorp, enabling remote authentication to online-services without a VPN. It can be enabled for App Engine or a Cloud Load Balancer (supporting GCE or GKE instances), and configured via IAM to allow fine grained access-control. Google says it can do authorisation as well, however IAP permissions are applied at the project level. If you turn on IAP for two applications, and grant a user IAP access, they will be able to access both. If you want to get more low level, for example at the application or even endpoint level, you’ll have to build some scaffolding around what IAP gives you.
At a high level, IAP has two main layers:
Resource Authorisation — an Oauth2 flow to generate a signed access token. IAP will use this token to validate identity. App Validation — verifying a user’s identity using signed headers generated by IAP. This provides an additional layer of security if someone manages to bypass IAP (or if you forget to turn it on ;) )
Depending on what you’re protecting and the language you’ve written it in you’ll have various levels of support to implement these two layers. The simplest use case is authorising as a user, as demonstrated in the example above. This process can be done programmatically (see here), however if you’re thinking about automating sign-in then you should probably be using service accounts. Unfortunately, automating this process is more complicated; if your app isn’t written in Java, C#, Python, or PHP* then things are going to get cURLy.
* At the time of writing these are the only available languages in the docs.
Bizarrely, this list doesn’t include Node.js. Although Cloud Functions now supports Python, adding IAP tokens to a Node 6 app is a nightmare (see this handy stack overflow thread). With Node 8 you should in theory be able to leverage the google-auth-library, although the example requires a Service Account key file, instead of leveraging the application credentials.
Resource Authorisation
In order to pass IAP border control and make a request against a protected app, we need to generate an OpenID token signed by Google and add it to our request as a header. To do that, we need to generate a JWT signed by the service account that our app is using. Fortunately, the examples in the docs implement this logic for you, so you don’t need to worry about it — unless you’re writing Node 6. Unfortunately, I had to implement this in Node 6.
We used the following approach, based on the above stack overflow link. Note that I had mixed results with this approach — if you get the ambiguous “401: Error Code 13” then the only advice I have for you is to start from scratch on a new project. I wish I was joking. Anyway, here’s what that snippet is actually doing:
Get service account access token from instance metadata store Create a JWT header and claim set for our OpenID request Sign that JWT using our service account access token and the Sign Blob API Use signed JWT to get OpenID token Attach that OpenID token to our request as a header
Frankly, if I were to do it again I’d just do it in Python (or one of the other supported languages) and use Google’s sample code.
Once you’ve generated a token, you can make an authenticated request by adding it to the “Bearer” header. A request made to an IAP protected endpoint will be redirected to the IAP gateway, where the token is decoded and validated. If valid, IAP will then create or replace the x-goog headers which can be used by the app to validate identity.
App Validation
In order to prevent nefarious parties (or l337 h4xors as they’re known in the industry) from accessing your app, IAP uses two layers of security: The first layer requires a token as generated above, which is used to validate the user’s identity. The second stage involves encrypting this identity into a second token using keys managed by the IAP service. This token can then be decrypted using public keys, which is what makes IAP so secure.
This is because spoofing this token would require knowing the private keys used to encode the IAP assertion. As long as your app performs this validation, it is locked down even if IAP isn’t turned on.
A Quick Note on GKE
Before I wrap up, a quick note on securing a GKE app with IAP. You’ll need a DNS name and an Ingress load balancer, as covered in the tutorial here. However, there’s a chicken and egg problem with this deployment that’s a pain to deal with:
In order to decode a JWT token inside a GKE app you need to know the backend service ID of that app.
This means that if you want to decode the IAP header inside your app and get the requester’s account you need the load-balancer to already exist. Since this isn’t possible you are left with three options:
Redeploy the app with the correct configuration once the load balancer is created Use dynamic config, for example by using a config map linked to a file Look up the backend service ID using the Instance Metadata server at runtime, also known as the “O’Reilly” option.
Wrapping Up
IAP provides a fast and secure way to lock down your apps, however there are a few caveats:
Not being able to automatically create an Oauth2 consent screen will prevent any deployments that use IAP from being fully automated.
Access is provisioned per account for all IAP-protected services at the project level. Access control at the application level will require additional scaffolding on top of what IAP provides.
The support for token generation in Javascript needs a lot of work.
Using this with Kubernetes will complicate deployment requirements
That being said, it still passes with flying colours. Authentication is hard, and having it as a fully managed service more than makes up for the overhead in deployment. Some of these problems are also minor fixes, and will likely become available as the product matures. If you’re searching for a quick-and-easy to lock down your services looks no further.
A Note on Cloud Endpoints
If you do need something with more fine-grained access control and better language support have a look at Cloud Endpoints. It doesn’t use the same 2-layer model as IAP, and you will have to manage swagger specs at the application level, however it does offer a more extensible service.
https://www.servian.com/gcp/ | https://medium.com/weareservian/breaking-down-google-cloud-iap-e3b23a8bddc7 | ['Mayan Salama'] | 2019-07-08 04:23:24.057000+00:00 | ['Identity Aware Proxy', 'Kubernetes', 'Microservices', 'Google Cloud Platform', 'Authentication'] | Title Breaking Google Cloud IAPContent literally Recently I’ve problem securing custom ETL tool client project built using combination AppEngine Kubernetes application Google Cloud Platform GCP need control access apps external network ideally integrate existing IAM policy Enter IAP Identity Aware Proxy IAP GCP’s offering lock application would otherwise publicly exposed cloud sell pretty sweet turn within minute get free wall Google’s good stuff surrounding shamefully exposed app security guy stop sweating everyone happy right Well… kinda box say however design decision leave little bit stumped post I’ll cover work behind scene integrate microservices architecture built around service account I’ll provide summary good bad end you’re TLDR Otherwise let’s see work importantly use Build Wall Turning IAP pretty simple fact let’s right First thing first I’ve deployed basic flask app print user’s identity testing purpose I’ve done App Engine Standard user API baked validate identity IAP available Security IdentityAware Proxy get you’re greeted screen like tell u need configure OAuth consent screen protect app screen presented unauthenticated user hit secured endpoint provides information like company name homepage privacy policy Unfortunately way automate configuration yet mean IAPcentric app cannot use fullyautomated deployment Anyway we’ve configured turn IAP App Engine apps project instantly lock AppEngine apps completely Including service URL allow authorised user access URL hit App Engine endpoint I’m prompted log Oops — account doesn’t access App Engine resource yet IAP page access you’ll find add button let give IAP permission user project level Behind scene applying “IAPSecured Web App User” IAM role account provide mean apply permission already existing group role change propagated get result worked work IAP central pillar BeyondCorp enabling remote authentication onlineservices without VPN enabled App Engine Cloud Load Balancer supporting GCE GKE instance configured via IAM allow fine grained accesscontrol Google say authorisation well however IAP permission applied project level turn IAP two application grant user IAP access able access want get low level example application even endpoint level you’ll build scaffolding around IAP give high level IAP two main layer Resource Authorisation — Oauth2 flow generate signed access token IAP use token validate identity App Validation — verifying user’s identity using signed header generated IAP provides additional layer security someone manages bypass IAP forget turn Depending you’re protecting language you’ve written you’ll various level support implement two layer simplest use case authorising user demonstrated example process done programmatically see however you’re thinking automating signin probably using service account Unfortunately automating process complicated app isn’t written Java C Python PHP thing going get cURLy time writing available language doc Bizarrely list doesn’t include Nodejs Although Cloud Functions support Python adding IAP token Node 6 app nightmare see handy stack overflow thread Node 8 theory able leverage googleauthlibrary although example requires Service Account key file instead leveraging application credential Resource Authorisation order pas IAP border control make request protected app need generate OpenID token signed Google add request header need generate JWT signed service account app using Fortunately example doc implement logic don’t need worry — unless you’re writing Node 6 Unfortunately implement Node 6 used following approach based stack overflow link Note mixed result approach — get ambiguous “401 Error Code 13” advice start scratch new project wish joking Anyway here’s snippet actually Get service account access token instance metadata store Create JWT header claim set OpenID request Sign JWT using service account access token Sign Blob API Use signed JWT get OpenID token Attach OpenID token request header Frankly I’d Python one supported language use Google’s sample code you’ve generated token make authenticated request adding “Bearer” header request made IAP protected endpoint redirected IAP gateway token decoded validated valid IAP create replace xgoog header used app validate identity App Validation order prevent nefarious party l337 h4xors they’re known industry accessing app IAP us two layer security first layer requires token generated used validate user’s identity second stage involves encrypting identity second token using key managed IAP service token decrypted using public key make IAP secure spoofing token would require knowing private key used encode IAP assertion long app performs validation locked even IAP isn’t turned Quick Note GKE wrap quick note securing GKE app IAP You’ll need DNS name Ingress load balancer covered tutorial However there’s chicken egg problem deployment that’s pain deal order decode JWT token inside GKE app need know backend service ID app mean want decode IAP header inside app get requester’s account need loadbalancer already exist Since isn’t possible left three option Redeploy app correct configuration load balancer created Use dynamic config example using config map linked file Look backend service ID using Instance Metadata server runtime also known “O’Reilly” option Wrapping IAP provides fast secure way lock apps however caveat able automatically create Oauth2 consent screen prevent deployment use IAP fully automated Access provisioned per account IAPprotected service project level Access control application level require additional scaffolding top IAP provides support token generation Javascript need lot work Using Kubernetes complicate deployment requirement said still pass flying colour Authentication hard fully managed service make overhead deployment problem also minor fix likely become available product matures you’re searching quickandeasy lock service look Note Cloud Endpoints need something finegrained access control better language support look Cloud Endpoints doesn’t use 2layer model IAP manage swagger spec application level however offer extensible service httpswwwserviancomgcpTags Identity Aware Proxy Kubernetes Microservices Google Cloud Platform Authentication |
2,008 | Introduction to Technology Hits | Background
I am a technologist for almost four decades.
Over the years, I developed a large readers group based in various academic, industry, and public platforms. My readers want to follow my technology-related content from a single and curated source.
These loyal followers requested me to curate compelling, insightful, and engaging content and disseminate them in a logical and digestible way. The best to achieve this goal is to turn the content to stories and make them available via a publication. Medium is an ideal platform for this request.
I thought the best way would be to establish a publication focusing on all technology covering various reader requirements. I am also connected with thousands of writers. For example, one of my publications on Medium is supporting to around 6,000 writers.
I can guess what you are thinking. Yes, there are thousands of publications about technology. You may ask, how would my publication be different and add value to readers?
I designed this publication by considering the requirements of my readers. By analyzing their needs and wants using a Design Thinking method, I classified my audience under seven categories. Information management, knowledge transfer, and content harvesting are my special interests. I am an inventor and innovator in these areas with industry credentials.
I announce this new publication today which coincided with the 5th milestone of my significant publication on Medium.
Scope
The seven personas in technology domains depict the scope of this publication. They are:
technicians, philosophers, entrepreneurs, entertainers, artists, storytellers, and futuristic leaders.
Since Medium allows a maximum of seven tabs in the publication interface, I added them as the amplifiers of stories that can be conveniently consumed by these personas using tags. I will provide a comprehensive guide to the effective use of the publication.
The publication banner depicts the implementation of the functions based on the seven major personas.
image screen capture of the publication banner by author
My aim is to harvest and publish stories within the scope of these seven logical functions. These seven functions can cover almost anyone who has an interest in any aspect of technology
Let me briefly explain the coverage of each domain and give you an idea of who can contribute to this unique publication with the type of stories.
Technical
Stories in this domain include technical, engineering, and scientific aspect of technology. For example, the definition and description of a technology, architecture, design constructs, security, tools, operations, processes, and procedures can be part of this section. How to type of stories can fit into this domain. Use of technology in all scientific disciplines such as medicine, biotechnology, neuroscience, engineering, environment, climate, and so on can be part of technical function. Technology professionals such as data scientists, enterprise architects, solution designers, technical specialists, software developers, and system administrators can share their stories in this section.
Philosophical
This function is dedicated to philosophers and deep thinkers. Ideas reflecting the pros and cons of technology constructs can well suit this domain. There are many readers interested in the philosophical aspect of technology. For example, ethics for artificial intelligence and robotics is a popular topic. There are undergraduate and postgraduate degrees offering courses on the philosophical aspect of technology. Students of these degrees are welcome to submit their academic yet engaging stories.
Entertaining
Entertainers use technology widely. The contributors and consumers of this function can be game enthusiasts. Computer games are widespread globally and establish an extensive industry outlook. Service providers use technology to entertain their customers. Social media stories especially podcasts and YouTube can be part of this section. Yes, you can introduce your YouTube channel and podcasts in a story.
Entrepreneurial
This function can serve entrepreneurs in Startup companies. This section can be used by those technology leaders who plan digital ventures. All business, economic, and financial aspect of technology can be part of this function
Artistic
Technology and art are interrelated. Many artists use technology to express their artistic thoughts and feelings. You may have heard about famous digital poems. Digital painting and digital music are widespread. You can submit stories about the use of technology for all art forms including design work in this section. I want to cover stories on how technology impacts poets, musicians, and painters.
Personal
This publication is home to storytellers writing various aspects of technology. You can share your personal experience with technology tools, processes, and services. You can share personal stories reflecting your thoughts and feelings about technological devices such as smartphones, smartwatches, security devices, cooking, gardening, various IoT devices serving different purposes.
Futuristic
This section is for thought leaders, inventors, innovators, and strategists. You can share your wild ideas on how the future should be from the technological aspect. Another great topic can be the transformational effects of technology on human life. Ideas for the next generations can be discussed in stories submitted to this section.
In short, anyone writing any aspect of technology can make this publication home for their stories.
Benefits
As an Editor in Chief with a strong technology background and publishing experience, I will orchestrate editing and publishing activities with the help of several experienced editors.
As contributors to this publication, we will value your content and support you to achieve your writing goals in the technology domain.
We will leverage 50K followers of ILLUMINATION to amplify your messages and showcase your outstanding stories to discerning readers on our special collection called ILLUMINATION-Curated.
As a unique value, we will let you transfer your stories among three publications with ease. This flexibility can give your stories maximum visibility to the interested readers of other two large publications. In addition, we will allow you to publish your old curated stories distributed to technology related topics in the past. This opportunity can give your old curated stories a second life by introducing them to a new audience.
You can also join our Slack group to collaborate with hundreds of other Medium writers contributing to ILLUMINATION and ILLUMINATION-Curated. We will create special sections and community clubs to support your writing goals.
If you are interested in becoming a contributor, please send a request via this link. Alternatively, you can leave a brief comment on this story showing your interest to participate.
I am excited about this initiative and look forward to collaborating with you.
You are welcome to join my 100K+ mailing list, to collaborate, enhance your network, and receive technology newsletter reflecting my industry experience. | https://medium.com/technology-hits/introduction-to-technology-hits-7665b8d5e950 | ['Dr Mehmet Yildiz'] | 2020-12-14 15:19:51.134000+00:00 | ['Artificial Intelligence', 'Technology', 'Data Science', 'Entertainment', 'Writing'] | Title Introduction Technology HitsContent Background technologist almost four decade year developed large reader group based various academic industry public platform reader want follow technologyrelated content single curated source loyal follower requested curate compelling insightful engaging content disseminate logical digestible way best achieve goal turn content story make available via publication Medium ideal platform request thought best way would establish publication focusing technology covering various reader requirement also connected thousand writer example one publication Medium supporting around 6000 writer guess thinking Yes thousand publication technology may ask would publication different add value reader designed publication considering requirement reader analyzing need want using Design Thinking method classified audience seven category Information management knowledge transfer content harvesting special interest inventor innovator area industry credential announce new publication today coincided 5th milestone significant publication Medium Scope seven persona technology domain depict scope publication technician philosopher entrepreneur entertainer artist storyteller futuristic leader Since Medium allows maximum seven tab publication interface added amplifier story conveniently consumed persona using tag provide comprehensive guide effective use publication publication banner depicts implementation function based seven major persona image screen capture publication banner author aim harvest publish story within scope seven logical function seven function cover almost anyone interest aspect technology Let briefly explain coverage domain give idea contribute unique publication type story Technical Stories domain include technical engineering scientific aspect technology example definition description technology architecture design construct security tool operation process procedure part section type story fit domain Use technology scientific discipline medicine biotechnology neuroscience engineering environment climate part technical function Technology professional data scientist enterprise architect solution designer technical specialist software developer system administrator share story section Philosophical function dedicated philosopher deep thinker Ideas reflecting pro con technology construct well suit domain many reader interested philosophical aspect technology example ethic artificial intelligence robotics popular topic undergraduate postgraduate degree offering course philosophical aspect technology Students degree welcome submit academic yet engaging story Entertaining Entertainers use technology widely contributor consumer function game enthusiast Computer game widespread globally establish extensive industry outlook Service provider use technology entertain customer Social medium story especially podcasts YouTube part section Yes introduce YouTube channel podcasts story Entrepreneurial function serve entrepreneur Startup company section used technology leader plan digital venture business economic financial aspect technology part function Artistic Technology art interrelated Many artist use technology express artistic thought feeling may heard famous digital poem Digital painting digital music widespread submit story use technology art form including design work section want cover story technology impact poet musician painter Personal publication home storyteller writing various aspect technology share personal experience technology tool process service share personal story reflecting thought feeling technological device smartphones smartwatches security device cooking gardening various IoT device serving different purpose Futuristic section thought leader inventor innovator strategist share wild idea future technological aspect Another great topic transformational effect technology human life Ideas next generation discussed story submitted section short anyone writing aspect technology make publication home story Benefits Editor Chief strong technology background publishing experience orchestrate editing publishing activity help several experienced editor contributor publication value content support achieve writing goal technology domain leverage 50K follower ILLUMINATION amplify message showcase outstanding story discerning reader special collection called ILLUMINATIONCurated unique value let transfer story among three publication ease flexibility give story maximum visibility interested reader two large publication addition allow publish old curated story distributed technology related topic past opportunity give old curated story second life introducing new audience also join Slack group collaborate hundred Medium writer contributing ILLUMINATION ILLUMINATIONCurated create special section community club support writing goal interested becoming contributor please send request via link Alternatively leave brief comment story showing interest participate excited initiative look forward collaborating welcome join 100K mailing list collaborate enhance network receive technology newsletter reflecting industry experienceTags Artificial Intelligence Technology Data Science Entertainment Writing |
2,009 | 24 Most Controversial Books of All Time | 24 Most Controversial Books of All Time
Readers.com infographic details most challenged/banned books of all time
“What is freedom of expression? Without the freedom to offend, it ceases to exist.” Salman Rushdie, among many others, finds a book of his on this list of the 24 most controversial books of all time. There are a few conspicuous absentees (Joyce’s Ulysses, for example). Which books were you most surprised not to see? | https://medium.com/electric-literature/24-most-controversial-books-of-all-time-70e484941082 | ['Nicholas Politan'] | 2016-07-25 16:23:09.939000+00:00 | ['Writing', 'Free Speech', 'Infographic', 'Books'] | Title 24 Controversial Books TimeContent 24 Controversial Books Time Readerscom infographic detail challengedbanned book time “What freedom expression Without freedom offend cease exist” Salman Rushdie among many others find book list 24 controversial book time conspicuous absentee Joyce’s Ulysses example book surprised seeTags Writing Free Speech Infographic Books |
2,010 | Pictal Health — Purpose, Vision and Values | For the last few months I’ve been working on a new company to help patients organize and visualize their health stories — Pictal Health. I come from a human-centered design background, and in my past work I have often used design principles to provide creative constraints and help my team make good decisions over the course of a project. So in designing this new venture, I am trying to use similar principles and statements to speak clearly about what Pictal Health is trying to do, the impact we hope to make, and how we want to work.
Below is Pictal Health’s purpose, vision and values, which the book Story Driven helped me develop. While the specific products or services we create may change over time, I hope these core statements will remain fairly consistent. So far they have helped me get clear about what I’m working on, make better use of my time, and make better decisions; I hope they also help others understand what I’m up to. | https://medium.com/pictal-health/pictal-health-purpose-vision-and-values-1d1dfa1007ec | ['Katie Mccurdy'] | 2019-05-31 14:15:26.575000+00:00 | ['Healthcare', 'Startup', 'Design Process', 'Design'] | Title Pictal Health — Purpose Vision ValuesContent last month I’ve working new company help patient organize visualize health story — Pictal Health come humancentered design background past work often used design principle provide creative constraint help team make good decision course project designing new venture trying use similar principle statement speak clearly Pictal Health trying impact hope make want work Pictal Health’s purpose vision value book Story Driven helped develop specific product service create may change time hope core statement remain fairly consistent far helped get clear I’m working make better use time make better decision hope also help others understand I’m toTags Healthcare Startup Design Process Design |
2,011 | 7 Pieces of Terrible Writing Advice You Should Never Follow | Why Should You Trust Me?
So far, I’ve said nothing unique. Every writing guru claims they have the secret sauce and every other guru doesn’t. What makes me different?
Here’s your first clue. I’m not going to promise any of this will work exactly the way I say it will. Good advice is nothing more than a suggestion. Nobody can promise you anything because there are too many variables in life.
Most terrible writing advice centers around some guarantee that if you do what you’re told, things will work out in some precise way.
Of course, luck is involved when it comes to writing. But, there are ways to increase your odds of building an audience and having blog posts go viral. I have a bag of tricks, but I don’t know exactly what will happen after I hit publish. No one does.
Anyone who’s promising you their “proven secrets to virality” is a charlatan.
I offer useful strategies that tend to pay off in the long term because long-time scales are more predictable.
I’m living proof of that. My blog posts have been read by millions, I’ve published two books with a third coming out this fall, and tens of thousands of people read my work on a monthly basis like clockwork.
But I’ve also been writing for five years.
Many, many, many people who write about writing edit out the part where they were stuck and frustrated and show you the “roadmap for success”, based on a starting point that’s not real — the point where they got traction instead of the very first time they wrote.
I won’t do that.
You’ll get unfiltered straight-to-the-point tips, the opposite of terrible writing advice. A great starting point for success is learning what not to do. Avoid these strategies at all costs. | https://medium.com/better-marketing/7-pieces-of-terrible-writing-advice-you-should-never-follow-f2153531aed5 | ['Ayodeji Awosika'] | 2019-09-18 02:05:50.088000+00:00 | ['Creative Writing', 'Content Marketing', 'Writing', 'Marketing', 'Writing Tips'] | Title 7 Pieces Terrible Writing Advice Never FollowContent Trust far I’ve said nothing unique Every writing guru claim secret sauce every guru doesn’t make different Here’s first clue I’m going promise work exactly way say Good advice nothing suggestion Nobody promise anything many variable life terrible writing advice center around guarantee you’re told thing work precise way course luck involved come writing way increase odds building audience blog post go viral bag trick don’t know exactly happen hit publish one Anyone who’s promising “proven secret virality” charlatan offer useful strategy tend pay long term longtime scale predictable I’m living proof blog post read million I’ve published two book third coming fall ten thousand people read work monthly basis like clockwork I’ve also writing five year Many many many people write writing edit part stuck frustrated show “roadmap success” based starting point that’s real — point got traction instead first time wrote won’t You’ll get unfiltered straighttothepoint tip opposite terrible writing advice great starting point success learning Avoid strategy costsTags Creative Writing Content Marketing Writing Marketing Writing Tips |
2,012 | 4 Biggest Myths About Anxiety Everyone Believes | 2. You need to understand the origin of your anxiety
One of the biggest misconceptions about anxiety is that it’s necessary to understand its origins in your life in order to deal with it effectively.
For example, I had a client once who came to see me because she was having panic attacks anytime she drove on the freeway. She told me that she was convinced that the origins of her panic were in her childhood and her father’s habit of driving while intoxicated. And she hoped that by exploring these childhood memories together we would be able to free her from her panic attacks.
Now, it’s not hard to see how a child might develop some significant driving anxiety as a result of being driven around by an intoxicated parent. So my client’s ideas about how to resolve her anxiety were understandable.
But as I tried to explain over the course of a few sessions, the way out of her driving anxiety was going to have very little to do with her past and everything to do with her present.
Because here’s the thing:
The original cause of anxiety is rarely the maintaining cause.
In my client’s case, it’s very possible that, as a result of her father’s drunk driving, she developed a habit of worrying a lot while driving. But when you think about it, her father wasn’t causing her driving anxiety now as a 45-year-old woman. What was causing her driving anxiety and panic now was the habit of worrying about her own anxiety while driving.
At my client’s request, we spent weeks and weeks exploring every nuance of her past and memories about her father and his drinking and driving. And while there were some interesting tidbits to be gleaned, my client’s driving anxiety and panic persisted.
No matter how much insight she got into the origins of her anxiety, the habit of worrying while driving persisted, and along with it, her panic attacks.
And the reason was straightforward: While her father’s drunk driving may have been the initial cause or trigger for her driving anxiety, it was her habit of worrying and catastrophizing in the present that was maintaining it.
This meant that we could explore her past until both of us were blue in the face, but until we took care of the habits in the present that were maintaining her anxiety, she would continue to have panic attacks while driving.
If you really want to free yourself from anxiety, it’s your present, not your past, that holds the key.
What’s more, the original cause or trigger for anxiety is not only unhelpful, most of the time it’s completely unnecessary for addressing anxiety in the present:
Understanding why your mother didn’t love you as much as you wished she had won’t change the fact that you’re in the habit of worrying about what other people think — and as a result, experience a lot of social anxiety.
Understanding how your learning disability as a teenager led to feelings of inadequacy won’t change the fact that you’re in the habit of putting yourself down with constant negative self-talk — and as a result, experience a lot of performance anxiety.
Understanding that worrying about the future was a normal consequence of your traumatic childhood won’t change the fact that you’re in the habit of catastrophizing and worrying about the future now — and as a result, experiencing a lot of generalized anxiety.
There’s nothing wrong with exploring your past and trying to understand how it’s shaped who you are today.
But if you’re serious about feeling less anxious, you need to understand the habits that are maintaining it in the present and address those head-on. | https://medium.com/personal-growth/4-biggest-myths-about-anxiety-everyone-believes-222090ac841e | ['Nick Wignall'] | 2020-12-19 19:56:49.064000+00:00 | ['Self', 'Psychology', 'Anxiety', 'Life', 'Mental Health'] | Title 4 Biggest Myths Anxiety Everyone BelievesContent 2 need understand origin anxiety One biggest misconception anxiety it’s necessary understand origin life order deal effectively example client came see panic attack anytime drove freeway told convinced origin panic childhood father’s habit driving intoxicated hoped exploring childhood memory together would able free panic attack it’s hard see child might develop significant driving anxiety result driven around intoxicated parent client’s idea resolve anxiety understandable tried explain course session way driving anxiety going little past everything present here’s thing original cause anxiety rarely maintaining cause client’s case it’s possible result father’s drunk driving developed habit worrying lot driving think father wasn’t causing driving anxiety 45yearold woman causing driving anxiety panic habit worrying anxiety driving client’s request spent week week exploring every nuance past memory father drinking driving interesting tidbit gleaned client’s driving anxiety panic persisted matter much insight got origin anxiety habit worrying driving persisted along panic attack reason straightforward father’s drunk driving may initial cause trigger driving anxiety habit worrying catastrophizing present maintaining meant could explore past u blue face took care habit present maintaining anxiety would continue panic attack driving really want free anxiety it’s present past hold key What’s original cause trigger anxiety unhelpful time it’s completely unnecessary addressing anxiety present Understanding mother didn’t love much wished won’t change fact you’re habit worrying people think — result experience lot social anxiety Understanding learning disability teenager led feeling inadequacy won’t change fact you’re habit putting constant negative selftalk — result experience lot performance anxiety Understanding worrying future normal consequence traumatic childhood won’t change fact you’re habit catastrophizing worrying future — result experiencing lot generalized anxiety There’s nothing wrong exploring past trying understand it’s shaped today you’re serious feeling le anxious need understand habit maintaining present address headonTags Self Psychology Anxiety Life Mental Health |
2,013 | As a Writer, You Need to Get Into Idea Mode | You did it again, didn’t you?
You let yourself run out of ideas of things to write about. Every time this happens, you promise you won’t let it happen again. You put Idea Generation on your to-do list. You write it in big red letters and draw circles and arrows around it. You make it a Really Big Deal.
But when it comes time to do it, it’s like trying to go to sleep so Santa Claus will come. You just sit there, staring at the screen. Writer’s block? Hell, you have thinker’s block.
But here’s a talent I have developed; a sort of superpower. Instead of idea generation being this active task you have to accomplish, it becomes more passive. Which makes more sense if you think about it. After all, you’ve tried it the other way too many times. Okay, think of ideas. Go!
It doesn’t work, does it? That’s not how ideas come to us. At least, that’s not how the really good ones come to us. They show up out of nowhere. While we sleep. In the shower. Driving down the road. Pretty much anytime, we are not prepared to capitalize on them.
Why is that? I’ve decided that there is this tiny receptor in the back of our brains, a sort of box, if you will. An idea box. And when we are not thinking about writing, and especially when we are not trying to think of ideas to write about, that box just pops open. Like that old Jack-in-the-box, you annoyed everyone with when you were a child. Pop Goes the Weasel just started playing in your head, didn’t it? Sorry about that. It will go away. Eventually.
But here’s the thing. With practice, you can open that box at will. It will take time and some positive reinforcement, but you can do it. Open that little box in the back of your brain and let ideas flow into it. Sounds crazy? Well, maybe, but if it works, who cares?
Try this. Open up your favorite social media or news feed. I like Twitter for this as it gives me the most bang for my mental buck, but you do you. Scroll through and start scanning posts. Open up any that interest you, but don’t spend much time on any one. You’re not trying to find out who did what with that thing. You’re capturing ideas.
As you scroll, just keep thinking about that idea box. Don’t look at each post and think, “Can I write about this?” Just keep scrolling and visualizing that open box in the back of your mind. It’s more associative than anything else. Each post, story, Tweet, whatever, is like one of those inkblots in the Rorschach test. What does this make you think of?
There it is. An idea. Write it down. Make a couple of quick notes about it, because you won’t remember what the idea is later. They are very fleeting things, those ideas. Inkblots? Weren’t they a singing group in the ‘30s? No, wait, that was the Ink Spots. Why would I want to write about them?
But don’t spend more than a few seconds on each one, a minute tops. Keep scrolling. The more you do it, the easier and faster they will come. Before you know it, you’ve come up with enough ideas to keep you fresh for a month. But don’t wait a month to do it again. No matter how brilliant the thought is right now, when you get ready to write, it may fade away. You will lose about half of these, so feed the beast often, at least once a week.
Don’t have your computer, tablet, or phone handy? What are you, a caveman? That’s okay, maybe you are stuck somewhere that you can’t spend time scrolling through social media. Like your real job. Get a piece of paper and just look around the room. Scroll through every item in your field of view.
A pencil cup? Seven Office Supplies That Should Be on Every Desk. Four Obsolete Items You Should Get Rid of Today.
Scan from item to item and make sure that box is open. What does that thing make you wonder about? What’s the history behind this stuff? Why don’t we start using this thing instead of that thing? If you can’t come up with ten articles without moving from your chair, you’re not trying.
But again, it takes practice, this idea mode. Maybe, to begin with, you do it for ten minutes once a day. As you get better at it, expand the time. One good weekly session should be enough to fuel your writing for a long time.
And here’s the best part. Here is where it becomes a superpower. After a while, you won’t have to turn it on. You won’t have to open the box. It will stay open all the time. You won’t be able to stop the ideas from coming. And that is a very good thing.
Now, if you will excuse me, I just thought of a great idea for my next article. | https://medium.com/write-i-must/as-a-writer-you-need-to-get-into-idea-mode-680845f02aa4 | ['Darryl Brooks'] | 2020-11-23 17:30:42.735000+00:00 | ['Self Improvement', 'Life Lessons', 'Writing', 'Ideas', 'Self-awareness'] | Title Writer Need Get Idea ModeContent didn’t let run idea thing write Every time happens promise won’t let happen put Idea Generation todo list write big red letter draw circle arrow around make Really Big Deal come time it’s like trying go sleep Santa Claus come sit staring screen Writer’s block Hell thinker’s block here’s talent developed sort superpower Instead idea generation active task accomplish becomes passive make sense think you’ve tried way many time Okay think idea Go doesn’t work That’s idea come u least that’s really good one come u show nowhere sleep shower Driving road Pretty much anytime prepared capitalize I’ve decided tiny receptor back brain sort box idea box thinking writing especially trying think idea write box pop open Like old Jackinthebox annoyed everyone child Pop Goes Weasel started playing head didn’t Sorry go away Eventually here’s thing practice open box take time positive reinforcement Open little box back brain let idea flow Sounds crazy Well maybe work care Try Open favorite social medium news feed like Twitter give bang mental buck Scroll start scanning post Open interest don’t spend much time one You’re trying find thing You’re capturing idea scroll keep thinking idea box Don’t look post think “Can write this” keep scrolling visualizing open box back mind It’s associative anything else post story Tweet whatever like one inkblot Rorschach test make think idea Write Make couple quick note won’t remember idea later fleeting thing idea Inkblots Weren’t singing group ‘30s wait Ink Spots would want write don’t spend second one minute top Keep scrolling easier faster come know you’ve come enough idea keep fresh month don’t wait month matter brilliant thought right get ready write may fade away lose half feed beast often least week Don’t computer tablet phone handy caveman That’s okay maybe stuck somewhere can’t spend time scrolling social medium Like real job Get piece paper look around room Scroll every item field view pencil cup Seven Office Supplies Every Desk Four Obsolete Items Get Rid Today Scan item item make sure box open thing make wonder What’s history behind stuff don’t start using thing instead thing can’t come ten article without moving chair you’re trying take practice idea mode Maybe begin ten minute day get better expand time One good weekly session enough fuel writing long time here’s best part becomes superpower won’t turn won’t open box stay open time won’t able stop idea coming good thing excuse thought great idea next articleTags Self Improvement Life Lessons Writing Ideas Selfawareness |
2,014 | Use C# And ML.NET Machine Learning To Predict Taxi Fares In New York | I’m using the awesome Rainbow CSV plugin for Visual Studio Code which is highlighting my CSV data file with these nice colors.
There are a lot of columns with interesting information in this data file, but I will only be focusing on the following:
Column 0: The data provider vendor ID
Column 3: Number of passengers
Column 4: Trip distance
Column 5: The rate code (standard, JFK, Newark, …)
Column 9: Payment type (credit card, cash, …)
Column 10: Fare amount
I’ll build a machine learning model in C# that will use columns 0, 3, 4, 5, and 9 as input, and use them to predict the taxi fare for every trip. Then I’ll compare the predicted fares with the actual taxi fares in column 10, and evaluate the accuracy of my model.
And I will use NET Core to build my app.
NET Core is really cool. It’s the multi-platform version of the NET framework and it runs flawlessly on Windows, OS/X, and Linux.
I’m using the 3.0 preview on my Mac right now and haven’t touched my Windows 10 virtual machine in days.
Here’s how to set up a new console project in NET Core:
$ dotnet new console -o PricePrediction
$ cd PricePrediction
Next, I need to install the ML.NET NuGet package:
$ dotnet add package Microsoft.ML
Now I’m ready to add some classes. I’ll need one to hold a taxi trip, and one to hold my model’s predictions.
I will modify the Program.cs file like this:
The TaxiTrip class holds one single taxi trip. Note how each field is adorned with a Column attribute that tell the CSV data loading code which column to import data from.
I’m also declaring a TaxiTripFarePrediction class which will hold a single fare prediction.
Now I’m going to load the training data in memory:
This code sets up a TextLoader to load the CSV data into memory. Note that all column data types are what you’d expect, except RateCode and PaymentType. These columns hold numeric values, but I’m loading then as string fields.
The reason I’m doing this is because RateCode is an enumeration with the following values:
1 = standard
2 = JFK
3 = Newark
4 = Nassau
5 = negotiated
6 = group
And PaymentType is defined as follows:
1 = Credit card
2 = Cash
3 = No charge
4 = Dispute
5 = Unknown
6 = Voided trip
These actual numbers don’t mean anything in this context. And I certainly don’t want the machine learning model to start believing that a trip to Newark is three times as important as a standard fare.
So converting these values to strings is a perfect trick to show the model that RateCode and PaymentType are just labels, and the underlying numbers don’t mean anything.
With the TextLoader all set up, a single call to Load() is sufficient to load the entire data file in memory.
I only have a single data file, so I am calling TrainTestSplit() to set up a training partition with 80% of the data and a test partition with the remaining 20% of the data.
You often see this 80/20 split in data science, it’s a very common approach to train and test a model.
Now I’m ready to start building the machine learning model:
Machine learning models in ML.NET are built with pipelines, which are sequences of data-loading, transformation, and learning components.
My pipeline has the following components:
CopyColumns which copies the FareAmount column to a new column called Label. This Label column holds the actual taxi fare that the model has to predict.
which copies the FareAmount column to a new column called Label. This Label column holds the actual taxi fare that the model has to predict. A group of three OneHotEncodings to perform one hot encoding on the three columns that contains enumerative data: VendorId, RateCode, and PaymentType. This is a required step because machine learning models cannot handle enumerative data directly.
to perform one hot encoding on the three columns that contains enumerative data: VendorId, RateCode, and PaymentType. This is a required step because machine learning models cannot handle enumerative data directly. Concatenate which combines all input data columns into a single column called Features. This is a required step because ML.NET can only train on a single input column.
which combines all input data columns into a single column called Features. This is a required step because ML.NET can only train on a single input column. AppendCacheCheckpoint which caches all data in memory to speed up the training process.
which caches all data in memory to speed up the training process. A final FastTree regression learner which will train the model to make accurate predictions.
The FastTreeRegressionTrainer is a very nice training algorithm that uses gradient boosting, a machine learning technique for regression problems.
A gradient boosting algorithm builds up a collection of weak regression models. It starts out with a weak model that tries to predict the taxi fare. Then it adds a second model that attempts to correct the error in the first model. And then it adds a third model, and so on.
The result is a fairly strong prediction model that is actually just an ensemble of weaker prediction models stacked on top of each other.
With the pipeline fully assembled, I can train the model on the training partition with a call to Fit().
I now have a fully- trained model. So now I need to load some validation data, predict the taxi fare for each trip, and calculate the accuracy of my model:
This code calls Transform(…) to set up predictions for every single taxi trip in the test partition. The Evaluate(…) method then compares these predictions to the actual taxi fares and automatically calculates three very handy metrics for me:
Rms : this is the root mean square error or RMSE value. It’s the go-to metric in the field of machine learning to evaluate models and rate their accuracy. RMSE represents the length of a vector in n-dimensional space, made up of the error in each individual prediction.
: this is the root mean square error or RMSE value. It’s the go-to metric in the field of machine learning to evaluate models and rate their accuracy. RMSE represents the length of a vector in n-dimensional space, made up of the error in each individual prediction. L1 : this is the mean absolute prediction error, expressed in dollars.
: this is the mean absolute prediction error, expressed in dollars. L2: this is the mean square prediction error, or MSE value. Note that RMSE and MSE are related: RMSE is just the square root of MSE.
To wrap up, let’s use the model to make a prediction.
I’m going to take a standard taxi trip for 19 minutes. I’ll be the only passenger and I’ll pay by credit card.
Here’s how to make the prediction:
I use the CreatePredictionEngine<…>(…) method to set up a prediction engine. The two type arguments are the input data class and the class to hold the prediction. And once my prediction engine is set up, I can simply call Predict(…) to make a single prediction.
I know that this trip is supposed to cost $15.50. How accurate will the model prediction be?
Here’s the code running in the Visual Studio Code debugger on my Mac: | https://medium.com/machinelearningadvantage/use-c-and-ml-net-machine-learning-to-predict-taxi-fares-in-new-york-519546f52591 | ['Mark Farragher'] | 2019-11-19 15:11:07.808000+00:00 | ['Machine Learning', 'Artificial Intelligence', 'Deep Learning', 'Csharp', 'Data Science'] | Title Use C MLNET Machine Learning Predict Taxi Fares New YorkContent I’m using awesome Rainbow CSV plugin Visual Studio Code highlighting CSV data file nice color lot column interesting information data file focusing following Column 0 data provider vendor ID Column 3 Number passenger Column 4 Trip distance Column 5 rate code standard JFK Newark … Column 9 Payment type credit card cash … Column 10 Fare amount I’ll build machine learning model C use column 0 3 4 5 9 input use predict taxi fare every trip I’ll compare predicted fare actual taxi fare column 10 evaluate accuracy model use NET Core build app NET Core really cool It’s multiplatform version NET framework run flawlessly Windows OSX Linux I’m using 30 preview Mac right haven’t touched Windows 10 virtual machine day Here’s set new console project NET Core dotnet new console PricePrediction cd PricePrediction Next need install MLNET NuGet package dotnet add package MicrosoftML I’m ready add class I’ll need one hold taxi trip one hold model’s prediction modify Programcs file like TaxiTrip class hold one single taxi trip Note field adorned Column attribute tell CSV data loading code column import data I’m also declaring TaxiTripFarePrediction class hold single fare prediction I’m going load training data memory code set TextLoader load CSV data memory Note column data type you’d expect except RateCode PaymentType column hold numeric value I’m loading string field reason I’m RateCode enumeration following value 1 standard 2 JFK 3 Newark 4 Nassau 5 negotiated 6 group PaymentType defined follows 1 Credit card 2 Cash 3 charge 4 Dispute 5 Unknown 6 Voided trip actual number don’t mean anything context certainly don’t want machine learning model start believing trip Newark three time important standard fare converting value string perfect trick show model RateCode PaymentType label underlying number don’t mean anything TextLoader set single call Load sufficient load entire data file memory single data file calling TrainTestSplit set training partition 80 data test partition remaining 20 data often see 8020 split data science it’s common approach train test model I’m ready start building machine learning model Machine learning model MLNET built pipeline sequence dataloading transformation learning component pipeline following component CopyColumns copy FareAmount column new column called Label Label column hold actual taxi fare model predict copy FareAmount column new column called Label Label column hold actual taxi fare model predict group three OneHotEncodings perform one hot encoding three column contains enumerative data VendorId RateCode PaymentType required step machine learning model cannot handle enumerative data directly perform one hot encoding three column contains enumerative data VendorId RateCode PaymentType required step machine learning model cannot handle enumerative data directly Concatenate combine input data column single column called Features required step MLNET train single input column combine input data column single column called Features required step MLNET train single input column AppendCacheCheckpoint cache data memory speed training process cache data memory speed training process final FastTree regression learner train model make accurate prediction FastTreeRegressionTrainer nice training algorithm us gradient boosting machine learning technique regression problem gradient boosting algorithm build collection weak regression model start weak model try predict taxi fare add second model attempt correct error first model add third model result fairly strong prediction model actually ensemble weaker prediction model stacked top pipeline fully assembled train model training partition call Fit fully trained model need load validation data predict taxi fare trip calculate accuracy model code call Transform… set prediction every single taxi trip test partition Evaluate… method compare prediction actual taxi fare automatically calculates three handy metric Rms root mean square error RMSE value It’s goto metric field machine learning evaluate model rate accuracy RMSE represents length vector ndimensional space made error individual prediction root mean square error RMSE value It’s goto metric field machine learning evaluate model rate accuracy RMSE represents length vector ndimensional space made error individual prediction L1 mean absolute prediction error expressed dollar mean absolute prediction error expressed dollar L2 mean square prediction error MSE value Note RMSE MSE related RMSE square root MSE wrap let’s use model make prediction I’m going take standard taxi trip 19 minute I’ll passenger I’ll pay credit card Here’s make prediction use CreatePredictionEngine…… method set prediction engine two type argument input data class class hold prediction prediction engine set simply call Predict… make single prediction know trip supposed cost 1550 accurate model prediction Here’s code running Visual Studio Code debugger MacTags Machine Learning Artificial Intelligence Deep Learning Csharp Data Science |
2,015 | 5 Life Lessons from 5 Years at VaynerMedia | This week marked my 5th anniversary (or Vaynerversary, as we call it) at a company I love: VaynerMedia. It’s a feat only a handful (no pun intended) of others have achieved to date, and one in which I happen to be quite proud. It reminds me of so much, and all the experiences, lessons and amazing friendships that have come out of it are invaluable.
A photo I took of the VaynerMedia office in Tribeca (Oct, 2010)
As someone who’s about to turn 30 (sigh), 5 years shouldn’t be all that transformative, but time just doesn’t work that way here.
Here, you can hold 6 different job titles in 5 years. You can watch a 20 person team grow into a 500+ one. You can move to 4 separate offices. And you can open 3 new ones (with one on the way). Time isn’t supposed to work that way, right?
I came here in 2010 because I heard Gary V. wanted to build the biggest building in town, and I truly believed he (we) would. I still believe it now.
When you’re lucky enough to be a fly on the wall at a fast-paced company, you undoubtedly pick up some valuable insights and knowledge. In an effort to share some of mine, I’ve written out 5 key things I learned along the way.
These are in no particular order, as I think they’re all super important. You may recognize a few, since some stem from philosophical things I know GV’s spoken about publicly over the years. What can I say? The guy’s quotable… | https://medium.com/the-ascent/5-life-lessons-from-5-years-at-vaynermedia-448844af2606 | ['Steve Campbell'] | 2019-12-10 23:15:41.591000+00:00 | ['Entrepreneurship', 'Startup', 'Life'] | Title 5 Life Lessons 5 Years VaynerMediaContent week marked 5th anniversary Vaynerversary call company love VaynerMedia It’s feat handful pun intended others achieved date one happen quite proud reminds much experience lesson amazing friendship come invaluable photo took VaynerMedia office Tribeca Oct 2010 someone who’s turn 30 sigh 5 year shouldn’t transformative time doesn’t work way hold 6 different job title 5 year watch 20 person team grow 500 one move 4 separate office open 3 new one one way Time isn’t supposed work way right came 2010 heard Gary V wanted build biggest building town truly believed would still believe you’re lucky enough fly wall fastpaced company undoubtedly pick valuable insight knowledge effort share mine I’ve written 5 key thing learned along way particular order think they’re super important may recognize since stem philosophical thing know GV’s spoken publicly year say guy’s quotable…Tags Entrepreneurship Startup Life |
2,016 | What is Google Kubernetes Engine (GKE)? | You can also checkout the explainer video where I walk through these concepts in detail
Explainer video on the topic — “What is Google Kubernetes Engine?”
Next steps
If you like this #GCPSketchnote then subscribe to my YouTube channel 👇 where I post a sketchnote on one topic every week!
Follow my website for downloads and prints👇
If you have thoughts or ideas on other topic that you might find helpful in this format, please drop them in comments below! | https://medium.com/google-cloud/what-is-google-kubernetes-engine-gke-d2cb2d17178d | ['Priyanka Vergadia'] | 2020-12-10 05:41:11.400000+00:00 | ['Kubernetes', 'Containers', 'Google Cloud Platform', 'Cloud Computing', 'Cloud'] | Title Google Kubernetes Engine GKEContent also checkout explainer video walk concept detail Explainer video topic — “What Google Kubernetes Engine” Next step like GCPSketchnote subscribe YouTube channel 👇 post sketchnote one topic every week Follow website downloads prints👇 thought idea topic might find helpful format please drop comment belowTags Kubernetes Containers Google Cloud Platform Cloud Computing Cloud |
2,017 | Artificial Intelligence Is Pioneering Advances in Ecology | Artificial Intelligence Is Pioneering Advances in Ecology CloudOps Follow Oct 16 · 6 min read
This blog post was originally published here on CloudOps’ blog.
The GitHub repo for this project can be found at: https://github.com/TristansCloud/YellowstonesVegitiation
“Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object… [and] generally refers to the use of satellite or aircraft based sensor technologies.”
It’s the lazy persons data collection. The ‘scan the entire world every day’ data collection. Remote sensing has given us a continuous stream of data on the state of the world, revolutionizing agriculture, international defence, environmental monitoring, crisis management, telecommunications, weather forecasting, firefighting, and many other fields. Any application that can be framed in a spatial context has likely benefited from advances in remote sensing.
As an ecologist, my field has been able to monitor global forest cover change and harmful algae blooms, estimate populations of endangered species, and designate the areas most important for ecosystem functioning to be protected all through the use of remote sensing technology.
Not wanting to be left out, I’ve been thinking about what remote sensing can bring to my own research interests. I study the processes that drive evolutionary change at the intersection between evolutionary biology and ecology called eco-evolutionary dynamics. I am particularly interested in the non-living factors that structure an ecosystem: how does the intersection between terrain, climate, geochemisty, and human disturbance (technically a living factor, but a special case) determine what organisms will be living there?
I am also very interested in machine learning solutions and applications of big data. A question naturally arose, can I use remote sensing and machine learning to link my non-living predictors to the resulting ecosystem at a large scale and across different ecosystems types?
To do this, I first had to define my predictors and response variables. To start simple, I chose myresponse to be open source NDVI images from the Landsat 8 satellite, which photographs the majority of the globe every 16 days. NDVI stands for normalized difference vegetation index, a measure of how much vegetation is in a given area. Plants absorb photosynthetically active light and reflect near infrared light, so the difference between these two wavelengths is a measure of how much healthy plantmaterial is present. I chose my predictor to be a digital elevation model (DEM) and ignored climate, geochemistry and human activity at this first attempt. I selected an area to study that should have little deviation in climate across the study area, as to start I only wanted to focus on terrain affects on the first try at building this model. However, I wanted to design a data pipeline that could easily be expanded once I wanted to address more complex predictors and responses.
Kubernetes provided an ideal solution, allowing me to create pods to complete each step in my preprocessing pipeline and delete those resources when no longer needed.
The USGS hosts an API for downloading Landsat 8 images, which I accessed through the programming language R and was executed in the download pod. The data was then unzipped in a new pod, and finally my predictor DEM was layered on the NDVI image in a final pod to create my prepared data. This final step is easily expandable to layer on different predictors as I expand the project.
Neural Networks
The hypothesis I wanted to test for this analysis is how well can I predict vegetation growth from a DEM, and how transferrable that model is from one area to another, assuming climate and other factors stay the same. To test this, I downloaded NDVI images from two national parks in the Northwestern USA: Salmon Challis national park and Yellowstone national park.
I chose these two places as they are in a similar geographic area so likely have similar climates. They also are both mountainous regions and cover similar amounts of land. Finally, they are both national parks and should have pristine ecosystems relatively removed from human interference (although there was some agriculture and human settlements in both areas).
I selected two scenes, one from Salmon Challis and one from Yellowstone, from the same year and season. My plan was to build a fully convolutional neural net (CNN) and train it on tensors from Salmon Challis to predict tensors in Yellowstone. By taking 51 pixel by 51 pixel sections of the NDVI and DEM images, I created 17,500 individual tensors for Salmon Challis and 17,500 individual tensors for Yellowstone. I then built a CNN to take two inputs, the 51 x 51 DEM as well as a 51 x 51 pixel low resolution DEM that covered a much larger area, in case the large scale geographic features surrounding an area are important to predicting vegetation. The output of the model is the 51 x 51 pixelNDVI image.
Through trial and error I found that an inception framework, inspired by GoogLeNet, improved the stability of model predictions. Inception branches tensors into separate processing pathways, possibly allowing models to understand different features in the data all while maintaining a computationally efficient network. GoogLeNet won the ImageNet Large-Scale Visual Recognition Challenge in 2014 with this style of architecture, and I recommend reading the original research paper (which is not overly technical) of this and other models if you are interested in learning more about network architecture effects on neural network performance. UNet, a separate convolutional network, also inspired some of my architecture.
Overall, the model performance surpasses a classic technique based on generalized additive models (GAMs) but is still unsatisfactory, getting quite a few images wrong. See the final figure for some of the model predictions. I wanted to teach myself to design neural networks, so for now I am avoiding transfer learning from a pretrained network although this is still an option. Through GCP, CloudOps, and the huge amounts of remote sensing data generated daily, I have the resources and data to improve this model. I would love to pass many more predictors, and increase the width and depth of my neural network. Take a look at my github if you would like to see my actual layers in the neural network or the Kubernetes solution I use to download data. I’m still working on this project, so my network architecture may have evolved a bit. Good luck in your own data science adventures!
Kubernetes and cloud native technologies have allowed scientists to store and make sense of the data collected by remote sensing. Nonetheless, these technologies can be difficult to learn and master. CloudOps’ DevOps workshops will deepen your understanding of cloud native technologies with hands-on training. Take our 3-day Docker and Kubernetes workshop to get started using containers in development or production, or our 2-day Machine Learning workshop to make your ML workflows simple, portable, and scalable with Kubernetes and other open source tools.
Tristan Kosciuch
Tristan is an evolutionary biologist interested in the effects of landscape levels on genetic and phenotypic variation. He works in Vancouver Island on threespine stickleback and in the Lake Victoria basin on Nile perch and haplochromine cichlids. His work on stickleback uses remote sensing to quantify environments to test the predictability of evolution.
This blog post was originally published here on CloudOps’ blog.
Sign up for CloudOps’ monthly newsletter to stay up to date with the latest DevOps and cloud native developments. | https://medium.com/datadriveninvestor/artificial-intelligence-is-pioneering-advances-in-ecology-5bd86d2ab8e1 | [] | 2020-11-07 04:00:34.291000+00:00 | ['Kubernetes', 'Data Pipeline', 'Neural Networks', 'Artificial Intelligence', 'Ecology'] | Title Artificial Intelligence Pioneering Advances EcologyContent Artificial Intelligence Pioneering Advances Ecology CloudOps Follow Oct 16 · 6 min read blog post originally published CloudOps’ blog GitHub repo project found httpsgithubcomTristansCloudYellowstonesVegitiation “Remote sensing acquisition information object phenomenon without making physical contact object… generally refers use satellite aircraft based sensor technologies” It’s lazy person data collection ‘scan entire world every day’ data collection Remote sensing given u continuous stream data state world revolutionizing agriculture international defence environmental monitoring crisis management telecommunication weather forecasting firefighting many field application framed spatial context likely benefited advance remote sensing ecologist field able monitor global forest cover change harmful algae bloom estimate population endangered specie designate area important ecosystem functioning protected use remote sensing technology wanting left I’ve thinking remote sensing bring research interest study process drive evolutionary change intersection evolutionary biology ecology called ecoevolutionary dynamic particularly interested nonliving factor structure ecosystem intersection terrain climate geochemisty human disturbance technically living factor special case determine organism living also interested machine learning solution application big data question naturally arose use remote sensing machine learning link nonliving predictor resulting ecosystem large scale across different ecosystem type first define predictor response variable start simple chose myresponse open source NDVI image Landsat 8 satellite photograph majority globe every 16 day NDVI stand normalized difference vegetation index measure much vegetation given area Plants absorb photosynthetically active light reflect near infrared light difference two wavelength measure much healthy plantmaterial present chose predictor digital elevation model DEM ignored climate geochemistry human activity first attempt selected area study little deviation climate across study area start wanted focus terrain affect first try building model However wanted design data pipeline could easily expanded wanted address complex predictor response Kubernetes provided ideal solution allowing create pod complete step preprocessing pipeline delete resource longer needed USGS host API downloading Landsat 8 image accessed programming language R executed download pod data unzipped new pod finally predictor DEM layered NDVI image final pod create prepared data final step easily expandable layer different predictor expand project Neural Networks hypothesis wanted test analysis well predict vegetation growth DEM transferrable model one area another assuming climate factor stay test downloaded NDVI image two national park Northwestern USA Salmon Challis national park Yellowstone national park chose two place similar geographic area likely similar climate also mountainous region cover similar amount land Finally national park pristine ecosystem relatively removed human interference although agriculture human settlement area selected two scene one Salmon Challis one Yellowstone year season plan build fully convolutional neural net CNN train tensor Salmon Challis predict tensor Yellowstone taking 51 pixel 51 pixel section NDVI DEM image created 17500 individual tensor Salmon Challis 17500 individual tensor Yellowstone built CNN take two input 51 x 51 DEM well 51 x 51 pixel low resolution DEM covered much larger area case large scale geographic feature surrounding area important predicting vegetation output model 51 x 51 pixelNDVI image trial error found inception framework inspired GoogLeNet improved stability model prediction Inception branch tensor separate processing pathway possibly allowing model understand different feature data maintaining computationally efficient network GoogLeNet ImageNet LargeScale Visual Recognition Challenge 2014 style architecture recommend reading original research paper overly technical model interested learning network architecture effect neural network performance UNet separate convolutional network also inspired architecture Overall model performance surpasses classic technique based generalized additive model GAMs still unsatisfactory getting quite image wrong See final figure model prediction wanted teach design neural network avoiding transfer learning pretrained network although still option GCP CloudOps huge amount remote sensing data generated daily resource data improve model would love pas many predictor increase width depth neural network Take look github would like see actual layer neural network Kubernetes solution use download data I’m still working project network architecture may evolved bit Good luck data science adventure Kubernetes cloud native technology allowed scientist store make sense data collected remote sensing Nonetheless technology difficult learn master CloudOps’ DevOps workshop deepen understanding cloud native technology handson training Take 3day Docker Kubernetes workshop get started using container development production 2day Machine Learning workshop make ML workflow simple portable scalable Kubernetes open source tool Tristan Kosciuch Tristan evolutionary biologist interested effect landscape level genetic phenotypic variation work Vancouver Island threespine stickleback Lake Victoria basin Nile perch haplochromine cichlid work stickleback us remote sensing quantify environment test predictability evolution blog post originally published CloudOps’ blog Sign CloudOps’ monthly newsletter stay date latest DevOps cloud native developmentsTags Kubernetes Data Pipeline Neural Networks Artificial Intelligence Ecology |
2,018 | Data science-Create Tailored Algorithms | Blackcoffer artificial intelligence solutions are easy to use out-of-the-box and are custom tailored to each individual client’s needs. Our end-to-end AI enabled platforms speed time to delivery, save costs, reduce risk, and deliver optimized results to give you an immediate competitive advantage and bolster your bottom line.
AI innovation enabled by faster processors, Big Data and novel algorithms
AI is “an area of computer science that deals with giving machines the ability to seem like they have human intelligence”.
Read More | https://medium.com/data-analytics-and-ai/data-science-create-tailored-algorithms-e4f4365e4496 | ['Ella William'] | 2019-06-14 11:11:30.590000+00:00 | ['Artificial Intelligence', 'Analytics', 'Data Science', 'Big Data'] | Title Data scienceCreate Tailored AlgorithmsContent Blackcoffer artificial intelligence solution easy use outofthebox custom tailored individual client’s need endtoend AI enabled platform speed time delivery save cost reduce risk deliver optimized result give immediate competitive advantage bolster bottom line AI innovation enabled faster processor Big Data novel algorithm AI “an area computer science deal giving machine ability seem like human intelligence” Read MoreTags Artificial Intelligence Analytics Data Science Big Data |
2,019 | A Vacation to Mars: The Biggest Scam in Modern History | A Vacation to Mars: The Biggest Scam in Modern History
Project Mars One
A depiction of what the Mars One habitable home on Mars could look like (Source: Mars One)
Colonization has been in the blood of humans for thousands of years, and with Earth’s population reaching its highest peak as well as its ecosystem leaning towards a decline, some people are already thinking of the possibility of moving to a different planet. People such as Elon Musk have shown the possibility of such a project because the required technology is here, but it is still too expensive to “mass produce.”
However, one man in 2011 wanted to bring this vision or dream closer to “reality,” or better said, the foolishness of rich investors. Bas Lansdorp was the co-founder and CEO of the private organization Mars One. Before Mars One, Lansdrop became a successful entrepreneur in the western world, proving his ability to not only raise companies but also capital.
Mars One started with Lansdorp’s dream of colonizing Mars and making it a habitable space for humans to live on. As ambitious as this sounded, he had spent quite a few years near-space engineers and scientists who saw his drive for this project, therefore supported his vision. As an entrepreneur he knew that this project would require an enormous amount of investment, therefore he used his entrepreneurial skills to look for people that had the two things he needed:
A desire to move to a different planet. Lots of money!
The money was never seen on paper
Lansdorp came up with an estimate of six billion dollars to start the first missions and get a habitat going that could produce food and allow those who moved there to live with no support from Earth. During the first two years from the start of the company, over 220,000 people invested large amounts of money. These people were promised a chance to end up on Mars, as only a few would be selected at the beginning, and over the years more and more would be able to migrate to the new planet.
Over the years the company kept receiving investments from people all around the world, but they never signed a contract. On paper, the company (Mars One) wasn’t even registered. For eight years, they claimed to be a real company, hiring hundreds of personnel, however, there were only four people in the venture, Lansdorp and four other people who are believed to be his friends.
“Since we started Mars One in March 2011, we received support from scientists, engineers, businessmen and –women and aeropace companies from all over the world. The announcement of our plan in May 2012 resulted in the engagement of the general public, and the support from sponsors and investors. To see our mission evolve this way feels like my dream is becoming a reality.” (taken from Mars One website, written by Bas Lansdorp.)
Bas Lansdorp as a keynote speaker (Source: Mars One)
In order to show his legitimacy, he offered to give free speeches to various organizations about the Mars One project and his vision, such a humble man. Everything he was doing seemed legitimate, but some felt a bit skeptical about this whole project, as it simply sounded a bit too good to be true. Therefore they started to look up information about the company. As the company was private, this meant that most of the information was also private. However, this didn’t stop some investors who were wondering where their money actually went.
The man behind the company was a mastermind in marketing, as he publicized everything on a professional level whilst using his background as an academic to create credibility around the fake company. The publicity created around the company made people think that his project was real.
Although other space and technology prestigious institutions thought differently. Research was carried out by MIT (Massachusetts Institute of Technology) to show that even if his people made it to Mars, they would die after 68 days of living on the planet, just because of the extremely low temperatures. To combat this, Lansdorp promised all the investors that the project would be finished by 2027.
The company was also backed up by lots of international space organizations, which shows how Lansdorp used this to not only build credibility around the project but also give a reason for investors to trust him with their money. Just look at this introduction video to the project from 2012 to see how convincing it is.
In January of 2019, the company was declared bankrupt with the private bank account showing the company being $25,000 in debt. Since then, there has been no information about Lansdorp or Mars One as every asset owned by the company was liquified.
But what happened with all the money raised?
The accounts were never publicly shown, but people do speculate that Mars One raised a few billion dollars. It is believed that Lansdorp took all the money and left the public media. At the same time maybe he just moved to Mars by himself, at the end of the day that was his dream. | https://medium.com/history-of-yesterday/a-vacation-to-mars-the-biggest-scam-in-modern-history-d9d191ed79a8 | ['Andrei Tapalaga'] | 2020-12-18 21:02:19.482000+00:00 | ['Money', 'Space', 'History', 'Marketing', 'Entrepreneurship'] | Title Vacation Mars Biggest Scam Modern HistoryContent Vacation Mars Biggest Scam Modern History Project Mars One depiction Mars One habitable home Mars could look like Source Mars One Colonization blood human thousand year Earth’s population reaching highest peak well ecosystem leaning towards decline people already thinking possibility moving different planet People Elon Musk shown possibility project required technology still expensive “mass produce” However one man 2011 wanted bring vision dream closer “reality” better said foolishness rich investor Bas Lansdorp cofounder CEO private organization Mars One Mars One Lansdrop became successful entrepreneur western world proving ability raise company also capital Mars One started Lansdorp’s dream colonizing Mars making habitable space human live ambitious sounded spent quite year nearspace engineer scientist saw drive project therefore supported vision entrepreneur knew project would require enormous amount investment therefore used entrepreneurial skill look people two thing needed desire move different planet Lots money money never seen paper Lansdorp came estimate six billion dollar start first mission get habitat going could produce food allow moved live support Earth first two year start company 220000 people invested large amount money people promised chance end Mars would selected beginning year would able migrate new planet year company kept receiving investment people around world never signed contract paper company Mars One wasn’t even registered eight year claimed real company hiring hundred personnel however four people venture Lansdorp four people believed friend “Since started Mars One March 2011 received support scientist engineer businessmen –women aeropace company world announcement plan May 2012 resulted engagement general public support sponsor investor see mission evolve way feel like dream becoming reality” taken Mars One website written Bas Lansdorp Bas Lansdorp keynote speaker Source Mars One order show legitimacy offered give free speech various organization Mars One project vision humble man Everything seemed legitimate felt bit skeptical whole project simply sounded bit good true Therefore started look information company company private meant information also private However didn’t stop investor wondering money actually went man behind company mastermind marketing publicized everything professional level whilst using background academic create credibility around fake company publicity created around company made people think project real Although space technology prestigious institution thought differently Research carried MIT Massachusetts Institute Technology show even people made Mars would die 68 day living planet extremely low temperature combat Lansdorp promised investor project would finished 2027 company also backed lot international space organization show Lansdorp used build credibility around project also give reason investor trust money look introduction video project 2012 see convincing January 2019 company declared bankrupt private bank account showing company 25000 debt Since information Lansdorp Mars One every asset owned company liquified happened money raised account never publicly shown people speculate Mars One raised billion dollar believed Lansdorp took money left public medium time maybe moved Mars end day dreamTags Money Space History Marketing Entrepreneurship |
2,020 | Happiness in Ordinary Things | Happiness is when you slip between fresh, air-dried sheets after bathing in scented oils. It’s the yellow flame of a candle on a winter’s night while the wind whistles outside your door and you snuggle inside.
“The east wind breaks over the branch that twists, an ocean of waves among the thicket. And as the last bird sings, notes splash into the sky, washing the sunset with salty tears to drown the day.” BW
When monochrome the day slides under trees, deep into their roots, and evening spreads her star-blanket wide, creeping over each sleeping house and prowling cat.
Dawn inches, shy into the foliage, licking every grass and berry crimson and dropping diamonds web-ward, startling spiders fast into morning’s welcome.
Waves that lap on the shore gently, enticing you to take off your shoes and dip in your toes. Orange and crimson sunsets that race across the sky, and gusts whistling through wheat in a field, making it dance.
“Each blade of emerald that swabs the dawn meadow. Every thicket flower, the sunset and the alpine grove that plump the evening forest — even the morsel carried by the ant trailing in the dust — brings beauty.” BW
The organic curve and beauty of a snail’s shell, so simple but perfect in every way. Red dresses, and generous carpet bags — think of Mary Poppins. The bright eyes and giggle of a toddler who finds simple things hilarious.
Delicious comfort food and a blazing log fire in the winter. Christmas lights strung across streets and carol singing. Festive get-togethers with people I love.
“A faint drizzle, a haze glistening, drenches down so soft like a mist of all the mornings ever uttered from the mouth of creation to the grass that sways and poppies that paint the meadow.” BW
Warm summer rain, and heat that thaws your bones after a chill wind. Writing, literature, and creativity. Puddles to splash in, kites, wigwams, and rainbows. The scent of cinnamon, fresh coffee, cut grass, and geranium oil.
Trampolines, bouncy castles, real castles, sunsets, sunrises, dawn before anyone else is up apart from the birds. Meditation, piano music, the saxophone.
Colorful vegetables, flowers of all kinds, secret gardens, swings, hedgehogs, and parrots. Beloved pets, close family, friends — the type that last forever — and people who make everyone laugh.
“Jar-wrapped, the herb and tomato fruits from the lingering summer scald, ripe red with luscious wine-scent and lemon, heaving and round as life, heavy and fat.
Pick as I may season’s last offering of scooped out September banquet — that lingering prize and rosette-laden plot still offers succulent squash and blooms.The basil, holy and cinnamon, thrives among the fennel, edible flowers, and figs.
And sweet peas, sunchokes, and okra splash the landscape with nature’s board.” BW
An artist’s pallet, scattered with color. A blank canvas, a blank page, and fresh stationary. Velvet, especially purple, blue, forest green, or deep red. Woodland, oak trees, dreams, and soft pillows.
Ducks that waddle, caterpillars — because of their potential — dragonflies, and moths. Stained glass and Tiffany lamps, yellow shoes, and bohemian art.
Crisp, corkscrew leaves of orange and gold that swoosh as you kick them and shuffle in their colorful carpet. Morning dew on emerald grass and dripping from tightly curled fern-fronds. These ordinary things are life’s treasures.
Fulfilling relationships, and closeness, when you know someone really, really well. Love, wherever it appears, and laughter. Stability and having needs met counts because it’s a strong foundation on which to build.
For me, happiness arises from understanding I am the creator of my well-being and control my emotional state. Not relying on anyone else to make me happy brings the joy of freedom and independence.
“I take a mental snapshot of the day as it pours warmth on bare-skinned knees and let the beechnuts crunching underfoot, birdsong, and indigo fields rise to nestle inside the tiny locker of a brain region meant for wonders. It’s then I spy a butterfly-filled canopy flutter at the oak’s crown. So much to paste within, and hold tight, lest it slips into the abyss.” BW
Everything, each occasion or being that inspires my happiness is filtered through my perception. Others reflect the thoughts I entertain most often. So, if I ever feel less than happy, I know the problem isn’t a lack of outside stimulus; I need to tweak my mindset. Knowing this contributes to my happiness. How about you? | https://medium.com/the-bolt-hole/happiness-in-ordinary-things-e6412720ee5e | ['Bridget Webber'] | 2020-10-21 11:10:28.600000+00:00 | ['Self Improvement', 'Philosophy', 'Lifestyle', 'Psychology', 'Mental Health'] | Title Happiness Ordinary ThingsContent Happiness slip fresh airdried sheet bathing scented oil It’s yellow flame candle winter’s night wind whistle outside door snuggle inside “The east wind break branch twist ocean wave among thicket last bird sings note splash sky washing sunset salty tear drown day” BW monochrome day slide tree deep root evening spread starblanket wide creeping sleeping house prowling cat Dawn inch shy foliage licking every grass berry crimson dropping diamond webward startling spider fast morning’s welcome Waves lap shore gently enticing take shoe dip toe Orange crimson sunset race across sky gust whistling wheat field making dance “Each blade emerald swab dawn meadow Every thicket flower sunset alpine grove plump evening forest — even morsel carried ant trailing dust — brings beauty” BW organic curve beauty snail’s shell simple perfect every way Red dress generous carpet bag — think Mary Poppins bright eye giggle toddler find simple thing hilarious Delicious comfort food blazing log fire winter Christmas light strung across street carol singing Festive gettogethers people love “A faint drizzle haze glistening drenches soft like mist morning ever uttered mouth creation grass sway poppy paint meadow” BW Warm summer rain heat thaw bone chill wind Writing literature creativity Puddles splash kite wigwam rainbow scent cinnamon fresh coffee cut grass geranium oil Trampolines bouncy castle real castle sunset sunrise dawn anyone else apart bird Meditation piano music saxophone Colorful vegetable flower kind secret garden swing hedgehog parrot Beloved pet close family friend — type last forever — people make everyone laugh “Jarwrapped herb tomato fruit lingering summer scald ripe red luscious winescent lemon heaving round life heavy fat Pick may season’s last offering scooped September banquet — lingering prize rosetteladen plot still offer succulent squash bloomsThe basil holy cinnamon thrives among fennel edible flower fig sweet pea sunchoke okra splash landscape nature’s board” BW artist’s pallet scattered color blank canvas blank page fresh stationary Velvet especially purple blue forest green deep red Woodland oak tree dream soft pillow Ducks waddle caterpillar — potential — dragonfly moth Stained glass Tiffany lamp yellow shoe bohemian art Crisp corkscrew leaf orange gold swoosh kick shuffle colorful carpet Morning dew emerald grass dripping tightly curled fernfronds ordinary thing life’s treasure Fulfilling relationship closeness know someone really really well Love wherever appears laughter Stability need met count it’s strong foundation build happiness arises understanding creator wellbeing control emotional state relying anyone else make happy brings joy freedom independence “I take mental snapshot day pours warmth bareskinned knee let beechnut crunching underfoot birdsong indigo field rise nestle inside tiny locker brain region meant wonder It’s spy butterflyfilled canopy flutter oak’s crown much paste within hold tight lest slip abyss” BW Everything occasion inspires happiness filtered perception Others reflect thought entertain often ever feel le happy know problem isn’t lack outside stimulus need tweak mindset Knowing contributes happiness youTags Self Improvement Philosophy Lifestyle Psychology Mental Health |
2,021 | Building scalable and efficient ML Pipelines | Building scalable and efficient ML Pipelines
Using Kubernetes to build ML pipelines that scale
Kubernetes is the gold standard for managing tons of containerized applications, whether they are in the cloud or on your own hardware. Whether it is pipeline building, model building or ML application building Kubernetes enable containerization which is a safe way to build and scale any of these scenarios.
Kubernetes can host several packaged and pre-integrated data and data science frameworks on the same cluster. These are usually scalable or they auto-scale, and they’re defined/managed with a declarative approach: specify what your requirements are and the service will continuously seek to satisfy them, which provides resiliency and minimizes manual intervention.
KubeFlow is an open source project that groups leading relevant K8 frameworks. KubeFlow components include Jupyter notebooks, KubeFlow Pipeline (workflow and experiment management), scalable training services (utilized for TensorFlow, PyTourch, Horovod, MXNet, Chainer) and model serving solutions. KubeFlow also offers examples and pre-integrated/tested components.
In addition to typical data science tools, Kubernetes can host data analytics tools such as Spark or Presto, various databases and monitoring/logging solutions such as Prometheus, Grafana and Elastic Search as well. It also enables the use of serverless functions (i.e. auto built/deployed/scaled code like AWS Lambda) for a variety of data-related tasks or APIs/model serving.
The key advantage of Kubernetes vs proprietary cloud or SaaS solutions is that its tools are regularly added and upgraded, Google’s search and Stack overflow are often the fastest path to help and the solution can be deployed everywhere (any cloud service or even on-prem or on your own laptop). A community project also forces associated components/services to conform to a set of standards/abstractions which simplify interoperability, security, monitoring which in turn benefits everyone.
Bringing efficiency to ML pipelines
Unfortunately, just betting on a credible platform like Kubernetes is not enough. Life for data and engineering gets easier once we adopt three guiding rules:
Optimize for functionality — Create reusable abstract functions/steps which can accept parameters. Build for scalability— Apply parallelism to every step (or as often as possible, within reason). Automation — Avoid manual and repetitive tasks by using declarative semantics and workflows.
The current trend in data science is to build “ML factories,” similar to agile software development, build automated pipelines which take data, pre-process it, run training, generate, deploy and monitor the models. The declarative and automated deployment and scaling approach offered by Kubernetes is a great baseline, but it’s missing a way to manage such pipelines on top of that.
A relatively new tool which is part of the KubeFlow project is Pipelines, a set of services and UI aimed at creating and managing ML pipelines. We can write our own code or build from a large set of pre-defined components and algorithms contributed by companies like Google, Amazon, Microsoft, IBM or NVIDIA.
Kubeflow Piplines UI
Once we have a workflow, we can run it once, at scheduled intervals, or trigger it automatically. The pipelines, experiments and runs are managed, and their results are stored and versioned. Pipelines solve the major problem of reproducing and explaining our ML models. It also means we can visually compare between runs and store versioned input and output artifacts in various object/file repositories.
A major challenge is always running experiments and data processing at scale. Pipelines orchestrate various horizontal-scaling and GPU-accelerated data and ML frameworks. A single logical pipeline step may run on a dozen parallel instances of TensorFlow, Spark, or Nuclio functions. Pipelines also have components which map to existing cloud services, so that we can submit a logical task which may run on a managed Google AI and data service, or on Amazon’s SageMaker or EMR.
KubeFlow and its Pipelines, like most tools in this category, are still evolving, but it has a large and vibrant multi-vendor community behind it. This guarantees a viable and open framework. Much like the first days of Kubernetes, cloud providers and software vendors had their proprietary solutions for managing containers, and over time they’ve all given way to the open source standard demanded by the community. | https://medium.com/acing-ai/building-scalable-and-efficient-ml-pipelines-a9f61d2ecbbd | ['Vimarsh Karbhari'] | 2020-09-18 15:22:26.109000+00:00 | ['Machine Learning', 'Artificial Intelligence', 'Kubernetes', 'Interview', 'Data Science'] | Title Building scalable efficient ML PipelinesContent Building scalable efficient ML Pipelines Using Kubernetes build ML pipeline scale Kubernetes gold standard managing ton containerized application whether cloud hardware Whether pipeline building model building ML application building Kubernetes enable containerization safe way build scale scenario Kubernetes host several packaged preintegrated data data science framework cluster usually scalable autoscale they’re definedmanaged declarative approach specify requirement service continuously seek satisfy provides resiliency minimizes manual intervention KubeFlow open source project group leading relevant K8 framework KubeFlow component include Jupyter notebook KubeFlow Pipeline workflow experiment management scalable training service utilized TensorFlow PyTourch Horovod MXNet Chainer model serving solution KubeFlow also offer example preintegratedtested component addition typical data science tool Kubernetes host data analytics tool Spark Presto various database monitoringlogging solution Prometheus Grafana Elastic Search well also enables use serverless function ie auto builtdeployedscaled code like AWS Lambda variety datarelated task APIsmodel serving key advantage Kubernetes v proprietary cloud SaaS solution tool regularly added upgraded Google’s search Stack overflow often fastest path help solution deployed everywhere cloud service even onprem laptop community project also force associated componentsservices conform set standardsabstractions simplify interoperability security monitoring turn benefit everyone Bringing efficiency ML pipeline Unfortunately betting credible platform like Kubernetes enough Life data engineering get easier adopt three guiding rule Optimize functionality — Create reusable abstract functionssteps accept parameter Build scalability— Apply parallelism every step often possible within reason Automation — Avoid manual repetitive task using declarative semantics workflow current trend data science build “ML factories” similar agile software development build automated pipeline take data preprocess run training generate deploy monitor model declarative automated deployment scaling approach offered Kubernetes great baseline it’s missing way manage pipeline top relatively new tool part KubeFlow project Pipelines set service UI aimed creating managing ML pipeline write code build large set predefined component algorithm contributed company like Google Amazon Microsoft IBM NVIDIA Kubeflow Piplines UI workflow run scheduled interval trigger automatically pipeline experiment run managed result stored versioned Pipelines solve major problem reproducing explaining ML model also mean visually compare run store versioned input output artifact various objectfile repository major challenge always running experiment data processing scale Pipelines orchestrate various horizontalscaling GPUaccelerated data ML framework single logical pipeline step may run dozen parallel instance TensorFlow Spark Nuclio function Pipelines also component map existing cloud service submit logical task may run managed Google AI data service Amazon’s SageMaker EMR KubeFlow Pipelines like tool category still evolving large vibrant multivendor community behind guarantee viable open framework Much like first day Kubernetes cloud provider software vendor proprietary solution managing container time they’ve given way open source standard demanded communityTags Machine Learning Artificial Intelligence Kubernetes Interview Data Science |
2,022 | Why React16 is a blessing to React developers | Just like how people are excited about updating their mobile apps and OS, developers should also be excited to update their frameworks. The new version of the different frameworks come with new features and tricks out of the box.
Below are some of the good features you should consider when migrating your existing app to React 16 from React 15.
Time to say Goodbye React15 👋
Error Handling
Error Handling be like :)
React 16 introduces the new concept of an error boundary.
Error boundaries are React components that catch JavaScript errors anywhere in their child component tree. They log those errors, and display a fallback UI instead of the crashed component tree. Error boundaries catch errors during rendering, in lifecycle methods, and in constructors of the whole tree below them.
A class component becomes an error boundary if it defines a new lifecycle method called componentDidCatch(error, info) :
Then you can use it as a regular component.
<ErrorBoundary>
<MyWidget />
</ErrorBoundary>
The componentDidCatch() method works like a JavaScript catch {} block, but for components. Only class components can be error boundaries. In practice, most of the time you’ll want to declare an error boundary component once. Then you’ll use it throughout your application.
Note that error boundaries only catch errors in the components below them in the tree. An error boundary can’t catch an error within itself. If an error boundary fails trying to render the error message, the error will propagate to the closest error boundary above it. This, too, is similar to how catch {} block works in JavaScript.
Check out the live demo:
ComponentDidCatch
For more information on error handling, head here.
New render return types: fragments and strings
Get rid of wrapping the component in a div while rendering.
You can now return an array of elements from a component’s render method. Like with other arrays, you’ll need to add a key to each element to avoid the key warning:
render() {
// No need to wrap list items in an extra element!
return [
// Don't forget the keys :)
<li key="A">First item</li>,
<li key="B">Second item</li>,
<li key="C">Third item</li>,
];
}
Starting with React 16.2.0, it has support for a special fragment syntax to JSX that doesn’t require keys.
Support for returning strings :
render() {
return 'Look ma, no spans!';
}
Portals
Portals provide a first-class way to render children into a DOM node that exists outside the DOM hierarchy of the parent component.
ReactDOM.createPortal(child, container)
The first argument ( child ) is any renderable React child, such as an element, string, or fragment. The second argument ( container ) is a DOM element.
How to use it
When you return an element from a component’s render method, it’s mounted into the DOM as a child of the nearest parent node:
render() {
// React mounts a new div and renders the children into it
return (
<div>
{this.props.children}
</div>
);
}
Sometimes it’s useful to insert a child into a different location in the DOM:
render() {
// React does *not* create a new div. It renders the children into `domNode`.
// `domNode` is any valid DOM node, regardless of its location in the DOM.
return ReactDOM.createPortal(
this.props.children,
domNode
);
}
A typical use case for portals is when a parent component has an overflow: hidden or z-index style, but you need the child to visually “break out” of its container. For example, dialogs, hovercards, and tooltips.
Portals
Custom DOM Attribute
React15 used to ignore any unknown DOM attributes. It would just skip them since React didn’t recognize it.
// Your code:
<div mycustomattribute="something" />
Would render an empty div to the DOM with React 15:
// React 15 output:
<div />
In React16, the output will be the following (custom attributes will be shown and not be ignored at all):
// React 16 output:
<div mycustomattribute="something" />
Avoid Re-render with setting NULL in state
With React16 you can prevent state updates and re-renders right from setState() . You just need to have your function return null .
const MAX_PIZZAS = 20;
function addAnotherPizza(state, props) {
// Stop updates and re-renders if I've had enough pizzas.
if (state.pizza === MAX_PIZZAS) {
return null;
}
// If not, keep the pizzas coming! :D
return {
pizza: state.pizza + 1,
}
}
this.setState(addAnotherPizza);
Read more here.
Creating Refs
Creating refs with React16 is now much easier. Why you need to use refs:
Managing focus, text selection, or media playback.
Triggering imperative animations.
Integrating with third-party DOM libraries.
Refs are created using React.createRef() and are attached to React elements via the ref attribute. Refs are commonly assigned to an instance property when a component is constructed so they can be referenced throughout the component.
class MyComponent extends React.Component {
constructor(props) {
super(props);
this.myRef = React.createRef();
}
render() {
return <div ref={this.myRef} />;
}
}
Accessing Refs
When a ref is passed to an element in render , a reference to the node becomes accessible at the current attribute of the ref.
const node = this.myRef.current;
The value of the ref differs depending on the type of the node:
When the ref attribute is used on an HTML element, the ref created in the constructor with React.createRef() receives the underlying DOM element as its current property.
attribute is used on an HTML element, the created in the constructor with receives the underlying DOM element as its property. When the ref attribute is used on a custom class component, the ref object receives the mounted instance of the component as its current .
attribute is used on a custom class component, the object receives the mounted instance of the component as its . You may not use the ref attribute on functional components because they don’t have instances.
Context API
Context provides a way to pass data through the component tree without having to pass props down manually at every level.
React.createContext
const {Provider, Consumer} = React.createContext(defaultValue);
Creates a { Provider, Consumer } pair. When React renders a context Consumer , it will read the current context value from the closest matching Provider above it in the tree.
The defaultValue argument is only used by a Consumer when it does not have a matching Provider above it in the tree. This can be helpful for testing components in isolation without wrapping them. Note: passing undefined as a Provider value does not cause Consumers to use defaultValue .
Provider
<Provider value={/* some value */}>
A React component that allows Consumers to subscribe to context changes.
Accepts a value prop to be passed to Consumers that are descendants of this Provider. One Provider can be connected to many Consumers. Providers can be nested to override values deeper within the tree.
Consumer
<Consumer>
{value => /* render something based on the context value */}
</Consumer>
A React component that subscribes to context changes.
Requires a function as a child. The function receives the current context value and returns a React node. The value argument passed to the function will be equal to the value prop of the closest Provider for this context above in the tree. If there is no Provider for this context above, the value argument will be equal to the defaultValue that was passed to createContext() .
static getDerivedStateFromProps()
getDerivedStateFromProps is invoked right before calling the render method. Both on the initial mount and on subsequent updates. It should return an object to update the state, or null to update nothing.
This method exists for rare use cases where the state depends on changes in props over time. For example, it might be handy for implementing a <Transition> component that compares its previous and next children to decide which of them to animate in and out.
Deriving state leads to verbose code and makes your components difficult to think about.
Make sure you’re familiar with simpler alternatives:
If you need to perform a side effect (for example, data fetching or an animation) in response to a change in props, use componentDidUpdate lifecycle instead.
(for example, data fetching or an animation) in response to a change in props, use lifecycle instead. If you want to re-compute some data only when a prop changes , use a memoization helper instead.
, use a memoization helper instead. If you want to “reset” some state when a prop changes, consider either making a component fully controlled or fully uncontrolled with a key instead.
This method doesn’t have access to the component instance. If you’d like, you can reuse some code between getDerivedStateFromProps() and the other class methods by extracting pure functions of the component props and state outside the class definition.
Note that this method is fired on every render, regardless of the cause. This is in contrast to UNSAFE_componentWillReceiveProps . It only fires when the parent causes a re-render and not as a result of a local setState .
We compare nextProps.someValue with this.props.someValue. If both are different then we perform some operation, setState
static getDerivedStateFromProps(nextProps, prevState){ if(nextProps.someValue!==prevState.someValue){
return { someState: nextProps.someValue};
} else return null;}
It receives two params nextProps and prevState . As mentioned previously, you cannot access this inside this method. You’ll have to store the props in the state to compare the nextProps with previous props. In above code nextProps and prevState are compared. If both are different then an object will be returned to update the state. Otherwise null will be returned indicating state update not required. If state changes then componentDidUpdate is called where we can perform the desired operations as we did in componentWillReceiveProps .
Bonus: React Lifecycle events
Lifecycle credits — https://twitter.com/dceddia
Well these are some of the features that you should definitely try while working with React16!
Happy coding 💻 😀 | https://medium.com/free-code-camp/why-react16-is-a-blessing-to-react-developers-31433bfc210a | ['Harsh Makadia'] | 2018-10-09 16:56:50.455000+00:00 | ['React', 'Technology', 'Productivity', 'Tech', 'Programming'] | Title React16 blessing React developersContent like people excited updating mobile apps OS developer also excited update framework new version different framework come new feature trick box good feature consider migrating existing app React 16 React 15 Time say Goodbye React15 👋 Error Handling Error Handling like React 16 introduces new concept error boundary Error boundary React component catch JavaScript error anywhere child component tree log error display fallback UI instead crashed component tree Error boundary catch error rendering lifecycle method constructor whole tree class component becomes error boundary defines new lifecycle method called componentDidCatcherror info use regular component ErrorBoundary MyWidget ErrorBoundary componentDidCatch method work like JavaScript catch block component class component error boundary practice time you’ll want declare error boundary component you’ll use throughout application Note error boundary catch error component tree error boundary can’t catch error within error boundary fails trying render error message error propagate closest error boundary similar catch block work JavaScript Check live demo ComponentDidCatch information error handling head New render return type fragment string Get rid wrapping component div rendering return array element component’s render method Like array you’ll need add key element avoid key warning render need wrap list item extra element return Dont forget key li keyAFirst itemli li keyBSecond itemli li keyCThird itemli Starting React 1620 support special fragment syntax JSX doesn’t require key Support returning string render return Look span Portals Portals provide firstclass way render child DOM node exists outside DOM hierarchy parent component ReactDOMcreatePortalchild container first argument child renderable React child element string fragment second argument container DOM element use return element component’s render method it’s mounted DOM child nearest parent node render React mount new div render child return div thispropschildren div Sometimes it’s useful insert child different location DOM render React create new div render child domNode domNode valid DOM node regardless location DOM return ReactDOMcreatePortal thispropschildren domNode typical use case portal parent component overflow hidden zindex style need child visually “break out” container example dialog hovercards tooltips Portals Custom DOM Attribute React15 used ignore unknown DOM attribute would skip since React didn’t recognize code div mycustomattributesomething Would render empty div DOM React 15 React 15 output div React16 output following custom attribute shown ignored React 16 output div mycustomattributesomething Avoid Rerender setting NULL state React16 prevent state update rerenders right setState need function return null const MAXPIZZAS 20 function addAnotherPizzastate prop Stop update rerenders Ive enough pizza statepizza MAXPIZZAS return null keep pizza coming return pizza statepizza 1 thissetStateaddAnotherPizza Read Creating Refs Creating ref React16 much easier need use ref Managing focus text selection medium playback Triggering imperative animation Integrating thirdparty DOM library Refs created using ReactcreateRef attached React element via ref attribute Refs commonly assigned instance property component constructed referenced throughout component class MyComponent extends ReactComponent constructorprops superprops thismyRef ReactcreateRef render return div refthismyRef Accessing Refs ref passed element render reference node becomes accessible current attribute ref const node thismyRefcurrent value ref differs depending type node ref attribute used HTML element ref created constructor ReactcreateRef receives underlying DOM element current property attribute used HTML element created constructor receives underlying DOM element property ref attribute used custom class component ref object receives mounted instance component current attribute used custom class component object receives mounted instance component may use ref attribute functional component don’t instance Context API Context provides way pas data component tree without pas prop manually every level ReactcreateContext const Provider Consumer ReactcreateContextdefaultValue Creates Provider Consumer pair React render context Consumer read current context value closest matching Provider tree defaultValue argument used Consumer matching Provider tree helpful testing component isolation without wrapping Note passing undefined Provider value cause Consumers use defaultValue Provider Provider value value React component allows Consumers subscribe context change Accepts value prop passed Consumers descendant Provider One Provider connected many Consumers Providers nested override value deeper within tree Consumer Consumer value render something based context value Consumer React component subscribes context change Requires function child function receives current context value return React node value argument passed function equal value prop closest Provider context tree Provider context value argument equal defaultValue passed createContext static getDerivedStateFromProps getDerivedStateFromProps invoked right calling render method initial mount subsequent update return object update state null update nothing method exists rare use case state depends change prop time example might handy implementing Transition component compare previous next child decide animate Deriving state lead verbose code make component difficult think Make sure you’re familiar simpler alternative need perform side effect example data fetching animation response change prop use componentDidUpdate lifecycle instead example data fetching animation response change prop use lifecycle instead want recompute data prop change use memoization helper instead use memoization helper instead want “reset” state prop change consider either making component fully controlled fully uncontrolled key instead method doesn’t access component instance you’d like reuse code getDerivedStateFromProps class method extracting pure function component prop state outside class definition Note method fired every render regardless cause contrast UNSAFEcomponentWillReceiveProps fire parent cause rerender result local setState compare nextPropssomeValue thispropssomeValue different perform operation setState static getDerivedStateFromPropsnextProps prevState ifnextPropssomeValueprevStatesomeValue return someState nextPropssomeValue else return null receives two params nextProps prevState mentioned previously cannot access inside method You’ll store prop state compare nextProps previous prop code nextProps prevState compared different object returned update state Otherwise null returned indicating state update required state change componentDidUpdate called perform desired operation componentWillReceiveProps Bonus React Lifecycle event Lifecycle credit — httpstwittercomdceddia Well feature definitely try working React16 Happy coding 💻 😀Tags React Technology Productivity Tech Programming |
2,023 | How to Write a Fundraising Letter: The Best Donor Appeals Include 5 Key Elements | The donor was ready to sell stock to give a $200,000 gift because he knew the education institute needed the money. But it was March 2020, and markets were collapsing. Why would he be so generous at such an uncertain time?
“I’ve found when I fear the Lord, nothing else in the world frightens me,’’ he answered. “But when I stop thinking about God? Then everything else frightens me.’’
Despite a pandemic that kept 4 billion people in lock-down, several nonprofits found it was the perfect time to send out fundraising appeal letters.
For example, when Orchard Lake Schools had to cancel three of its biggest fundraising events (the Ambassador’s Ball, Founder’s Day, and the St. Mary’s Polish Country Fair), a fundraising letter went out spelling out all the numbers: how much was lost and what was needed. Money poured in.
Similarly, our friend Al Kresta at Ave Mario Radio emailed supporters saying $300,000 was needed “to make up the deficit we incurred by canceling our Spring membership drive.’’
Within the first 24 hours, $115,000 came in, and Al wrote donors back the very next day thanking them for that great start, letting them know “we have about 185,000 dollars to go. That’s a great beginning response.’’ He added:
“The question is whether Ave Maria Radio will be able to shine like a matchhead or like a floodlight. We will continue bearing witness to Jesus, the Light of the World. The difference will be the degree of effectiveness.’’
The very best donor appeal letters include these five elements
1. What is the problem or opportunity for good?
You have six seconds to get their attention. So a good appeal starts with the headlines, which are typically half the story. Your headline and body type get to the point:
What is the problem your donor can tackle with a gift? Or if it’s not a problem, what is the opportunity to do good? Past giving history and a clear understanding of who you are writing to are essential.
2. What is your organization doing about it?
Your letter has to show how you are doing something that matters. Ideally, you are helping people in a way no one else can and can demonstrate exactly how you solve a problem or offer an opportunity for good.
3. What is this going to cost?
Return on investment is key. For example, if you write “we can feed a child for just $7.32 per day,’’ the donor thinks, “that’s less than I’d pay to take a child to dinner.’’ They do the math and see the value.
Many years back, our friends at Orchard Lake “sold’’ a donor on a plan to build new tennis courts, but when he saw the price, he immediately said, “that’s way too much money for tennis courts.’’
4. How is this opportunity different from all the others?
Your story shapes your identity, and story+identity should tell you about your mission. Every day, we are bombarded with appeal letters and emails, asking us for money.
The essential question of marketing, religion, education, and nonprofits: Why? Why this cause over some other worthy cause they are about? Why your group over a rival?
5. Why now?
If you don’t give your readers a deadline, they could throw your letter on the “not sure. This can wait — maybe I’ll think about it’’ pile. Once you're in the “maybe’’ pile, you’re likely to get buried in the clutter.
Political fundraisers do well because they include a deadline: most donors know when Election Day and even campaign fundraising deadlines fall.
Key takeaway: We throw away form letters — we cherish love letters
We cherish love letters and throw away form letters. The most personal and moving appeals move people. The best fundraisers are matchmakers, connecting the donor with the group that most moves them. The more personal the connection, the quicker you are to cement the appeal. | https://medium.com/the-partnered-pen/how-to-write-a-fundraising-letter-the-best-donor-appeals-include-5-key-elements-2b5b593d538b | ['Joseph Serwach'] | 2020-10-06 23:05:02.932000+00:00 | ['Marketing', 'Education', 'Work', 'Fundraising', 'Writing'] | Title Write Fundraising Letter Best Donor Appeals Include 5 Key ElementsContent donor ready sell stock give 200000 gift knew education institute needed money March 2020 market collapsing would generous uncertain time “I’ve found fear Lord nothing else world frightens me’’ answered “But stop thinking God everything else frightens me’’ Despite pandemic kept 4 billion people lockdown several nonprofit found perfect time send fundraising appeal letter example Orchard Lake Schools cancel three biggest fundraising event Ambassador’s Ball Founder’s Day St Mary’s Polish Country Fair fundraising letter went spelling number much lost needed Money poured Similarly friend Al Kresta Ave Mario Radio emailed supporter saying 300000 needed “to make deficit incurred canceling Spring membership drive’’ Within first 24 hour 115000 came Al wrote donor back next day thanking great start letting know “we 185000 dollar go That’s great beginning response’’ added “The question whether Ave Maria Radio able shine like matchhead like floodlight continue bearing witness Jesus Light World difference degree effectiveness’’ best donor appeal letter include five element 1 problem opportunity good six second get attention good appeal start headline typically half story headline body type get point problem donor tackle gift it’s problem opportunity good Past giving history clear understanding writing essential 2 organization letter show something matter Ideally helping people way one else demonstrate exactly solve problem offer opportunity good 3 going cost Return investment key example write “we feed child 732 per day’’ donor think “that’s le I’d pay take child dinner’’ math see value Many year back friend Orchard Lake “sold’’ donor plan build new tennis court saw price immediately said “that’s way much money tennis courts’’ 4 opportunity different others story shape identity storyidentity tell mission Every day bombarded appeal letter email asking u money essential question marketing religion education nonprofit cause worthy cause group rival 5 don’t give reader deadline could throw letter “not sure wait — maybe I’ll think it’’ pile youre “maybe’’ pile you’re likely get buried clutter Political fundraiser well include deadline donor know Election Day even campaign fundraising deadline fall Key takeaway throw away form letter — cherish love letter cherish love letter throw away form letter personal moving appeal move people best fundraiser matchmaker connecting donor group move personal connection quicker cement appealTags Marketing Education Work Fundraising Writing |
2,024 | How to Make the Most out of James Clear’s Atomic Habits | 3. Make It Simple
It’s totally normal, at the beginning of a habit, to find yourself looking ahead to where you might end up where you to succeed at sustaining a habit. It’s the same thing when you first walk into a gym.
The first time you catch yourself in the mirror at your gym, mid-repetition through your favorite exercise (deadlift of course), you can’t help yourself but really look into the reflection and see what you will look like if you stay the course of a good gym habit.
But often, these high-goals can be the bane of the habit. Especially at the start where the results are likely to take their time to show themselves to you. Instead of waiting in the mirror, try instead to focus on ways of proactively pushing for the habit-building process to take stock in your life.
A strong method of practicing this is by setting yourself a no excuses framework around the habit you are setting out. This means to step back and see the variables that could jeopardize the habit from taking place.
Whether that’s being an inconvenience, or too time-consuming, there will always be friction against change that you want to set in your days.
If we see going to the gym as the habit for this case, a friction point could be the place. How long would it take me to get there? And if taken further, how long am I going to stay there? Easily enough, you’ll be fast to come up with plenty of resistance points that will push you not to do the habit.
This is where simplifying the process of the habit can help take some decision fatigue off your head.
Now, simplifying a habit is not making it necessarily easier. Rather, it’s removing any unnecessary friction from stopping you from the process of performing the habit.
You do not rise to the level of your goals. You fall to the level of your systems — James Clear
Rather than looking to go to the same gym that your buddies go to, that so happens to lie on the complete opposite side of the town to where you live, look up a nearby gym spot to go after work (or before, if you dare). Take the inconvenience out of the equation and make the commute less of a burden on yourself and more of a point of action on your part to be more intentional with your time.
At the same time that you’ll be economizing on your time, you would be simplifying the choice to go workout to make it unavoidable to yourself.
In addition, by simplifying the habit of its atomic habits (you knew this was coming somewhere) the overall habit becomes far easier to hold onto and sustain.
James’s Habit Tip: Make the habit simple enough to execute that no friction point has enough in it to doubt yourself attending to the habit in the first place. | https://medium.com/age-of-awareness/how-to-make-the-most-out-of-james-clears-atomic-habits-95691b421f37 | [] | 2020-11-18 04:43:07.684000+00:00 | ['Education', 'Learning', 'Habits', 'Productivity', 'Writing'] | Title Make James Clear’s Atomic HabitsContent 3 Make Simple It’s totally normal beginning habit find looking ahead might end succeed sustaining habit It’s thing first walk gym first time catch mirror gym midrepetition favorite exercise deadlift course can’t help really look reflection see look like stay course good gym habit often highgoals bane habit Especially start result likely take time show Instead waiting mirror try instead focus way proactively pushing habitbuilding process take stock life strong method practicing setting excuse framework around habit setting mean step back see variable could jeopardize habit taking place Whether that’s inconvenience timeconsuming always friction change want set day see going gym habit case friction point could place long would take get taken long going stay Easily enough you’ll fast come plenty resistance point push habit simplifying process habit help take decision fatigue head simplifying habit making necessarily easier Rather it’s removing unnecessary friction stopping process performing habit rise level goal fall level system — James Clear Rather looking go gym buddy go happens lie complete opposite side town live look nearby gym spot go work dare Take inconvenience equation make commute le burden point action part intentional time time you’ll economizing time would simplifying choice go workout make unavoidable addition simplifying habit atomic habit knew coming somewhere overall habit becomes far easier hold onto sustain James’s Habit Tip Make habit simple enough execute friction point enough doubt attending habit first placeTags Education Learning Habits Productivity Writing |
2,025 | Lesser-known techniques for data exploration | Exploratory data analysis (EDA) is essentially the first step in the machine learning pipeline. There are many techniques used for EDA, such as:
Checking all columns: name, type, segments
Setting expectation of what the variable might mean and how it may affect the target — and testing the hypothesis
Analyzing the target variable
Using describe() function in Pandas to get a summary of all variables
function in Pandas to get a summary of all variables Checking skewness and kurtosis
Creating scatter plots ( pairplot() in Seaborn is probably the easiest way), distribution plots and box plots
in Seaborn is probably the easiest way), distribution plots and box plots Creating correlation matrix (heat-map); zoomed heat-map if required
Creating scatter plots between the most correlated variables; contemplating whether the correlation makes sense or not
Check for missing data (if a column has more than 15% of missing data, it is probably better to delete the column instead of replacing the missing values)
Checking for outliers (uni-variate as well as bi-variate)
Apart from these, here are some lesser-known tips for EDA: | https://medium.com/bigbrownbag/lesser-known-techniques-for-data-exploration-23eeb6686a22 | ['Soham Ghosh'] | 2019-04-25 13:48:53.111000+00:00 | ['Exploratory Data Analysis', 'Machine Learning', 'Kaggle', 'Analytics', 'Data Science'] | Title Lesserknown technique data explorationContent Exploratory data analysis EDA essentially first step machine learning pipeline many technique used EDA Checking column name type segment Setting expectation variable might mean may affect target — testing hypothesis Analyzing target variable Using describe function Pandas get summary variable function Pandas get summary variable Checking skewness kurtosis Creating scatter plot pairplot Seaborn probably easiest way distribution plot box plot Seaborn probably easiest way distribution plot box plot Creating correlation matrix heatmap zoomed heatmap required Creating scatter plot correlated variable contemplating whether correlation make sense Check missing data column 15 missing data probably better delete column instead replacing missing value Checking outlier univariate well bivariate Apart lesserknown tip EDATags Exploratory Data Analysis Machine Learning Kaggle Analytics Data Science |
2,026 | Planet OS Data Challenge at ExpeditionHack NYC | We’re thrilled to be part of the Expedition Hackathon NYC happening on November 12–13! This is your chance to map the future of sustainability with NGA, Mapbox, IBM Bluemix, Planet OS and others. The hackathon’s focus areas are Oceans, Forests, Conservation and Indigenous People.
To add some motivation to the hours of intense coding and hustling, we decided to put out tons of high-quality environmental data, data integration and computational infrastructure, and reward the best teams with some cool prizes.
All hackathon participants will get free, unlimited access to:
The prizes:
All teams that use our data tools will secure an unlimited free access to Planet OS data tools
data tools The team with the best solution will get special swag and surprises from Planet OS
The general Grand Prize of the hackathon is $3000 and a round trip to DC from NGA to meet NGA Executives.
We have already validated a few business ideas that the teams could work on. Stay tuned for updates! All the updates will be shared on this page so it would be wise to bookmark it. Contact us at [email protected] for further questions.
#PlanetOS #DataChallenge
About Planet OS:
Planet OS is the world’s leading provider of weather, environmental and geospatial data access and operational intelligence, based in Palo Alto, California. The company works with real-world industries — from energy, weather forecasting, agriculture, logistics to insurance — helping them become data-driven, mitigate risks and grow faster. The world’s second largest offshore wind farm runs on Planet OS Powerboard; and the company’s open data service, Datahub, provides access to thousands parameters of high-quality data collected by premier institutions around the world. | https://medium.com/planet-os/planet-os-data-challenge-at-expeditionhack-nyc-5e6a9f192956 | ['Planet Os'] | 2016-12-07 11:47:31.134000+00:00 | ['Hackathon', 'NYC', 'Sustainability', 'Big Data', 'Nga'] | Title Planet OS Data Challenge ExpeditionHack NYCContent We’re thrilled part Expedition Hackathon NYC happening November 12–13 chance map future sustainability NGA Mapbox IBM Bluemix Planet OS others hackathon’s focus area Oceans Forests Conservation Indigenous People add motivation hour intense coding hustling decided put ton highquality environmental data data integration computational infrastructure reward best team cool prize hackathon participant get free unlimited access prize team use data tool secure unlimited free access Planet OS data tool data tool team best solution get special swag surprise Planet OS general Grand Prize hackathon 3000 round trip DC NGA meet NGA Executives already validated business idea team could work Stay tuned update update shared page would wise bookmark Contact u aljashplanetoscom question PlanetOS DataChallenge Planet OS Planet OS world’s leading provider weather environmental geospatial data access operational intelligence based Palo Alto California company work realworld industry — energy weather forecasting agriculture logistics insurance — helping become datadriven mitigate risk grow faster world’s second largest offshore wind farm run Planet OS Powerboard company’s open data service Datahub provides access thousand parameter highquality data collected premier institution around worldTags Hackathon NYC Sustainability Big Data Nga |
2,027 | SEO For Startups with Naguib Toihiri | Raunak: For people who do not know what SEO is, could you give us a quick summary?
Naguib: If you do not know what SEO is, or you have a vague understanding, don’t worry, because it lies in one of the grey areas of the Marketing World. SEO stands for Search Engine Optimisation. The main objective of SEO is to attract quality traffic to your site. So whenever users search for something in Google, SEO is basically working towards making you rank first on it in the organic search results.
Raunak: And it’s basically a free of cost Marketing channel, am I right in saying that?
Naguib: Absolutely.
The way that I divide search engine marketing usually is into 3 parts:
1) Organic — which is SEO (Search Engine Optimisation)
2) Paid — SEA (Search Engine Advertising)
3) Social — SMO (Social Media Optimisation)
SEO is the organic form of marketing, which basically means that we do not pay Google to increase your visibility, we optimise your site to get it ranked higher on Google organically (without ad money).
Raunak: Just to give the readers a bird’s eye view, for a startup that is competing against big businesses on search; what should they be doing?
Naguib: First of all it is relative to your objectives and what results they are looking for.
If they are looking for immediate results, paid search is the way to go. Essentially you will be visible to your target audience the minute you implement your campaign.
For SEO, it will definitely, take time. The big BUT however is SEO does NOT depend on your budget. This is why I love SEO, it basically is a fair competition and is not relative to budget. Even if you do have a bigger budget, your website will not automatically be ranked higher. It’s a combination of technical, content and social know-how.
Aspects that are often overlooked like a mobile-friendly platform or page speed, are what make a big difference in the long-term visibility and technical rank of your site.
Another key tactic for startups to utilise, is pushing out relevant content. If you push more relevant content with the right keywords and structure, and push it out regularly, Google will recognise this and rank you higher.
Raunak: Alright, now you’ve told me once that there are thousands of SEO tools but most of them are not needed. What tools do you recommend?
Naguib: I will give you only 3 tools to focus on and yes all of them are free.
1. Mobile Friendly Tool — Google — This tool is especially relevant in this region and in this year with an incremental rise in sites competing for the same local market who by the way search as much on smartphones as they do on desktop.
2. Page Speed by Google — It audits your website for mobile and desktop loading speed. It gives a ranking out of 100. If you get a ranking of over 70–80, you’re doing fine, otherwise there is always room for optimisation.
3. Google Search Console — How Google considers your website, is if their seeing any crawl error, how many pages are indexed or not. It is a very critical tool for how Google analyses your site and what you can learn from it.
Where can someone looking to learn more get a chance to do that?
I will be instructing the Search Engine Optimization workshop at the upcoming Digital Marketing Track. Otherwise you can add me on LinkedIn to connect. | https://medium.com/astrolabs/seo-for-startups-with-naguib-toihiri-9a97b8b54936 | ['Raunak Datt'] | 2017-09-18 09:39:58.826000+00:00 | ['Marketing', 'Startup', 'Digital', 'Digital Marketing', 'SEO'] | Title SEO Startups Naguib ToihiriContent Raunak people know SEO could give u quick summary Naguib know SEO vague understanding don’t worry lie one grey area Marketing World SEO stand Search Engine Optimisation main objective SEO attract quality traffic site whenever user search something Google SEO basically working towards making rank first organic search result Raunak it’s basically free cost Marketing channel right saying Naguib Absolutely way divide search engine marketing usually 3 part 1 Organic — SEO Search Engine Optimisation 2 Paid — SEA Search Engine Advertising 3 Social — SMO Social Media Optimisation SEO organic form marketing basically mean pay Google increase visibility optimise site get ranked higher Google organically without ad money Raunak give reader bird’s eye view startup competing big business search Naguib First relative objective result looking looking immediate result paid search way go Essentially visible target audience minute implement campaign SEO definitely take time big however SEO depend budget love SEO basically fair competition relative budget Even bigger budget website automatically ranked higher It’s combination technical content social knowhow Aspects often overlooked like mobilefriendly platform page speed make big difference longterm visibility technical rank site Another key tactic startup utilise pushing relevant content push relevant content right keywords structure push regularly Google recognise rank higher Raunak Alright you’ve told thousand SEO tool needed tool recommend Naguib give 3 tool focus yes free 1 Mobile Friendly Tool — Google — tool especially relevant region year incremental rise site competing local market way search much smartphones desktop 2 Page Speed Google — audit website mobile desktop loading speed give ranking 100 get ranking 70–80 you’re fine otherwise always room optimisation 3 Google Search Console — Google considers website seeing crawl error many page indexed critical tool Google analysis site learn someone looking learn get chance instructing Search Engine Optimization workshop upcoming Digital Marketing Track Otherwise add LinkedIn connectTags Marketing Startup Digital Digital Marketing SEO |
2,028 | Are You the CEO of Your Writing Career? | Are You the CEO of Your Writing Career?
Take control of your writing job by acting as if you are a CEO
Photo by LinkedIn Sales Navigator on Unsplash
Are you the chief executive officer (CEO) of your writing career? To become better at writing, act as if you are the CEO of your writing business and look to grow it.
You should work with other writers, ask for help, listen, and make informed decisions. As the CEO of your writing career, you manage your writing like a business.
Here are five tips to help you become the CEO of your writing career, no matter if writing is your full-time job, a side hustle, or something you do for fun. | https://medium.com/change-your-mind/are-you-the-ceo-of-your-writing-career-2339ee20d12e | ['Matthew Royse'] | 2020-12-18 16:37:14.386000+00:00 | ['Mindset', 'Entrepreneurship', 'Careers', 'Inspiration', 'Writing'] | Title CEO Writing CareerContent CEO Writing Career Take control writing job acting CEO Photo LinkedIn Sales Navigator Unsplash chief executive officer CEO writing career become better writing act CEO writing business look grow work writer ask help listen make informed decision CEO writing career manage writing like business five tip help become CEO writing career matter writing fulltime job side hustle something funTags Mindset Entrepreneurship Careers Inspiration Writing |
2,029 | Don’t Be Fooled. Looking for Inspiration Doesn’t Work Anymore. | Don’t Be Fooled. Looking for Inspiration Doesn’t Work Anymore.
6 proven tactics to get your best work done in no time
Photo by Miriam Espacio from Pexels
I’m so done with it.
Every time I run out of ideas or my current ideas aren’t good enough, I switch on the Dora The Explorer-modus and I go surf the internet. Hoping to find the shiny coast of opportunity and inspiration. Better said — I browse YouTube.
I didn’t do the math, but I guess that in 2% of the cases it works.
I came to the conclusion that consuming more content isn’t the answer.
My desk is located next to my bookshelves and in those moments of incompetence I take out some books — interesting or not — and scroll through them. Trying to find inspiration as if the map to Mordor is hidden inside one of them.
The best part?
This strategy has a success ratio of 3,7% — not great either.
That leaves me with the last option: bang my head on the keyboard until I find something to write about. I’ve had days where this behavior took place 4 times a day, but in better times it only happened twice a week. Besides these mental shackles and some bruises, I’m fine. I guess.
Oh, the success ratio? Around 5%.
I’m not an idiot. This year, I’ve published over 65 articles, so there must be something that I do that enables me to consistently push out content. That got me thinking — what are my strategies whenever I’m stuck in this crazy content-block-limboland?
It turned out that there are 6 tactics I subconsciously apply every time I’m stuck. | https://medium.com/the-brave-writer/dont-be-fooled-looking-for-inspiration-doesn-t-work-anymore-9db4606d4653 | ['Jessie Van Breugel'] | 2020-12-21 17:03:04.865000+00:00 | ['Inspiration', 'Freelancing', 'Writing Tips', 'Entrepreneurship', 'Writing'] | Title Don’t Fooled Looking Inspiration Doesn’t Work AnymoreContent Don’t Fooled Looking Inspiration Doesn’t Work Anymore 6 proven tactic get best work done time Photo Miriam Espacio Pexels I’m done Every time run idea current idea aren’t good enough switch Dora Explorermodus go surf internet Hoping find shiny coast opportunity inspiration Better said — browse YouTube didn’t math guess 2 case work came conclusion consuming content isn’t answer desk located next bookshelf moment incompetence take book — interesting — scroll Trying find inspiration map Mordor hidden inside one best part strategy success ratio 37 — great either leaf last option bang head keyboard find something write I’ve day behavior took place 4 time day better time happened twice week Besides mental shackle bruise I’m fine guess Oh success ratio Around 5 I’m idiot year I’ve published 65 article must something enables consistently push content got thinking — strategy whenever I’m stuck crazy contentblocklimboland turned 6 tactic subconsciously apply every time I’m stuckTags Inspiration Freelancing Writing Tips Entrepreneurship Writing |
2,030 | Open Source Dataset for NLP Beginners | Nowadays, Natural language processing is an up growing and booming field of research, . Sometimes, it might be very confusing for an NLP beginner, which areas to explore and how to find the exact dataset for the implementation and hands-on experience.
This blog is focused to provide an overview of different free online datasets for NLP. It is quite difficult to compile all the datasets, as NLP is a broad range research area but keeping respective of a beginner in mind and general problem statements to be implemented at the starting point, I tried to build the following list.
Datasets for Sentiment Analysis
Applying Machine Learning for the Sentiment analysis task needs a large number of specialized datasets.
The following list should hint at some of the ways that you can improve your sentiment analysis algorithm.
Multidomain Sentiment Analysis Dataset: This dataset contains the features of a variety of product reviews taken from Amazon.
IMDB Reviews: This is relatively small dataset was compiled primarily for binary sentiment classification use cases which contain 25,000 movie reviews.
Stanford Sentiment Treebank: Also built from movie reviews, Stanford’s dataset was designed to train a model to identify sentiment in longer phrases. It contains over 10,000 snippets taken from Rotten Tomatoes.
Sentiment140: This dataset consists of 160,000 tweets formatted with 6 fields: polarity, ID, tweet date, query, user, and the text. Emoticons have been pre-removed.
Twitter US Airline Sentiment: This dataset contains tweets about US airlines that are classified as positive, negative, and neutral. Negative tweets have also been categorized by reason for complaint.
Text Datasets
Natural language processing is a massive field of research, but the following list includes a broad range of datasets for different natural language processing tasks, such as voice recognition and chatbots.
20 Newsgroups: This dataset is a collection of approximately 20,000 documents covers 20 different newsgroups, from baseball to religion.
ArXiv: This repository contains all of the arXiv research paper archive as full text, with a total dataset size of 270 GB.
Reuters News Dataset: The documents in this dataset appeared on Reuters in 1987. They have since been assembled and indexed for use in machine learning.
The WikiQA Corpus: This corpus is a publicly-available collection of questions and answers pairs. It was originally assembled for use in research on open-domain question answering.
UCI’s Spambase: This large spam email dataset is useful for developing personalized spam filters,created by a team at Hewlett-Packard,
Yelp Reviews: This open dataset released by Yelp contains more than 5 million reviews.
WordNet: Compiled by researchers at Princeton University, WordNet is essentially a large lexical database of English ‘synsets’, or groups of synonyms that each describe a different, distinct concept.
The Blog Authorship Corpus — This dataset includes over 681,000 posts written by 19,320 different bloggers. In total, there are over 140 million words within the corpus.
General — Datasets for Natural Language Processing
There are a few more datasets for natural language processing tasks that are commonly used in general.
Enron Dataset: This contains, roughly 500,000 messages from the senior management of Enron. This dataset is generally used by people who are are looking to improve or understand current email tools.
Amazon Reviews: This dataset contains around 35 million reviews from Amazon spanning a period of 18 years. It includes product and user information, ratings, and plaintext review.
Google Books Ngrams: A Google Books corpora of n-grams, or ‘fixed size tuples of items’, can be found at this link. The ’n’ in ‘n-grams’ specifies the number of words or characters in that specific tuple.
Blogger Corpus: This is a collection of 681,288 blog posts contains over 140 million words. Each blog included here contains at least 200 occurrences of common English words.
Wikipedia Links Data: Containing approximately 13 million documents, this dataset by Google consists of web pages that contain at least one hyperlink pointing to English Wikipedia. Each Wikipedia page is treated as an entity, while the anchor text of the link represents a mention of that entity.
Gutenberg eBooks List: This annotated list of ebooks from Project Gutenberg contains basic information about each eBook, organized by year.
Hansards Text Chunks of Canadian Parliament: This corpus contains 1.3 million pairs of aligned text chunks from the records of the 36th Canadian Parliament.
Jeopardy: The archive linked here contains more than 200,000 questions and answers from the quiz show Jeopardy. Each data point also contains a range of other information, including the category of the question, show number, and air date.
SMS Spam Collection in English: This dataset consists of 5,574 English SMS messages that have been tagged as either legitimate or spam. 425 of the texts are spam messages that were manually extracted from the Grumbletext website.
Depending on the problem statement you are aspiring to solve download the respective dataset and get explored more of yourself into world of NLP. | https://medium.com/swlh/you-c120c972f8c6 | ['Dr. Monica'] | 2020-06-28 12:47:31.914000+00:00 | ['Machine Learning', 'Python', 'Data Science', 'NLP', 'Artificial Intelligence'] | Title Open Source Dataset NLP BeginnersContent Nowadays Natural language processing growing booming field research Sometimes might confusing NLP beginner area explore find exact dataset implementation handson experience blog focused provide overview different free online datasets NLP quite difficult compile datasets NLP broad range research area keeping respective beginner mind general problem statement implemented starting point tried build following list Datasets Sentiment Analysis Applying Machine Learning Sentiment analysis task need large number specialized datasets following list hint way improve sentiment analysis algorithm Multidomain Sentiment Analysis Dataset dataset contains feature variety product review taken Amazon IMDB Reviews relatively small dataset compiled primarily binary sentiment classification use case contain 25000 movie review Stanford Sentiment Treebank Also built movie review Stanford’s dataset designed train model identify sentiment longer phrase contains 10000 snippet taken Rotten Tomatoes Sentiment140 dataset consists 160000 tweet formatted 6 field polarity ID tweet date query user text Emoticons preremoved Twitter US Airline Sentiment dataset contains tweet US airline classified positive negative neutral Negative tweet also categorized reason complaint Text Datasets Natural language processing massive field research following list includes broad range datasets different natural language processing task voice recognition chatbots 20 Newsgroups dataset collection approximately 20000 document cover 20 different newsgroups baseball religion ArXiv repository contains arXiv research paper archive full text total dataset size 270 GB Reuters News Dataset document dataset appeared Reuters 1987 since assembled indexed use machine learning WikiQA Corpus corpus publiclyavailable collection question answer pair originally assembled use research opendomain question answering UCI’s Spambase large spam email dataset useful developing personalized spam filterscreated team HewlettPackard Yelp Reviews open dataset released Yelp contains 5 million review WordNet Compiled researcher Princeton University WordNet essentially large lexical database English ‘synsets’ group synonym describe different distinct concept Blog Authorship Corpus — dataset includes 681000 post written 19320 different blogger total 140 million word within corpus General — Datasets Natural Language Processing datasets natural language processing task commonly used general Enron Dataset contains roughly 500000 message senior management Enron dataset generally used people looking improve understand current email tool Amazon Reviews dataset contains around 35 million review Amazon spanning period 18 year includes product user information rating plaintext review Google Books Ngrams Google Books corpus ngrams ‘fixed size tuples items’ found link ’n’ ‘ngrams’ specifies number word character specific tuple Blogger Corpus collection 681288 blog post contains 140 million word blog included contains least 200 occurrence common English word Wikipedia Links Data Containing approximately 13 million document dataset Google consists web page contain least one hyperlink pointing English Wikipedia Wikipedia page treated entity anchor text link represents mention entity Gutenberg eBooks List annotated list ebooks Project Gutenberg contains basic information eBook organized year Hansards Text Chunks Canadian Parliament corpus contains 13 million pair aligned text chunk record 36th Canadian Parliament Jeopardy archive linked contains 200000 question answer quiz show Jeopardy data point also contains range information including category question show number air date SMS Spam Collection English dataset consists 5574 English SMS message tagged either legitimate spam 425 text spam message manually extracted Grumbletext website Depending problem statement aspiring solve download respective dataset get explored world NLPTags Machine Learning Python Data Science NLP Artificial Intelligence |
2,031 | Could Netflix’s Big Mouth Be Oversimplifying Mental Illness? | Note: This article contains Big Mouth Season 4 spoilers.
Like many other people I know, I spent a good chunk of this weekend binging the entire newly released season of Big Mouth, an animated Netflix comedy about … puberty?
Well, maybe that definition isn’t giving the show enough credit.
Although Big Mouth does revolve around the life of tweens going through puberty, it also tackles topics surrounding relationships, sexual and gender identity, and mental health issues — which makes total sense, given that these are all things that come with growing up.
The show’s creators have brought anxiety and depression to life — literally.
More specifically, depression is “The Depression Kitty”, and she’s a giant purple cat that likes to pin you down and berate you with thoughts that are, well, depressing.
Anxiety is “Tito the Anxiety Mosquito” and to be honest, I kind of hate how accurately they brought anxiety to life as a character. Swarming the kids with his nervous energy and whispering anxious fears into their ears, Tito is truly the worst.
While these two foes have a good run on the show, one of their tormented victims, 13-year-old Jessi Glaser, learns to fight them off with “The Gratitoad”. No, that wasn’t a spelling error — gratitude is represented in the show by a talking toad (if you didn’t already sense that this show is all sorts of weird, here’s your cue to do so).
Although practicing expressing gratitude doesn’t get rid of Jessi’s depression and anxiety completely, they become fairly minimized (quite literally — in the season finale, we see the formerly massive depression kitty shrink down to the size of a cute little house cat).
So, combating depression and anxiety with gratitude — is this realistic?
Well, yes and no. While we can use grateful thinking to improve our relationships (both with ourselves and others), it’s unlikely to be the be-all and end-all of mental health cures.
A recent meta-analysis conducted on 27 individual studies dealing with gratitude and its effects on mental health, originally published in the Journal of Happiness Studies (summarized in a Healthline article here) points to clear limits in how much gratitude can really accomplish.
While the studies were conducted differently, they all shared one thing: participants were asked to perform some kind of gratitude exercise. Whether it was writing a grateful letter and reading it to the recipient or listing out all the things that went well in a day, those participating in the studies (3,675 people in total) were all asked to practice gratitude.
After the experiments were up, psychologists analyzed the effect of these different gratitude exercises on the participants’ mental health — specifically as it pertained to their symptoms of depression and anxiety.
In short: The effects were insignificant.
As an article on Science Daily summarizes the results: “Go ahead and be grateful for the good things in your life. Just don’t think that a gratitude intervention will help you feel less depressed or anxious.”
This doesn’t mean that practicing gratitude can’t be impactful on your mindset and your relationships, just that it isn’t a viable treatment for depression and anxiety on its own (nor is it a viable replacement for proven treatments like cognitive behavioral therapy).
So what does this say about Big Mouth and The Gratitoad? How about media representation of mental health issues in general?
I’d say it remains to be seen.
The way I see it, the creators of the show didn’t make it seem like Jessi had entirely defeated her depression and anxiety through working with The Gratitoad, only that she had minimized them.
It’s how they’ll choose to move forward with her story in the next season that will really show how they view mental health issues. In my opinion, I think they gave us a hint of Jessi’s depression coming back with the return of the depression kitty in the final minutes of the season finale (albeit a smaller, less-threatening version of her).
Big Mouth has already made mental health-representation history — now let’s hope they don’t fuck it up.
Not a lot of shows have tried to tackle the progression of mental health issues like Big Mouth has — not with children in their early teens, and certainly not with animation to give these ethereal concepts more digestible names and faces.
In a weird way, this infamously uncensored, whacky, over-the-top cartoon has done a lot more for the on-screen representation of mental illness than many of its more “serious” counterparts.
But the creators of this show are walking a very fine line — the line between the benefit of making these concepts digestible and the danger of oversimplifying mental illness to the point of belittling it.
I never thought I’d say this, but for the sake of accurate representation, I hope the depression kitty and anxiety mosquito have some fight left in them in season five. | https://medium.com/an-injustice/could-netflixs-big-mouth-be-oversimplifying-mental-illness-a2eea1cdfb10 | ['Till Kaeslin'] | 2020-12-08 23:31:31.096000+00:00 | ['Entertainment', 'Mental Health', 'Psychology', 'TV Series', 'Gratitude'] | Title Could Netflix’s Big Mouth Oversimplifying Mental IllnessContent Note article contains Big Mouth Season 4 spoiler Like many people know spent good chunk weekend binging entire newly released season Big Mouth animated Netflix comedy … puberty Well maybe definition isn’t giving show enough credit Although Big Mouth revolve around life tweens going puberty also tackle topic surrounding relationship sexual gender identity mental health issue — make total sense given thing come growing show’s creator brought anxiety depression life — literally specifically depression “The Depression Kitty” she’s giant purple cat like pin berate thought well depressing Anxiety “Tito Anxiety Mosquito” honest kind hate accurately brought anxiety life character Swarming kid nervous energy whispering anxious fear ear Tito truly worst two foe good run show one tormented victim 13yearold Jessi Glaser learns fight “The Gratitoad” wasn’t spelling error — gratitude represented show talking toad didn’t already sense show sort weird here’s cue Although practicing expressing gratitude doesn’t get rid Jessi’s depression anxiety completely become fairly minimized quite literally — season finale see formerly massive depression kitty shrink size cute little house cat combating depression anxiety gratitude — realistic Well yes use grateful thinking improve relationship others it’s unlikely beall endall mental health cure recent metaanalysis conducted 27 individual study dealing gratitude effect mental health originally published Journal Happiness Studies summarized Healthline article point clear limit much gratitude really accomplish study conducted differently shared one thing participant asked perform kind gratitude exercise Whether writing grateful letter reading recipient listing thing went well day participating study 3675 people total asked practice gratitude experiment psychologist analyzed effect different gratitude exercise participants’ mental health — specifically pertained symptom depression anxiety short effect insignificant article Science Daily summarizes result “Go ahead grateful good thing life don’t think gratitude intervention help feel le depressed anxious” doesn’t mean practicing gratitude can’t impactful mindset relationship isn’t viable treatment depression anxiety viable replacement proven treatment like cognitive behavioral therapy say Big Mouth Gratitoad medium representation mental health issue general I’d say remains seen way see creator show didn’t make seem like Jessi entirely defeated depression anxiety working Gratitoad minimized It’s they’ll choose move forward story next season really show view mental health issue opinion think gave u hint Jessi’s depression coming back return depression kitty final minute season finale albeit smaller lessthreatening version Big Mouth already made mental healthrepresentation history — let’s hope don’t fuck lot show tried tackle progression mental health issue like Big Mouth — child early teen certainly animation give ethereal concept digestible name face weird way infamously uncensored whacky overthetop cartoon done lot onscreen representation mental illness many “serious” counterpart creator show walking fine line — line benefit making concept digestible danger oversimplifying mental illness point belittling never thought I’d say sake accurate representation hope depression kitty anxiety mosquito fight left season fiveTags Entertainment Mental Health Psychology TV Series Gratitude |
2,032 | Úll 2017: Call for Proposals | Úll 2017, April 10–11, Killarney, Ireland
Last year, for the first time, we had an open call for submissions to perform at Úll. The response was fantastic and the result was a line-up rich with varied life experiences, performance styles and areas of expertise. It allowed us to bring you a collection of speakers that pushed beyond the expected, celebrating a beautiful balance of fresh faces and friendly familiars.
This year, we want to extend that invitation again.
We are opening a call for submissions for three types of presentation:
Storytelling. Classic. Special Feature.
The theme this year is, simply, “The Future”.
Storytelling
This is a magical part of the programme. The brief is simple: We are looking for folks to tell a 10 minute story about “The Future”. This isn’t a typical presentation, rather a performance. There are no slides or elaborate visuals- just you and the audience.
Could you tell a tale that will delight folks? Do you have some novel insight or pearls of wisdom to share? Could you bring us on an adventure or just plain old inspire us? If so, we’d like to hear from you.
Apply to tell a story.
Classic
These presentations form the backbone of the conference. Guided by the theme of ‘The Future’, we want you to share your experience and insight. This year, we’ll once again be placing these under the banner of ‘The Builders’ and each presentation will last roughly 10–15 minutes. Presenters can follow a more traditional presentation format or get creative with slides, visuals and props.
Are you building an exciting new app? Have you learned valuable lessons while creating your product or building your business? Have you some interesting forecasts or predictions you’d like to share? If so, we’d like to hear from you.
Apply to present on The Builder’s Track.
Special Feature
The Úll Special Feature is an idea we have developed over the last few years. In a nutshell, rather than give you a stage and a timeslot, we give you a room and invite you to set it up with your presentation.
A Special Feature can be a regular conference talk that you record that attendees can walk in and watch. Or it could be an art installation that teaches attendees about hardware hacking. Or it could be a time machine that takes attendees back to their childhood and forward into old age.
This format is particularly attractive for those more introverted amongst us, or folks who prefer to create a more personal, intimate experience rather than performing on stage.
Can you create an experience that folks will remember long after the conference is over? Can you build an interactive installation? Prepare a talk that you pre-record? Do something completely original? If so, we’d like to hear from you.
Apply to present a Special Feature.
The Package
For anyone who presents at Úll, we will provide:
A travel allowance for flights
A full free ticket to the conference
Up to 2 nights accommodation at The Europe Hotel and Spa Resort (lakeview room with balcony)
Up to 2 nights accommodation in Dublin for the Fringe events
Train transportation to and from Killarney if you arrive in Dublin
Support
All storytellers and feature presenters will have access in advance to test out the AV, walk the stage, or set up their space.
We want everyone who presents at Úll to feel supported. We are happy to work with you on preparing your story or feature. Things we could help with:
Brainstorming ideas
Finding a mentor
Figuring out and budgeting for any additional AV requirements
Arranging suitable rehearsal time
Providing a volunteer to help with your feature
As much mutual support as we can muster
We’re here to support you in any way we can. Presentations form the structural core of a conference, and we want ours to be as strong as it can be.
Applications will be open until February 17, 2017 | https://medium.com/the-%C3%BAll-blog/%C3%BAll-2017-call-for-proposals-78c38480da90 | [] | 2017-01-31 22:22:37.963000+00:00 | ['Storytelling', 'Presentations', 'iOS', 'Apple', 'Conference'] | Title Úll 2017 Call ProposalsContent Úll 2017 April 10–11 Killarney Ireland Last year first time open call submission perform Úll response fantastic result lineup rich varied life experience performance style area expertise allowed u bring collection speaker pushed beyond expected celebrating beautiful balance fresh face friendly familiar year want extend invitation opening call submission three type presentation Storytelling Classic Special Feature theme year simply “The Future” Storytelling magical part programme brief simple looking folk tell 10 minute story “The Future” isn’t typical presentation rather performance slide elaborate visuals audience Could tell tale delight folk novel insight pearl wisdom share Could bring u adventure plain old inspire u we’d like hear Apply tell story Classic presentation form backbone conference Guided theme ‘The Future’ want share experience insight year we’ll placing banner ‘The Builders’ presentation last roughly 10–15 minute Presenters follow traditional presentation format get creative slide visuals prop building exciting new app learned valuable lesson creating product building business interesting forecast prediction you’d like share we’d like hear Apply present Builder’s Track Special Feature Úll Special Feature idea developed last year nutshell rather give stage timeslot give room invite set presentation Special Feature regular conference talk record attendee walk watch could art installation teach attendee hardware hacking could time machine take attendee back childhood forward old age format particularly attractive introverted amongst u folk prefer create personal intimate experience rather performing stage create experience folk remember long conference build interactive installation Prepare talk prerecord something completely original we’d like hear Apply present Special Feature Package anyone present Úll provide travel allowance flight full free ticket conference 2 night accommodation Europe Hotel Spa Resort lakeview room balcony 2 night accommodation Dublin Fringe event Train transportation Killarney arrive Dublin Support storyteller feature presenter access advance test AV walk stage set space want everyone present Úll feel supported happy work preparing story feature Things could help Brainstorming idea Finding mentor Figuring budgeting additional AV requirement Arranging suitable rehearsal time Providing volunteer help feature much mutual support muster We’re support way Presentations form structural core conference want strong Applications open February 17 2017Tags Storytelling Presentations iOS Apple Conference |
2,033 | The Bad Writing Habits We Learned in School: And Advice to Forget Them | Photo by Evan Leith on Unsplash
The Bad Writing Habits We Learned in School: And Advice to Forget Them
‘Good habits make time your ally. Bad habits make time your enemy.’
Intro: Why Term Papers Need to Go
If you’re an undergraduate student right now, you are probably consuming and sharing more forms of communication than at any time in history: texts, blogs, Instagram, tweets, TikTok, email, news.
You are a node in a fast-moving network of incoming and outgoing communication of all kinds.
In a society in which most of us are immersed in massive amounts of information, sociology professor Deborah Cohan writes, the power of writing lies not merely in the ability to absorb and recycle endless amounts of information, but more so: “to appreciate essence, nuance, and depth, to distill and focus on important points without convenient guides to translate all the ideas for [us].”
It’s with this ethos of what writing enables us to do that Cohan calls for the end of a modern staple of higher education: the end-of-semester, final ‘term paper.’
In her essay, The Case Against the Term Paper, Cohan writes: | https://medium.com/swlh/the-bad-writing-habits-we-learned-in-school-and-advice-to-forget-them-7662e7517e61 | ['Gavin Lamb'] | 2020-07-20 22:22:45.146000+00:00 | ['Writing Tips', 'Education', 'Productivity', 'Learning', 'Writing'] | Title Bad Writing Habits Learned School Advice Forget ThemContent Photo Evan Leith Unsplash Bad Writing Habits Learned School Advice Forget ‘Good habit make time ally Bad habit make time enemy’ Intro Term Papers Need Go you’re undergraduate student right probably consuming sharing form communication time history text blog Instagram tweet TikTok email news node fastmoving network incoming outgoing communication kind society u immersed massive amount information sociology professor Deborah Cohan writes power writing lie merely ability absorb recycle endless amount information “to appreciate essence nuance depth distill focus important point without convenient guide translate idea us” It’s ethos writing enables u Cohan call end modern staple higher education endofsemester final ‘term paper’ essay Case Term Paper Cohan writesTags Writing Tips Education Productivity Learning Writing |
2,034 | Machine Learning (ML) Algorithms For Beginners with Code Examples in Python | Machine Learning (ML) Algorithms For Beginners with Code Examples in Python
Best machine learning algorithms for beginners with coding samples in Python. Launch the coding samples with Google Colab
Author(s): Pratik Shukla, Roberto Iriondo, Sherwin Chen
Last updated, June 23, 2020
Machine learning (ML) is rapidly changing the world, from diverse types of applications and research pursued in industry and academia. Machine learning is affecting every part of our daily lives. From voice assistants using NLP and machine learning to make appointments, check our calendar, and play music, to programmatic advertisements — that are so accurate that they can predict what we will need before we even think of it.
More often than not, the complexity of the scientific field of machine learning can be overwhelming, making keeping up with “what is important” a very challenging task. However, to make sure that we provide a learning path to those who seek to learn machine learning, but are new to these concepts. In this article, we look at the most critical basic algorithms that hopefully make your machine learning journey less challenging.
Any suggestions or feedback is crucial to continue to improve. Please let us know in the comments if you have any.
📚 Check out our tutorial diving into simple linear regression with math and Python. 📚
Index
Introduction to Machine Learning.
Major Machine Learning Algorithms.
Supervised vs. Unsupervised Learning.
Linear Regression.
Multivariable Linear Regression.
Polynomial Regression.
Exponential Regression.
Sinusoidal Regression.
Logarithmic Regression.
What is machine learning?
A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E. ~ Tom M. Mitchell [1]
Machine learning behaves similarly to the growth of a child. As a child grows, her experience E in performing task T increases, which results in higher performance measure (P).
For instance, we give a “shape sorting block” toy to a child. (Now we all know that in this toy, we have different shapes and shape holes). In this case, our task T is to find an appropriate shape hole for a shape. Afterward, the child observes the shape and tries to fit it in a shaped hole. Let us say that this toy has three shapes: a circle, a triangle, and a square. In her first attempt at finding a shaped hole, her performance measure(P) is 1/3, which means that the child found 1 out of 3 correct shape holes.
Second, the child tries it another time and notices that she is a little experienced in this task. Considering the experience gained (E), the child tries this task another time, and when measuring the performance(P), it turns out to be 2/3. After repeating this task (T) 100 times, the baby now figured out which shape goes into which shape hole.
So her experience (E) increased, her performance(P) also increased, and then we notice that as the number of attempts at this toy increases. The performance also increases, which results in higher accuracy.
Such execution is similar to machine learning. What a machine does is, it takes a task (T), executes it, and measures its performance (P). Now a machine has a large number of data, so as it processes that data, its experience (E) increases over time, resulting in a higher performance measure (P). So after going through all the data, our machine learning model’s accuracy increases, which means that the predictions made by our model will be very accurate.
Another definition of machine learning by Arthur Samuel:
Machine Learning is the subfield of computer science that gives “computers the ability to learn without being explicitly programmed.” ~ Arthur Samuel [2]
Let us try to understand this definition: It states “learn without being explicitly programmed” — which means that we are not going to teach the computer with a specific set of rules, but instead, what we are going to do is feed the computer with enough data and give it time to learn from it, by making its own mistakes and improve upon those. For example, We did not teach the child how to fit in the shapes, but by performing the same task several times, the child learned to fit the shapes in the toy by herself.
Therefore, we can say that we did not explicitly teach the child how to fit the shapes. We do the same thing with machines. We give it enough data to work on and feed it with the information we want from it. So it processes the data and predicts the data accurately.
Why do we need machine learning?
For instance, we have a set of images of cats and dogs. What we want to do is classify them into a group of cats and dogs. To do that we need to find out different animal features, such as:
How many eyes does each animal have? What is the eye color of each animal? What is the height of each animal? What is the weight of each animal? What does each animal generally eat?
We form a vector on each of these questions’ answers. Next, we apply a set of rules such as:
If height > 1 feet and weight > 15 lbs, then it could be a cat.
Now, we have to make such a set of rules for every data point. Furthermore, we place a decision tree of if, else if, else statements and check whether it falls into one of the categories.
Let us assume that the result of this experiment was not fruitful as it misclassified many of the animals, which gives us an excellent opportunity to use machine learning.
What machine learning does is process the data with different kinds of algorithms and tells us which feature is more important to determine whether it is a cat or a dog. So instead of applying many sets of rules, we can simplify it based on two or three features, and as a result, it gives us a higher accuracy. The previous method was not generalized enough to make predictions.
Machine learning models helps us in many tasks, such as:
Object Recognition
Summarization
Prediction
Classification
Clustering
Recommender systems
And others
What is a machine learning model?
A machine learning model is a question/answering system that takes care of processing machine-learning related tasks. Think of it as an algorithm system that represents data when solving problems. The methods we will tackle below are beneficial for industry-related purposes to tackle business problems.
For instance, let us imagine that we are working on Google Adwords’ ML system, and our task is to implementing an ML algorithm to convey a particular demographic or area using data. Such a task aims to go from using data to gather valuable insights to improve business outcomes.
Major Machine Learning Algorithms:
1. Regression (Prediction)
We use regression algorithms for predicting continuous values.
Regression algorithms:
Linear Regression
Polynomial Regression
Exponential Regression
Logistic Regression
Logarithmic Regression
2. Classification
We use classification algorithms for predicting a set of items’ class or category.
Classification algorithms:
K-Nearest Neighbors
Decision Trees
Random Forest
Support Vector Machine
Naive Bayes
3. Clustering
We use clustering algorithms for summarization or to structure data.
Clustering algorithms:
K-means
DBSCAN
Mean Shift
Hierarchical
4. Association
We use association algorithms for associating co-occurring items or events.
Association algorithms:
Apriori
5. Anomaly Detection
We use anomaly detection for discovering abnormal activities and unusual cases like fraud detection.
6. Sequence Pattern Mining
We use sequential pattern mining for predicting the next data events between data examples in a sequence.
7. Dimensionality Reduction
We use dimensionality reduction for reducing the size of data to extract only useful features from a dataset.
8. Recommendation Systems
We use recommenders algorithms to build recommendation engines.
Examples:
Netflix recommendation system.
A book recommendation system.
A product recommendation system on Amazon.
Nowadays, we hear many buzz words like artificial intelligence, machine learning, deep learning, and others.
What are the fundamental differences between Artificial Intelligence, Machine Learning, and Deep Learning?
📚 Check out our editorial recommendations on the best machine learning books. 📚
Artificial Intelligence (AI):
Artificial intelligence (AI), as defined by Professor Andrew Moore, is the science and engineering of making computers behave in ways that, until recently, we thought required human intelligence [4].
These include:
Computer Vision
Language Processing
Creativity
Summarization
Machine Learning (ML):
As defined by Professor Tom Mitchell, machine learning refers to a scientific branch of AI, which focuses on the study of computer algorithms that allow computer programs to automatically improve through experience [3].
These include:
Classification
Neural Network
Clustering
Deep Learning:
Deep learning is a subset of machine learning in which layered neural networks, combined with high computing power and large datasets, can create powerful machine learning models. [3]
Neural network abstract representation | Photo by Clink Adair via Unsplash
Why do we prefer Python to implement machine learning algorithms?
Python is a popular and general-purpose programming language. We can write machine learning algorithms using Python, and it works well. The reason why Python is so popular among data scientists is that Python has a diverse variety of modules and libraries already implemented that make our life more comfortable.
Let us have a brief look at some exciting Python libraries.
Numpy: It is a math library to work with n-dimensional arrays in Python. It enables us to do computations effectively and efficiently. Scipy: It is a collection of numerical algorithms and domain-specific tool-box, including signal processing, optimization, statistics, and much more. Scipy is a functional library for scientific and high-performance computations. Matplotlib: It is a trendy plotting package that provides 2D plotting as well as 3D plotting. Scikit-learn: It is a free machine learning library for python programming language. It has most of the classification, regression, and clustering algorithms, and works with Python numerical libraries such as Numpy, Scipy.
Machine learning algorithms classify into two groups :
Supervised Learning algorithms
Unsupervised Learning algorithms
I. Supervised Learning Algorithms:
Goal: Predict class or value label.
Supervised learning is a branch of machine learning(perhaps it is the mainstream of machine/deep learning for now) related to inferring a function from labeled training data. Training data consists of a set of *(input, target)* pairs, where the input could be a vector of features, and the target instructs what we desire for the function to output. Depending on the type of the *target*, we can roughly divide supervised learning into two categories: classification and regression. Classification involves categorical targets; examples ranging from some simple cases, such as image classification, to some advanced topics, such as machine translations and image caption. Regression involves continuous targets. Its applications include stock prediction, image masking, and others- which all fall in this category.
To illustrate the example of supervised learning below | Source: Photo by Shirota Yuri, Unsplash
To understand what supervised learning is, we will use an example. For instance, we give a child 100 stuffed animals in which there are ten animals of each kind like ten lions, ten monkeys, ten elephants, and others. Next, we teach the kid to recognize the different types of animals based on different characteristics (features) of an animal. Such as if its color is orange, then it might be a lion. If it is a big animal with a trunk, then it may be an elephant.
We teach the kid how to differentiate animals, this can be an example of supervised learning. Now when we give the kid different animals, he should be able to classify them into an appropriate animal group.
For the sake of this example, we notice that 8/10 of his classifications were correct. So we can say that the kid has done a pretty good job. The same applies to computers. We provide them with thousands of data points with its actual labeled values (Labeled data is classified data into different groups along with its feature values). Then it learns from its different characteristics in its training period. After the training period is over, we can use our trained model to make predictions. Keep in mind that we already fed the machine with labeled data, so its prediction algorithm is based on supervised learning. In short, we can say that the predictions by this example are based on labeled data.
Example of supervised learning algorithms :
Linear Regression
Logistic Regression
K-Nearest Neighbors
Decision Tree
Random Forest
Support Vector Machine
II. Unsupervised Learning:
Goal: Determine data patterns/groupings.
In contrast to supervised learning. Unsupervised learning infers from unlabeled data, a function that describes hidden structures in data.
Perhaps the most basic type of unsupervised learning is dimension reduction methods, such as PCA, t-SNE, while PCA is generally used in data preprocessing, and t-SNE usually used in data visualization.
A more advanced branch is clustering, which explores the hidden patterns in data and then makes predictions on them; examples include K-mean clustering, Gaussian mixture models, hidden Markov models, and others.
Along with the renaissance of deep learning, unsupervised learning gains more and more attention because it frees us from manually labeling data. In light of deep learning, we consider two kinds of unsupervised learning: representation learning and generative models.
Representation learning aims to distill a high-level representative feature that is useful for some downstream tasks, while generative models intend to reproduce the input data from some hidden parameters.
To illustrate the example of unsupervised learning below | Source: Photo by Jelleke Vanooteghem, Unsplash
Unsupervised learning works as it sounds. In this type of algorithms, we do not have labeled data. So the machine has to process the input data and try to make conclusions about the output. For example, remember the kid whom we gave a shape toy? In this case, he would learn from its own mistakes to find the perfect shape hole for different shapes.
But the catch is that we are not feeding the child by teaching the methods to fit the shapes (for machine learning purposes called labeled data). However, the child learns from the toy’s different characteristics and tries to make conclusions about them. In short, the predictions are based on unlabeled data.
Examples of unsupervised learning algorithms: | https://medium.com/towards-artificial-intelligence/machine-learning-algorithms-for-beginners-with-python-code-examples-ml-19c6afd60daa | ['Towards Ai Team'] | 2020-12-09 23:51:03.187000+00:00 | ['Technology', 'Artificial Intelligence', 'Education', 'Science', 'Innovation'] | Title Machine Learning ML Algorithms Beginners Code Examples PythonContent Machine Learning ML Algorithms Beginners Code Examples Python Best machine learning algorithm beginner coding sample Python Launch coding sample Google Colab Authors Pratik Shukla Roberto Iriondo Sherwin Chen Last updated June 23 2020 Machine learning ML rapidly changing world diverse type application research pursued industry academia Machine learning affecting every part daily life voice assistant using NLP machine learning make appointment check calendar play music programmatic advertisement — accurate predict need even think often complexity scientific field machine learning overwhelming making keeping “what important” challenging task However make sure provide learning path seek learn machine learning new concept article look critical basic algorithm hopefully make machine learning journey le challenging suggestion feedback crucial continue improve Please let u know comment 📚 Check tutorial diving simple linear regression math Python 📚 Index Introduction Machine Learning Major Machine Learning Algorithms Supervised v Unsupervised Learning Linear Regression Multivariable Linear Regression Polynomial Regression Exponential Regression Sinusoidal Regression Logarithmic Regression machine learning computer program said learn experience E respect class task performance measure P performance task measured P improves experience E Tom Mitchell 1 Machine learning behaves similarly growth child child grows experience E performing task increase result higher performance measure P instance give “shape sorting block” toy child know toy different shape shape hole case task find appropriate shape hole shape Afterward child observes shape try fit shaped hole Let u say toy three shape circle triangle square first attempt finding shaped hole performance measureP 13 mean child found 1 3 correct shape hole Second child try another time notice little experienced task Considering experience gained E child try task another time measuring performanceP turn 23 repeating task 100 time baby figured shape go shape hole experience E increased performanceP also increased notice number attempt toy increase performance also increase result higher accuracy execution similar machine learning machine take task executes measure performance P machine large number data process data experience E increase time resulting higher performance measure P going data machine learning model’s accuracy increase mean prediction made model accurate Another definition machine learning Arthur Samuel Machine Learning subfield computer science give “computers ability learn without explicitly programmed” Arthur Samuel 2 Let u try understand definition state “learn without explicitly programmed” — mean going teach computer specific set rule instead going feed computer enough data give time learn making mistake improve upon example teach child fit shape performing task several time child learned fit shape toy Therefore say explicitly teach child fit shape thing machine give enough data work feed information want process data predicts data accurately need machine learning instance set image cat dog want classify group cat dog need find different animal feature many eye animal eye color animal height animal weight animal animal generally eat form vector questions’ answer Next apply set rule height 1 foot weight 15 lb could cat make set rule every data point Furthermore place decision tree else else statement check whether fall one category Let u assume result experiment fruitful misclassified many animal give u excellent opportunity use machine learning machine learning process data different kind algorithm tell u feature important determine whether cat dog instead applying many set rule simplify based two three feature result give u higher accuracy previous method generalized enough make prediction Machine learning model help u many task Object Recognition Summarization Prediction Classification Clustering Recommender system others machine learning model machine learning model questionanswering system take care processing machinelearning related task Think algorithm system represents data solving problem method tackle beneficial industryrelated purpose tackle business problem instance let u imagine working Google Adwords’ ML system task implementing ML algorithm convey particular demographic area using data task aim go using data gather valuable insight improve business outcome Major Machine Learning Algorithms 1 Regression Prediction use regression algorithm predicting continuous value Regression algorithm Linear Regression Polynomial Regression Exponential Regression Logistic Regression Logarithmic Regression 2 Classification use classification algorithm predicting set items’ class category Classification algorithm KNearest Neighbors Decision Trees Random Forest Support Vector Machine Naive Bayes 3 Clustering use clustering algorithm summarization structure data Clustering algorithm Kmeans DBSCAN Mean Shift Hierarchical 4 Association use association algorithm associating cooccurring item event Association algorithm Apriori 5 Anomaly Detection use anomaly detection discovering abnormal activity unusual case like fraud detection 6 Sequence Pattern Mining use sequential pattern mining predicting next data event data example sequence 7 Dimensionality Reduction use dimensionality reduction reducing size data extract useful feature dataset 8 Recommendation Systems use recommenders algorithm build recommendation engine Examples Netflix recommendation system book recommendation system product recommendation system Amazon Nowadays hear many buzz word like artificial intelligence machine learning deep learning others fundamental difference Artificial Intelligence Machine Learning Deep Learning 📚 Check editorial recommendation best machine learning book 📚 Artificial Intelligence AI Artificial intelligence AI defined Professor Andrew Moore science engineering making computer behave way recently thought required human intelligence 4 include Computer Vision Language Processing Creativity Summarization Machine Learning ML defined Professor Tom Mitchell machine learning refers scientific branch AI focus study computer algorithm allow computer program automatically improve experience 3 include Classification Neural Network Clustering Deep Learning Deep learning subset machine learning layered neural network combined high computing power large datasets create powerful machine learning model 3 Neural network abstract representation Photo Clink Adair via Unsplash prefer Python implement machine learning algorithm Python popular generalpurpose programming language write machine learning algorithm using Python work well reason Python popular among data scientist Python diverse variety module library already implemented make life comfortable Let u brief look exciting Python library Numpy math library work ndimensional array Python enables u computation effectively efficiently Scipy collection numerical algorithm domainspecific toolbox including signal processing optimization statistic much Scipy functional library scientific highperformance computation Matplotlib trendy plotting package provides 2D plotting well 3D plotting Scikitlearn free machine learning library python programming language classification regression clustering algorithm work Python numerical library Numpy Scipy Machine learning algorithm classify two group Supervised Learning algorithm Unsupervised Learning algorithm Supervised Learning Algorithms Goal Predict class value label Supervised learning branch machine learningperhaps mainstream machinedeep learning related inferring function labeled training data Training data consists set input target pair input could vector feature target instructs desire function output Depending type target roughly divide supervised learning two category classification regression Classification involves categorical target example ranging simple case image classification advanced topic machine translation image caption Regression involves continuous target application include stock prediction image masking others fall category illustrate example supervised learning Source Photo Shirota Yuri Unsplash understand supervised learning use example instance give child 100 stuffed animal ten animal kind like ten lion ten monkey ten elephant others Next teach kid recognize different type animal based different characteristic feature animal color orange might lion big animal trunk may elephant teach kid differentiate animal example supervised learning give kid different animal able classify appropriate animal group sake example notice 810 classification correct say kid done pretty good job applies computer provide thousand data point actual labeled value Labeled data classified data different group along feature value learns different characteristic training period training period use trained model make prediction Keep mind already fed machine labeled data prediction algorithm based supervised learning short say prediction example based labeled data Example supervised learning algorithm Linear Regression Logistic Regression KNearest Neighbors Decision Tree Random Forest Support Vector Machine II Unsupervised Learning Goal Determine data patternsgroupings contrast supervised learning Unsupervised learning infers unlabeled data function describes hidden structure data Perhaps basic type unsupervised learning dimension reduction method PCA tSNE PCA generally used data preprocessing tSNE usually used data visualization advanced branch clustering explores hidden pattern data make prediction example include Kmean clustering Gaussian mixture model hidden Markov model others Along renaissance deep learning unsupervised learning gain attention free u manually labeling data light deep learning consider two kind unsupervised learning representation learning generative model Representation learning aim distill highlevel representative feature useful downstream task generative model intend reproduce input data hidden parameter illustrate example unsupervised learning Source Photo Jelleke Vanooteghem Unsplash Unsupervised learning work sound type algorithm labeled data machine process input data try make conclusion output example remember kid gave shape toy case would learn mistake find perfect shape hole different shape catch feeding child teaching method fit shape machine learning purpose called labeled data However child learns toy’s different characteristic try make conclusion short prediction based unlabeled data Examples unsupervised learning algorithmsTags Technology Artificial Intelligence Education Science Innovation |
2,035 | Space Science with Python — A Data Science Tutorial Series | Space Science with Python
Python is an amazing language for data science and machine learning and has a lot of great community driven Open Source libraries and projects. How can we use Python to explore and analyse the wonders and mysteries of Space?
Photo by Shot by Cerqueira on Unsplash
Near-Earth Objects, Meteors, ESA’s Rosetta/Philae mission to a comet, the spacecraft Cassini exploring the ring worlds of Saturn … I worked in great projects during my academic studies and later as a doctorate student in the university. As a modern astrophysicist or space scientist, the major work is done in front of the screen: data exploration, data storage and maintenance, as well as the scientific analysis and publication of fascinating results and insights.
I learned a lot during these times and I am very grateful for that. Grateful for the opportunities and the time to explore cosmic wonders at academia’s final frontier.
I used data scientific methods, machine learning and neural network architectures that can be developed and used by virtually anybody thanks to great publication sites, passionate users and a strong open source community. Now, I want to create a link between Data Science and Space Science. On Medium, Twitter, Reddit or at my Public Outreach presentations: People are amazed and fascinated by our cosmos! And I want to contribute something back for the community: A tutorial series that links Space Science with Python.
Overview
This article is an overview and provides short summaries of all articles that I publish here on Medium. This article will be updated continuously and provides a table of contents. All code examples are uploaded on my GitHub repository. Bookmark it to get future updates.
The very first article contains no coding parts. It was written and published as an initial introduction.
Setup of a virtual environment for Python. Installation of the NASA toolkit SPICE, respectively the Python Wrapper spiceypy. Explanation of some so-called SPICE kernels.
Computation of the Solar System Barycentre with respect to the Sun (using SPICE). The tutorial shows that the gravitational centre of our Solar System moves within and outside the Sun. Consequently, the Sun “wobbles” around this common centre.
The outer gas giants (Jupiter, Saturn, Uranus and Neptune) are the major gravitational influencers in our Solar System. The computations and visualisations of miscellaneous angular parameters reveal that these planets are the main reason of the movement of the Solar System Barycentre as introduced in tutorial session 2.
April / May 2020: The Venus is visible to the naked eye in the evening; right after sunset our neighbor planet appears as a star above the horizon. Close angular distances with the Moon create a nice photo shoot. Here, the tutorial explains how to compute the angular distance between the Venus, Moon and Sun to determine optimal observation parameters (using SPICE).
A tutorial that explains a core analysis and visualisation part of astronomy and space science: maps. SPICE and matplotlib are used to explain, compute, draw and interpret these maps. Further, two different reference systems are explained that are used in future sessions, too.
SPICE provides so-called kernels that allow one to determine the position and velocity vector of planets, asteroids or spacecraft. The vector computation procedure is shown for the dwarf planet Ceres. Based on the position and velocity vector the corresponding orbital elements are calculated. Further, it is shown how close the asteroid 1997BQ passed by Earth in May 2020.
Comets are a remnant of the formation of our Solar System. Hundreds are known, documented and free available as a dataset. In this session, an SQLite database is created with data from the Minor Planet Center and some parameters are derived using SPICE. Further, the Great Comet Hale-Bopp is used as an example to derive positional information.
Two types of comets are known: P and C Types. The different statistical variations are shown and discussed as well as their possible source of origin.
P Type comets are dynamically associated with Jupiter. This dynamical link is described with the Tisserand Parameter that is introduced and explained. A data scientific analysis of the distribution reveals the significant dynamical differences between C and P Type comets.
This tutorial session is a supplementary article. It describes how one can create animations of the multi-dimensional Tisserand Parameter. These kind of visualisations help one to understand more easily multi-input functions. Online supplementary materials are often provided in publications to support the reader with additional information.
Bias effects are present in virtually any statistical or data scientific research topic. Smaller, respectively fainter comets are more difficult to detect and their detectability scales with the distance and activitiy to the Sun.
ESA’s Rosetta/Philae mission explored the comet 67P/Churyumov–Gerasimenko from 2014 to 2016. During its 2 years mission the camera instruments took several images of the comet’s core and derived a 3 D shape model. With the package visvis a Python renderer is programmed to interactively explore this icy world.
There are several sources to predict the trajectory of a comet (here: 67P). We established an SQLite database with data from the Minor Planet center (see part 7) and we learned how to derive data from the SPICE kernels. Both data provide different and also non-static results that are described and compared here.
Part 13 has shown that the orbital elements of 67P from the SPICE kernels change for different Ephemeris times. One possible reason: 67P is a P Type and Jupiter-Family-Comet (part 9) that is being influenced significantly by Jupiter. With the support of SPICE we can show the gravitational influence of the gas giant by computing a simple 2-body solution.
A few weeks ago (End May / Beginning of June 2020) ESA’s Solar Orbiter crossed parts of the dust and ion tail of comet ATLAS. What kind of geometric requirements must be fulfilled to be sure that the spacecraft crossed the ion tail? Using SPICE and the most recent kernels of the spacecraft help us to answer this question.
Brightness, flux density, irradiance, radiance … there are a lot of confusing words and definitions to describe light sources. In astronomy and space science one uses another definition: Magnitude. We create in this basic concept tutorial some functions that are used for future sessions (e.g., brightness computation of asteroids or meteors).
It is the 30th June 2020: Asteroid Day! Today we start with some asteroid related articles, beginning with an asteroid that passed by at a distance of 3 Lunar Distances: 2020 JX1. Computing the position of an asteroid is not as simple as shown in the past, we need the covariance matrix to determine a possible solution space of the asteroid’s location.
2020 JX1 left the vicinity of our home planet! A distance of 3 Lunar Distances was small in cosmic scales, but large enough to miss us. The error-bars in the orbit solution space (see last session) propagate through the computation. Consequently, the sky coordinates of the asteroid are a solution space, too! A 2D Kernel Density Estimator will help us to determine an area of uncertainty in the sky, to answer the question: Where could the asteroid be?
The brightness of asteroids can be computed by using the so called H-G magnitude function. An empirically determined equation that depends on the distance between the asteroid and the Earth and Sun, the phase angle, its absolute magnitude and the slope parameter. What are the special features of this equation? Let’s see …
Tutorial #20 links several topics together: distance and phase angle determination, the apparent magnitude, sky coordinates and so on. The task: Visualising the path of Ceres in the sky for the year 2020 (considering its brightness trend, too). After this article we are good to go to start our first space science project about asteroids and Near-Earth Objects.
Science Project #1
The first part of the project is an introduction into the Near-Earth Object (NEO) topic and does not include any coding yet. The structure of the upcoming weeks is being described.
Our project shall lead to a Python library that can be later used by amateur and professional astronomers and scientists alike. To ensure a credible and sustainable software package the library shall be written in a Test Driven Development (TDD) coding framework. What is TDD exactly? We will figure it out in this session.
A generic TDD example is provided in this step-by-step guide. Using a simple equation (computation of the enclosed angle between 2 vectors) we will try to find a solution based on example for all required computational steps. | https://medium.com/space-science-in-a-nutshell/space-science-with-python-a-data-science-tutorial-series-57ad95660056 | ['Thomas Albin'] | 2020-10-05 14:42:36.278000+00:00 | ['Python', 'Data Science', 'Science', 'Space', 'Programming'] | Title Space Science Python — Data Science Tutorial SeriesContent Space Science Python Python amazing language data science machine learning lot great community driven Open Source library project use Python explore analyse wonder mystery Space Photo Shot Cerqueira Unsplash NearEarth Objects Meteors ESA’s RosettaPhilae mission comet spacecraft Cassini exploring ring world Saturn … worked great project academic study later doctorate student university modern astrophysicist space scientist major work done front screen data exploration data storage maintenance well scientific analysis publication fascinating result insight learned lot time grateful Grateful opportunity time explore cosmic wonder academia’s final frontier used data scientific method machine learning neural network architecture developed used virtually anybody thanks great publication site passionate user strong open source community want create link Data Science Space Science Medium Twitter Reddit Public Outreach presentation People amazed fascinated cosmos want contribute something back community tutorial series link Space Science Python Overview article overview provides short summary article publish Medium article updated continuously provides table content code example uploaded GitHub repository Bookmark get future update first article contains coding part written published initial introduction Setup virtual environment Python Installation NASA toolkit SPICE respectively Python Wrapper spiceypy Explanation socalled SPICE kernel Computation Solar System Barycentre respect Sun using SPICE tutorial show gravitational centre Solar System move within outside Sun Consequently Sun “wobbles” around common centre outer gas giant Jupiter Saturn Uranus Neptune major gravitational influencers Solar System computation visualisation miscellaneous angular parameter reveal planet main reason movement Solar System Barycentre introduced tutorial session 2 April May 2020 Venus visible naked eye evening right sunset neighbor planet appears star horizon Close angular distance Moon create nice photo shoot tutorial explains compute angular distance Venus Moon Sun determine optimal observation parameter using SPICE tutorial explains core analysis visualisation part astronomy space science map SPICE matplotlib used explain compute draw interpret map two different reference system explained used future session SPICE provides socalled kernel allow one determine position velocity vector planet asteroid spacecraft vector computation procedure shown dwarf planet Ceres Based position velocity vector corresponding orbital element calculated shown close asteroid 1997BQ passed Earth May 2020 Comets remnant formation Solar System Hundreds known documented free available dataset session SQLite database created data Minor Planet Center parameter derived using SPICE Great Comet HaleBopp used example derive positional information Two type comet known P C Types different statistical variation shown discussed well possible source origin P Type comet dynamically associated Jupiter dynamical link described Tisserand Parameter introduced explained data scientific analysis distribution reveals significant dynamical difference C P Type comet tutorial session supplementary article describes one create animation multidimensional Tisserand Parameter kind visualisation help one understand easily multiinput function Online supplementary material often provided publication support reader additional information Bias effect present virtually statistical data scientific research topic Smaller respectively fainter comet difficult detect detectability scale distance activitiy Sun ESA’s RosettaPhilae mission explored comet 67PChuryumov–Gerasimenko 2014 2016 2 year mission camera instrument took several image comet’s core derived 3 shape model package visvis Python renderer programmed interactively explore icy world several source predict trajectory comet 67P established SQLite database data Minor Planet center see part 7 learned derive data SPICE kernel data provide different also nonstatic result described compared Part 13 shown orbital element 67P SPICE kernel change different Ephemeris time One possible reason 67P P Type JupiterFamilyComet part 9 influenced significantly Jupiter support SPICE show gravitational influence gas giant computing simple 2body solution week ago End May Beginning June 2020 ESA’s Solar Orbiter crossed part dust ion tail comet ATLAS kind geometric requirement must fulfilled sure spacecraft crossed ion tail Using SPICE recent kernel spacecraft help u answer question Brightness flux density irradiance radiance … lot confusing word definition describe light source astronomy space science one us another definition Magnitude create basic concept tutorial function used future session eg brightness computation asteroid meteor 30th June 2020 Asteroid Day Today start asteroid related article beginning asteroid passed distance 3 Lunar Distances 2020 JX1 Computing position asteroid simple shown past need covariance matrix determine possible solution space asteroid’s location 2020 JX1 left vicinity home planet distance 3 Lunar Distances small cosmic scale large enough miss u errorbars orbit solution space see last session propagate computation Consequently sky coordinate asteroid solution space 2D Kernel Density Estimator help u determine area uncertainty sky answer question could asteroid brightness asteroid computed using called HG magnitude function empirically determined equation depends distance asteroid Earth Sun phase angle absolute magnitude slope parameter special feature equation Let’s see … Tutorial 20 link several topic together distance phase angle determination apparent magnitude sky coordinate task Visualising path Ceres sky year 2020 considering brightness trend article good go start first space science project asteroid NearEarth Objects Science Project 1 first part project introduction NearEarth Object NEO topic include coding yet structure upcoming week described project shall lead Python library later used amateur professional astronomer scientist alike ensure credible sustainable software package library shall written Test Driven Development TDD coding framework TDD exactly figure session generic TDD example provided stepbystep guide Using simple equation computation enclosed angle 2 vector try find solution based example required computational stepsTags Python Data Science Science Space Programming |
2,036 | A Beginner’s Look at Kaggle | The above heatmaps show our strength of association between each variable. While there is no rigid standard for “Highly Associated” or “Weakly Associated”, we will use a cut-off value of |0.1| between our independent variables and survival. We will likely drop features whose association is lower than |0.1|. This is an entirely arbitrary guess, and I may return to raise or lower the bar later (In fact, I decided to keep Age after noticing improved performance when I did).
For now, the feature that meets the criteria for dropping are SibSp. Additionally, I am choosing to drop Name, Ticket and Cabin, mostly on a hunch that they don’t add much.
It should be noted that correlation between independent (predictor) variables can mean redundant information. This can cause a drop in performance in some algorithms. Some might choose to drop highly correlated predictors. I did not take the time to do that, but you might try it at home!
todrop = ['SibSp', 'Ticket', 'Cabin', 'Name']
train_df = train_df.drop(todrop, axis=1)
Let’s take a look at our transformed data frame, replete with new features, categories converted to numerical data, and old features dropped:
Setup for Machine Learning:
During this phase, we will begin to format our data for feeding into a machine learning algorithm. We will then use this formatted data to get a picture of what a few different models can do for us, and pick the best one. This phase is broken into the following parts:
Train/Test Split Normalize Data of each split Impute missing values
Let’s go.
Train/Test Split
We will split our data once into training and testing sets. Within the training set, we will use stratified k-fold cross validation to find average performance of our models.
The test set will not be touched until after we have fully tuned each of our candidate models using the training data and k-fold cross validation. Once training and tuning is complete, we will compare the results of each model on the held-out test set. The one that performs the best will be used for the competition.
# Split dependant and independant variables
X = train_df.drop(['Survived'], axis = 1)
Y = train_df.loc[:, 'Survived'] # Split data into training and validation sets
x_train, x_test, y_train, y_test = model_selection.train_test_split(X, Y, test_size=0.2, random_state=333)
Normalizing the Data
Some Machine Learning models require all of our predictors to be on the same scale, while others do not. Most notably, models like Logistic Regression and SVM will probably benefit from scaling, while decision trees will simply ignore scaling. Because we are going to be looking at a mixed bag of algorithms, I’m going to go ahead and scale our data.
# We normalize the training and testing data separately so as to avoid data leaks. Ask at the end! x_train = pd.DataFrame(pre.scale(x_train),
columns=x_train.columns,
index=x_train.index)
x_test = pd.DataFrame(pre.scale(x_test),
columns=x_test.columns,
index=x_test.index)
Imputing Missing Data
You might recall that there were a significant amount of missing Age values in our data. Let’s fill this in with the median age:
# Again, applying changes to the now separate datasets helps us avoid data leaks. x_train.loc[x_train.Age.isnull(), 'Age'] = x_train.loc[:, 'Age']
.median() x_test.loc[x_test.Age.isnull(), 'Age'] = x_test.loc[:, 'Age']
.median()
Let’s make sure our missing data is filled in:
x_train.info() # Output:
<class 'pandas.core.frame.DataFrame'>
Int64Index: 712 entries, 466 to 781
Data columns (total 11 columns):
# Column Non-Null Count Dtype
--- ------ -------------- -----
0 Pclass 712 non-null float64
1 Sex 712 non-null float64
2 Age 712 non-null float64
3 Parch 712 non-null float64
4 Fare 712 non-null float64
5 Embarked 712 non-null float64
6 Title 712 non-null float64
7 FamilySize 712 non-null float64
8 Alone 712 non-null float64
9 LName 712 non-null float64
10 NameLength 712 non-null float64
dtypes: float64(11)
Now we see that each and every variable that we chose to keep has 712 valid data entries.
Model Selection
Now that we have prepared our data, we want to look at different options available to us for solving classification problems. Some common ones are:
K-Nearest Neighbors
Support Vector Machines
Decision Trees
Logistic Regression
We will train and tune each of these models on our training data by way of k-fold cross-validation. When complete, we will compare the tuned models’ performance on a held out test set.
Training and Comparing Base Models:
First, we want to get a feel model’s performance before tuning. We will write two functions to help us describe our results. The first will evaluate the model several times over random splits in the data, and return the average performance as a dictionary. The second will simply nicely print our dictionary.
# A function that evaluates each model and gives us the results:
def kfold_evaluate(model, folds=5):
eval_dict = {}
accuracy = 0
f1 = 0
AUC = 0
skf = model_selection.StratifiedKFold(n_splits=folds)
# perform k splits on the training data.
for train_idx, test_idx in skf.split(x_train, y_train):
xk_train, xk_test =
x_train.iloc[train_idx], x_train.iloc[test_idx] yk_train, yk_test =
y_train.iloc[train_idx], y_train.iloc[test_idx]
# Test performance on this fold:
model.fit(xk_train, yk_train)
y_pred = model.predict(xk_test)
report = metrics.classification_report(yk_test,
y_pred,
output_dict=True)
# Gather performance metrics for output
prob_array = model.predict_proba(xk_test)
fpr, tpr, huh = metrics.roc_curve(yk_test,
model.predict_proba(xk_test)[:,1])
auc = metrics.auc(fpr, tpr)
accuracy += report['accuracy']
f1 += report['macro avg']['f1-score']
AUC += auc
# Average performance metrics over the k folds
measures = np.array([accuracy, f1, AUC])
measures = measures/folds # Add metric averages to dictionary and return.
eval_dict['Accuracy'] = measures[0]
eval_dict['F1 Score'] = measures[1]
eval_dict['AUC'] = measures[2]
eval_dict['Model'] = model
return eval_dict # a function to pretty print our dictionary of dictionaries:
def pprint(web, level):
for k,v in web.items():
if isinstance(v, dict):
print('\t'*level, f'{k}: ')
level += 1
pprint(v, level)
level -= 1
else:
print('\t'*level, k, ": ", v)
Putting our kfold evaluation function to use:
# Perform evaluation on each model:
evals = {}
evals['KNN'] = kfold_evaluate(KNeighborsClassifier())
evals['Logistic Regression'] = kfold_evaluate(LogisticRegression(max_iter=1000))
evals['Random Forest'] = kfold_evaluate(RandomForestClassifier())
evals['SVC'] = kfold_evaluate(SVC(probability=True)) # Plot results for visual comparison:
result_df = pd.DataFrame(evals)
result_df
.drop('Model', axis=0)
.plot(kind='bar', ylim=(0.7, 0.9))
.set_title("Base Model Performance")
plt.xticks(rotation=0)
plt.show()
Base Model Summary
It appears that we have a clear winner in our Random Forest classifier.
Hyper-parameter Tuning:
Let’s tune up our current champion’s hyper-parameters in hopes of eking out a little bit more performance. We will use scikit-learn’s RandomizedSearchCV which has some speed advantages over using an exhaustive GridSearchCV . Our first step is to create our grid of parameters over which we will randomly search for the best settings:
# Number of trees in random forest
n_estimators = [int(x) for x in np.linspace(start = 200, stop = 2000, num = 10)]
# Number of features to consider at every split
max_features = ['auto', 'sqrt']
# Maximum number of levels in tree
max_depth = [int(x) for x in np.linspace(10, 110, num = 11)]
max_depth.append(None)
# Minimum number of samples required to split a node
min_samples_split = [2, 5, 10]
# Minimum number of samples required at each leaf node
min_samples_leaf = [1, 2, 4]
# Method of selecting samples for training each tree
bootstrap = [True, False] # Create the random grid from above parameters
random_grid = {'n_estimators': n_estimators,
'max_features': max_features,
'max_depth': max_depth,
'min_samples_split': min_samples_split,
'min_samples_leaf': min_samples_leaf,
'bootstrap': bootstrap} pprint(random_grid, 0) #Output:
n_estimators : [200, 400, 600, 800, 1000,
1200, 1400, 1600, 1800, 2000]
max_features : ['auto', 'sqrt']
max_depth : [10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 110, None]
min_samples_split : [2, 5, 10]
min_samples_leaf : [1, 2, 4]
bootstrap : [True, False]
Next, we want to create our RandomizedSearchCV object which will use the grid we just created above. It will randomly sample 10 combinations of parameters, test them over 3 folds and return the set of parameters that performed the best on our training data.
# create RandomizedSearchCV object
searcher = model_selection.RandomizedSearchCV(
estimator = RandomForestClassifier(),
param_distributions = random_grid,
n_iter = 10, # Number of parameter settings to sample
cv = 3, # Number of folds for k-fold validation
n_jobs = -1, # Use all processors to compute in parallel
random_state=0 # Fore reproducible results
) # Look for the best parameters
search = searcher.fit(x_train, y_train)
params = search.best_params_
params #Output:
{'n_estimators': 1600,
'min_samples_split': 10,
'min_samples_leaf': 4,
'max_features': 'auto',
'max_depth': 30,
'bootstrap': False}
After performing our parameter tuning, we can verify whether or not the parameters provided by the search actually improve the base model or not. Let’s compare the performance of the two models before and after tuning.
tuning_eval = {}
tuned_rf = RandomForestClassifier(**params)
basic_rf = RandomForestClassifier() tuning_eval['Tuned'] = kfold_evaluate(tuned_rf)
tuning_eval['Basic'] = kfold_evaluate(basic_rf) result_df = pd.DataFrame(tuning_eval)
result_df.drop('Model', axis=0).plot(kind='bar', ylim=(0.7, 0.9)).set_title("Tuning Performance")
plt.xticks(rotation=0)
plt.show()
result_df
Final Steps:
Now that we have chosen and tuned a Random Forest classifier, we want to test it on data it has never before seen. This will tell us how we might expect the model to perform in the future, on new data. It’s time to use that held out test set.
Then, we will combine the test and training data, and re-fit our model to the combined data set, hopefully giving it the greatest chance of success on the unlabeled data from the competition.
Finally, we will make our predictions on the unlabeled data for submission to the competition.
Final Test on Held Out Data
# Get tuned model predictions on held out data
y_pred = tuned_rf.predict(x_test) # Compare predictions to actual answers and show performance
results = metrics.classification_report(y_test, y_pred,
labels = [0, 1],
target_names = ['Died', 'Survived'],
output_dict = True) pprint(results, 0)
And here is how our model performed:
Died:
precision : 0.7815126050420168
recall : 0.8532110091743119
f1-score : 0.8157894736842106
support : 109
Survived:
precision : 0.7333333333333333
recall : 0.6285714285714286
f1-score : 0.6769230769230768
support : 70
accuracy : 0.7653631284916201
macro avg:
precision : 0.757422969187675
recall : 0.7408912188728702
f1-score : 0.7463562753036437
support : 179
weighted avg:
precision : 0.7626715490665541
recall : 0.7653631284916201
f1-score : 0.761484178861421
support : 179
It looks like we may have experienced some over-fitting. Our model’s performance on the test data is roughly 7–9% lower across the board, but we should expect that our model performs about this well on real world data that it has never before seen.
Combine Training and Testing Datasets for Final Model Fit
Now that we have ascertained that our tuned model performs with about 76% accuracy and has an f1-score of 0.74 on new data, we can proceed to train our model on the entire labeled training set. More (good) data is almost always better for an algorithm.
X = pd.concat([x_train, x_test], axis=0).sort_index()
Y = pd.concat([y_train, y_test], axis=0).sort_index()
tuned_rf.fit(X, Y)
Format and Standardize Unlabeled Data
Now that our model has been completely fitted on the training data, it’s time to get ready to make the predictions that we will submit to the competition.
We need to transform our unlabeled competition data in the same manner as when we were formatting our training data. This includes encoding categorical variables, dropping the same features and normalization. The idea here is consistency. What we did to the data that we trained the model on, we need to do to the data we will use to make our final predictions.
# Feature Engineering:
test_df['Title'] = test_df.Name.str.extract(r'([A-Za-z]+)\.')
test_df['LName'] = test_df.Name.str.extract(r'([A-Za-z]+),')
test_df['NameLength'] = test_df.Name.apply(len)
test_df['FamilySize'] = 1 + test_df.SibSp + test_df.Parch
test_df['Alone'] = test_df.FamilySize.apply(lambda x: 1 if x==1 else 0)
test_df.Title = test_df.Title.map(title_dict) # Feature Selection
test_df = test_df.drop(todrop, axis=1) # Imputation of missing age and fare data
test_df.loc[test_df.Age.isna(), 'Age'] = test_df.Age.median()
test_df.loc[test_df.Fare.isna(), 'Fare'] = test_df.Fare.median() # encode categorical data
for i in test_df.columns:
if test_df[i].dtype == 'object':
test_df[i], _ = pd.factorize(test_df[i])
# center and scale data
test_df = pd.DataFrame(pre.scale(test_df), columns=test_df.columns, index=test_df.index) # ensure columns of unlabeled data are in same order as training data.
test_df = test_df[x_test.columns]
test_df
Make Final Predictions and Common Sense Check:
Roughly 32 percent of the passengers aboard the Titanic lived. We will do a last, common sense check to see if our algorithm predicts roughly the same distribution of survivals. Since Survived variable with value 1 implies survival, we can simply add all instances of survival and divide by the total number of passengers to get a rough idea of our predicted distribution.
Keep in mind, the competition organizers could have been tricky and given us uneven distributions for training and testing. In that case, this might not work, but I’m assuming they did not.
# Make final predictions
final = tuned_rf.predict(test_df) # Check the probability of survival according to our predictions. It should be roughly 32% (we get 36.6% which is a bit optimistic)
final.sum()/len(final) # Get our predictions in the competition rules format:
submission = pd.DataFrame({'PassengerId':test_df.index,
'Survived':final}) # Output our submission data to a .csv file:
submission.to_csv('submission2.csv', index=False)
Summary
My focus was not to win a competition, but to learn a way of thinking. After all, if you are like me, you aspire to become a data-scientist. Therefore, machine learning experiments should be rigorous and repeatable, but just as important, the process should be uniquely defined by the questions being asked and the data on hand to answer those questions.
However, if you are wondering how well this set-up performed, it achieved an accuracy of 77%. That’s far from perfect!
Again, if you have any feedback, I would love to hear your questions, comments and critiques of my process. Writing this article is part of my own learning process, and I hope you join in.
Thanks! | https://medium.com/analytics-vidhya/a-beginners-look-at-kaggle-b868ceb2eccf | ['Wesley Neill'] | 2020-05-09 17:43:23.183000+00:00 | ['Machine Learning', 'Data Science', 'Python', 'Begginer', 'Kaggle'] | Title Beginner’s Look KaggleContent heatmaps show strength association variable rigid standard “Highly Associated” “Weakly Associated” use cutoff value 01 independent variable survival likely drop feature whose association lower 01 entirely arbitrary guess may return raise lower bar later fact decided keep Age noticing improved performance feature meet criterion dropping SibSp Additionally choosing drop Name Ticket Cabin mostly hunch don’t add much noted correlation independent predictor variable mean redundant information cause drop performance algorithm might choose drop highly correlated predictor take time might try home todrop SibSp Ticket Cabin Name traindf traindfdroptodrop axis1 Let’s take look transformed data frame replete new feature category converted numerical data old feature dropped Setup Machine Learning phase begin format data feeding machine learning algorithm use formatted data get picture different model u pick best one phase broken following part TrainTest Split Normalize Data split Impute missing value Let’s go TrainTest Split split data training testing set Within training set use stratified kfold cross validation find average performance model test set touched fully tuned candidate model using training data kfold cross validation training tuning complete compare result model heldout test set one performs best used competition Split dependant independant variable X traindfdropSurvived axis 1 traindfloc Survived Split data training validation set xtrain xtest ytrain ytest modelselectiontraintestsplitX testsize02 randomstate333 Normalizing Data Machine Learning model require predictor scale others notably model like Logistic Regression SVM probably benefit scaling decision tree simply ignore scaling going looking mixed bag algorithm I’m going go ahead scale data normalize training testing data separately avoid data leak Ask end xtrain pdDataFrameprescalextrain columnsxtraincolumns indexxtrainindex xtest pdDataFrameprescalextest columnsxtestcolumns indexxtestindex Imputing Missing Data might recall significant amount missing Age value data Let’s fill median age applying change separate datasets help u avoid data leak xtrainlocxtrainAgeisnull Age xtrainloc Age median xtestlocxtestAgeisnull Age xtestloc Age median Let’s make sure missing data filled xtraininfo Output class pandascoreframeDataFrame Int64Index 712 entry 466 781 Data column total 11 column Column NonNull Count Dtype 0 Pclass 712 nonnull float64 1 Sex 712 nonnull float64 2 Age 712 nonnull float64 3 Parch 712 nonnull float64 4 Fare 712 nonnull float64 5 Embarked 712 nonnull float64 6 Title 712 nonnull float64 7 FamilySize 712 nonnull float64 8 Alone 712 nonnull float64 9 LName 712 nonnull float64 10 NameLength 712 nonnull float64 dtypes float6411 see every variable chose keep 712 valid data entry Model Selection prepared data want look different option available u solving classification problem common one KNearest Neighbors Support Vector Machines Decision Trees Logistic Regression train tune model training data way kfold crossvalidation complete compare tuned models’ performance held test set Training Comparing Base Models First want get feel model’s performance tuning write two function help u describe result first evaluate model several time random split data return average performance dictionary second simply nicely print dictionary function evaluates model give u result def kfoldevaluatemodel folds5 evaldict accuracy 0 f1 0 AUC 0 skf modelselectionStratifiedKFoldnsplitsfolds perform k split training data trainidx testidx skfsplitxtrain ytrain xktrain xktest xtrainiloctrainidx xtrainiloctestidx yktrain yktest ytrainiloctrainidx ytrainiloctestidx Test performance fold modelfitxktrain yktrain ypred modelpredictxktest report metricsclassificationreportyktest ypred outputdictTrue Gather performance metric output probarray modelpredictprobaxktest fpr tpr huh metricsroccurveyktest modelpredictprobaxktest1 auc metricsaucfpr tpr accuracy reportaccuracy f1 reportmacro avgf1score AUC auc Average performance metric k fold measure nparrayaccuracy f1 AUC measure measuresfolds Add metric average dictionary return evaldictAccuracy measures0 evaldictF1 Score measures1 evaldictAUC measures2 evaldictModel model return evaldict function pretty print dictionary dictionary def pprintweb level kv webitems isinstancev dict printtlevel fk level 1 pprintv level level 1 else printtlevel k v Putting kfold evaluation function use Perform evaluation model evals evalsKNN kfoldevaluateKNeighborsClassifier evalsLogistic Regression kfoldevaluateLogisticRegressionmaxiter1000 evalsRandom Forest kfoldevaluateRandomForestClassifier evalsSVC kfoldevaluateSVCprobabilityTrue Plot result visual comparison resultdf pdDataFrameevals resultdf dropModel axis0 plotkindbar ylim07 09 settitleBase Model Performance pltxticksrotation0 pltshow Base Model Summary appears clear winner Random Forest classifier Hyperparameter Tuning Let’s tune current champion’s hyperparameters hope eking little bit performance use scikitlearn’s RandomizedSearchCV speed advantage using exhaustive GridSearchCV first step create grid parameter randomly search best setting Number tree random forest nestimators intx x nplinspacestart 200 stop 2000 num 10 Number feature consider every split maxfeatures auto sqrt Maximum number level tree maxdepth intx x nplinspace10 110 num 11 maxdepthappendNone Minimum number sample required split node minsamplessplit 2 5 10 Minimum number sample required leaf node minsamplesleaf 1 2 4 Method selecting sample training tree bootstrap True False Create random grid parameter randomgrid nestimators nestimators maxfeatures maxfeatures maxdepth maxdepth minsamplessplit minsamplessplit minsamplesleaf minsamplesleaf bootstrap bootstrap pprintrandomgrid 0 Output nestimators 200 400 600 800 1000 1200 1400 1600 1800 2000 maxfeatures auto sqrt maxdepth 10 20 30 40 50 60 70 80 90 100 110 None minsamplessplit 2 5 10 minsamplesleaf 1 2 4 bootstrap True False Next want create RandomizedSearchCV object use grid created randomly sample 10 combination parameter test 3 fold return set parameter performed best training data create RandomizedSearchCV object searcher modelselectionRandomizedSearchCV estimator RandomForestClassifier paramdistributions randomgrid niter 10 Number parameter setting sample cv 3 Number fold kfold validation njobs 1 Use processor compute parallel randomstate0 Fore reproducible result Look best parameter search searcherfitxtrain ytrain params searchbestparams params Output nestimators 1600 minsamplessplit 10 minsamplesleaf 4 maxfeatures auto maxdepth 30 bootstrap False performing parameter tuning verify whether parameter provided search actually improve base model Let’s compare performance two model tuning tuningeval tunedrf RandomForestClassifierparams basicrf RandomForestClassifier tuningevalTuned kfoldevaluatetunedrf tuningevalBasic kfoldevaluatebasicrf resultdf pdDataFrametuningeval resultdfdropModel axis0plotkindbar ylim07 09settitleTuning Performance pltxticksrotation0 pltshow resultdf Final Steps chosen tuned Random Forest classifier want test data never seen tell u might expect model perform future new data It’s time use held test set combine test training data refit model combined data set hopefully giving greatest chance success unlabeled data competition Finally make prediction unlabeled data submission competition Final Test Held Data Get tuned model prediction held data ypred tunedrfpredictxtest Compare prediction actual answer show performance result metricsclassificationreportytest ypred label 0 1 targetnames Died Survived outputdict True pprintresults 0 model performed Died precision 07815126050420168 recall 08532110091743119 f1score 08157894736842106 support 109 Survived precision 07333333333333333 recall 06285714285714286 f1score 06769230769230768 support 70 accuracy 07653631284916201 macro avg precision 0757422969187675 recall 07408912188728702 f1score 07463562753036437 support 179 weighted avg precision 07626715490665541 recall 07653631284916201 f1score 0761484178861421 support 179 look like may experienced overfitting model’s performance test data roughly 7–9 lower across board expect model performs well real world data never seen Combine Training Testing Datasets Final Model Fit ascertained tuned model performs 76 accuracy f1score 074 new data proceed train model entire labeled training set good data almost always better algorithm X pdconcatxtrain xtest axis0sortindex pdconcatytrain ytest axis0sortindex tunedrffitX Format Standardize Unlabeled Data model completely fitted training data it’s time get ready make prediction submit competition need transform unlabeled competition data manner formatting training data includes encoding categorical variable dropping feature normalization idea consistency data trained model need data use make final prediction Feature Engineering testdfTitle testdfNamestrextractrAZaz testdfLName testdfNamestrextractrAZaz testdfNameLength testdfNameapplylen testdfFamilySize 1 testdfSibSp testdfParch testdfAlone testdfFamilySizeapplylambda x 1 x1 else 0 testdfTitle testdfTitlemaptitledict Feature Selection testdf testdfdroptodrop axis1 Imputation missing age fare data testdfloctestdfAgeisna Age testdfAgemedian testdfloctestdfFareisna Fare testdfFaremedian encode categorical data testdfcolumns testdfidtype object testdfi pdfactorizetestdfi center scale data testdf pdDataFrameprescaletestdf columnstestdfcolumns indextestdfindex ensure column unlabeled data order training data testdf testdfxtestcolumns testdf Make Final Predictions Common Sense Check Roughly 32 percent passenger aboard Titanic lived last common sense check see algorithm predicts roughly distribution survival Since Survived variable value 1 implies survival simply add instance survival divide total number passenger get rough idea predicted distribution Keep mind competition organizer could tricky given u uneven distribution training testing case might work I’m assuming Make final prediction final tunedrfpredicttestdf Check probability survival according prediction roughly 32 get 366 bit optimistic finalsumlenfinal Get prediction competition rule format submission pdDataFramePassengerIdtestdfindex Survivedfinal Output submission data csv file submissiontocsvsubmission2csv indexFalse Summary focus win competition learn way thinking like aspire become datascientist Therefore machine learning experiment rigorous repeatable important process uniquely defined question asked data hand answer question However wondering well setup performed achieved accuracy 77 That’s far perfect feedback would love hear question comment critique process Writing article part learning process hope join ThanksTags Machine Learning Data Science Python Begginer Kaggle |
2,037 | How to Gain Wisdom? Read Some of Aesop’s Fables | How to gain wisdom? Read some Aesop fables
Everyone wants to gain wisdom. Wisdom is one of the greatest qualities that human beings can possess. So, seek it, hold on to it, share it and treasure it. Why? Because it will help you navigate through choppy waters, it will lift you up from the depths of despair, it will help you put everything into perspective, and ultimately it will turn you into the hero of your own story. But, how do you gain wisdom? I suggest that you start by reading some of Aesop’s fables.
With the possible exception of the New Testament, no works written in Greek have been more widespread and better known than Aesop’s fables. For more than 2500 years, Aesop fables have been teaching people of all ages valuable life lessons in the most entertaining and cynical way.
Want to hear a rags-to-riches story? Meet Aesop the Wise-Fool
Aesop’s life reads just like one of his fables. Aesop is believed to have lived between the period from 620 to 560 BC. He began his life as a slave and was said to have been remarkably ugly with some physical deformities and as this wasn’t enough misfortune, he was born mute, unable to utter a word. On the positive side, he was intelligent, resourceful and kind. His life took a turn for the better after he rescued a priestess of the goddess Isis from a difficult situation after she had strayed from the road and became lost.
From Slavery to Greatness — Meet Aesop who is also known as the Wise-Fool
His divine reward for this act of kindness was the gift of speech and a remarkable ability to conceive and elaborate wise tales in Greek. His talent for storytelling, his wisdom, and wit set him free literally. Aesop acquired freedom, fame, and fortune in the same breath. Not bad for an ugly, deformed mute. He acquired some kind of celebrity status by hanging out with the most prominent and powerful personalities of the time offering to solve their problems, giving them sound advice, and telling fables along the way. But in the end, it was his very success that lead him to his ruin.
Aesop made a good living as a storyteller travelling from city to city to perform his art, acquiring fame and fortune along the way. When he arrived in Delphi, he realized that his wit and sarcasm didn’t work so well on the Delphian audience, who refused to give him any reward for his performance. Disappointed and vexed by this cold treatment he lashed out and mocked the Delphians comparing them to driftwood (something worthwhile at a distance but is revealed to be worthless when see closed-up.) He should have stopped there but continued his tirade realizing too late how outraged the Delphians were by his insults. They kicked him out of town, but unbeknown to him they hid a golden cup from the Temple of Apollo in his luggage and as he was leaving the city he was arrested, charged, sentenced to death, and executed unceremoniously by being pushed off a cliff.
Moral of the story: Storytelling and wit can set you free, but it can also make you fly off a cliff.
Want to survive a bad situation? Follow the cat and not the fox
I don’t know what was Aesop’s final thought before he died but I am going to speculate that he may have recited to himself the Fox and the Cat fable that he himself wrote a little while before.
The Fox and the Cat A fox was boasting to a cat of its clever devices for escaping its enemies. I have a whole bag of tricks he said , which contains a hundred ways of escaping my enemies. I have only one, said the cat. But I can generally manage with that. Just at that moment they heard the cry of a pack of hounds coming towards them, and the cat immediately scampered up a tree and hid herself in the boughs. This is my plan, said the cat. What are you going to do? The fox thought first of one way, then of another, and while he was debating, the hounds came nearer and nearer, and at last the fox in his confusion was caught up by the hounds and soon killed by the huntsmen. Miss Puss, who had been looking on, said, Better one safe way than a hundred on which you cannot reckon. Aesop
Want to hear a truly inspirational tale? Meet The Peddlar of Swaffham
Please allow me to take you to Norfolk, England in a small village called Swaffham, where you will hear the extraordinary tale of the Peddlar of Swaffham.
The Pedlar of Swaffham “Tradition says that there lived in former times in Swaffham, Norfolk, a certain pedlar, who dreamed that if he went to London Bridge, and stood there, he would hear some very joyful news, which he at first slighted, but afterwards, his dream being doubled and trebled upon, he resolved to try the issue of it, and accordingly went to London and stood on the bridge there for two or three days, looking about him, but heard nothing that might yield him any comfort. At last, it happened that a shop keeper there, having noted his fruitless standing, seeing that he neither sold any wares nor asked any alms, went to him and most earnestly begged to know what he wanted there, or what his business was; to which the pedlar honestly answered that he had dreamed that if he came to London and stood there upon the bridge he should hear good news; at which the shop keeper lighted heartily, asking him if he was such a fool as to take a journey on such a silly errand, adding: “I will tell you country fellow, last night I dreamed that I was in Swaffham, in Norfolk, a place utterly unknown to me where I thought that behind a pedlar’s house in a certain orchard, and under a great oak tree, if I dig I should find a vast treasure! Now think you, says he, that I am such a fool to take such a long journey upon me upon the instigation of a silly dream? No. No. No. I am wiser. Therefore, good fellow, learn wit from me, and get you home and mind your business.” The pedlar observing his words what he had said he dreamed and knowing they concerned him, glad of such joyful news, went speedily home, and dug and found a prodigious great treasure, with which he grew exceedingly rich; and Swaffham Church being for the most part fallen down, he set on workmen and rectified it most sumptuously, at his own charges; and to this day, there is a statute therein with his pack at his back and his dog at his heels; and his memory is also preserved by the same form of picture in most of the old glass windows, taverns and ale houses of that town unto this day.” Source: Sidney Hartland — English Diary and Other Folks Tales (London, ca. 1890) which in turn refers to the Diary of Abraham Dela Pryme — 1699. Text available under Creative Commons CC-By-SA-4.0 License.
In this video, I am taking you to Norfolk, UK, in the village of Swaffham, where the fable of the Peddlar of Swaffham originates. Come along with me …
This English tale resonates with me because of its candour and the moral that emanates from it.
My own reflection on this tale is that the moral of the story is as follows:
Listen to your inner voice, your intuition, your gut feeling, your inner compass;
Don’t be afraid to be ridiculed. Be patient. Have grit. Have resilience. Have faith;
Have the courage to act upon your dream and remember that a thousand-mile journey starts with the first step;
The journey will no doubt be marred with uncertainties, danger, surprises and some intriguing encounters;
Pay attention. Listen to the signs. Listen to the messages, the tips you receive on your journey. There may be joyful news awaiting you;
In the end, your courage, your efforts, your convictions will pay off and success will flow towards you, abundance will flow into your life;
When prosperity falls upon you do not hold tight to the wealth you seek but keep a healthy vision of its power to heal and the power it will give you to fulfil your purpose and spread goodness all around you.
And this, my Dear Companion, is Your Quest!
If you liked this post you can follow me on Instagram, Pinterest, or Facebook, or you may also like:
The audio version of my book “This Is your Quest ” is available. Feel free to check it out and use this special Promotion code
Gain Access to Expert View — Subscribe to DDI Intel | https://medium.com/datadriveninvestor/how-to-gain-wisdom-read-some-of-aesops-fables-fcd011976313 | ['Joanne Reed'] | 2020-12-02 20:31:56.139000+00:00 | ['Storytelling', 'Self-awareness', 'Philosophy', 'Wisdom', 'Self Improvement'] | Title Gain Wisdom Read Aesop’s FablesContent gain wisdom Read Aesop fable Everyone want gain wisdom Wisdom one greatest quality human being posse seek hold share treasure help navigate choppy water lift depth despair help put everything perspective ultimately turn hero story gain wisdom suggest start reading Aesop’s fable possible exception New Testament work written Greek widespread better known Aesop’s fable 2500 year Aesop fable teaching people age valuable life lesson entertaining cynical way Want hear ragstoriches story Meet Aesop WiseFool Aesop’s life read like one fable Aesop believed lived period 620 560 BC began life slave said remarkably ugly physical deformity wasn’t enough misfortune born mute unable utter word positive side intelligent resourceful kind life took turn better rescued priestess goddess Isis difficult situation strayed road became lost Slavery Greatness — Meet Aesop also known WiseFool divine reward act kindness gift speech remarkable ability conceive elaborate wise tale Greek talent storytelling wisdom wit set free literally Aesop acquired freedom fame fortune breath bad ugly deformed mute acquired kind celebrity status hanging prominent powerful personality time offering solve problem giving sound advice telling fable along way end success lead ruin Aesop made good living storyteller travelling city city perform art acquiring fame fortune along way arrived Delphi realized wit sarcasm didn’t work well Delphian audience refused give reward performance Disappointed vexed cold treatment lashed mocked Delphians comparing driftwood something worthwhile distance revealed worthless see closedup stopped continued tirade realizing late outraged Delphians insult kicked town unbeknown hid golden cup Temple Apollo luggage leaving city arrested charged sentenced death executed unceremoniously pushed cliff Moral story Storytelling wit set free also make fly cliff Want survive bad situation Follow cat fox don’t know Aesop’s final thought died going speculate may recited Fox Cat fable wrote little Fox Cat fox boasting cat clever device escaping enemy whole bag trick said contains hundred way escaping enemy one said cat generally manage moment heard cry pack hound coming towards cat immediately scampered tree hid bough plan said cat going fox thought first one way another debating hound came nearer nearer last fox confusion caught hound soon killed huntsman Miss Puss looking said Better one safe way hundred cannot reckon Aesop Want hear truly inspirational tale Meet Peddlar Swaffham Please allow take Norfolk England small village called Swaffham hear extraordinary tale Peddlar Swaffham Pedlar Swaffham “Tradition say lived former time Swaffham Norfolk certain pedlar dreamed went London Bridge stood would hear joyful news first slighted afterwards dream doubled trebled upon resolved try issue accordingly went London stood bridge two three day looking heard nothing might yield comfort last happened shop keeper noted fruitless standing seeing neither sold ware asked alms went earnestly begged know wanted business pedlar honestly answered dreamed came London stood upon bridge hear good news shop keeper lighted heartily asking fool take journey silly errand adding “I tell country fellow last night dreamed Swaffham Norfolk place utterly unknown thought behind pedlar’s house certain orchard great oak tree dig find vast treasure think say fool take long journey upon upon instigation silly dream wiser Therefore good fellow learn wit get home mind business” pedlar observing word said dreamed knowing concerned glad joyful news went speedily home dug found prodigious great treasure grew exceedingly rich Swaffham Church part fallen set workman rectified sumptuously charge day statute therein pack back dog heel memory also preserved form picture old glass window tavern ale house town unto day” Source Sidney Hartland — English Diary Folks Tales London ca 1890 turn refers Diary Abraham Dela Pryme — 1699 Text available Creative Commons CCBySA40 License video taking Norfolk UK village Swaffham fable Peddlar Swaffham originates Come along … English tale resonates candour moral emanates reflection tale moral story follows Listen inner voice intuition gut feeling inner compass Don’t afraid ridiculed patient grit resilience faith courage act upon dream remember thousandmile journey start first step journey doubt marred uncertainty danger surprise intriguing encounter Pay attention Listen sign Listen message tip receive journey may joyful news awaiting end courage effort conviction pay success flow towards abundance flow life prosperity fall upon hold tight wealth seek keep healthy vision power heal power give fulfil purpose spread goodness around Dear Companion Quest liked post follow Instagram Pinterest Facebook may also like audio version book “This Quest ” available Feel free check use special Promotion code Gain Access Expert View — Subscribe DDI IntelTags Storytelling Selfawareness Philosophy Wisdom Self Improvement |
2,038 | Yes, Social Media Is Making You Depressed | Yes, Social Media Is Making You Depressed
The science is in and it’s not surprising
Photo by Tim Mossholder on Unsplash
The science is in regarding social media, and the findings are not very surprising. The University of Arkansas just released a new study which is the first large, national study to show a link between social media use and depression over time.
The connection is clear between social media and depression.
Essentially, the more time you spend on social media, the more likely you are to become depressed. Participants who used social media for more than 300 minutes per day were 2.8 times more likely to become depressed than those who spent less than 120 minutes per day on social media.
I know what you’re thinking. 300 minutes is a long time. Who spends 5 hours on social media per day? Well, this study was conducted in 2018, before the pandemic of 2020.
According to this source, in 2020 we spend on average 3 hours per day on social media. Consider this fact: 3.96 billion people use social media today, which accounts for roughly half (51%) of the global population.
So if we average 3 hours right now, 5 hours is actually not as high as it sounds. Consider that we are using social media on our phones, tablets, and computers.
I think we could argue that YouTube is even a form of social media with the comments section. LinkedIn wasn’t considered in this study either. Here is a summary that includes some of the highlights of the study.
Some Details Of The Study
The study had a sample size of 1,000 individuals between the ages of 18 to 30 during 2018.
The study focused on the following social media platforms: Facebook, Twitter, Reddit, Instagram, and SnapChat.
“Social media is often curated to emphasize positive portrayals,” said Jaime Sidani, assistant professor of medicine at the University of Pittsburgh and co-author of the study.
“This can be especially difficult for young adults who are at critical junctures in life-related to identity development and feel that they can’t measure up to the impossible ideals they are exposed to.”
Here’s another powerful insight from Sidani. “Excess time on social media may displace forming more important in-person relationships, achieving personal or professional goals, or even simply having moments of valuable reflection.”
Let’s be totally honest: this study is only confirming what we’ve already known. Social media makes it easier to compare ourselves to other people. In turn, once we do that, we often feel inadequate and lonely. Over time, feelings of isolation and loneliness lead to depression.
I’ve heard it said that depression is often anger that is turned inward. That’s what it is like for me: depression is a mix of frustration, anger, and loneliness. Real connection — actually talking to people in a meaningful way — often helps with that.
So what’s the solution? What’s the takeaway? Engage in a real conversation, not a superficial conversation. Yes, you can do this on social media. But it’s not very common. Be willing to be weird. Dig deeper. Have real conversations and don’t settle for the surface level ones. Intentionally engage with other people.
Don’t settle for casual browsing and scrolling. That’s what the app makers want you to do.
Schedule a “virtual coffee” and hang out with people. Honestly, this might be a great time to start a podcast where you intentionally connect with other people. I’m doing that with mine. I’m starting an interview once per month on my Write Your Book Podcast.
Not into podcasting? Sick of Zoom too? I get that. How about a good old fashioned phone call? That’s right, that thing in your hand is good at making phone calls. Do that. Connect with others proactively, not reactively. You’ll be glad you did. | https://medium.com/the-partnered-pen/yes-social-media-is-making-you-depressed-40a68f7ba7a4 | ['Jim Woods'] | 2020-12-28 19:01:08.343000+00:00 | ['Self-awareness', 'Social Media', 'Self Improvement', 'Psychology', 'Life Lessons'] | Title Yes Social Media Making DepressedContent Yes Social Media Making Depressed science it’s surprising Photo Tim Mossholder Unsplash science regarding social medium finding surprising University Arkansas released new study first large national study show link social medium use depression time connection clear social medium depression Essentially time spend social medium likely become depressed Participants used social medium 300 minute per day 28 time likely become depressed spent le 120 minute per day social medium know you’re thinking 300 minute long time spends 5 hour social medium per day Well study conducted 2018 pandemic 2020 According source 2020 spend average 3 hour per day social medium Consider fact 396 billion people use social medium today account roughly half 51 global population average 3 hour right 5 hour actually high sound Consider using social medium phone tablet computer think could argue YouTube even form social medium comment section LinkedIn wasn’t considered study either summary includes highlight study Details Study study sample size 1000 individual age 18 30 2018 study focused following social medium platform Facebook Twitter Reddit Instagram SnapChat “Social medium often curated emphasize positive portrayals” said Jaime Sidani assistant professor medicine University Pittsburgh coauthor study “This especially difficult young adult critical juncture liferelated identity development feel can’t measure impossible ideal exposed to” Here’s another powerful insight Sidani “Excess time social medium may displace forming important inperson relationship achieving personal professional goal even simply moment valuable reflection” Let’s totally honest study confirming we’ve already known Social medium make easier compare people turn often feel inadequate lonely time feeling isolation loneliness lead depression I’ve heard said depression often anger turned inward That’s like depression mix frustration anger loneliness Real connection — actually talking people meaningful way — often help what’s solution What’s takeaway Engage real conversation superficial conversation Yes social medium it’s common willing weird Dig deeper real conversation don’t settle surface level one Intentionally engage people Don’t settle casual browsing scrolling That’s app maker want Schedule “virtual coffee” hang people Honestly might great time start podcast intentionally connect people I’m mine I’m starting interview per month Write Book Podcast podcasting Sick Zoom get good old fashioned phone call That’s right thing hand good making phone call Connect others proactively reactively You’ll glad didTags Selfawareness Social Media Self Improvement Psychology Life Lessons |
2,039 | Pro Tips to Help You Get Started With Your Side Project | Pro Tips to Help You Get Started With Your Side Project
Begin with solid foundations to keep the excitement kicking in
Photo by Blake Meyer on Unsplash
Day 1 — You bought your <fancy-name>.io domain name and promised yourself you would finish this product for good, this time.
Day 56— <fancy-name>.io homepage is still a 404. You refuse to talk to anyone about what went wrong.
How often do you start a project and give up on it?
Justified by a lack of structure, discipline, or organization, this project that was once your best idea ever gets boring, messy, and doesn’t look as exceptional as when you had your first thought about it.
In short, your project is not even exciting anymore, and you gave up.
Here are some tips to help you stay motivated and keep focused on what matters until you ship. | https://medium.com/better-programming/pro-tips-to-help-you-get-started-with-your-side-project-15d01b76e0d8 | ['Thomas Guibert'] | 2020-07-15 15:37:27.012000+00:00 | ['Side Project', 'Technology', 'Software Engineering', 'Productivity', 'Programming'] | Title Pro Tips Help Get Started Side ProjectContent Pro Tips Help Get Started Side Project Begin solid foundation keep excitement kicking Photo Blake Meyer Unsplash Day 1 — bought fancynameio domain name promised would finish product good time Day 56— fancynameio homepage still 404 refuse talk anyone went wrong often start project give Justified lack structure discipline organization project best idea ever get boring messy doesn’t look exceptional first thought short project even exciting anymore gave tip help stay motivated keep focused matter shipTags Side Project Technology Software Engineering Productivity Programming |
2,040 | A Simple Growth Marketing Plan For SaaS Startups | Instructions
Similar to step three, you should prioritize your new markets/ channels based on the following criteria:
Profit margins
Market size
Control
Input/output time ration
Scalability
Please make sure you look into ways to identify and prioritize new markets only when your startup gets to the “growth” stage, as mentioned in the introduction above.
Bonus: Get A copy of the SaaS growth marketing framework now!
The spreadsheet above is broken down into six tabs. Here’s how to use each of them: | https://medium.com/better-marketing/a-simple-growth-marketing-plan-for-saas-startups-543ae2d339b2 | ['Nicolás Vargas'] | 2019-10-30 10:29:37.291000+00:00 | ['Startup', 'Marketing', 'Growth'] | Title Simple Growth Marketing Plan SaaS StartupsContent Instructions Similar step three prioritize new market channel based following criterion Profit margin Market size Control Inputoutput time ration Scalability Please make sure look way identify prioritize new market startup get “growth” stage mentioned introduction Bonus Get copy SaaS growth marketing framework spreadsheet broken six tab Here’s use themTags Startup Marketing Growth |
2,041 | How I Read 69 Books in 2019 without Changing My Routine | How I Read 69 Books in 2019 without Changing My Routine
And how you can do it too.
Photo by Artem Beliaikin on Unsplash
Even though we are (finally) reaching the end of the catastrophic year that is 2020, there is something I want to tell you about 2019.
Yes, 2019, or 1 b.C. (before Corona).
I have never read so many books in a year as I did in 2019. I fell one book short of seventy. I haven’t taken part in any special reading challenge or struggled to squeeze reading time into a tight routine, and I don’t do any form of speed reading (spoiler: I think speed reading fiction is a complete nonsense). It felt so natural to finish this amount of books in this timeframe, I got surprised by the result.
I love to read. It’s one of my favorite activities to pass the time — be it at home or in a waiting room. But this year, I applied some “tricks” that increased the number of works I finished.
These aren’t top-secret techniques, nor do they require any special ability or skill. These are easy steps you can do at home if you wish, like me, to read more books from now on.
If you, too, want to read more books in 2020 and 2021, here are some tips: | https://medium.com/a-life-of-words/how-i-read-69-books-in-2019-without-changing-my-routine-accad3ca8875 | ['Morton Newberry'] | 2020-08-22 23:59:11.075000+00:00 | ['Books', 'Readinglist', 'Productivity', 'Audiobooks', 'Reading'] | Title Read 69 Books 2019 without Changing RoutineContent Read 69 Books 2019 without Changing Routine Photo Artem Beliaikin Unsplash Even though finally reaching end catastrophic year 2020 something want tell 2019 Yes 2019 1 bC Corona never read many book year 2019 fell one book short seventy haven’t taken part special reading challenge struggled squeeze reading time tight routine don’t form speed reading spoiler think speed reading fiction complete nonsense felt natural finish amount book timeframe got surprised result love read It’s one favorite activity pas time — home waiting room year applied “tricks” increased number work finished aren’t topsecret technique require special ability skill easy step home wish like read book want read book 2020 2021 tipsTags Books Readinglist Productivity Audiobooks Reading |
2,042 | A Stark Look at Covid-19 and Racial Disparities | A Stark Look at Covid-19 and Racial Disparities
We knew this would happen
Image courtesy of author
Life expectancy in the United States will almost certainly drop in 2020 due to Covid-19 deaths, extending a decline that frustrates economic demographers like David Bishai, MD, a professor at Johns Hopkins Bloomberg School of Public Health. After rising steadily for 50 years, U.S. life expectancy fell in 2015, 2016, and 2017. The drop wasn’t due to infectious disease or war or any biological limit to how long humans can live, but rather persistent systemic inequities and racial disparities in the health system, along with increases in deaths from opioids, alcohol, and suicide — the latter are what Bishai and other experts call “deaths of despair.”
The ultimate story of Covid-19, written through the lens of history with all the final death statistics, will undoubtedly mirror what we already know from hard data on U.S. life expectancy: On average, the haves outlive the have-nots in a country where the responsibility for health care is placed largely on the individual, and life expectancy varies dramatically based on disparities deeply rooted in geography, wealth, and race.
Globally, the United States ranks 50th in life expectancy, trailing such countries as Cuba, Chile, Slovenia, Portugal, France, and Italy. America is a full five years behind several of the leading nations.
And in America, there are notable gaps in longevity. On average, white men outlive black men by about 4.5 years, and white women outlive black women by about 2.7 years.
More glaring, life expectancy varies by a whopping 20.1 years in U.S. counties with the most favorable numbers — mostly on the coasts and scattered around a handful of other states, including Colorado — compared with counties at the bottom of the charts, which are mostly in the South or have large Native American populations. And things are not getting better for those at or near the bottom: Between 1980 and 2014, the worst counties made no progress, researchers concluded in the journal JAMA Internal Medicine.
That geographic disparity disproportionately affects minorities, the poor, people with underlying health conditions like heart disease and diabetes, and people who often have little choice about working from home or even staying home when they’re sick. Then along came Covid-19.
On average, the haves outlive the have-nots in a country where the responsibility for health care is placed largely on the individual.
Segregation of a different sort
“Most epidemics are guided missiles attacking those who are poor, disenfranchised, and have underlying health problems,” says Thomas Frieden, MD, former director of the U.S. Centers for Disease Control and Prevention.
Already, coronavirus deaths prove the point.
While just 22% of U.S. counties are disproportionately black, they accounted for 58% of Covid-19 deaths by April 13, according to a study released May 5 by the Foundation for AIDS Research.
Other research published in April found Covid-19 death rates among black people and Hispanics much higher (92.3 and 74.3 deaths per 100,000 population, respectively) than among whites (45.2) or Asians (34.5).
In Chicago, nearly 70% of Covid-19 deaths have been among black people, who make up 30% of the population. Similarly lopsided statistics have come out of Michigan and Detroit.
An analysis of deaths in Massachusetts, published May 9 by the Boston Globe and based on research by Harvard scientists, finds a surge in excess deaths in the early days of Covid-19 was 40% greater in cities and towns “with higher poverty, higher household crowding, higher percentage of populations of color, and higher racialized economic segregation” compared to those with the lowest levels of those measures.
These are people who can’t afford to miss a chance to work, often don’t have paid sick leave, may not get proper protection from Covid-19 spread on the job, and tend to already have lower health status due to “persistent health inequities,” says study team member Nancy Krieger, PhD, professor of social epidemiology in the department of social and behavioral sciences at Harvard T.H. Chan School of Public Health.
“It’s been hard for Americans to understand that there are racial structural disparities in this country, that racism exists,” says Camara Jones, MD, an epidemiologist at the Morehouse School of Medicine in Atlanta. “If you asked most white people in this country today, they would be in denial that racism exists and continues to have profound impacts on opportunities and exposures, resources and risks. But Covid-19 and the statistics about black excess deaths are pulling away that deniability.”
Today’s segregation involves factors like severely limited access to healthy foods and green space, and higher exposure to environmental hazards, all contributing to higher rates of obesity, diabetes, high blood pressure, and heart disease, Jones says, echoing the views of many public health researchers.
“Prior to this pandemic and economic calamity, African Americans already lacked health insurance at a rate nearly 40% higher than white people,” says Christopher Hayes, PhD, a labor historian at Rutgers School of Management and Labor Relations. “Many of the highest rates of being uninsured are in Southern states that have not expanded Medicaid and have large black populations.”
Also, the massive unemployment caused by the 2020 global economic shutdown will only worsen the plight of U.S. minorities, putting further strain on families and their options for attending to their health. While the overall unemployment rate rose to 14.7% as of May 8, it jumped to 16.7% among black workers and 18.9% for Hispanic and Latino workers.
“Given that African Americans are disproportionately concentrated in low-wage jobs, and we live in the only rich country without universal health care, too many people only seek medical care in dire situations, and when they do, it can easily be financially ruinous,” says Hayes.
The impact of Covid-19 in the United States will almost surely prove detrimental to the longevity of African Americans and other marginalized ethnic and racial groups, the experts say.
Counting the years
Life expectancy at birth is an estimate of how long a person might be expected to live if known death rates at the time were to remain consistent throughout that person’s life. It is based on a complex calculation of age-specific mortality rates, giving more weight to the probability of death later in life than for young people. Throughout the first half of the 20th century, it spiked up and down significantly in the United States as various deadly infectious diseases swept largely unabated through the population every few years.
The spikiness began to change for many reasons, not the least being higher living standards and improved sanitation and hygiene, says Bishai, the Johns Hopkins demographer.
In 1900, tuberculosis was among America’s leading causes of death. Filthy, crowded living and workplace conditions contributed to the spread of TB bacteria. Also, contaminated food, milk, and water caused TB infections and many other foodborne illnesses, from typhoid fever to botulism. Infections began to slow with public health messages that promoted hand-washing, as well as the introduction of refrigerators and pasteurization of dairy products in the 1920s — making food safer. Along the way, several childhood vaccines were introduced, including whooping cough in 1914 and diphtheria in 1926. Smallpox was eliminated in the United States by 1949.
Vaccines for polio, measles, mumps, and rubella, introduced in the 1960s, helped keep the upward longevity trend going. In 1960, the U.S. surgeon general began recommending annual flu vaccines for pregnant women and people over 65 or with chronic diseases.
From the 1960s onward, there were “noticeable gains in life expectancy at the middle and the end of life,” Bishai says. This was helped in part by advances in heart surgery and cancer treatments. Improved insurance coverage, including Medicare, also helped, he says. But his research, published in 2018 in the journal BMC Public Health, finds that increases in life expectancy have slowed here and across the world since 1950.
Then it all came to a screeching halt.
The decline in U.S. life expectancy in 2015, 2016, and 2017 (it ticked up slightly in 2018, and 2019 figures are not out yet) reflects a stark new reality: Death rates are rising not among children or the very old, but among people age 25 to 64, especially in the economically challenged industrial Midwest and Appalachia, according to a study published last year in the journal JAMA.
“In America, that’s where the battle is — it’s in the middle of life,” Bishai says in a phone interview.
“It’s been hard for Americans to understand that there are racial structural disparities in this country, that racism exists.”
Inequalities not addressed
The federal government is well aware of the nation’s regional disparities in health and mortality. In 2011, the CDC created a Social Vulnerability Index that ranks counties by their resilience “when confronted by external stresses on human health, stresses such as natural or human-caused disasters, or disease outbreaks.” It factors things like socioeconomic status, minority status, and even access to transportation. “Reducing social vulnerability can decrease both human suffering and economic loss,” the agency states.
The Social Vulnerability Index, last updated in 2016, reveals the most vulnerable counties in dark blue. Image: CDC
“Health differences between racial and ethnic groups are often due to economic and social conditions” such as living in densely populated areas, lack of access to grocery stores and medical facilities, and lack of paid sick leave, among a host of other systemic factors, the CDC states. “In public health emergencies, these conditions can also isolate people from the resources they need to prepare for and respond to outbreaks.”
Those well-known differences are driving disastrous outcomes in real time as the new coronavirus rips through low-income and poor neighborhoods. Greg Millett, MPH, director of public policy at the Foundation for AIDS Research and leader of the study out last week on the disproportionate number of deaths in predominantly black U.S. counties, ties Covid-19 directly to the known regional inequities.
Underlying health problems, including diabetes, hypertension, and heart disease, which raise the risk of death from Covid-19, “tended to be more prevalent in disproportionately black counties, but greater Covid-19 cases and deaths were still observed in those counties when adjusting for these factors,” Millett and his colleagues write.
“Many people have observed large and consistent disparities in Covid-19 cases and deaths among black Americans, but these observations have largely been anecdotal or have relied on incomplete data,” Millett says. “This analysis proves that county-level data can be used to gauge Covid-19 impact on black communities to inform immediate policy actions.”
Force for change?
Since we don’t know how many people will die in the pandemic, it’s not possible yet to predict the drop it will cause in life expectancy. But it’s a safe bet it will go down, Bishai says, adding that it would take some “miraculous” decrease in other causes of deaths to prevent a dip.
It didn’t have to be so bad. In a pandemic, a rising health tide would lift all boats. Improved overall health among the most disadvantaged, along with better access to health care and the ability for people to confidently stay at home when they are sick — all things that could change with significant governmental policy shifts — would mean fewer infections for everyone, less pressure on hospitals, and a quicker restart of the economy.
Bishai hopes one positive outcome of Covid-19 is that it helps America get past the notion that the federal government is not responsible for the nation’s health. “What makes you healthy is beyond what you choose to eat, and lifestyle, and what your doctor does for you,” he says. He’s not alone in finding it “frustrating” and “bothersome” that our political system has not addressed the dipping life expectancy curve or the gross health disparities across the country.
“The first thing the federal government could do is take charge and actually have a strategy for dealing with the pandemic,” says Hayes, the Rutgers historian. “Telling the states to handle it is not a solution and is a profound refusal to perform basic duties. Who could imagine FDR telling Hawaii to take care of Pearl Harbor or George Bush shrugging his shoulders at New York on 9/11?”
Ultimately, Hayes argues, the federal government needs to provide universal health care, greatly reduce pollution that contributes to poor heart health, and address income inequality by raising the minimum wage.
“The scourge of Covid-19 will end, but health care disparities will persist,” writes Clyde Yancy, MD, an academic cardiologist at Northwestern University, in an April 15 commentary in the journal JAMA. “The U.S. has needed a trigger to fully address health care disparities,” he writes. “Covid-19 may be that bellwether event.” | https://elemental.medium.com/a-stark-look-at-covid-19-and-racial-disparities-5737e56dbe2b | ['Robert Roy Britt'] | 2020-05-14 14:29:46.679000+00:00 | ['Coronavirus', 'Mental Health', 'Healthcare', 'Racism', 'Covid 19'] | Title Stark Look Covid19 Racial DisparitiesContent Stark Look Covid19 Racial Disparities knew would happen Image courtesy author Life expectancy United States almost certainly drop 2020 due Covid19 death extending decline frustrates economic demographer like David Bishai MD professor Johns Hopkins Bloomberg School Public Health rising steadily 50 year US life expectancy fell 2015 2016 2017 drop wasn’t due infectious disease war biological limit long human live rather persistent systemic inequity racial disparity health system along increase death opioids alcohol suicide — latter Bishai expert call “deaths despair” ultimate story Covid19 written lens history final death statistic undoubtedly mirror already know hard data US life expectancy average have outlive havenots country responsibility health care placed largely individual life expectancy varies dramatically based disparity deeply rooted geography wealth race Globally United States rank 50th life expectancy trailing country Cuba Chile Slovenia Portugal France Italy America full five year behind several leading nation America notable gap longevity average white men outlive black men 45 year white woman outlive black woman 27 year glaring life expectancy varies whopping 201 year US county favorable number — mostly coast scattered around handful state including Colorado — compared county bottom chart mostly South large Native American population thing getting better near bottom 1980 2014 worst county made progress researcher concluded journal JAMA Internal Medicine geographic disparity disproportionately affect minority poor people underlying health condition like heart disease diabetes people often little choice working home even staying home they’re sick along came Covid19 average have outlive havenots country responsibility health care placed largely individual Segregation different sort “Most epidemic guided missile attacking poor disenfranchised underlying health problems” say Thomas Frieden MD former director US Centers Disease Control Prevention Already coronavirus death prove point 22 US county disproportionately black accounted 58 Covid19 death April 13 according study released May 5 Foundation AIDS Research research published April found Covid19 death rate among black people Hispanics much higher 923 743 death per 100000 population respectively among white 452 Asians 345 Chicago nearly 70 Covid19 death among black people make 30 population Similarly lopsided statistic come Michigan Detroit analysis death Massachusetts published May 9 Boston Globe based research Harvard scientist find surge excess death early day Covid19 40 greater city town “with higher poverty higher household crowding higher percentage population color higher racialized economic segregation” compared lowest level measure people can’t afford miss chance work often don’t paid sick leave may get proper protection Covid19 spread job tend already lower health status due “persistent health inequities” say study team member Nancy Krieger PhD professor social epidemiology department social behavioral science Harvard TH Chan School Public Health “It’s hard Americans understand racial structural disparity country racism exists” say Camara Jones MD epidemiologist Morehouse School Medicine Atlanta “If asked white people country today would denial racism exists continues profound impact opportunity exposure resource risk Covid19 statistic black excess death pulling away deniability” Today’s segregation involves factor like severely limited access healthy food green space higher exposure environmental hazard contributing higher rate obesity diabetes high blood pressure heart disease Jones say echoing view many public health researcher “Prior pandemic economic calamity African Americans already lacked health insurance rate nearly 40 higher white people” say Christopher Hayes PhD labor historian Rutgers School Management Labor Relations “Many highest rate uninsured Southern state expanded Medicaid large black populations” Also massive unemployment caused 2020 global economic shutdown worsen plight US minority putting strain family option attending health overall unemployment rate rose 147 May 8 jumped 167 among black worker 189 Hispanic Latino worker “Given African Americans disproportionately concentrated lowwage job live rich country without universal health care many people seek medical care dire situation easily financially ruinous” say Hayes impact Covid19 United States almost surely prove detrimental longevity African Americans marginalized ethnic racial group expert say Counting year Life expectancy birth estimate long person might expected live known death rate time remain consistent throughout person’s life based complex calculation agespecific mortality rate giving weight probability death later life young people Throughout first half 20th century spiked significantly United States various deadly infectious disease swept largely unabated population every year spikiness began change many reason least higher living standard improved sanitation hygiene say Bishai Johns Hopkins demographer 1900 tuberculosis among America’s leading cause death Filthy crowded living workplace condition contributed spread TB bacteria Also contaminated food milk water caused TB infection many foodborne illness typhoid fever botulism Infections began slow public health message promoted handwashing well introduction refrigerator pasteurization dairy product 1920s — making food safer Along way several childhood vaccine introduced including whooping cough 1914 diphtheria 1926 Smallpox eliminated United States 1949 Vaccines polio measles mumps rubella introduced 1960s helped keep upward longevity trend going 1960 US surgeon general began recommending annual flu vaccine pregnant woman people 65 chronic disease 1960s onward “noticeable gain life expectancy middle end life” Bishai say helped part advance heart surgery cancer treatment Improved insurance coverage including Medicare also helped say research published 2018 journal BMC Public Health find increase life expectancy slowed across world since 1950 came screeching halt decline US life expectancy 2015 2016 2017 ticked slightly 2018 2019 figure yet reflects stark new reality Death rate rising among child old among people age 25 64 especially economically challenged industrial Midwest Appalachia according study published last year journal JAMA “In America that’s battle — it’s middle life” Bishai say phone interview “It’s hard Americans understand racial structural disparity country racism exists” Inequalities addressed federal government well aware nation’s regional disparity health mortality 2011 CDC created Social Vulnerability Index rank county resilience “when confronted external stress human health stress natural humancaused disaster disease outbreaks” factor thing like socioeconomic status minority status even access transportation “Reducing social vulnerability decrease human suffering economic loss” agency state Social Vulnerability Index last updated 2016 reveals vulnerable county dark blue Image CDC “Health difference racial ethnic group often due economic social conditions” living densely populated area lack access grocery store medical facility lack paid sick leave among host systemic factor CDC state “In public health emergency condition also isolate people resource need prepare respond outbreaks” wellknown difference driving disastrous outcome real time new coronavirus rip lowincome poor neighborhood Greg Millett MPH director public policy Foundation AIDS Research leader study last week disproportionate number death predominantly black US county tie Covid19 directly known regional inequity Underlying health problem including diabetes hypertension heart disease raise risk death Covid19 “tended prevalent disproportionately black county greater Covid19 case death still observed county adjusting factors” Millett colleague write “Many people observed large consistent disparity Covid19 case death among black Americans observation largely anecdotal relied incomplete data” Millett say “This analysis prof countylevel data used gauge Covid19 impact black community inform immediate policy actions” Force change Since don’t know many people die pandemic it’s possible yet predict drop cause life expectancy it’s safe bet go Bishai say adding would take “miraculous” decrease cause death prevent dip didn’t bad pandemic rising health tide would lift boat Improved overall health among disadvantaged along better access health care ability people confidently stay home sick — thing could change significant governmental policy shift — would mean fewer infection everyone le pressure hospital quicker restart economy Bishai hope one positive outcome Covid19 help America get past notion federal government responsible nation’s health “What make healthy beyond choose eat lifestyle doctor you” say He’s alone finding “frustrating” “bothersome” political system addressed dipping life expectancy curve gross health disparity across country “The first thing federal government could take charge actually strategy dealing pandemic” say Hayes Rutgers historian “Telling state handle solution profound refusal perform basic duty could imagine FDR telling Hawaii take care Pearl Harbor George Bush shrugging shoulder New York 911” Ultimately Hayes argues federal government need provide universal health care greatly reduce pollution contributes poor heart health address income inequality raising minimum wage “The scourge Covid19 end health care disparity persist” writes Clyde Yancy MD academic cardiologist Northwestern University April 15 commentary journal JAMA “The US needed trigger fully address health care disparities” writes “Covid19 may bellwether event”Tags Coronavirus Mental Health Healthcare Racism Covid 19 |
2,043 | Increasing Accuracy by Converting to a Multivariate Dataset | Increasing Accuracy by Converting to a Multivariate Dataset Tracyrenee Follow Dec 22 · 6 min read
Being a data science novice seeking to improve my skills, I continuously go through the competitions I have previously entered and seek to improve their accuracy. One such competition I have reviewed is Analytics Vidhya’s JetRail time series analysis.
There are many ways that one can predict future figures, such as Random Forest, statsmodels functions and Facebook Prophet. In both statsmodels and Prophet there are ways that one can check if a date is a weekend or a holiday, but this can be tricky. For example, I do not know what country JetRail is based in, but I am assuming it is in a western country. Even if JetRail is in a western country, however, it is important to know the holiday schedule for the country it is in before an accurate prediction can be obtained. With this in mind, I decided to assume that JetRail is based in a western country and the work week is from Monday to Friday and the weekend lasts from Saturday to Sunday.
I have previously written about the JetRail dataset as a univariate time series analysis problem, with the link to this post being found here:- How I solved the JetRail time series problem with FB Prophet | by Tracyrenee | Python In Plain English | Nov, 2020 | Medium
In this post, however, I have converted the univariate dataset to a multivariate dataset in an attempt to improve the accuracy. If you would like to know what happened then please read on.
The problem statement and datasets can be found on Analytics Vidhya’s JetRail competition page, the link being here:- Time Series Forecasting (analyticsvidhya.com)
The .ipyn file for this competition question was created in Google Colab, a free online Jupyter Notebook that can be used from any computer that has internet access.
The problem statement for this competition question reads as follows:-
“Welcome DataHacker!
Congratulations on your new job! This time you are helping out Unicorn Investors with your data hacking skills. They are considering making an investment in a new form of transportation — JetRail. JetRail uses Jet propulsion technology to run rails and move people at a high speed! While JetRail has mastered the technology and they hold the patent for their product, the investment would only make sense, if they can get more than 1 Million monthly users with in next 18 months.
You need to help Unicorn ventures with the decision. They usually invest in B2C start-ups less than 4 years old looking for pre-series A funding. In order to help Unicorn Ventures in their decision, you need to forecast the traffic on JetRail for the next 7 months. You are provided with traffic data of JetRail since inception in the test file.”
Because many of the libraries I need to solve this question are already installed on Google Colab, I only needed to import those libraries into the program, being pandas, numpy, seaborn, matplotlib, fbprophet and sklearn.
I then loaded and read the datasets into the program, being train, test and sample:-
I decided to convert the univariate time series dataset to a multivariate dataset and I accomplished this by adding a column, “dayofweek”. The function, dayofweek, returns a value from 0 to 6 signifying what day of the week the sampling occurred:-
I then created an additional column from the index, which is in datetime format. This column is necessary to perform a datetime analysis.
I then changed the names of the columns to names that Prophet wants to see when it is training and fitting the data:-
I created variables, ID_train and id_test, which stored the data train.ID and test.ID respectively. These columns were then dropped from the datasets because they are not needed to carry out the computations:-
I plotted a graph of the train dataset because it is important to have a visual representation of how the number of passengers have increased with time:-
I split the train dataframe in two to separate it into training and validation sets. The splitting is based on the date of this time series analysis:-
I defined the model, being Facebook Prophet. Prophet normally only wants to see two variables, being “y” and “ds”, but it is possible to add an additional variable, “add1”, which I did in this instance.
I forecasted on the validation set to obtain yhat:-
I then plotted a graph of the training and validation datasets’ time serious analysis to visually illustrate how Prophet has predicted on the numbers of passengers of JetRail:-
I then forecast on the test dataset to obtain yhat for that dataset:-
I produced a graph of Prophet’s predictions of the test dataset and it can be seen visually that the number of passengers is anticipated to increase in an ascending fashion:-
I prepared the submission from the value, yhat, and put it on a dataframe, which I then converted to a .csv file:-
When I submitted the predictions to Analytics Vidhya’s solution checker I achieved an accuracy of 365.16, which was less than 1 point better than the model I had previously submitted that was univariate. I decided to make the predictions integers, and this reduced the accuracy to about half a point better than the previously submitted univariate version.
I thought that if the days of the week data had improved the accuracy of the predictions then whether or not the day was a weekend might provide further illumination, so I added code to create an extra boolean column that stated whether the day in question was a weekend, and submitted the amended code to the solution checker. Sadly, this extra data did not increase the accuracy of the model, but actually reduced it. The code for this amendment is on my personal Google Colab account, but if anyone wants me to post that code, I will be more than happy to:-
The code for this post can be found in its entirety in my personal GitHub account, the link is to the right:- Jet-Rail-TS/AV_JetRail_Multivariate_Prophet.ipynb at main · TracyRenee61/Jet-Rail-TS (github.com) | https://medium.com/ai-in-plain-english/slightly-increase-accuracy-of-the-jetrail-competition-question-by-converting-it-to-a-multivariate-1ce846baa781 | [] | 2020-12-22 08:35:21.928000+00:00 | ['Time Series Analysis', 'Data Science', 'Facebook Prophet', 'Python', 'Artificial Intelligence'] | Title Increasing Accuracy Converting Multivariate DatasetContent Increasing Accuracy Converting Multivariate Dataset Tracyrenee Follow Dec 22 · 6 min read data science novice seeking improve skill continuously go competition previously entered seek improve accuracy One competition reviewed Analytics Vidhya’s JetRail time series analysis many way one predict future figure Random Forest statsmodels function Facebook Prophet statsmodels Prophet way one check date weekend holiday tricky example know country JetRail based assuming western country Even JetRail western country however important know holiday schedule country accurate prediction obtained mind decided assume JetRail based western country work week Monday Friday weekend last Saturday Sunday previously written JetRail dataset univariate time series analysis problem link post found solved JetRail time series problem FB Prophet Tracyrenee Python Plain English Nov 2020 Medium post however converted univariate dataset multivariate dataset attempt improve accuracy would like know happened please read problem statement datasets found Analytics Vidhya’s JetRail competition page link Time Series Forecasting analyticsvidhyacom ipyn file competition question created Google Colab free online Jupyter Notebook used computer internet access problem statement competition question read follows “Welcome DataHacker Congratulations new job time helping Unicorn Investors data hacking skill considering making investment new form transportation — JetRail JetRail us Jet propulsion technology run rail move people high speed JetRail mastered technology hold patent product investment would make sense get 1 Million monthly user next 18 month need help Unicorn venture decision usually invest B2C startup le 4 year old looking preseries funding order help Unicorn Ventures decision need forecast traffic JetRail next 7 month provided traffic data JetRail since inception test file” many library need solve question already installed Google Colab needed import library program panda numpy seaborn matplotlib fbprophet sklearn loaded read datasets program train test sample decided convert univariate time series dataset multivariate dataset accomplished adding column “dayofweek” function dayofweek return value 0 6 signifying day week sampling occurred created additional column index datetime format column necessary perform datetime analysis changed name column name Prophet want see training fitting data created variable IDtrain idtest stored data trainID testID respectively column dropped datasets needed carry computation plotted graph train dataset important visual representation number passenger increased time split train dataframe two separate training validation set splitting based date time series analysis defined model Facebook Prophet Prophet normally want see two variable “y” “ds” possible add additional variable “add1” instance forecasted validation set obtain yhat plotted graph training validation datasets’ time serious analysis visually illustrate Prophet predicted number passenger JetRail forecast test dataset obtain yhat dataset produced graph Prophet’s prediction test dataset seen visually number passenger anticipated increase ascending fashion prepared submission value yhat put dataframe converted csv file submitted prediction Analytics Vidhya’s solution checker achieved accuracy 36516 le 1 point better model previously submitted univariate decided make prediction integer reduced accuracy half point better previously submitted univariate version thought day week data improved accuracy prediction whether day weekend might provide illumination added code create extra boolean column stated whether day question weekend submitted amended code solution checker Sadly extra data increase accuracy model actually reduced code amendment personal Google Colab account anyone want post code happy code post found entirety personal GitHub account link right JetRailTSAVJetRailMultivariateProphetipynb main · TracyRenee61JetRailTS githubcomTags Time Series Analysis Data Science Facebook Prophet Python Artificial Intelligence |
2,044 | 11 Social-Media Marketing Tools to Bookmark Now | Want to save time?
Boost productivity?
Get organized?
Develop new, unicorn-level social-media strategies?
The workflow of a social-media marketer can be chaotic and overwhelming — but it doesn’t have to be.
Tools like MobileMonkey, Meet Edgar, and IFTTT, to name a few, help you get the job done and stay sane.
Every social-media marketer should have these 11 tools bookmarked for easy access (I know I do).
Discover them now!
Chatbot marketing is at the forefront of most digital-marketers minds.
With unprecedented ROI (re: an average 80 percent open rate and 20 percent click-through rate on messages delivered through Facebook Messenger), building a chatbot is your №1 priority.
MobileMonkey is a simple and straightforward chatbot builder where you can create a chatbot in minutes without writing a single line of code.
That’s right — its drag-and-drop interface makes it easy as click-drag-type, and you can have a chatbot up and running in no time.
Did I mention it’s free?
Get on this now.
If you’re posting on multiple social-media accounts at once, then Hootsuite is a must.
It can help make that juggling act with multiple tabs and tons of copy-pasting a whole lot easier.
You can organize and schedule hundreds of posts on all your social-media accounts at once.
Notably, most of its features are free to use.
If you want to get a closer and more organized look at what’s trending or viral, try Tagboard.
It’s a social-listening tool that lets you enter a term, topic, or hashtag to see what’s buzzing.
You can use it to monitor brand or product mentions, or find out what hashtag is making waves.
That information can then give new content ideas and ways to engage the audience.
Standing for “if this, then that,” IFTTT is another one of my go-to social-media automation tools.
It allows you to set up recipes.
For example, you can create a recipe to automatically upload your Instagram posts to a Facebook Page’s album.
Or you can set up recipes that will tweet content from a specific user’s Twitter account, or you can sync your Instagram posts to a Pinterest board.
The possibilities are endless.
IFTTT is a major time-saver and a helpful automation tool for social-media marketers everywhere.
For visual marketers who use images and video, Tailwind is the answer.
It has hashtag lists and tons of shortcuts for your Instagram and Pinterest marketing.
Tailwind also lets you track the performance of your posts to see what works and what doesn’t.
Its competitive pricing makes Tailwind accessible to consultants, small businesses, and large agencies alike.
Visuals in your social-media posts may include photographs of places, objects, events, etc.
A dependable and affordable source of stock photography is an asset for online marketers.
Unsplash is one such website that offers over 810,000 photographs in its library.
The most amazing thing about Unsplash is that it’s free, as unbelievable as that sounds.
If you think your social-media posts are a mess and need something to organize them, Meet Edgar may be for you.
Use Meet Edgar to find old posts on your social-media profiles and reschedule them.
It also has a browser extension to easily add new content you may want to share.
Meet Edgar also lets you edit and update posts in bulk, saving you a lot of time and energy.
Keeping tabs on social-media competition can be rather tedious, but not so with Brand24.
This tools notifies you of sudden changes in conversations.
That can help you track down whatever interactions may affect your image.
Data in Brand24 can be filtered however you want and exported to PDF, spreadsheet, or infographics.
If you’re looking to get hardcore with your metrics, Brand24 is great.
If you’re worried about the grammar in your written content, then Grammarly has you covered.
It’s a great all-in-one online grammar, spell-checking, and plagiarism detection tool.
Using Grammarly can make sure your content is both well-written and original.
Most people don’t have either money for Photoshop or the know-how to properly use it.
Canva is for those who need visuals to go with their content, but need something free and easy.
The drag-and-drop interface makes it very easy for anyone to create good-looking visuals.
It also gives you access to over a million photographs, graphics, and fonts.
Both design novices and professionals can benefit from using Canva for their social-media marketing.
This can be considered the online marketer’s multi-tool with its versatility and effectiveness.
BuzzSumo is one of the best tools ever because it can help you find fresh content on the web.
You can enter a topic or keyword to get a breakdown of what’s getting engagements.
It also analyzes domains and back-links, as well as lists of influencers who are sharing that content.
BuzzSumo is a great tool for all sorts of content marketing and social-media campaigns.
Be a Unicorn in a Sea of Donkeys
Get my very best Unicorn marketing & entrepreneurship growth hacks:
2. Sign up for occasional Facebook Messenger Marketing news & tips via Facebook Messenger.
About the Author
Larry Kim is the CEO of MobileMonkey — provider of the World’s Best Facebook Messenger Marketing Platform. He’s also the founder of WordStream.
You can connect with him on Facebook Messenger, Twitter, LinkedIn, Instagram.
Originally Published on Inc.com | https://medium.com/marketing-and-entrepreneurship/11-social-media-marketing-tools-to-bookmark-now-bf453555639c | ['Larry Kim'] | 2019-06-19 10:26:01.098000+00:00 | ['Marketing', 'Life Hacking', 'Tools', 'Entrepreneurship', 'Social Media'] | Title 11 SocialMedia Marketing Tools Bookmark NowContent Want save time Boost productivity Get organized Develop new unicornlevel socialmedia strategy workflow socialmedia marketer chaotic overwhelming — doesn’t Tools like MobileMonkey Meet Edgar IFTTT name help get job done stay sane Every socialmedia marketer 11 tool bookmarked easy access know Discover Chatbot marketing forefront digitalmarketers mind unprecedented ROI average 80 percent open rate 20 percent clickthrough rate message delivered Facebook Messenger building chatbot №1 priority MobileMonkey simple straightforward chatbot builder create chatbot minute without writing single line code That’s right — draganddrop interface make easy clickdragtype chatbot running time mention it’s free Get you’re posting multiple socialmedia account Hootsuite must help make juggling act multiple tab ton copypasting whole lot easier organize schedule hundred post socialmedia account Notably feature free use want get closer organized look what’s trending viral try Tagboard It’s sociallistening tool let enter term topic hashtag see what’s buzzing use monitor brand product mention find hashtag making wave information give new content idea way engage audience Standing “if that” IFTTT another one goto socialmedia automation tool allows set recipe example create recipe automatically upload Instagram post Facebook Page’s album set recipe tweet content specific user’s Twitter account sync Instagram post Pinterest board possibility endless IFTTT major timesaver helpful automation tool socialmedia marketer everywhere visual marketer use image video Tailwind answer hashtag list ton shortcut Instagram Pinterest marketing Tailwind also let track performance post see work doesn’t competitive pricing make Tailwind accessible consultant small business large agency alike Visuals socialmedia post may include photograph place object event etc dependable affordable source stock photography asset online marketer Unsplash one website offer 810000 photograph library amazing thing Unsplash it’s free unbelievable sound think socialmedia post mess need something organize Meet Edgar may Use Meet Edgar find old post socialmedia profile reschedule also browser extension easily add new content may want share Meet Edgar also let edit update post bulk saving lot time energy Keeping tab socialmedia competition rather tedious Brand24 tool notifies sudden change conversation help track whatever interaction may affect image Data Brand24 filtered however want exported PDF spreadsheet infographics you’re looking get hardcore metric Brand24 great you’re worried grammar written content Grammarly covered It’s great allinone online grammar spellchecking plagiarism detection tool Using Grammarly make sure content wellwritten original people don’t either money Photoshop knowhow properly use Canva need visuals go content need something free easy draganddrop interface make easy anyone create goodlooking visuals also give access million photograph graphic font design novice professional benefit using Canva socialmedia marketing considered online marketer’s multitool versatility effectiveness BuzzSumo one best tool ever help find fresh content web enter topic keyword get breakdown what’s getting engagement also analyzes domain backlinks well list influencers sharing content BuzzSumo great tool sort content marketing socialmedia campaign Unicorn Sea Donkeys Get best Unicorn marketing entrepreneurship growth hack 2 Sign occasional Facebook Messenger Marketing news tip via Facebook Messenger Author Larry Kim CEO MobileMonkey — provider World’s Best Facebook Messenger Marketing Platform He’s also founder WordStream connect Facebook Messenger Twitter LinkedIn Instagram Originally Published InccomTags Marketing Life Hacking Tools Entrepreneurship Social Media |
2,045 | Why it’s Terrifying to Start Writing Again | Hello Muddah, hello Faddah
Here I am at Camp Grenada
Camp is very entertaining
And they say we'll have some fun if it stops raining (Written by Allen Sherman-Sung by Mel Brooks)
Writing is terrifying, especially if you’re afraid of everything, which I am. I do a bang-up job of acting fierce but come behind the curtain, I’m a panic-stricken fool. But, do you know what’s scarier than writing? Not writing. Because when you stop, it feels like you’ve been thrown off the ship into a tropical cyclone without a life jacket. Not that a lifejacket would be much help in a tropical cyclone, but it would be something, and sometimes something keeps you going.
I grew up singing this satirical song about camp. The first lines are shown above. The song is about a postcard written by a camper to his mother and father (Mudduh and Fadduh). It’s all about the perils of camp. The singing postcard describes poison ivy, missing children, kids eaten by bears, having to read Ulysses. The end of the postcard reveals it’s only been one day. He concludes that all is well now and his parents should disregard his letter.
As an adult, I see this song as a metaphor for adulthood. Days are long. Mondays seem like they should be Fridays. I would love to occasionally write a postcard, asking someone to pick me up and take me out of this one long day, but who would I send it to?
As a writer, the longest day is the day I return to the page after a self-imposed hiatus. You see, writing can feel like an unnecessary act. Like in a cult, once you start questioning those hive beliefs, you want to get the hell out of there. Writing is like that. You shouldn’t question why you’re doing it. It breaks the spell.
Here’s my question. Do you keep writing even when you have nothing to give to the page? Or, do you take breaks so you can reboot, reflect and shift your gaze?
When I keep writing, and there's no light in my attic, I feel like I’m writing one long sentence. Not necessarily a bad sentence, but not a particularly interesting one either. More tortoise than hare.
When I stop to take a break from writing, however, I get scared that I’ll never return. Like I’ve been thrown off a boat into a cyclone and once the storm clears, I can’t see the land or the boat. I am lost.
Once I return to the page, after all this water-treading, I doubt myself. I wonder how I sailed this writing ship before? What muscles did I use to lift these thoughts? What routes did my brain travel to connect the words in my brain to the words on the page? I also doubt the navigation I previously used to find my way. Aren’t all the stars dead? Why was I using them to map my way? I am somewhere, but I still lost.
I used to be a swimmer. When I raced, you got one false start. After two, you were disqualified. The officials realized some people were using the first false starts to their advantage, so now you only get one false start then you’re history. Go back to the locker room, change, and go home.
Writing has a lot of false starts. One false start after the other. No one comes in and says “Get off the block, ya’ done”, but it feels that way. You keep starting over, again and again. Every time you return to the page, you’re at the beginning. You have to remember how to dive off every time.
This week, I took off two days from writing. I still scribbled down ideas and potential titles, but I was off the block. The block was a mile away. It’s always terrifying.
This morning, I walked back towards the starting block. My chest tightened. I stretched out my brain by inhaling and exhaling. I took a swig from my water bottle, which for writers is coffee. I climbed onto the block. Instead of fear-shaking, I bent over and grabbed the part where my toes go. I squeezed it. I said ‘I’m not afraid of you, block.’ I was a little afraid, but what difference did that make? I’m more afraid of not writing than of writing, so I’m just going to stand here until the starter's gun goes off. I can’t see the finish line, but it’s something, and sometimes something keeps you going. | https://medium.com/illumination/why-its-terrifying-to-start-writing-again-3a32446cdfc9 | ['Amy Culberg'] | 2020-12-23 20:28:21.390000+00:00 | ['Fear', 'Writing Tips', 'Self-awareness', 'Self', 'Writing'] | Title it’s Terrifying Start Writing AgainContent Hello Muddah hello Faddah Camp Grenada Camp entertaining say well fun stop raining Written Allen ShermanSung Mel Brooks Writing terrifying especially you’re afraid everything bangup job acting fierce come behind curtain I’m panicstricken fool know what’s scarier writing writing stop feel like you’ve thrown ship tropical cyclone without life jacket lifejacket would much help tropical cyclone would something sometimes something keep going grew singing satirical song camp first line shown song postcard written camper mother father Mudduh Fadduh It’s peril camp singing postcard describes poison ivy missing child kid eaten bear read Ulysses end postcard reveals it’s one day concludes well parent disregard letter adult see song metaphor adulthood Days long Mondays seem like Fridays would love occasionally write postcard asking someone pick take one long day would send writer longest day day return page selfimposed hiatus see writing feel like unnecessary act Like cult start questioning hive belief want get hell Writing like shouldn’t question you’re break spell Here’s question keep writing even nothing give page take break reboot reflect shift gaze keep writing there light attic feel like I’m writing one long sentence necessarily bad sentence particularly interesting one either tortoise hare stop take break writing however get scared I’ll never return Like I’ve thrown boat cyclone storm clear can’t see land boat lost return page watertreading doubt wonder sailed writing ship muscle use lift thought route brain travel connect word brain word page also doubt navigation previously used find way Aren’t star dead using map way somewhere still lost used swimmer raced got one false start two disqualified official realized people using first false start advantage get one false start you’re history Go back locker room change go home Writing lot false start One false start one come say “Get block ya’ done” feel way keep starting Every time return page you’re beginning remember dive every time week took two day writing still scribbled idea potential title block block mile away It’s always terrifying morning walked back towards starting block chest tightened stretched brain inhaling exhaling took swig water bottle writer coffee climbed onto block Instead fearshaking bent grabbed part toe go squeezed said ‘I’m afraid block’ little afraid difference make I’m afraid writing writing I’m going stand starter gun go can’t see finish line it’s something sometimes something keep goingTags Fear Writing Tips Selfawareness Self Writing |
2,046 | GEOINT App: Using web maps as the spatial ground truth. | The Web Map Use Case
Loading a web map can sometimes take a while and should therefore be carried out in the background. ArcGIS Runtime supports the so-called loadable design pattern. Accessing web maps and operational data structured in layers requires the resources to initialize their state asynchronously. The loadable design pattern reflects the behavior that online resources use to load data asynchronously. The pattern also provides a retry mechanism if previous attempts to load have failed so we can properly handle and resolve situations such as network outtakes. Online resources process simultaneous and repeated requests for loading accordingly and also allow a request to be canceled so that we can stop the loading of a resource.
The ACLED events are represented as two feature layers of this web map. Both feature layers represent the events as data records, so called features. One contains the events of the last 14 days and the other all historical events. The events are visualized by a specific unique value and a simple renderer. An event feature has a unique primary key and a point geometry. All features of a feature layer always have the same geometry type and the same spatial reference. This restriction not only has advantages in terms of rendering performance, but also in evaluating the features. It also allows to spatially relate the features to one another in a simple manner.
When querying the features you have to make sure that the corresponding layer is fully loaded first. You get access to the attributes and geometries of every feature. A query can not only filter the resulting feature set, but also define which attributes and whether or not the geometries should be returned.
After investigating a bunch of ACLED items from the Living atlas we took a closer look at one of the items the “Bureau of Conflict and Stabilization Operations” (CSO) published in 2019. The web map was last updated on December 4th in 2019 and can be easily accessed using this item.
If we reuse the examples from Proof of concept — Ramp-up, we just have to replace the default map by using the item. We need to create a new portal item instance using the item’s url and passing this instance into the constructor of the map instance. Compile, run the sample map viewer and after the web map is loaded we should see a ACLED layer showing on top of a dark-gray basemap.
JavaFX based ACLED web map sample
WPF based ACLED web map sample
Qt Quick based ACLED web map sample
Each SDK uses the language specific best practices for getting the job done. In Java you create and register a listener using the map or layer instance. When developing with C# you do the same, the listener is just called an event handler and you can enjoy the async/await pattern. In Qt you are using the specific signals and define slots for handling the map and layer events.
We defined a simple use case for stressing the map viewer samples. When the map is fully loaded we just register a map view tap listener/handler/slot. By tapping the map view a query is scheduled against the feature layer representing the ACLED events. The returned result is analyzed by using the primary key and the returned geometry of each feature. The map sample viewer defines a generic Hastable/Dictionary/Hash managing all returned features using the ID as key and the feature itself as value. Whenever a new query result is obtained, the feature is only added when the ID is not already known and if the ID is known the geometries of both features are compared using the underlying Geometry Engine implementation of the C++ based runtime core. If the geometry had changed, the old feature is replaced by the new feature. The feature layer contains 13 684 features all having a valid and not empty geometry representation. When analyzing the point geometries by just using their raw coordinates we saw that 6301 unique locations were represented by those features. If we want to know the real spatial distribution of these ACLED events, we would create a spatial grid and classify locations being near to each other as a match.
Let us take a look at the following chart representing the overall memory consumption of all three sample map viewers during startup and on shutdown. | https://medium.com/geospatial-intelligence/geoint-app-using-web-maps-as-the-spatial-ground-truth-c8e716e87af8 | ['Jan Tschada'] | 2020-12-06 18:14:12.132000+00:00 | ['Dotnet', 'Geospatial', 'Qt', 'Java', 'Software Engineering'] | Title GEOINT App Using web map spatial ground truthContent Web Map Use Case Loading web map sometimes take therefore carried background ArcGIS Runtime support socalled loadable design pattern Accessing web map operational data structured layer requires resource initialize state asynchronously loadable design pattern reflects behavior online resource use load data asynchronously pattern also provides retry mechanism previous attempt load failed properly handle resolve situation network outtake Online resource process simultaneous repeated request loading accordingly also allow request canceled stop loading resource ACLED event represented two feature layer web map feature layer represent event data record called feature One contains event last 14 day historical event event visualized specific unique value simple renderer event feature unique primary key point geometry feature feature layer always geometry type spatial reference restriction advantage term rendering performance also evaluating feature also allows spatially relate feature one another simple manner querying feature make sure corresponding layer fully loaded first get access attribute geometry every feature query filter resulting feature set also define attribute whether geometry returned investigating bunch ACLED item Living atlas took closer look one item “Bureau Conflict Stabilization Operations” CSO published 2019 web map last updated December 4th 2019 easily accessed using item reuse example Proof concept — Rampup replace default map using item need create new portal item instance using item’s url passing instance constructor map instance Compile run sample map viewer web map loaded see ACLED layer showing top darkgray basemap JavaFX based ACLED web map sample WPF based ACLED web map sample Qt Quick based ACLED web map sample SDK us language specific best practice getting job done Java create register listener using map layer instance developing C listener called event handler enjoy asyncawait pattern Qt using specific signal define slot handling map layer event defined simple use case stressing map viewer sample map fully loaded register map view tap listenerhandlerslot tapping map view query scheduled feature layer representing ACLED event returned result analyzed using primary key returned geometry feature map sample viewer defines generic HastableDictionaryHash managing returned feature using ID key feature value Whenever new query result obtained feature added ID already known ID known geometry feature compared using underlying Geometry Engine implementation C based runtime core geometry changed old feature replaced new feature feature layer contains 13 684 feature valid empty geometry representation analyzing point geometry using raw coordinate saw 6301 unique location represented feature want know real spatial distribution ACLED event would create spatial grid classify location near match Let u take look following chart representing overall memory consumption three sample map viewer startup shutdownTags Dotnet Geospatial Qt Java Software Engineering |
2,047 | Keeping your story open and accessible for everyone to read | Keeping your story open and accessible for everyone to read
Common questions about Medium and its Partner Program.
The UX Collective is a platform that elevates unheard design voices all over the world. One of the channels we use for knowledge sharing is Medium, an online publishing platform that enables anyone to share their thoughts and build a community around ideas.
Below are some common questions around your options as an author when sharing your content on Medium.
What is the Medium Partner Program?
Medium has recently introduced a Partner Program: a way for writers to get paid for their writing. Writers can choose to submit their stories to the Program and, if the story is approved, they will get paid based on how readers are engaging with their articles. As members read longer, writers earn more.
Is my story going to sit behind a paywall?
No. By default, every new story is open for everyone.
Writers who are enrolled in the Medium Partner Program will have the option to place their stories behind the metered paywall. If they choose to do so, their stories are eligible to earn money. (Source)
Here’s what Ev Williams, the founder of Medium, has to say about the topic:
“It is free to publish on Medium, your content is always yours, and we will host it for free, indefinitely. This applies whether or not you choose to make money by putting your articles behind the Medium paywall or make them completely free. Yes, there is a paywall, which blocks reads after a few articles per month — but only if they were put behind the paywall by the writer (which, again, is optional). Many writers choose not to do so. But if they do, know that they are getting paid when members read their posts.”
How does the paywall work?
If you decide to close your story , it becomes “member-only”. In reality, readers who are not members will be able to see a number of closed stories for free per month, before they are asked to sign up for a Medium membership (US$5/month in most markets). Medium will also give you, the author, a “Friend Link”, which is a specific URL you can share with your network so they can read your article for free, regardless of having a Medium account or membership.
, it becomes “member-only”. In reality, readers who are not members will be able to see a number of closed stories for free per month, before they are asked to sign up for a Medium membership (US$5/month in most markets). Medium will also give you, the author, a “Friend Link”, which is a specific URL you can share with your network so they can read your article for free, regardless of having a Medium account or membership. If you choose to keep your stories open, your story will remain free and open for every reader, including logged out users.
But why pay the writer?
Writing, like any other work, takes time. Not everyone has the privilege of being able to write and publish content for free, so it’s fair to give authors the option to get paid based on how many people read and engage with their stories. If you’re a writer and you don’t depend on making money on your writing, you can keep your stories open.
I believe in keeping design knowledge open and available to everyone — can I keep my articles open?
Yes. When you are about to publish your story on Medium, you’ll see a checkbox that gives you the option of whether your story will be open or closed. Your story will only be put behind a paywall if you check that box.
Your story will only be put behind a paywall if you check that box.
When your story is open, anyone can read it, without restrictions. The story will remain open and accessible to everyone.
As publication editors, we don’t have any influence or control over whether your story is part of the program or not. It’s your choice, and your choice only. As it should be.
Does Medium own my content?
According to the founder of Medium: “It is free to publish on Medium, your content is always yours, and we will host it for free, indefinitely.”
For more details, check Medium’s Terms of Service.
How much do writers who decide to close their stories get paid? | https://uxdesign.cc/can-i-keep-my-story-open-and-accessible-to-everyone-to-read-on-medium-ebb91751987 | ['Ux Collective Editors'] | 2020-10-11 01:14:55.466000+00:00 | ['UX', 'UI', 'User Experience', 'Startup', 'Design'] | Title Keeping story open accessible everyone readContent Keeping story open accessible everyone read Common question Medium Partner Program UX Collective platform elevates unheard design voice world One channel use knowledge sharing Medium online publishing platform enables anyone share thought build community around idea common question around option author sharing content Medium Medium Partner Program Medium recently introduced Partner Program way writer get paid writing Writers choose submit story Program story approved get paid based reader engaging article member read longer writer earn story going sit behind paywall default every new story open everyone Writers enrolled Medium Partner Program option place story behind metered paywall choose story eligible earn money Source Here’s Ev Williams founder Medium say topic “It free publish Medium content always host free indefinitely applies whether choose make money putting article behind Medium paywall make completely free Yes paywall block read article per month — put behind paywall writer optional Many writer choose know getting paid member read posts” paywall work decide close story becomes “memberonly” reality reader member able see number closed story free per month asked sign Medium membership US5month market Medium also give author “Friend Link” specific URL share network read article free regardless Medium account membership becomes “memberonly” reality reader member able see number closed story free per month asked sign Medium membership US5month market Medium also give author “Friend Link” specific URL share network read article free regardless Medium account membership choose keep story open story remain free open every reader including logged user pay writer Writing like work take time everyone privilege able write publish content free it’s fair give author option get paid based many people read engage story you’re writer don’t depend making money writing keep story open believe keeping design knowledge open available everyone — keep article open Yes publish story Medium you’ll see checkbox give option whether story open closed story put behind paywall check box story put behind paywall check box story open anyone read without restriction story remain open accessible everyone publication editor don’t influence control whether story part program It’s choice choice Medium content According founder Medium “It free publish Medium content always host free indefinitely” detail check Medium’s Terms Service much writer decide close story get paidTags UX UI User Experience Startup Design |
2,048 | Your Love of Big Breasts Isn’t Biologically Hardwired | I was thirteen the first time I was catcalled for having breasts. I developed early, much earlier than getting my first period. And from that moment on, despite my youthful face and obvious lack of sexual maturity, men felt obliged to comment on, stare at, and talk about my breasts.
“Straight men are just hardwired to find bigger breasts more attractive.”
This was a comment I got when I complained about men staring at my breasts with no sense of shame. It was aimed to excuse the men in question, to allow their behavior. And it wasn’t the only comment I got.
Over the years, there have been lots: dirty, smug, scathing comments about how men simply can’t resist looking at big honking bazonkers, and not only can they not resist, but biology is on their side for it.
There’s nothing I despise more than folks — usually straight cis men — using really bad human evolutionary psychology takes to defend their misogyny. You see it when men claim that women are naturally worse at science, better at nurturing, just hardwired to want kids. There’s just something so patronizing about this line of defense that sets my teeth on edge.
“I’m not sexist,” these men seem to say. “I’m not objectifying you on purpose. It’s just science.”
But the science they’re citing, in this case, is wrong. Let’s get into the various mistaken assumptions about breast size.
Bigger breasts have no reproductive value.
Let’s go basic biology for a minute. Traditionally, people are attracted to features that indicate the future of their potential offspring is strong. We like symmetrical features that signify healthy genes, smooth faces that indicate a lack of disease.
And one of the things these men seem to seize upon is the allegedly universal truth that bigger breasts mean a woman is more likely to be reproductively successful, and that’s why they’re so attracted to them.
Photo by Annie Spratt on Unsplash
The truth is that there is no evolutionary reason why men would prefer larger breasts. They’re not linked to higher fertility as a single trait, larger breasts don’t produce more milk for offspring, and if anything, larger breasts might signify that a woman is already pregnant which would count as a mark against her suitability as a mate.
Not only that but in terms of signaling reproductive readiness, they’re flawed at best. Many women develop breasts long before they’re fertile. Just as secondary sexual characteristics in men, like beards, aren’t universally attractive and don’t signify sexual virility or otherwise, breasts don’t either.
Breasts are not found universally attractive.
What a lot of people don’t realize is that many of the indications we take for attractiveness now are simply cultural. It’s not “hardwired” into us to find certain traits attractive, it’s drilled into us as a cultural preference.
Look at thinness, which is deemed a universally appealing trait. But the second you start to dig into any research that’s been done, you can see that women with higher BMIs tend to have more children, and children with higher birth weight, which would suggest that a higher BMI should be deemed sexually attractive. But it isn’t.
Look further afield and you’ll find one culture prefers “tubular” shaped women, instead of the traditional hourglass, whereas others prefer rounder figures because those signify a well-structured community that looks after its members.
In a 1951 study of 191 cultures, anthropologist Clellan Ford and ethologist Frank Beach reported that breasts were considered sexually important to men in 13 of those cultures. —Natalie Wolchover, via New Scientist
(I’ll leave aside the very worthy criticisms of BMI as a measurement for now.)
Nobody likes to talk about that, or about any of the other deviations of what people from different cultures find attractive because that contradicts popular perception of what people find attractive in society now. But big breasts are by no means something every culture deems sexually attractive.
Breast obsession is learned, not hardwired.
Here’s a pretty wild example: would you consider bound feet to be sexually desirable? Probably not — nowadays it’s viewed as a pretty controlling method which caused pain for the women it was inflicted upon.
Photo by Andalucía Andaluía on Unsplash
And yet, until fairly recently, footbinding was sexually appealing. This wasn’t due to some strange hardwired preference. Smaller feet didn’t signify a greater reproductive potential. It was simply a cultural preference tied up with a whole lot of weird misogyny about women being helpless.
Additionally, women can learn to fetishize breasts. There’s no reproductive benefit to women preferring to look at breasts, and yet in selected cultures, women do. That’s not a coincidence — it’s a sign that this kind of attraction is nurture, not nature.
In the cultural view, men aren’t so much biologically drawn to breasts as trained from an early age to find them erotic. — Natalie Wolchover via Live Science
The problem with the breast fetish
Well, what’s all the fuss? Why does it matter that men aren’t really hardwired to be obsessed with breasts? It’s because of the cultural significance placed on breasts.
Women are simultaneously told to cover up and display their breasts. We’re revered and shamed for them. They cement our position in society as mothers, caregivers, nurturers, while simultaneously casting us as harlots, provocateurs, shameless whores. Breasts are demonized when out of the control of men and put on the holiest of thresholds when in their possession.
“[Ms. Yalom] found very little in the record to indicate how women have felt about their breasts: whether they took pleasure in them, the extent to which they chose to display their breasts or if they had any say in the debate over wet-nursing.” — Natalie Angier, via the New York Times
In the past thirty years, there has been so much research done on the alleged importance of the female body — breasts included — and what men find attractive. And there’s been an unsurprising dearth of research that runs contrary to popular cultural expectations.
For example, look at the traditional story we’re told of women being the childbearing mothers staying at home collecting berries while the menfolk were all off hunting mammoths.
It begs the question: Why aren’t strong women seen as more attractive, given that the stronger, bigger, and broader women would have been more capable of protecting their children should. Why is it petite, dainty, helpless, big-breasted, small-waisted women who claim public adoration?
Photo by averie woodard on Unsplash
There’s no research done on this for the same reason the breast fetish isn’t questioned, for the same reason that there is next to no scrutiny on the body shapes and sizes that women prefer: because for the history of science, academia has had a vested interest in protecting the dominant worldview that breasts and the women attached to them are there for the consumption and pleasure of men.
I encourage scientists and readers alike to question their deep-held beliefs about universal attraction and the “natural” preferences and skills reported for men and women. Look at how these have changed over time, between and within cultures. Closely examine your prejudice and be brave enough to question it in the books you read, the people you speak with, and the beliefs you hold. | https://zulie.medium.com/your-love-of-big-breasts-isnt-biologically-hardwired-2f903209a13e | ['Zulie Rane'] | 2019-08-30 15:17:12.266000+00:00 | ['Equality', 'Sexuality', 'Psychology', 'Culture', 'Science'] | Title Love Big Breasts Isn’t Biologically HardwiredContent thirteen first time catcalled breast developed early much earlier getting first period moment despite youthful face obvious lack sexual maturity men felt obliged comment stare talk breast “Straight men hardwired find bigger breast attractive” comment got complained men staring breast sense shame aimed excuse men question allow behavior wasn’t comment got year lot dirty smug scathing comment men simply can’t resist looking big honking bazonkers resist biology side There’s nothing despise folk — usually straight ci men — using really bad human evolutionary psychology take defend misogyny see men claim woman naturally worse science better nurturing hardwired want kid There’s something patronizing line defense set teeth edge “I’m sexist” men seem say “I’m objectifying purpose It’s science” science they’re citing case wrong Let’s get various mistaken assumption breast size Bigger breast reproductive value Let’s go basic biology minute Traditionally people attracted feature indicate future potential offspring strong like symmetrical feature signify healthy gene smooth face indicate lack disease one thing men seem seize upon allegedly universal truth bigger breast mean woman likely reproductively successful that’s they’re attracted Photo Annie Spratt Unsplash truth evolutionary reason men would prefer larger breast They’re linked higher fertility single trait larger breast don’t produce milk offspring anything larger breast might signify woman already pregnant would count mark suitability mate term signaling reproductive readiness they’re flawed best Many woman develop breast long they’re fertile secondary sexual characteristic men like beard aren’t universally attractive don’t signify sexual virility otherwise breast don’t either Breasts found universally attractive lot people don’t realize many indication take attractiveness simply cultural It’s “hardwired” u find certain trait attractive it’s drilled u cultural preference Look thinness deemed universally appealing trait second start dig research that’s done see woman higher BMIs tend child child higher birth weight would suggest higher BMI deemed sexually attractive isn’t Look afield you’ll find one culture prefers “tubular” shaped woman instead traditional hourglass whereas others prefer rounder figure signify wellstructured community look member 1951 study 191 culture anthropologist Clellan Ford ethologist Frank Beach reported breast considered sexually important men 13 culture —Natalie Wolchover via New Scientist I’ll leave aside worthy criticism BMI measurement Nobody like talk deviation people different culture find attractive contradicts popular perception people find attractive society big breast mean something every culture deems sexually attractive Breast obsession learned hardwired Here’s pretty wild example would consider bound foot sexually desirable Probably — nowadays it’s viewed pretty controlling method caused pain woman inflicted upon Photo Andalucía Andaluía Unsplash yet fairly recently footbinding sexually appealing wasn’t due strange hardwired preference Smaller foot didn’t signify greater reproductive potential simply cultural preference tied whole lot weird misogyny woman helpless Additionally woman learn fetishize breast There’s reproductive benefit woman preferring look breast yet selected culture woman That’s coincidence — it’s sign kind attraction nurture nature cultural view men aren’t much biologically drawn breast trained early age find erotic — Natalie Wolchover via Live Science problem breast fetish Well what’s fuss matter men aren’t really hardwired obsessed breast It’s cultural significance placed breast Women simultaneously told cover display breast We’re revered shamed cement position society mother caregiver nurturers simultaneously casting u harlot provocateur shameless whore Breasts demonized control men put holiest threshold possession “Ms Yalom found little record indicate woman felt breast whether took pleasure extent chose display breast say debate wetnursing” — Natalie Angier via New York Times past thirty year much research done alleged importance female body — breast included — men find attractive there’s unsurprising dearth research run contrary popular cultural expectation example look traditional story we’re told woman childbearing mother staying home collecting berry menfolk hunting mammoth begs question aren’t strong woman seen attractive given stronger bigger broader woman would capable protecting child petite dainty helpless bigbreasted smallwaisted woman claim public adoration Photo averie woodard Unsplash There’s research done reason breast fetish isn’t questioned reason next scrutiny body shape size woman prefer history science academia vested interest protecting dominant worldview breast woman attached consumption pleasure men encourage scientist reader alike question deepheld belief universal attraction “natural” preference skill reported men woman Look changed time within culture Closely examine prejudice brave enough question book read people speak belief holdTags Equality Sexuality Psychology Culture Science |
2,049 | 20 Simple Ways To Reduce Your Environmental Impact While Travelling | As a current international gap year student, a large part of my life of late has involved travelling. Going abroad and experiencing a different country, without a doubt, is extremely valuable when it comes to fostering understanding and respect towards different cultures. Sadly, however, the act of hopping on a plane and going across countries is largely detrimental towards the environment.
Especially as someone who advocates for low-impact living, I’ve recently become hyper-aware of how damaging and hypocritical this can be, and I thought it would be necessary to address the contradictory nature of travelling as a vegan ‘environmentalist’. While an ideal world wouldn’t require any of these individual adjustments in the first place (read: Neoliberalism has Conned Us Into Fighting Climate Change as Individuals), I am still in the process of rectifying my passion for global understanding with reducing my carbon emissions.
In my eyes, travelling and interacting with different cultures will always be a fundamental way to connect with the world; the gift of nature is what motivates me every day to perform little acts of advocacy for a healthier planet and better future. Unfortunately, I fail to see an immediate way to cut out high-carbon travelling right now, but what I believe we can do in the meantime, is make lifestyle changes to reduce this colossal impact. Over the months, I have been paying extra attention to areas with potential for adjustment towards sustainability. In the process, I have acquired some tokens of advice on how we may lower our impact while travelling, which I’ve listed out below:
Slow Travel: Trekking in Nepal
1. Embrace slow, low-impact travel
The concept of slow travel was definitely drilled into me during my three months spent in Nepal. A 15-hour drive on a bumpy road? No problem. 11-hour trek days carrying all our gear on our backs at 15,000 ft altitude? Sure thing! 17 days of camping without a shower? Come at me…Not only do these practices promote ways to fully absorb everything that’s around you, they also lend to more opportunities to connect with people on the way and enjoy the process while you’re at it. Those 15-hour bus rides without any devices definitely enabled me to bond with my peers and during the times where we were not socialising, it reminded me of the important ‘skill’ of being bored as I was able to utilise this time to reconnect with previous neglected thoughts.
2. Bring your own container, utensils, napkin, earphones/headphones, and blanket on the plane to avoid the packaged ones they provide. And for when you’re eating local street food in places like Thailand or Taipei! If, for whatever reason, you’re still missing some of these ‘zero-waste’ essentials, Net Zero Co kindly sent me some of their products and I am loving them so far! Of course, repurpose what you have before purchasing anything you ‘need’, but their website contains pretty much anything you’re looking for — even if it’s just inspiration.
Reusable container and utensils from Net Zero Co
3. Bring an empty water bottle, cup or mug to fill on the plane (just ask the flight attendants!). Or, if you’re in Hong Kong, bring it to fill at the water dispensers which can be found at every few departure gates. While filling your water bottle on the plane, it’s also a great way to spark conversation with flight attendants when they’re not busy serving food, demonstrating safety procedures, etc. It might be interesting to learn about where the food goes when it’s not eaten! Download water fountain apps if possible (e.g. Water for Free in HK), and a Steri pen or Life Straw can also be a worthwhile investment when it comes to purifying tap water in countries where it’s unsafe to drink out of the faucet directly.
4. Pack lightly; choose carry-on if possible. This saves money AND reduces the need for transportation fuel. I wasn’t aware of this before, but the extra bags do add up and require extra fuel to transport.
The Rainbow Bus to the Farmers Market in Byron Bay, Australia
5. Shop at zero-waste bulk food stores and/or support local farmers markets. Make sure you go with a reusable bag or container — bring multiple in case your friends forget! Buying local means that a) you’re supporting their economy, b) reducing food miles — the food didn’t have to travel huge distances to get to where you are (according to Levi “save the world” Hildebrand, buying local cuts down on the average 1500 miles that food travels to be on your plate), c) it’s probably cheaper, and d) the produce is likely to be more fresh and nutritious!
6. Don’t buy souvenirs. This applies to anywhere you are: buy experiences, not things. You’ll save money, form more memories, and have more stories to share! Avoid falling into the trap of consumerism, capitalism, and pretty much any ‘ism’ that begins with c…Want to show your friends you care about them? Share videos of you in different places telling them how much you love them! And if you really want to bring home a souvenir, buy something that is made locally. These can often be found at different artisan/handmade markets.
7. Avoid transfer flights when possible. Taking off and landing are what generates the most carbon run-off and can be easily avoided. If possible, take flights that are of a higher priority within the airport and/or those with a built-in carbon offsetting program. This means they are less likely to linger around in the airport — emitting more damaging and unnecessary chemicals into the environment. Cheap, ‘affordable’ flights are often the ones that cause your carbon footprint to soar.
Compost Bins at Grampians Eco YHA
8. Avoid food wastage. This applies wherever you are in the world, but only buy what you need if cooking at home or eating out! As I mention in this article, not only does wasting food waste all the resources that went in to its production — from the water and energy used to produce and transport it, to the nutritional value it once contained — when food decomposes in the landfills, it also emits methane gas, which is 21 times more potent than CO2 — leaving an even greater impact on climate change. If you’re concerned about disliking the food on the plane, bring your own snacks in a reusable container (refer to tip #2) instead!
9. Walk — or run — everywhere! This is the best way to explore the area that you’re in; it’s cheap, fun, and sustainable. If you’re not a fan of either, try renting a convenient form of transportation. You can do this with rental systems such as SmartBike in Hong Kong, Lime in the U.S. and some places in Europe and Australia, Bird in LA, City Cycle in Brisbane, the list goes on…Alternatively, you can join a free walking tour, hop-on hop-off tour bus, or just jump on a metro and explore!
10. Do research before you enter the country you’re visiting! See if they have food waste apps available such as OLIO and Too Good To Go or events such as dumpster diving. These are great ways to reduce food waste while meeting people abroad. Before visiting Brisbane, I found out that they have a community herb garden where you can collect herbs to bring home — for free! While I never ended up grabbing any, for those planning to stay longer term, this would be the perfect way to reduce your costs and environmental impact, while potentially making some like-minded friends.
A second-hand book!
11. Invest in a Kindle/download the Kindle app on your phone/tablet. I love bookstores as much as the next person, but maybe this time, you could use the store as your browser, then simply purchase the book off the Kindle app instead. Since libraries are unlikely to be accessible for one-off, short-term travellers, downloading them onto your devices can be an easy way to access your book everywhere you go. Alternatively, you can scout out second-hand book stores (I got this book photographed on the left from a pre-loved store in Byron Bay, for example). It’s likely to be cheaper, too!
12. If staying at a hotel/place with room service, ask the cleaners to NOT wash your sheets/towels, etc every day. Perhaps I’m overgeneralising, but I doubt you’re so dirty that your bedsheets require furnishing after a single night’s sleep. I’m sure you don’t change your sheets at home every day, so it should be no different while you’re away!
13. Bring your own toothbrush, toothpaste (which, again — you can get as tablets or in a jar from Live Zero or Slowood), creams, shampoo, conditioner, soap, safety razor, instead of using ones at hotels that will come in plastic packaging.
A Vegan Meal
14. Eat less meat! An obvious one coming from me, but an important reminder nevertheless. Sure, it’s great to dabble in different cultural cuisines, but once you’ve had a try, it’s a good idea to cut down on the meat consumption — especially beef. If you need a refresher as to why this industry is particularly damaging towards our environment, check out this article.
15. Plogging! Pick up trash whenever you see any on the beach/streets/wherever you go. Or join a local beach clean-up to give back to the environment you’re in.
16. Travel differently: Try WWOOFing (a form of work exchange where you work on an organic farm in exchange for food and accommodation), backpacking (learning to live simplistically out of your backpack), camping (this includes living in a teepee like the one pictured below), travelling and sleeping in a campervan, and — for those looking for something more extreme, you can even try living in a hammock. I read about a girl who did this while travelling Australia, who would bring her hammock around and sleep in people’s backyards for no cost!
Alternatively, if you want something a bit more conventional, look for hostels or hotels with a specific emphasis on being eco-friendly. In Australia, I stayed at the Grampians Eco YHA for a night, and I was impressed by all their sustainability initiatives. They had a herb garden (where residents could take herbs from for free), vermicompost box, chickens with free-range eggs, etc — many factors which contribute to a sustainable food system.
For a slightly more cultural experience, working as an au pair or living with a homestay family works wonders. I did this in Nepal for over a month, and it was one of the most valuable travel experiences I’ve had! Not only was my host family great company, but I also learnt so much more about the area than I would have had I been staying on my own.
Low-impact travel
17. Use reef-friendly sunscreen. It turns out that oxybenzone — an ingredient commonly found in conventional sunscreens — combined with warmer water temperatures is a leading cause of coral bleaching. This mixture disrupts the fish and wildlife, leaches coral of its nutrients and bleaches it white. Not only does this affect the habitat itself, but also harms local economies which depend on tourism that the coral reef attracts. Therefore, when purchasing sunscreen for your next vacation, seek out ‘reef-friendly’ sunscreen which doesn’t contain any toxic chemicals or substances.
18. LEARN their ways. One of the first things I did upon arriving Australia was attend an aboriginal walking tour that led me through the different ways in which indigenous communities have been preserving the land for years. This set the tone for how I’d come to interact with and appreciate different natural landscapes while navigating the country, and it left me with some useful insight on how we can become better stewards of the earth even back home. Try to attend workshops and talks where you can learn something valuable and transfer that knowledge back to your home community!
Canoeing the Noosa Everglades
19. Support tours that don’t destroy habitats. Rather than go on a speed boat that disturbs the serenity of marine habitat, why not try kayaking instead? In Byron Bay, I went on a dolphin kayaking tour, where the guides made a special emphasis not to disrupt the natural movement of the sea life. This is the best way to experience nature while getting some exercise in! Other forms of sustainable exploration include canoeing, cycling, walking, etc.
20. Stay for a longer length of time. There’s a difference between being a tourist and a traveller. While the former can be achieved within a couple of days of landmark-hopping, the latter takes more time and effort but allows you to connect more deeply with a country. By remaining in one place for a longer period of time, not only will you be able to see and do more, you can get much more out of your stay both socially and culturally.
With business, school, and individual trips on the rise, it doesn’t look like overseas travel is going anywhere anytime soon. However, the worst thing we can do is dwell on the fact that we’re doing something ‘wrong’. We can make changes to mitigate our impact, and the simplest thing we can do is reframe our mindset and recognize this reality. Keep in mind, however, that while small lifestyle changes are great, these are best carried out in conjunction with other, more grand acts such as demanding systemic change. I hope this post gave you a bit of inspiration as to how to travel more sustainably, and do let me know if you have any feedback or other ideas!
— — —
If you found this article insightful, please do give it some claps (you can clap up to 50 times)! This goes a long way in helping me reach more people with my work. Also, you can find me on Instagram for more related content!
Originally published at https://www.veganhkblog.com. | https://medium.com/climate-conscious/20-simple-ways-to-reduce-your-environmental-impact-while-travelling-d787e3156966 | ['Eugenia Chow'] | 2020-07-14 13:56:17.158000+00:00 | ['Travel', 'Sustainability', 'Climate Action', 'Culture', 'Environment'] | Title 20 Simple Ways Reduce Environmental Impact TravellingContent current international gap year student large part life late involved travelling Going abroad experiencing different country without doubt extremely valuable come fostering understanding respect towards different culture Sadly however act hopping plane going across country largely detrimental towards environment Especially someone advocate lowimpact living I’ve recently become hyperaware damaging hypocritical thought would necessary address contradictory nature travelling vegan ‘environmentalist’ ideal world wouldn’t require individual adjustment first place read Neoliberalism Conned Us Fighting Climate Change Individuals still process rectifying passion global understanding reducing carbon emission eye travelling interacting different culture always fundamental way connect world gift nature motivates every day perform little act advocacy healthier planet better future Unfortunately fail see immediate way cut highcarbon travelling right believe meantime make lifestyle change reduce colossal impact month paying extra attention area potential adjustment towards sustainability process acquired token advice may lower impact travelling I’ve listed Slow Travel Trekking Nepal 1 Embrace slow lowimpact travel concept slow travel definitely drilled three month spent Nepal 15hour drive bumpy road problem 11hour trek day carrying gear back 15000 ft altitude Sure thing 17 day camping without shower Come me…Not practice promote way fully absorb everything that’s around also lend opportunity connect people way enjoy process you’re 15hour bus ride without device definitely enabled bond peer time socialising reminded important ‘skill’ bored able utilise time reconnect previous neglected thought 2 Bring container utensil napkin earphonesheadphones blanket plane avoid packaged one provide you’re eating local street food place like Thailand Taipei whatever reason you’re still missing ‘zerowaste’ essential Net Zero Co kindly sent product loving far course repurpose purchasing anything ‘need’ website contains pretty much anything you’re looking — even it’s inspiration Reusable container utensil Net Zero Co 3 Bring empty water bottle cup mug fill plane ask flight attendant you’re Hong Kong bring fill water dispenser found every departure gate filling water bottle plane it’s also great way spark conversation flight attendant they’re busy serving food demonstrating safety procedure etc might interesting learn food go it’s eaten Download water fountain apps possible eg Water Free HK Steri pen Life Straw also worthwhile investment come purifying tap water country it’s unsafe drink faucet directly 4 Pack lightly choose carryon possible save money reduces need transportation fuel wasn’t aware extra bag add require extra fuel transport Rainbow Bus Farmers Market Byron Bay Australia 5 Shop zerowaste bulk food store andor support local farmer market Make sure go reusable bag container — bring multiple case friend forget Buying local mean you’re supporting economy b reducing food mile — food didn’t travel huge distance get according Levi “save world” Hildebrand buying local cut average 1500 mile food travel plate c it’s probably cheaper produce likely fresh nutritious 6 Don’t buy souvenir applies anywhere buy experience thing You’ll save money form memory story share Avoid falling trap consumerism capitalism pretty much ‘ism’ begin c…Want show friend care Share video different place telling much love really want bring home souvenir buy something made locally often found different artisanhandmade market 7 Avoid transfer flight possible Taking landing generates carbon runoff easily avoided possible take flight higher priority within airport andor builtin carbon offsetting program mean le likely linger around airport — emitting damaging unnecessary chemical environment Cheap ‘affordable’ flight often one cause carbon footprint soar Compost Bins Grampians Eco YHA 8 Avoid food wastage applies wherever world buy need cooking home eating mention article wasting food waste resource went production — water energy used produce transport nutritional value contained — food decomposes landfill also emits methane gas 21 time potent CO2 — leaving even greater impact climate change you’re concerned disliking food plane bring snack reusable container refer tip 2 instead 9 Walk — run — everywhere best way explore area you’re it’s cheap fun sustainable you’re fan either try renting convenient form transportation rental system SmartBike Hong Kong Lime US place Europe Australia Bird LA City Cycle Brisbane list go on…Alternatively join free walking tour hopon hopoff tour bus jump metro explore 10 research enter country you’re visiting See food waste apps available OLIO Good Go event dumpster diving great way reduce food waste meeting people abroad visiting Brisbane found community herb garden collect herb bring home — free never ended grabbing planning stay longer term would perfect way reduce cost environmental impact potentially making likeminded friend secondhand book 11 Invest Kindledownload Kindle app phonetablet love bookstore much next person maybe time could use store browser simply purchase book Kindle app instead Since library unlikely accessible oneoff shortterm traveller downloading onto device easy way access book everywhere go Alternatively scout secondhand book store got book photographed left preloved store Byron Bay example It’s likely cheaper 12 staying hotelplace room service ask cleaner wash sheetstowels etc every day Perhaps I’m overgeneralising doubt you’re dirty bedsheets require furnishing single night’s sleep I’m sure don’t change sheet home every day different you’re away 13 Bring toothbrush toothpaste — get tablet jar Live Zero Slowood cream shampoo conditioner soap safety razor instead using one hotel come plastic packaging Vegan Meal 14 Eat le meat obvious one coming important reminder nevertheless Sure it’s great dabble different cultural cuisine you’ve try it’s good idea cut meat consumption — especially beef need refresher industry particularly damaging towards environment check article 15 Plogging Pick trash whenever see beachstreetswherever go join local beach cleanup give back environment you’re 16 Travel differently Try WWOOFing form work exchange work organic farm exchange food accommodation backpacking learning live simplistically backpack camping includes living teepee like one pictured travelling sleeping campervan — looking something extreme even try living hammock read girl travelling Australia would bring hammock around sleep people’s backyard cost Alternatively want something bit conventional look hostel hotel specific emphasis ecofriendly Australia stayed Grampians Eco YHA night impressed sustainability initiative herb garden resident could take herb free vermicompost box chicken freerange egg etc — many factor contribute sustainable food system slightly cultural experience working au pair living homestay family work wonder Nepal month one valuable travel experience I’ve host family great company also learnt much area would staying Lowimpact travel 17 Use reeffriendly sunscreen turn oxybenzone — ingredient commonly found conventional sunscreen — combined warmer water temperature leading cause coral bleaching mixture disrupts fish wildlife leach coral nutrient bleach white affect habitat also harm local economy depend tourism coral reef attracts Therefore purchasing sunscreen next vacation seek ‘reeffriendly’ sunscreen doesn’t contain toxic chemical substance 18 LEARN way One first thing upon arriving Australia attend aboriginal walking tour led different way indigenous community preserving land year set tone I’d come interact appreciate different natural landscape navigating country left useful insight become better steward earth even back home Try attend workshop talk learn something valuable transfer knowledge back home community Canoeing Noosa Everglades 19 Support tour don’t destroy habitat Rather go speed boat disturbs serenity marine habitat try kayaking instead Byron Bay went dolphin kayaking tour guide made special emphasis disrupt natural movement sea life best way experience nature getting exercise form sustainable exploration include canoeing cycling walking etc 20 Stay longer length time There’s difference tourist traveller former achieved within couple day landmarkhopping latter take time effort allows connect deeply country remaining one place longer period time able see get much stay socially culturally business school individual trip rise doesn’t look like overseas travel going anywhere anytime soon However worst thing dwell fact we’re something ‘wrong’ make change mitigate impact simplest thing reframe mindset recognize reality Keep mind however small lifestyle change great best carried conjunction grand act demanding systemic change hope post gave bit inspiration travel sustainably let know feedback idea — — — found article insightful please give clap clap 50 time go long way helping reach people work Also find Instagram related content Originally published httpswwwveganhkblogcomTags Travel Sustainability Climate Action Culture Environment |
2,050 | 10 free tools to help you grow your business | “Most businesses actually get zero distribution channels to work. Poor distribution — not product — is the number one cause of failure.” This Peter Thiel quote should be heeded by every startup founder reading this. Honestly, I’ve made this mistake myself, in my previous failed entrepreneurial experience.
And how many times you’ve seen friends, co-workers or teams pitching their ideas, where there’s a fantastic team, a brilliant product but zero effort on understanding how to distribute the whole package.
Without an excellent distribution model, your business will fail. If you build it, they won’t come. Marketing is at the heart of any good distribution framework, and as a fledgling startup, it’s critical that your company finds cost-effective tools to help amplify your network as quickly as possible.
That’s where this list comes in. Each tool is free to use at least for a “freemium” plan, and will make a genuinely positive impact on your company.
1. SEMrush: From doing competitive analysis, to SEO keyword research, SEMrush is an incredibly powerful tool that will empower you and your colleagues to appraise your company’s online performance in minutes. Their free to-use starter plan provides insightful data that other company’s would make you pay to see. It’s a must use platform, without a doubt.
2. BuzzSumo: If you’re interested in harnessing the power of content marketing, but don’t know where to begin, BuzzSumo is the tool to use. Simply enter a URL or keyword, and this social media monitoring tool will show you the 10 most-shared articles for free. While you have to pay to see other highly shared articles, a top 10 list associated with a specific keyword or website will help you jump-start your content marketing strategy.
3. Canva: Gone are the days where you needed to rely on an Adobe Illustrator expert to create a great looking logo, blog post header, or social media background. Canva is an incredibly intuitive design tool that empowers business owners to quickly create professional graphics.
4. Google Analytics: Probably no introductions needed here. Stop wondering how many people are visiting your website, from where, via what method. Google Analytics provides users a set of rich information, perfect for a new startup interested in analyzing user behavior. Use the “behavior flow” tool to see what pages most of your visitors view first, and to see how visitors explore your website from there. That’ll help you to better optimize their site for UX.
5. GetSocial: Virality is a startup founder’s best friend. You can’t create it without highly clickable social share buttons. That’s where GetSocial comes in, with its social media app store that helps websites improve their traffic, shares, followers and conversions. Also, they’ve optimized the whole mobile social sharing experience.
6. Buffer: Becoming an influencer on social media has never been so easy. Buffer allows you to schedule 10 Tweets, LinkedIn posts or, Facebook posts and tracks all key metrics. Plus, the platform will suggest relevant content for you to share, so that you can grow your audience by providing valuable and relevant content. Buffer is one of the best social media monitoring platforms around and is a must use for founders.
7. Trello: This task management platform is free to use, and will help you stay organized as a business and as a marketer. You can create segmented columns with Trello, which will help you stay on-top of various marketing initiatives like blog posting, social media, and email marketing. Plus you can share Trello boards with your team, that way everyone will be aligned on what needs to get done. On a side note, I use Trello for everything: from shopping list, to finding an apartment to rent, to our day-to-day product management. I also love the use case from the guys at Uservoice.
8. Hubspot Marketing Grader: Hubspot is an all-in-one marketing automation system. While it costs quite a bit to actually use Hubspot, the company offers the Hubspot Marketing Grader that will analyze the overall performance of your marketing strategy online. Use insights from Hubspot to understand what is working and what needs to be fixed if your business is to scale.
9. MozBar: Learn why various website are ranking on Google with the MozBar. This Chrome and FireFox extension shows users ranking factors like page authority, and social media performance as they browse the web. It’s an ideal free tool for founders interested in better understand their competitors and SEO in general.
10 Headline Analyzer: Whether you’re writing a blog post, titling a new page on your website, or editing your pitch deck, headlines have a huge impact on the overall performance of a written marketing initiative. That’s where Co.Schedule’s Headline Analyzer Tool comes in. Simply paste your headline into the tool and it will grade your headline on an F to A scale for virality.
While creating a product that customers can’t resist is a critical component to building any successful startup, building a marketing machine is another key component to creating a business that scales quickly. These 10 free to use marketing tools are sure to make it easier for any founder to grow his or her business quickly. | https://medium.com/getsocial-io/10-free-tools-to-help-you-grow-your-business-a9ab8d73d997 | ['João Romão'] | 2017-07-07 08:41:28.319000+00:00 | ['Growth Hacking', 'Digital Marketing', 'Startup', 'Online Marketing', 'Marketing'] | Title 10 free tool help grow businessContent “Most business actually get zero distribution channel work Poor distribution — product — number one cause failure” Peter Thiel quote heeded every startup founder reading Honestly I’ve made mistake previous failed entrepreneurial experience many time you’ve seen friend coworkers team pitching idea there’s fantastic team brilliant product zero effort understanding distribute whole package Without excellent distribution model business fail build won’t come Marketing heart good distribution framework fledgling startup it’s critical company find costeffective tool help amplify network quickly possible That’s list come tool free use least “freemium” plan make genuinely positive impact company 1 SEMrush competitive analysis SEO keyword research SEMrush incredibly powerful tool empower colleague appraise company’s online performance minute free touse starter plan provides insightful data company’s would make pay see It’s must use platform without doubt 2 BuzzSumo you’re interested harnessing power content marketing don’t know begin BuzzSumo tool use Simply enter URL keyword social medium monitoring tool show 10 mostshared article free pay see highly shared article top 10 list associated specific keyword website help jumpstart content marketing strategy 3 Canva Gone day needed rely Adobe Illustrator expert create great looking logo blog post header social medium background Canva incredibly intuitive design tool empowers business owner quickly create professional graphic 4 Google Analytics Probably introduction needed Stop wondering many people visiting website via method Google Analytics provides user set rich information perfect new startup interested analyzing user behavior Use “behavior flow” tool see page visitor view first see visitor explore website That’ll help better optimize site UX 5 GetSocial Virality startup founder’s best friend can’t create without highly clickable social share button That’s GetSocial come social medium app store help website improve traffic share follower conversion Also they’ve optimized whole mobile social sharing experience 6 Buffer Becoming influencer social medium never easy Buffer allows schedule 10 Tweets LinkedIn post Facebook post track key metric Plus platform suggest relevant content share grow audience providing valuable relevant content Buffer one best social medium monitoring platform around must use founder 7 Trello task management platform free use help stay organized business marketer create segmented column Trello help stay ontop various marketing initiative like blog posting social medium email marketing Plus share Trello board team way everyone aligned need get done side note use Trello everything shopping list finding apartment rent daytoday product management also love use case guy Uservoice 8 Hubspot Marketing Grader Hubspot allinone marketing automation system cost quite bit actually use Hubspot company offer Hubspot Marketing Grader analyze overall performance marketing strategy online Use insight Hubspot understand working need fixed business scale 9 MozBar Learn various website ranking Google MozBar Chrome FireFox extension show user ranking factor like page authority social medium performance browse web It’s ideal free tool founder interested better understand competitor SEO general 10 Headline Analyzer Whether you’re writing blog post titling new page website editing pitch deck headline huge impact overall performance written marketing initiative That’s CoSchedule’s Headline Analyzer Tool come Simply paste headline tool grade headline F scale virality creating product customer can’t resist critical component building successful startup building marketing machine another key component creating business scale quickly 10 free use marketing tool sure make easier founder grow business quicklyTags Growth Hacking Digital Marketing Startup Online Marketing Marketing |
2,051 | Mental Health vs. Strong Body | Since there were only a few things I could control entirely, one of them was to get fit.
In my entire 20s I have been slightly chubby, as I loved, and still do (whom am I kidding?) eating anything made from dough: pastries, bread, pasta, you name it. If it screamed carbohydrates, it was my go-to meal.
I have always had this excuse saying that if I ever wanted to lose weight I will do it. The only problem with this — let’s call it on its real name, a delusion — was that it’s been ages since I kept selling myself this, which only made me even more comfortable with the kilos I was adding on the scale on each passing year.
It was January 2018 when I have decided to make a change. While I was not actively thinking about motherhood, the mental seed was always there, waiting. So I have decided to try to lose some weight. I hadn’t had any target but I was sure as hell I didn’t want to reach a new weight milestone on the scale which I would become comfortable with it, too. I was 150 cm in height and weighing 59 kilos.
Since 2017 June, I have stopped eating other meat than fish, and that would have been only on occasions, having switched to a lacto-ovo-vegetarian diet with a predominant preference towards carbohydrates — and lots of it. Anyone who moves from a regular diet to a vegetarian has to be careful on the caloric intake because of the trap of thinking your body will not get enough energy to support your normal activities.
Since this was an underlining concern for me, I ended up choosing very high caloric foods. Knowing what my culprit was, the decision to try the keto diet was a no brainer. At the same time, I also started a very light exercise plan, because I hate any form of physical exercise to even consider going to the gym. The plan involved approximately 10 to 15 minutes a day (usually in the morning) of medium intensity exercises that I could perform in the comfort of my home.
It worked like a charmer. The keto diet (which I would only dare to try again in case of emergency — aka nothing else works) was perfect. I lost around 2 kilos during the first week, which was a strong incentive for me to continue with it, even though, preparing the meal or eating out was very often a challenge for a vegetarian on keto. In parallel, as I was losing weight, my morning routine for exercises became more bearable to the point I started enjoying it and became even disciplined about it. That was it. First time in my life, I was owning it. Week after week, I became leaner and fitter so I decided, in that spring, to go for my first years' jog. It was nothing spectacular at the beginning, but my new discipline habit caught up and helped me come back to it whenever I felt that I will never be a runner. | https://medium.com/in-fitness-and-in-health/becoming-a-mother-a-heartfelt-testimonial-c114f8dfec71 | ['Eir Thunderbird'] | 2020-12-20 20:53:24.186000+00:00 | ['Fitness', 'Mental Health', 'Motivation', 'Feminism', 'Parenting'] | Title Mental Health v Strong BodyContent Since thing could control entirely one get fit entire 20 slightly chubby loved still kidding eating anything made dough pastry bread pasta name screamed carbohydrate goto meal always excuse saying ever wanted lose weight problem — let’s call real name delusion — it’s age since kept selling made even comfortable kilo adding scale passing year January 2018 decided make change actively thinking motherhood mental seed always waiting decided try lose weight hadn’t target sure hell didn’t want reach new weight milestone scale would become comfortable 150 cm height weighing 59 kilo Since 2017 June stopped eating meat fish would occasion switched lactoovovegetarian diet predominant preference towards carbohydrate — lot Anyone move regular diet vegetarian careful caloric intake trap thinking body get enough energy support normal activity Since underlining concern ended choosing high caloric food Knowing culprit decision try keto diet brainer time also started light exercise plan hate form physical exercise even consider going gym plan involved approximately 10 15 minute day usually morning medium intensity exercise could perform comfort home worked like charmer keto diet would dare try case emergency — aka nothing else work perfect lost around 2 kilo first week strong incentive continue even though preparing meal eating often challenge vegetarian keto parallel losing weight morning routine exercise became bearable point started enjoying became even disciplined First time life owning Week week became leaner fitter decided spring go first year jog nothing spectacular beginning new discipline habit caught helped come back whenever felt never runnerTags Fitness Mental Health Motivation Feminism Parenting |
2,052 | An interview with Awkward co-founder Kevin Kalle | You first studied at the Willem de Kooning Academy in Rotterdam, then transferred to Maryland Institute College of Art. What drove you to move to the states and study at MICA?
I felt limited at the Willem de Kooning Academy. It made me look for a challenge and I knew the bar at MICA was really high. I signed up, got accepted, and packed my bags. These days education is behind on the industry, so I think the problem I faced then is something we still face today.
Why do you think education is behind on the industry?
The curriculum does not match the practical experience. It’s hard for schools to innovate, and at the same time, our industry is changing rapidly. Looking back, I’m glad I made the decision to transfer instead of dropping out. It’s not just design skills you learn at school but you also develop social skills, and you get a chance to work in teams and learn to understand other perspectives. These are the things that I think should be emphasized in the work we do; it’s not just about your skill set.
You finished MICA in 2006 and started Awkward in 2011. What did you do during the five-year gap?
Right after school, I co-founded a start-up with 3 others in San Francisco where I learned a lot of things besides design itself. I did that for a while but noticed I still wanted to learn a variety of things instead of just one thing. That’s when I started freelancing for multiple startups and agencies with a strong focus on user interface and icon design. During this time I met Pieter Omvlee from Sketch. Back then he was working on Fontcase and Drawit. | https://medium.com/madeawkward/an-interview-with-awkward-co-founder-kevin-kalle-5874c0439a01 | [] | 2018-09-17 08:06:30.909000+00:00 | ['Design', 'Agency', 'Founder Stories', 'Interview', 'Startup'] | Title interview Awkward cofounder Kevin KalleContent first studied Willem de Kooning Academy Rotterdam transferred Maryland Institute College Art drove move state study MICA felt limited Willem de Kooning Academy made look challenge knew bar MICA really high signed got accepted packed bag day education behind industry think problem faced something still face today think education behind industry curriculum match practical experience It’s hard school innovate time industry changing rapidly Looking back I’m glad made decision transfer instead dropping It’s design skill learn school also develop social skill get chance work team learn understand perspective thing think emphasized work it’s skill set finished MICA 2006 started Awkward 2011 fiveyear gap Right school cofounded startup 3 others San Francisco learned lot thing besides design noticed still wanted learn variety thing instead one thing That’s started freelancing multiple startup agency strong focus user interface icon design time met Pieter Omvlee Sketch Back working Fontcase DrawitTags Design Agency Founder Stories Interview Startup |
2,053 | Did the Protagonist Need a Backstory in Tenet? | Christopher Nolan’s more recent films have, in one way or another, been polarizing, to say the least. Whether it was the narratively messy The Dark Knight Rises, or the heavy-handed dialogue found in Interstellar, or the lack of character development in Dunkirk, there is no shortage of criticism that can be found being levied against Nolan films.
And yet, for how prevalent this is for Nolan’s work, the criticism and critiques never seem to stick, at least not in the same way that it has for the likes of M. Night Shyamalan; which effectively sunk his career and reputation in a big way.
The why behind Nolan’s success is truly fascinating. For while we can criticize his storytelling style all day long, we always find ourselves coming back for more.
Which brings us to Nolan’s latest polarizing project, Tenet.
Tenet is a fascinating character study — not of the protagonist, the, uh, Protagonist — but of Nolan himself. Tenet, probably more so than any other of Nolan’s recent projects gives us a glimpse into how he approaches his minimalistic storytelling process.
The protagonist, the Protagonist
Ironically, the most fascinating criticism about Tenet isn’t the preposterously crazy take on time travel, but about how the film presents its lead character, the protagonist who is purposefully known literally as the Protagonist.
Many have taken humorous jabs at Nolan for this seemingly on-the-nose creative self-indulgence. After all, on the surface, naming your protagonist the Protagonist seems like the sort of thing a film student would do in an attempt to be artistically edgy and unique, but is instead groan-inducing.
And while I’m not saying that Nolan couldn’t have nor shouldn’t have come up with a more appropriate naming convention, it makes me wonder, how much focus did Nolan plan on putting into Tenet’s main character in the first place?
After all, the Protagonist feels like a shell of a character. He seemingly doesn’t have a fleshed-out backstory, and his motivations are unclear at best. While defenders of Tenet have tried to explain away the Protagonist’s coldness and aloofness, you can’t deny that those elements definitely exist within the character.
Which I guess is kind of the point. At the end of the day, all you really need to know about the Protagonist is that he’s cold, efficient, and incredibly competent at his job. Only in very subtle instances do we see cracks in his exterior that hint at an underlying softness in his stoic shell.
So while the Protagonist doesn’t have genuine character development, he does have character.
Seeing a character react to their situation is character, whereas challenging the belief systems of a character is character development.
With the Protagonist, we see him react to plenty of unusual circumstances, but we never get a firm grasp of why he has chosen to face these challenges in the first place or how it makes him feel.
After all, you can’t have character development if the character doesn’t grow or shift their mindset in a meaningful way. And the problem with the Protagonist is that we have no idea what he believes.
But by naming the protagonist the Protagonist, Nolan effectively stripped the character down to its naked core.
In a way, Nolan naming the main character the Protagonist is simply his way of saying, ‘This story isn’t about the character. It’s about the story. Oh, and by the way, he’s the good guy, and he knows he’s the good guy.’
In the tech development industry, many teams have adopted the Lean methodology. Simply put, Lean is meant to help product teams focus on small, doable tasks while cutting out the fat of digital products. In this way, you focus on expanding the elements of your product that are essential.
In much the same way, Tenet seems like part of an extended experiment on Nolan’s part in a quest to find the most efficient way to tell overly complicated stories.
No matter whether you think the use of the title protagonist is fitting or inherently silly, you have to admire Nolan for creating such a complicated story in such a lean, efficient way.
Do Characters Need Backstory?
But all this got me thinking. Is the Protagonist a cold and aloof character simply because he has no backstory, or is there more to the story itself?
After all, the Protagonist is far from the first action hero that has no backstory. The first example that came to mind for me is one of my favorite heroes, Ethan Hunt, in the Mission Impossible franchise.
For as iconic of a character that he is, what do we really know about Ethan, exactly? The first film alludes to his upbringing in a small rural town and mentions his mother and Uncle Donald, but besides that, we know nothing about Ethan’s past. Was he in the military or CIA before joining the IMF? Does he have siblings? What did he have to overcome personally and professionally to get to be an IMF agent?
The fact is, we simply don’t know.
Funnily enough, Tom Cruise’s spy character in the criminally under-appreciated Knight and Day has more backstory than Ethan Hunt does in the Mission Impossible films.
And what about characters like Jason Bourne, which is a character who’s past is deliberately held back from the audience. (Except for Matt Damon’s last entry into the franchise, but we don’t talk about that.)
How is it that a character can be successful like Jason Bourne when the only things we know about him are the same things that the character knows about himself?
The answer is that in these cases, their past simply doesn’t matter. What matters is how the characters react and respond to obstacles in the moment.
In The Bourne Identity, we see Bourne struggle with his amnesia, even going so far as to lash out verbally due to his frustration. We also are able to get into his mind to see how he solves problems, such as when he’s escaping from the embassy.
With Ethan Hunt, we get emotionally invested with him as he deals with the turmoil of seeing his team murdered right in front of him in the original Mission Impossible film. We get to see the aftermath as he struggles with figuring out what to do next, while also dealing with emotional fatigue.
These are only a couple of examples that would seem to suggest that characters don’t need backstories for us as the audience to identify and empathize with them. Which raises the question, are backstories even necessary at all?
Depends On the Story
It’s been posited by some online commentators that backstories are unnecessary. I’ve heard arguments be made that you can watch The Dark Knight without having seen Batman Begins and still be able to understand and become engaged in Bruce Wayne’s story.
While this is true, it’s a fact that even though they’re in the same trilogy, The Dark Knight has a totally different story to tell than Batman Begins.
You can’t just simply take the storytelling style of the Dark Knight and make Batman Begins. It just wouldn’t work, and vice versa.
Including a backstory or not is completely predicated on the type of story you want to tell. Are you telling a tight, lean spy story that’s mostly focused on espionage and mind-games, or are you diving into a character study where the character’s depth is important to the story and the progression of the plot?
While The Dark Knight is, at its core, a crime thriller, Batman Begins is a character study about Bruce Wayne’s childhood trauma. Both are great stories in their own right, but they’re not equal because they’re not the same.
So no, backstories are not a tool to simply be thrown away. At the same time, not every story ever written needs one either.
Ultimately, it just depends.
Every type of story has pros and cons. With a character study like Batman Begins, you gain the ability for the audience to empathize and become emotionally invested in the hero’s journey, whereas with a crime thriller like The Dark Knight, you can place all your focus on the character’s actions and reactions.
What about Tenet?
I went to go see Tenet in theaters with a couple of my brothers, and afterward, while discussing the film, one of my brothers pointed out that in Tenet, it wasn’t the Protagonist’s lack of backstory that was the problem with his character, but that we didn’t get to see him respond in a human way to the obstacles he encounters.
With every new obstacle or piece of information he learns, he accepts everything in stride without ever reacting in a relatable way for the audience to empathize with.
For all intents and purposes, the Protagonist is effectively emotionless. David Washington does what he can with the character, and I quite liked him in the role, but his character almost felt more robotic than human, like an AI always trying to figure things out while not having any underlying emotions to connect with.
Yes, we get to see him making difficult decisions, but we don’t really get to see the effect that those decisions have on him as a person.
On top of that, the Protagonist only asks direct questions and doesn’t ask for elaboration. For someone who is experiencing a scientific anomaly, he seems numb for most of the runtime since nothing that happens in the course of the story seems to pique his curiosity in the slightest. In a way, the Protagonist feels more like he’s caught up in the current of the story and is just along for the ride as opposed to being an active participant in the plot.
Which, once again, might be kind of the point, but I won’t go into spoilers here.
Ultimately, with Christopher Nolan’s screenplay, the themes, concepts, and storytelling beats took precedence over the characterization of the characters.
Which, in a way, is logical and totally warranted. Tenet has so many complicated twists and turns that it’s hard to just keep up with what’s happening in the story. If Nolan had inserted deep characterization into the plot, it potentially could have just become too bloated to be engaging.
In essence, Nolan sacrificed characterization for the sake of the plot. Was that the right decision to make?
Well, not only does a story depend on the type of story that the storyteller intends to tell, but it also depends on what the audience expects of certain stories as well.
After Christopher Nolan’s Dunkirk, I was expecting Tenet to be more of a visual and audio spectacle more than a deep character study. In that sense, Tenet totally paid off for me, because while I didn’t become emotionally attached to the characters, I was fully engaged with the story.
So while I sympathize with people who saw Tenet and were disappointed at the lack of characterization — and I’ll readily admit that they’re definitely not wrong for thinking so — I’m not convinced that Christopher Nolan made the wrong decision to forego characterization for the sake of the story.
While I think a more nuanced director like Doug Liman could have turned Tenet’s protagonist into a more relatable character — which probably would have translated into a better movie overall — I’m also simultaneously amazed at the sheer scope and visceral energy of Tenet’s story and filmmaking.
Tenet is one of those movies that keeps you thinking about it for days afterward.
Conclusion
While Tenet told its story in a lean and satisfactory way, it was missing a human element to ground the story on an emotional level.
What this boils down to is that Tenet is one of Christopher Nolan’s lesser movies, but also one of his most fascinating. Tenet tells a story that doesn’t resonate with me emotionally, but the plot keeps the analytical side of my mind constantly engaged.
Much like Ad Astra, that was so cold and emotionless as to render the audience numb, Tenet ultimately was a lesser film because it only hooked me intellectually, not emotionally.
In general, the best films are able to do both, but that doesn’t mean that Tenet was a mistake.
In short, Nolan knew the story he wanted to tell, and he did it in the most efficient way possible.
If you enjoy movies and liked this story, give me some claps and follow me for more stories like this! | https://medium.com/oddbs/did-the-protagonist-need-a-backstory-in-tenet-bc7a80974fd0 | ['Brett Seegmiller'] | 2020-10-06 18:58:30.181000+00:00 | ['Storytelling', 'Cinema', 'Film', 'Writing', 'Movies'] | Title Protagonist Need Backstory TenetContent Christopher Nolan’s recent film one way another polarizing say least Whether narratively messy Dark Knight Rises heavyhanded dialogue found Interstellar lack character development Dunkirk shortage criticism found levied Nolan film yet prevalent Nolan’s work criticism critique never seem stick least way like Night Shyamalan effectively sunk career reputation big way behind Nolan’s success truly fascinating criticize storytelling style day long always find coming back brings u Nolan’s latest polarizing project Tenet Tenet fascinating character study — protagonist uh Protagonist — Nolan Tenet probably Nolan’s recent project give u glimpse approach minimalistic storytelling process protagonist Protagonist Ironically fascinating criticism Tenet isn’t preposterously crazy take time travel film present lead character protagonist purposefully known literally Protagonist Many taken humorous jab Nolan seemingly onthenose creative selfindulgence surface naming protagonist Protagonist seems like sort thing film student would attempt artistically edgy unique instead groaninducing I’m saying Nolan couldn’t shouldn’t come appropriate naming convention make wonder much focus Nolan plan putting Tenet’s main character first place Protagonist feel like shell character seemingly doesn’t fleshedout backstory motivation unclear best defender Tenet tried explain away Protagonist’s coldness aloofness can’t deny element definitely exist within character guess kind point end day really need know Protagonist he’s cold efficient incredibly competent job subtle instance see crack exterior hint underlying softness stoic shell Protagonist doesn’t genuine character development character Seeing character react situation character whereas challenging belief system character character development Protagonist see react plenty unusual circumstance never get firm grasp chosen face challenge first place make feel can’t character development character doesn’t grow shift mindset meaningful way problem Protagonist idea belief naming protagonist Protagonist Nolan effectively stripped character naked core way Nolan naming main character Protagonist simply way saying ‘This story isn’t character It’s story Oh way he’s good guy know he’s good guy’ tech development industry many team adopted Lean methodology Simply put Lean meant help product team focus small doable task cutting fat digital product way focus expanding element product essential much way Tenet seems like part extended experiment Nolan’s part quest find efficient way tell overly complicated story matter whether think use title protagonist fitting inherently silly admire Nolan creating complicated story lean efficient way Characters Need Backstory got thinking Protagonist cold aloof character simply backstory story Protagonist far first action hero backstory first example came mind one favorite hero Ethan Hunt Mission Impossible franchise iconic character really know Ethan exactly first film alludes upbringing small rural town mention mother Uncle Donald besides know nothing Ethan’s past military CIA joining IMF sibling overcome personally professionally get IMF agent fact simply don’t know Funnily enough Tom Cruise’s spy character criminally underappreciated Knight Day backstory Ethan Hunt Mission Impossible film character like Jason Bourne character who’s past deliberately held back audience Except Matt Damon’s last entry franchise don’t talk character successful like Jason Bourne thing know thing character know answer case past simply doesn’t matter matter character react respond obstacle moment Bourne Identity see Bourne struggle amnesia even going far lash verbally due frustration also able get mind see solves problem he’s escaping embassy Ethan Hunt get emotionally invested deal turmoil seeing team murdered right front original Mission Impossible film get see aftermath struggle figuring next also dealing emotional fatigue couple example would seem suggest character don’t need backstories u audience identify empathize raise question backstories even necessary Depends Story It’s posited online commentator backstories unnecessary I’ve heard argument made watch Dark Knight without seen Batman Begins still able understand become engaged Bruce Wayne’s story true it’s fact even though they’re trilogy Dark Knight totally different story tell Batman Begins can’t simply take storytelling style Dark Knight make Batman Begins wouldn’t work vice versa Including backstory completely predicated type story want tell telling tight lean spy story that’s mostly focused espionage mindgames diving character study character’s depth important story progression plot Dark Knight core crime thriller Batman Begins character study Bruce Wayne’s childhood trauma great story right they’re equal they’re backstories tool simply thrown away time every story ever written need one either Ultimately depends Every type story pro con character study like Batman Begins gain ability audience empathize become emotionally invested hero’s journey whereas crime thriller like Dark Knight place focus character’s action reaction Tenet went go see Tenet theater couple brother afterward discussing film one brother pointed Tenet wasn’t Protagonist’s lack backstory problem character didn’t get see respond human way obstacle encounter every new obstacle piece information learns accepts everything stride without ever reacting relatable way audience empathize intent purpose Protagonist effectively emotionless David Washington character quite liked role character almost felt robotic human like AI always trying figure thing underlying emotion connect Yes get see making difficult decision don’t really get see effect decision person top Protagonist asks direct question doesn’t ask elaboration someone experiencing scientific anomaly seems numb runtime since nothing happens course story seems pique curiosity slightest way Protagonist feel like he’s caught current story along ride opposed active participant plot might kind point won’t go spoiler Ultimately Christopher Nolan’s screenplay theme concept storytelling beat took precedence characterization character way logical totally warranted Tenet many complicated twist turn it’s hard keep what’s happening story Nolan inserted deep characterization plot potentially could become bloated engaging essence Nolan sacrificed characterization sake plot right decision make Well story depend type story storyteller intends tell also depends audience expects certain story well Christopher Nolan’s Dunkirk expecting Tenet visual audio spectacle deep character study sense Tenet totally paid didn’t become emotionally attached character fully engaged story sympathize people saw Tenet disappointed lack characterization — I’ll readily admit they’re definitely wrong thinking — I’m convinced Christopher Nolan made wrong decision forego characterization sake story think nuanced director like Doug Liman could turned Tenet’s protagonist relatable character — probably would translated better movie overall — I’m also simultaneously amazed sheer scope visceral energy Tenet’s story filmmaking Tenet one movie keep thinking day afterward Conclusion Tenet told story lean satisfactory way missing human element ground story emotional level boil Tenet one Christopher Nolan’s lesser movie also one fascinating Tenet tell story doesn’t resonate emotionally plot keep analytical side mind constantly engaged Much like Ad Astra cold emotionless render audience numb Tenet ultimately lesser film hooked intellectually emotionally general best film able doesn’t mean Tenet mistake short Nolan knew story wanted tell efficient way possible enjoy movie liked story give clap follow story like thisTags Storytelling Cinema Film Writing Movies |
2,054 | April Fools’ 2019: Perception-driven data visualization | April Fools’ 2019: Perception-driven data visualization
Exploring OKCupid data with the most powerful psychological technique for accelerating analytics
This article was a prank for April Fools’ Day 2019. Now that the festivities are over, scroll to the end of the article for the Real Lessons section for a minute of genuine learning.
Evolution endowed humans with a few extraordinary abilities, from walking upright to operating heavy machinery to hyperefficient online mate selection.
Humans have evolved the ability to process faces quickly, and you can use perception-driven technique to accelerate your analytics.
One of the most impressive is our ability to perceive tiny changes in facial structure and expression, so data scientists have started exploiting our innate superpowers for faster and more powerful data analytics.
Evolution-driven data analysis
Get ready to be blown away by an incredible new analytics technique!
Chernoff Faces are remarkable for the elegance and clarity with which they convey information by taking advantage of what humans are best at: facial recognition.
The core idea behind Chernoff faces is that every facial feature will map to an attribute of the data. Bigger ears will mean something, as will smiling, eye size, nose shape, and the rest. I hope you’re excited to see it in action!
Let’s walk through a real-life mate selection example with OKCupid data.
Data processing
I started by downloading a dataset of nearly 60K leaked OKCupid profiles, available here for you to follow along. Real-world data are usually messy and require quite a lot of preprocessing before they’re useful to your data science objectives, and that’s certainly true of these. For example, they come with reams of earnest and 100% reliable self-intro essays, so I did a bit of quick filtering to boil my dataset down to something relevant to me. I used R and the function I found most useful was grepl().
First, since I live in NYC, I filtered out all but the 17 profiles based near me. Next, I cleaned the data to show me the characteristics I’m most fussy about. For example, I’m an Aquarius and getting along astrologically is obviously important, as is a love of cats and a willingness to have soulful conversations in C++.
After the first preprocessing steps, here’s what my dataset looks like:
The next step is to convert the strings into numbers so that the Chernoff face code will run properly. This is what I’ll be submitting into the faces() function from R’s aplpack package:
Next step, the magic!
Faces revealed
Now that our dataset is ready, let’s run our Chernoff faces visualization! Taa-daa!
Below is a handy guide on how to read it. Isn’t it amazingly elegant and so quick to see exactly what is going on? For example, the largest faces are the tallest and oldest people, while the smilers can sing me sweet C++ sonnets. It’s so easy to see all that in a heartbeat. The human brain is incredible!
Data privacy issues
Unfortunately, by cognitively machine deep learning all these faces, we are violating the privacy of OKCupid users. If you look carefully and remember the visualizations, you might be able to pick them out of a crowd. Watch out for that! Make sure you re-anonymize your results by rerunning the code on an unrelated dataset before presenting these powerful images to your boss.
Dates and dating
Chernoff faces?! You really should check publication dates, especially when they’re at the very beginning of April. I hope you started getting suspicious when this diehard statistician mentioned astrology and were sure by the time I got to the drivel about de-anonymization.
Much love from me and whichever prankster forwarded this to you. ❤
Real lessons
I’ve always been amused by Chernoff faces (and eager for an excuse to share some of my favorite analytics trivia with you), though I’ve never actually seen them making themselves useful in the wild. Even though the article was intended for a laugh, there are a few real lessons to take away:
Expect to spend time cleaning data. While the final visualization took only a couple of keystrokes to achieve, the bulk of my effort was preparing the dataset to use, and you should expect this in your own data science adventures too.
While the final visualization took only a couple of keystrokes to achieve, the bulk of my effort was preparing the dataset to use, and you should expect this in your own data science adventures too. Data visualization is more than just histograms . There’s a lot of room for creativity when it comes to how you can present your data, though not everything will be implemented in a package that’s easy for beginners to use. While you can get Chernoff faces through R with just the single function faces(data), the sky is the limit if you’re feeling creative and willing to put the graphics effort in. You might need something like C++ if you’re after the deepest self-expression.
There’s a lot of room for creativity when it comes to how you can present your data, though not everything will be implemented in a package that’s easy for beginners to use. While you can get Chernoff faces through R with just the single function faces(data), the sky is the limit if you’re feeling creative and willing to put the graphics effort in. You might need something like C++ if you’re after the deepest self-expression. What’s relevant to me might not be relevant to you. I might care about cat-love, you might care about something else. An analysis is only useful for its intended purpose, so be careful if you’re inheriting a dataset or report made by someone else. It might be useless to you, or worse, misleading.
I might care about cat-love, you might care about something else. An analysis is only useful for its intended purpose, so be careful if you’re inheriting a dataset or report made by someone else. It might be useless to you, or worse, misleading. There’s no right way to present data , but one way to think about viz quality is speed-to-understanding. The faces just weren’t efficient at getting the information into your brain — you probably had to go and consult the table to figure out what you’re looking at. That’s something you want to avoid when you’re doing analytics for realsies.
, but one way to think about viz quality is speed-to-understanding. The faces just weren’t efficient at getting the information into your brain — you probably had to go and consult the table to figure out what you’re looking at. That’s something you want to avoid when you’re doing analytics for realsies. Chernoff faces sounded brilliant when they were invented, the same way that “cognitive” this-and-that sounds brilliant today. Not everything that tickles the poet in you is a good idea… and stay extra vigilant for leaps of logic when the argument appeals to evolution and the human brain. Don’t forget to test mathemagical things before you deploy them in your business.
If you want to have a go at creating these faces yourself, here’s a tutorial. If you prefer to read one of my straight-faced articles about data visualization instead, try this one. | https://towardsdatascience.com/perception-driven-data-visualization-e1d0f13908d5 | ['Cassie Kozyrkov'] | 2019-04-02 13:29:17.253000+00:00 | ['Analytics', 'Data Science', 'Technology', 'Visualization', 'Artificial Intelligence'] | Title April Fools’ 2019 Perceptiondriven data visualizationContent April Fools’ 2019 Perceptiondriven data visualization Exploring OKCupid data powerful psychological technique accelerating analytics article prank April Fools’ Day 2019 festivity scroll end article Real Lessons section minute genuine learning Evolution endowed human extraordinary ability walking upright operating heavy machinery hyperefficient online mate selection Humans evolved ability process face quickly use perceptiondriven technique accelerate analytics One impressive ability perceive tiny change facial structure expression data scientist started exploiting innate superpower faster powerful data analytics Evolutiondriven data analysis Get ready blown away incredible new analytics technique Chernoff Faces remarkable elegance clarity convey information taking advantage human best facial recognition core idea behind Chernoff face every facial feature map attribute data Bigger ear mean something smiling eye size nose shape rest hope you’re excited see action Let’s walk reallife mate selection example OKCupid data Data processing started downloading dataset nearly 60K leaked OKCupid profile available follow along Realworld data usually messy require quite lot preprocessing they’re useful data science objective that’s certainly true example come ream earnest 100 reliable selfintro essay bit quick filtering boil dataset something relevant used R function found useful grepl First since live NYC filtered 17 profile based near Next cleaned data show characteristic I’m fussy example I’m Aquarius getting along astrologically obviously important love cat willingness soulful conversation C first preprocessing step here’s dataset look like next step convert string number Chernoff face code run properly I’ll submitting face function R’s aplpack package Next step magic Faces revealed dataset ready let’s run Chernoff face visualization Taadaa handy guide read Isn’t amazingly elegant quick see exactly going example largest face tallest oldest people smiler sing sweet C sonnet It’s easy see heartbeat human brain incredible Data privacy issue Unfortunately cognitively machine deep learning face violating privacy OKCupid user look carefully remember visualization might able pick crowd Watch Make sure reanonymize result rerunning code unrelated dataset presenting powerful image bos Dates dating Chernoff face really check publication date especially they’re beginning April hope started getting suspicious diehard statistician mentioned astrology sure time got drivel deanonymization Much love whichever prankster forwarded ❤ Real lesson I’ve always amused Chernoff face eager excuse share favorite analytics trivia though I’ve never actually seen making useful wild Even though article intended laugh real lesson take away Expect spend time cleaning data final visualization took couple keystroke achieve bulk effort preparing dataset use expect data science adventure final visualization took couple keystroke achieve bulk effort preparing dataset use expect data science adventure Data visualization histogram There’s lot room creativity come present data though everything implemented package that’s easy beginner use get Chernoff face R single function facesdata sky limit you’re feeling creative willing put graphic effort might need something like C you’re deepest selfexpression There’s lot room creativity come present data though everything implemented package that’s easy beginner use get Chernoff face R single function facesdata sky limit you’re feeling creative willing put graphic effort might need something like C you’re deepest selfexpression What’s relevant might relevant might care catlove might care something else analysis useful intended purpose careful you’re inheriting dataset report made someone else might useless worse misleading might care catlove might care something else analysis useful intended purpose careful you’re inheriting dataset report made someone else might useless worse misleading There’s right way present data one way think viz quality speedtounderstanding face weren’t efficient getting information brain — probably go consult table figure you’re looking That’s something want avoid you’re analytics realsies one way think viz quality speedtounderstanding face weren’t efficient getting information brain — probably go consult table figure you’re looking That’s something want avoid you’re analytics realsies Chernoff face sounded brilliant invented way “cognitive” thisandthat sound brilliant today everything tickle poet good idea… stay extra vigilant leap logic argument appeal evolution human brain Don’t forget test mathemagical thing deploy business want go creating face here’s tutorial prefer read one straightfaced article data visualization instead try oneTags Analytics Data Science Technology Visualization Artificial Intelligence |
2,055 | Training multiple machine learning models and running data tasks in parallel via YARN + Spark + multithreading | Training multiple machine learning models and running data tasks in parallel via YARN + Spark + multithreading
Harness large scale computational resources to allow a single data scientist to perform dozens or hundreds of Big data tasks in parallel, stretching the limits of data science scaling and automation
image: Freepik.com
Summary
To objective of this article is to show how a single data scientist can launch dozens or hundreds of data science-related tasks simultaneously (including machine learning model training) without using complex deployment frameworks. In fact, the tasks can be launched from a “data scientist”-friendly interface, namely, a single Python script which can be run from an interactive shell such as Jupyter, Spyder or Cloudera Workbench. The tasks can be themselves parallelised in order to handle large amounts of data, such that we effectively add a second layer of parallelism.
Who this article is intended for?
Data scientists who wish to do more work with less time, by making use of large scale computational resources (e.g. clusters or public clouds), possibly shared with other users via YARN. To understand this article you need a good knowledge of Python, working knowledge of Spark, and at least basic understanding about Hadoop YARN architecture and shell scripting;
Machine learning engineers who are supporting data scientists on making use of available computational capacity and operating large scale data
Introduction
Data science and automation
“Data science” and “automation” are two words that invariably go hand-in-hand with each other, as one of the keys goals of machine learning is to allow machines to perform tasks more quickly, with lower cost, and/or better quality than humans.
Naturally, it wouldn’t make sense for an organization to spend more on tech staff that are supposed to develop and maintain systems that automate work (data scientists, data engineers, DevOps engineers, software engineers and others) than on the staff that do the work manually. It’s not thus surprising that a recurrent discussion is how much we can automate the work of data science teams themselves, for instance via automated machine learning.
To achieve cost-effective data science automation, it is imperative to able to harness computational power from public or private clouds; after all, the cost of hardware is quite low compared to the cost of highly skilled technical staff. While technology to achieve so is certainly available, many organisations ended up facing the “big data software engineer vs data scientist conundrum”, or more precisely, the drastic discrepancy between
“Big data software engineer skills”, i.e. skills necessary to manipulate massive amounts of data in complex computational environments, and run these processes in a reliable manner along with other concurrent processes
“Data scientist skills”, i.e. skills necessary to apply algorithms and mathematics to the data to extract insights valuable from a business standpoint
Harnessing computational power is key to automating data science work
image: Freepik.com
Some organisations would make “data scientists” responsible for developing the analytics models in some sort of “controlled analytics environment” where one does not need to think too much about the underlying computational resources or sharing the resources with other processes, and “big data software engineers” responsible for coding “production-ready” versions of the models developed by data scientists and deploy them into production. This setup resulted in obvious inefficiencies, such as:
Data scientists developing sub-optimal models due to not making use of large scale data and computational resources. In some organisations, data scientists even ended up working with single-node frameworks such as Pandas/Scikit-Learn and basing their models entirely on small datasets obtained via sampling or over-engineered features; Developed models performing well on analytics environment but not performing well, or being completely unable to run, in production environment; The difficulty to evaluate generation of business value, identify and fix problems, as well as making iterative improvements, as data scientists end up dramatically losing oversight of the analytics process once models are sent into production.
Different organisations dealt with this situation with different ways, either by forcing big data software engineers and data scientists learn the skills of the “other role”, or by creating a “third role”, named “Machine Learning Engineer” to bridge the gap between the two roles.
But the fact is that nowadays, there are far more resources in terms of allowing data scientists without exceptional software engineering skills to work in “realistic” environments, i.e. similar to production, in terms of computational complexity. Machine learning libraries such as Spark MLLib, Kubeflow, Tensorflow-GPU, and MMLSpark allow data preparation and model training to be distributed across multiple CPUs, GPUs, or a combination of both; at the same time, frameworks such as Apache Hadoop YARN and Kubernetes allow data scientists to work simultaneously using the same computational resources, by understanding only basic concepts about the underlying server infrastructure, such as number of available CPUs/GPUs and available memory.
The intent of this article is to provide an example of how these libraries and frameworks, as well as massive (but shared) computational resources, can be leveraged together in order to automate the creation and testing of data science models.
From individually massively parallelised tasks to massively running tasks in parallel
Frameworks like Spark and Kubeflow make easy to distribute a Big Data task, such as feature processing or machine learning model training, across GPUs and/or hundreds of CPUs without a detailed understanding of the server architecture. On the other hand, executing tasks in parallel, rather than individual parallelised tasks, is not as seamless. Of course, it’s not hard for a data scientist to work with two or three PySpark sessions in Jupyter at the same time, but for the sake of automation, we might be rather interested in running dozens and hundreds of tasks simultaneously, all specified in a programmatic way with minimal human interference.
Naturally, one may ask why bother with running tasks in parallel, instead of simply increasing the number of cores per task and make each task run in a shorter time. There are two reasons:
The processing speed often does not scale with the number of cores. For example, in the case of training machine learning models, if the data is not large enough, there might be zero improvement on computation time by increasing the number of cores from say, 10 to 100, and sometimes the computational time might even increase due to process and communication overhead, as well as the inability to leverage highly efficient single-processor implementations available in some machine learning libraries The accuracy of machine learning algorithms models may also decrease due to parallelisation, as those algorithms often rely on suboptimal heuristics to able to run in distributed fashion, such as data split and voting
It is certainly possible, using deployment tools such as Airflow, to run arbitrarily complex, dynamically defined and highly automated data analytics pipelines involving parallelised tasks. However, these tools require low-level scripting and configuration and aren’t suited for quick “trial and error” experiments carried on by data scientists on a daily basis, often accustomed to try and re-try ideas quickly in interactive shells such as Jupyter or Spyder. Also, taking us back to the previously mentioned “big data software engineer vs data scientist” conundrum, organisations might prefer data scientists to spend their time focusing on experimenting with the data and generating business value, not on getting immersed in low-level implementation or deployment.
What you will learn in this article?
In this article, I will show how we can make use of Apache Hadoop YARN to launch and monitor multiple jobs in a Hadoop cluster simultaneously, (including individually parallelised Spark jobs), directly from any Python code (including code from interactive Python shells such as Jupyter), via Python multithreading. While the example will consist of training multiple machine learning models in parallel, I will provide a generic framework that can be used to launch arbitrary data tasks such as feature engineering and model metric computation.
Some applications for multiple model parallel training are:
Hyper-parameter tuning: For the same training data set, simultaneously train using different model types (say Logistic Regression, Gradient Boosting and Multi-layer Perceptron) and also different hyperparameter configurations, in order to find the optimal model type/hyperparameter set as quickly as possible;
For the same training data set, simultaneously train using different model types (say Logistic Regression, Gradient Boosting and Multi-layer Perceptron) and also different hyperparameter configurations, in order to find the optimal model type/hyperparameter set as quickly as possible; Multi-label classification: Train multiple binary/multi-class classification models in parallel, where each model training task will use a different column as the label column, such that the resulting combination of models will effectively be a multi-label classifier;
Train multiple binary/multi-class classification models in parallel, where each model training task will use a different column as the label column, such that the resulting combination of models will effectively be a multi-label classifier; Feature reduction: For a poll of previously ranked features, train multiple models, each using only the top N-ranked features as feature columns, with N being varied across the training tasks.
Technical overview
In our framework, I will call the main task, i.e. the Python code that creates the additional tasks to run in parallel, as the controller task, and the tasks being started by the controller task as the subordinate tasks. (I intentionally avoid using the expression “worker” to avoid confusion, as in Spark, “worker” is a synonym for Spark executor)
The controller task is responsible for:
Defining how many subordinate tasks should be run at the same time and what to do in case one of the tasks fail;
Creating the subordinate tasks, passing the inputs to each task and getting their outputs, if any;
Generating the inputs and processing the outputs of the subordinate tasks.
An interesting aspect of YARN is that it allows Spark to be used both in the controller and subordinate tasks. Although neither is necessary, this allows us to handle arbitrarily large datasets without needing to worry ourselves with data engineering, as long as we have enough computational resources. Namely, the controller task can run Spark in client mode, and the subordinate tasks can run Spark in cluster mode:
In client mode, the Spark driver runs in the environment where the controller’s Python code is being run (that we refer to as client environment ) , allowing the use of locally installed interactive shells such as Jupyter, whereas the Spark executors run in the YARN-managed Hadoop cluster , with the interactions between the driver and executors made via a third type of process named Application Master also running in the Hadoop cluster;
) allowing the use of locally installed interactive shells such as Jupyter, whereas the Spark executors run in the YARN-managed , with the interactions between the driver and executors made via a third type of process named Application Master also running in the Hadoop cluster; In cluster mode, both the driver and the executors run in the YARN-managed Hadoop cluster. Note that nothing prevent us to have the controller task also running in cluster mode, but interactive shells cannot be used in this way.
The framework is illustrated in the figure below:
Illustration of the parallelisation framework
There are two things to note about the example above:
Although in the example the controller task is also the driver of the Spark process (and thus associated with executors in the Hadoop cluster via the YARN Application Master), this is not necessary, although useful for example if we want to do some preprocessing on the data before deploying to the subordinate tasks;
Although the subordinate tasks do not need to use Spark parallelisation, we will use the spark-submit command to launch them, such that they will always have a Spark driver, although not necessarily Spark executors. This is the case of process 3 above.
Technical implementation
Executing a subordinate task as a Spark job
Before I delve into parallelisation, I will first explain how to execute a subordinate task from a controller task written in Python. As mentioned before, we will do so using the spark-submit shell script contained in the Apache Spark installation, such that the subordinate task will be technically a Spark job, although it does not necessarily has executors or Spark code as I mentioned before.
In principle, we can use spark-submit from Python by simply calling the os.system function, which allows us to execute a shell command from Python. In practice, we need to be able to debug and monitor the task; for that purpose, it is better to use the excellent subprocess library. An example:
import json
import subprocess spark_config_cluster_path = "/home/edsonaoki/spark_config_cluster"
app_name = "some_model_training"
spark_config = {
"spark.jars.packages" :
"com.microsoft.ml.spark:mmlspark_2.11:0.18.1",
"spark.dynamicAllocation.enabled": "false",
"spark.executor.instances": "10",
"spark.yarn.dist.files": "/home/edsonaoki/custom_packages.tar"
}
command = "lightgbm_training.py "\
"hdfs://user/edsonaoki/datasets/input_data.parquet "\
"hdfs://user/edsonaoki/models" spark_submit_cmd = “SPARK_CONF_DIR=%s spark-submit -name %s %s %s"
% (spark_config_cluster_path, app_name,
" ".join(['-conf %s="%s"' % (key, value) for key, value in
spark_config.items()]),
command)
cmd_output = subprocess.Popen(spark_submit_cmd, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
bufsize=1, universal_newlines=True) for line in cmd_output.stdout:
print(line)
cmd_output.communicate()
At the beginning of the code I set the path containing the cluster mode base Spark configuration, which is later used to change the SPARK_CONF_DIR environmental variable. This is an actually crucial step if the controller task is configured to run in Spark in client mode since the Spark configuration for cluster mode is typically different than for client mode.
If you don’t know much about how to configure Spark in cluster mode, you can start by making a copy of the existing SPARK_CONF_DIR . Inside the spark-defaults.conf file we need to have
spark.submit.deployMode=cluster
instead of
spark.submit.deployMode=client
and certain configuration options, such as spark.yarn.rmProxy.enabled and the spark.driver.options.* options need to be disabled as there is no network-specific configuration for the driver when running Spark in cluster mode. Check the Spark on YARN documentation if you are in doubt. Of course, if the controller task is also running Spark in cluster mode, there is no need to have a separate configuration.
Now, looking at the subsequent steps:
app_name = "some_model_training"
spark_config = {
"spark.jars.packages" :
"com.microsoft.ml.spark:mmlspark_2.11:0.18.1",
"spark.dynamicAllocation.enabled": "false",
"spark.executor.instances": "10",
"spark.yarn.dist.files": "/home/edsonaoki/custom_packages.tar"
}
command = "lightgbm_training.py "\
"hdfs://user/edsonaoki/datasets/input_data.parquet"\
"hdfs://user/edsonaoki/models"
spark_submit_cmd = “SPARK_CONF_DIR=%s spark-submit -name %s %s %s"
% (spark_config_cluster_path, app_name,
" ".join(['-conf %s="%s"' % (key, value) for key, value in
spark_config.items()]),
command)
Here I set up the application name, additional Spark configuration options and the command to be executed by the spark-submit script. These are straightforward to understand, but the application name is particularly important in our case — we will later understand why. We also submit a custom Python package via the spark.yarn.dist.files configuration parameter, which as I will show later, is especially handy since the subordinate task runs in the Hadoop cluster and hence has no access to the Python functions available in the local (client) environment.
Note also that I specify two HDFS paths as arguments to the lightgbm_training.py Python script (the subordinate task’s code), for a similar reason to above: since the Python script will run in the Hadoop cluster, it will not have access to any files in the client environment’s file system, and hence any files to be exchanged between controller or subordinate task must be either explicitly submitted via spark.yarn.dist.files or put into a shared file system such as HDFS or AWS S3.
After preparing the spark-submit shell command line, we are ready to execute it using the subprocess.Popen command:
cmd_output = subprocess.Popen(spark_submit_cmd, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
bufsize=1, universal_newlines=True)
We set shell=True to make Python initiate a separate shell process to execute the command, rather than attempting to initiate spark-submit directly from the Python process. Although setting shell=False is generally preferable when using the subprocess library, doing so restricts the command line format and it’s not feasible in our case.
The stdout , stderr , bufsize and universal_newlines arguments are used to handle the output (STDOUT) and error messages (STDERR) issued by the shell command during execution time. When we are executing multiple subordinate tasks in parallel, we will probably want to ignore all execution time messages as they will be highly cluttered and impossible to interpret anyways. This is also useful to save memory for reasons we will explain later. However, before attempting to run multiple tasks in parallel, it is certainly best to first make sure that each individual task will work properly, by running a single subordinate task with output/error messages enabled.
In the example I set stdout=subprocess.PIPE , stderr=subprocess.STDOUT , bufsize=1 and universal_newlines=True , which basically, will direct all shell command output to a First In First Out (FIFO) queue named subprocess.PIPE .
Note that when running a Spark job in cluster mode, subprocess.PIPE will only have access to messages from the YARN Application Master, not the driver or executors. To check the driver and executor messages, you might look at the Hadoop cluster UI via your browser, or retrieve the driver and executor logs post-execution as I will show later. Additionally, if file logging is enabled in the log4j.properties file (located in the Spark configuration), the messages from the Application Master will be logged into a file rather than directed to subprocess.PIPE , so disable file logging if needed.
Finally, to display the output/error messages in the Python script’s output, I continue the code above as follows:
for line in cmd_output.stdout:
print(line)
cmd_output.communicate()
The purpose of cmd_output.communicate() is to wait for the process to finish after subprocess.PIPE is empty, i.e. no more outputs from the subordinate task are written to it. It highly advisable to read the entire queue before calling cmd_output.communicate() method as done above, to prevent the queue from increasing in size and wasting memory.
Monitoring the subordinate task without using debug messages
As I mentioned earlier, when we run tasks in parallel we do not want debug messages to be displayed; moreover, if a large number of tasks are sending messages to an in-memory FIFO queue at the same time, memory usage will increase messages aren’t being read from the queue as fast as they are generated. A version of the code from the previous section without debugging, starting with the call to spark-submit , is as follows:
cmd_output = subprocess.Popen(spark_submit_cmd, shell=True,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) def getYARNApplicationID(app_name):
state = 'RUNNING,ACCEPTED,FINISHED,KILLED,FAILED'
out = subprocess.check_output(["yarn","application","-list",
"-appStates",state], stderr=subprocess.DEVNULL,
universal_newlines=True)
lines = [x for x in out.split("
")]
application_id = ''
for line in lines:
if app_name in line:
application_id = line.split('\t')[0]
break
return application_id max_wait_time_job_start_s = 120
start_time = time.time()
while yarn_application_id == '' and time.time()-start_time\
< max_wait_time_job_start_s:
yarn_application_id = getYARNApplicationID(app_name) cmd_output.wait() if yarn_application_id == '':
raise RuntimeError("Couldn't get yarn application ID for application %s" % app_name)
The code starts by launching the subordinate task as before, but with debugging disabled:
cmd_output = subprocess.Popen(spark_submit_cmd, shell=True,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
Since there are no debug messages to be displayed when the process is running, we use cmd_output.wait instead of cmd_output.communicate() to wait for the task to finish. Note that although we won’t see the Application Master’s messages, we can still debug the Spark job’s driver and executor in runtime via the Hadoop cluster UI.
However, we still need to be able to monitor the task from a programmatic point of view; more specifically, the controller task needs to know when the subordinate task has finished, whether it was successful, and take appropriate action in case of failure. For that purpose, we can use the application name that we set in the beginning:
app_name = "some_model_training"
The application name can be used by YARN to retrieve the YARN application ID, which allows us to retrieve the status and other information about the subordinate task. Again, we can resort to the subprocess library to define a function that can retrieve the application ID from the application name:
def getYARNApplicationID(app_name):
state = 'RUNNING,ACCEPTED,FINISHED,KILLED,FAILED'
out = subprocess.check_output(["yarn","application","-list",
"-appStates",state], stderr=subprocess.DEVNULL,
universal_newlines=True)
lines = [x for x in out.split("
")]
application_id = ''
for line in lines:
if app_name in line:
application_id = line.split('\t')[0]
break
return application_id
Observe that getYARNApplicationID parses the output of the yarn application -list shell command. Depending on your Hadoop version the output format may be slightly different and the parsing needs to be adjusted accordingly. If in doubt, you can test the format by running the following command in the terminal:
$ yarn application -list -appStates RUNNING,ACCEPTED,FINISHED,KILLED,FAILED
The tricky aspect is that this method can only work if the application name is unique in the Hadoop cluster. Therefore, you need to make sure you are creating a unique application name, for instance by including timestamps, random strings, your user ID, etc. Optionally, you can also add other filters when attempting to parse the output of yarn application -list , for example, the user ID, the YARN queue name or the time of the day.
Since the Spark job takes some time to be registered in YARN after it has been launched using spark-submit , I implemented the loop:
max_wait_time_job_start_s = 120
start_time = time.time()
while yarn_application_id == '' and time.time()-start_time\
< max_wait_time_job_start_s:
yarn_application_id = getYARNApplicationID(app_name)
where max_wait_time_job_start_s is the time to wait for the registration in seconds, which may need to be adjusted according to your environment.
The meaning of
if yarn_application_id == '':
raise RuntimeError("Couldn't get yarn application ID for"\
" application %s" % app_name)
is straightforward; if there is no application ID, it means the Spark job has not been successfully launched and we need to throw an exception. This may also indicate that we need to increase max_wait_time_job_start_s , or change how the output of yarn application -list is parsed inside getYARNApplicationID .
Checking the final status of the subordinate task
After the subordinate task has finished, checking its final status can be done as follows:
def getSparkJobFinalStatus(application_id):
out = subprocess.check_output(["yarn","application",
"-status",application_id], stderr=subprocess.DEVNULL,
universal_newlines=True)
status_lines = out.split("
")
state = ''
for line in status_lines:
if len(line) > 15 and line[1:15] == "Final-State : ":
state = line[15:]
break
return state final_status = getSparkJobFinalStatus(yarn_application_id)
where again, you may need to tune the parsing of yarn application -status depending on your Hadoop version. How to handle the final status is entirely up to you, but one possibility is to store the Spark job’s driver and executor log in a file and raise an exception. For example:
log_path = "/home/edsonaoki/logs/%s_%s.log" % (app_name,
yarn_application_id) if final_status != "SUCCEEDED":
cmd_output = subprocess.Popen(["yarn","logs",
"-applicationId",yarn_application_id],
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
bufsize=1, universal_lines=True)
with open(log_path, "w") as f:
for line in cmd_output.stdout:
f.write(line)
print("Written log of failed task to %s" % log_path)
cmd_output.communicate()
raise RuntimeError("Task %s has not succeeded" % app_name)
Using multithreading to execute subordinate tasks in parallel
If not obvious, before attempting to execute subordinate tasks in parallel, make sure to test as many as tasks as possible without parallelisation, as debugging parallel tasks can be incredibly difficult.
To perform parallelisation we will use Python’s concurrent library. The concurrent library uses multithreading and not multiprocessing; i.e. the threads do run in the same processor, such that from the side of the controller task, there is no real parallel processing. However, since the threads started in the controller task are in I/O mode (unblocked) when waiting for the subordinate tasks to finish, multiple subordinate tasks can be launched asynchronously, such that they will actually run in parallel in the side of the Hadoop cluster. While we can technically use the multiprocessing library instead of the concurrent library to achieve parallelism also from the controller task’s side, I would advise against it as it will substantially increase the memory consumption in the client environment for little benefit — the idea is that the “tough processing” is done in the Hadoop cluster.
When we launch a Spark job, we are typically aware of the constraints of processing and memory in the cluster environment, especially in the case of a shared environment, and use configuration parameters such as spark.executor.memory and spark.executor.instances in order to control the task’s processing and memory consumption. The same needs to be done in our case; we need to limit the number of subordinate tasks that execute simultaneously according to the availability of computational resources in the cluster, such that when we reach this limit, a subordinate task can only be started after another has finished.
The concurrent package offers the futures.ThreadPoolExecutor class which allows us to start multiple threads and wait for them to finish. The class also allows us to limit the number of threads doing active processing(i.e. not blocked by I/O) via the max_workers argument. However, as I mentioned before, a thread in the controller task is treated as being blocked by I/O when the subordinate task is running, which means that max_workers won’t effectively limit the number of threads. As result, all subordinate tasks will be submitted nearly simultaneously and the Hadoop cluster can become overloaded.
This can be solved rather easily by modifying the futures.ThreadPoolExecutor class as follows:
import concurrent.futures
from queue import Queue class ThreadPoolExecutorWithQueueSizeLimit(
concurrent.futures.ThreadPoolExecutor): def __init__(self, maxsize, *args, **kwargs):
super(ThreadPoolExecutorWithQueueSizeLimit,
self).__init__(*args, **kwargs)
self._work_queue = Queue(maxsize=maxsize)
This new class ThreadPoolExecutorWithQueueSizeLimit works exactly like futures.ThreadPoolExecutor , but it won’t allow more than maxsize threads to exist at any point of time, effectively limiting the number of subordinate tasks running simultaneously in the Hadoop cluster.
We now need to define a function, containing the execution code of the thread, which can be passed as an argument to the class ThreadPoolExecutorWithQueueSizeLimit . Based on the previous code for executing a subordinate task from Python without debugging messages, I present the following generic thread execution function:
def executeThread(app_name, spark_submit_cmd, error_log_dir,
max_wait_time_job_start_s=120):
cmd_output = subprocess.Popen(spark_submit_cmd, shell=True,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
start_time = time.time()
while yarn_application_id == '' and time.time()-start_time\
< max_wait_time_job_start_s:
yarn_application_id = getYARNApplicationID(app_name)
cmd_output.wait()
if yarn_application_id == '':
raise RuntimeError("Couldn't get yarn application ID for"\
"application %s" % app_name)
final_status = getSparkJobFinalStatus(yarn_application_id)
log_path = %s/%s_%s.log" % (error_log_dir, app_name,
yarn_application_id)
if final_status != "SUCCEEDED":
cmd_output = subprocess.Popen(["yarn","logs",
"-applicationId",yarn_application_id],
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
bufsize=1, universal_lines=True)
with open(log_path, "w") as f:
for line in cmd_output.stdout:
f.write(line)
print("Written log of failed task to %s" % log_path)
cmd_output.communicate()
raise RuntimeError("Task %s has not succeeded" % app_name)
return True
As you can see, the function uses the previously defined functions getYARNApplicationID and getSparkJobFinalStatus , and the application name, the spark-submit command line and the directory to store the error logs are passed as arguments to the function.
Note that the function raises an exception in case the yarn application ID cannot be found, or the status of the Spark job is not successful. But depending on the case, we may just want the function to return a False value, such that the controller task knows that this particular subordinate task has not been successful and needs to be executed again, without need to run again the tasks that have been already successful. In this case, we just need to replace line
raise RuntimeError("Couldn't get yarn application ID for application %s" % app_name)
and
raise RuntimeError("Task %s has not succeeded" % app_name)
with
return False
The next step is to create a generic code to start the threads and wait for their completion, as follows:
def executeAllThreads(dict_spark_submit_cmds, error_log_dir,
dict_success_app=None):
if dict_success_app is None:
dict_success_app = {app_name: False for app_name in
dict_spark_submit_cmds.keys()}
with ThreadPoolExecutorWithQueueSizeLimit(maxsize=max_parallel,
max_workers=max_parallel) as executor:
future_to_app_name = {
executor.submit(
executeThread, app_name,
spark_submit_cmd, error_log_dir,
): app_name for app_name, spark_submit_cmd in
dict_spark_submit_cmds.items() if
dict_success_app[app_name] == False
}
for future in concurrent.futures\
.as_completed(future_to_app_name):
app_name = future_to_app_name[future]
try:
dict_success_app[app_name] = future.result()
except Exception as exc:
print('Subordinate task %s generated exception %s' %
(app_name, exc))
raise
return dict_success_app
The mandatory arguments to the function are:
a dictionary with application names as keys and the corresponding job submission command lines as values;
the directory to store the error logs.
The output of the function is also a dictionary containing the return value (True or False) of each subordinate task, indexed by application name. The optional argument is dict_success_app , that can be the return value from a previous execution from the function, in case we only want to run the subordinate tasks that have not been already successful. I will show later how that can be accomplished.
For the reader’s convenience, I put together the complete code of the parallelisation framework below:
import subprocess
import concurrent.futures
from queue import Queue class ThreadPoolExecutorWithQueueSizeLimit(
concurrent.futures.ThreadPoolExecutor): def __init__(self, maxsize, *args, **kwargs):
super(ThreadPoolExecutorWithQueueSizeLimit,
self).__init__(*args, **kwargs)
self._work_queue = Queue(maxsize=maxsize) def getYARNApplicationID(app_name):
state = 'RUNNING,ACCEPTED,FINISHED,KILLED,FAILED'
out = subprocess.check_output(["yarn","application","-list",
"-appStates",state], stderr=subprocess.DEVNULL,
universal_newlines=True)
lines = [x for x in out.split("
")]
application_id = ''
for line in lines:
if app_name in line:
application_id = line.split('\t')[0]
break
return application_id def getSparkJobFinalStatus(application_id):
out = subprocess.check_output(["yarn","application",
"-status",application_id], stderr=subprocess.DEVNULL,
universal_newlines=True)
status_lines = out.split("
")
state = ''
for line in status_lines:
if len(line) > 15 and line[1:15] == "Final-State : ":
state = line[15:]
break
return state def executeThread(app_name, spark_submit_cmd, error_log_dir,
max_wait_time_job_start_s = 120):
cmd_output = subprocess.Popen(spark_submit_cmd, shell=True,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
start_time = time.time()
while yarn_application_id == '' and time.time()-start_time\
< max_wait_time_job_start_s:
yarn_application_id = getYARNApplicationID(app_name)
cmd_output.wait()
if yarn_application_id == '':
raise RuntimeError("Couldn't get yarn application ID for"\
" application %s" % (app_name))
# Replace line above by the following if you do not
# want a failed task to stop the entire process:
# return False
final_status = getSparkJobFinalStatus(yarn_application_id)
log_path = %s/%s_%s.log" % (error_log_dir, app_name,
yarn_application_id)
if final_status != "SUCCEEDED":
cmd_output = subprocess.Popen(["yarn","logs",
"-applicationId",yarn_application_id],
stdout=subprocess.PIPE, stderr=subprocess.STDOUT,
bufsize=1, universal_lines=True)
with open(log_path, "w") as f:
for line in cmd_output.stdout:
f.write(line)
print("Written log of failed task to %s" % log_path)
cmd_output.communicate()
raise RuntimeError("Task %s has not succeeded" % app_name)
# Replace line above by the following if you do not
# want a failed task to stop the entire process:
# return False
return True def executeAllThreads(dict_spark_submit_cmds, error_log_dir,
dict_success_app=None):
if dict_success_app is None:
dict_success_app = {app_name: False for app_name in
dict_spark_submit_cmds.keys()}
with ThreadPoolExecutorWithQueueSizeLimit(maxsize=max_parallel,
max_workers=max_parallel) as executor:
future_to_app_name = {
executor.submit(
executeThread, app_name,
spark_submit_cmd, error_log_dir,
): app_name for app_name, spark_submit_cmd in
dict_spark_submit_cmds.items() if
dict_success_app[app_name] == False
}
for future in concurrent.futures\
.as_completed(future_to_app_name):
app_name = future_to_app_name[future]
try:
dict_success_app[app_name] = future.result()
except Exception as exc:
print('Subordinate task %s generated exception %s' %
(app_name, exc))
raise
return dict_success_app
Example: Multi-label model training with 2-level parallelisation using Gradient Boosting binary classifiers
In this example, I will show how to use the framework above to parallelise training of a multi-label classifier with hundreds of labels. Basically, we will train multiple binary classifiers in parallel, where the training of each binary model is itself parallelised via Spark. The individual binary classifiers are Gradient Boosting models trained using the Spark version of the popular LightGBM package, contained in the Microsoft Machine Learning for Spark (MMLSpark) library.
Setting up the controller task
By using the framework above, there are only two other things that the controller task needs to do:
Prior to calling the executeAllThreads function, set up the application name and spark-submit command for each subordinate task; After returning from the executeAllThreads function, check which subordinate tasks have been successful and handle their output appropriately.
For the first part, we can start by looking at our previous example where we are submitting a standalone subordinate job:
spark_config_cluster_path = "/home/edsonaoki/spark_config_cluster"
app_name = "some_model_training"
spark_config = {
"spark.jars.packages" :
"com.microsoft.ml.spark:mmlspark_2.11:0.18.1",
"spark.dynamicAllocation.enabled": "false",
"spark.executor.instances": "10",
"spark.yarn.dist.files": "/home/edsonaoki/custom_packages.tar"
} command = "lightgbm_training.py "\
"hdfs://user/edsonaoki/datasets/input_data.parquet"\
"hdfs://user/edsonaoki/models" spark_submit_cmd = "SPARK_CONF_DIR=%s spark-submit -name %s %s %s"
% (spark_config_cluster_path, app_name,
" ".join(['-conf %s="%s"' % (key, value) for key, value in
spark_config.items()]),
command)
What do we need to adapt the code for multi-label classification? First, for the reasons already mentioned, the application name needs to be completely unique. Assuming that the label columns of the dataset input_data.parquet are contained in a variable lst_labels , one way to ensure likely unique applications IDs for each subordinate task would something like:
import time
curr_timestamp = int(time.time()*1000)
app_names = ["model_training_%s_%d" % (label,curr_timestamp) for
label in lst_labels]
This ensures that application names will be unique as long as the controller task is not started more once in the same millisecond (of course, if we have a shared YARN cluster other adaptions may be needed to make the application names unique, such as adding the username to the application name).
We are yet to discuss how the subordinate task code contained in lightgbm_training.py looks like, but let’s suppose it:
Performs some pre-processing on the training data, based on the label column (such as dataset balancing), using a function contained in the custom_packages.tar file submitted along with the Spark job
file submitted along with the Spark job Trains the model based on the features column and the label column
Saves the trained model in the HDFS system
In this case, the controller task needs to pass the HDFS path of the training dataset, the HDFS path to store the trained models, and the label to be used for each subordinate task, via command-line arguments to lightgbm_training.py . This can be done as shown below:
dict_spark_submit_cmds = dict()
for i in range(len(lst_labels)):
command = "lightgbm_training.py "\
"hdfs://user/edsonaoki/datasets/input_data.parquet "\
"hdfs://user/edsonaoki/models "\
+lst_labels[i]
spark_submit_cmd = “SPARK_CONF_DIR=%s spark-submit -name %s "\
"%s %s" % (spark_config_cluster_path, app_names[i],
" ".join(['-conf %s="%s"' % (key, value) for key, value in
spark_config.items()]),
command)
dict_spark_submit_cmds[app_names[i]] = spark_submit_cmd
Of course, there are many other ways to customise the subordinate tasks. We might want to use different model training hyperparameters, different datasets, different Spark configurations, or even use different Python scripts for each subordinate task. The fact that we allow the spark-submit command line to be unique for each subtask allows complete customisation.
For the reader’s convenience, I put together the controller task’s code prior to and until calling executeAllThreads :
import time
spark_config_cluster_path = "/home/edsonaoki/spark_config_cluster" curr_timestamp = int(time.time()*1000)
app_names = ["model_training_%s_%d" % (label,curr_timestamp) for
label in lst_labels] spark_config = {
"spark.jars.packages" :
"com.microsoft.ml.spark:mmlspark_2.11:0.18.1",
"spark.dynamicAllocation.enabled": "false",
"spark.executor.instances": "10",
"spark.yarn.dist.files": "/home/edsonaoki/custom_packages.tar"
} dict_spark_submit_cmds = dict()
for i in range(len(lst_labels)):
command = "lightgbm_training.py "\
"hdfs://user/edsonaoki/datasets/input_data.parquet "\
"hdfs://user/edsonaoki/models "\
+lst_labels[i]
spark_submit_cmd = “SPARK_CONF_DIR=%s spark-submit -name %s "\
"%s %s" % (spark_config_cluster_path, app_names[i],
" ".join(['-conf %s="%s"' % (key, value) for key, value in
spark_config.items()]),
command)
dict_spark_submit_cmds[app_names[i]] = spark_submit_cmd executeAllThreads(dict_spark_submit_cmds, "/home/edsonaoki/logs")
For the second part, i.e. what the controller task should do after returning from executeAllThreads , assuming that the successful tasks have saved the trained models in the HDFS system, we can just open these files and process them as appropriate, for instance applying the models to some appropriate validation dataset, generating plots and computing performance metrics.
If we use the parallelisation framework presented earlier as it is, there won’t be “unsuccessful subordinate tasks” as any failure will result in an exception being raised. But if we modified executeThread to return False in case of task failure, we might store the returning dict_success_app dictionary in a JSON or Pickle file such that we can later investigate and fix the failed tasks. Finally, we can call again executeAllThreads with the optional argument dict_success_app set such that we re-run only the failed tasks.
Setting up the subordinate task
Let us now write the code of the subordinate task in the lightgbm_training.py script. The first step is to read the input arguments of the script, i.e. the path of the training dataset in the HDFS filesystem, the path to store the models and the name of the label column:
import sys
train_data_path = sys.argv[1]
model_path = sys.argv[2]
label = sys.argv[3]
Since we are using the Spark version of LightGBM, we need to create a Spark session, which we do as follows:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
spark.sparkContext.addPyFile("./custom_packages.tar")
Note that there is no need to set up any configuration for the Spark session, as it has been already done in the command line submitted by the controller task. Also, since we explicitly submitted a custom Python package custom_packages.tar to the Spark job, we need to use the addPyFile function to make the contents of the package usable inside our code, as the package is not included in the PYTHONPATH environment variable of the Hadoop cluster.
The code that does the actual processing in the subordinate task is pretty straightforward. The subordinate task will read the training data, call some pre-processing function inside custom_packages.tar (say custom_data_preprocessing.datasetBalancing ), perform the model training, and save the trained model with a unique name back in the HDFS file system:
from custom_data_preprocessing import datasetBalancing
from mmlspark import LightGBMClassifier df_train_data = spark.read.parquet(train_data_path)
df_preproc_data = datasetBalancing(df_train_data, label)
untrained_model = LightGBMClassifier(learningRate=0.3,
numIterations=150,
numLeaves=45)\
.setFeaturesCol("features")\
.setLabelCol(label)
trained_model = untrained_model.fit(df_preproc_data)
trained_model.write().overwrite()\
.save(model_path + "/trained_model_%s.mdl" % label) spark.stop()
The full code of lightgbm_training.py is put together below for the reader’s convenience:
import sys
train_data_path = sys.argv[1]
model_path = sys.argv[2]
label = sys.argv[3] from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
spark.sparkContext.addPyFile("./custom_packages.tar") from custom_data_preprocessing import datasetBalancing
from mmlspark import LightGBMClassifier df_train_data = spark.read.parquet(train_data_path)
df_preproc_data = datasetBalancing(df_train_data, label)
untrained_model = LightGBMClassifier(learningRate=0.3,
numIterations=150,
numLeaves=45)\
.setFeaturesCol("features")\
.setLabelCol(label)
trained_model = untrained_model.fit(df_preproc_data)
trained_model.write().overwrite()\
.save(model_path + "/trained_model_%s.mdl" % label) spark.stop()
Conclusion
It is easy to see that the framework presented in this article can be re-used for various tasks other than multiple machine learning model training. A question is that may arise is whether it can be used for different cluster environments, for instance with Spark on Mesos rather than Spark on YARN. I believe so, but some adaptations are needed as the presented code relies heavily on the yarn command to monitor the subordinate tasks.
By using this framework, data scientists can focus more of their time on designing the data tasks, not on manually executing them for dozens or hundreds of small variations. Another advantage is that by harnessing parallelisation, the tasks can be done in much less time, or from a different perspective, without requiring multiple data scientists to work simultaneously to complete the tasks in the same amount of time.
Naturally, this article presents only one of many ways to improve data science automation. Organisations that realise that the time of data scientists and other skilled tech professionals is highly valuable will certainly find increasingly more ways to help these professionals focus on higher-level problems. | https://towardsdatascience.com/how-to-train-multiple-machine-learning-models-and-run-other-data-tasks-in-parallel-by-combining-2fa9670dd579 | ['Edson Hiroshi Aoki'] | 2019-11-12 06:14:24.952000+00:00 | ['Machine Learning', 'Spark', 'Data Science', 'Big Data', 'Python'] | Title Training multiple machine learning model running data task parallel via YARN Spark multithreadingContent Training multiple machine learning model running data task parallel via YARN Spark multithreading Harness large scale computational resource allow single data scientist perform dozen hundred Big data task parallel stretching limit data science scaling automation image Freepikcom Summary objective article show single data scientist launch dozen hundred data sciencerelated task simultaneously including machine learning model training without using complex deployment framework fact task launched “data scientist”friendly interface namely single Python script run interactive shell Jupyter Spyder Cloudera Workbench task parallelised order handle large amount data effectively add second layer parallelism article intended Data scientist wish work le time making use large scale computational resource eg cluster public cloud possibly shared user via YARN understand article need good knowledge Python working knowledge Spark least basic understanding Hadoop YARN architecture shell scripting Machine learning engineer supporting data scientist making use available computational capacity operating large scale data Introduction Data science automation “Data science” “automation” two word invariably go handinhand one key goal machine learning allow machine perform task quickly lower cost andor better quality human Naturally wouldn’t make sense organization spend tech staff supposed develop maintain system automate work data scientist data engineer DevOps engineer software engineer others staff work manually It’s thus surprising recurrent discussion much automate work data science team instance via automated machine learning achieve costeffective data science automation imperative able harness computational power public private cloud cost hardware quite low compared cost highly skilled technical staff technology achieve certainly available many organisation ended facing “big data software engineer v data scientist conundrum” precisely drastic discrepancy “Big data software engineer skills” ie skill necessary manipulate massive amount data complex computational environment run process reliable manner along concurrent process “Data scientist skills” ie skill necessary apply algorithm mathematics data extract insight valuable business standpoint Harnessing computational power key automating data science work image Freepikcom organisation would make “data scientists” responsible developing analytics model sort “controlled analytics environment” one need think much underlying computational resource sharing resource process “big data software engineers” responsible coding “productionready” version model developed data scientist deploy production setup resulted obvious inefficiency Data scientist developing suboptimal model due making use large scale data computational resource organisation data scientist even ended working singlenode framework PandasScikitLearn basing model entirely small datasets obtained via sampling overengineered feature Developed model performing well analytics environment performing well completely unable run production environment difficulty evaluate generation business value identify fix problem well making iterative improvement data scientist end dramatically losing oversight analytics process model sent production Different organisation dealt situation different way either forcing big data software engineer data scientist learn skill “other role” creating “third role” named “Machine Learning Engineer” bridge gap two role fact nowadays far resource term allowing data scientist without exceptional software engineering skill work “realistic” environment ie similar production term computational complexity Machine learning library Spark MLLib Kubeflow TensorflowGPU MMLSpark allow data preparation model training distributed across multiple CPUs GPUs combination time framework Apache Hadoop YARN Kubernetes allow data scientist work simultaneously using computational resource understanding basic concept underlying server infrastructure number available CPUsGPUs available memory intent article provide example library framework well massive shared computational resource leveraged together order automate creation testing data science model individually massively parallelised task massively running task parallel Frameworks like Spark Kubeflow make easy distribute Big Data task feature processing machine learning model training across GPUs andor hundred CPUs without detailed understanding server architecture hand executing task parallel rather individual parallelised task seamless course it’s hard data scientist work two three PySpark session Jupyter time sake automation might rather interested running dozen hundred task simultaneously specified programmatic way minimal human interference Naturally one may ask bother running task parallel instead simply increasing number core per task make task run shorter time two reason processing speed often scale number core example case training machine learning model data large enough might zero improvement computation time increasing number core say 10 100 sometimes computational time might even increase due process communication overhead well inability leverage highly efficient singleprocessor implementation available machine learning library accuracy machine learning algorithm model may also decrease due parallelisation algorithm often rely suboptimal heuristic able run distributed fashion data split voting certainly possible using deployment tool Airflow run arbitrarily complex dynamically defined highly automated data analytics pipeline involving parallelised task However tool require lowlevel scripting configuration aren’t suited quick “trial error” experiment carried data scientist daily basis often accustomed try retry idea quickly interactive shell Jupyter Spyder Also taking u back previously mentioned “big data software engineer v data scientist” conundrum organisation might prefer data scientist spend time focusing experimenting data generating business value getting immersed lowlevel implementation deployment learn article article show make use Apache Hadoop YARN launch monitor multiple job Hadoop cluster simultaneously including individually parallelised Spark job directly Python code including code interactive Python shell Jupyter via Python multithreading example consist training multiple machine learning model parallel provide generic framework used launch arbitrary data task feature engineering model metric computation application multiple model parallel training Hyperparameter tuning training data set simultaneously train using different model type say Logistic Regression Gradient Boosting Multilayer Perceptron also different hyperparameter configuration order find optimal model typehyperparameter set quickly possible training data set simultaneously train using different model type say Logistic Regression Gradient Boosting Multilayer Perceptron also different hyperparameter configuration order find optimal model typehyperparameter set quickly possible Multilabel classification Train multiple binarymulticlass classification model parallel model training task use different column label column resulting combination model effectively multilabel classifier Train multiple binarymulticlass classification model parallel model training task use different column label column resulting combination model effectively multilabel classifier Feature reduction poll previously ranked feature train multiple model using top Nranked feature feature column N varied across training task Technical overview framework call main task ie Python code creates additional task run parallel controller task task started controller task subordinate task intentionally avoid using expression “worker” avoid confusion Spark “worker” synonym Spark executor controller task responsible Defining many subordinate task run time case one task fail Creating subordinate task passing input task getting output Generating input processing output subordinate task interesting aspect YARN allows Spark used controller subordinate task Although neither necessary allows u handle arbitrarily large datasets without needing worry data engineering long enough computational resource Namely controller task run Spark client mode subordinate task run Spark cluster mode client mode Spark driver run environment controller’s Python code run refer client environment allowing use locally installed interactive shell Jupyter whereas Spark executor run YARNmanaged Hadoop cluster interaction driver executor made via third type process named Application Master also running Hadoop cluster allowing use locally installed interactive shell Jupyter whereas Spark executor run YARNmanaged interaction driver executor made via third type process named Application Master also running Hadoop cluster cluster mode driver executor run YARNmanaged Hadoop cluster Note nothing prevent u controller task also running cluster mode interactive shell cannot used way framework illustrated figure Illustration parallelisation framework two thing note example Although example controller task also driver Spark process thus associated executor Hadoop cluster via YARN Application Master necessary although useful example want preprocessing data deploying subordinate task Although subordinate task need use Spark parallelisation use sparksubmit command launch always Spark driver although necessarily Spark executor case process 3 Technical implementation Executing subordinate task Spark job delve parallelisation first explain execute subordinate task controller task written Python mentioned using sparksubmit shell script contained Apache Spark installation subordinate task technically Spark job although necessarily executor Spark code mentioned principle use sparksubmit Python simply calling ossystem function allows u execute shell command Python practice need able debug monitor task purpose better use excellent subprocess library example import json import subprocess sparkconfigclusterpath homeedsonaokisparkconfigcluster appname somemodeltraining sparkconfig sparkjarspackages commicrosoftmlsparkmmlspark2110181 sparkdynamicAllocationenabled false sparkexecutorinstances 10 sparkyarndistfiles homeedsonaokicustompackagestar command lightgbmtrainingpy hdfsuseredsonaokidatasetsinputdataparquet hdfsuseredsonaokimodels sparksubmitcmd “SPARKCONFDIRs sparksubmit name sparkconfigclusterpath appname joinconf s key value key value sparkconfigitems command cmdoutput subprocessPopensparksubmitcmd shellTrue stdoutsubprocessPIPE stderrsubprocessSTDOUT bufsize1 universalnewlinesTrue line cmdoutputstdout printline cmdoutputcommunicate beginning code set path containing cluster mode base Spark configuration later used change SPARKCONFDIR environmental variable actually crucial step controller task configured run Spark client mode since Spark configuration cluster mode typically different client mode don’t know much configure Spark cluster mode start making copy existing SPARKCONFDIR Inside sparkdefaultsconf file need sparksubmitdeployModecluster instead sparksubmitdeployModeclient certain configuration option sparkyarnrmProxyenabled sparkdriveroptions option need disabled networkspecific configuration driver running Spark cluster mode Check Spark YARN documentation doubt course controller task also running Spark cluster mode need separate configuration looking subsequent step appname somemodeltraining sparkconfig sparkjarspackages commicrosoftmlsparkmmlspark2110181 sparkdynamicAllocationenabled false sparkexecutorinstances 10 sparkyarndistfiles homeedsonaokicustompackagestar command lightgbmtrainingpy hdfsuseredsonaokidatasetsinputdataparquet hdfsuseredsonaokimodels sparksubmitcmd “SPARKCONFDIRs sparksubmit name sparkconfigclusterpath appname joinconf s key value key value sparkconfigitems command set application name additional Spark configuration option command executed sparksubmit script straightforward understand application name particularly important case — later understand also submit custom Python package via sparkyarndistfiles configuration parameter show later especially handy since subordinate task run Hadoop cluster hence access Python function available local client environment Note also specify two HDFS path argument lightgbmtrainingpy Python script subordinate task’s code similar reason since Python script run Hadoop cluster access file client environment’s file system hence file exchanged controller subordinate task must either explicitly submitted via sparkyarndistfiles put shared file system HDFS AWS S3 preparing sparksubmit shell command line ready execute using subprocessPopen command cmdoutput subprocessPopensparksubmitcmd shellTrue stdoutsubprocessPIPE stderrsubprocessSTDOUT bufsize1 universalnewlinesTrue set shellTrue make Python initiate separate shell process execute command rather attempting initiate sparksubmit directly Python process Although setting shellFalse generally preferable using subprocess library restricts command line format it’s feasible case stdout stderr bufsize universalnewlines argument used handle output STDOUT error message STDERR issued shell command execution time executing multiple subordinate task parallel probably want ignore execution time message highly cluttered impossible interpret anyways also useful save memory reason explain later However attempting run multiple task parallel certainly best first make sure individual task work properly running single subordinate task outputerror message enabled example set stdoutsubprocessPIPE stderrsubprocessSTDOUT bufsize1 universalnewlinesTrue basically direct shell command output First First FIFO queue named subprocessPIPE Note running Spark job cluster mode subprocessPIPE access message YARN Application Master driver executor check driver executor message might look Hadoop cluster UI via browser retrieve driver executor log postexecution show later Additionally file logging enabled log4jproperties file located Spark configuration message Application Master logged file rather directed subprocessPIPE disable file logging needed Finally display outputerror message Python script’s output continue code follows line cmdoutputstdout printline cmdoutputcommunicate purpose cmdoutputcommunicate wait process finish subprocessPIPE empty ie output subordinate task written highly advisable read entire queue calling cmdoutputcommunicate method done prevent queue increasing size wasting memory Monitoring subordinate task without using debug message mentioned earlier run task parallel want debug message displayed moreover large number task sending message inmemory FIFO queue time memory usage increase message aren’t read queue fast generated version code previous section without debugging starting call sparksubmit follows cmdoutput subprocessPopensparksubmitcmd shellTrue stdoutsubprocessDEVNULL stderrsubprocessDEVNULL def getYARNApplicationIDappname state RUNNINGACCEPTEDFINISHEDKILLEDFAILED subprocesscheckoutputyarnapplicationlist appStatesstate stderrsubprocessDEVNULL universalnewlinesTrue line x x outsplit applicationid line line appname line applicationid linesplitt0 break return applicationid maxwaittimejobstarts 120 starttime timetime yarnapplicationid timetimestarttime maxwaittimejobstarts yarnapplicationid getYARNApplicationIDappname cmdoutputwait yarnapplicationid raise RuntimeErrorCouldnt get yarn application ID application appname code start launching subordinate task debugging disabled cmdoutput subprocessPopensparksubmitcmd shellTrue stdoutsubprocessDEVNULL stderrsubprocessDEVNULL Since debug message displayed process running use cmdoutputwait instead cmdoutputcommunicate wait task finish Note although won’t see Application Master’s message still debug Spark job’s driver executor runtime via Hadoop cluster UI However still need able monitor task programmatic point view specifically controller task need know subordinate task finished whether successful take appropriate action case failure purpose use application name set beginning appname somemodeltraining application name used YARN retrieve YARN application ID allows u retrieve status information subordinate task resort subprocess library define function retrieve application ID application name def getYARNApplicationIDappname state RUNNINGACCEPTEDFINISHEDKILLEDFAILED subprocesscheckoutputyarnapplicationlist appStatesstate stderrsubprocessDEVNULL universalnewlinesTrue line x x outsplit applicationid line line appname line applicationid linesplitt0 break return applicationid Observe getYARNApplicationID par output yarn application list shell command Depending Hadoop version output format may slightly different parsing need adjusted accordingly doubt test format running following command terminal yarn application list appStates RUNNINGACCEPTEDFINISHEDKILLEDFAILED tricky aspect method work application name unique Hadoop cluster Therefore need make sure creating unique application name instance including timestamps random string user ID etc Optionally also add filter attempting parse output yarn application list example user ID YARN queue name time day Since Spark job take time registered YARN launched using sparksubmit implemented loop maxwaittimejobstarts 120 starttime timetime yarnapplicationid timetimestarttime maxwaittimejobstarts yarnapplicationid getYARNApplicationIDappname maxwaittimejobstarts time wait registration second may need adjusted according environment meaning yarnapplicationid raise RuntimeErrorCouldnt get yarn application ID application appname straightforward application ID mean Spark job successfully launched need throw exception may also indicate need increase maxwaittimejobstarts change output yarn application list parsed inside getYARNApplicationID Checking final status subordinate task subordinate task finished checking final status done follows def getSparkJobFinalStatusapplicationid subprocesscheckoutputyarnapplication statusapplicationid stderrsubprocessDEVNULL universalnewlinesTrue statuslines outsplit state line statuslines lenline 15 line115 FinalState state line15 break return state finalstatus getSparkJobFinalStatusyarnapplicationid may need tune parsing yarn application status depending Hadoop version handle final status entirely one possibility store Spark job’s driver executor log file raise exception example logpath homeedsonaokilogssslog appname yarnapplicationid finalstatus SUCCEEDED cmdoutput subprocessPopenyarnlogs applicationIdyarnapplicationid stdoutsubprocessPIPE stderrsubprocessSTDOUT bufsize1 universallinesTrue openlogpath w f line cmdoutputstdout fwriteline printWritten log failed task logpath cmdoutputcommunicate raise RuntimeErrorTask succeeded appname Using multithreading execute subordinate task parallel obvious attempting execute subordinate task parallel make sure test many task possible without parallelisation debugging parallel task incredibly difficult perform parallelisation use Python’s concurrent library concurrent library us multithreading multiprocessing ie thread run processor side controller task real parallel processing However since thread started controller task IO mode unblocked waiting subordinate task finish multiple subordinate task launched asynchronously actually run parallel side Hadoop cluster technically use multiprocessing library instead concurrent library achieve parallelism also controller task’s side would advise substantially increase memory consumption client environment little benefit — idea “tough processing” done Hadoop cluster launch Spark job typically aware constraint processing memory cluster environment especially case shared environment use configuration parameter sparkexecutormemory sparkexecutorinstances order control task’s processing memory consumption need done case need limit number subordinate task execute simultaneously according availability computational resource cluster reach limit subordinate task started another finished concurrent package offer futuresThreadPoolExecutor class allows u start multiple thread wait finish class also allows u limit number thread active processingie blocked IO via maxworkers argument However mentioned thread controller task treated blocked IO subordinate task running mean maxworkers won’t effectively limit number thread result subordinate task submitted nearly simultaneously Hadoop cluster become overloaded solved rather easily modifying futuresThreadPoolExecutor class follows import concurrentfutures queue import Queue class ThreadPoolExecutorWithQueueSizeLimit concurrentfuturesThreadPoolExecutor def initself maxsize args kwargs superThreadPoolExecutorWithQueueSizeLimit selfinitargs kwargs selfworkqueue Queuemaxsizemaxsize new class ThreadPoolExecutorWithQueueSizeLimit work exactly like futuresThreadPoolExecutor won’t allow maxsize thread exist point time effectively limiting number subordinate task running simultaneously Hadoop cluster need define function containing execution code thread passed argument class ThreadPoolExecutorWithQueueSizeLimit Based previous code executing subordinate task Python without debugging message present following generic thread execution function def executeThreadappname sparksubmitcmd errorlogdir maxwaittimejobstarts120 cmdoutput subprocessPopensparksubmitcmd shellTrue stdoutsubprocessDEVNULL stderrsubprocessDEVNULL starttime timetime yarnapplicationid timetimestarttime maxwaittimejobstarts yarnapplicationid getYARNApplicationIDappname cmdoutputwait yarnapplicationid raise RuntimeErrorCouldnt get yarn application ID application appname finalstatus getSparkJobFinalStatusyarnapplicationid logpath ssslog errorlogdir appname yarnapplicationid finalstatus SUCCEEDED cmdoutput subprocessPopenyarnlogs applicationIdyarnapplicationid stdoutsubprocessPIPE stderrsubprocessSTDOUT bufsize1 universallinesTrue openlogpath w f line cmdoutputstdout fwriteline printWritten log failed task logpath cmdoutputcommunicate raise RuntimeErrorTask succeeded appname return True see function us previously defined function getYARNApplicationID getSparkJobFinalStatus application name sparksubmit command line directory store error log passed argument function Note function raise exception case yarn application ID cannot found status Spark job successful depending case may want function return False value controller task know particular subordinate task successful need executed without need run task already successful case need replace line raise RuntimeErrorCouldnt get yarn application ID application appname raise RuntimeErrorTask succeeded appname return False next step create generic code start thread wait completion follows def executeAllThreadsdictsparksubmitcmds errorlogdir dictsuccessappNone dictsuccessapp None dictsuccessapp appname False appname dictsparksubmitcmdskeys ThreadPoolExecutorWithQueueSizeLimitmaxsizemaxparallel maxworkersmaxparallel executor futuretoappname executorsubmit executeThread appname sparksubmitcmd errorlogdir appname appname sparksubmitcmd dictsparksubmitcmdsitems dictsuccessappappname False future concurrentfutures ascompletedfuturetoappname appname futuretoappnamefuture try dictsuccessappappname futureresult except Exception exc printSubordinate task generated exception appname exc raise return dictsuccessapp mandatory argument function dictionary application name key corresponding job submission command line value directory store error log output function also dictionary containing return value True False subordinate task indexed application name optional argument dictsuccessapp return value previous execution function case want run subordinate task already successful show later accomplished reader’s convenience put together complete code parallelisation framework import subprocess import concurrentfutures queue import Queue class ThreadPoolExecutorWithQueueSizeLimit concurrentfuturesThreadPoolExecutor def initself maxsize args kwargs superThreadPoolExecutorWithQueueSizeLimit selfinitargs kwargs selfworkqueue Queuemaxsizemaxsize def getYARNApplicationIDappname state RUNNINGACCEPTEDFINISHEDKILLEDFAILED subprocesscheckoutputyarnapplicationlist appStatesstate stderrsubprocessDEVNULL universalnewlinesTrue line x x outsplit applicationid line line appname line applicationid linesplitt0 break return applicationid def getSparkJobFinalStatusapplicationid subprocesscheckoutputyarnapplication statusapplicationid stderrsubprocessDEVNULL universalnewlinesTrue statuslines outsplit state line statuslines lenline 15 line115 FinalState state line15 break return state def executeThreadappname sparksubmitcmd errorlogdir maxwaittimejobstarts 120 cmdoutput subprocessPopensparksubmitcmd shellTrue stdoutsubprocessDEVNULL stderrsubprocessDEVNULL starttime timetime yarnapplicationid timetimestarttime maxwaittimejobstarts yarnapplicationid getYARNApplicationIDappname cmdoutputwait yarnapplicationid raise RuntimeErrorCouldnt get yarn application ID application appname Replace line following want failed task stop entire process return False finalstatus getSparkJobFinalStatusyarnapplicationid logpath ssslog errorlogdir appname yarnapplicationid finalstatus SUCCEEDED cmdoutput subprocessPopenyarnlogs applicationIdyarnapplicationid stdoutsubprocessPIPE stderrsubprocessSTDOUT bufsize1 universallinesTrue openlogpath w f line cmdoutputstdout fwriteline printWritten log failed task logpath cmdoutputcommunicate raise RuntimeErrorTask succeeded appname Replace line following want failed task stop entire process return False return True def executeAllThreadsdictsparksubmitcmds errorlogdir dictsuccessappNone dictsuccessapp None dictsuccessapp appname False appname dictsparksubmitcmdskeys ThreadPoolExecutorWithQueueSizeLimitmaxsizemaxparallel maxworkersmaxparallel executor futuretoappname executorsubmit executeThread appname sparksubmitcmd errorlogdir appname appname sparksubmitcmd dictsparksubmitcmdsitems dictsuccessappappname False future concurrentfutures ascompletedfuturetoappname appname futuretoappnamefuture try dictsuccessappappname futureresult except Exception exc printSubordinate task generated exception appname exc raise return dictsuccessapp Example Multilabel model training 2level parallelisation using Gradient Boosting binary classifier example show use framework parallelise training multilabel classifier hundred label Basically train multiple binary classifier parallel training binary model parallelised via Spark individual binary classifier Gradient Boosting model trained using Spark version popular LightGBM package contained Microsoft Machine Learning Spark MMLSpark library Setting controller task using framework two thing controller task need Prior calling executeAllThreads function set application name sparksubmit command subordinate task returning executeAllThreads function check subordinate task successful handle output appropriately first part start looking previous example submitting standalone subordinate job sparkconfigclusterpath homeedsonaokisparkconfigcluster appname somemodeltraining sparkconfig sparkjarspackages commicrosoftmlsparkmmlspark2110181 sparkdynamicAllocationenabled false sparkexecutorinstances 10 sparkyarndistfiles homeedsonaokicustompackagestar command lightgbmtrainingpy hdfsuseredsonaokidatasetsinputdataparquet hdfsuseredsonaokimodels sparksubmitcmd SPARKCONFDIRs sparksubmit name sparkconfigclusterpath appname joinconf s key value key value sparkconfigitems command need adapt code multilabel classification First reason already mentioned application name need completely unique Assuming label column dataset inputdataparquet contained variable lstlabels one way ensure likely unique application IDs subordinate task would something like import time currtimestamp inttimetime1000 appnames modeltrainingsd labelcurrtimestamp label lstlabels ensures application name unique long controller task started millisecond course shared YARN cluster adaption may needed make application name unique adding username application name yet discus subordinate task code contained lightgbmtrainingpy look like let’s suppose Performs preprocessing training data based label column dataset balancing using function contained custompackagestar file submitted along Spark job file submitted along Spark job Trains model based feature column label column Saves trained model HDFS system case controller task need pas HDFS path training dataset HDFS path store trained model label used subordinate task via commandline argument lightgbmtrainingpy done shown dictsparksubmitcmds dict rangelenlstlabels command lightgbmtrainingpy hdfsuseredsonaokidatasetsinputdataparquet hdfsuseredsonaokimodels lstlabelsi sparksubmitcmd “SPARKCONFDIRs sparksubmit name sparkconfigclusterpath appnamesi joinconf s key value key value sparkconfigitems command dictsparksubmitcmdsappnamesi sparksubmitcmd course many way customise subordinate task might want use different model training hyperparameters different datasets different Spark configuration even use different Python script subordinate task fact allow sparksubmit command line unique subtask allows complete customisation reader’s convenience put together controller task’s code prior calling executeAllThreads import time sparkconfigclusterpath homeedsonaokisparkconfigcluster currtimestamp inttimetime1000 appnames modeltrainingsd labelcurrtimestamp label lstlabels sparkconfig sparkjarspackages commicrosoftmlsparkmmlspark2110181 sparkdynamicAllocationenabled false sparkexecutorinstances 10 sparkyarndistfiles homeedsonaokicustompackagestar dictsparksubmitcmds dict rangelenlstlabels command lightgbmtrainingpy hdfsuseredsonaokidatasetsinputdataparquet hdfsuseredsonaokimodels lstlabelsi sparksubmitcmd “SPARKCONFDIRs sparksubmit name sparkconfigclusterpath appnamesi joinconf s key value key value sparkconfigitems command dictsparksubmitcmdsappnamesi sparksubmitcmd executeAllThreadsdictsparksubmitcmds homeedsonaokilogs second part ie controller task returning executeAllThreads assuming successful task saved trained model HDFS system open file process appropriate instance applying model appropriate validation dataset generating plot computing performance metric use parallelisation framework presented earlier won’t “unsuccessful subordinate tasks” failure result exception raised modified executeThread return False case task failure might store returning dictsuccessapp dictionary JSON Pickle file later investigate fix failed task Finally call executeAllThreads optional argument dictsuccessapp set rerun failed task Setting subordinate task Let u write code subordinate task lightgbmtrainingpy script first step read input argument script ie path training dataset HDFS filesystem path store model name label column import sys traindatapath sysargv1 modelpath sysargv2 label sysargv3 Since using Spark version LightGBM need create Spark session follows pysparksql import SparkSession spark SparkSessionbuildergetOrCreate sparksparkContextaddPyFilecustompackagestar Note need set configuration Spark session already done command line submitted controller task Also since explicitly submitted custom Python package custompackagestar Spark job need use addPyFile function make content package usable inside code package included PYTHONPATH environment variable Hadoop cluster code actual processing subordinate task pretty straightforward subordinate task read training data call preprocessing function inside custompackagestar say customdatapreprocessingdatasetBalancing perform model training save trained model unique name back HDFS file system customdatapreprocessing import datasetBalancing mmlspark import LightGBMClassifier dftraindata sparkreadparquettraindatapath dfpreprocdata datasetBalancingdftraindata label untrainedmodel LightGBMClassifierlearningRate03 numIterations150 numLeaves45 setFeaturesColfeatures setLabelCollabel trainedmodel untrainedmodelfitdfpreprocdata trainedmodelwriteoverwrite savemodelpath trainedmodelsmdl label sparkstop full code lightgbmtrainingpy put together reader’s convenience import sys traindatapath sysargv1 modelpath sysargv2 label sysargv3 pysparksql import SparkSession spark SparkSessionbuildergetOrCreate sparksparkContextaddPyFilecustompackagestar customdatapreprocessing import datasetBalancing mmlspark import LightGBMClassifier dftraindata sparkreadparquettraindatapath dfpreprocdata datasetBalancingdftraindata label untrainedmodel LightGBMClassifierlearningRate03 numIterations150 numLeaves45 setFeaturesColfeatures setLabelCollabel trainedmodel untrainedmodelfitdfpreprocdata trainedmodelwriteoverwrite savemodelpath trainedmodelsmdl label sparkstop Conclusion easy see framework presented article reused various task multiple machine learning model training question may arise whether used different cluster environment instance Spark Mesos rather Spark YARN believe adaptation needed presented code relies heavily yarn command monitor subordinate task using framework data scientist focus time designing data task manually executing dozen hundred small variation Another advantage harnessing parallelisation task done much le time different perspective without requiring multiple data scientist work simultaneously complete task amount time Naturally article present one many way improve data science automation Organisations realise time data scientist skilled tech professional highly valuable certainly find increasingly way help professional focus higherlevel problemsTags Machine Learning Spark Data Science Big Data Python |
2,056 | Are You A Walking Paradox Like Me? | It can get lonely and confusing out here, for us boxless ones — those of us who don’t feel like we really fit in with a specific category. It’s even hard to describe the category of “boxless”.
Here are some signs that you’re boxless like me:
You’ve felt lonely because nobody else seems to think and experience life the way you do. On the surface, it might look like you fit in but you don’t feel a sense of true belonging for who you really are.
Your experiences with therapy and coaching are mostly that you were a better therapist or coach to yourself than they were. This adds an additional layer of loneliness because this is the person who is supposed to see you — like, really see you — and even they don’t seem to get it.
You’ve outgrown your parents emotionally and aren’t sure what to do about that.
You can blend in and get along with most anyone, but rarely do you feel a true sense of belonging.
Your relationship with most friends involves you listening to them talk and/or helping them through their problems. On those rare times when you try to open up about your experiences, you don’t usually get a reaction from them that actually helps.
You feel deeply and think deeply and have a rich inner world. This has its perks but sometimes you study less-aware people and you envy their simplicity because it seems easier to be happy that way.
You are sensitive to your surroundings, other people, and your inner world. You are affected by things more deeply than many other people seem to be and you hold on to emotions longer. This is a gift, but instead of using it to your advantage, you beat yourself up for being too sensitive because in our society, sensitivity is equated with weakness.
You push yourself too hard and then burn out. You’ve had periods of your life that cycle through pushing, collapsing into burnout and hiding away, then pushing yourself again to start the cycle over.
You are spiritual, but wary of blindly following any religion or guru.
You have a still small voice always thinking, “There has got to be a better way.”
If you can relate to most all of these bullets — I see you.
Please know you are not alone. Please know it is my life’s mission to walk beside you, to enter your inner world with you and help you find all the answers that are already hidden inside of you. | https://medium.com/just-jordin/are-you-a-walking-paradox-like-me-8d3e7a61682b | ['Jordin James'] | 2020-11-15 22:57:34.671000+00:00 | ['Mental Health', 'Self', 'Psychology', 'Spirituality', 'Inspiration'] | Title Walking Paradox Like MeContent get lonely confusing u boxless one — u don’t feel like really fit specific category It’s even hard describe category “boxless” sign you’re boxless like You’ve felt lonely nobody else seems think experience life way surface might look like fit don’t feel sense true belonging really experience therapy coaching mostly better therapist coach add additional layer loneliness person supposed see — like really see — even don’t seem get You’ve outgrown parent emotionally aren’t sure blend get along anyone rarely feel true sense belonging relationship friend involves listening talk andor helping problem rare time try open experience don’t usually get reaction actually help feel deeply think deeply rich inner world perk sometimes study lessaware people envy simplicity seems easier happy way sensitive surroundings people inner world affected thing deeply many people seem hold emotion longer gift instead using advantage beat sensitive society sensitivity equated weakness push hard burn You’ve period life cycle pushing collapsing burnout hiding away pushing start cycle spiritual wary blindly following religion guru still small voice always thinking “There got better way” relate bullet — see Please know alone Please know life’s mission walk beside enter inner world help find answer already hidden inside youTags Mental Health Self Psychology Spirituality Inspiration |
2,057 | Sometimes My Mind Makes Me Hate Writing | (This story was originally written on October 25th, 2018. It was a good snapshot of my life at the time, so I thought I would republish it. During these psychotic episodes, I have very little control of my mind, and I tried my best to capture the chaos that went on that day.)
I’ve been having a rough time lately. I’m trying to write every day, but the situation in my mind has been appalling. Some days, I can’t even get an extra thought in with all the racket competing for attention. I feel like my mind has a mind of its own.
But you see, I have goals. These goals are not nice to have but written in stone with a chisel. I have to do something about my financial state of affairs, and I need to do it now.
I’m going to admit something hard for me to say. I’m on Social Security Disability (SSDI).
There, I said it.
I don’t know why I’m ashamed of it because I don’t have a choice in the matter. I haven’t been able to keep a job for a long time, and freelancing is problematic because I have problems with consistency. Clients don’t want to hire you if you can’t deliver on deadlines day after day.
I’ve proven over time that I can’t, you know, deliver.
When I have days like today, where the voices in my head are challenging each other for airtime and I can’t form a thought — much less write anything worthwhile — I become anxious and depressed.
Seriously. It took me 2 hours to write the last 217 words.
I’ve been trying to come up with a way to explain the situation I’m in, but all that comes to mind is FML — fuck my life.
I’ll try again.
Photo by Ksenia Makagonova on Unsplash
I can’t hold a job — I’ve proven it time and time again. Along the road I’ve walked the past fifteen years are the shattered, smoking husks of lost opportunities.
Freelancing seemed a viable option to supplement my SSDI, but like any job, they expect you to deliver on deadlines. It’s not personal — it’s business. I get it.
You don’t have to explain to the crying man sitting in the corner.
I thought a solution would be to get something going on Medium or write articles for blogs on my own time and schedule.
I’ve been reading the advice of others about what I need to do to be successful on Medium, or with writing in general. One of the first things always mentioned is you have to write every day and put out content seven days a week.
That’s just not realistic for me.
I sound like a complainer — I know, I disgust myself. But I’ve tried writing and publishing every day, and I even made a schedule. I went one step further and tracked my time to see where it was all going.
I sit at my desk and try to type. I try to make the words flow. Today I ended up with my head in my hands, screaming for my brain to please shut up. I finally gave up and rested in bed with my laptop open. I’m struggling to write this post, 20 words at a time.
My family tries to help, but I can’t tell them that every small noise they make rings in my head like a dinner gong. I can’t tell them everything irritates me.
The worst thing is — if I can’t get my mind under control, we may not eat next month, or the month after that, because my Social Security can’t last forever.
Again, FML.
I can feel the panic building. My stomach feels like I ate a 5-pound burrito and the contents are pushing into my throat in preparation to throw up. My hands are shaking.
Music. I need music. Ed Sheeran — take me away.
Dogs are barking. I still hear them. My daughter, Zoey, is chattering happily in the next room. Control. A woman’s voice is droning on in the back of my head. I’m ignoring her, but she’s persistent. My medication isn’t doing anything to help. PANIC.
I need to take a break.
Photo by Adeolu Eletu on Unsplash
An hour of Netflix and my mind has quieted somewhat. Zoey is sick today and sitting at my desk watching funny YouTube videos on my phone.
It’s as calm as it gets around here — no better time than now to write a few words.
Sometimes, it takes a little distraction for me to be able to focus. Does that make any sense? I need to focus on something other than what’s going on in my head. Sometimes, the things that live in my head are so disturbing that it takes a lot of noise to drown it out.
Photo by Tim Marshall on Unsplash
More breaks. I can’t keep it reigned in.
If SHE is not talking in my head, it’s an old woman. Nothing they say makes sense.
I’m scared because I don’t want Flora to find out the voices are back. She thought the medication was helping, but it’s not. The only time it’s quiet is when I’m drunk, but I promised myself I wouldn’t self-medicate. There are also problems with the headaches. When I drink, the headaches get worse.
Worse yet, when the headaches are screaming in my head, so are the voices.
I know they’re not real. I’ve been dealing with the people in my head long enough to know the people aren’t real people. My mind creates everything.
Knowing it doesn’t help. Knowing it makes me feel like more of a freak.
I have to stop.
Photo by Will Porada on Unsplash
I shoveled the food into my mouth, more out of habit than hunger. I didn’t even taste it. Every little noise distracts my mind — even the sound of the fork touching the plate was torture.
I yelled at Zoey again. She was just playing, but my mind convinced me it was bothering my wife while she was working. I can’t control the anger that builds in my chest.
Now I feel horrible. I’m such an asshole.
Photo by Edin Hopic on Unsplash
I forget kids don’t hold grudges. When I went to check on Zoey, she smiled and hugged me. It’s scary to think that so many people count on me.
I don’t want to lose what’s left of my mind. | https://jasonjamesweiland.medium.com/sometimes-my-mind-makes-me-hate-writing-9e79b98c3088 | ['Jason Weiland'] | 2020-02-13 03:39:49.541000+00:00 | ['Digital Life', 'Mental Health', 'Self', 'Mindfulness', 'Writing'] | Title Sometimes Mind Makes Hate WritingContent story originally written October 25th 2018 good snapshot life time thought would republish psychotic episode little control mind tried best capture chaos went day I’ve rough time lately I’m trying write every day situation mind appalling day can’t even get extra thought racket competing attention feel like mind mind see goal goal nice written stone chisel something financial state affair need I’m going admit something hard say I’m Social Security Disability SSDI said don’t know I’m ashamed don’t choice matter haven’t able keep job long time freelancing problematic problem consistency Clients don’t want hire can’t deliver deadline day day I’ve proven time can’t know deliver day like today voice head challenging airtime can’t form thought — much le write anything worthwhile — become anxious depressed Seriously took 2 hour write last 217 word I’ve trying come way explain situation I’m come mind FML — fuck life I’ll try Photo Ksenia Makagonova Unsplash can’t hold job — I’ve proven time time Along road I’ve walked past fifteen year shattered smoking husk lost opportunity Freelancing seemed viable option supplement SSDI like job expect deliver deadline It’s personal — it’s business get don’t explain cry man sitting corner thought solution would get something going Medium write article blog time schedule I’ve reading advice others need successful Medium writing general One first thing always mentioned write every day put content seven day week That’s realistic sound like complainer — know disgust I’ve tried writing publishing every day even made schedule went one step tracked time see going sit desk try type try make word flow Today ended head hand screaming brain please shut finally gave rested bed laptop open I’m struggling write post 20 word time family try help can’t tell every small noise make ring head like dinner gong can’t tell everything irritates worst thing — can’t get mind control may eat next month month Social Security can’t last forever FML feel panic building stomach feel like ate 5pound burrito content pushing throat preparation throw hand shaking Music need music Ed Sheeran — take away Dogs barking still hear daughter Zoey chattering happily next room Control woman’s voice droning back head I’m ignoring she’s persistent medication isn’t anything help PANIC need take break Photo Adeolu Eletu Unsplash hour Netflix mind quieted somewhat Zoey sick today sitting desk watching funny YouTube video phone It’s calm get around — better time write word Sometimes take little distraction able focus make sense need focus something what’s going head Sometimes thing live head disturbing take lot noise drown Photo Tim Marshall Unsplash break can’t keep reigned talking head it’s old woman Nothing say make sense I’m scared don’t want Flora find voice back thought medication helping it’s time it’s quiet I’m drunk promised wouldn’t selfmedicate also problem headache drink headache get worse Worse yet headache screaming head voice know they’re real I’ve dealing people head long enough know people aren’t real people mind creates everything Knowing doesn’t help Knowing make feel like freak stop Photo Porada Unsplash shoveled food mouth habit hunger didn’t even taste Every little noise distracts mind — even sound fork touching plate torture yelled Zoey playing mind convinced bothering wife working can’t control anger build chest feel horrible I’m asshole Photo Edin Hopic Unsplash forget kid don’t hold grudge went check Zoey smiled hugged It’s scary think many people count don’t want lose what’s left mindTags Digital Life Mental Health Self Mindfulness Writing |
2,058 | 7 Powerful Psychology Lessons That Will Boost Your Digital Marketing Game | 1. Emotional Marketing
There are two types of strategies that affect consumers’ buying habits:
Rational marketing that promotes the quality and usefulness of the product, emphasizes the benefits and appeals to the rational or logical consumer.
Emotional marketing that approaches consumer on a personal level and focuses on the tone, lighting, and mood to increase loyalty and boost conversions.
It’s been proven that consumers base their purchase decisions around feelings and emotions rather than the rational information of products’ features and attributes. So, it’s worth remembering that customers will more likely be loyal to brands that evoke positive emotional response. Use this knowledge in your content marketing strategy and create content that:
Inspires, creates excitement and interest.
Reminds of special moments.
Sparks conversations, reactions, and engagements.
Apple is the perfect example of a company that uses emotions to connect with their consumers and increase brand loyalty. Apple’s marketing strategies tend to create a desire to become a part of a lifestyle movement, to be a part of something bigger. Recently, Apple joined Instagram and their #ShotoniPhone campaign fully encompasses those values. Instead of focusing on shiny product shots, Apple invites regular users around the world to share their iPhone photography with others.
2. Social Proof
According to social psychologist Robert Cialdini, social proof is one of the most important tactics for influencing and convincing customers. Social proof or social influence is based on the fact that people love to follow the behavior of others. We tend to adopt the beliefs or mimic the actions of people we trust and admire. Implement this knowledge in your marketing strategy by using:
User-generated content, testimonials and reviews.
Influencer marketing.
Social plugins and sharing buttons.
For instance, clothing company Old Navy cooperated with social influencer Meghan Rienks on Instagram, Twitter and YouTube. In her videos, Meghan suggested style ideas to her followers using items from Old Navy, thus providing a powerful social proof.
3. Grounded Cognition
Grounded Cognition theory is based in the principle that people can experience a story that they read, watch or hear as if it was happening to them. It also states that people tend to forget dry facts and figures. If you want your customers to remember your message, you have to incorporate it into a story. Taking this into account you can boost your marketing by:
Speaking to your audience in a friendly way.
Telling the stories they can empathize with.
Sharing a personal story or experience.
High Brew Coffee provides a great example of a personal story that enables the audience to connect with the brand. The founder of the company, David Smith, together with his wife have shared their story of coming up with their business idea. They let the audience know exactly where it comes from — a long trip through the Caribbean with their whole family.
4. Paradox Of Choice
Giving people the freedom of choice can positively influence your marketing efforts. However, too many choices make people nervous and can negatively impact conversion rates. According to psychologist Barry Schwarz, providing people with limited range of choices reduces customers’ anxiety and leads to better marketing results. Use this knowledge and:
Emphasize a few key points at a time.
Create clear CTAs.
Give your customers no more than two clear paths to follow.
The Paradox of Choice theory can be applied also if you wish to offer your customers a wider range of choices. For example, while Amazon offers millions of products, they still manage to avoid choice overload. It’s done by highlight few different categories of products, each with up to 7 product options.
5. Information-Gap Theory
George Loewenstein proposed that people experience a strong emotional response when they notice a gap between what they know and what they want to know. This means that you have to create a feeling of curiosity within your audience and give them information that fulfills their need for knowledge. An effective way to incorporate it in your content marketing is by creating powerful headlines. There are plenty of free online tools that can help, such as:
Take an example from the digital marketing expert Neil Patel who is a master of strong headlines that create curiosity and generate clicks:
6. The Commitment and Consistency Theory
This theory states that if you make a small commitment to something, you are more likely to say yes to a bigger commitment in the future. This means that if you get your customers to make a small commitment towards your brand, like signing up for a newsletter, they are more likely to make a larger commitment e.g. in the form of a purchase or membership. To improve your marketing strategy, start with small commitments like:
Ask for customers’ contact details.
Invite them to subscribe to a newsletter.
Ask prospects to share your content on social media.
Offer them to sign up for an e-book or webinar.
Search Engine Journal, for instance, takes advantage of this theory and offers a free webinar for their website visitors. Although it’s simply asking for a name and an email address, it’s already a small commitment the user makes towards the brand.
7. Loss Aversion Theory
Loss aversion theory refers to the tendency of people to avoid losses rather than acquire gains. The negative feelings associated with loss are even twice as powerful as good feelings of gain. You can effectively use this theory in your advantage if you analyze your audience, learn their fears and create content that emphasizes benefits of your brand that eases those fears. There are many analytical tools that can help you know your audience better, for example:
ModCloth used this theory in their email reminders. After a few days after not making a purchase, customers receive a reminder that inventory is running low and the item they looked at might soon not be available anymore.
Wrapping It Up
Using psychological theories is a great way to improve the success of your marketing messages without any additional technologies or big budgets. These theories can help you better understand your customers, consider how your customers think and create a content that cuts through the information overload we’re all bombarded with. | https://medium.com/the-pushcrew-journal/7-powerful-psychology-lessons-that-will-boost-your-digital-marketing-game-38bbc7b661e9 | ['Alex'] | 2019-10-10 19:38:46.202000+00:00 | ['Marketing', 'Marketing Strategies', 'Growth Hacking', 'Psychology', 'Digital Marketing'] | Title 7 Powerful Psychology Lessons Boost Digital Marketing GameContent 1 Emotional Marketing two type strategy affect consumers’ buying habit Rational marketing promotes quality usefulness product emphasizes benefit appeal rational logical consumer Emotional marketing approach consumer personal level focus tone lighting mood increase loyalty boost conversion It’s proven consumer base purchase decision around feeling emotion rather rational information products’ feature attribute it’s worth remembering customer likely loyal brand evoke positive emotional response Use knowledge content marketing strategy create content Inspires creates excitement interest Reminds special moment Sparks conversation reaction engagement Apple perfect example company us emotion connect consumer increase brand loyalty Apple’s marketing strategy tend create desire become part lifestyle movement part something bigger Recently Apple joined Instagram ShotoniPhone campaign fully encompasses value Instead focusing shiny product shot Apple invite regular user around world share iPhone photography others 2 Social Proof According social psychologist Robert Cialdini social proof one important tactic influencing convincing customer Social proof social influence based fact people love follow behavior others tend adopt belief mimic action people trust admire Implement knowledge marketing strategy using Usergenerated content testimonial review Influencer marketing Social plugins sharing button instance clothing company Old Navy cooperated social influencer Meghan Rienks Instagram Twitter YouTube video Meghan suggested style idea follower using item Old Navy thus providing powerful social proof 3 Grounded Cognition Grounded Cognition theory based principle people experience story read watch hear happening also state people tend forget dry fact figure want customer remember message incorporate story Taking account boost marketing Speaking audience friendly way Telling story empathize Sharing personal story experience High Brew Coffee provides great example personal story enables audience connect brand founder company David Smith together wife shared story coming business idea let audience know exactly come — long trip Caribbean whole family 4 Paradox Choice Giving people freedom choice positively influence marketing effort However many choice make people nervous negatively impact conversion rate According psychologist Barry Schwarz providing people limited range choice reduces customers’ anxiety lead better marketing result Use knowledge Emphasize key point time Create clear CTAs Give customer two clear path follow Paradox Choice theory applied also wish offer customer wider range choice example Amazon offer million product still manage avoid choice overload It’s done highlight different category product 7 product option 5 InformationGap Theory George Loewenstein proposed people experience strong emotional response notice gap know want know mean create feeling curiosity within audience give information fulfills need knowledge effective way incorporate content marketing creating powerful headline plenty free online tool help Take example digital marketing expert Neil Patel master strong headline create curiosity generate click 6 Commitment Consistency Theory theory state make small commitment something likely say yes bigger commitment future mean get customer make small commitment towards brand like signing newsletter likely make larger commitment eg form purchase membership improve marketing strategy start small commitment like Ask customers’ contact detail Invite subscribe newsletter Ask prospect share content social medium Offer sign ebook webinar Search Engine Journal instance take advantage theory offer free webinar website visitor Although it’s simply asking name email address it’s already small commitment user make towards brand 7 Loss Aversion Theory Loss aversion theory refers tendency people avoid loss rather acquire gain negative feeling associated loss even twice powerful good feeling gain effectively use theory advantage analyze audience learn fear create content emphasizes benefit brand eas fear many analytical tool help know audience better example ModCloth used theory email reminder day making purchase customer receive reminder inventory running low item looked might soon available anymore Wrapping Using psychological theory great way improve success marketing message without additional technology big budget theory help better understand customer consider customer think create content cut information overload we’re bombarded withTags Marketing Marketing Strategies Growth Hacking Psychology Digital Marketing |
2,059 | The 5 things we’ve lived by to create a truly international business from Day One | If you’re building a startup, you have a LOT of decisions to make early on. Most of them will be wrong and that’s OK. You learn from them and move on.
But when it comes to attracting markets beyond your country’s borders, you have to make the right decisions from Day One.
This is especially true if your home market is not the United States or any other English speaking country.
I’ve learned that the hard way with my first startup affinitiz. affinitiz was one of the very first social networks. (Mark Zuckerberg was still in high school to put things in perspective.)
affinitiz, my first startup, only targeted the French market. It was one of the very first social networks back in 2001. It never really took off.
It was launched in 2001 and was built as a French-only business. The website was in French, the app was in French, our PR was targeting French media, and so on.
As a consequence, it was constrained to a small, limited market (France) and it never became big enough to survive.
After 5 years, it had 400,000 registered users in France. I have no doubt that had it been an international business from Day One, it would have had at least 10 times that and would have been a valuable business.
If I had reached that 4 million registered user mark in 2006, I would have sold my company and made millions.
But instead, as a French-only business, affinitiz made only $150,000 in annual revenue. I shut it down in 2009.
My current business, Agorapulse, has users in 170 countries and 14,300 cities around the world.
It also generates more than $3.3M in annual recurring revenue and is profitable.
Only 20 percent of its revenue comes from France. If I had built Agorapulse the same way I built my first startup (French-only), it’d be dead by now.
Doing the right things to make sure your startup is truly international can make or break it.
Here’s what I’ve learned from doing it right with Agorapulse.
1) Use English as your working language for EVERYTHING
If English is not your native language, deciding that it should be your working language is no small commitment.
But think about it. Language is everywhere: your code, naming policy, website, app, your support content, emails, etc.
If you want to grow a global business, you’ll have to assemble a global team right away. Note: Most of them won’t speak your native language.
If you start building your business in your native language (say, French) and, two years down the road, you hire a native Spanish speaker who doesn’t speak a word of French, you’re fucked. (Pardon my French there.)
If you launch your website in French (or any other language that isn’t English) and two years down the road you’re ready to go international, your website will only have built authority in your native language, and none in other languages. Good luck with that.
Long story short: You HAVE to do everything work related in the only universal language there is — English.
For many founders, it’s hard (and certainly not intuitive) to put aside your native language and use a “foreign” language in everything you do.
But it’s the ONLY way to build an international business. If you start building your app, website, support doc and so on in a language that’s not English, you’ll create roadblocks that will very quickly become way too hard to overcome.
2) Localize everything (and watch your words)
Localize, localize, localize. This is especially true for your code and your app interface. Every word, every sentence, tool tip, and button has to be a language variable.
It’s also true with your website.
Sure, it’s time consuming at first, but it will soon become a lifesaver as you start adding more languages to the mix.
It’s a discipline and it’s not always easy to do, but it’s worth it. Otherwise, you get a hodgepodge of languages on your page — which isn’t impressive to anyone. Here’s an example from Hootsuite where French and English words are mixed up on the same page.
The ability to uncorrelate the code from any given language will also allow you to use translation software such as Webtranslateit and let your marketing / product teams manage translations on their own. Your tech guys will save time (and headaches) and your product / marketing teams (or localization team as you grow bigger) will have the flexibility they need.
Localization also plays a role in the UI of your product or website.
As I mentioned earlier, English should be your working language. So when your UI/UX guy works on screens, all the wording should be in English.
I’ve learned the hard way that English has much shorter words or sentences than any other language (well, at least, French, Spanish or Portuguese, the three other languages we use).
For example, “Moderation rules” will become “Règles de modération” in French and, all of a sudden, that button where all the text fits in one nice line in English looks a doesn’t look as nice in French.
In English:
In French:
Spanish and Portuguese looks very similar to French in terms of length and how it impacts UI.
When you design your product UI or your website in English and expect to localize them in other languages down the road, keep that in mind.
Like with this menu on our app, it looks good in English:
But since “ads comments” is a much lengthier phrase in Spanish, it falls beneath the navigation bar and loses any dropdown menu functionality.
Key takeaway: leave some UI breathing room whenever you can!
3) Hire native speakers
Now that you’ve localized your app and website and you’ve begun creating your content in English, you’re ready to localize EVERYTHING.
In the early days, you’ll likely have a small team with no native speakers.
You’ll be tempted to work with translators to fill in the gaps. When you search on Fiverr or Upwork, it looks easy: there are a LOT of people who claim they can localize / translate your content. There are even companies that specialize in localization jobs.
I’ve tried them all. Trust me when I say localization agencies/companies and freelancers don’t work.
It’s a tall task for these outsourced workers to know your jargon, ecosystem, and product. The level of onboarding, proofreading, and micromanaging these people unfamiliar to your business is overwhelming.
It’s MUCH faster and easier to have your localization capabilities in-house. That’s what we’ve done by hiring native speakers in English (in Ireland and the U.S.), Spanish (in Mexico), and Portuguese (in Brazil).
The awesome thing about having native speakers embedded in your team (as opposed to external service providers) is that they learn your product, ecosystem and jargon along the way. After 4 to 6 months, they know everything they need to know to do their job.
As localization needs rarely constitutes a full-time job, these team members can also help the company by providing customer support, giving demos to prospects, and helping expand your business in countries that speak their native language.
A win-win to me — because localizing your app and website is not enough to grow.
Growth will only occur if you offer the whole stack in their own language:
website
content
app
support
sales
Just to illustrate, we’ve offered our website and app in Spanish and Portuguese almost from Day One (2013). For 2 years, the MRR from Spanish and Portuguese speaking countries remained painfully low.
But look at what happened after we hired our first full-time Spanish speaking team member:
And what happened after we hired our first Portuguese speaking team member:
You get the point, right?
4) Consider a remote (or semi-remote) organization
Hiring native speakers on your team can be challenging in two ways:
It’s a bigger financial commitment.
It can be hard to find native speakers of your targeted languages if you need them locally in your office.
The solution we’ve found to these two challenges is to hire our native speakers remotely. That’s actually the reason why we’ve become a “semi remote” company. Read more about our story here:
This solution has worked great for us.
First, it’s MUCH easier to find native Spanish speakers in Spanish speaking countries!
Who would’ve thunk it? :)
Same goes for Portuguese and English. If we had to hire them in Paris, our “home”, we’d have a hard time coming up with enough great applicants.
Benefit #1: You have more opportunities of hiring not only native speakers, but great team members!
When you go beyond the confines of your headquarters, you open up the pool of potentially great coworkers. Think of other cities around the world with great talent pools — with a remote or semi-remote approach, the talent in those areas is within your reach.
Benefit #2: You’ll find more affordable resources if your remote team members live outside of San Francisco, New York, Paris, London or other major “western” big city. Opening positions that can be held by people living in places where the cost of living is cheaper helps a lot with the cost. That difference gets even bigger if you can hire people in countries where the cost of living is cheaper than yours.
It’s obviously not a key factor, but in the early days, when every penny counts, the ability to spend 50% less on a resource just because her cost of living is 50% less than yours is a great win/win.
5) Deal with the friction more effectively
Everything in a multilingual company is much more complicated, takes more time, and brings more challenges than if only one language is used.
And startups don’t want additional friction that will consume more of its already scarce resources. I get it.
But it’s only true if you do everything in all languages from the start.
The best way to deal with that friction is to always start everything with one language — English — and then test, iterate, and measure for as long as necessary until you get to a state or a process that works well. Then, and only then, do you localize.
For example, our support knowledge base was offered only in English for a loooong time. When it got to a point that we wouldn’t have to change it too much, we localized it.
When we run ads, we always test and iterate on one language and then localize. Sometimes we even just use English on our ads, even if we target worldwide.
In a nutshell, we don’t always localize everything. We try to reduce the friction as much as possible.
Get the right tools
If you’re going to run a remote (or semi-remote) team like I’ve suggested, you’ll need tools to work efficiently. I’ve shared most of the tools we used in this blog post:
Here are a few key ones that will work well for your from-Day-One international business.
Webtranslateit
Webtranslateit is our go-to tool for localizing our app. It makes the process straightforward and you’re guaranteed not to let anything slips through the cracks.
Multisite Language Switcher (WordPress plugin)
We’ve chosen to use a multisite WordPress install for multiple reasons that go beyond the topic of this post. Long story short: Having only one WordPress instance with a language plugin didn’t offer us the flexibility we needed.
The Multisite Language Switcher plugin, however, allows you to switch from one language to the others across your multisite Wordpress setup for any page or blog post. It makes localizing each page and blog post pretty straightforward too.
We love Support Hero for many reasons. Most of them are detailed here:
One of Support Hero’s greatest features is that if offers multilingual support and makes it easy to create different versions of your support documentation in different languages and see what has been translated and what has not (and needs to be done).
If your startup is based in the U.S., you can wait longer to go international, but make sure you build the foundation for your future expansion, like localizing your code and website.
Your turn! Do you have a global business? Any tip you’d like to share?
Or are you struggling to go international and would like more details about how we’ve done it? Just ask! | https://medium.com/agorapulse-stories/the-5-things-weve-lived-by-to-create-a-truly-international-business-from-day-one-941add21ba9a | ['Emeric Ernoult'] | 2017-03-22 22:41:21.213000+00:00 | ['Localization', 'SaaS', 'International Development', 'Entrepreneurship', 'Startup'] | Title 5 thing we’ve lived create truly international business Day OneContent you’re building startup LOT decision make early wrong that’s OK learn move come attracting market beyond country’s border make right decision Day One especially true home market United States English speaking country I’ve learned hard way first startup affinitiz affinitiz one first social network Mark Zuckerberg still high school put thing perspective affinitiz first startup targeted French market one first social network back 2001 never really took launched 2001 built Frenchonly business website French app French PR targeting French medium consequence constrained small limited market France never became big enough survive 5 year 400000 registered user France doubt international business Day One would least 10 time would valuable business reached 4 million registered user mark 2006 would sold company made million instead Frenchonly business affinitiz made 150000 annual revenue shut 2009 current business Agorapulse user 170 country 14300 city around world also generates 33M annual recurring revenue profitable 20 percent revenue come France built Agorapulse way built first startup Frenchonly it’d dead right thing make sure startup truly international make break Here’s I’ve learned right Agorapulse 1 Use English working language EVERYTHING English native language deciding working language small commitment think Language everywhere code naming policy website app support content email etc want grow global business you’ll assemble global team right away Note won’t speak native language start building business native language say French two year road hire native Spanish speaker doesn’t speak word French you’re fucked Pardon French launch website French language isn’t English two year road you’re ready go international website built authority native language none language Good luck Long story short everything work related universal language — English many founder it’s hard certainly intuitive put aside native language use “foreign” language everything it’s way build international business start building app website support doc language that’s English you’ll create roadblock quickly become way hard overcome 2 Localize everything watch word Localize localize localize especially true code app interface Every word every sentence tool tip button language variable It’s also true website Sure it’s time consuming first soon become lifesaver start adding language mix It’s discipline it’s always easy it’s worth Otherwise get hodgepodge language page — isn’t impressive anyone Here’s example Hootsuite French English word mixed page ability uncorrelate code given language also allow use translation software Webtranslateit let marketing product team manage translation tech guy save time headache product marketing team localization team grow bigger flexibility need Localization also play role UI product website mentioned earlier English working language UIUX guy work screen wording English I’ve learned hard way English much shorter word sentence language well least French Spanish Portuguese three language use example “Moderation rules” become “Règles de modération” French sudden button text fit one nice line English look doesn’t look nice French English French Spanish Portuguese look similar French term length impact UI design product UI website English expect localize language road keep mind Like menu app look good English since “ads comments” much lengthier phrase Spanish fall beneath navigation bar loses dropdown menu functionality Key takeaway leave UI breathing room whenever 3 Hire native speaker you’ve localized app website you’ve begun creating content English you’re ready localize EVERYTHING early day you’ll likely small team native speaker You’ll tempted work translator fill gap search Fiverr Upwork look easy LOT people claim localize translate content even company specialize localization job I’ve tried Trust say localization agenciescompanies freelancer don’t work It’s tall task outsourced worker know jargon ecosystem product level onboarding proofreading micromanaging people unfamiliar business overwhelming It’s MUCH faster easier localization capability inhouse That’s we’ve done hiring native speaker English Ireland US Spanish Mexico Portuguese Brazil awesome thing native speaker embedded team opposed external service provider learn product ecosystem jargon along way 4 6 month know everything need know job localization need rarely constitutes fulltime job team member also help company providing customer support giving demo prospect helping expand business country speak native language winwin — localizing app website enough grow Growth occur offer whole stack language website content app support sale illustrate we’ve offered website app Spanish Portuguese almost Day One 2013 2 year MRR Spanish Portuguese speaking country remained painfully low look happened hired first fulltime Spanish speaking team member happened hired first Portuguese speaking team member get point right 4 Consider remote semiremote organization Hiring native speaker team challenging two way It’s bigger financial commitment hard find native speaker targeted language need locally office solution we’ve found two challenge hire native speaker remotely That’s actually reason we’ve become “semi remote” company Read story solution worked great u First it’s MUCH easier find native Spanish speaker Spanish speaking country would’ve thunk go Portuguese English hire Paris “home” we’d hard time coming enough great applicant Benefit 1 opportunity hiring native speaker great team member go beyond confines headquarters open pool potentially great coworkers Think city around world great talent pool — remote semiremote approach talent area within reach Benefit 2 You’ll find affordable resource remote team member live outside San Francisco New York Paris London major “western” big city Opening position held people living place cost living cheaper help lot cost difference get even bigger hire people country cost living cheaper It’s obviously key factor early day every penny count ability spend 50 le resource cost living 50 le great winwin 5 Deal friction effectively Everything multilingual company much complicated take time brings challenge one language used startup don’t want additional friction consume already scarce resource get it’s true everything language start best way deal friction always start everything one language — English — test iterate measure long necessary get state process work well localize example support knowledge base offered English loooong time got point wouldn’t change much localized run ad always test iterate one language localize Sometimes even use English ad even target worldwide nutshell don’t always localize everything try reduce friction much possible Get right tool you’re going run remote semiremote team like I’ve suggested you’ll need tool work efficiently I’ve shared tool used blog post key one work well fromDayOne international business Webtranslateit Webtranslateit goto tool localizing app make process straightforward you’re guaranteed let anything slip crack Multisite Language Switcher WordPress plugin We’ve chosen use multisite WordPress install multiple reason go beyond topic post Long story short one WordPress instance language plugin didn’t offer u flexibility needed Multisite Language Switcher plugin however allows switch one language others across multisite Wordpress setup page blog post make localizing page blog post pretty straightforward love Support Hero many reason detailed One Support Hero’s greatest feature offer multilingual support make easy create different version support documentation different language see translated need done startup based US wait longer go international make sure build foundation future expansion like localizing code website turn global business tip you’d like share struggling go international would like detail we’ve done askTags Localization SaaS International Development Entrepreneurship Startup |
2,060 | 9 Traits You Should Slowly Remove From Your Day-to-Day Life | 9 Traits You Should Slowly Remove From Your Day-to-Day Life
#1 Overthinking the little things
Photo by Ivana Cajina on Unsplash
We’re all human.
I know, so profound.
This isn’t the most enlightening piece of knowledge I’ve offered, but hear me out. In the hustle and bustle of our lives, it’s easy to lose perspective.
We are all, in fact, human beings. We are biological machines that take in oxygen to fuel our cells and exert carbon dioxide.
In fact, I had a real human-chemistry experience the other day.
I was wrapping up my emails for the day at work when one of my workmates rushed in. Their left eye was red and they struggled to keep it open. They anxiously asked if “I knew chemistry.”
Now imagine what kind of thoughts were computing in my head.
What about chemistry? What do you need me to do to your eye? I’m so confused and not qualified to do whatever it is you’re about to ask of me.
Lucky for me, my workmate’s contact lens was simply stuck, and they needed me to work with their chemistry student for a few minutes.
I felt instant relief. Oh so you’re working with a chemistry student, I thought. I can totally do that. I felt much better after absolutely overthinking the situation. Why do I stress so much?
Overthinking isn’t my only mentally draining character trait. There’s a long list of other thoughts and feelings I know I need to reduce in my life.
These traits wear us out and put us down, and for no good reason. Life presents us with new problems everyday. There’s a positive way to go about dealing with each one. | https://medium.com/illumination/9-traits-you-should-slowly-remove-from-your-day-to-day-life-8408bd9038f7 | ['Ryan Porter'] | 2020-12-16 23:50:46.893000+00:00 | ['Life Lessons', 'Productivity', 'Motivation', 'Self Improvement', 'Ideas'] | Title 9 Traits Slowly Remove DaytoDay LifeContent 9 Traits Slowly Remove DaytoDay Life 1 Overthinking little thing Photo Ivana Cajina Unsplash We’re human know profound isn’t enlightening piece knowledge I’ve offered hear hustle bustle life it’s easy lose perspective fact human being biological machine take oxygen fuel cell exert carbon dioxide fact real humanchemistry experience day wrapping email day work one workmate rushed left eye red struggled keep open anxiously asked “I knew chemistry” imagine kind thought computing head chemistry need eye I’m confused qualified whatever you’re ask Lucky workmate’s contact lens simply stuck needed work chemistry student minute felt instant relief Oh you’re working chemistry student thought totally felt much better absolutely overthinking situation stress much Overthinking isn’t mentally draining character trait There’s long list thought feeling know need reduce life trait wear u put u good reason Life present u new problem everyday There’s positive way go dealing oneTags Life Lessons Productivity Motivation Self Improvement Ideas |
2,061 | Deep Learning for Developers | Photo by Jason Leung on Unsplash
So you have been working as a Software Engineer for many years, you know different frameworks/languages/libraries, you do know the best practices and use them.
I will try ensure you understand what Deep Learning is and things you should know about it from Developer’s point of view
But then in the background you can hear some buzz going on around data science, artificial intelligence, machine learning, deep learning and your inner evil starts tickling the impostor syndrome that makes you feel behind on this topic.
In this blog post, I will try ensure you understand what Deep Learning is and things you should know about it from Developer’s point of view. I.e. we will try to avoid going deep into maths.
Let’s go!
Let’s start with a business requirement: we are going to create an API, which can recognise if there is a flower in the image (see the picture of this blog post, where a robot is looking at a lego flower).
What do we need to do, to implement it using Deep Learning?
How to represent an image as a matrix?
So imagine a square image, which is 64 px x 64px. Every pixel has an RGB (red/green/value) value, where (0, 0, 0) would stand for black, (255, 255, 255) for white.
So if you wanted to represent an image as a matrix — it’s simply a three dimensional matrix, where dimensions are 64 x 64 x 3.
This should be a cold shower for many devs, who, just like me, hate adding matrices and math stuff into code. But Deep Learning requires that.
Matrices used: images (data) and labels
We will have two kinds of sets of images, one set for training and one for testing (to see the accuracy of the trained model).
So let’s say if we have a data set of 100 images, we may put it into a matrix and have a four dimensional one: 100 x 64 x 64 x 3 (64px x 64px, 3 — RGB).
Then, each image should have a label, which says 0 (false) or 1 (true) to indicate if you, as a human, see a flower in that image. This is the model pre-training, where you need to give some examples to software so it knows what a flower is.
What is Logistic (a.k.a. Sigmoid) Function?
So one of the cryptic terms you’ll hear when looking into Deep Learning is a Sigmoid function, or a Logistic Function. It uses the Euler’s number and gives values between 0 and 1.
In deep learning, the algorithms return values between 0 and 1 to give the probability “how likely there is a flower”. Then we round that value (e.g. 0.7 becomes 1) to a binary one.
There is no need to go into internals of Sigmoid, since it’s very easy to define in code or use as an abstraction.
What is Jupyter?
Jupyter is like an IDE for Data Scientists. If JupyterHub is used, then it’s also a versioning system.
Jupyter is a web-based tool where you can create “notebooks” (*.ipynb extension), these consist of python code and comments/plots/images/tables/etc.
Basically, you read Jupyter notebook a some article and run the lines of code block by block:
What is NumPy and TensorFlow?
NumPy is a Python library which abstracts many scientific computations. For Deep Learning, we are mostly interested in operations with matrices (multiplication, transposing, shape shifting). If we had to do it in plain python — neither it would be efficient hardware-wise, nor you’d enjoy writing that code.
TensorFlow is an ecosystem of libraries for Machine Learning for different languages (Python, Java, JavaScript, etc.). Deep Learning is a subset of Machine Learning, therefore we are going to use it. A great thing about TensorFlow, is that it has a lot of predefined data sets / trained models, so you already may use them instead of having to train yourself.
Before looking into code examples:
If you want to try running python notebooks, you may use Google’s Colab to have an environment setup quickly
Training data set — it’s a set of images to train your model, i.e. let’s say you have 1000 images and for each of them you assign a binary value wether there is a flower in it or not
Testing data set — it’s a set of different images from training data set, again with binary values assigned. This data set is used like a fitness function in software engineering, to tell how accurate your model is. Similarly to human mind, if you learn maths by doing maths tests, the result will likely be better if during exam you will get an identical test that you already did before rather than a completely new one.
X and Y in data sets: x corresponds to a matrix of images, whereas y represent a binary (1 or 0) value that corresponds to “yes” or “no”
Some code:
Finally, some code that you may test out on Jupyter. For the purpose of explaining it given a real world example, I will avoid naming things in x, y, z and similar notations, just so you understand what is what.
Let’s start with the simplest, we will use NumPy:
import numpy as np
Next, let’s introduce data. For the simplicity fo this example, all pixels of all training and testing images will be zero, but in real world, you’d need to import images of the same size (e.g. 64px x 64px), each pixel has 3 values (red/green/blue, i.e. RGB values) and convert them to matrices:
# Constants
training_images = np.zeros((10, 64, 64, 3)) # 10 images, 64px x 64px, 3 — RGB
testing_images = np.zeros((2, 64, 64, 3)) # 2 images, 64px x 64px, 3 — RGB training_images_labels = np.zeros((1, 10)) # labels for 10 training images
testing_images_labels = np.zeros((1, 2)) # labels for 2 training images
Since Logistic Regression doesn’t by default accept 4 dimensional matrices, we need to come back to a humanly understandable, two dimensional model, i.e. flatten the data into a two dimensional matrix (or a table), where the amount of columns means the amount of images and rows — all pixels stacked:
def flatten_images(images_matrix):
return images_matrix.reshape(images_matrix.shape[0], -1).T flattened_training_images = flatten_images(training_images)
flattened_testing_images = flatten_images(testing_images)
Let’s define the sigmoid function (yes, TensorFlow has an abstraction for it, but just for the sake of understanding it):
def get_sigmoid(z):
return 1.0 / (1 + np.exp(-z))
The activation value (again, this is a calculus thing related to logistic regression) where you accepted flattened data (images), weights and bias (more on that — later):
def get_activation_value(flattened_data, weights, bias):
return get_sigmoid(np.dot(weights.T, flattened_data) + bias)
Calculating weights (something close to a probability) near each image and the bias:
def get_weights_and_bias(flattened_training_data, training_data_labels):
values_per_data_entry = flattened_training_data.T[0].shape[0] # How many pixels an image has
amount_of_training_data = flattened_training_data.shape[1] # How many images we have
weights = np.zeros(values_per_data_entry)
bias = 0
iterations = 1000 # You can set almost any value and optimise it
learning_rate = 0.5 for index in range(iterations):
activation_values = get_activation_value(flattened_training_data, weights, bias) weights_derivative = np.divide(np.dot(flattened_training_data, np.subtract(activation_values, training_data_labels).T), amount_of_training_data)
weights = weights — learning_rate * weights_derivative bias_derivative = np.divide(np.sum(np.subtract(activation_values, training_data_labels)), amount_of_training_data)
bias = bias — learning_rate * bias_derivative return weights, bias
And here’s the final place where you actually train the model and put everything into one place:
def train_model(flattened_training_data, training_data_labels, flattened_testing_data, testing_data_labels):
# Gettings weights and bias
weights, bias = get_weights_and_bias(flattened_training_data, training_data_labels) # Calculating predictions for each entry in the data set
training_data_predictions = get_activation_value(flattened_training_data, weights, bias)
testing_data_predictions = get_activation_value(flattened_testing_data, weights, bias) # We only care about binary predictions, i.e. “it is a flower” or “it is not”, so rounding
training_data_predictions = np.around(training_data_predictions, 0)
testing_data_predictions = np.around(testing_data_predictions, 0) # That’s it! Just for the sake of testing you may now check accuracy of your model:
accuracy_of_this_model = 100 — np.mean(np.abs(testing_data_predictions — testing_data_labels)) * 100 print(‘{}%’.format(accuracy_of_this_model))
To run the model, you’d need to insert the values defined earlier:
train_model(flattened_training_images, training_images_labels, flattened_testing_images, testing_images_labels)
Running it would take a few seconds and then print 100%, because all values in our data matrices were zeros and not actual RGB colour values. I.e. this model was doomed to succeed.
What wasn’t covered in this blog post
Many things! So in classical explanations we would see what Neural Networks are, explanations why do they look/act similarly to human brain and went deeper into calculus, so that you could write a Deep Learning software without using TensorFlow. You could even do it without NumPy, but it wouldn’t be as efficient because of heavy operations with matrices.
If you did enjoy this brief and simplified intro to Deep Learning and want to know more, I do recommend digging deeper into it in https://www.deeplearning.ai/ — they have a series of different courses that covers all you need to know and start applying at your work.
Summary
It might be extra difficult to truly understand how deep learning works, but you don’t necessarily need to know all of that just to get started. We could see, that flower recognition in images might be relatively easy.
In reality, if we had such task coming from business, we’d likely use Google Cloud Vision API (which we extensively use in Zedge for wallpapers) or some other service to do the job. But don’t forget, that Deep Learning could be applied to more things than just images. | https://medium.com/zedge/deep-learning-for-developers-366a02691459 | ['Tomas Petras Rupšys'] | 2020-12-18 07:26:53.949000+00:00 | ['Machine Learning', 'Software Engineering', 'Deep Learning', 'Artificial Intelligence'] | Title Deep Learning DevelopersContent Photo Jason Leung Unsplash working Software Engineer many year know different frameworkslanguageslibraries know best practice use try ensure understand Deep Learning thing know Developer’s point view background hear buzz going around data science artificial intelligence machine learning deep learning inner evil start tickling impostor syndrome make feel behind topic blog post try ensure understand Deep Learning thing know Developer’s point view Ie try avoid going deep math Let’s go Let’s start business requirement going create API recognise flower image see picture blog post robot looking lego flower need implement using Deep Learning represent image matrix imagine square image 64 px x 64px Every pixel RGB redgreenvalue value 0 0 0 would stand black 255 255 255 white wanted represent image matrix — it’s simply three dimensional matrix dimension 64 x 64 x 3 cold shower many devs like hate adding matrix math stuff code Deep Learning requires Matrices used image data label two kind set image one set training one testing see accuracy trained model let’s say data set 100 image may put matrix four dimensional one 100 x 64 x 64 x 3 64px x 64px 3 — RGB image label say 0 false 1 true indicate human see flower image model pretraining need give example software know flower Logistic aka Sigmoid Function one cryptic term you’ll hear looking Deep Learning Sigmoid function Logistic Function us Euler’s number give value 0 1 deep learning algorithm return value 0 1 give probability “how likely flower” round value eg 07 becomes 1 binary one need go internals Sigmoid since it’s easy define code use abstraction Jupyter Jupyter like IDE Data Scientists JupyterHub used it’s also versioning system Jupyter webbased tool create “notebooks” ipynb extension consist python code commentsplotsimagestablesetc Basically read Jupyter notebook article run line code block block NumPy TensorFlow NumPy Python library abstract many scientific computation Deep Learning mostly interested operation matrix multiplication transposing shape shifting plain python — neither would efficient hardwarewise you’d enjoy writing code TensorFlow ecosystem library Machine Learning different language Python Java JavaScript etc Deep Learning subset Machine Learning therefore going use great thing TensorFlow lot predefined data set trained model already may use instead train looking code example want try running python notebook may use Google’s Colab environment setup quickly Training data set — it’s set image train model ie let’s say 1000 image assign binary value wether flower Testing data set — it’s set different image training data set binary value assigned data set used like fitness function software engineering tell accurate model Similarly human mind learn math math test result likely better exam get identical test already rather completely new one X data set x corresponds matrix image whereas represent binary 1 0 value corresponds “yes” “no” code Finally code may test Jupyter purpose explaining given real world example avoid naming thing x z similar notation understand Let’s start simplest use NumPy import numpy np Next let’s introduce data simplicity fo example pixel training testing image zero real world you’d need import image size eg 64px x 64px pixel 3 value redgreenblue ie RGB value convert matrix Constants trainingimages npzeros10 64 64 3 10 image 64px x 64px 3 — RGB testingimages npzeros2 64 64 3 2 image 64px x 64px 3 — RGB trainingimageslabels npzeros1 10 label 10 training image testingimageslabels npzeros1 2 label 2 training image Since Logistic Regression doesn’t default accept 4 dimensional matrix need come back humanly understandable two dimensional model ie flatten data two dimensional matrix table amount column mean amount image row — pixel stacked def flattenimagesimagesmatrix return imagesmatrixreshapeimagesmatrixshape0 1T flattenedtrainingimages flattenimagestrainingimages flattenedtestingimages flattenimagestestingimages Let’s define sigmoid function yes TensorFlow abstraction sake understanding def getsigmoidz return 10 1 npexpz activation value calculus thing related logistic regression accepted flattened data image weight bias — later def getactivationvalueflatteneddata weight bias return getsigmoidnpdotweightsT flatteneddata bias Calculating weight something close probability near image bias def getweightsandbiasflattenedtrainingdata trainingdatalabels valuesperdataentry flattenedtrainingdataT0shape0 many pixel image amountoftrainingdata flattenedtrainingdatashape1 many image weight npzerosvaluesperdataentry bias 0 iteration 1000 set almost value optimise learningrate 05 index rangeiterations activationvalues getactivationvalueflattenedtrainingdata weight bias weightsderivative npdividenpdotflattenedtrainingdata npsubtractactivationvalues trainingdatalabelsT amountoftrainingdata weight weight — learningrate weightsderivative biasderivative npdividenpsumnpsubtractactivationvalues trainingdatalabels amountoftrainingdata bias bias — learningrate biasderivative return weight bias here’s final place actually train model put everything one place def trainmodelflattenedtrainingdata trainingdatalabels flattenedtestingdata testingdatalabels Gettings weight bias weight bias getweightsandbiasflattenedtrainingdata trainingdatalabels Calculating prediction entry data set trainingdatapredictions getactivationvalueflattenedtrainingdata weight bias testingdatapredictions getactivationvalueflattenedtestingdata weight bias care binary prediction ie “it flower” “it not” rounding trainingdatapredictions nparoundtrainingdatapredictions 0 testingdatapredictions nparoundtestingdatapredictions 0 That’s sake testing may check accuracy model accuracyofthismodel 100 — npmeannpabstestingdatapredictions — testingdatalabels 100 print‘’formataccuracyofthismodel run model you’d need insert value defined earlier trainmodelflattenedtrainingimages trainingimageslabels flattenedtestingimages testingimageslabels Running would take second print 100 value data matrix zero actual RGB colour value Ie model doomed succeed wasn’t covered blog post Many thing classical explanation would see Neural Networks explanation lookact similarly human brain went deeper calculus could write Deep Learning software without using TensorFlow could even without NumPy wouldn’t efficient heavy operation matrix enjoy brief simplified intro Deep Learning want know recommend digging deeper httpswwwdeeplearningai — series different course cover need know start applying work Summary might extra difficult truly understand deep learning work don’t necessarily need know get started could see flower recognition image might relatively easy reality task coming business we’d likely use Google Cloud Vision API extensively use Zedge wallpaper service job don’t forget Deep Learning could applied thing imagesTags Machine Learning Software Engineering Deep Learning Artificial Intelligence |
2,062 | The Best Things I Discovered in 2020 | What a year. Some got rich. Some discovered what ignorance can do. Some learned harsh lessons and retreated to the golf course to lick their wounds and find their ego again.
2020 was a tough year. I’ve never worked in a business environment quite like it, where nobody wants to spend any money.
It wasn’t all bad. 2020 was a year that taught us resilience and love. We survived together in isolation via Zoom calls. Here are the best things I discovered in 2020.
20. A book called “Your Music and People”
Derek Sivers is Tony Robbins for weird people. I like weird. Derek’s books are simple to read and the wisdom is powerful. You can read all of his books in a few hours. He ruthlessly edits tangents to leave you with pure gold.
19. The Last Time I Had Sex With My Wife
Greyson Ferguson wrote a story with this title. I read every word and felt all of his pain. Writing that moves me emotionally is a rare find.
This story is inspiration for anyone who wants to write with emotion and make people feel something.
18. Local walks
I spent most of the year not being allowed to go beyond 5 kms from my home. Melbourne had one of the harshest lockdowns anywhere in the world. I basically couldn’t do anything. So I had to get used to finding things to do with my girlfriend. We took a walk around our neighborhood every day.
My neighborhood looks and feels like a new suburb. Sometimes we ignore what’s right in front of our eyes. We get sold lies by travel agents that we need to be in Hawaii to be happy. 2020 showed us travel won’t make you happy.
17. Bose noise-canceling headphones
When you live next to a train line in a student apartment, things get noisy. Bose headphones pump white noise into your ear so your brain can concentrate. I used these headphones to crank up movie soundtracks and write lots of content online.
16. My Octopus
The movie, My Octopus Teacher, was so powerful. You go into the film thinking it’s going to be a documentary. Then you get taken down the rabbit hole of how a single octopus lives. It’s hard to believe the relationship between a man and an octopus was captured on camera.
After watching this film you will question everything you know. You will learn to notice the small things and get lost in your curiosity.
15. Whole food plant-based eating
Cutting out meat, seafood, dairy, oil and sugar has lifted my energy levels. Energy is life. It allows me to perform at my best and focus on writing for hours on end. The closure of restaurants helped me stay disciplined. Now I don’t want to go back to fried food life.
If you want more energy, do what my 104 year old grandma used to say: eat plants.
14. Giving up SMS
Not sure why this communication channel exists. Who trades phone numbers anymore? SMS is a brain drain for me. Trying to write on a tiny phone keyboard is my definition of hell. Audio messages, video messages and messenger apps on a desktop/laptop work better.
The best mode for your phone is aeroplane mode. It helps you think.
13. “Earth Deluxe” for reminders of beauty
The instagram account “Earth Deluxe” is just what I needed when I couldn’t leave my home for most of the year. You can travel with your mind, rather than on a plane, with these gorgeous images.
When you feel like you have nothing, you always have a sunset.
12. Creative communities
I’ve always tried to do everything creative, alone. Creative loneliness is a bad idea. I learned in 2020 that creative communities are incredibly powerful.
Many of my new virtual friends this year have come from a couple of writing communities. I made it a habit to do video calls with people from the community every week. It helped me feel connected to this crazy, shutdown world — where everything you try to do is canceled.
What if the answer to “what do I do next” is found in a creative community of people just like you, trying to achieve the same goals as you?
11. The Atlantic
Their long-form essays are the bomb. They taught me what real writing is, although their extremely long paragraphs do my head in and make it hard to follow the words along the page.
10. Twitter Threads
Nicolas Cole got me onto these. Twitter threads are a better way to use twitter. They turn twitter into a blogging platform. Twitter threads force you to be concise and cut out all the extra words and sentences readers don’t need. On twitter, you can say whatever you want. I found that liberating in 2020.
9. Family
When the world turns into an apocalypse you miss your parents. They remind you of where you came from. I was separated from my family for most of 2020 due to lockdown and covid restrictions.
This made me appreciate family even more. Phone calls became more important. Thankfully they are all okay.
Family acts as a reset when chaos temporarily takes over the world.
8. iPhone 12
Okay, calm down. I got a new iphone and fell in love with photography again. Most cameras on phones suck.
Try taking a picture at night with your phone and you’ll see what I mean. The iPhone 12’s camera is unbelievable and makes the upgrade worth it. The lightning-fast 5G network opens new possibilities for apps, too.
7. Free email courses
I’d never heard of this concept. A free email course helped me engage with readers this year. I realized how much people appreciate when you go deep on a subject and don’t force them to pay money for it.
6. Teaching
2020 was the year I launched an online course. I’ve wanted to do it for years. I’d tried before and failed lots of times. The best part was unexpected.
Watching all the students flock into the private online community was a deeply emotional moment.
Within a few days the community was buzzing with activity and people were taking everything I’d learned as a writer and applying it. To see what you’ve learned be reused in real-time is a ridiculously cool feeling.
Teach others what you know to feel fulfilled.
5. Loom
This handy tool allows you to record your screen and send links to the videos you capture. You can use this tool to help you create your own online course.
4. Proper Finance Gurus
The world of money completely changed forever in 2020. This was the year I took the time to understand finance at an even deeper level. These financial gurus taught me a lot:
#1 by a mile: Raoul Pal
Raoul Pal Ray Dalio
Alex Saunders
Ivan Liljeqvist
Paul Tudor Jones
Anthony Pompliano
Daniela Cambone, Stansberry Research
Michael Saylor, Microstrategy
3. Todd Brison
You can’t have him. He’s all mine (Okay, I’ll share him with you.)
Todd writes the best emails I have ever seen. Those on my email list get to read them. Every time Todd drops one people go crazy and my inbox lights up. People love personality fused with helpful content. Todd is the hipster yoda of writing. Plus, we taught a writing course together.
2. Blockchain investing
People said I was stupid for investing in Ethereum and Bitcoin.
My original investment has gone up 17,900%. Bitcoin is the best performing asset of the last decade and was up 170% in 2020.
It pays to ignore the critics and do your own research. You can make enough money to retire early and never work a normal job again if you get yourself a basic financial education.
1. Humanity
The secret to 2020 was forcing myself to see the positive. Watching humanity endure one of the toughest times in history made me emotional. I spent a lot of time looking for how people stayed positive.
The fitness instructors, musicians, and everyday people in Europe using their balconies to spread hope, love, positivity and support were incredible. I’ve never seen anything like it. While a virus stormed the world and killed a lot of people, everyday folks found it in their hearts to help complete strangers.
Thinking about the beauty of humanity in 2020 is enough to bring a grown man like me to tears.
2020 showed us what we’re capable of. 2021 and beyond will show us our ability to recover and make a tremendous comeback. | https://medium.com/the-ascent/the-best-things-i-discovered-in-2020-5307cabeb22e | ['Tim Denning'] | 2020-12-16 21:03:23.617000+00:00 | ['Books', 'Self Improvement', 'Life', 'Money', 'Writing'] | Title Best Things Discovered 2020Content year got rich discovered ignorance learned harsh lesson retreated golf course lick wound find ego 2020 tough year I’ve never worked business environment quite like nobody want spend money wasn’t bad 2020 year taught u resilience love survived together isolation via Zoom call best thing discovered 2020 20 book called “Your Music People” Derek Sivers Tony Robbins weird people like weird Derek’s book simple read wisdom powerful read book hour ruthlessly edits tangent leave pure gold 19 Last Time Sex Wife Greyson Ferguson wrote story title read every word felt pain Writing move emotionally rare find story inspiration anyone want write emotion make people feel something 18 Local walk spent year allowed go beyond 5 km home Melbourne one harshest lockdown anywhere world basically couldn’t anything get used finding thing girlfriend took walk around neighborhood every day neighborhood look feel like new suburb Sometimes ignore what’s right front eye get sold lie travel agent need Hawaii happy 2020 showed u travel won’t make happy 17 Bose noisecanceling headphone live next train line student apartment thing get noisy Bose headphone pump white noise ear brain concentrate used headphone crank movie soundtrack write lot content online 16 Octopus movie Octopus Teacher powerful go film thinking it’s going documentary get taken rabbit hole single octopus life It’s hard believe relationship man octopus captured camera watching film question everything know learn notice small thing get lost curiosity 15 Whole food plantbased eating Cutting meat seafood dairy oil sugar lifted energy level Energy life allows perform best focus writing hour end closure restaurant helped stay disciplined don’t want go back fried food life want energy 104 year old grandma used say eat plant 14 Giving SMS sure communication channel exists trade phone number anymore SMS brain drain Trying write tiny phone keyboard definition hell Audio message video message messenger apps desktoplaptop work better best mode phone aeroplane mode help think 13 “Earth Deluxe” reminder beauty instagram account “Earth Deluxe” needed couldn’t leave home year travel mind rather plane gorgeous image feel like nothing always sunset 12 Creative community I’ve always tried everything creative alone Creative loneliness bad idea learned 2020 creative community incredibly powerful Many new virtual friend year come couple writing community made habit video call people community every week helped feel connected crazy shutdown world — everything try canceled answer “what next” found creative community people like trying achieve goal 11 Atlantic longform essay bomb taught real writing although extremely long paragraph head make hard follow word along page 10 Twitter Threads Nicolas Cole got onto Twitter thread better way use twitter turn twitter blogging platform Twitter thread force concise cut extra word sentence reader don’t need twitter say whatever want found liberating 2020 9 Family world turn apocalypse miss parent remind came separated family 2020 due lockdown covid restriction made appreciate family even Phone call became important Thankfully okay Family act reset chaos temporarily take world 8 iPhone 12 Okay calm got new iphone fell love photography camera phone suck Try taking picture night phone you’ll see mean iPhone 12’s camera unbelievable make upgrade worth lightningfast 5G network open new possibility apps 7 Free email course I’d never heard concept free email course helped engage reader year realized much people appreciate go deep subject don’t force pay money 6 Teaching 2020 year launched online course I’ve wanted year I’d tried failed lot time best part unexpected Watching student flock private online community deeply emotional moment Within day community buzzing activity people taking everything I’d learned writer applying see you’ve learned reused realtime ridiculously cool feeling Teach others know feel fulfilled 5 Loom handy tool allows record screen send link video capture use tool help create online course 4 Proper Finance Gurus world money completely changed forever 2020 year took time understand finance even deeper level financial guru taught lot 1 mile Raoul Pal Raoul Pal Ray Dalio Alex Saunders Ivan Liljeqvist Paul Tudor Jones Anthony Pompliano Daniela Cambone Stansberry Research Michael Saylor Microstrategy 3 Todd Brison can’t He’s mine Okay I’ll share Todd writes best email ever seen email list get read Every time Todd drop one people go crazy inbox light People love personality fused helpful content Todd hipster yoda writing Plus taught writing course together 2 Blockchain investing People said stupid investing Ethereum Bitcoin original investment gone 17900 Bitcoin best performing asset last decade 170 2020 pay ignore critic research make enough money retire early never work normal job get basic financial education 1 Humanity secret 2020 forcing see positive Watching humanity endure one toughest time history made emotional spent lot time looking people stayed positive fitness instructor musician everyday people Europe using balcony spread hope love positivity support incredible I’ve never seen anything like virus stormed world killed lot people everyday folk found heart help complete stranger Thinking beauty humanity 2020 enough bring grown man like tear 2020 showed u we’re capable 2021 beyond show u ability recover make tremendous comebackTags Books Self Improvement Life Money Writing |
2,063 | I wear this smile like a mask | I wear this smile like a mask
Poetry in free verse
No, I am not lying,
I am only showing you
what you want to see,
because it’s easier to pretend
than to explain
why I am not alright.
I am afraid
that if I try,
you will argue,
tell me how
I have everything I need
that there is something wrong with me
if I am still sad.
But how do I explain
that the opposite of happy isn’t always sad,
and there’s a difference between
not knowing where I want to be
and not wanting to be where I am right now?
I am an expert at pretending
to be happy when I am not.
I think we all are.
That we bury our sadness
beneath those layers of fake smiles
and laughter that fails to hide
the shadows around our eyes.
This is our blessing,
this is our curse,
and we would rather pretend
than explain to you
why we are not alright. | https://medium.com/resistance-poetry/i-wear-this-smile-like-a-mask-6d030fd5603f | ['Anangsha Alammyan'] | 2020-07-23 19:41:40.508000+00:00 | ['Self-awareness', 'Mental Health', 'Depression', 'Poetry', 'Resistance Poetry'] | Title wear smile like maskContent wear smile like mask Poetry free verse lying showing want see it’s easier pretend explain alright afraid try argue tell everything need something wrong still sad explain opposite happy isn’t always sad there’s difference knowing want wanting right expert pretending happy think bury sadness beneath layer fake smile laughter fails hide shadow around eye blessing curse would rather pretend explain alrightTags Selfawareness Mental Health Depression Poetry Resistance Poetry |
2,064 | Why You Should Trade Split Decisions for “Flip Decisions” | Why You Should Trade Split Decisions for “Flip Decisions”
Use Flipism to make in-the-moment choices.
Photo by Pocky Lee on Unsplash
In their book The Leading Brain: Neuroscience Hacks to Work Smarter, Better, and Happier, Friederike Fabritis and Hans Hagemannthat describe flipping a coin as a powerful way to make decisions. But not in the way we’d normally expect.
Usually, we have two options. Option A is heads and Option B is tails. We flip the coin, and whichever side the coin lands on, we go with that option.
But this is not the ideal way to make decisions. There is a better, more intuitive way and it’s more reflective of what the brain actually wants. In his article at Inc.com, Jeff Haden writes:
“If you’re torn between two choices of seemingly equal merit, flip a coin. If you’re satisfied or relieved by the decision the coin made for you, then go with it. On the other hand, if the result of the coin toss leaves you uneasy and even makes you wonder why you used a coin toss to decide such an important decision in the first place, then go with the other choice instead. Your ‘gut feeling’ alerted you to the ‘right’ decision.”
A study from researchers in Switzerland documented a similar process. They told participants that one side of the coin would allow them to take a job at a more prestigious firm with higher pay and longer hours, and the other side would be at a less prestigious firm with lower pay and more flexible hours. The coin was then flipped into the air, but it was never revealed which side it landed on. Research participants were asked to decide which choice their subconscious brain wanted more while the coin was in the air. This choice revealed their underlying desire.
While the flipping of the coin acted as the catalyst for decision making in this study, a second study was performed. In it, researchers suggested participants go for specific choices in a restaurant menu. It was clear that when certain menu items were suggested to participants, they formed stronger opinions about what they wanted. Their final decisions either leaned toward or strongly away from the recommended item.
Whether or not people followed the recommendation didn’t matter. What mattered was the fact that people became much more decisive.
This phenomenon is known as Flipism. Psynso describes Flipism as:
“[a] pseudophilosophy under which all decisions are made by flipping a coin. It originally appeared in the Disney comic “Flip Decision” by Carl Barks, published in 1953. Barks called a practitioner of “Flipism” a “Flippist.” Flipism can be seen as a normative decision theory, although it does not fulfil the criteria of rationality.”
Flipism should probably be taken with a grain of salt. However, when making split decisions or acting in the moment, it can be a really powerful tool.
In a piece about in-the-moment decision making, Neil Patel highlights the positive results he’s had using his instinct, and how those results compound when he became more and more confident in his gut. In fact, studies show that “the more you pay attention to the outcome of trusting your intuition in combination with facts, the better your future decision-making can become.”
The name “split decisions” simply reveals the conflict you face in those moments when you need to take them. The name doesn’t offer any solutions. “Flip decisions,” however, offer a valuable tool in deciphering which direction to go when you have a quick decision to make.
Toss the coin up in the air, forget about it, and your mind will be made. | https://medium.com/big-self-society/why-you-should-trade-split-decisions-for-flip-decisions-3f43034da5eb | ['Jordan Gross'] | 2020-11-20 14:10:58.770000+00:00 | ['Leadership', 'Mental Health', 'Self Improvement', 'Psychology', 'Inspiration'] | Title Trade Split Decisions “Flip Decisions”Content Trade Split Decisions “Flip Decisions” Use Flipism make inthemoment choice Photo Pocky Lee Unsplash book Leading Brain Neuroscience Hacks Work Smarter Better Happier Friederike Fabritis Hans Hagemannthat describe flipping coin powerful way make decision way we’d normally expect Usually two option Option head Option B tail flip coin whichever side coin land go option ideal way make decision better intuitive way it’s reflective brain actually want article Inccom Jeff Haden writes “If you’re torn two choice seemingly equal merit flip coin you’re satisfied relieved decision coin made go hand result coin toss leaf uneasy even make wonder used coin toss decide important decision first place go choice instead ‘gut feeling’ alerted ‘right’ decision” study researcher Switzerland documented similar process told participant one side coin would allow take job prestigious firm higher pay longer hour side would le prestigious firm lower pay flexible hour coin flipped air never revealed side landed Research participant asked decide choice subconscious brain wanted coin air choice revealed underlying desire flipping coin acted catalyst decision making study second study performed researcher suggested participant go specific choice restaurant menu clear certain menu item suggested participant formed stronger opinion wanted final decision either leaned toward strongly away recommended item Whether people followed recommendation didn’t matter mattered fact people became much decisive phenomenon known Flipism Psynso describes Flipism “a pseudophilosophy decision made flipping coin originally appeared Disney comic “Flip Decision” Carl Barks published 1953 Barks called practitioner “Flipism” “Flippist” Flipism seen normative decision theory although fulfil criterion rationality” Flipism probably taken grain salt However making split decision acting moment really powerful tool piece inthemoment decision making Neil Patel highlight positive result he’s using instinct result compound became confident gut fact study show “the pay attention outcome trusting intuition combination fact better future decisionmaking become” name “split decisions” simply reveals conflict face moment need take name doesn’t offer solution “Flip decisions” however offer valuable tool deciphering direction go quick decision make Toss coin air forget mind madeTags Leadership Mental Health Self Improvement Psychology Inspiration |
2,065 | How Not To Apply To An Accelerator (part 6) | This is part 6 of my “self-defense essay”. If you missed the prior installments, start here with The #EpicNovelFail
The #LinkerFail
(a.k.a. The “You find it”)
“Can’t I just send you my pitch deck? It’s all in there.”
I get that question from time to time and it’s a fair question. The entrepreneur has put a lot of time into crafting his deck and making it look pretty. Why fill out an application if the data are in the deck?
In many cases, the data are not all there. Our application questions represent the minimum amount of info we need to feel comfortable inviting a startup to the next stage of the process. I would guestimate that well over 80% of the investor decks we see are missing the answer to at least one of our questions. These aren’t bad decks. Many are likely very effective in getting the startup a meeting with potential investors. They just don’t have all the info we want to see.
The other answer is a bit more subtle. As I mentioned, every Dreamit reviewer sees hundreds of applications over the course of a few short weeks. Even if a deck is ‘complete’, each deck would still present the information in its own way and in its own order. We would have to hunt through the deck to find where the answer to a specific question is while mentally checking off the boxes to make sure all the bases were covered. That adds time and mental load to a process that already consumes massive amounts of both of these scarce resources.
Tip: don’t respond to an application question with “Please see my deck/website/video (link here).”
Next up: The #PoorAttentionToDetailFail | https://medium.com/dreamit-perspectives/how-not-to-apply-to-an-accelerator-part-6-66f517006f32 | ['Andrew Ackerman'] | 2016-11-17 18:48:34.653000+00:00 | ['Entrepreneurship', 'Startup'] | Title Apply Accelerator part 6Content part 6 “selfdefense essay” missed prior installment start EpicNovelFail LinkerFail aka “You find it” “Can’t send pitch deck It’s there” get question time time it’s fair question entrepreneur put lot time crafting deck making look pretty fill application data deck many case data application question represent minimum amount info need feel comfortable inviting startup next stage process would guestimate well 80 investor deck see missing answer least one question aren’t bad deck Many likely effective getting startup meeting potential investor don’t info want see answer bit subtle mentioned every Dreamit reviewer see hundred application course short week Even deck ‘complete’ deck would still present information way order would hunt deck find answer specific question mentally checking box make sure base covered add time mental load process already consumes massive amount scarce resource Tip don’t respond application question “Please see deckwebsitevideo link here” Next PoorAttentionToDetailFailTags Entrepreneurship Startup |
2,066 | From Print to Online: Is the Truth Worth Paying For? | The Truth Is…
Truth is bland. It lacks the glitter that catches our attention. It does not take any sides, so no one wants it. From the streams of information that floods our minds every day, the stream of truth is the least appealing. It’s like a ruin in the middle of a bustling city. It’s there, but no one really cares about it.
But in 2016 something happened that increased the worth of truth. Donald Trump became the President of the United States. So the first spike in the digital subscription of the Times came soon after Donald Trump was elected.
And shortly after Trump labeled the press “the enemy of the people,” the Times along with Droga5 NY came up with a campaign, The Truth Is Hard. That campaign video was played at the Oscars in 2017. It was the Times' first televised campaign in a decade.
The Truth Is Hard ad at the Oscars
The Truth Is Hard short documentary
The aim of the campaign was to show people that knowing the truth is important. And there is a lot of effort that goes into unearthing the truth. The Times wanted the curious reader to understand that by paying for the subscription, they will support the cause of truthful reporting.
After the first teaser at the Oscars and subsequent print advertisements, the newspaper came out with a slew of short documentaries reinforcing the message.
The documentaries ranged from the heart-wrenching stories of how kids were separated from their parents at the Mexico border, to the appalling conditions of the Rohingya refugees in Myanmar.
The Truth Is Worth It: a story about immigrant children separated from their parents at the Mexican-US border
The Truth Is Worth It: a story about the plight of Rohingya Muslims in Myanmar
Toby Treyer and Laurie Howell, the creative directors at Droga5 who led the campaign, explained it in an interview. They said:
“We thought, wouldn’t it be amazing if we could show everything that went into a headline, but do it as if the journalist was discovering it as they were writing the story?”
The advertisements were giving the public a sneak peek into the life of a New York Times journalist. By showing how hard it is to get to the depth of the stories, dealing with hostile governments, anxious locals, grief-stricken mothers, and rogue assassins, the ads showed us how valuable truth is. | https://medium.com/better-marketing/from-print-to-online-is-the-truth-worth-paying-for-61eb76fc3aaa | ['Mehboob Khan'] | 2020-11-20 15:42:51.955000+00:00 | ['Marketing', 'News', 'New York Times', 'Journalism', 'Advertising'] | Title Print Online Truth Worth Paying ForContent Truth Is… Truth bland lack glitter catch attention take side one want stream information flood mind every day stream truth least appealing It’s like ruin middle bustling city It’s one really care 2016 something happened increased worth truth Donald Trump became President United States first spike digital subscription Times came soon Donald Trump elected shortly Trump labeled press “the enemy people” Times along Droga5 NY came campaign Truth Hard campaign video played Oscars 2017 Times first televised campaign decade Truth Hard ad Oscars Truth Hard short documentary aim campaign show people knowing truth important lot effort go unearthing truth Times wanted curious reader understand paying subscription support cause truthful reporting first teaser Oscars subsequent print advertisement newspaper came slew short documentary reinforcing message documentary ranged heartwrenching story kid separated parent Mexico border appalling condition Rohingya refugee Myanmar Truth Worth story immigrant child separated parent MexicanUS border Truth Worth story plight Rohingya Muslims Myanmar Toby Treyer Laurie Howell creative director Droga5 led campaign explained interview said “We thought wouldn’t amazing could show everything went headline journalist discovering writing story” advertisement giving public sneak peek life New York Times journalist showing hard get depth story dealing hostile government anxious local griefstricken mother rogue assassin ad showed u valuable truth isTags Marketing News New York Times Journalism Advertising |
2,067 | Azure — Deploying React App With Java Backend on AKS | Azure — Deploying React App With Java Backend on AKS
A step by step guide with an example project
AKS is Microsoft Azure’s managed Kubernetes solution that lets you run and manage containerized applications in the cloud. Since this is a managed Kubernetes service, Microsoft takes care of a lot of things for us such as security, maintenance, scalability, and monitoring. This makes us quickly deploy our applications into the Kubernetes cluster without worrying about the underlying details of building it.
In this post, we are going to deploy a React application with a Java environment. First, we dockerize our app and push that image to the Azure container registry and run that app on Azure AKS. We will see how we can build the Kubernetes cluster on Azure AKS, Accessing clusters from outside, configuring kubectl to work with AKS cluster, and many more.
Example Project
Prerequisites
Install Azure CLI and Configure
Dockerize the Project
Pushing Docker Image To Container Registry
Creating AKS Cluster
Configure Kuebctl With AKS Cluster
Deploy Kubernetes Objects On Azure AKS Cluster
Access the WebApp from the browser
Summary
Conclusion
Example Project
This is a simple project which demonstrates developing and running React application with Java. We have a simple app in which we can add users, count, and display them at the side, and retrieve them whenever you want.
Example Project
If you want to practice your own here is a Github link to this project. You can clone it and run it on your machine as well. | https://medium.com/bb-tutorials-and-thoughts/azure-deploying-react-app-with-java-backend-on-aks-4466adda8cfc | ['Bhargav Bachina'] | 2020-12-16 06:02:34.690000+00:00 | ['DevOps', 'Cloud Computing', 'Web Development', 'Kubernetes', 'Programming'] | Title Azure — Deploying React App Java Backend AKSContent Azure — Deploying React App Java Backend AKS step step guide example project AKS Microsoft Azure’s managed Kubernetes solution let run manage containerized application cloud Since managed Kubernetes service Microsoft take care lot thing u security maintenance scalability monitoring make u quickly deploy application Kubernetes cluster without worrying underlying detail building post going deploy React application Java environment First dockerize app push image Azure container registry run app Azure AKS see build Kubernetes cluster Azure AKS Accessing cluster outside configuring kubectl work AKS cluster many Example Project Prerequisites Install Azure CLI Configure Dockerize Project Pushing Docker Image Container Registry Creating AKS Cluster Configure Kuebctl AKS Cluster Deploy Kubernetes Objects Azure AKS Cluster Access WebApp browser Summary Conclusion Example Project simple project demonstrates developing running React application Java simple app add user count display side retrieve whenever want Example Project want practice Github link project clone run machine wellTags DevOps Cloud Computing Web Development Kubernetes Programming |
2,068 | Nine Ways to Tell Your Design Story on Medium | There are so many examples of successful design writing on Medium. Here are just a few that exemplify how you can use the platform.
I. Share your knowledge
You have a wealth of expertise and experience that readers would find of interest. Facebook product design director Julie Zhuo regularly writes on topics like design process, management, common mistakes, and more:
Pasquale D’Silva explains designing for animation:
A group of designers from leading tech companies collaborate to share best practices, lessons, and stories:
II. Reveal your process
Give readers a peek behind the curtain with insight into how their favorite products were made. Vanessa Koch, who worked on the resdesign of Asana, provided insight into the process:
III. Announce a feature or product
Press releases are passé; instead, many designers write Medium posts to showcase new features or products. When Foursquare underwent a redesign and launched Swarm, Sam Brown and Zack Davenport revealed their design thinking in announcing both:
IV. Solve a problem
Show how design can be used to solve a problem. Shortly after Caitlin Winner arrived at Facebook, she noticed that the “friends” icon didn’t adequately represent both women and men. So she redesigned it:
V. Engage with your audience
While you can always broadcast your ideas on Medium, the real value is its network — the ability to interact with your readers to advance thinking. Jennifer Daniel, Erika Hall, Mike Monteiro, and others launched Dear Design Student to solicit and provide advice:
VI. Promote your company’s design talent
Competition for designers has never been fiercer. Showcase your company’s design bench with a dedicated publication, like Facebook’s, Uber’s, and (naturally) Google’s design teams did:
VII. Write ‘non-design’ design stories
You can write about design without writing about the design process. Medium designer Marcin Wichary goes deep on typography and language:
Basecamp founder Jason Fried does design criticism — of the Drudge Report:
VIII. Relate to adjacent fields
Design doesn’t exist in a vacuum, of course. Designers work closely with engineers, researchers, user support, product scientists, content strategists, and others to craft their products. Andrei Herasimchuk is just one of the many designers who’s written about whether designers should learn to code:
Khosla Ventures’s Irene Au explains how designers work effectively with management:
IX. Adapt a speech
Many designers give talks at conferences like SPAN across the country and around the world. You can easily adapt your speech and publish it on Medium, like Google designer Rachel Garb did. | https://medium.com/google-design/nine-ways-to-tell-your-design-story-on-medium-36edb2936bb5 | ['Kate Lee'] | 2015-11-06 18:06:11.832000+00:00 | ['Medium', 'Writing', 'Design'] | Title Nine Ways Tell Design Story MediumContent many example successful design writing Medium exemplify use platform Share knowledge wealth expertise experience reader would find interest Facebook product design director Julie Zhuo regularly writes topic like design process management common mistake Pasquale D’Silva explains designing animation group designer leading tech company collaborate share best practice lesson story II Reveal process Give reader peek behind curtain insight favorite product made Vanessa Koch worked resdesign Asana provided insight process III Announce feature product Press release passé instead many designer write Medium post showcase new feature product Foursquare underwent redesign launched Swarm Sam Brown Zack Davenport revealed design thinking announcing IV Solve problem Show design used solve problem Shortly Caitlin Winner arrived Facebook noticed “friends” icon didn’t adequately represent woman men redesigned V Engage audience always broadcast idea Medium real value network — ability interact reader advance thinking Jennifer Daniel Erika Hall Mike Monteiro others launched Dear Design Student solicit provide advice VI Promote company’s design talent Competition designer never fiercer Showcase company’s design bench dedicated publication like Facebook’s Uber’s naturally Google’s design team VII Write ‘nondesign’ design story write design without writing design process Medium designer Marcin Wichary go deep typography language Basecamp founder Jason Fried design criticism — Drudge Report VIII Relate adjacent field Design doesn’t exist vacuum course Designers work closely engineer researcher user support product scientist content strategist others craft product Andrei Herasimchuk one many designer who’s written whether designer learn code Khosla Ventures’s Irene Au explains designer work effectively management IX Adapt speech Many designer give talk conference like SPAN across country around world easily adapt speech publish Medium like Google designer Rachel Garb didTags Medium Writing Design |
2,069 | Is immunity against corona virus available in the market or is it available at home? | Is immunity against corona virus available in the market or is it available at home?
Find out! This is the most common question of today .
Photo by G.lodhia.
Is immunity produced in a special kind of food or is it available in any capsule for?
There are uncertain questions to this topic, people are not able to understand immunity. The most common similarity about immunity and disease is just that it is caused in our body. We do not inhale or intake any diseases or immunity.
Disease is caused by viruses or bacteria or something else also.
Immunity is our body’s mechanism. The stronger we are in fighting the disease, the weaker will be the action of the disease in the body. The affect of any diseases that means the attack which the disease gives the body can be diminished or negligible, only if we are vaccinated against it or if our body’s fighting mechanism which is called its immunity is stronger.
As there is no vaccine against coronavirus, the only technique we have to save ourselves is keeping proper hygiene, taking precautions and boosting our immunity.
Immunity is our body’s soldier -White blood cells called WBC and also called leucocyte. Leucocyte protects our body against foreign materials which enters while breathing means while respiration. We can increase our immunity by intaking some kind of materials available at our home. We find turmeric, ginger, garlic, green tea in our kitchen. These are used to increase immunity. Even vitamin C and vitamin E tablet or capsule aids in increasing the immunity. Vitamin D is synthesized in our body when we get the sun rays of the morning time, capsules are also available in fact injections are also available.
There are some personal habits also which decreases our immunity and those should be avoided like avoiding harmful snacks, by exercising regularly, by getting adequate sleep, stress should not be taken and cleanliness of a body is mandatory.
If we improve and strengthen our immunity, we can make the virus non functional in our world or we can make its effect deleted from our body.
NO DISEASE. NO IMMUNITY. NO LIFE .NO DEATH .
Viruses cause diseases.
Corona viruses is a group of RNA viruses normally 0.125 µm that is 125 nm is the size of coronavirus as said by the experts. The smallest size discovered is 0.06 µm. The biggest size is 0.14 µm. The overall size as said by the experts is in between 60 to 140 nm which is 0.06 µm to 0.125 µm. This corona virus has spikes which measures 9 to 12 nm ,it gives it a shape like a solar one. Coronavirus is bigger than some smallest dust particle . One meter is 1000000000 nanometers.
photo by G.Lodhia.
Earlier there existed severe diseases like corona viruses disease(covid 19 ) named SARS and MERS which had caused pandemic. There is No specific vaccine or medicine for corona virus. Investigation to find a vaccine is in progress.
2 to 14 days is the incubation for the symptoms to be seen if affected by coronavirus.It has symptoms like
Respiratory tract failure
Cold
Bronchitis
Pneumonia
Gut diseases
Fever
Sore throat
Loss of smell or taste
Photo by G.Lodhia
With hospitalization, there can be over-the-counter treatments also.Such as
Rest Avoid overexertion Drink plenty of water Use proper masks Keep proper hygiene Do not touch things unnecessarily Sanitize your hands often or wash them properly more than usually done Prevent touching your face parts.
Photo by G.Lodhia
Conclusion: Prevention is better than cure, applies axactly for corona virus disease.
Hope This Could Enlighten Some Ways To Be Done At Home To Live In This Lock Down Days . | https://medium.com/illumination/is-immunity-against-corona-virus-available-in-the-market-or-is-it-available-at-home-find-out-d8ba357fd182 | ['G.Lodhia M. Edu'] | 2020-06-28 15:27:29.660000+00:00 | ['Article', 'Coronavirus', 'Virus', 'Blog', 'Writing'] | Title immunity corona virus available market available homeContent immunity corona virus available market available home Find common question today Photo Glodhia immunity produced special kind food available capsule uncertain question topic people able understand immunity common similarity immunity disease caused body inhale intake disease immunity Disease caused virus bacteria something else also Immunity body’s mechanism stronger fighting disease weaker action disease body affect disease mean attack disease give body diminished negligible vaccinated body’s fighting mechanism called immunity stronger vaccine coronavirus technique save keeping proper hygiene taking precaution boosting immunity Immunity body’s soldier White blood cell called WBC also called leucocyte Leucocyte protects body foreign material enters breathing mean respiration increase immunity intaking kind material available home find turmeric ginger garlic green tea kitchen used increase immunity Even vitamin C vitamin E tablet capsule aid increasing immunity Vitamin synthesized body get sun ray morning time capsule also available fact injection also available personal habit also decrease immunity avoided like avoiding harmful snack exercising regularly getting adequate sleep stress taken cleanliness body mandatory improve strengthen immunity make virus non functional world make effect deleted body DISEASE IMMUNITY LIFE DEATH Viruses cause disease Corona virus group RNA virus normally 0125 µm 125 nm size coronavirus said expert smallest size discovered 006 µm biggest size 014 µm overall size said expert 60 140 nm 006 µm 0125 µm corona virus spike measure 9 12 nm give shape like solar one Coronavirus bigger smallest dust particle One meter 1000000000 nanometer photo GLodhia Earlier existed severe disease like corona virus diseasecovid 19 named SARS MERS caused pandemic specific vaccine medicine corona virus Investigation find vaccine progress 2 14 day incubation symptom seen affected coronavirusIt symptom like Respiratory tract failure Cold Bronchitis Pneumonia Gut disease Fever Sore throat Loss smell taste Photo GLodhia hospitalization overthecounter treatment alsoSuch Rest Avoid overexertion Drink plenty water Use proper mask Keep proper hygiene touch thing unnecessarily Sanitize hand often wash properly usually done Prevent touching face part Photo GLodhia Conclusion Prevention better cure applies axactly corona virus disease Hope Could Enlighten Ways Done Home Live Lock Days Tags Article Coronavirus Virus Blog Writing |
2,070 | Loss of trust in American democracy is a crisis we have to confront | From CNN
Anthony Marx and Jamie Woodson write that American faith in democracy and the media has declined significantly in the last 40 years, but there are ways to increase trust in these institutions. They require every American to take an active role.
Full story here | https://medium.com/trust-media-and-democracy/loss-of-trust-in-american-democracy-is-a-crisis-we-have-to-confront-4027e8e8212b | ['Knight Commission On Trust', 'Media'] | 2019-02-07 19:32:32.315000+00:00 | ['Trust', 'Journalism', 'Media'] | Title Loss trust American democracy crisis confrontContent CNN Anthony Marx Jamie Woodson write American faith democracy medium declined significantly last 40 year way increase trust institution require every American take active role Full story hereTags Trust Journalism Media |
2,071 | Do You Think Hard Work Equals Success — Think Again | Do You Think Hard Work Equals Success — Think Again
Lessons from “Outliers: The Story of Success” by Malcolm Gladwell
In NPR’s How I Built This podcast, the host, Guy Raz always asks his guest one question; “How much do you attribute your success to luck and how much do you attribute to hard work?” Episode after episode, entrepreneurs ponder on his question and answer whether their success was attributed to success or luck, or both.
From all of the episodes I’ve listened to, the founder of Canva, Melanie Perkins, answer stuck out to me the most. In the podcast, she answered Raz like so:
I think it's a very interesting question because I think that if you zoom out of luck, then you’ll say, where were you born, who were your parents, what was the education that you got, you know, having good health. There are so many layers of luck. So if you look at all of those things then I couldn’t be luckier. Then on the other side of it, I think we planned enough seeds where eventually one of them grows, so that's kind of another version of luck, maybe you plant 1000 seeds, eventually one of them will grow. You can attribute one of these seeds as luck or hard work for planting 1000 seeds. So I would say little column A, little column B.
Melanie Perkins was very articulate in her answer. Perkins elegant answer matches with Malcolm Gladwell’s book, “Outliers: The Story of Success”.
In summary, Outliers proves that hard work does not always equate to success. Rather, success is a combination of lucky events and hard work. The first example Gladwell pointed out in his book was one interesting fact about every single professional hockey player.
Take a look at the two charts below and try to see if you can see a pattern:
Outliers: Page 20
Outliers: Page 21
Do you see a pattern for a person who is more likely to be a professional hockey player?
Most of the players were born in January. This is because in Canada, cut off date for schools in January. So a kid born in January can be a couple of months older than kids born later in the summer. While growing up, a couple of months of difference is a lot in children. This means that kids born in January were slightly older, slightly taller which gave them advances. Kids born in January got more training. As a kid, a couple of hours of training time doesn’t equal a lot. However, over time, these kids who got a couple more training here and there ultimately get better than kids who did not.
This means that you are more likely to become a professional hockey player if you were born in January. This sounds like luck to me.
Although you are more likely to become a hockey player if you were born in January, not every single kid born in January ends up playing hockey professionally. This is where the hard work comes in.
Success is a combination of hard work and luck. | https://medium.com/the-innovation/do-you-think-hard-work-equals-success-think-again-bf31d7a43c29 | ['İlknur Eren'] | 2020-12-27 19:25:04.194000+00:00 | ['Books', 'Self Improvement', 'Productivity', 'Advice', 'Reading'] | Title Think Hard Work Equals Success — Think AgainContent Think Hard Work Equals Success — Think Lessons “Outliers Story Success” Malcolm Gladwell NPR’s Built podcast host Guy Raz always asks guest one question “How much attribute success luck much attribute hard work” Episode episode entrepreneur ponder question answer whether success attributed success luck episode I’ve listened founder Canva Melanie Perkins answer stuck podcast answered Raz like think interesting question think zoom luck you’ll say born parent education got know good health many layer luck look thing couldn’t luckier side think planned enough seed eventually one grows thats kind another version luck maybe plant 1000 seed eventually one grow attribute one seed luck hard work planting 1000 seed would say little column little column B Melanie Perkins articulate answer Perkins elegant answer match Malcolm Gladwell’s book “Outliers Story Success” summary Outliers prof hard work always equate success Rather success combination lucky event hard work first example Gladwell pointed book one interesting fact every single professional hockey player Take look two chart try see see pattern Outliers Page 20 Outliers Page 21 see pattern person likely professional hockey player player born January Canada cut date school January kid born January couple month older kid born later summer growing couple month difference lot child mean kid born January slightly older slightly taller gave advance Kids born January got training kid couple hour training time doesn’t equal lot However time kid got couple training ultimately get better kid mean likely become professional hockey player born January sound like luck Although likely become hockey player born January every single kid born January end playing hockey professionally hard work come Success combination hard work luckTags Books Self Improvement Productivity Advice Reading |
2,072 | Medium Article Format | Other Tools for Writing and Editing Your Medium Article
Customizing Your Article’s Properties Before and After Publication
Miscellaneous
Quick Answers to Questions About Medium Formatting
How can I center text on my Medium article?
You cannot center text utilizing the Medium editor or toolbar.
Can I automatically post my WordPress blog articles on Medium?
No, unfortunately you cannot automatically post your blog posts on Medium. This used to be an option in WordPress through a specific integration, but Medium discontinued this option.
Can I edit more than one article at a time? | https://medium.com/blogging-guide/medium-article-format-bc06439c4e7c | ['Casey Botticello'] | 2020-02-23 01:49:05.061000+00:00 | ['Format', 'Typography', 'Design', 'Medium', 'Writing'] | Title Medium Article FormatContent Tools Writing Editing Medium Article Customizing Article’s Properties Publication Miscellaneous Quick Answers Questions Medium Formatting center text Medium article cannot center text utilizing Medium editor toolbar automatically post WordPress blog article Medium unfortunately cannot automatically post blog post Medium used option WordPress specific integration Medium discontinued option edit one article timeTags Format Typography Design Medium Writing |
2,073 | Why Companies Should Pay Attention to the Trend of Minimalist Consumers | Deconstructing digital devices
These dangers of new technologies can also come from the digital devices themselves. The tools that have become our daily lives, such as smartphones, compact computers and touch-sensitive tablets, have features designed to do everything.
As such, hardware designers, be it Apple, Sony or Samsung, have promoted devices that make us pay attention to several things at the same time. They were based on the idea that increasing the multifunctionality of devices would bring more value to the consumer.
Yet, as neuroscience studies show, the brain is very good at doing only one thing at a time, as neural networks gather information simultaneously and not successively. As a result, these technologies lead to distracting and permanently addictive behavior for activities that require little concentration.
Many consumers have become aware of the need to have devices that only provide one service at a time (for example, by turning off social network or call notifications, or filtering applications). Some others have started to think about creating new kinds of products that address a single need.
The Light Phone, for example, is a phone that meets the basic functionality of a normal phone, like the models before smartphones, i.e. calling and SMS, and nothing else. Some others have conceived computers that would perform only a few cognitive tasks.
These initiatives are in line with what Mark Weiser and John Seely Brown called in their seminal article the revolution of “calm technologies”, i.e. less invasive technologies that are deployed in the peripheries of our senses and make less noise. They started from the conviction that technology must be made to serve the human being, the consumer who needs to minimize the influence of the machine on his work and his life. | https://medium.com/curious/why-companies-should-pay-attention-to-the-trend-of-minimalist-consumers-ec52039ecd37 | ['Jean-Marc Buchert'] | 2020-12-22 15:48:12.307000+00:00 | ['Minimalism', 'Product Design', 'Productivity', 'Consumer Behavior', 'Marketing'] | Title Companies Pay Attention Trend Minimalist ConsumersContent Deconstructing digital device danger new technology also come digital device tool become daily life smartphones compact computer touchsensitive tablet feature designed everything hardware designer Apple Sony Samsung promoted device make u pay attention several thing time based idea increasing multifunctionality device would bring value consumer Yet neuroscience study show brain good one thing time neural network gather information simultaneously successively result technology lead distracting permanently addictive behavior activity require little concentration Many consumer become aware need device provide one service time example turning social network call notification filtering application others started think creating new kind product address single need Light Phone example phone meet basic functionality normal phone like model smartphones ie calling SMS nothing else others conceived computer would perform cognitive task initiative line Mark Weiser John Seely Brown called seminal article revolution “calm technologies” ie le invasive technology deployed periphery sens make le noise started conviction technology must made serve human consumer need minimize influence machine work lifeTags Minimalism Product Design Productivity Consumer Behavior Marketing |
2,074 | Not Every Developer Wants to Become a Manager — And That’s Okay | Not Every Developer Wants to Become a Manager — And That’s Okay
Companies should create clear career paths for individual contributors
Photo by Jaromír Kavan on Unsplash
I have only worked in startups with flat hierarchies. Even at companies where there are no clear titles, you usually find three kinds of engineers: The junior developers — fresh out of school; the tech leads to whom everyone reaches out for help and whose technical opinions matter the most; and in the middle, between the juniors and the tech leads, a vast ocean of software engineers with various skills and experiences.
One topic that repeatedly came up in our retros is the lack of career growth opportunities. This topic seemed to puzzle some tech leads who thought that there were a lot of projects and a lot of new things to learn. There were surely a lot of learning opportunities. Still, when the only feedback you get in your 1:1 meetings is “You’re doing great, keep going,” you don’t feel like you’re progressing.
As software engineers, we want our opinion to matter — we want to have an impact. The obvious next step is to become a tech lead but it’s unclear how we get such a position. Or if we even want it. | https://medium.com/better-programming/not-every-developer-wants-to-become-a-manager-and-thats-okay-e7d76b3efd0e | ['Meriam Kharbat'] | 2020-02-17 15:08:24.570000+00:00 | ['Careers', 'Management', 'Programming', 'Startup', 'Software Engineering'] | Title Every Developer Wants Become Manager — That’s OkayContent Every Developer Wants Become Manager — That’s Okay Companies create clear career path individual contributor Photo Jaromír Kavan Unsplash worked startup flat hierarchy Even company clear title usually find three kind engineer junior developer — fresh school tech lead everyone reach help whose technical opinion matter middle junior tech lead vast ocean software engineer various skill experience One topic repeatedly came retro lack career growth opportunity topic seemed puzzle tech lead thought lot project lot new thing learn surely lot learning opportunity Still feedback get 11 meeting “You’re great keep going” don’t feel like you’re progressing software engineer want opinion matter — want impact obvious next step become tech lead it’s unclear get position even want itTags Careers Management Programming Startup Software Engineering |
2,075 | How To Be A Successful Business Owner? | BUSINESS LESSONS
How To Be A Successful Business Owner?
Surviving The Early Stage Of Business Ownership
Photo by Joshua Earle on Unsplash
Many entrepreneurs answer the question of why they went into business with either their passion or the need for an income. The problem is, there is so much more to it.
We enter into business with a particular skillset. Some of us are experts in one area. Some of us are a Jack of All Trades, Master of None.
The true winner in the Entrepreneurial landscape has a mindset that combines the …
→ Go Getter that tackles the challenges in front of them without delay
→ Analyst that looks at the details, reviews results, and makes plans based on them
→ Communicator that shares knowledge with team members
→ Delegator that knows what to have done by others
→ Regulator that stays informed and in compliance with laws
→ Networker that stays connected to the outside world through local meetings and online forums.
→ Recruiter that attracts and vets the right people
→ Bean Counter that makes sure there will be a tomorrow by planning strategically and minimizing expenses
→ Coach that trains, supports, and acknowledges the team
→ Evangalist that doesn’t spend a day without building awareness and promoting their business
→ Director that tracks everything
→ Organizer strives to improve processes
→ Visionary that regularly thinks of new ways to attract customers, conduct business, and introduce new products and services
ALL OF THIS IN ONE PERSON… seems IMPOSSIBLE.
Exhausting → YES, but impossible no.
I am an example of a Jack of All Trades that took on a challenge in a completely different and heavily regulated industry, Cosmetology. Previously, I had a service-based sole proprietorship and a product-based sole proprietorship. I knew nothing about hair other than what I had observed or picked up from a friendship of 20 years.
Due to the economy and an outdated concept, the business closed. I am proud of all that I did to try to make it work and gained so much from the experience that I want to share as a Mentor to other business owners.
Being a business owner means wearing many hats.
To be successful, new owners do not just dabble, but dig in and master.
If I can do it, so can you. I am here to help! | https://medium.com/swlh/how-to-be-a-successful-business-owner-4169fcfdc08f | ['Colette Becker'] | 2019-12-15 15:59:27.446000+00:00 | ['Startup Lessons', 'Business Owner', 'Startup', 'Operations Management', 'Entrepreneurship'] | Title Successful Business OwnerContent BUSINESS LESSONS Successful Business Owner Surviving Early Stage Business Ownership Photo Joshua Earle Unsplash Many entrepreneur answer question went business either passion need income problem much enter business particular skillset u expert one area u Jack Trades Master None true winner Entrepreneurial landscape mindset combine … → Go Getter tackle challenge front without delay → Analyst look detail review result make plan based → Communicator share knowledge team member → Delegator know done others → Regulator stay informed compliance law → Networker stay connected outside world local meeting online forum → Recruiter attracts vet right people → Bean Counter make sure tomorrow planning strategically minimizing expense → Coach train support acknowledges team → Evangalist doesn’t spend day without building awareness promoting business → Director track everything → Organizer strives improve process → Visionary regularly think new way attract customer conduct business introduce new product service ONE PERSON… seems IMPOSSIBLE Exhausting → YES impossible example Jack Trades took challenge completely different heavily regulated industry Cosmetology Previously servicebased sole proprietorship productbased sole proprietorship knew nothing hair observed picked friendship 20 year Due economy outdated concept business closed proud try make work gained much experience want share Mentor business owner business owner mean wearing many hat successful new owner dabble dig master helpTags Startup Lessons Business Owner Startup Operations Management Entrepreneurship |
2,076 | 4 Eye-Opening Mindfulness Lessons I Learned from a Depressed Buddhist Monk | 4 Eye-Opening Mindfulness Lessons I Learned from a Depressed Buddhist Monk
How to make peace with your mind?
Photo by THÁI NHÀN from Pexels
After being a Buddhist monk for 12 years, Gelong Thubten became depressed. It was a shock for him. And for me, too, when he shared his story. I thought monks have it all figured out.
When we think we have arrived, we are stuck. We can always learn something new about ourselves wherever we are on our journey.
Gelong joined a 4-year retreat on a Scottish island, cut off from the outside world. No news, no internet, no meetings with people outside the retreat location.
He describes the first two years as “falling through space with nothing to hold him”. Gelong thought this retreat is gonna be a piece of cake, and then he found himself depressed and anxious.
When he reached rock-bottom after half of the retreat, something changed and made him overcome his depression.
What is it that can even make a monk with 12 years of meditation experience depressed? Do you recognize his situation? Have we not also been locked away, cut off from the people we love by the pandemic.
Since March, I barely left the house and only met very few people. For normal humans, the pandemic is like being on a retreat for monks.
The next months are going to be tough with corona cases increasing everywhere in the world. Let’s see how Gelong Thubten overcame his struggle. What can we learn from him to get through this winter happy and energetic, instead of depressed and anxious? | https://medium.com/change-your-mind/4-eye-opening-mindfulness-lessons-i-learned-from-a-depressed-buddhist-monk-99ffa60ed0fc | ['Karolin Wanner'] | 2020-10-16 12:31:50.861000+00:00 | ['Self', 'Mindfulness', 'Spirituality', 'Psychology', 'Mental Health'] | Title 4 EyeOpening Mindfulness Lessons Learned Depressed Buddhist MonkContent 4 EyeOpening Mindfulness Lessons Learned Depressed Buddhist Monk make peace mind Photo THÁI NHÀN Pexels Buddhist monk 12 year Gelong Thubten became depressed shock shared story thought monk figured think arrived stuck always learn something new wherever journey Gelong joined 4year retreat Scottish island cut outside world news internet meeting people outside retreat location describes first two year “falling space nothing hold him” Gelong thought retreat gonna piece cake found depressed anxious reached rockbottom half retreat something changed made overcome depression even make monk 12 year meditation experience depressed recognize situation also locked away cut people love pandemic Since March barely left house met people normal human pandemic like retreat monk next month going tough corona case increasing everywhere world Let’s see Gelong Thubten overcame struggle learn get winter happy energetic instead depressed anxiousTags Self Mindfulness Spirituality Psychology Mental Health |
2,077 | 4 Thoughts I had to Kill to heal from Anxiety | 4. Life will never be as good as it once was
Have you ever listened to an old song and thought to yourself: “Songs were so good back in the day”? Or maybe you had the same reaction when you stumbled upon your childhood pictures: “This was the best time of my life”.
It’s because we only look back at the winners, the victories, the good times, the good feelings, and all the best memories from the past. Known as Survivorship bias or survival bias, it is the logical error of concentrating on the people or things that made it past some selection process and overlooking those that did not, typically because of their lack of visibility. In other terms, we completely ignore all the losses, failures, and all the bad memories from the past. We completely forget the fact that, back in the days, those old songs or those memories were not as good as our brain is tricking us to remember them.
Change can be scary, which is why our brain is never happy with what it has in the present. But change also is inevitable. Thus, our brain always hopes and dreams of a perfect future, while nostalgically remembering only the best times of the past. It is always trying to trick us into feeling good so we get the little dopamine kick, which is the pleasure chemical of it. But that pleasure can only last long. Dwelling on the past, romanticizing our youth is sabotaging our growth.
“The past is history, tomorrow is a mystery, but today is a gift. That is why we call it the present.” — Master Oogway
Stop living in the past and future, they don’t exist. The only thing that exists is the present. Enjoy the present. Savor it. Be grateful for what you have. Life is precious, unique, and beautiful. | https://medium.com/change-your-mind/4-thoughts-i-had-to-kill-to-heal-from-anxiety-1c663fc74a07 | ['Douaa El Khaer'] | 2020-05-03 11:39:27.770000+00:00 | ['Psychology', 'Anxiety', 'Life', 'Mental Health', 'Mindfulness'] | Title 4 Thoughts Kill heal AnxietyContent 4 Life never good ever listened old song thought “Songs good back day” maybe reaction stumbled upon childhood picture “This best time life” It’s look back winner victory good time good feeling best memory past Known Survivorship bias survival bias logical error concentrating people thing made past selection process overlooking typically lack visibility term completely ignore loss failure bad memory past completely forget fact back day old song memory good brain tricking u remember Change scary brain never happy present change also inevitable Thus brain always hope dream perfect future nostalgically remembering best time past always trying trick u feeling good get little dopamine kick pleasure chemical pleasure last long Dwelling past romanticizing youth sabotaging growth “The past history tomorrow mystery today gift call present” — Master Oogway Stop living past future don’t exist thing exists present Enjoy present Savor grateful Life precious unique beautifulTags Psychology Anxiety Life Mental Health Mindfulness |
2,078 | How I Launched a Successful Gig Business with Only Gig Workers | Five key reasons for this business model
Running a gig business with gig workers can be efficient and smooth when done right. Here are five key reasons why I choose this business model:
1. Pay for what you need when you need it
One of the most beneficial aspects of utilizing gig workers in a gig business is that you only pay for what you need when you need it. When the projects are flowing, I’m happy to be hiring these creative experts to do amazing work.
But when things are slow, I don’t have any recurring salaries or overhead expenses chipping away at my bank account. I am able to quickly adjust expenditures whether in an up or down season, keeping my hard costs manageable.
2. No additional equipment
Most freelancers have their own tools. For video production, it’s very common. Camera operators will typically have their own camera gear, lights, and other accessories. Editors will have the latest workstation loaded with editing software and plugins. Sound technicians will have the latest microphones, mixers, and sound tools.
Since freelancers typically come with their own gear, I don’t have to purchase expensive equipment or keep up with the latest technology.
3. No office space
For my gig business, I have chosen not to have a business office. I know this is not ideal for everyone, but I want as little overhead as possible, so I run all of my productions from a home office. The bulk of my work is production management and client communication, which I can do via the Internet and through phone calls. A home office is perfect for my situation.
4. High-quality service
As I mentioned earlier, I quickly learned that I could produce much better work by hiring talented freelancers. As my productions expanded into Fortune 500 companies, I had to deliver high-quality work. This was key in transitioning my business from a small, mediocre video service into a thriving production company with a healthy list of recurring clients.
I’ll say it again and again — if it weren’t for my team of creative freelancers, I wouldn’t be where I am today.
5. Scalability
And finally, another great aspect of using gig workers in a gig business is scalability. The ups and downs of a gig business can be both terrifying and exhilarating. In the down times, yes, you can scale back accordingly and keep your costs slow. But when the projects are flowing, the ability to ramp up using freelancers is incredibly appealing.
A few years ago, I was in one of those flow moments. I was juggling about 15 different productions using a variety of freelancers from around the nation. It was intense, but it was also incredibly rewarding both creatively and financially. | https://medium.com/swlh/how-i-launched-a-successful-gig-business-with-only-gig-workers-f8573d577f26 | ['Russ Pond'] | 2020-11-03 13:02:54.566000+00:00 | ['Entrepreneurship', 'Small Business', 'Gig Economy', 'Startup Lessons', 'Startup'] | Title Launched Successful Gig Business Gig WorkersContent Five key reason business model Running gig business gig worker efficient smooth done right five key reason choose business model 1 Pay need need One beneficial aspect utilizing gig worker gig business pay need need project flowing I’m happy hiring creative expert amazing work thing slow don’t recurring salary overhead expense chipping away bank account able quickly adjust expenditure whether season keeping hard cost manageable 2 additional equipment freelancer tool video production it’s common Camera operator typically camera gear light accessory Editors latest workstation loaded editing software plugins Sound technician latest microphone mixer sound tool Since freelancer typically come gear don’t purchase expensive equipment keep latest technology 3 office space gig business chosen business office know ideal everyone want little overhead possible run production home office bulk work production management client communication via Internet phone call home office perfect situation 4 Highquality service mentioned earlier quickly learned could produce much better work hiring talented freelancer production expanded Fortune 500 company deliver highquality work key transitioning business small mediocre video service thriving production company healthy list recurring client I’ll say — weren’t team creative freelancer wouldn’t today 5 Scalability finally another great aspect using gig worker gig business scalability ups down gig business terrifying exhilarating time yes scale back accordingly keep cost slow project flowing ability ramp using freelancer incredibly appealing year ago one flow moment juggling 15 different production using variety freelancer around nation intense also incredibly rewarding creatively financiallyTags Entrepreneurship Small Business Gig Economy Startup Lessons Startup |
2,079 | How journalists can use Instagram to engage and inform | During the process of redeveloping Me Explica in the Tow-Knight Center program, I have experimented with different tools and strategies to create engagement around my content. In my last article, I explained how my new strategy is to focus on social media because that is where citizens are mostly getting their news from.
In its past iterations, Me Explica was an article-based publication, first as a blog, then as a site. As the years went by, I noticed that being on social media requires much more than simply posting links to your content. You need to truly engage with the reader, answer questions, address criticisms and sometimes even accusations. As a small publication, I am able to do it with little effort but I believe even bigger ones need to commit to talking directly with their readers.
Having this in mind, I have been conducting a few experiments on Instagram. Once defined as a "photo-sharing" app, IG is very versatile and allows for the publication of text, photos, videos, cards (images with text), and videos with text (either with subtitles or only text).
I will share what I have been doing, the tools used to create the content and brief observations about the results.
Experiments
My tests revolved around three kinds of content:
(1) Cards
Explainers on images that are perfect for the photo feed and can easily be shared on other platforms.
Card about Petrobras' losses after a statement by the Brazilian President
(2) Video explainers
The presenter (me) talks directly to the audience in native videos that can be short (1 minute for the feed) or long (up to 10 minutes os IGTV).
Video explainer about the militias in Brazil
(3) Video Stories
Even though the Stories feature only allows for smaller videos, creating the content on an external tool can be helpful in order to do something that lasts a little while longer.
Considerations
These experiments have shown that there is an opportunity to create engaging and informational posts on Instagram. The audience is interested in consuming journalism on a platform that is not made for long-form content but still allows establishing a quick connection to the news. Instagram may not be the best place to break the news but is a good tool to build on it.
One major difficulty for journalists and outlets is to monetize their Instagram profiles. There is no option for doing that and there is no news showing that Facebook might be interested in building such features. Yet, we have plenty of success stories of news delivery on Instagram, such as former CNN White House correspondent Jessica Yellin, Poynter's Media Wise, and Uno.ar (from Argentina).
Toolkit
Having decided what kind of content you will post, you can use the many tools that can help journalists create posts quickly and efficiently at a low cost. I will share some of the platforms and products used to conduct the tests on Me Explica.
Visuals: Canva
Canva is one of my favorite tools. I use it to make presentations, design posts and covers for social media and to create the card I showed above. It is very intuitive: you need only to drag and drop. There are thousands of templates to choose from. The free version is already very good, but the pro subscription allows you to resize projects so you can post in multiple social media channels. https://www.canva.com
Video: Lumen 5
I have only recently come across Lumen 5 but I'm already a fan. I used it to create the Stories video shown above. It helps you create social media videos very easily -from text or your own audio. Its artificial intelligence creates new frames automatically, speeding up the process. Anyone with little to or no design experience can use it and have great results. https://lumen5.com/
Smartphone videos: Cheap tools
Showing my gear on Instagram
I shared on my Instagram account some of the accessories I have been using to film my explainer videos and some people got interested in the equipment. Amazon was the source for both the selfie ring light that can be attached to any smartphone and the lavalier microphone. Considering that Instagram allows for more informal and amateur-ish videos, this set up is very helpful for filming on the go. All for less than 20 dollars.
Here's the microphone:
And here's the light:
Conclusion
Instagram is a great tool to explore new ways of delivering information to audiences that no longer want to visit homepages in search of news. It is a good testing ground to get a sense of what might and not might work to engage citizens. Even with its limitations in terms of generating revenue, it can be a good way for smaller outlets and individual journalists to get a sense of what resonates with audiences. It is worth experimenting. | https://medium.com/journalism-innovation/how-journalists-can-use-instagram-to-engage-and-inform-62b0ad80b74b | ['Diogo A. Rodriguez'] | 2019-05-10 02:43:21.857000+00:00 | ['Journalism', 'Storytelling', 'Social Media', 'Instagram', 'Innovation'] | Title journalist use Instagram engage informContent process redeveloping Explica TowKnight Center program experimented different tool strategy create engagement around content last article explained new strategy focus social medium citizen mostly getting news past iteration Explica articlebased publication first blog site year went noticed social medium requires much simply posting link content need truly engage reader answer question address criticism sometimes even accusation small publication able little effort believe even bigger one need commit talking directly reader mind conducting experiment Instagram defined photosharing app IG versatile allows publication text photo video card image text video text either subtitle text share tool used create content brief observation result Experiments test revolved around three kind content 1 Cards Explainers image perfect photo feed easily shared platform Card Petrobras loss statement Brazilian President 2 Video explainers presenter talk directly audience native video short 1 minute feed long 10 minute o IGTV Video explainer militia Brazil 3 Video Stories Even though Stories feature allows smaller video creating content external tool helpful order something last little longer Considerations experiment shown opportunity create engaging informational post Instagram audience interested consuming journalism platform made longform content still allows establishing quick connection news Instagram may best place break news good tool build One major difficulty journalist outlet monetize Instagram profile option news showing Facebook might interested building feature Yet plenty success story news delivery Instagram former CNN White House correspondent Jessica Yellin Poynters Media Wise Unoar Argentina Toolkit decided kind content post use many tool help journalist create post quickly efficiently low cost share platform product used conduct test Explica Visuals Canva Canva one favorite tool use make presentation design post cover social medium create card showed intuitive need drag drop thousand template choose free version already good pro subscription allows resize project post multiple social medium channel httpswwwcanvacom Video Lumen 5 recently come across Lumen 5 Im already fan used create Stories video shown help create social medium video easily text audio artificial intelligence creates new frame automatically speeding process Anyone little design experience use great result httpslumen5com Smartphone video Cheap tool Showing gear Instagram shared Instagram account accessory using film explainer video people got interested equipment Amazon source selfie ring light attached smartphone lavalier microphone Considering Instagram allows informal amateurish video set helpful filming go le 20 dollar Heres microphone here light Conclusion Instagram great tool explore new way delivering information audience longer want visit homepage search news good testing ground get sense might might work engage citizen Even limitation term generating revenue good way smaller outlet individual journalist get sense resonates audience worth experimentingTags Journalism Storytelling Social Media Instagram Innovation |
2,080 | Have We Reached the Phase of Smart Financial Crime Detection? | Have We Reached the Phase of Smart Financial Crime Detection?
Financial Technology
Why are financial crimes on the rise? Many people ask this question as crime-cases in the financial industry rise. Banks according to a McKinsey report¹ have lost millions of dollars in the last decade alone and this could worsen as criminals upgrade their financial crime tactics. Financial crime analytics can help financial institutions, investigators detect fraud, and money laundering, assess risk, and report on data to prevent financial crime.
Each year, many cases of banking fraud² increase and despite stringent measures, losses continue to spike with financial institutions lacking concrete strategies to address this growing problem. Analytics help to pinpoint transactions that need further scrutiny, identifying the needle in the haystack of financial data.
Photo by Bermix Studio on Unsplash
With only a 1% success rate in recovering stolen funds, the financial services industry has realized that traditional approaches to dealing with financial crime are not working. Across the ecosystem, regulatory authorities, enforcement agencies, and financial institutions³ are working together to disrupt financial crime. This requires a proactive approach to predict and manage the risks posed to people and organizations and not merely to comply with rules and regulations.
The challenges faced by financial institutions regarding money-laundering activities have increased substantially in the globalization era. Additionally, there is a rising menace of financial crime and counterfeiting. As money launderers become more sophisticated, the effectiveness of anti-money laundering policies is under heightened regulatory scrutiny. The probability of banks facing rigid penalties and reputation loss in case of shortcomings in AML management has increased.
A good example of a tool used for financial crime detection is AMLOCK. This is the enterprise level end-to-end financial crime management solution. It integrates the best of anti-money laundering⁴ and anti-fraud measures to effectively identify, manage, and report financial crime. It provides various features that cater to profiling, risk categorization, transaction monitoring, and reporting requirements of financial institutions. Features that form part of this offering are in line with AML (Anti Money Laundering) regulations.
In this article, I will explore current practices in financial crime detection, use cases and explore what the future looks in financial technology and fraud reduction.
Overview
Criminals are pervasive in their determination to identify and exploit vulnerabilities throughout the financial services industry. Their ability to collaborate and innovate necessitates a proactive approach towards responding to individual events, while disrupting crime networks. Combating #financialcrime is complementary to generating revenue. The big data analytical capabilities that enable a bank to personalize product offerings also underpin an effective approach to spotting and responding to criminal behavior.
To out-pace fraudsters, financial institutions and payment processors need a quicker and more agile approach to payment fraud detection⁵. Instead of relying on predefined models, applications need the ability to quickly adapt to emerging fraud activities and implement rules to stop those fraud types. Not only should organizations be able to adjust their detection models, the models themselves should be inter-operable with any #datascience, machine learning, open source and AI technique using any vendor.
In addition, to eliminate fraud traveling from one area or channel to another undetected, aggregating transactional and non-transactional behavior from across various channels providers greater context and spots seemingly innocuous patterns that connect complex fraud schemes.
Artificial Intelligence For Financial Crime Detection
Within financial institutions, it is not uncommon to have high false-positive rates that is, notifications of potential suspicious activity that do not result in the filing of a suspicious transaction report. For AML alerts, high false positives are the norm.
The reason for this is a combination of dated technology and incomplete and inaccurate data. Traditional detection systems provide inaccurate results due to outdated rules or peer groups creating static segmentations of customer types based on limited demographic details.
Photo by Jp Valery on Unsplash
In addition, account data within the institution can be fragmented, incomplete and housed in multiple locations. These factors are part of the reason why alerts and AML are key areas to apply #artificialintelligence, advanced analytics⁶ and RPA.
The technologies can gather greater insight, understand transactional patterns across a larger scale and eliminate tedious aspects of the investigation that are time-consuming and low value. AI can augment the investigation process and provide the analyst with the most likely results, driving faster and more informed decisions with less effort.
AI based Intelligent Customer Insights
Periodic reviews of customer accounts are performed as part of a financial service organization’s risk management process, to ensure the institution is not unwittingly being used for illegal activities. As a practice, accounts and individuals that represent a higher risk undergo these reviews more often than lower-risk entities. For these higher-risk accounts, additional scrutiny is performed in the form of enhanced due diligence.
This process involves not only looking at government and public watch list and sanctions lists, but also news outlets and business registers to uncover any underlying risks. As one would think, such less-common investigations took the majority of the due diligence process because they typically required lengthy, manual searches and validation that a name was the individual or entity under review.
With modern technologies like entity link analysis to identify connections between entities based on shared criteria, as well as #naturallanguageprocessing to gain context from structured and unstructured text, much of this investigation process can be automated. By using AI to perform the initial search and review of a large number of articles and information sources, financial institutions gain greater consistency and the ability to record the research results and methodology.
Much like the AML alert triage example previously mentioned, the key is not to automate analysts from the process. Instead, AI automates the data gathering and initial review to focus the analysts on reviewing the most pertinent information, providing their feedback on the accuracy of those sources and making the ultimate decision on the customer’s risk level.
Analytics for Financial Fraud Detection
Innovation in the payments space is at a level not seen in decades. From mobile payments, to peer-to-peer payments⁷ to real-time payments, there are a growing number of payment services, channels and rails for consumers and businesses alike. But these myriad options also give fraudsters plenty of openings for exploitation, as well.
Easy-to-exploit issues with these new payment services include their speed and lack of transactional and customer behavioral history. These issues put financial institutions and payment processors in a difficult position. If they block a transaction, they could negatively impact a legitimate user, leading the user to either abandon the platform or use a competitor instead.
If the transaction is approved and it is fraudulent, it erodes trust in the payment provider and leads to a loss. Traditional fraud detection systems were designed for a relatively slow-moving fraud environment. Once a new fraud pattern was discovered, a detection rule or model would be created over a matter of weeks or months, tested and then put into production to uncover fraud that fit those known fraud typologies.
Obviously, the weakness of this approach is that is takes too long and relies on identifying the fraud pattern first. In the time it takes to identify the fraud pattern, develop the model and put it into use, consumers and the institution could experience considerable fraud losses. In addition, fraudsters, aware of this deficiency, can quickly and continuously change the fraud scheme to evade detection.
Case Studies of Financial Crime Technology
Let us now explore some use cases of financial technology and how companies benefited in fraud reduction.
1. MasterCard
To help acquirers better evaluate merchants, MasterCard created an anti-fraud solution using proprietary MasterCard data on a platform called MATCH that maintains data on hundreds of millions of fraudulent businesses and handles nearly one million inquiries each month. As the volume of data in its platform grew over the years, MasterCard staff found that its homegrown relational database management system lookup solution was no longer the best option to satisfy the growing and increasingly complex needs of MATCH users.
Photo by CardMapr on Unsplash
Realizing that there was an opportunity to deliver substantially better value to its customers, MasterCard turned to the Cloudera Enterprise Data Hub. After successfully building, integrating, and incorporating security into its EDH, MasterCard added Cloudera Search and other tools and workloads to access, search, and secure more data.
2. United Overseas Bank (Asia)
The challenge UOB faced was the data limitations of their legacy systems. With legacy databases, banks are restricted by the amount of data as well as the variety. As a result, they miss key data attributes that are necessary for anti-money laundering, transaction monitoring, and customer analytics engines to work effectively. UOB established the Enterprise Data Hub⁸ as the principal data platform that, every day, ingests two petabytes of transaction, customer, trade, deposit, and loan data and a range of unstructured data, including voice and text.
3. Bank Danamon (Indonesia)
Bank Danamon is one of Indonesia’s largest financial institutions, offering corporate and small business banking, consumer banking, treasury and capital markets. Bank Danamon uses a machine-learning platform for real-time customer marketing, fraud detection, and anti-money laundering activities. The platform integrates data from about 50 different systems and drives machine-learning applications. Using #machinelearning on aggregated behavior and transaction data in real time has helped Bank Danamon reduce marketing costs, identify new patterns of fraud, and deepen customer relationships.
This is the Best Time to Implement AI for Financial Crime Detection
Financial crime and corruption are at epidemic levels and many countries are unable to significantly reduce corruption. Regulators and financial institutions are looking to innovative AI technology to fix problems that have grown beyond their ability to solve with intuition and existing tools alone. To justify cognitive initiatives, financial services organizations need to show real return on value in such investments.
IBM is able to demonstrate the value in a variety of use cases, as shown in the client success stories outlined in this white paper. A misunderstanding about artificial intelligence is the belief that it will replace employees. However, the financial crime analyst is and should always be an essential part of this process. AI, process automation and #advancedanalytics are tools that can perform analyses and tasks in a fraction of the time it would take an employee.
Yet, the ultimate decision-making power still lies with those analysts, investigators and compliance officers for whom this technology provides greater insight and eliminates tedious task work. This augmented intelligence is the next phase of the fight against financial crime, and one that only together financial institutions, regulators and technology partners can win.
What do you think? Is the current technology capable of addressing rising fraud cases and financial crime? Share your comments below and contribute to the discussion on Have We Reached The Phase Of Smart Financial Crime Detection?
Works Cited
¹McKinsey Report, ²Banking Fraud, ³Financial Institutions, ⁴Anti-Money Laundering, ⁵Payment Fraud Detection, ⁶Advanced Analytics, ⁷Peer-to-Peer Payments, ⁸Enterprise Data Hub
More from David Yakobovitch:
Listen to the HumAIn Podcast | Subscribe to my newsletter | https://medium.com/towards-artificial-intelligence/have-we-reached-the-phase-of-smart-financial-crime-detection-9f3d98fb488 | ['David Yakobovitch'] | 2020-12-17 20:01:08.149000+00:00 | ['Opinion', 'Analysis', 'News', 'Artificial Intelligence', 'Technology'] | Title Reached Phase Smart Financial Crime DetectionContent Reached Phase Smart Financial Crime Detection Financial Technology financial crime rise Many people ask question crimecases financial industry rise Banks according McKinsey report¹ lost million dollar last decade alone could worsen criminal upgrade financial crime tactic Financial crime analytics help financial institution investigator detect fraud money laundering ass risk report data prevent financial crime year many case banking fraud² increase despite stringent measure loss continue spike financial institution lacking concrete strategy address growing problem Analytics help pinpoint transaction need scrutiny identifying needle haystack financial data Photo Bermix Studio Unsplash 1 success rate recovering stolen fund financial service industry realized traditional approach dealing financial crime working Across ecosystem regulatory authority enforcement agency financial institutions³ working together disrupt financial crime requires proactive approach predict manage risk posed people organization merely comply rule regulation challenge faced financial institution regarding moneylaundering activity increased substantially globalization era Additionally rising menace financial crime counterfeiting money launderers become sophisticated effectiveness antimoney laundering policy heightened regulatory scrutiny probability bank facing rigid penalty reputation loss case shortcoming AML management increased good example tool used financial crime detection AMLOCK enterprise level endtoend financial crime management solution integrates best antimoney laundering⁴ antifraud measure effectively identify manage report financial crime provides various feature cater profiling risk categorization transaction monitoring reporting requirement financial institution Features form part offering line AML Anti Money Laundering regulation article explore current practice financial crime detection use case explore future look financial technology fraud reduction Overview Criminals pervasive determination identify exploit vulnerability throughout financial service industry ability collaborate innovate necessitates proactive approach towards responding individual event disrupting crime network Combating financialcrime complementary generating revenue big data analytical capability enable bank personalize product offering also underpin effective approach spotting responding criminal behavior outpace fraudsters financial institution payment processor need quicker agile approach payment fraud detection⁵ Instead relying predefined model application need ability quickly adapt emerging fraud activity implement rule stop fraud type organization able adjust detection model model interoperable datascience machine learning open source AI technique using vendor addition eliminate fraud traveling one area channel another undetected aggregating transactional nontransactional behavior across various channel provider greater context spot seemingly innocuous pattern connect complex fraud scheme Artificial Intelligence Financial Crime Detection Within financial institution uncommon high falsepositive rate notification potential suspicious activity result filing suspicious transaction report AML alert high false positive norm reason combination dated technology incomplete inaccurate data Traditional detection system provide inaccurate result due outdated rule peer group creating static segmentation customer type based limited demographic detail Photo Jp Valery Unsplash addition account data within institution fragmented incomplete housed multiple location factor part reason alert AML key area apply artificialintelligence advanced analytics⁶ RPA technology gather greater insight understand transactional pattern across larger scale eliminate tedious aspect investigation timeconsuming low value AI augment investigation process provide analyst likely result driving faster informed decision le effort AI based Intelligent Customer Insights Periodic review customer account performed part financial service organization’s risk management process ensure institution unwittingly used illegal activity practice account individual represent higher risk undergo review often lowerrisk entity higherrisk account additional scrutiny performed form enhanced due diligence process involves looking government public watch list sanction list also news outlet business register uncover underlying risk one would think lesscommon investigation took majority due diligence process typically required lengthy manual search validation name individual entity review modern technology like entity link analysis identify connection entity based shared criterion well naturallanguageprocessing gain context structured unstructured text much investigation process automated using AI perform initial search review large number article information source financial institution gain greater consistency ability record research result methodology Much like AML alert triage example previously mentioned key automate analyst process Instead AI automates data gathering initial review focus analyst reviewing pertinent information providing feedback accuracy source making ultimate decision customer’s risk level Analytics Financial Fraud Detection Innovation payment space level seen decade mobile payment peertopeer payments⁷ realtime payment growing number payment service channel rail consumer business alike myriad option also give fraudsters plenty opening exploitation well Easytoexploit issue new payment service include speed lack transactional customer behavioral history issue put financial institution payment processor difficult position block transaction could negatively impact legitimate user leading user either abandon platform use competitor instead transaction approved fraudulent erodes trust payment provider lead loss Traditional fraud detection system designed relatively slowmoving fraud environment new fraud pattern discovered detection rule model would created matter week month tested put production uncover fraud fit known fraud typology Obviously weakness approach take long relies identifying fraud pattern first time take identify fraud pattern develop model put use consumer institution could experience considerable fraud loss addition fraudsters aware deficiency quickly continuously change fraud scheme evade detection Case Studies Financial Crime Technology Let u explore use case financial technology company benefited fraud reduction 1 MasterCard help acquirer better evaluate merchant MasterCard created antifraud solution using proprietary MasterCard data platform called MATCH maintains data hundred million fraudulent business handle nearly one million inquiry month volume data platform grew year MasterCard staff found homegrown relational database management system lookup solution longer best option satisfy growing increasingly complex need MATCH user Photo CardMapr Unsplash Realizing opportunity deliver substantially better value customer MasterCard turned Cloudera Enterprise Data Hub successfully building integrating incorporating security EDH MasterCard added Cloudera Search tool workload access search secure data 2 United Overseas Bank Asia challenge UOB faced data limitation legacy system legacy database bank restricted amount data well variety result miss key data attribute necessary antimoney laundering transaction monitoring customer analytics engine work effectively UOB established Enterprise Data Hub⁸ principal data platform every day ingests two petabyte transaction customer trade deposit loan data range unstructured data including voice text 3 Bank Danamon Indonesia Bank Danamon one Indonesia’s largest financial institution offering corporate small business banking consumer banking treasury capital market Bank Danamon us machinelearning platform realtime customer marketing fraud detection antimoney laundering activity platform integrates data 50 different system drive machinelearning application Using machinelearning aggregated behavior transaction data real time helped Bank Danamon reduce marketing cost identify new pattern fraud deepen customer relationship Best Time Implement AI Financial Crime Detection Financial crime corruption epidemic level many country unable significantly reduce corruption Regulators financial institution looking innovative AI technology fix problem grown beyond ability solve intuition existing tool alone justify cognitive initiative financial service organization need show real return value investment IBM able demonstrate value variety use case shown client success story outlined white paper misunderstanding artificial intelligence belief replace employee However financial crime analyst always essential part process AI process automation advancedanalytics tool perform analysis task fraction time would take employee Yet ultimate decisionmaking power still lie analyst investigator compliance officer technology provides greater insight eliminates tedious task work augmented intelligence next phase fight financial crime one together financial institution regulator technology partner win think current technology capable addressing rising fraud case financial crime Share comment contribute discussion Reached Phase Smart Financial Crime Detection Works Cited ¹McKinsey Report ²Banking Fraud ³Financial Institutions ⁴AntiMoney Laundering ⁵Payment Fraud Detection ⁶Advanced Analytics ⁷PeertoPeer Payments ⁸Enterprise Data Hub David Yakobovitch Listen HumAIn Podcast Subscribe newsletterTags Opinion Analysis News Artificial Intelligence Technology |
2,081 | A Layman’s Guide to Data Science: How to Become a (Good) Data Scientist | How simple is Data Science?
Sometimes when you hear data scientists shoot a dozen of algorithms while discussing their experiments or go into details of Tensorflow usage you might think that there is no way a layman can master Data Science. Big Data looks like another mystery of the Universe that will be shut up in an ivory tower with a handful of present-day alchemists and magicians. At the same time, you hear about the urgent necessity to become data-driven from everywhere.
The trick is, we used to have only limited and well-structured data. Now, with the global Internet, we are swimming in the never-ending flows of structured, unstructured and semi-structured data. It gives us more power to understand industrial, commercial or social processes, but at the same time, it requires new tools and technologies.
Data Science is merely a 21st century extension of mathematics that people have been doing for centuries. In its essence, it is the same skill of using information available to gain insight and improve processes. Whether it’s a small Excel spreadsheet or a 100 million records in a database, the goal is always the same: to find value. What makes Data Science different from traditional statistics is that it tries not only to explain values, but to predict future trends.
In other words, we use Data Science for:
Data Science is a newly developed blend of machine learning algorithms, statistics, business intelligence, and programming. This blend helps us reveal hidden patterns from the raw data which in turn provides insights in business and manufacturing processes.
What should a data scientist know?
To go into Data Science, you need the skills of a business analyst, a statistician, a programmer, and a Machine Learning developer. Luckily, for the first dive into the world of data, you do not need to be an expert in any of these fields. Let’s see what you need and how you can teach yourself the necessary minimum.
Business Intelligence
When we first look at Data Science and Business Intelligence we see the similarity: they both focus on “data” to provide favorable outcomes and they both offer reliable decision-support systems. The difference is that while BI works with static and structured data, Data Science can handle high-speed and complex, multi-structured data from a wide variety of data sources. From the practical perspective, BI helps interpret past data for reporting or Descriptive Analytics and Data Science analyzes the past data to make future predictions in Predictive Analytics or Prescriptive Analytics.
Theories aside, to start a simple Data Science project, you do not need to be an expert Business Analyst. What you need is to have clear ideas of the following points:
have a question or something you’re curious about;
find and collect relevant data that exists for your area of interest and might answer your question;
analyze your data with selected tools;
look at your analysis and try to interpret findings.
As you can see, at the very beginning of your journey your curiosity and common sense might be sufficient from the BI point of view. In a more complex production environment, there will probably be separate Business Analysts to do insightful interpreting. However, it is important to have at least dim vision of BI tasks and strategies.
Resources
We recommend you to have a look at the following introductory books to feel more confident in analytics:
Introduction To The Basic Business Intelligence Concepts — an insightful article giving an overview of the basic concepts in BI;
Business Intelligence for Dummies — a step-by-step guidance through the BI technologies;
Big Data & Business Intelligence — an online course for beginners;
Business Analytics Fundamentals — another introductory course teaching the basic concepts of BI.
Statistics and probability
Probability and statistics are the basis of Data Science. Statistics is, in simple terms, the use of mathematics to perform technical analysis of data. With the help of statistical methods, we make estimates for the further analysis. Statistical methods themselves are dependent on the theory of probability which allow us to make predictions. Both statistics and probability are separate and complicated fields of mathematics, however, as a beginner data scientist, you can start with 5 basic statistics concepts:
Statistical features . Things like bias, variance, mean, median, percentiles, and many others are the first stats technique you would apply when exploring a dataset. It’s all fairly easy to understand and implement them in code even at the novice level.
. Things like bias, variance, mean, median, percentiles, and many others are the first stats technique you would apply when exploring a dataset. It’s all fairly easy to understand and implement them in code even at the novice level. Probability Distributions represent the probabilities of all possible values in the experiment. The most common in Data Science are a Uniform Distribution that has is concerned with events that are equally likely to occur, a Gaussian, or Normal Distribution where most observations cluster around the central peak (mean) and the probabilities for values further away taper off equally in both directions in a bell curve, and a Poisson Distribution similar to the Gaussian but with an added factor of skewness.
represent the probabilities of all possible values in the experiment. The most common in Data Science are a Uniform Distribution that has is concerned with events that are equally likely to occur, a Gaussian, or Normal Distribution where most observations cluster around the central peak (mean) and the probabilities for values further away taper off equally in both directions in a bell curve, and a Poisson Distribution similar to the Gaussian but with an added factor of skewness. Over and Under Sampling that help to balance datasets. If the majority class is overrepresented, undersampling helps select some of the data from it to balance it with the minority class has. When data is insufficient, oversampling duplicates the minority class values to have the same number of examples as the majority class has.
that help to balance datasets. If the majority class is overrepresented, undersampling helps select some of the data from it to balance it with the minority class has. When data is insufficient, oversampling duplicates the minority class values to have the same number of examples as the majority class has. Dimensionality Reduction. The most common technique used for dimensionality reduction is PCA which essentially creates vector representations of features showing how important they are to the output i.e. their correlation.
The most common technique used for dimensionality reduction is PCA which essentially creates vector representations of features showing how important they are to the output i.e. their correlation. Bayesian Statistics. Finally, Bayesian statistics is an approach applying probability to statistical problems. It provides us with mathematical tools to update our beliefs about random events in light of seeing new data or evidence about those events.
Image credit: unsplash.com
Resources
We have selected just a few books and courses that are practice-oriented and can help you feel the taste of statistical concepts from the beginning:
Practical Statistics for Data Scientists: 50 Essential Concepts — a solid practical book that introduces essential tools specifically for data science;
Naked Statistics: Stripping the Dread from the Data — an introduction to statistics in simple words;
Statistics and probability — an introductory online course;
Statistics for data science — a special course on statistics developed for data scientists.
Programming
Data Science is an exciting field to work in, as it combines advanced statistical and quantitative skills with real-world programming ability. Depending on your background, you are free to choose a programming language to your liking. The most popular in the Data Science community are, however, R, Python and SQL.
R is a powerful language specifically designed for Data Science needs. It excels at a huge variety of statistical and data visualization applications, and being open source has an active community of contributors. In fact, 43 percent of data scientists are using R to solve statistical problems. However, it is difficult to learn, especially if you already mastered a programming language.
is a powerful language specifically designed for Data Science needs. It excels at a huge variety of statistical and data visualization applications, and being open source has an active community of contributors. In fact, 43 percent of data scientists are using R to solve statistical problems. However, it is difficult to learn, especially if you already mastered a programming language. Python is another common language in Data Science. 40 percent of respondents surveyed by O’Reilly use Python as their major programming language. Because of its versatility, you can use Python for almost all steps of data analysis. It allows you to create datasets and you can literally find any type of dataset you need on Google. Ideal for entry level and easy-to learn, Python remains exciting for Data Science and Machine Learning experts with more sophisticated libraries such as Google’s Tensorflow.
is another common language in Data Science. 40 percent of respondents surveyed by O’Reilly use Python as their major programming language. Because of its versatility, you can use Python for almost all steps of data analysis. It allows you to create datasets and you can literally find any type of dataset you need on Google. Ideal for entry level and easy-to learn, Python remains exciting for Data Science and Machine Learning experts with more sophisticated libraries such as Google’s Tensorflow. SQL (structured query language) is more useful as a data processing language than as an advanced analytical tool. IT can help you to carry out operations like add, delete and extract data from a database and carry out analytical functions and transform database structures. Even though NoSQL and Hadoop have become a large component of Data Science, it is still expected that a data scientist can write and execute complex queries in SQL.
Resources
There are plenty of resources for any programming language and every level of proficiency. We’d suggest visiting DataCamp to explore the basic programming skills needed for Data Science.
If you feel more comfortable with books, the vast collection of O’Reilly’s free programming ebooks will help you choose the language to master.
Image credit: unsplash.com
Machine Learning and AI
Although AI and Data Science usually go hand-in-hand, a large number of data scientists are not proficient in Machine Learning areas and techniques. However, Data Science involves working with large amounts of data sets that require mastering Machine Learning techniques, such as supervised machine learning, decision trees, logistic regression, etc. These skills will help you to solve different data science problems that are based on predictions of major organizational outcomes.
At the entry level, Machine Learning does not require much knowledge of math or programming, just interest and motivation. The basic thing that you should know about ML is that in its core lies one of the three main categories of algorithms: supervised learning, unsupervised learning and reinforcement learning.
Supervised Learning is a branch of ML that works on labeled data, in other words, the information you are feeding to the model has a ready answer. Your software learns by making predictions about the output and then comparing it with the actual answer.
is a branch of ML that works on labeled data, in other words, the information you are feeding to the model has a ready answer. Your software learns by making predictions about the output and then comparing it with the actual answer. In unsupervised learning , data is not labeled and the objective of the model is to create some structure from it. Unsupervised learning can be further divided into clustering and association. It is used to find patterns in data, which are especially useful in business intelligence to analyze the customer behavior.
, data is not labeled and the objective of the model is to create some structure from it. Unsupervised learning can be further divided into clustering and association. It is used to find patterns in data, which are especially useful in business intelligence to analyze the customer behavior. Reinforcement learning is the closest to the way that humans learn,i.e. by trial and error. Here, a performance function is created to tell the model if what it did was getting it closer to its goal or making it go the other way. Based on this feedback, the model learns and then makes another guess, this continues to happen and every new guess is better.
With these broad approaches in mind, you have a backbone for analysis of your data and explore specific algorithms and techniques that would suit you the best. | https://medium.com/sciforce/a-laymans-guide-to-data-science-how-to-become-a-good-data-scientist-97927ad51ed8 | [] | 2020-01-06 15:43:36.312000+00:00 | ['Programming', 'Machine Learning', 'Data Science', 'Artificial Intelligence', 'Business Intelligence'] | Title Layman’s Guide Data Science Become Good Data ScientistContent simple Data Science Sometimes hear data scientist shoot dozen algorithm discussing experiment go detail Tensorflow usage might think way layman master Data Science Big Data look like another mystery Universe shut ivory tower handful presentday alchemist magician time hear urgent necessity become datadriven everywhere trick used limited wellstructured data global Internet swimming neverending flow structured unstructured semistructured data give u power understand industrial commercial social process time requires new tool technology Data Science merely 21st century extension mathematics people century essence skill using information available gain insight improve process Whether it’s small Excel spreadsheet 100 million record database goal always find value make Data Science different traditional statistic try explain value predict future trend word use Data Science Data Science newly developed blend machine learning algorithm statistic business intelligence programming blend help u reveal hidden pattern raw data turn provides insight business manufacturing process data scientist know go Data Science need skill business analyst statistician programmer Machine Learning developer Luckily first dive world data need expert field Let’s see need teach necessary minimum Business Intelligence first look Data Science Business Intelligence see similarity focus “data” provide favorable outcome offer reliable decisionsupport system difference BI work static structured data Data Science handle highspeed complex multistructured data wide variety data source practical perspective BI help interpret past data reporting Descriptive Analytics Data Science analyzes past data make future prediction Predictive Analytics Prescriptive Analytics Theories aside start simple Data Science project need expert Business Analyst need clear idea following point question something you’re curious find collect relevant data exists area interest might answer question analyze data selected tool look analysis try interpret finding see beginning journey curiosity common sense might sufficient BI point view complex production environment probably separate Business Analysts insightful interpreting However important least dim vision BI task strategy Resources recommend look following introductory book feel confident analytics Introduction Basic Business Intelligence Concepts — insightful article giving overview basic concept BI Business Intelligence Dummies — stepbystep guidance BI technology Big Data Business Intelligence — online course beginner Business Analytics Fundamentals — another introductory course teaching basic concept BI Statistics probability Probability statistic basis Data Science Statistics simple term use mathematics perform technical analysis data help statistical method make estimate analysis Statistical method dependent theory probability allow u make prediction statistic probability separate complicated field mathematics however beginner data scientist start 5 basic statistic concept Statistical feature Things like bias variance mean median percentile many others first stats technique would apply exploring dataset It’s fairly easy understand implement code even novice level Things like bias variance mean median percentile many others first stats technique would apply exploring dataset It’s fairly easy understand implement code even novice level Probability Distributions represent probability possible value experiment common Data Science Uniform Distribution concerned event equally likely occur Gaussian Normal Distribution observation cluster around central peak mean probability value away taper equally direction bell curve Poisson Distribution similar Gaussian added factor skewness represent probability possible value experiment common Data Science Uniform Distribution concerned event equally likely occur Gaussian Normal Distribution observation cluster around central peak mean probability value away taper equally direction bell curve Poisson Distribution similar Gaussian added factor skewness Sampling help balance datasets majority class overrepresented undersampling help select data balance minority class data insufficient oversampling duplicate minority class value number example majority class help balance datasets majority class overrepresented undersampling help select data balance minority class data insufficient oversampling duplicate minority class value number example majority class Dimensionality Reduction common technique used dimensionality reduction PCA essentially creates vector representation feature showing important output ie correlation common technique used dimensionality reduction PCA essentially creates vector representation feature showing important output ie correlation Bayesian Statistics Finally Bayesian statistic approach applying probability statistical problem provides u mathematical tool update belief random event light seeing new data evidence event Image credit unsplashcom Resources selected book course practiceoriented help feel taste statistical concept beginning Practical Statistics Data Scientists 50 Essential Concepts — solid practical book introduces essential tool specifically data science Naked Statistics Stripping Dread Data — introduction statistic simple word Statistics probability — introductory online course Statistics data science — special course statistic developed data scientist Programming Data Science exciting field work combine advanced statistical quantitative skill realworld programming ability Depending background free choose programming language liking popular Data Science community however R Python SQL R powerful language specifically designed Data Science need excels huge variety statistical data visualization application open source active community contributor fact 43 percent data scientist using R solve statistical problem However difficult learn especially already mastered programming language powerful language specifically designed Data Science need excels huge variety statistical data visualization application open source active community contributor fact 43 percent data scientist using R solve statistical problem However difficult learn especially already mastered programming language Python another common language Data Science 40 percent respondent surveyed O’Reilly use Python major programming language versatility use Python almost step data analysis allows create datasets literally find type dataset need Google Ideal entry level easyto learn Python remains exciting Data Science Machine Learning expert sophisticated library Google’s Tensorflow another common language Data Science 40 percent respondent surveyed O’Reilly use Python major programming language versatility use Python almost step data analysis allows create datasets literally find type dataset need Google Ideal entry level easyto learn Python remains exciting Data Science Machine Learning expert sophisticated library Google’s Tensorflow SQL structured query language useful data processing language advanced analytical tool help carry operation like add delete extract data database carry analytical function transform database structure Even though NoSQL Hadoop become large component Data Science still expected data scientist write execute complex query SQL Resources plenty resource programming language every level proficiency We’d suggest visiting DataCamp explore basic programming skill needed Data Science feel comfortable book vast collection O’Reilly’s free programming ebooks help choose language master Image credit unsplashcom Machine Learning AI Although AI Data Science usually go handinhand large number data scientist proficient Machine Learning area technique However Data Science involves working large amount data set require mastering Machine Learning technique supervised machine learning decision tree logistic regression etc skill help solve different data science problem based prediction major organizational outcome entry level Machine Learning require much knowledge math programming interest motivation basic thing know ML core lie one three main category algorithm supervised learning unsupervised learning reinforcement learning Supervised Learning branch ML work labeled data word information feeding model ready answer software learns making prediction output comparing actual answer branch ML work labeled data word information feeding model ready answer software learns making prediction output comparing actual answer unsupervised learning data labeled objective model create structure Unsupervised learning divided clustering association used find pattern data especially useful business intelligence analyze customer behavior data labeled objective model create structure Unsupervised learning divided clustering association used find pattern data especially useful business intelligence analyze customer behavior Reinforcement learning closest way human learnie trial error performance function created tell model getting closer goal making go way Based feedback model learns make another guess continues happen every new guess better broad approach mind backbone analysis data explore specific algorithm technique would suit bestTags Programming Machine Learning Data Science Artificial Intelligence Business Intelligence |
2,082 | Better Marketing Newsletter: How to Make Your Content Stand Out in 2021 | Hey y’all,
In this issue, we’ve got articles about mindfulness and marketing, a bunch of 2021 predictions and trends, and an explanation of why you’re so obsessed with getting 10,000 steps in every day.
We launched the Better Marketing Slack Community last week, and it’s been fun to engage in conversations about newsletter platforms, Medium design, gender in marketing campaigns, and more. If you’re interested in connecting with other Better Marketing readers, come join us!
Featured Articles | https://medium.com/better-marketing/better-marketing-newsletter-how-to-make-your-content-stand-out-in-2021-2a13a5943804 | ['Brittany Jezouit'] | 2020-12-18 15:23:28.773000+00:00 | ['Meditation', 'Writing', 'Media', 'Newsletter', 'Marketing'] | Title Better Marketing Newsletter Make Content Stand 2021Content Hey y’all issue we’ve got article mindfulness marketing bunch 2021 prediction trend explanation you’re obsessed getting 10000 step every day launched Better Marketing Slack Community last week it’s fun engage conversation newsletter platform Medium design gender marketing campaign you’re interested connecting Better Marketing reader come join u Featured ArticlesTags Meditation Writing Media Newsletter Marketing |
2,083 | 7 Common Dreams and What They Say About You | 7 Common Dreams and What They Say About You
How to use your dreams to understand yourself better
Photo by Joshua Abner from Pexels
Not many people pay enough attention to their dreams, and here’s why that should change. Dreams are manifestations of unconscious desires and wishes. They are signals from the brain and body.
Carl Jung, a highly reputed Swiss psychiatrist, saw dreams as “the psyche’s attempt to communicate important things to the individual”, and he valued them above all else, as a way of knowing what was really going on.
Dreams help you make connections about your feelings that your conscious self wouldn’t make. Think of them as free therapy sessions in your mind, nudging you to confront your suppressed emotions.
Before we get into the dream interpretations, let me clarify how you can tell when a dream actually means something. PET scans and MRIs have shown that some dreams are mere “data dumps”, where you dispose of excess information that you collected during the day.
Your brain discards “useless” memories, and saves the valuable ones. So, a random acquaintance or something you thought of during the day popping up in your dreams is very normal and may not signify something deep.
However, many recurring dreams reveal unusual and sometimes bizarre symbolism that cannot be written off as a coincidence. These symbols are strongly connected to the psyche and can help dreamers understand themselves much better. | https://medium.com/indian-thoughts/7-common-dreams-and-what-they-say-about-you-c341222c2849 | ['Bertilla Niveda'] | 2020-11-09 07:52:46.573000+00:00 | ['Psychology', 'Dreams', 'Mental Health', 'Philosophy', 'Self'] | Title 7 Common Dreams Say YouContent 7 Common Dreams Say use dream understand better Photo Joshua Abner Pexels many people pay enough attention dream here’s change Dreams manifestation unconscious desire wish signal brain body Carl Jung highly reputed Swiss psychiatrist saw dream “the psyche’s attempt communicate important thing individual” valued else way knowing really going Dreams help make connection feeling conscious self wouldn’t make Think free therapy session mind nudging confront suppressed emotion get dream interpretation let clarify tell dream actually mean something PET scan MRIs shown dream mere “data dumps” dispose excess information collected day brain discard “useless” memory save valuable one random acquaintance something thought day popping dream normal may signify something deep However many recurring dream reveal unusual sometimes bizarre symbolism cannot written coincidence symbol strongly connected psyche help dreamer understand much betterTags Psychology Dreams Mental Health Philosophy Self |
2,084 | When Trade Went Global | When Trade Went Global
A review of Valerie Hansen’s “The Year 1000: When Explorers Connected the World — and Globalization Began”
If I asked you when explorers connected the world for the first time, what would you say? A month ago, I would have said in 1492. Columbus sailed the ocean blue, a new exchange of food, ideas, animals, people, and microbes changed the world forever. That’s the first time the world was connected in any meaningful way. But with her new book, Valerie Hansen has convinced me of something that sounded illogical: the world had already been connected before Columbus. Columbus and the other 15th century explorers took it a step further, yes, but they were only continuing what had been started by the Vikings around the year 1000.
Valerie Hansen’s most provocative thesis (one which I don’t want to overstate because she doesn’t) is that the process of globalization began and the world was connected for the first time around the year 1000. That is not to say these explorers created a sustained connection like the one seen in the era of the Columbian Exchange, because that is important in itself. Even more provocative is not a thesis of Hansen’s but a subpoint to support it: the Vikings made contact and traded with the Mayans. If this blows your mind, it did mine too. And if I can summarize the two pieces of evidence to support it: 1) the Mayans drew blonde-haired people in their art (okay evidence but explainable if you’re skeptical), and 2) the Mayans drew Viking slatted boats that were visibly different than any that the Mayans ever built. If they had never seen a Viking boat, how would they draw one when no one around them built boats like that? This evidence convinced me that the Vikings did make contact with the Mayans and had more of an effect on the pre-Columbian Americas than I previously thought.
But to zoom in on this point of Hansen’s does not do justice to the entirety of the book. The Vikings’ travels are an important early point, but the larger argument is that the world was much more connected in and around the year 1000 than is often assumed by non-historians. To show this, Hansen takes the reader on a tour of the world focused around the year 1000. Usually, she goes back to about 800 or 900 to give context for each chapter, and she always continues the narrative in abbreviated form until ~1450 to show the effects of trade in the given region or civilization. However, she laser-focuses the narrative around the year 1000 as much as possible, giving credence to the book’s title. This includes an analysis of the Silk Roads, Trans-Saharan trade, and Indian Ocean trade that interwove the economies, cultures, and societies of the majority of the world by c. 1000. Most of this analysis is also focused on people groups that are not given much focus in most histories of globalization, as those histories most often begin the narrative with Columbus.
Hansen is successful in convincing me that the process of globalization began around the year 1000 and that refusing to acknowledge the accomplishments of earlier societies in globalization leads to a history that is too eurocentric. The Year 1000, however, achieves a balance that highlights the achievements of almost all regions of the world. Most importantly, Hansen reveals and analyzes the economic, social, and cultural connections between these distinct regions.
A world map by Sicilian cartographer al-Idrisi (1100–1165). It shows most of Afro-Eurasia.
For fellow teachers of AP World History: Modern, this book is a tremendous primer to enter the world scene in the year 1200. I almost want to call it “The Global Tapestry: The Book” (a reference to the much-maligned name of Unit 1), but it goes back much further in time and also includes many concepts from Unit 2: Networks of Exchange. Some of the people groups explored in The Year 1000 which also overlap with my curriculum include the Kitan/Liao, the Song dynasty of China, the Seljuk empire, Srivijaya, the Angkor empire, the Maya, Great Zimbabwe, Ghana, and Mali. I look forward to using the book to supplement my teaching, and I think it will be a fantastic resource for many others.
The Year 1000 taught me more about this specific period in history than all other books I’ve read combined. This is because of its relentless focus, yet the heavy emphasis on context and causation will help connect readers’ preexisting knowledge to subject matter they may have no background in. For that reason, I would recommend The Year 1000 to anyone even interested in world history. Anyone can pick it up and be successful, and it will serve as foundational knowledge for future learning as well.
I received an eARC of The Year 1000 courtesy of Scribner and NetGalley, but my opinions are my own. | https://medium.com/park-recommendations/when-trade-went-global-e2b97d96c42d | ['Jason Park'] | 2020-05-10 12:08:12.124000+00:00 | ['World', 'Nonfiction', 'Books', 'History', 'Book Review'] | Title Trade Went GlobalContent Trade Went Global review Valerie Hansen’s “The Year 1000 Explorers Connected World — Globalization Began” asked explorer connected world first time would say month ago would said 1492 Columbus sailed ocean blue new exchange food idea animal people microbe changed world forever That’s first time world connected meaningful way new book Valerie Hansen convinced something sounded illogical world already connected Columbus Columbus 15th century explorer took step yes continuing started Vikings around year 1000 Valerie Hansen’s provocative thesis one don’t want overstate doesn’t process globalization began world connected first time around year 1000 say explorer created sustained connection like one seen era Columbian Exchange important Even provocative thesis Hansen’s subpoint support Vikings made contact traded Mayans blow mind mine summarize two piece evidence support 1 Mayans drew blondehaired people art okay evidence explainable you’re skeptical 2 Mayans drew Viking slatted boat visibly different Mayans ever built never seen Viking boat would draw one one around built boat like evidence convinced Vikings make contact Mayans effect preColumbian Americas previously thought zoom point Hansen’s justice entirety book Vikings’ travel important early point larger argument world much connected around year 1000 often assumed nonhistorians show Hansen take reader tour world focused around year 1000 Usually go back 800 900 give context chapter always continues narrative abbreviated form 1450 show effect trade given region civilization However laserfocuses narrative around year 1000 much possible giving credence book’s title includes analysis Silk Roads TransSaharan trade Indian Ocean trade interwove economy culture society majority world c 1000 analysis also focused people group given much focus history globalization history often begin narrative Columbus Hansen successful convincing process globalization began around year 1000 refusing acknowledge accomplishment earlier society globalization lead history eurocentric Year 1000 however achieves balance highlight achievement almost region world importantly Hansen reveals analyzes economic social cultural connection distinct region world map Sicilian cartographer alIdrisi 1100–1165 show AfroEurasia fellow teacher AP World History Modern book tremendous primer enter world scene year 1200 almost want call “The Global Tapestry Book” reference muchmaligned name Unit 1 go back much time also includes many concept Unit 2 Networks Exchange people group explored Year 1000 also overlap curriculum include KitanLiao Song dynasty China Seljuk empire Srivijaya Angkor empire Maya Great Zimbabwe Ghana Mali look forward using book supplement teaching think fantastic resource many others Year 1000 taught specific period history book I’ve read combined relentless focus yet heavy emphasis context causation help connect readers’ preexisting knowledge subject matter may background reason would recommend Year 1000 anyone even interested world history Anyone pick successful serve foundational knowledge future learning well received eARC Year 1000 courtesy Scribner NetGalley opinion ownTags World Nonfiction Books History Book Review |
2,085 | How to Become a Successful Writer in 5 Not-So-Easy Steps | Bleary-eyed I stumble
From the bed to the floor
Feel the carpet squish
Beneath my toes.
Silent, I tiptoe
Down the dimly-lit hall
To the table
Where my journal waits.
Open it, step inside
My mind
Where will it take me today?
What makes a writer successful? Fame? Money? These days both are elusive. If these are your goals, you may not have the stamina for the expedition. The road may be arduous. There’ll be twists and turns along the way. And maybe even a dark forest or two. If you’re ready to find out if you have the fortitude for the journey, read on.
Being a successful writer is finding joy in the journey, finding the sunlight through the trees, and making new discoveries along the way. It’s showing up, engaging in something meaningful, and celebrating your progress.
If you’re ready to begin, here are the 5 not-so-easy steps:
1. Write. Daily.
The inimitable Jane Yolen, author of 386 books, has this magic word she shares with writers — BIC. It stands for Butt in Chair.
This is the first step in becoming a successful writer. You have to write. Daily.
How many times have you fantasized about seeing your name on a book in a bookstore? Or imagined yourself reading to a room full of kids? Or speaking at a writer’s conference?
None of these experiences will happen if you don’t do the work.
Jane Yolen starts every day with a poem. She calls her morning poems “finger exercises” because they wake her up and get her ready to write. She likens the practice to “priming the pump so the water flows.”
Yolen says she gets grumpy if she just rushes into one of her projects without writing her morning poem. I tried it this morning and the result is the poem above. It works. I was able to get into the flow of this article much quicker than I normally do.
How do you prime yourself for writing?
Try writing a poem each morning. Or, sit down and let your thoughts spill onto the page in a stream of consciousness. Then, tackle your big project. ‘Finger exercises’ remove the damns blocking your flow of words.
If you want to be a successful writer get your butt in the chair, warm-up, and write.
2. Play the long game
Kwame Alexander jokes that he is a “26-year overnight success.”
Don’t expect instant success as a writer. Most successful authors take years to finally break out. Judy Blume received rejections for two years and attended writing courses at night before ever successfully publishing a book. Kwame Alexander self-published fourteen books before he finally found an agent.
What is one trait all successful writers share? Determination.
Play the long game. Stick with writing because you love it, because it completes you and gives your life meaning, not because you’re expecting instant success.
Neal Porter, vice-president and publisher of Neal Porter Books, says picture books take 2–3 years from submission to publication. That’s a long time. If you want to be a traditionally-published writer, you’ll have to be patient. Even if you want to self-publish, you’ll still need patience. Make sure your book is the very best version it can be before you share it with the world.
In the SCBWI interview with Jane Yolen she said she’s never understood all this stuff about writers bleeding onto the page. She’s joyous when she writes. Perhaps we all need to be a bit more joyous and little less serious when we write. Our joy will shine through in our writing.
If you’re not prepared to play the long game and you can’t find the joy in your writing, it will show.
3. Read
Reading the work of others will help you tune your ear.
First, read for the joy of it. I can’t tell you how many times I’ve started a book and planned to read it with a writer’s eyes, only to get swept away by the story. If you want to study the craft of writing first, read for the sheer joy of it. Get this out of your system. Only then will you be able to go back and read like a writer.
Find standout passages. What makes them special?
Which scenes provoke emotional reactions? Why?
Notice what techniques the writer uses. Look for the pauses in the story — those quiet, powerful scenes followed by loud, thumping action. How does the author blend the two?
How does an author add layers of meaning and depth? What lies under the surface of the story?
How do the scenes fit together?
Look for the rhythm of the writing. Are the sentences short? Are they long? How does the author merge them seamlessly?
What are the conventions of your genre? Read other books in the same genre. Look for the common elements and also for the ways the author distinguishes himself within the genre.
Explore character. What are the protagonist’s flaws? What are his/her quirks? Everyone has them. Give you character some dimension. How does your character charm the reader? Does he or she have any endearing qualities? Are they a loyal friend or a well-meaning fool?
When you tune your reader’s ear, you’ll notice a world of exciting possibilities open to you as a writer. Be bold. Go forth. Explore.
4. Stay in your own lane
Stop looking ahead or over your shoulders at other writers. It may feel like a punch to the gut when you see someone who seems to come out of nowhere and zoom ahead of you, but remember, we’re all on our own journey and your success may not look like someone else’s.
As your writing develops, try to discover your own voice. Sometimes, it helps to write in the style of other authors when you’re starting out. But, if you stick with writing long enough your goal should be to find your own voice and style. Find it by experimenting, taking chances, and being brave.
“Every great or even every very good writer makes the world over according to his own specification. It is his world and no other. This is one of the things that distinguishes one writer from another. Not talent. There’s plenty of that around.”-Raymond Carver
If you’re copying others, your writing won’t provoke an emotional reaction in the reader. Jill Santopolo, author and associate publisher of Philomel Books, looks for emotion in the books she takes on.
“Those books that touch readers are the ones who really sell.” — Jill Santopolo
Stay in your own lane. Stop comparing yourself with others. Be brave, take chances, experiment and practice. Listen to your intuition to help you find and hone your voice.
“Know yourself. Know what matters. What are your priorities? What will you fight for?” — Julie Strauss-Gabel
Where will your writer’s journey take you?
5. Put yourself out there
“It is impossible to live without failing at something unless you live so cautiously that you might as well not have lived at all — in which case, you fail by default” — J.K. Rowling, Harvard Commencement address
Sharing your work with others is scary. It makes you feel vulnerable. Do it anyway. It’s the quickest way to grow.
In her interview with SCBWI on Saturday Judy Blume suggested you looking for someone supportive of your writing journey. She once had a writing teacher who didn’t believe in her so she left his class. It’s important to get the right people in your corner. You want people who are honest and direct, but also encouraging.
Who will be in your superhero writing team?
Who will challenge you?
Who will hold you accountable?
Who will celebrate with you when you finish your project?
Create your own superhero writing team. Join a writing group. Support others. Help each other grow and develop. When you do, everyone wins.
Face your fears. Hit publish. Find readers for your manuscript and deal with your discomfort because it’s the only way to grow.
Conclusion
If you want to be a successful writer, write every day, play the long game, read, stay in your own lane, and put yourself out there. You won’t regret it.
“It is our choices, Harry, that show what we truly are, far more than our abilities.” — J.K. Rowling, Harry Potter and the Chamber of Secrets
If you enjoyed this story, you may like these:
Becky Grant is an orphan with the ability to harness the magical powers of gemstones which she uses to stop evil Emperor Amaru from taking over Zatonia, land of wise condors, jewel-eyed cave dwellers, and vicious boarmen. Oh, wait! That’s the protagonist of her debut middle grade novel, The Stone Seer. Nevermind. Becky is really a boring adult who loves to drink coffee, sing in the car, and live vicariously through her middle grade characters. | https://medium.com/a-novel-idea/how-to-become-a-successful-writer-in-5-not-so-easy-steps-10a2bf6662b1 | ['Becky Grant'] | 2020-10-16 15:13:05.081000+00:00 | ['Productivity', 'Success', 'Writing Tips', 'Inspiration', 'Writing'] | Title Become Successful Writer 5 NotSoEasy StepsContent Blearyeyed stumble bed floor Feel carpet squish Beneath toe Silent tiptoe dimlylit hall table journal wait Open step inside mind take today make writer successful Fame Money day elusive goal may stamen expedition road may arduous There’ll twist turn along way maybe even dark forest two you’re ready find fortitude journey read successful writer finding joy journey finding sunlight tree making new discovery along way It’s showing engaging something meaningful celebrating progress you’re ready begin 5 notsoeasy step 1 Write Daily inimitable Jane Yolen author 386 book magic word share writer — BIC stand Butt Chair first step becoming successful writer write Daily many time fantasized seeing name book bookstore imagined reading room full kid speaking writer’s conference None experience happen don’t work Jane Yolen start every day poem call morning poem “finger exercises” wake get ready write likens practice “priming pump water flows” Yolen say get grumpy rush one project without writing morning poem tried morning result poem work able get flow article much quicker normally prime writing Try writing poem morning sit let thought spill onto page stream consciousness tackle big project ‘Finger exercises’ remove damn blocking flow word want successful writer get butt chair warmup write 2 Play long game Kwame Alexander joke “26year overnight success” Don’t expect instant success writer successful author take year finally break Judy Blume received rejection two year attended writing course night ever successfully publishing book Kwame Alexander selfpublished fourteen book finally found agent one trait successful writer share Determination Play long game Stick writing love completes give life meaning you’re expecting instant success Neal Porter vicepresident publisher Neal Porter Books say picture book take 2–3 year submission publication That’s long time want traditionallypublished writer you’ll patient Even want selfpublish you’ll still need patience Make sure book best version share world SCBWI interview Jane Yolen said she’s never understood stuff writer bleeding onto page She’s joyous writes Perhaps need bit joyous little le serious write joy shine writing you’re prepared play long game can’t find joy writing show 3 Read Reading work others help tune ear First read joy can’t tell many time I’ve started book planned read writer’s eye get swept away story want study craft writing first read sheer joy Get system able go back read like writer Find standout passage make special scene provoke emotional reaction Notice technique writer us Look pause story — quiet powerful scene followed loud thumping action author blend two author add layer meaning depth lie surface story scene fit together Look rhythm writing sentence short long author merge seamlessly convention genre Read book genre Look common element also way author distinguishes within genre Explore character protagonist’s flaw hisher quirk Everyone Give character dimension character charm reader endearing quality loyal friend wellmeaning fool tune reader’s ear you’ll notice world exciting possibility open writer bold Go forth Explore 4 Stay lane Stop looking ahead shoulder writer may feel like punch gut see someone seems come nowhere zoom ahead remember we’re journey success may look like someone else’s writing develops try discover voice Sometimes help write style author you’re starting stick writing long enough goal find voice style Find experimenting taking chance brave “Every great even every good writer make world according specification world one thing distinguishes one writer another talent There’s plenty around”Raymond Carver you’re copying others writing won’t provoke emotional reaction reader Jill Santopolo author associate publisher Philomel Books look emotion book take “Those book touch reader one really sell” — Jill Santopolo Stay lane Stop comparing others brave take chance experiment practice Listen intuition help find hone voice “Know Know matter priority fight for” — Julie StraussGabel writer’s journey take 5 Put “It impossible live without failing something unless live cautiously might well lived — case fail default” — JK Rowling Harvard Commencement address Sharing work others scary make feel vulnerable anyway It’s quickest way grow interview SCBWI Saturday Judy Blume suggested looking someone supportive writing journey writing teacher didn’t believe left class It’s important get right people corner want people honest direct also encouraging superhero writing team challenge hold accountable celebrate finish project Create superhero writing team Join writing group Support others Help grow develop everyone win Face fear Hit publish Find reader manuscript deal discomfort it’s way grow Conclusion want successful writer write every day play long game read stay lane put won’t regret “It choice Harry show truly far abilities” — JK Rowling Harry Potter Chamber Secrets enjoyed story may like Becky Grant orphan ability harness magical power gemstone us stop evil Emperor Amaru taking Zatonia land wise condor jeweleyed cave dweller vicious boarmen Oh wait That’s protagonist debut middle grade novel Stone Seer Nevermind Becky really boring adult love drink coffee sing car live vicariously middle grade charactersTags Productivity Success Writing Tips Inspiration Writing |
2,086 | Are You In The Wilderness Season? | If we held quiet, we could hear the bears, the crunch of leaves and branches underfoot. The soft sounds the cubs made. We watched them streak black through the trees. We didn’t expect to see them. My pop and I took a cabin in Vermont — ages ago, it seems — and the closest we’d ever come to a wild animal were the thoroughbred yearlings he broke. We lived in New York, after all.
But there they were, not a mile from our cabin nestled deep in the woods, and we crouched down low and didn’t dare breathe. I wondered if they could smell us, how we sweated through layers of clothes in terror, awe, and fear. I held my pepper spray, ready. As if a little tube could protect me from a mother charging. My pop rolled his eyes, but asked if I had an extra. By then, we were shaking because they were close. Is it strange to say we could fear their weight in the distance, their hulk? We pressed our stiff bodies into the earth. The ground was cold and yielding, like a grave.
I could taste the dirt, it stained my lips black. I remember the salt in my teeth.
We lay like that for a few minutes and then they were gone. Disoriented, we got lost on the way back, and by the time we could collapse onto the rugs on the wood floors, the day had folded into black.
On the drive back, we kept retelling the story, adding color and contouring the details as my pop and I were prone to do, but a part of me didn’t feel the story belonged to me. We didn’t belong in the woods. We didn’t know its language, couldn’t navigate the terrain. We were tourists, and it wasn’t until we got back to Long Island did I understand the depth of our foreignness.
While we had our guides, maps, and compasses, we could still get lost. There will always be places where our discomfort snuffs out that which is warm and familiar. I couldn’t shake the cold or the bears out of my bones, and I remember a few weeks later riding the subway all day because this was what I knew. I knew every stop on every train from Brooklyn and Queens to Manhattan and the Bronx. Stations that had remnants of 70s and 80s grit and grime, and stops that were bleached clean and whitewashed new.
This is my country.
Sometimes, it seems there’s nothing more monstrous than forcing someone to stew in sadness. We’re desperate to flee the unknown and the possibility of a pain or loss that has no end. We anesthetize and dull the edges. Quiet unnerves us so we switch on televisions, fans — any form of white noise. We’d rather be uncomfortably comfortable than walk through the wild. We embrace noise and constant velocity because should we pause we face our reckoning. We have to deal with all that we’ve been dodging.
When I first moved to Los Angeles, everything felt quiet. I had no subways or distractions or friends. Nothing was familiar, and the bears became expensive cars careening down the 10 or 405. Every day I woke to open-heart surgery. My skin felt like a graft that didn’t take. I had no tube of pepper spray to protect me from the grief of having lost my estranged mother to cancer. A constant sadness that threatened to swallow me whole.
Then, the tsunami of questions. What was I doing here? What did I leave the comfort of New York? Why did I have to start over? What if I lost everything? What if I failed? What if I had to slouch my way back east in defeat?
And then another layer deep: Should I have said goodbye to my mother before she died? What kind of woman had I become, and did I like this version of me? What life was I living and is what I wanted or intended? Should I go on when I can’t go on?
I used to think of depression as a dark country. Those who suffered from it had a visa that would permit you entry, and we had no instruments to navigate our way through and out. We were never promised a return ticket. That country being the imperialist motherfucker it is, began to encompass moments of fear, uncertainty, unrest, anxiety, and despair. I realized I didn’t have to leave my house to find sadness. The unknown is always just beyond my reach.
I didn’t need a foreign country or stamps on my passport — I could easily get lost in spaces that once felt familiar. I could lose my way coming home.
So, this put me to thinking of my pop and I in the woods in Vermont. How the terror we felt from having gotten lost was never as long as we thought it was or could be. The pain is always temporary, even when we’re convinced we’ll never claw our way out. This year, I tumbled into the wilderness season knowing pepper spray (or a simple solution) wasn’t going to save me. There exists no simple or easy way out — but the road, cabin, or clearing does exist. This much I know to be true.
I would be lying if I told you the following months don’t make me anxious. I have plans — like I had for 2020— but there are so many unknowns, wild cards, characters resurrected from the dead, and plot twists. Will I ever be able to leave this country? Will I finally have semblance of financial security? Will I get a German shepherd? What sustains me, what stops me from jumping out of open windows, is the clearing. Knowing every wilderness has its season. Every shape we take is temporary.
And if we hold still and breathe, maybe the bears won’t make a feast of us. Maybe they’ll move on and disappear in the wild, through the trees. | https://felsull.medium.com/are-you-in-the-wilderness-season-bff39e757444 | ['Felicia C. Sullivan'] | 2020-12-29 02:12:47.880000+00:00 | ['Life Lessons', 'Mental Health', 'Self', 'Relationships', 'Writing'] | Title Wilderness SeasonContent held quiet could hear bear crunch leaf branch underfoot soft sound cub made watched streak black tree didn’t expect see pop took cabin Vermont — age ago seems — closest we’d ever come wild animal thoroughbred yearling broke lived New York mile cabin nestled deep wood crouched low didn’t dare breathe wondered could smell u sweated layer clothes terror awe fear held pepper spray ready little tube could protect mother charging pop rolled eye asked extra shaking close strange say could fear weight distance hulk pressed stiff body earth ground cold yielding like grave could taste dirt stained lip black remember salt teeth lay like minute gone Disoriented got lost way back time could collapse onto rug wood floor day folded black drive back kept retelling story adding color contouring detail pop prone part didn’t feel story belonged didn’t belong wood didn’t know language couldn’t navigate terrain tourist wasn’t got back Long Island understand depth foreignness guide map compass could still get lost always place discomfort snuff warm familiar couldn’t shake cold bear bone remember week later riding subway day knew knew every stop every train Brooklyn Queens Manhattan Bronx Stations remnant 70 80 grit grime stop bleached clean whitewashed new country Sometimes seems there’s nothing monstrous forcing someone stew sadness We’re desperate flee unknown possibility pain loss end anesthetize dull edge Quiet unnerves u switch television fan — form white noise We’d rather uncomfortably comfortable walk wild embrace noise constant velocity pause face reckoning deal we’ve dodging first moved Los Angeles everything felt quiet subway distraction friend Nothing familiar bear became expensive car careening 10 405 Every day woke openheart surgery skin felt like graft didn’t take tube pepper spray protect grief lost estranged mother cancer constant sadness threatened swallow whole tsunami question leave comfort New York start lost everything failed slouch way back east defeat another layer deep said goodbye mother died kind woman become like version life living wanted intended go can’t go used think depression dark country suffered visa would permit entry instrument navigate way never promised return ticket country imperialist motherfucker began encompass moment fear uncertainty unrest anxiety despair realized didn’t leave house find sadness unknown always beyond reach didn’t need foreign country stamp passport — could easily get lost space felt familiar could lose way coming home put thinking pop wood Vermont terror felt gotten lost never long thought could pain always temporary even we’re convinced we’ll never claw way year tumbled wilderness season knowing pepper spray simple solution wasn’t going save exists simple easy way — road cabin clearing exist much know true would lying told following month don’t make anxious plan — like 2020— many unknown wild card character resurrected dead plot twist ever able leave country finally semblance financial security get German shepherd sustains stop jumping open window clearing Knowing every wilderness season Every shape take temporary hold still breathe maybe bear won’t make feast u Maybe they’ll move disappear wild treesTags Life Lessons Mental Health Self Relationships Writing |
2,087 | One for the Road | FICTION
One for the Road
I drank myself senseless on Christmas Eve. I knew I shouldn’t have done that; I knew it was a bad idea; it was a terrible idea even, but I did it anyway.
The bearded man at the other end of the bar raised his shot glass; he raised it as ceremoniously as a priest during a Mass, then he froze for an instant, tried to steady his swaying body, even though his hand holding the glass remained impeccably motionless, like that of a crane operator, and declared: “One for the road! Ladies and gentlemen, one for the road!”
He downed it in one gulp, and the entire bar — full of patrons and smoke — also downed their drinks with him. I saw the sudden flashes of light, reflected from the bottoms of the raised glasses, flare up all around me, here and there, as if the starry night had crept into this crowded place and taken it over. I remember waiting for them to finish all that collective raising the toast and drinking — I had never been particularly fond of any mass actions, or inactions for that matter. And only then, when the last glass had landed safely on the runway of the counter or a table, did I allow mine to take off.
“Don’t be shy, ladies and gentlemen! Don’t be shy!” he would reappear a few minutes later, as shaky and wobbly as before, and yell: “One for the road! One for the road, ladies and gentlemen! And Merry Christmas to you!”
It took several of those high-proof farewells to knock him off his feet. Too much cordiality defeated him, apparently; so that, before long, two more or less sober Samaritans had to step up and tow him out of there, tow him back home — his insteps dragging on the ground, like twin turntable needles trying to record something. They left two parallel grooves in the fresh snow — so they had recorded something, after all: his path home.
When I got out of there myself — yet without all that dragging and towing — I felt the sidewalk swim beneath me, just as if it tried to catch me off guard and smack me in the face. I saw everything swim and be in constant motion: the snow-caked shops closed for the night; the blazing street lamps forcing me to squint my eyes; the snowplows sailing majestically down the snow-covered streets like the monumental icebreakers that are about to reach and claim the North Pole; the trashcans being discreetly emptied by the warmly dressed garbage men roaming the empty streets — a swarm of nocturnal creatures creeping out of their lairs only after dark. The world, despite the pervading cold, seemed to be in a sudden mood for swimming — so I swam along with it.
I swam down the road, my plump and short legs desperately worked beneath me — like the needles in the hands of one knitting a jumper — doing their fat best not to let me down; my long coat grazed and caressed the snowdrifts, like a delicate and passionate lover.
Hazily, I saw — through the low-placed windows that I passed on my way — the cheery families gathered around their tables: the late Christmas dinners; I saw the warm and colorful blaze of fairy lights pour from each of the top-floor windows; I felt the festive mood suffusing the air; I saw the Christmas trees sag under the oppressive weight of all the sparkling and glittering nonsense being attached to them, as if they were generals with their chests hung with medals. And then I saw a splash of vomit on my right shoe — I must have stepped into something left by that yeller from the bar: I was a shrapnel victim. I dug the tip of my shoe into the nearby snowdrift — a portable shoeshiner, very convenient.
Then it occurred to me; then it struck me — I couldn’t go back home like this; I just couldn’t return home empty-handed like this — it was Christmas Eve for crying and howling out loud.
I staggered back, back to where the shops and flashy display windows could be found. I staggered back to the shops with their shelves bending and buckling under the overflow of presents, toys, mascots, shiny gift-wrapping papers, and everything that one might wish for on a day like this. I staggered back there, yet only to find them all closed and boarded up — such a cruel lack of compassion on a night like this, on such a special night like this. The tree sellers were gone as well and only the occasional heaps of green needles here and there — impossible to miss on the uncompromising whiteness of the fresh snow, like the blots of blood on the crime scene — marked the spots where they had till recently practiced their trade.
Before I realized what I was really doing, I grabbed a spruce growing next to some building — the first one that I had encountered on my way. I kept wrestling and fighting with it, the snow and needles raining on me, on the ground, on the car parked nearby, until the fairly thin trunk gave up and broke with a snap.
Eager to walk away from there as fast as possible, from the amputated tree trunk sticking accusatorily from the snow, from the incriminating evidence of my wooden crime, I cleaned it up; I straightened the spruce up, as if I were adjusting a child’s clothes before dropping it off at school.
I barely climbed the stairs; it was dark in there. The tree kept brushing against the walls of the narrow staircase, showering the needles all the time, leaving a treacherous trail on the steps — I would deal with it later; I would deal with it tomorrow. I found the door to the apartment to be invitingly ajar — a warm light seeped from there, like the comforting heat from a fireplace. I silently walked in.
In the living room, I saw a table; it was all set: full of empty plates; the empty chairs all around it, as if I intruded on a secret meeting of the huddling furniture. Was I too late? Did I miss it? Had they started without me? Was it that bad?
Cursing myself, I propped the spruce against the wall; it poked and tilted the picture hanging there, like one fingering a loose tooth. I wiped my forehead — it was hot in there; too hot to my liking.
Then I saw it: another Christmas tree, fully decorated, large and proud, sitting in the opposite corner of the room: a glittering impostor — they hadn’t even waited for me to do it.
A sudden wave of drowsiness came over me; I could hardly keep my eyes open: I had to lie down. I directed my clumsy and more and more dragging steps toward the bedroom. The door was open; the bed was nicely made up — every tired man’s dream.
I didn’t even bother to take off my coat, much less the shoes — it was my bed, after all, I could afford a hint of slovenliness, once in a while at least. It was warm; it was pleasantly soothing; it was fine.
Then she appeared in the doorway, like a slice of bread jumping out of a toaster.
“Out, out, out of here!” she started screaming right away. “Get out of here, now.”
A tall and balding man dressed in a ridiculous sweater with a horizontal diamond pattern — was that the best he could do? — materialized right next to her; the silly robust grocer.
“Jeez, not him again. Not that guy again,” the baldie squeezed into the room, past the speechless woman; he leaped to my side and started tugging at my coat’s sleeve, like a toothless dog maltreating a trespasser. “Sir, you can’t be here. Sir, you can’t keep coming here.”
“Call the police,” the woman demanded. “Kids, call the police. Tell them that this man is here; he’s here again.”
I saw a duo of girlish little heads peeking around the doorframe and looking like tiny flowers in a boutonnière.
“Sir, it’s not your home,” the baldie went on and on. “Sir, you can’t be here. It’s not your home.”
“Kids, call the cops,” the woman kept wailing. “Where’s the phone? Call the cops.”
“Sir, get up. Sir, it’s not your bed,” the baldie pleaded. “Please go away. It’s not your home.”
“It was my home,” I slurred, burying my head deeper into the pillow — a blissful smile on my face — into the soft bedclothes. “It used to be my home. It was my home once. I only had one for the road.” | https://medium.com/the-nonconformist/one-for-the-road-9b90780fd14e | ['F. R. Foksal'] | 2020-12-29 09:44:40.141000+00:00 | ['Storytelling', 'Books', 'Short Story', 'Fiction', 'Flash Fiction'] | Title One RoadContent FICTION One Road drank senseless Christmas Eve knew shouldn’t done knew bad idea terrible idea even anyway bearded man end bar raised shot glass raised ceremoniously priest Mass froze instant tried steady swaying body even though hand holding glass remained impeccably motionless like crane operator declared “One road Ladies gentleman one road” downed one gulp entire bar — full patron smoke — also downed drink saw sudden flash light reflected bottom raised glass flare around starry night crept crowded place taken remember waiting finish collective raising toast drinking — never particularly fond mass action inaction matter last glass landed safely runway counter table allow mine take “Don’t shy lady gentleman Don’t shy” would reappear minute later shaky wobbly yell “One road One road lady gentleman Merry Christmas you” took several highproof farewell knock foot much cordiality defeated apparently long two le sober Samaritans step tow tow back home — instep dragging ground like twin turntable needle trying record something left two parallel groove fresh snow — recorded something path home got — yet without dragging towing — felt sidewalk swim beneath tried catch guard smack face saw everything swim constant motion snowcaked shop closed night blazing street lamp forcing squint eye snowplow sailing majestically snowcovered street like monumental icebreaker reach claim North Pole trashcans discreetly emptied warmly dressed garbage men roaming empty street — swarm nocturnal creature creeping lair dark world despite pervading cold seemed sudden mood swimming — swam along swam road plump short leg desperately worked beneath — like needle hand one knitting jumper — fat best let long coat grazed caressed snowdrift like delicate passionate lover Hazily saw — lowplaced window passed way — cheery family gathered around table late Christmas dinner saw warm colorful blaze fairy light pour topfloor window felt festive mood suffusing air saw Christmas tree sag oppressive weight sparkling glittering nonsense attached general chest hung medal saw splash vomit right shoe — must stepped something left yeller bar shrapnel victim dug tip shoe nearby snowdrift — portable shoeshiner convenient occurred struck — couldn’t go back home like couldn’t return home emptyhanded like — Christmas Eve cry howling loud staggered back back shop flashy display window could found staggered back shop shelf bending buckling overflow present toy mascot shiny giftwrapping paper everything one might wish day like staggered back yet find closed boarded — cruel lack compassion night like special night like tree seller gone well occasional heap green needle — impossible miss uncompromising whiteness fresh snow like blot blood crime scene — marked spot till recently practiced trade realized really grabbed spruce growing next building — first one encountered way kept wrestling fighting snow needle raining ground car parked nearby fairly thin trunk gave broke snap Eager walk away fast possible amputated tree trunk sticking accusatorily snow incriminating evidence wooden crime cleaned straightened spruce adjusting child’s clothes dropping school barely climbed stair dark tree kept brushing wall narrow staircase showering needle time leaving treacherous trail step — would deal later would deal tomorrow found door apartment invitingly ajar — warm light seeped like comforting heat fireplace silently walked living room saw table set full empty plate empty chair around intruded secret meeting huddling furniture late miss started without bad Cursing propped spruce wall poked tilted picture hanging like one fingering loose tooth wiped forehead — hot hot liking saw another Christmas tree fully decorated large proud sitting opposite corner room glittering impostor — hadn’t even waited sudden wave drowsiness came could hardly keep eye open lie directed clumsy dragging step toward bedroom door open bed nicely made — every tired man’s dream didn’t even bother take coat much le shoe — bed could afford hint slovenliness least warm pleasantly soothing fine appeared doorway like slice bread jumping toaster “Out here” started screaming right away “Get now” tall balding man dressed ridiculous sweater horizontal diamond pattern — best could — materialized right next silly robust grocer “Jeez guy again” baldie squeezed room past speechless woman leaped side started tugging coat’s sleeve like toothless dog maltreating trespasser “Sir can’t Sir can’t keep coming here” “Call police” woman demanded “Kids call police Tell man he’s again” saw duo girlish little head peeking around doorframe looking like tiny flower boutonnière “Sir it’s home” baldie went “Sir can’t It’s home” “Kids call cops” woman kept wailing “Where’s phone Call cops” “Sir get Sir it’s bed” baldie pleaded “Please go away It’s home” “It home” slurred burying head deeper pillow — blissful smile face — soft bedclothes “It used home home one road”Tags Storytelling Books Short Story Fiction Flash Fiction |
2,088 | My Top Ten Highest Earning Medium Stories for November 2020 | My Top Ten Highest Earning Medium Stories for November 2020
Plus one honorable mention.
Photo by Viacheslav Bublyk on Unsplash
Last month was my best month yet for my earnings on this platform. I made over $15 — something I did not expect. In addition, I achieved one other milestone I didn’t expect. The story that had been chosen for distribution in October had so many views from Medium readers that I was able to see their interests.
So, without any further delay, here are my top ten most popular Medium stories for November 2020, by the amount earned:
A Simple Process for Tracking All Your Goals in Google Sheets
This story made a whole 10 cents in November and has made $1.30 since I published it in January. In the post, I discuss how to set up a dashboard in Google Sheets so you can track all your goals. It’s received over 1000 views, but most of them came from Google. If you want to check it out, here it is: | https://medium.com/writers-blokke/my-top-ten-highest-earning-medium-stories-for-november-2020-b95acdb87304 | ['Erica Martin'] | 2020-12-03 02:59:55.988000+00:00 | ['Medium', 'Analytics', 'Motivation', 'Reading', 'Writing'] | Title Top Ten Highest Earning Medium Stories November 2020Content Top Ten Highest Earning Medium Stories November 2020 Plus one honorable mention Photo Viacheslav Bublyk Unsplash Last month best month yet earnings platform made 15 — something expect addition achieved one milestone didn’t expect story chosen distribution October many view Medium reader able see interest without delay top ten popular Medium story November 2020 amount earned Simple Process Tracking Goals Google Sheets story made whole 10 cent November made 130 since published January post discus set dashboard Google Sheets track goal It’s received 1000 view came Google want check isTags Medium Analytics Motivation Reading Writing |
2,089 | 7 Typical Traits of Medically Unsocial People | Almost all people suffering from social anxiety denies it until they start seeing it infecting other parts of their life. It degrades their health, destroys their relationships, and decimates their dreams; leaving nothing behind but an unfillable void.
No-one wants to live like this, and the majority of people don’t even accept it in their minds. But the reality begs to differ when the root cause of all their misery is right under their noses, unnoticed. We all want to live happily and become successful in life — you may have the right attitude, a burning desire, and an unshakable persistence to do so. But it doesn’t necessarily translate into success. You know why? Because of this one flaw in your personality — Social Anxiety.
Obviously, it is curable but, not all can fight it easily.
Here are the 7 major traits of medically unsocial people. If you find yourself in a similar situation, then it is time to stop denying it and seek some professional help, ASAP. | https://medium.com/mental-health-and-addictions-community/7-typical-traits-of-medically-unsocial-people-675073576b39 | ['Nishu Jain'] | 2020-12-02 17:54:25.381000+00:00 | ['Social Anxiety', 'Personal Development', 'Relationships', 'Mental Health', 'Psychology'] | Title 7 Typical Traits Medically Unsocial PeopleContent Almost people suffering social anxiety denies start seeing infecting part life degrades health destroys relationship decimates dream leaving nothing behind unfillable void Noone want live like majority people don’t even accept mind reality begs differ root cause misery right nose unnoticed want live happily become successful life — may right attitude burning desire unshakable persistence doesn’t necessarily translate success know one flaw personality — Social Anxiety Obviously curable fight easily 7 major trait medically unsocial people find similar situation time stop denying seek professional help ASAPTags Social Anxiety Personal Development Relationships Mental Health Psychology |
2,090 | Yes, Post-Vacation Burnout Is a Thing | Yes, Post-Vacation Burnout Is a Thing
If a holiday is supposed to leave you refreshed and restored, why are you often more tired than when you left?
Photo by Ricardo Gomez Angel on Unsplash
Have you ever come back from vacation feeling like you badly needed, well, a vacation?
Complaining about how exhausted you are after a week in Cancun isn’t going to win you any sympathy from co-workers, but it isn’t unusual to experience a crash, even after a lovely holiday.
It’s increasingly clear that skipping vacation — as more than half of Americans do — is bad for health and productivity, increasing your risk of both depression and heart attacks. It can also contribute to burnout, a syndrome recently defined by the World Health Organization as exhaustion, negativity, and loss of professional efficacy. Multiple studies suggest that detaching from work on vacation makes us more productive and creative.
But time away isn’t always relaxing — particularly if you spend it flying with kids, appeasing in-laws, or checking email — and reentry can be brutal. An overflowing inbox and multiple fires to put out can leave you feeling more drained and frazzled when you return to your desk than when you left.
With post-vacation burnout, as with most things, prevention is better than cure. Here are some tips to help avoid it.
Choose the right vacation
First of all, be sure you’ve planned a vacation that actually allows for recuperation. Occupational psychologist Sabine Sonnentag at the University of Konstanz in Germany has identified four ingredients that make a vacation restorative, but this is also about personal taste: A week-long mountain-climbing trip might be ideal for some as an escape from work, and simply exhausting for others.
Ideally, schedule your vacation with at least a day’s buffer before you have to go back to work, to give yourself time to settle back in, do laundry, get a good night’s sleep.
Set your out-of-office mindset
At work before you leave, take some time to complete your most unpleasant tasks, so you don’t spend your whole vacation thinking about them.
In addition to setting your email vacation response and Slack status, make sure your colleagues know everything they need to do, and designate someone to address any pressing issues that come up while you’re gone to reduce the chance of getting a panicked message that you need to respond to from your beach blanket.
Next, write a detailed, not-too-ambitious to-do list for your first day or two back, so that you can stumble through the first few days of reentry without straining your jet-lagged brain.
Don’t announce that you’re home for a day or two, and wait to update your status and out-of-office message.
On vacation, truly relax
Plan activities you find both relaxing and pleasurable, like idly browsing a bookstore, doing a jigsaw puzzle, or going for an easy hike. The key is to give yourself a break from trying to achieve anything.
Try to maintain some control over how you spend your time and energy. This can be tough on family vacations — especially when traveling with small children — but it’s important to carve out some time to do what you want, even if it means taking turns watching the kids or hiring childcare.
If your needs align perfectly with what your partner and family want to do, great. If not, arrange to strike out on your own at least once, whether for an early morning run or a solo museum visit.
And relaxation doesn’t have to mean downtime, but it should mean abandoning the need to perform. Consider developing a new skill or building on one, such as kayaking or taking a cooking class in the local cuisine. The activity doesn’t have to be physically risky, or even all that hard, just mentally absorbing enough to keep you focused and in a state of “flow.”
Activities that help you develop a new mastery help combat the discouragement and inadequacy that signal burnout on the job. Another upside? It’s hard to check your phone if your device is sealed in a drybag or your hands are covered in focaccia dough.
As for your work-work, do your best to unplug from it: Nearly 30% of Americans work more than they thought they would on vacation, according to a 2018 study by the American Psychological Association. Constantly checking your email undermines the potential benefits of vacation, and may even negatively affect health and well-being afterward. It also disrupts the potential for creative inspiration that can arise from allowing yourself to be a bit bored, or simply letting your mind wander.
Ease your reentry
If you’ve managed to schedule a buffer day or two, use it to decompress, catch up on sleep, and savor the experience. Print out photos, record memories in a journal, or download recipes that will remind you of your trip. Write thank-you notes to your hosts and travel companions. Plan your next adventure.
Once you return to work, be sure to go to bed and get up at your normal times, and don’t try to make up for being gone by resuming work in double time.
And here’s a useful trick to ease yourself back into the pace of office communication: Don’t announce that you’re back at your desk for a day or two, and wait to update your status and out-of-office message.
Recognize that a vacation isn’t a cure-all
Even if your vacation was blissfully relaxing, don’t be surprised if the mood-boosting effects fade soon after your return — that’s normal. If you don’t feel any better after vacation, or you can’t enjoy it because of work stress, however, you might be suffering from work-related burnout.
If you’re struggling with burnout, chances are your vacation was only a temporary fix. The chronic stress associated with burnout syndrome isn’t resolved by a week or two of time off, no matter how perfectly you plan it, notes Irvin Schonfeld, an occupational health researcher at the City University of New York. Schonfeld’s research suggests that people who score high on the Maslach Burnout Inventory may actually have a form of job-stress-induced depression. Emotional exhaustion, which Schonfeld describes as “the core of burnout,“ is also highly correlated with depression, “so it’s tricky to say that burnout is a separate phenomenon from depression,” he says.
Finally, remember that vacation isn’t the only time you’re allowed to relax. Find ways to build “deliberate rest” into your day; take opportunities to let your mind wander or be entertained — even at work, or during your commute.
And pretend you’re on vacation over the weekends. Research suggests that when people adopt a “vacation mindset” on weekends, they do less housework, and spend more time eating and having sex. They’re also happier on Mondays. Go figure. | https://forge.medium.com/yes-post-vacation-burnout-is-a-thing-ef614bc7d49f | ['Emily Underwood'] | 2019-08-19 11:01:01.175000+00:00 | ['Vacation', 'Mental Health', 'Productivity', 'Live', 'Burnout'] | Title Yes PostVacation Burnout ThingContent Yes PostVacation Burnout Thing holiday supposed leave refreshed restored often tired left Photo Ricardo Gomez Angel Unsplash ever come back vacation feeling like badly needed well vacation Complaining exhausted week Cancun isn’t going win sympathy coworkers isn’t unusual experience crash even lovely holiday It’s increasingly clear skipping vacation — half Americans — bad health productivity increasing risk depression heart attack also contribute burnout syndrome recently defined World Health Organization exhaustion negativity loss professional efficacy Multiple study suggest detaching work vacation make u productive creative time away isn’t always relaxing — particularly spend flying kid appeasing inlaws checking email — reentry brutal overflowing inbox multiple fire put leave feeling drained frazzled return desk left postvacation burnout thing prevention better cure tip help avoid Choose right vacation First sure you’ve planned vacation actually allows recuperation Occupational psychologist Sabine Sonnentag University Konstanz Germany identified four ingredient make vacation restorative also personal taste weeklong mountainclimbing trip might ideal escape work simply exhausting others Ideally schedule vacation least day’s buffer go back work give time settle back laundry get good night’s sleep Set outofoffice mindset work leave take time complete unpleasant task don’t spend whole vacation thinking addition setting email vacation response Slack status make sure colleague know everything need designate someone address pressing issue come you’re gone reduce chance getting panicked message need respond beach blanket Next write detailed nottooambitious todo list first day two back stumble first day reentry without straining jetlagged brain Don’t announce you’re home day two wait update status outofoffice message vacation truly relax Plan activity find relaxing pleasurable like idly browsing bookstore jigsaw puzzle going easy hike key give break trying achieve anything Try maintain control spend time energy tough family vacation — especially traveling small child — it’s important carve time want even mean taking turn watching kid hiring childcare need align perfectly partner family want great arrange strike least whether early morning run solo museum visit relaxation doesn’t mean downtime mean abandoning need perform Consider developing new skill building one kayaking taking cooking class local cuisine activity doesn’t physically risky even hard mentally absorbing enough keep focused state “flow” Activities help develop new mastery help combat discouragement inadequacy signal burnout job Another upside It’s hard check phone device sealed drybag hand covered focaccia dough workwork best unplug Nearly 30 Americans work thought would vacation according 2018 study American Psychological Association Constantly checking email undermines potential benefit vacation may even negatively affect health wellbeing afterward also disrupts potential creative inspiration arise allowing bit bored simply letting mind wander Ease reentry you’ve managed schedule buffer day two use decompress catch sleep savor experience Print photo record memory journal download recipe remind trip Write thankyou note host travel companion Plan next adventure return work sure go bed get normal time don’t try make gone resuming work double time here’s useful trick ease back pace office communication Don’t announce you’re back desk day two wait update status outofoffice message Recognize vacation isn’t cureall Even vacation blissfully relaxing don’t surprised moodboosting effect fade soon return — that’s normal don’t feel better vacation can’t enjoy work stress however might suffering workrelated burnout you’re struggling burnout chance vacation temporary fix chronic stress associated burnout syndrome isn’t resolved week two time matter perfectly plan note Irvin Schonfeld occupational health researcher City University New York Schonfeld’s research suggests people score high Maslach Burnout Inventory may actually form jobstressinduced depression Emotional exhaustion Schonfeld describes “the core burnout“ also highly correlated depression “so it’s tricky say burnout separate phenomenon depression” say Finally remember vacation isn’t time you’re allowed relax Find way build “deliberate rest” day take opportunity let mind wander entertained — even work commute pretend you’re vacation weekend Research suggests people adopt “vacation mindset” weekend le housework spend time eating sex They’re also happier Mondays Go figureTags Vacation Mental Health Productivity Live Burnout |
2,091 | Andrea Yates and the Cost of Ignoring Mental Illness | Photo courtesy of the Houston Police Dept.
When a mother kills her children, it’s condemned as the most heinous crime a person can commit — a cold-blooded act of violence against the most vulnerable victims. But is it always so clear cut? In 2001, the case of Andrea Yates made the world ask that question — and to find the answer, they would have to reckon with the ways that religion, patriarchy, and mental illness can destroy a woman.
She was born Andrea Kennedy on July 2, 1962, the youngest of five children in a Catholic family. Friends and classmates remember her as being very active in extracurricular activities and charity work. She was smart, too — a member of the National Honor Society, she graduated valedictorian of her class at Milby High School in Houston, Texas.
But what they might not have known was that this driven perfectionist also struggled with depression and bulimia. But in her desperate need to appear flawless, she never allowed anyone to know what was going on inside her mind.
After graduation, she went on to earn a degree in nursing, then went to work as a registered nurse at a cancer center. It was around this time that she met Russell “Rusty” Yates. Very soon after meeting, they moved in together. Rusty said that she was extremely uncomfortable with her body — dressing and undressing in the closet — and did not enjoy sex. While some might chalk this up to a strict patriarchal upbringing, it stands out as a red flag for a number of mental disorders.
Rusty was a devout follower of the itinerant street preacher Michael Woroniecki. Woroniecki would travel around the country with his wife and kids preaching, mostly on college campuses, his fire-and-brimstone message. The religion he espoused was a stark fundamentalist Christianity with particularly regressive rules for women, whom he preached were naturally evil “witches” because they came from Eve. Women were not to educate themselves, work outside the home, or use birth control. Wives were expected to submit themselves to their husbands in all matters. Children, as well, were expected to be seen and not heard, and disobedience of any kind was to be punished with spankings or whippings. Mothers who didn’t beat their children, he taught, were condemning them to hell.
Rusty introduced Andrea to Woroniecki’s teachings, and then to Woroneicki himself. Perhaps his strict fundamentalist teachings didn’t seem so foreign to her, since she had grown up in a Catholic household.
Nevertheless, they lived together for two years before getting married, and in February 1994, she gave birth to their first child, Noah. Andrea, now a devout follower of Woroniecki, quit her job and studies to stay home and be a full-time mom. Later, Andrea would admit that after Noah’s birth, she began to have disturbing visions of knives and stabbings, and she even thought she heard the voice of Satan speaking to her. However, she told no one of these troubling visions.
During that time, the Yateses and the Woronieckis became quite close, even considering each other family, and the women often watched each other’s children.
Soon after Noah was born, they had to move from their four-bedroom home in Houston to a small trailer in Seminole, Florida, for a temporary job Rusty had taken. There, thanks to their anti-contraceptive beliefs, Andrea gave birth to two more sons: John in December 1995 and then Paul in September 1997.
During this time, the Yateses kept in contact with Worniecki and his wife through their newsletter, videos, and letters. In their letters, the Woronieckis would often “diagnose” Andrea as being evil. “God knows how wicked you are,” he wrote. “You must accept the reality that your life is under the curse of sin and death . . .” Andrea was subjected to a near-constant stream of hateful messages like this from a man she believed spoke for God.
Shortly after Paul’s birth, the Yateses moved back to Houston — this time, they purchased their home from Woroniecki: a used Greyhound bus that had been converted to a motorhome. There, in that 350-square-foot bus, Andrea was consumed with caring for a newborn, a toddler, and a preschooler. Besides the work of cooking for the family, feeding the older two, nursing the newborn, and cleaning up after everyone, she was constantly changing and washing cloth diapers (disposable diapers were not allowed by Woroniecki) and homeschooling the oldest. On top of that, she was caring for her aging father, who had Alzheimer’s. When friends or family members would question Rusty, or try to point out it was too much stress to put on his wife, he would shrug it off as being Andrea’s job.
Meanwhile the Woronieckis continued condemning Andrea for not disciplining her children more. Apparently their normal childhood behaviors were seen as “disrespectful” and “not what God wants,” and the Woronieckis insisted that by not forcing the children to be more obedient by whipping them, Andrea was damning their souls to hell.
In February 1999, their fourth child, Luke, was born. Four months later, Andrea called Rusty at work, begging for help. He arrived home to see her nearly catatonic, chewing her fingers.
His solution was to take her and the kids for a walk on the beach. He claimed she seemed better after that, but the next day she tried to overdose on her father’s trazodone. Rusty took her to the hospital, where she was diagnosed with major depressive disorder and put on the antidepressant Zoloft. However, she had to be released after a short time as her insurance would not cover further inpatient services.
After she was sent home, she began seeing a new doctor, who put her on the anti-psychotic drug Zyprexa. But at home, she was back under the spell of Woroniecki, who preached that drugs and medical care were of the devil. Andrea promptly flushed all of her Zyprexa down the toilet.
Her mental health spiraled downwards: she was pulling her hair out and leaving bald spots, picking her skin until it bled, and not eating. She began hearing voices telling her to get a knife. One day Rusty came home from work to find Andrea holding a knife to her own throat. He again rushed her to the hospital.
The hospital recommended electroshock therapy, which the couple refused. So the hospital sent her home with a combination of drugs, including the anti-psychotic Haldol, in conjunction with weekly visits to a psychiatrist.
Thankfully, family members managed to convince Rusty that Andrea and the kids needed to get out of that bus. He purchased a three-bedroom, two-bath home in nearby Clear Lake, Texas. Now that she was out of that cramped bus, under a doctor’s close supervision, and taking the appropriate medication, she seemed to recover.
Doctors warned the couple not to have another child, since women who suffer from postpartum depression and psychosis are at a much higher risk with each birth, and the episodes tend to worsen.
However, now that Andrea was seemingly back to normal again, the couple decided to have another child. Rusty called Andrea’s severe postpartum depression, psychosis, and suicide attempts “like having the flu,” and that if she relapsed, they could just put her back on her meds and everything would be fine. The couple either didn’t know or didn’t care that going off psychiatric drugs can itself trigger severe reactions, and later, make it harder to treat the underlying issue.
So in 2000, Andrea stopped taking both her psychiatric drugs and her birth control. In November, their fifth child, Mary, was born.
The following March, Andrea’s father passed away. His death hit her hard, and she began showing symptoms of severe depression: lethargy, picking bald spots on her scalp, not drinking liquids. Over the next few months, she was in and out of psychiatric hospitals and clinics and subjected to an ever-changing mixture of psychiatric drugs. Rusty was warned by doctors not to leave Andrea alone.
But Rusty still would not face the severity of Andrea’s problems. He arranged for his mother to come to the house to help Andrea regularly, but would leave her alone for short periods of time in order to “make her more independent” and, of course, so she wouldn’t become dependent on him and his mother for her “maternal responsibilities.”
One day in May, his mother arrived at their home to find Andrea filling the bathtub at 4:30 in the afternoon. When questioned, she gave vague answers. This scared Rusty’s mother, so Andrea was sent back to the hospital, where she admitted she had thought about drowning herself and her children.
June 18, 2001, Rusty took her back to her doctor because she was not getting any better. He reports that the doctor was frustrated that none of the drugs seemed to be working, so he told Andrea, “You need to think happy thoughts!”
Two days later — June 20, 2001 — Rusty left for work around 9. His mother was scheduled to come to the home around 10, leaving Andrea alone with the children for an hour.
As soon as Rusty left, Andrea filled the bathtub. She took John, who was 5, into the bathroom where she held him under the water until he was dead. Then she carried him into the master bedroom and carefully laid his body on the bed.
She then brought in Paul, age 3, and repeated the process. Luke, age 2, was next.
She then drowned 6-month-old Mary, but while she was still floating in the tub, Noah, age 7, came in and asked what was wrong with Mary. He tried to run away, but Andrea caught him, then drowned him, too. She left him floating in the tub, but took Mary and laid her in John’s arms on the bed.
She then called 911, insisting they come to the house, but would not answer why.
As soon as she got off the phone with 911, she called Rusty and told him to come home right away. He seemed to intuit what had happened, because he asked her, “How many?” and she answered, “All of them.”
When the police arrived, her first words were, “I just killed my kids.” Her hair and clothes were still wet.
At the station, the court-appointed psychiatrists described Andrea as “the sickest person” they had ever seen. She was nearly catatonic, emaciated, filthy, her scalp checkered with bald spots.
Under questioning, she readily confessed to drowning all five of her children. Her reasoning was a delusion built entirely from Woroniecki’s teachings: she said she had killed her children so that they would go to heaven; if she hadn’t “sent them to God” now, they would surely keep “stumbling” and would go to hell.
She said she knew she was already evil — that the Devil was literally inside of her — and damned to hell. So killing them, in her delusion, wouldn’t make any difference to her eternal soul, but would save her children’s.
On July 30, 2001, she was indicted on two counts of capital murder. The prosecution held off charging her for the other three murders as a kind of fall-back: if they failed to get a conviction, they could then bring the other three charges without violating her right not to be tried for the same crime twice (i.e., “double jeopardy”).
Andrea pled not guilty by reason of insanity, an extremely risky strategy. Nationally, only about 1 percent of criminal defendants take this plea, and of those, only about a quarter of them are successful. In addition, Texas has some of the strictest qualifications for an insanity defense. Known as the M’Naghten Rule, defendants must prove both that they have a mental disease or defect and that they could not tell right from wrong at the time of the crime.
Andrea Yates’ trial began on February 18, 2002. While it was clear that she indeed had a mental disease, her ability to tell right from wrong was at the heart of the trial. The efforts she took in planning the crime, such as waiting until Rusty was gone and locking up the family dog, were used to prove she knew what she was doing was wrong.
It also didn’t help that by the time of the trial, Andrea had been under psychiatric care and seemed more normal: she was lucid in a way she hadn’t been when she committed the crime, and her appearance was clean and well-groomed. As much as it hurt her case, the court psychiatrists could not have ethically withheld treatment.
In March 2002, a jury deliberated only three and a half hours before rejecting the insanity defense and finding her guilty. Although the prosecution had sought the death penalty, the jury rejected it. Instead she was sentenced her to life imprisonment with eligibility for parole in 40 years.
While in prison, she was placed on suicide watch, and later, hospitalized for refusing to eat.
Her attorney filed an appeal, and in 2005, the Texas First Court of Appeals reversed her capital murder conviction.
That same year, Rusty divorced her.
In 2006, her retrial began. She again pled not guilty by reason of insanity, and on July 26, 2006, she was found not guilty by reason of insanity and ordered into the custody of a state mental hospital.
She now resides in the low-security Kerrville State Hospital in Kerrville, Texas, where she receives treatment and counseling. In her free time, she makes cards and aprons that are sold anonymously, with the proceeds sent to a fund to help low-income women access mental health services. Every year, her case is brought up for review, but every year, she waives it. It seems Andrea Yates doesn’t want to be released.
Public opinion was, and still is, split into those who see her as a cold-hearted baby killer and those who see her as a victim of her mental illness — and, possibly, manipulation by a maniacal cult leader. Thankfully, in the end, it doesn’t matter — Andrea Yates will most likely never leave the walls of Kerrville State Hospital alive. | https://delanirbartlette.medium.com/andrea-yates-and-the-cost-of-ignoring-mental-illness-b9e2f1f598ae | ['Delani R. Bartlette'] | 2019-05-13 13:01:00.995000+00:00 | ['True Crime', 'Crime', 'Mental Health', 'Postpartum Depression', 'Psychology'] | Title Andrea Yates Cost Ignoring Mental IllnessContent Photo courtesy Houston Police Dept mother kill child it’s condemned heinous crime person commit — coldblooded act violence vulnerable victim always clear cut 2001 case Andrea Yates made world ask question — find answer would reckon way religion patriarchy mental illness destroy woman born Andrea Kennedy July 2 1962 youngest five child Catholic family Friends classmate remember active extracurricular activity charity work smart — member National Honor Society graduated valedictorian class Milby High School Houston Texas might known driven perfectionist also struggled depression bulimia desperate need appear flawless never allowed anyone know going inside mind graduation went earn degree nursing went work registered nurse cancer center around time met Russell “Rusty” Yates soon meeting moved together Rusty said extremely uncomfortable body — dressing undressing closet — enjoy sex might chalk strict patriarchal upbringing stand red flag number mental disorder Rusty devout follower itinerant street preacher Michael Woroniecki Woroniecki would travel around country wife kid preaching mostly college campus fireandbrimstone message religion espoused stark fundamentalist Christianity particularly regressive rule woman preached naturally evil “witches” came Eve Women educate work outside home use birth control Wives expected submit husband matter Children well expected seen heard disobedience kind punished spanking whipping Mothers didn’t beat child taught condemning hell Rusty introduced Andrea Woroniecki’s teaching Woroneicki Perhaps strict fundamentalist teaching didn’t seem foreign since grown Catholic household Nevertheless lived together two year getting married February 1994 gave birth first child Noah Andrea devout follower Woroniecki quit job study stay home fulltime mom Later Andrea would admit Noah’s birth began disturbing vision knife stabbings even thought heard voice Satan speaking However told one troubling vision time Yateses Woronieckis became quite close even considering family woman often watched other’s child Soon Noah born move fourbedroom home Houston small trailer Seminole Florida temporary job Rusty taken thanks anticontraceptive belief Andrea gave birth two son John December 1995 Paul September 1997 time Yateses kept contact Worniecki wife newsletter video letter letter Woronieckis would often “diagnose” Andrea evil “God know wicked are” wrote “You must accept reality life curse sin death ” Andrea subjected nearconstant stream hateful message like man believed spoke God Shortly Paul’s birth Yateses moved back Houston — time purchased home Woroniecki used Greyhound bus converted motorhome 350squarefoot bus Andrea consumed caring newborn toddler preschooler Besides work cooking family feeding older two nursing newborn cleaning everyone constantly changing washing cloth diaper disposable diaper allowed Woroniecki homeschooling oldest top caring aging father Alzheimer’s friend family member would question Rusty try point much stress put wife would shrug Andrea’s job Meanwhile Woronieckis continued condemning Andrea disciplining child Apparently normal childhood behavior seen “disrespectful” “not God wants” Woronieckis insisted forcing child obedient whipping Andrea damning soul hell February 1999 fourth child Luke born Four month later Andrea called Rusty work begging help arrived home see nearly catatonic chewing finger solution take kid walk beach claimed seemed better next day tried overdose father’s trazodone Rusty took hospital diagnosed major depressive disorder put antidepressant Zoloft However released short time insurance would cover inpatient service sent home began seeing new doctor put antipsychotic drug Zyprexa home back spell Woroniecki preached drug medical care devil Andrea promptly flushed Zyprexa toilet mental health spiraled downwards pulling hair leaving bald spot picking skin bled eating began hearing voice telling get knife One day Rusty came home work find Andrea holding knife throat rushed hospital hospital recommended electroshock therapy couple refused hospital sent home combination drug including antipsychotic Haldol conjunction weekly visit psychiatrist Thankfully family member managed convince Rusty Andrea kid needed get bus purchased threebedroom twobath home nearby Clear Lake Texas cramped bus doctor’s close supervision taking appropriate medication seemed recover Doctors warned couple another child since woman suffer postpartum depression psychosis much higher risk birth episode tend worsen However Andrea seemingly back normal couple decided another child Rusty called Andrea’s severe postpartum depression psychosis suicide attempt “like flu” relapsed could put back med everything would fine couple either didn’t know didn’t care going psychiatric drug trigger severe reaction later make harder treat underlying issue 2000 Andrea stopped taking psychiatric drug birth control November fifth child Mary born following March Andrea’s father passed away death hit hard began showing symptom severe depression lethargy picking bald spot scalp drinking liquid next month psychiatric hospital clinic subjected everchanging mixture psychiatric drug Rusty warned doctor leave Andrea alone Rusty still would face severity Andrea’s problem arranged mother come house help Andrea regularly would leave alone short period time order “make independent” course wouldn’t become dependent mother “maternal responsibilities” One day May mother arrived home find Andrea filling bathtub 430 afternoon questioned gave vague answer scared Rusty’s mother Andrea sent back hospital admitted thought drowning child June 18 2001 Rusty took back doctor getting better report doctor frustrated none drug seemed working told Andrea “You need think happy thoughts” Two day later — June 20 2001 — Rusty left work around 9 mother scheduled come home around 10 leaving Andrea alone child hour soon Rusty left Andrea filled bathtub took John 5 bathroom held water dead carried master bedroom carefully laid body bed brought Paul age 3 repeated process Luke age 2 next drowned 6monthold Mary still floating tub Noah age 7 came asked wrong Mary tried run away Andrea caught drowned left floating tub took Mary laid John’s arm bed called 911 insisting come house would answer soon got phone 911 called Rusty told come home right away seemed intuit happened asked “How many” answered “All them” police arrived first word “I killed kids” hair clothes still wet station courtappointed psychiatrist described Andrea “the sickest person” ever seen nearly catatonic emaciated filthy scalp checkered bald spot questioning readily confessed drowning five child reasoning delusion built entirely Woroniecki’s teaching said killed child would go heaven hadn’t “sent God” would surely keep “stumbling” would go hell said knew already evil — Devil literally inside — damned hell killing delusion wouldn’t make difference eternal soul would save children’s July 30 2001 indicted two count capital murder prosecution held charging three murder kind fallback failed get conviction could bring three charge without violating right tried crime twice ie “double jeopardy” Andrea pled guilty reason insanity extremely risky strategy Nationally 1 percent criminal defendant take plea quarter successful addition Texas strictest qualification insanity defense Known M’Naghten Rule defendant must prove mental disease defect could tell right wrong time crime Andrea Yates’ trial began February 18 2002 clear indeed mental disease ability tell right wrong heart trial effort took planning crime waiting Rusty gone locking family dog used prove knew wrong also didn’t help time trial Andrea psychiatric care seemed normal lucid way hadn’t committed crime appearance clean wellgroomed much hurt case court psychiatrist could ethically withheld treatment March 2002 jury deliberated three half hour rejecting insanity defense finding guilty Although prosecution sought death penalty jury rejected Instead sentenced life imprisonment eligibility parole 40 year prison placed suicide watch later hospitalized refusing eat attorney filed appeal 2005 Texas First Court Appeals reversed capital murder conviction year Rusty divorced 2006 retrial began pled guilty reason insanity July 26 2006 found guilty reason insanity ordered custody state mental hospital resides lowsecurity Kerrville State Hospital Kerrville Texas receives treatment counseling free time make card apron sold anonymously proceeds sent fund help lowincome woman access mental health service Every year case brought review every year waif seems Andrea Yates doesn’t want released Public opinion still split see coldhearted baby killer see victim mental illness — possibly manipulation maniacal cult leader Thankfully end doesn’t matter — Andrea Yates likely never leave wall Kerrville State Hospital aliveTags True Crime Crime Mental Health Postpartum Depression Psychology |
2,092 | 3 Open Source Tools for Ethical AI | #1 — Deon
An ethics checklist for responsible data science, Deon represents a starting point for teams to evaluate considerations related to advanced analytics and machine learning applications from data collection through deployment.
The nuanced discussions spurred by Deon can ensure that risks inherent to AI-empowered technology do not escalate into threats to an organization’s constituents, reputation, or society more broadly. With AI, the stakes are monumental, yet the dangers are potentially indistinct. Instances of algorithmic malpractices are not always clear cut.
Deon checklist from DrivenData spans ethical considerations from Data Collection to Deployment
As example of use case where Deon could have been implemented to improve data product governance, consider the influence of Russian-crafted fake news on the 2016 election. Though the threat did not stem from AI, the impact of the intentionally misleading media content was amplified by the recommendation algorithms of social media, where controversy begets interaction, and users are pushed towards increasingly extreme belief systems.
This pattern of segmentation is beneficial to the algorithms underlying this technology as it leads to wider decision boundaries between classes — but it is detrimental to society, increasing the potential for foreign actors to sow division and undermine critical institutions such as the sanctity of national elections.
Social media companies have come under fire as a result of their failure to detect and reject fake news content. By failing to act, these firms permitted their artificially intelligent recommendation engines to strengthen the destabilizing impact of the deceptive foreign media.
If these firms had undertaken a systematic review of the potential ethical and social implications of fake news amplified by their algorithmic systems prior to the 2016 election, this effort may have resulted in a more robust plan to root out systematic disinformation campaigns.
With the threat of AI-generated deepfakes looming as an ever more realistic weapon in information warfare, the hazard posed by the current state of unpreparedness is heightened going into the 2020 election. | https://medium.com/atlas-research/ethical-ai-tools-b9d276a49fea | ['Nicole Janeway Bills'] | 2020-12-23 11:48:04.442000+00:00 | ['Finance', 'Machine Learning', 'Data Science', 'Artificial Intelligence', 'Python'] | Title 3 Open Source Tools Ethical AIContent 1 — Deon ethic checklist responsible data science Deon represents starting point team evaluate consideration related advanced analytics machine learning application data collection deployment nuanced discussion spurred Deon ensure risk inherent AIempowered technology escalate threat organization’s constituent reputation society broadly AI stake monumental yet danger potentially indistinct Instances algorithmic malpractice always clear cut Deon checklist DrivenData span ethical consideration Data Collection Deployment example use case Deon could implemented improve data product governance consider influence Russiancrafted fake news 2016 election Though threat stem AI impact intentionally misleading medium content amplified recommendation algorithm social medium controversy begets interaction user pushed towards increasingly extreme belief system pattern segmentation beneficial algorithm underlying technology lead wider decision boundary class — detrimental society increasing potential foreign actor sow division undermine critical institution sanctity national election Social medium company come fire result failure detect reject fake news content failing act firm permitted artificially intelligent recommendation engine strengthen destabilizing impact deceptive foreign medium firm undertaken systematic review potential ethical social implication fake news amplified algorithmic system prior 2016 election effort may resulted robust plan root systematic disinformation campaign threat AIgenerated deepfakes looming ever realistic weapon information warfare hazard posed current state unpreparedness heightened going 2020 electionTags Finance Machine Learning Data Science Artificial Intelligence Python |
2,093 | 10 online courses for those who want to start their own business | As options for learning online continue to expand, a growing number of entrepreneurs are using them to keep their staff on the cutting edge or even for themselves.
Using tools for online training, including videos, apps, and webinars, rather than sending employees to expensive training classes or bringing in pricey consultants to train on site, can save startup’s both time and money
Small businesses are turning to online training for cost, quality, and access reasons,” says Nate Kimmons, Vice President of enterprise marketing at lynda.com.
“Gone are the days of sending employees off to a two-day, in-person class. Online training serves as a 24/7 resource that the learner can access anytime, anywhere at their own pace from any device. It’s simple to use.”
If you are thinking of trying online training, here are a few things to consider and examples of tools to get you started.
Allow for flexibility.
With face-to-face training, you usually get one chance to soak it all in. But many online programs are on-demand, meaning learners can move at their own pace and watch presentations again and again if needed.
The added flexibility allows everyone to work at his or her own pace and better fit the training into a busy schedule,
Go mobile.
Online education also allows for flexibility across technology formats. Employees can learn at home, on the job, or anywhere they use their smartphone.
Do your research.
Not every online course is worth the money. Check out reviews and feedback offered by users of any given online course.
Coming up now are examples of just ten courses available online offering curriculums in Entrepreneurship, Marketing, Marketing Psychology and Coding to name a few. All vary in price from free all the way up to €200 with links available to the courses for further details
1. Entrepreneurship: Launching an Innovative Business Specialisation
· Developing Innovative Ideas for New Companies: The First Step in Entrepreneurship
· Innovation for Entrepreneurs: From Idea to Marketplace
· New Venture Finance: Startup Funding for Entrepreneurs
· Entrepreneurship Capstone
Develop your entrepreneurial mind set and skill sets, learn how to bring innovations to market, and craft a business model to successfully launch your new business.
Enrol here.
2. Entrepreneurship Specialization
· Entrepreneurship 1: Developing the Opportunity
· Entrepreneurship 2: Launching your Start-Up
· Entrepreneurship 3: Growth Strategies
· Entrepreneurship 4: Financing and Profitability
· Wharton Entrepreneurship Capstone
Wharton’s Entrepreneurship Specialization covers the conception, design, organisation, and management of new enterprises. This four-course series is designed to take you from opportunity identification through launch, growth, financing and profitability.
Enrol here.
3. How to Start Your Own Business Specialization
Developing an Entrepreneurial Mind-set: First Step towards Success
· The Search for Great Ideas: Harnessing creativity to empower innovation.
· Planning: Principled, Proposing, Proofing, and Practicing to a Success Plan
· Structure: Building the Frame for Business Growth
· Launch Strategy: 5 Steps to Capstone Experience
· Capstone — Launch Your Own Business!
‘This specialization is a guide to creating your own business. We will cover a progression of topics necessary for successful business creation including: mind set, ideation, planning, action and strategy, will be covered. Rather than just describing what to do, the focus will be on guiding you through the process of actually doing it.
Enrol here.
4. Entrepreneurship: The Part Time Entrepreneur Complete Course
For people who want to pursue Entrepreneurship without giving up your full time jobs succeeding with a Side Gig as a PT Entrepreneur
Identify and take action on part time entrepreneur or side gigs that fit their lifestyle.
Be ready to launch their new business.
“Great course. Focused on Part Time which is nice as other courses are about full time and I am not read for that. Want to make some money as a freelancer and part-time for now. Educational and Instructor is very motivational and encouraging as well. Highly recommend.”
Enrol here.
Price: €200
5. SEO for SEO Beginners
SEO tutorial for beginners: SEO optimise your site, get to the top of the search results and increase sales! Seomatico
Get up to speed with the fundamentals concepts of SEO
Discover how to find the best keywords for your website — ‘keyword research’
Find out how to increase your sites visibility in the search engine results — ‘on page optimisation’
Learn how to build more authority in your niche than competitors so Google puts you’re at the top of the search results
Price: FREE
In this SEO tutorial for beginners, you’ll learn about the Three Pillars of Powerful SEO:
1. Keyword Research: How to find keywords that attract visitors who want to buy
2. On Page Optimisation: How to increase your site visibility in the search engines
3. Off page optimisation: How to build authority on your site using links so Google knows you have the best content for it’s users.
Enrol here.
6. Twitter Marketing Secrets 2017-A step-by-step complete guide
Discover social media marketing secrets, gain 25000+ true twitter fans & followers, twitter Marketing tips!
Reach 25k highly targeted followers in just weeks.
Attract real and targeted followers with just zero money and 20 minutes a day.
Become an influencer on Twitter and sell products and services right away.
“1000+ highly satisfied students within just 5 days of the course launch”
“Best Twitter Marketing Course on Earth!! This Course will Skyrocket your Twitter Career. I highly recommend taking this course”
Price: €120
Enrol here.
7. Marketing Psychology: How to Get People to Buy More & Fast!
Learn a set of “persuasion & influence tools” you can use to ethically motivate people to action in marketing & business
Create marketing that grabs your customer’s attention, triggers curiosity and interest in your product, and ultimately persuades them to take action and BUY from you.
‘The psychology of capturing attention, and how to get people to think and dream about your brand, or the psychology behind getting people to rave for your product after you think that they’ve gotten sick of seeing it.
How to design simple web pages and marketing materials that boost your conversions
Enrol here.
Price: €200
8. Coding for Writers 1: Basic Programming
Learn to both code and write about code
The course uses JavaScript, but talks about other programming languages as well, including providing a survey of common programming languages. It covers common Computer Science concepts, such as variables, functions, conditionals, loops, etc.
Price: €45
Enrol here.
9. Smart Marketing with Price Psychology
Improve online marketing success with fundamental psychological pricing research for your business and marketing
Price your product or service to maximize revenue
Understand how consumers perceive and think about prices
Run promotions that make people think they’re getting an amazing deal
Think about your full set of products and services in ways that maximize earnings
Price: €35
Enrol here.
10. Entrepreneurship: 5 keys to building a successful business
Learn the core components to starting a great business from an entrepreneur who made his first million at the age of 24
This course shows an understanding of how successful entrepreneurs think and how to apply that thinking in your own life.
The foundations you’ll need to develop a business idea that truly resonates with consumers and addresses an actual market demand.
Price: €90
Enrol here.
Each of these online courses will be found to be very flexible as they only require a very small amount of your time per week. They all have excellent feedback from previous users and finally they are all accessible through an online app available through the various app stores on various mobile devices. They all tick three very important boxes.
Launching your own business is very time consuming and requires your undivided attention. Enrolling in courses like these for your staff or even for your own benefit might give your business in the insight or kick it needs. | https://medium.com/the-lucey-fund/10-online-courses-for-those-who-want-to-start-their-own-business-a58572b00f1e | ['Ian Lucey'] | 2017-03-30 12:37:39.082000+00:00 | ['Online Courses', 'Startup', 'Education', 'Entrepreneurship', 'SEO'] | Title 10 online course want start businessContent option learning online continue expand growing number entrepreneur using keep staff cutting edge even Using tool online training including video apps webinars rather sending employee expensive training class bringing pricey consultant train site save startup’s time money Small business turning online training cost quality access reasons” say Nate Kimmons Vice President enterprise marketing lyndacom “Gone day sending employee twoday inperson class Online training serf 247 resource learner access anytime anywhere pace device It’s simple use” thinking trying online training thing consider example tool get started Allow flexibility facetoface training usually get one chance soak many online program ondemand meaning learner move pace watch presentation needed added flexibility allows everyone work pace better fit training busy schedule Go mobile Online education also allows flexibility across technology format Employees learn home job anywhere use smartphone research every online course worth money Check review feedback offered user given online course Coming example ten course available online offering curriculum Entrepreneurship Marketing Marketing Psychology Coding name vary price free way €200 link available course detail 1 Entrepreneurship Launching Innovative Business Specialisation · Developing Innovative Ideas New Companies First Step Entrepreneurship · Innovation Entrepreneurs Idea Marketplace · New Venture Finance Startup Funding Entrepreneurs · Entrepreneurship Capstone Develop entrepreneurial mind set skill set learn bring innovation market craft business model successfully launch new business Enrol 2 Entrepreneurship Specialization · Entrepreneurship 1 Developing Opportunity · Entrepreneurship 2 Launching StartUp · Entrepreneurship 3 Growth Strategies · Entrepreneurship 4 Financing Profitability · Wharton Entrepreneurship Capstone Wharton’s Entrepreneurship Specialization cover conception design organisation management new enterprise fourcourse series designed take opportunity identification launch growth financing profitability Enrol 3 Start Business Specialization Developing Entrepreneurial Mindset First Step towards Success · Search Great Ideas Harnessing creativity empower innovation · Planning Principled Proposing Proofing Practicing Success Plan · Structure Building Frame Business Growth · Launch Strategy 5 Steps Capstone Experience · Capstone — Launch Business ‘This specialization guide creating business cover progression topic necessary successful business creation including mind set ideation planning action strategy covered Rather describing focus guiding process actually Enrol 4 Entrepreneurship Part Time Entrepreneur Complete Course people want pursue Entrepreneurship without giving full time job succeeding Side Gig PT Entrepreneur Identify take action part time entrepreneur side gig fit lifestyle ready launch new business “Great course Focused Part Time nice course full time read Want make money freelancer parttime Educational Instructor motivational encouraging well Highly recommend” Enrol Price €200 5 SEO SEO Beginners SEO tutorial beginner SEO optimise site get top search result increase sale Seomatico Get speed fundamental concept SEO Discover find best keywords website — ‘keyword research’ Find increase site visibility search engine result — ‘on page optimisation’ Learn build authority niche competitor Google put you’re top search result Price FREE SEO tutorial beginner you’ll learn Three Pillars Powerful SEO 1 Keyword Research find keywords attract visitor want buy 2 Page Optimisation increase site visibility search engine 3 page optimisation build authority site using link Google know best content it’s user Enrol 6 Twitter Marketing Secrets 2017A stepbystep complete guide Discover social medium marketing secret gain 25000 true twitter fan follower twitter Marketing tip Reach 25k highly targeted follower week Attract real targeted follower zero money 20 minute day Become influencer Twitter sell product service right away “1000 highly satisfied student within 5 day course launch” “Best Twitter Marketing Course Earth Course Skyrocket Twitter Career highly recommend taking course” Price €120 Enrol 7 Marketing Psychology Get People Buy Fast Learn set “persuasion influence tools” use ethically motivate people action marketing business Create marketing grab customer’s attention trigger curiosity interest product ultimately persuades take action BUY ‘The psychology capturing attention get people think dream brand psychology behind getting people rave product think they’ve gotten sick seeing design simple web page marketing material boost conversion Enrol Price €200 8 Coding Writers 1 Basic Programming Learn code write code course us JavaScript talk programming language well including providing survey common programming language cover common Computer Science concept variable function conditionals loop etc Price €45 Enrol 9 Smart Marketing Price Psychology Improve online marketing success fundamental psychological pricing research business marketing Price product service maximize revenue Understand consumer perceive think price Run promotion make people think they’re getting amazing deal Think full set product service way maximize earnings Price €35 Enrol 10 Entrepreneurship 5 key building successful business Learn core component starting great business entrepreneur made first million age 24 course show understanding successful entrepreneur think apply thinking life foundation you’ll need develop business idea truly resonates consumer address actual market demand Price €90 Enrol online course found flexible require small amount time per week excellent feedback previous user finally accessible online app available various app store various mobile device tick three important box Launching business time consuming requires undivided attention Enrolling course like staff even benefit might give business insight kick needsTags Online Courses Startup Education Entrepreneurship SEO |
2,094 | Roller Coaster Therapy | Roller Coaster Therapy
Strap in.
Photo by Marc Schaefer on Unsplash
Living with addiction and chronic mental illness and visions of blood-hungry demons gnawing at your heels can be like spending your life on a roller coaster. A one-way trip to nowhere. You get used to the thrills and chills, the highs and the lows. You know once you go up, you’re about to come crashing down. You know what lies around the next bend. And still it’s exhausting. You lose a little of yourself upon the completion of each cycle. It’s a ride that has no exit point. You pulled the lever and hopped on. Or fate pulled it for you and threw you into your seat. Round and round we go. Now you’re whirling and twirling through the twilight sky into oblivion. You scream at the stars. You think about taking off your safety harness, jumping and plummeting to the ground four or five stories below. You couldn’t jump from any lower. The cars are whipping through too fast and you are enshrined in webs of metal. You wouldn’t be the first to jump.
Where is the hope? Where is the healing? When and how does it end?
You ask, and you are not alone.
There are other lost souls traveling with you on other cars. In the beginning they were all screaming and either hanging on for dear life or throwing their arms in the air with reckless abandon. Screaming in excitement or in terror. Now they’ve all gone quiet. Their arms lay limp in their laps. Their eyes are bloodshot and glazed over. They are resigned to the maddening meaningless revolutions. Hope departed from their souls long ago, to be replaced by a leaden weariness, a torpor that has constricted their limbs and anesthetized their minds. They feel no anger, no desperation. Can they still be reached? Can they still be saved? In fact, they may be closer to salvation than most.
Still others clamor for control. They think they can regain their agency in a situation that has spun far outside the arena of their thoughts and emotions. They refuse to surrender to a state of graceless passivity. They analyze, calculate, formulate plans. How can we make this work? How can we use this, this and this to our advantage? Where is the weak point? There has to be one. It’s just a ride, after all. I will not do this forever. I’m smart, I’m strong and I will not be rendered a captive observer to my own life experience. If we work together, we can overcome any challenge. All their ideas result in utter failure. They try again and again. They’re a stubborn group, but even they get worn down by the relentless whirling and spinning of the coaster. Time takes its toll. Soon they slump back in their seats, defeated and dejected. But there is still a glint of fire in their eyes.
People begin yelling at each other, venting their frustration, their fear and their blackest despair. Blame spreads like wildfire. It is a deadly contagion, sickening everyone who hears it and internalizes it. The living dead say nothing, their bodies slumped and their jaws slack. But everyone else is looking for recourse from someone. They want to know why. Why them? They want to know how. How could this possibly be? What did they ever do to deserve such hellish torture? Where are all the people? And, inevitably, the question is asked, are we all dead? Some chew this over. Ponder it. Others discard it immediately as complete insanity. They want practical answers and practical solutions.
The pain is palpable. It seems on the verge of coalescing into a living, breathing organism born of human agony. Finally a woman with a quiet but firm voice cuts through the bickering with a question, ‘What if the person or persons who designed this monstrosity wanted to break our minds and spirits so they could create us anew? We’re already a little broken. We’re all dying. What do we have to lose?’ She is instantly rebuffed by a torrent of indignant responses. ‘I’m not broken. How dare you!’ ‘So your solution is just give in to torture?’ ‘I have my dignity to lose, you bitch! And my life!’ ‘They just get to play god with our bodies and minds without our consent? Fuck them and fuck you!’ The woman takes it all in stride and lays back in her seat.
Then a tall, solidly built man in overalls and a dark t-shirt emerges from the darkness. No one notices him at first. The coaster is speeding along too fast and the discussions and recriminations between passengers have descended into full-blown chaos. People have even taken off their safety harnesses and started taking swings at each other. The man apparently takes no notice. If he does, there is no trace of it on his placid expression. He walks calmly over to the lever and as the coaster comes swooping down he pulls it and the cars come to a screeching halt. A few people who unstrapped themselves tumble over the side of their cars, but quickly pick themselves back up and dust themselves off. There is great rejoicing. The passengers swarm the tall man, hugging him fiercely, rubbing his shoulders and patting him on the back.
When they step further out into the open air they find themselves in the middle of a gigantic deserted theme park, with the rides all lit up and buzzing as if it were opening day.
Everyone is confused and disoriented and absolutely exhausted. But they are free. They are free to walk the earth in whatever direction they choose. What a magnificent gift. Some are licking their lips at the thought of a drink. Others are scratching their arms at the thought of a shot of dope. All of them are angry. All of them are depressed. Some want to end it the first chance they get. This was punishment. This wasn’t treatment.
The living dead are now animated enough to start stumbling towards the exit. Most of the people try to follow their lead. It seems the best way to go. The best way to continue their journey. To firmly cement their freedom and return to the glory of the wider world, in all its beautiful ruin.
But they are quickly corralled by the tall man with the black shirt and overalls. He shakes his head.
“You’re not ready yet.”
One woman collapses to her knees and begins to shriek at the sky. An older gentleman approaches the tall man.
“We’ve done our penance. Enough is enough, sir.”
The man frowns. He looks truly sorry. “No, sir. I promise, on my heart and on my word as a good man of God, you will thank me when this ordeal is done.”
Then he turns back to the crowd and points his finger to a ride off in the distance.
“To the Tilt-A-Whirl!” | https://medium.com/grab-a-slice/roller-coaster-therapy-6be6c8c7a0dd | ["Timothy O'Neill"] | 2020-01-26 20:55:55.695000+00:00 | ['Addiction', 'Mental Health', 'Fiction', 'Psychology', 'Horror'] | Title Roller Coaster TherapyContent Roller Coaster Therapy Strap Photo Marc Schaefer Unsplash Living addiction chronic mental illness vision bloodhungry demon gnawing heel like spending life roller coaster oneway trip nowhere get used thrill chill high low know go you’re come crashing know lie around next bend still it’s exhausting lose little upon completion cycle It’s ride exit point pulled lever hopped fate pulled threw seat Round round go you’re whirling twirling twilight sky oblivion scream star think taking safety harness jumping plummeting ground four five story couldn’t jump lower car whipping fast enshrined web metal wouldn’t first jump hope healing end ask alone lost soul traveling car beginning screaming either hanging dear life throwing arm air reckless abandon Screaming excitement terror they’ve gone quiet arm lay limp lap eye bloodshot glazed resigned maddening meaningless revolution Hope departed soul long ago replaced leaden weariness torpor constricted limb anesthetized mind feel anger desperation still reached still saved fact may closer salvation Still others clamor control think regain agency situation spun far outside arena thought emotion refuse surrender state graceless passivity analyze calculate formulate plan make work use advantage weak point one It’s ride forever I’m smart I’m strong rendered captive observer life experience work together overcome challenge idea result utter failure try They’re stubborn group even get worn relentless whirling spinning coaster Time take toll Soon slump back seat defeated dejected still glint fire eye People begin yelling venting frustration fear blackest despair Blame spread like wildfire deadly contagion sickening everyone hears internalizes living dead say nothing body slumped jaw slack everyone else looking recourse someone want know want know could possibly ever deserve hellish torture people inevitably question asked dead chew Ponder Others discard immediately complete insanity want practical answer practical solution pain palpable seems verge coalescing living breathing organism born human agony Finally woman quiet firm voice cut bickering question ‘What person person designed monstrosity wanted break mind spirit could create u anew We’re already little broken We’re dying lose’ instantly rebuffed torrent indignant response ‘I’m broken dare you’ ‘So solution give torture’ ‘I dignity lose bitch life’ ‘They get play god body mind without consent Fuck fuck you’ woman take stride lay back seat tall solidly built man overall dark tshirt emerges darkness one notice first coaster speeding along fast discussion recrimination passenger descended fullblown chaos People even taken safety harness started taking swing man apparently take notice trace placid expression walk calmly lever coaster come swooping pull car come screeching halt people unstrapped tumble side car quickly pick back dust great rejoicing passenger swarm tall man hugging fiercely rubbing shoulder patting back step open air find middle gigantic deserted theme park ride lit buzzing opening day Everyone confused disoriented absolutely exhausted free free walk earth whatever direction choose magnificent gift licking lip thought drink Others scratching arm thought shot dope angry depressed want end first chance get punishment wasn’t treatment living dead animated enough start stumbling towards exit people try follow lead seems best way go best way continue journey firmly cement freedom return glory wider world beautiful ruin quickly corralled tall man black shirt overall shake head “You’re ready yet” One woman collapse knee begin shriek sky older gentleman approach tall man “We’ve done penance Enough enough sir” man frown look truly sorry “No sir promise heart word good man God thank ordeal done” turn back crowd point finger ride distance “To TiltAWhirl”Tags Addiction Mental Health Fiction Psychology Horror |
2,095 | 3rd Annual Global Artificial Intelligence Conference [January 23–25th, 2019] | Click the image to RSVP to the conference
Global Big Data Conference’s vendor agnostic 3rd Annual Global Artificial Intelligence (AI) Conference is held on January 23rd, January 24th, & January 25th 2019 on all industry verticals (Finance, Retail/E-Commerce/M-Commerce, Healthcare/Pharma/BioTech, Energy, Education, Insurance, Manufacturing, Telco, Auto, Hi-Tech, Media, Agriculture, Chemical, Government, Transportation etc.. ). It will be the largest vendor agnostic conference in AI space. The Conference allows practitioners to discuss AI through effective use of various techniques.
Join the AIMA Thought Leadership @ bit.ly/AIMA-MeetUp
Large amount of data created by various mobile platforms, social media interactions, e-commerce transactions, and IoT provide an opportunity for businesses to effectively tailor their services by effective use of AI. Proper use of Artificial Intelligence can be a major competitive advantage for any business considering vast amount of data being generated.
Artificial Intelligence is an emerging field that allows businesses to effectively mine historical data and better understand consumer behavior. This type of approach is critical for any business to successfully launch its products and better serve its existing markets.
Annual Global AI Conference is extended to three days based on feedback from participants. The event will feature many of the AI thought leaders from the industry. Annual Global AI Conference is an event acclaimed for its highly interactive sessions. This conference provides insights and potential solutions to address AI issues from well known experts and thought leaders through panel sessions and open Q&A sessions. Speakers will showcase successful industry vertical use cases, share development and administration tips, and educate organizations about how best to leverage AI as a key component in their enterprise architecture. It will also be an excellent networking event for Executives( CXO’s, VP, Directors), Managers, developers, architects, administrators, data analysts, data scientists, statisticians and vendors interested in advancing, extending or implementing AI.
SPEAKERS
Over 100 leading experts in Artificial Intelligence area will present at our conference. Please send an email to [email protected] for speaking engagements.
YOU GET TO MEET
You get to meet technical experts, Senior , VC and C-level executives from leading innovators in the AI space (Executives from startups to large corporations will be at our conference.)
WHO SHOULD ATTEND
CEO, EVP/SVP/VP, C-Level, Director, Global Head, Manager, Decision-makers, Business Executives responsible for AI Intiatives, Heads of Innovation, Heads of Product Development, Analysts, Project managers, Analytics managers, Data Scientist, Statistician, Sales, Marketing, human resources, Engineers, AI & Software Developers, VCs/Investors, AI Consultants and Service Providers, Architects, Networking specialists, Students, Professional Services, Data Analyst, BI Developer/Architect, QA, Performance Engineers, Data Warehouse Professional, Sales, Pre Sales, Technical Marketing, PM, Teaching Staff, Delivery Manager and other line-of-business executives.
WHAT YOU WILL LEARN
You’ll get up to speed on emerging techniques and technologies by analyzing case studies, develop new technical skills through in-depth workshop, share emerging best practices in AI and future trends. The depth and breadth of what’s covered at the annual Global AI conference requires multiple tracks/sessions. You can either follow one track from beginning to end or pick the individual sessions that most interest you.
1. Industry Vertical Use Cases ( Where AI applications are working/not working, What hot Technologies are used to implement AI, How to develop AI applications etc..)
2. Cognitive Computing
3. Chatbot
4. Data Science, Machine Learning & Deep Learning
5. IoT
6. Security
7. NLP
8. Computer Vision
9. Home Assistant
10. Robotics
11. Neural networks
12. Data Mining and Data Analytics
13. Speech Recognition, Image processing, Unsupervised Learning
14. Workshops
Conference Location
Santa Clara Convention Center, 5001 Great America Parkway, Santa Clara, CA 95054 (Map)
CONFERENCE HIGHLIGHTS | https://medium.com/aimarketingassociation/3rd-annual-global-artificial-intelligence-conference-january-23-25th-2019-5ff91ec70467 | ['Federico Gobbi'] | 2019-01-08 23:25:44.118000+00:00 | ['Machine Learning', 'Data Science', 'Artificial Intelligence', 'Marketing', 'Deep Learning'] | Title 3rd Annual Global Artificial Intelligence Conference January 23–25th 2019Content Click image RSVP conference Global Big Data Conference’s vendor agnostic 3rd Annual Global Artificial Intelligence AI Conference held January 23rd January 24th January 25th 2019 industry vertical Finance RetailECommerceMCommerce HealthcarePharmaBioTech Energy Education Insurance Manufacturing Telco Auto HiTech Media Agriculture Chemical Government Transportation etc largest vendor agnostic conference AI space Conference allows practitioner discus AI effective use various technique Join AIMA Thought Leadership bitlyAIMAMeetUp Large amount data created various mobile platform social medium interaction ecommerce transaction IoT provide opportunity business effectively tailor service effective use AI Proper use Artificial Intelligence major competitive advantage business considering vast amount data generated Artificial Intelligence emerging field allows business effectively mine historical data better understand consumer behavior type approach critical business successfully launch product better serve existing market Annual Global AI Conference extended three day based feedback participant event feature many AI thought leader industry Annual Global AI Conference event acclaimed highly interactive session conference provides insight potential solution address AI issue well known expert thought leader panel session open QA session Speakers showcase successful industry vertical use case share development administration tip educate organization best leverage AI key component enterprise architecture also excellent networking event Executives CXO’s VP Directors Managers developer architect administrator data analyst data scientist statistician vendor interested advancing extending implementing AI SPEAKERS 100 leading expert Artificial Intelligence area present conference Please send email eventsglobalbigdataconferencecom speaking engagement GET MEET get meet technical expert Senior VC Clevel executive leading innovator AI space Executives startup large corporation conference ATTEND CEO EVPSVPVP CLevel Director Global Head Manager Decisionmakers Business Executives responsible AI Intiatives Heads Innovation Heads Product Development Analysts Project manager Analytics manager Data Scientist Statistician Sales Marketing human resource Engineers AI Software Developers VCsInvestors AI Consultants Service Providers Architects Networking specialist Students Professional Services Data Analyst BI DeveloperArchitect QA Performance Engineers Data Warehouse Professional Sales Pre Sales Technical Marketing PM Teaching Staff Delivery Manager lineofbusiness executive LEARN You’ll get speed emerging technique technology analyzing case study develop new technical skill indepth workshop share emerging best practice AI future trend depth breadth what’s covered annual Global AI conference requires multiple trackssessions either follow one track beginning end pick individual session interest 1 Industry Vertical Use Cases AI application workingnot working hot Technologies used implement AI develop AI application etc 2 Cognitive Computing 3 Chatbot 4 Data Science Machine Learning Deep Learning 5 IoT 6 Security 7 NLP 8 Computer Vision 9 Home Assistant 10 Robotics 11 Neural network 12 Data Mining Data Analytics 13 Speech Recognition Image processing Unsupervised Learning 14 Workshops Conference Location Santa Clara Convention Center 5001 Great America Parkway Santa Clara CA 95054 Map CONFERENCE HIGHLIGHTSTags Machine Learning Data Science Artificial Intelligence Marketing Deep Learning |
2,096 | Microservices — Discovering Service Mesh | Microservices — Discovering Service Mesh
Service interactions in the microservices world deal with many non-functional concerns — service discovery, load balancing, fault tolerance, etc. Service Mesh provides the platform to manage these concerns in a more efficient and cleaner way. In this article, we will understand the framework in more detail along with a sample implementation based on Istio and Spring Boot. This is the 9th part of our learning series on Spring Boot Microservices.
Photo by Ricardo Gomez Angel on Unsplash
Why do we need Service Mesh?
The unique proposition of Service Mesh does not lie in “what it offers” instead “how it achieves it”. It solves the problems, originated in the microservices architecture, with a different and mature perspective. It offers the platform where the non-functional concerns related to service interactions, are managed more efficiently. It ensures these operational concerns are not coupled together with the business logic, as was the case with earlier solutions.
We already discussed multiple microservice patterns as part of our spring-boot learning series including Service Discovery, Load Balancing, API Gateway, Circuit Breaker, and many others. Before moving further, I assume, you have a basic understanding of these patterns. We will be discussing and referring to them throughout this article. If you do not have the background on them, it will be difficult to understand “What Service Mesh offers”. You can check out our learning series to get insight into these patterns if needed.
If we rewind our previous exercises, we will find that the non-functional concerns are tightly coupled with the application logic. For instance, in our service discovery exercise, we implemented the load balancing on the client service side. In our circuit breaker exercise, we implemented the decorators, again on the client service end. These solutions work fine except that they restrict software maintenance, both from infrastructure and business perspectives. Different technologies, standards, and design approaches across the multiple microservices teams create a diversified set of implementations for the same set of problems. This creates a much bigger problem to solve.
Assume the scenario, where we have to implement TLS Certificates across all the “service to service communications”. If different teams start working on this, this will become a long-lasting exercise. It's primarily due to the fact that the operation logic is bundled together with the application logic. This increases the implementation complexity multifold. Teams will be busier in resolving the inconsistencies rather than focusing on their core business logic.
The primary objective of Service Mesh is to segregate the non-functional concerns, primarily dealing with connecting, securing, and monitoring services, from the application code. With the help of a separate infrastructure layer, we can enable the non-functional features with almost zero impact on the existing services.
Consider the case of our e-commerce system. We have a Product Catalog Service responsible for product management and a Product Inventory Service responsible for product inventory management. If our portal is interested in getting the product details the call will look as shown in the figure above. With the traditional approach, the logic of Service Discovery and Circuit Breaker will be implemented along with the application logic of Product Catalog Service. Service Mesh promises to separate this out.
Enabling operational concerns with the help of a separate layer improves the overall maintainability of the system significantly. Also, the changes can be managed more effectively as different teams can focus on different concerns. Development teams can focus on the business logic whereas the DevOps teams can focus on implementing the infrastructure concerns.
How Service-Mesh Works?
There are multiple service mesh technologies including Linkerd, Istio, Consul, AWS, and many more. More or less they work on the same architecture based on the proxies. Each of the business services is associated with one proxy each. So in our case Product Catalog Service will have one proxy and the Product Inventory Service will have another. The proxies reside alongside the services and this is the reason they are termed as — sidecar proxies.
All the sidecar proxies reside in the data plane. They intercept all the calls to and from the service and enable the operational functionalities through it. The list of operational features is long. Few examples could be automatic load balancing, routing, retries, fail-overs, access controls, rate limits, automatic metrics, logs, tracing, etc. Most of these features operate at the request level.
For instance, if Product Catalog Service makes an HTTP call to Product Inventory Service, the sidecar proxy on the Product Catalog end can load balance the call intelligently across all the instances of Product Inventory Service. It can retry the request if it fails. Similarly, the sidecar proxy on the Product Inventory Service side can reject the call if it’s not allowed, or is over the rate limit.
Another important component in Service Mesh is the control plane which helps in coordinating the behavior of proxies and provides API to manipulate and measure the mesh. It’s responsible for managing the sidecar proxies, ingress/egress gateways, service registry, certificates, and other management aspects.
Now that we understand the Service Mesh framework to some extent, let's see how it works on the ground. We will be using Istio for our sample implementation which is the leading service mesh framework. It's an open-source technology supported by Redhat, Google Cloud, IBM Cloud, Pivotal, Apigee, and other technology leaders.
Sample Implementation
Istio is designed to be platform-independent and supports services deployed over Kubernetes, Consul, or Virtual Machines. We will be using Kubernetes as the underline deployment platform. If you are new to Kubernetes, I suggest getting a basic understanding of it. You can visit my article on Working with Kubernetes for a high-level overview of this topic. For a detailed overview, you can visit its official website.
Istio provides the platform to enable multiple features in the areas of traffic management, security, and observability. With this exercise, we will focus on enabling Service Discovery and Circuit Breaker patterns along with a flavor of API Gateway. I have already covered Service Discovery using Netflix Eureka in one of the previous exercises. Similarly, I have captured the Circuit Breaker pattern based on Resilience4j and API Gateway in separate articles. In each of these exercises, the implementation of the patterns is coupled to the application logic, significantly. In this exercise, we will implement these patterns independently, out of the services code.
Service Mesh — Sample Implementation
We will be implementing these patterns in the context of two Spring Boot based microservices — Product Catalog Service and Product Inventory Service. Assuming an external client is interested in getting the product details which include product availability as well, we will see how the API Gateway, Service Discovery, and Circuit Breaker patterns are implemented in this call. We will cover the exercise with the help of the following sections —
Setting up
Installing Kubernetes — We will be installing Minikube which can run a single-node Kubernetes cluster and best suited for learning purposes. I am using a Debian machine and used the following commands to download and install the minikube package. You can get the installation instructions for other platforms at https://kubernetes.io/docs/tasks/tools/install-minikube/.
$ curl -LO ###### downloading package$ curl -LO https://storage.googleapis.com/minikube/releases/latest/minikube_latest_amd64.deb ###### installing package
$ sudo dpkg -i minikube_latest_amd64.deb ###### starting kubernetes cluster
$ minikube start
Installing Docker — We will not be able to start the Kubernetes cluster as it needs an underline virtualization technology such as containers or virtual machines to function. We will be using the most popular container option here — Docker. The following command helps in installing Docker on my machine. You can check other installation options at https://docs.docker.com/engine/install/.
# installing through convenience scripts for testing purpose
$ curl -fsSL https://get.docker.com -o get-docker.sh
$ sudo sh get-docker.sh # enabling current user to run docker
$ sudo usermod -aG docker $current_user && newgrp docker
Now that we have the Docker installed you can run minikube start to start the Kubernetes cluster.
In this exercise, we will be building container images for our spring-boot based services. If you are new to this topic, you can get a crash course on this here.
Installing Kubectl — This is the command-line tool to access Kubernetes APIs. We will be using it to manage service deployments.
# installing kubectl on Debian
sudo apt-get install -y kubectl
Installing Istio — Here comes our primary framework. We have already installed its pre-requisites, so its installation should be smooth.
$ curl -L ######### downloading latest release of istio$ curl -L https://istio.io/downloadIstio | sh - ######### including istio on path
$ export PATH=$PWD/bin:$PATH
######### installing istio in demo mode
$ istioctl install --set profile=demo ######### instructing istio to automatically inject envoy sidecar proxies
$ kubectl label namespace default istio-injection=enabled
By running the above commands, a lot of things have happened behind the scenes—
We are ready to create and run containerized services with the help of Docker. We are ready with our Kubernetes Cluster. Istio has installed ingress and egress gateways to control the incoming and outgoing traffic. Istio is ready to deploy its side-car proxies.
Deploying Services, Enabling Service Discovery
It's time to deploy our services. Let's get the microservices code from our Github repository —
This will get the code across all the samples. For the purpose of this exercise, we will be dealing with the samples present in the directory spring-boot/istio-example . Before we jump into creating the container images for our services, run the command $ eval $(minikube docker-env) to use the docker environment available with minikube. This will ensure that all the local images are stored in this environment and referred correctly during runtime.
Let's create a container image for the Product Inventory Service. Change your working directory to spring-boot/istio-example/product_inventory . Dockerfile is already available for this service —
###### product inventory service ##### FROM adoptopenjdk:11-jre-hotspot as builder
ARG JAR_FILE=target/product_inventory-0.0.1-SNAPSHOT.jar
ADD ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]
Run the following command to build the docker image. This will update the local docker daemon respectively.
##### building container image for product inventory service
$ docker build -t product-inventory:v0.0.1 .
We need to create deployment and service configurations for our Product Inventory Service. Configurations are already available in the root directory of the service — deployment-def.yaml and service-def.yaml
deployment-def.yaml — creating basic deployment configuration based on the container image product-inventory:v0.0.1
apiVersion: apps/v1
kind: Deployment
metadata:
name: product-inventory
namespace: default
labels:
app: product-inventory
spec:
replicas: 1
selector:
matchLabels:
app: product-inventory
template:
metadata:
labels:
app: product-inventory
spec:
containers:
- name: product-inventory
image: 'product-inventory:v0.0.1'
imagePullPolicy: Never
ports:
- containerPort: 8080
service-def.yaml — exposing product inventory service accessible inside the Kubernetes cluster.
apiVersion: v1
kind: Service
metadata:
labels:
app: product-inventory
service: product-inventory
name: product-inventory
namespace: default
spec:
ports:
- port: 8080
name: http
selector:
app: product-inventory
type: ClusterIP
Run the following commands to apply these configurations —
$ kubectl apply -f deployment-def.yaml
$ kubectl apply -f service-def.yaml
The above commands have done quite a few things. Our Product Inventory Service is deployed and is accessible in the cluster. Istio has also installed a side-car proxy for this service. If you see the container pods by running kubectl get pods you will see something like this.
product inventory service — running pod
In the READY column, it displays how many containers the pod is running. If we investigate the running pod with the command kubectl describe pod product-inventory-db9686d7d-7xsz5 , we will see that the pod has two containers — product-inventory & istio-proxy
This means along with our Product Inventory Service, Istio has already installed the side-car proxy. This proxy has the capability to intercept all the incoming and outgoing requests of the Service.
Also by applying service-def.yaml we have instructed Istio to register the service in the service registry with the name product-inventory . Even if we create 10 instances of this service, we can communicate to it by referring just the DNS name — product-inventory . Service Discovery and Load Balancing will continue to happen behind the scenes.
Similar to the Product Inventory Service lets configure and deploy Product Catalog Service
##### creating docker image for product-catalog service
$ cd istio-example/product_catalog
$ docker build -t product-catalog:v0.0.1 . ##### deploying product catalog service
$ kubectl apply -f deployment-def.yaml
$ kubectl apply -f service-def.yaml
With this, our Product Catalog Service is up and running. Let's take a quick look at how the service is calling the Product Inventory Service. Open the code for ProductCatalogService.java and check the getProductDetails API.
//get product details api
public Product getProductDetails(
Product product = mongoTemplate.findById(id, Product.class);
ProductInventory productInventory = restTemplate.getForObject(" @GetMapping ("/product/{id}")public Product getProductDetails( @PathVariable String id) {Product product = mongoTemplate.findById(id, Product.class);ProductInventory productInventory = restTemplate.getForObject(" http://product-inventory:8080/inventory/ " + id, ProductInventory.class); product.setProductInventory(productInventory); return product;
}
In this, it's referring to Product Inventory Service with the DNS name — product-inventory . We have enabled service discovery for both our services and we should be able to make the call to getProductDetails API. But wait! To do this, we must enable access to Product Catalog Service from outside our cluster. And this will be done with the help of Gateway configuration.
Implementing Gateway
This is relatively simpler. You can find the gateway configuration in istio-example root directory with the name gateway-config.yaml .
apiVersion: networking.istio.io/v1alpha3
kind: Gateway
metadata:
name: product-catalog-gateway
spec:
selector:
istio: ingressgateway
servers:
- port:
number: 80
name: http
protocol: HTTP
hosts:
- '*' apiVersion: networking.istio.io/v1alpha3
kind: VirtualService
metadata:
name: product-catalog
spec:
hosts:
- '*'
gateways:
- product-catalog-gateway
http:
- match:
- uri:
prefix: /product
route:
- destination:
host: product-catalog
port:
number: 8080
We are defining a routing rule here instructing to forward all the requests starting with /product to the Product Catalog Service. Apply this configuration by running kubectl apply -f gateway-config.yaml .
Our basic API gateway is ready with the above configuration. We can use it for multiple purposes but let's stick to our basic need. Run the following commands to get the service URL for Product Catalog Service
##### identifying service host and port
$ export INGRESS_PORT=$(kubectl -n istio-system get service istio-ingressgateway -o jsonpath='{.spec.ports[?(@.name=="http2")].nodePort}')
$ export INGRESS_HOST=$(minikube ip)
$ export GATEWAY_URL=$INGRESS_HOST:$INGRESS_PORT
$ echo $GATEWAY_URL ## returns something like 192.168.49.2:30682 ##### minikube tunnel facilitate creating a network route
$ minikube tunnel &
Now that we have the URL to access the service, let's run curl to access the product details for test-product-123. This should return the product details successfully.
$ curl http://192.168.49.2:30682/product/test-product-123
Congratulations, you have successfully implemented API Gateway and Service Discovery for our services.
Implementing Circuit Breaker
Implementing the Circuit Breaker is also easy to configure. You can find the configuration in the root directory — circuit-breaker-config.yaml .
apiVersion: networking.istio.io/v1alpha3
kind: DestinationRule
metadata:
name: product-inventory
spec:
host: product-inventory
trafficPolicy:
connectionPool:
tcp:
maxConnections: 1
http:
http1MaxPendingRequests: 1
maxRequestsPerConnection: 1
outlierDetection:
consecutiveErrors: 1
interval: 1s
baseEjectionTime: 3m
maxEjectionPercent: 100
With this, we are applying the configuration on top of the Product Inventory Service based on DestinationRule. By setting maxConnections: 1 and http1MaxPendingRequests: 1 we are instructing to activate the circuit breaker if it receives requests from more than one connection at the same time. The circuit breaker will remain in an open state and will keep rejecting the requests till it starts receiving the requests from one connection only.
You can use any performance testing tool to validate this behavior. You can also use the bundled tool called folio under the directory — sample-client . More details on this tool can be found here.
Next Steps
We successfully implemented API Gateway, Service Discovery, and Circuit Breaker with the help of Istio. We did not update our services to implement these infrastructure concerns. Instead, we used the separate infrastructure layer provided by Istio to enable them. We provided the configurations and Istio did the rest of the magic.
We can use Istio to configure other traffic management aspects including request routing, fault injection, request timeouts, and ingress/egress policies. We can use its security layer to configure TLS certificates, authentication, and authorization. Observability is another important area of offering from Istio. We can use it to enable service monitoring, logs, distributed tracing, telemetry, and other visualizations. Check out more on its features at the Istio website.
The solution provided by Service Mesh looks clean but as usual, it has some side effects too. It adds additional hops for each call in the form of side-car proxies. In our case, one hop is added at Product Catalog Service and another at Product Inventory Service. The additional hops in the form of proxies need additional resources — CPU and memory. As the microservices increase, this can increase the resource overhead to a good extent. We must keep a watch on this!
Additionally, each of the microservices patterns, when implemented through Service Mesh, does limit the control and capabilities over the operational features to some extent. I am sure this concern will subside with time, as the technology matures. For now, this is one of the best approaches to manage your service to service interactions. | https://medium.com/swlh/microservices-discovering-service-mesh-409ed06b5128 | ['Lal Verma'] | 2020-10-30 19:30:17.377000+00:00 | ['Istio', 'Service Mesh', 'Microservices', 'Spring Boot', 'Software Engineering'] | Title Microservices — Discovering Service MeshContent Microservices — Discovering Service Mesh Service interaction microservices world deal many nonfunctional concern — service discovery load balancing fault tolerance etc Service Mesh provides platform manage concern efficient cleaner way article understand framework detail along sample implementation based Istio Spring Boot 9th part learning series Spring Boot Microservices Photo Ricardo Gomez Angel Unsplash need Service Mesh unique proposition Service Mesh lie “what offers” instead “how achieves it” solves problem originated microservices architecture different mature perspective offer platform nonfunctional concern related service interaction managed efficiently ensures operational concern coupled together business logic case earlier solution already discussed multiple microservice pattern part springboot learning series including Service Discovery Load Balancing API Gateway Circuit Breaker many others moving assume basic understanding pattern discussing referring throughout article background difficult understand “What Service Mesh offers” check learning series get insight pattern needed rewind previous exercise find nonfunctional concern tightly coupled application logic instance service discovery exercise implemented load balancing client service side circuit breaker exercise implemented decorator client service end solution work fine except restrict software maintenance infrastructure business perspective Different technology standard design approach across multiple microservices team create diversified set implementation set problem creates much bigger problem solve Assume scenario implement TLS Certificates across “service service communications” different team start working become longlasting exercise primarily due fact operation logic bundled together application logic increase implementation complexity multifold Teams busier resolving inconsistency rather focusing core business logic primary objective Service Mesh segregate nonfunctional concern primarily dealing connecting securing monitoring service application code help separate infrastructure layer enable nonfunctional feature almost zero impact existing service Consider case ecommerce system Product Catalog Service responsible product management Product Inventory Service responsible product inventory management portal interested getting product detail call look shown figure traditional approach logic Service Discovery Circuit Breaker implemented along application logic Product Catalog Service Service Mesh promise separate Enabling operational concern help separate layer improves overall maintainability system significantly Also change managed effectively different team focus different concern Development team focus business logic whereas DevOps team focus implementing infrastructure concern ServiceMesh Works multiple service mesh technology including Linkerd Istio Consul AWS many le work architecture based proxy business service associated one proxy case Product Catalog Service one proxy Product Inventory Service another proxy reside alongside service reason termed — sidecar proxy sidecar proxy reside data plane intercept call service enable operational functionality list operational feature long example could automatic load balancing routing retries failovers access control rate limit automatic metric log tracing etc feature operate request level instance Product Catalog Service make HTTP call Product Inventory Service sidecar proxy Product Catalog end load balance call intelligently across instance Product Inventory Service retry request fails Similarly sidecar proxy Product Inventory Service side reject call it’s allowed rate limit Another important component Service Mesh control plane help coordinating behavior proxy provides API manipulate measure mesh It’s responsible managing sidecar proxy ingressegress gateway service registry certificate management aspect understand Service Mesh framework extent let see work ground using Istio sample implementation leading service mesh framework opensource technology supported Redhat Google Cloud IBM Cloud Pivotal Apigee technology leader Sample Implementation Istio designed platformindependent support service deployed Kubernetes Consul Virtual Machines using Kubernetes underline deployment platform new Kubernetes suggest getting basic understanding visit article Working Kubernetes highlevel overview topic detailed overview visit official website Istio provides platform enable multiple feature area traffic management security observability exercise focus enabling Service Discovery Circuit Breaker pattern along flavor API Gateway already covered Service Discovery using Netflix Eureka one previous exercise Similarly captured Circuit Breaker pattern based Resilience4j API Gateway separate article exercise implementation pattern coupled application logic significantly exercise implement pattern independently service code Service Mesh — Sample Implementation implementing pattern context two Spring Boot based microservices — Product Catalog Service Product Inventory Service Assuming external client interested getting product detail include product availability well see API Gateway Service Discovery Circuit Breaker pattern implemented call cover exercise help following section — Setting Installing Kubernetes — installing Minikube run singlenode Kubernetes cluster best suited learning purpose using Debian machine used following command download install minikube package get installation instruction platform httpskubernetesiodocstaskstoolsinstallminikube curl LO downloading package curl LO httpsstoragegoogleapiscomminikubereleaseslatestminikubelatestamd64deb installing package sudo dpkg minikubelatestamd64deb starting kubernetes cluster minikube start Installing Docker — able start Kubernetes cluster need underline virtualization technology container virtual machine function using popular container option — Docker following command help installing Docker machine check installation option httpsdocsdockercomengineinstall installing convenience script testing purpose curl fsSL httpsgetdockercom getdockersh sudo sh getdockersh enabling current user run docker sudo usermod aG docker currentuser newgrp docker Docker installed run minikube start start Kubernetes cluster exercise building container image springboot based service new topic get crash course Installing Kubectl — commandline tool access Kubernetes APIs using manage service deployment installing kubectl Debian sudo aptget install kubectl Installing Istio — come primary framework already installed prerequisite installation smooth curl L downloading latest release istio curl L httpsistioiodownloadIstio sh including istio path export PATHPWDbinPATH installing istio demo mode istioctl install set profiledemo instructing istio automatically inject envoy sidecar proxy kubectl label namespace default istioinjectionenabled running command lot thing happened behind scenes— ready create run containerized service help Docker ready Kubernetes Cluster Istio installed ingres egress gateway control incoming outgoing traffic Istio ready deploy sidecar proxy Deploying Services Enabling Service Discovery time deploy service Lets get microservices code Github repository — get code across sample purpose exercise dealing sample present directory springbootistioexample jump creating container image service run command eval minikube dockerenv use docker environment available minikube ensure local image stored environment referred correctly runtime Lets create container image Product Inventory Service Change working directory springbootistioexampleproductinventory Dockerfile already available service — product inventory service adoptopenjdk11jrehotspot builder ARG JARFILEtargetproductinventory001SNAPSHOTjar ADD JARFILE appjar ENTRYPOINT javajarappjar Run following command build docker image update local docker daemon respectively building container image product inventory service docker build productinventoryv001 need create deployment service configuration Product Inventory Service Configurations already available root directory service — deploymentdefyaml servicedefyaml deploymentdefyaml — creating basic deployment configuration based container image productinventoryv001 apiVersion appsv1 kind Deployment metadata name productinventory namespace default label app productinventory spec replica 1 selector matchLabels app productinventory template metadata label app productinventory spec container name productinventory image productinventoryv001 imagePullPolicy Never port containerPort 8080 servicedefyaml — exposing product inventory service accessible inside Kubernetes cluster apiVersion v1 kind Service metadata label app productinventory service productinventory name productinventory namespace default spec port port 8080 name http selector app productinventory type ClusterIP Run following command apply configuration — kubectl apply f deploymentdefyaml kubectl apply f servicedefyaml command done quite thing Product Inventory Service deployed accessible cluster Istio also installed sidecar proxy service see container pod running kubectl get pod see something like product inventory service — running pod READY column display many container pod running investigate running pod command kubectl describe pod productinventorydb9686d7d7xsz5 see pod two container — productinventory istioproxy mean along Product Inventory Service Istio already installed sidecar proxy proxy capability intercept incoming outgoing request Service Also applying servicedefyaml instructed Istio register service service registry name productinventory Even create 10 instance service communicate referring DNS name — productinventory Service Discovery Load Balancing continue happen behind scene Similar Product Inventory Service let configure deploy Product Catalog Service creating docker image productcatalog service cd istioexampleproductcatalog docker build productcatalogv001 deploying product catalog service kubectl apply f deploymentdefyaml kubectl apply f servicedefyaml Product Catalog Service running Lets take quick look service calling Product Inventory Service Open code ProductCatalogServicejava check getProductDetails API get product detail api public Product getProductDetails Product product mongoTemplatefindByIdid Productclass ProductInventory productInventory restTemplategetForObject GetMapping productidpublic Product getProductDetails PathVariable String id Product product mongoTemplatefindByIdid ProductclassProductInventory productInventory restTemplategetForObject httpproductinventory8080inventory id ProductInventoryclass productsetProductInventoryproductInventory return product referring Product Inventory Service DNS name — productinventory enabled service discovery service able make call getProductDetails API wait must enable access Product Catalog Service outside cluster done help Gateway configuration Implementing Gateway relatively simpler find gateway configuration istioexample root directory name gatewayconfigyaml apiVersion networkingistioiov1alpha3 kind Gateway metadata name productcataloggateway spec selector istio ingressgateway server port number 80 name http protocol HTTP host apiVersion networkingistioiov1alpha3 kind VirtualService metadata name productcatalog spec host gateway productcataloggateway http match uri prefix product route destination host productcatalog port number 8080 defining routing rule instructing forward request starting product Product Catalog Service Apply configuration running kubectl apply f gatewayconfigyaml basic API gateway ready configuration use multiple purpose let stick basic need Run following command get service URL Product Catalog Service identifying service host port export INGRESSPORTkubectl n istiosystem get service istioingressgateway jsonpathspecportsnamehttp2nodePort export INGRESSHOSTminikube ip export GATEWAYURLINGRESSHOSTINGRESSPORT echo GATEWAYURL return something like 19216849230682 minikube tunnel facilitate creating network route minikube tunnel URL access service let run curl access product detail testproduct123 return product detail successfully curl http19216849230682producttestproduct123 Congratulations successfully implemented API Gateway Service Discovery service Implementing Circuit Breaker Implementing Circuit Breaker also easy configure find configuration root directory — circuitbreakerconfigyaml apiVersion networkingistioiov1alpha3 kind DestinationRule metadata name productinventory spec host productinventory trafficPolicy connectionPool tcp maxConnections 1 http http1MaxPendingRequests 1 maxRequestsPerConnection 1 outlierDetection consecutiveErrors 1 interval 1 baseEjectionTime 3m maxEjectionPercent 100 applying configuration top Product Inventory Service based DestinationRule setting maxConnections 1 http1MaxPendingRequests 1 instructing activate circuit breaker receives request one connection time circuit breaker remain open state keep rejecting request till start receiving request one connection use performance testing tool validate behavior also use bundled tool called folio directory — sampleclient detail tool found Next Steps successfully implemented API Gateway Service Discovery Circuit Breaker help Istio update service implement infrastructure concern Instead used separate infrastructure layer provided Istio enable provided configuration Istio rest magic use Istio configure traffic management aspect including request routing fault injection request timeouts ingressegress policy use security layer configure TLS certificate authentication authorization Observability another important area offering Istio use enable service monitoring log distributed tracing telemetry visualization Check feature Istio website solution provided Service Mesh look clean usual side effect add additional hop call form sidecar proxy case one hop added Product Catalog Service another Product Inventory Service additional hop form proxy need additional resource — CPU memory microservices increase increase resource overhead good extent must keep watch Additionally microservices pattern implemented Service Mesh limit control capability operational feature extent sure concern subside time technology matures one best approach manage service service interactionsTags Istio Service Mesh Microservices Spring Boot Software Engineering |
2,097 | Turning a Page | Turning a Page
Accepting and adapting to changes
I now look out of my eight-floor window, each morning minus the anxiety I felt post-retirement. I feel no guilt about still being in bed, and not out at work as I know everyone else is also at home.
photo by John- Mark Smith on Unsplash
Whilst they are hurrying and scurrying, preparing for work from home, I too am planning my day. Unlike them, my screen time is limited, and I have the choice to decide what I want to do.
It’s liberating to be the master of your own time, and I no longer envy my colleagues, who are now working from home, managing the household chores and their kids without the help they had previously. Most of them have become experts in half dressing for their online classes, waist down still in their shorts or pyjamas.
Working from home was exciting for a short period, but most people long to go back to their previous routines. The novelty wore off soon enough, and they realised they missed the interaction going out to work provided. They also discovered that the work hours had increased as bosses realised they were now available twenty-four seven.
photo by Helena Lopes on Unsplash
The informal exchange between colleagues during coffee breaks that infused them with ideas and energy needed to take on challenging tasks is no longer an option. Continuous screen time, lack of movement and stress will have an unfavorable effect on their long-term health.
We have all got accustomed to feeling safer and more secure within the confines of home, socializing with limited friends and family members. Evening walks, jogging, cycling, gardening are now replacing yoga classes and work out routines at gyms. The die-hard enthusiasts have the option of online courses, but most of us need to be out to avoid mental and emotional stress.
Funny, how we have finally accepted this new lifestyle. The other day, we went out for one of those now rare dinners to a friend’s house, and it felt strange. Dressing up felt good, though wearing a mask took away some pleasure. After a long gap of six months of eating healthy and on time, the late dinner and the many array of dishes were tempting but gave me heartburn.
Returning home past midnight, I wasn’t sure if the changes were all that bad. Our new routine is healthier; one leads a more disciplined life, ensuring that the food we intake is more nutritious rather than tasty. All the immunity boosters taken religiously have made me more aware, and I eat more consciously.
My birthday a few days ago was different from the previous years. My spouse pleasantly surprised me with a gift I least expected. I had once expressed a desire for something and then forgot about it as I realized the expenditure was unnecessary, as this was not the time to pamper one’s vanity.
My better half who usually forgets my birthday, and avoids making a frivolous expense, outdid himself, by taking me by surprise, going out to purchase the said gift and keeping it hidden till d- day.
I haven’t stopped teasing him about his benevolence, wondering if, as I age, he considers me more endangered now!! Or is it his receding hairline making him more appreciative of my hair?
photo credits to the author Anita Sud
Another new one was the number of friends who attempted to reach out and call. I think all of us have realized the importance and significance of staying in touch and spreading love.
The pandemic may have changed the way we live, but it has not affected our spirit. We are more in touch now with family and friends than before and do not take relationships and occasions for granted. The unpredictability of life has made us aware of people and their significance in our lives.
Today we have only emotional expectations from our friends and family. The healthy, unhealthy competition that existed at work is history now.
We appreciate small things and are less demanding. We have gained freedom from the noise and clutter of the past. One short outing instead of the many trips is good enough now. I marvel and wonder how we packed in so much into a day previously.
Luckily, we humans can attach and detach quickly. We change abodes, habits and lifestyle promptly and let’s hope 2021 bring happiness, cheer and positive changes in all our lives. | https://medium.com/this-shall-be-our-story/turning-a-page-95042d280580 | ['Anita Sud'] | 2020-09-22 01:01:38.952000+00:00 | ['Life Lessons', 'Mental Health', 'Self-awareness', 'Relationships', 'Life'] | Title Turning PageContent Turning Page Accepting adapting change look eightfloor window morning minus anxiety felt postretirement feel guilt still bed work know everyone else also home photo John Mark Smith Unsplash Whilst hurrying scurrying preparing work home planning day Unlike screen time limited choice decide want It’s liberating master time longer envy colleague working home managing household chore kid without help previously become expert half dressing online class waist still short pyjama Working home exciting short period people long go back previous routine novelty wore soon enough realised missed interaction going work provided also discovered work hour increased boss realised available twentyfour seven photo Helena Lopes Unsplash informal exchange colleague coffee break infused idea energy needed take challenging task longer option Continuous screen time lack movement stress unfavorable effect longterm health got accustomed feeling safer secure within confines home socializing limited friend family member Evening walk jogging cycling gardening replacing yoga class work routine gym diehard enthusiast option online course u need avoid mental emotional stress Funny finally accepted new lifestyle day went one rare dinner friend’s house felt strange Dressing felt good though wearing mask took away pleasure long gap six month eating healthy time late dinner many array dish tempting gave heartburn Returning home past midnight wasn’t sure change bad new routine healthier one lead disciplined life ensuring food intake nutritious rather tasty immunity booster taken religiously made aware eat consciously birthday day ago different previous year spouse pleasantly surprised gift least expected expressed desire something forgot realized expenditure unnecessary time pamper one’s vanity better half usually forgets birthday avoids making frivolous expense outdid taking surprise going purchase said gift keeping hidden till day haven’t stopped teasing benevolence wondering age considers endangered receding hairline making appreciative hair photo credit author Anita Sud Another new one number friend attempted reach call think u realized importance significance staying touch spreading love pandemic may changed way live affected spirit touch family friend take relationship occasion granted unpredictability life made u aware people significance life Today emotional expectation friend family healthy unhealthy competition existed work history appreciate small thing le demanding gained freedom noise clutter past One short outing instead many trip good enough marvel wonder packed much day previously Luckily human attach detach quickly change abode habit lifestyle promptly let’s hope 2021 bring happiness cheer positive change livesTags Life Lessons Mental Health Selfawareness Relationships Life |
2,098 | Detect Spam Messages with C# And A CNTK Deep Neural Network | It’s a TSV file with only 2 columns of information:
Label: ‘spam’ for a spam message and ‘ham’ for a normal message.
Message: the full text of the SMS message.
I will build a binary classification network that reads in all messages and then makes a prediction for each message if it is spam or ham.
Let’s get started. Here’s how to set up a new console project in NET Core:
$ dotnet new console -o SpamDetection
$ cd SpamDetection
Next, I need to install required packages:
$ dotnet add package Microsoft.ML
$ dotnet add package CNTK.GPU
$ dotnet add package XPlot.Plotly
$ dotnet add package Fsharp.Core
Microsoft.ML is the Microsoft machine learning package. We will use to load and process the data from the dataset.
The CNTK.GPU library is Microsoft’s Cognitive Toolkit that can train and run deep neural networks.
And Xplot.Plotly is an awesome plotting library based on Plotly. The library is designed for F# so we also need to pull in the Fsharp.Core library.
The CNTK.GPU package will train and run deep neural networks using your GPU. You’ll need an NVidia GPU and Cuda graphics drivers for this to work.
If you don’t have an NVidia GPU or suitable drivers, the library will fall back and use the CPU instead. This will work but training neural networks will take significantly longer.
CNTK is a low-level tensor library for building, training, and running deep neural networks. The code to build deep neural network can get a bit verbose, so I’ve developed a little wrapper called CNTKUtil that will help you write code faster.
You can download the CNTKUtil files and save them in a new CNTKUtil folder at the same level as your project folder.
Then make sure you’re in the console project folder and crearte a project reference like this:
$ dotnet add reference ..\CNTKUtil\CNTKUtil.csproj
Now I’m ready to start writing code. I will edit the Program.cs file with Visual Studio Code and add the following code:
The SpamData class holds all the data for one single spam message. Note how each field is tagged with a LoadColumn attribute that will tell the TSV data loading code from which column to import the data.
Unfortunately I can’t train a deep neural network on text data directly. I first need to convert the text to numbers.
I will get to that conversion later. For now I’ll add a class here that will contain the converted text:
There’s the Label again, but notice how the message has now been converted to a VBuffer and stored in the Features field.
The VBuffer type is a sparse vector. It’s going to store a very large vector with mostly zeroes and only a few nonzero values. The nice thing about the VBuffer type is that it only stores the nonzero values. The zeroes are not stored and do not occupy any space in memory.
The GetFeatures method calls DenseValues to return the complete vector and returns it as a float[] that our neural network understands.
And there’s a GetLabel method that returns 1 if the message is spam (indicated by the Label field containing the word ‘spam’) and 0 if the message is not spam.
The features represent the text converted to a sparse vector that we will use to train the neural network on, and the label is the output variable that we’re trying to predict. So here we’re training on encoded text to predict if that text is spam or not.
Now it’s time to start writing the main program method:
When working with the ML.NET library we always need to set up a machine learning context represented by the MLContext class.
The code calls the LoadFromTextFile method to load the CSV data in memory. Note the SpamData type argument that tells the method which class to use to load the data.
I then use TrainTestSplit to split the data in a training partition containing 70% of the data and a testing partition containing 30% of the data.
Note that I’m deviating from the usual 80–20 split here. This is because the data file is quite small, and so 20% of the data is simply not enough to test the neural network on.
Now it’s time to build a pipeline to convert the text to sparse vector-encoded data. I will use the FeaturizeText component in the ML.NET machine learning library:
Machine learning pipelines in ML.NET are built by stacking transformation components. Here I am using a single component, FeaturizeText, that converts the text messages in SpamData.Message into sparse vector-encoded data in a new column called ‘Features’.
The FeaturizeText component is a very nice solution for handling text input data. The component performs a number of transformations on the text to prepare it for model training:
Normalize the text (=remove punctuation, diacritics, switching to lowercase etc.)
Tokenize each word.
Remove all stopwords
Extract Ngrams and skip-grams
TF-IDF rescaling
Bag of words conversion
The result is that each message is converted to a vector of numeric values that can easily be processed by a deep neural network.
I call the Fit method to initialize the pipeline, and then call Transform twice to transform the text in the training and testing partitions.
Finally I call CreateEnumerable to convert the training and testing data to an enumeration of ProcessedData instances. So now I have the training data in training and the testing data in testing. Both are enumerations of ProcessedData instances.
But CNTK can’t train on an enumeration of class instances. It requires a float[][] for features and float[] for labels.
So I need to set up four float arrays:
These LINQ expressions set up four arrays containing the feature and label data for the training and testing partitions.
Now I need to tell CNTK what shape the input data has that I’ll train the neural network on, and what shape the output data of the neural network will have:
I don’t know in advance how many dimensions the FeaturizeText component will create, so I simply check the width of the training_data array.
The first Var method tells CNTK that my neural network will use a 1-dimensional tensor of nodeCount float values as input. This shape matches the array returned by the ProcessedData.GetFeatures method.
And the second Var method tells CNTK that I want my neural network to output a single float value. This shape matches the single value returned by the ProcessedData.GetLabel method.
My next step is to design the neural network.
I will use a deep neural network with a 16-node input layer, a 16-node hidden layer, and a single-node output layer. I’ll use the ReLU activation function for the input and hidden layers, and Sigmoid activation for the output layer.
The sigmoid function forces the output of a regression network to a range of 0..1 which means I can treat the number as a binary classification probability. So we can turn any regression network into a binary classification network by simply adding the sigmoid activation function to the output layer.
Here’s how to build this neural network:
Each Dense call adds a new dense feedforward layer to the network. I am stacking two layers, both using ReLU activation, and then add a final layer with a single node using Sigmoid activation.
Then I use the ToSummary method to output a description of the architecture of the neural network to the console.
Now I need to decide which loss function to use to train the neural network, and how I am going to track the prediction error of the network during each training epoch.
I will use BinaryCrossEntropy as the loss function because it’s the standard metric for measuring binary classification loss.
And I’ll track the error with the BinaryClassificationError metric. This is the number of times (expressed as a percentage) that the model predictions are wrong. An error of 0 means the predictions are correct all the time, and an error of 1 means the predictions are wrong all the time.
Next I need to decide which algorithm to use to train the neural network. There are many possible algorithms derived from Gradient Descent that we can use here.
I am going to use the AdamLearner. You can learn more about the Adam algorithm here: https://machinelearningmastery.com/adam...
These configuration values are a good starting point for many machine learning scenarios, but you can tweak them if you like to try and improve the quality of the predictions.
We’re almost ready to train. My final step is to set up a trainer and an evaluator for calculating the loss and the error during each training epoch:
The GetTrainer method sets up a trainer which will track the loss and the error for the training partition. And GetEvaluator will set up an evaluator that tracks the error in the test partition.
Now I am finally ready to start training the neural network!
I need to add the following code:
I am training the network for 10 epochs using a batch size of 64. During training I’ll track the loss and errors in the loss, trainingError and testingError arrays.
Once training is done, I show the final testing error on the console. This is the percentage of mistakes the network makes when predicting spam messages.
Note that the error and the accuracy are related: accuracy = 1 — error. So I also report the final accuracy of the neural network.
Here’s the code to train the neural network. This should go inside the for loop:
The Index().Shuffle().Batch() sequence randomizes the data and splits it up in a collection of 64-record batches. The second argument to Batch() is a function that will be called for every batch.
Inside the batch function I call GetBatch twice to get a feature batch and a corresponding label batch. Then I call TrainBatch to train the neural network on these two batches of training data.
The TrainBatch method returns the loss and error, but only for training on the 64-record batch. So I simply add up all these values and divide them by the number of batches in the dataset. That gives me the average loss and error for the predictions on the training partition during the current epoch, and I report this to the console.
So now I know the training loss and error for one single training epoch. The next step is to test the network by making predictions about the data in the testing partition and calculate the testing error.
This code goes inside the epoch loop and right below the training code:
I don’t need to shuffle the data for testing, so now I can call Batch directly. Again I’m calling GetBatch to get feature and label batches, but note that I am now providing the testing_data and testing_labels arrays.
I call TestBatch to test the neural network on the 64-record test batch. The method returns the error for the batch, and I again add up the errors for each batch and divide by the number of batches.
That gives me the average error in the neural network predictions on the test partition for this epoch.
After training completes, the training and testing errors for each epoch will be available in the trainingError and testingError arrays. Let’s use XPlot to create a nice plot of the two error curves so we can check for overfitting:
This code creates a Plot with two Scatter graphs. The first one plots the trainingError values and the second one plots the testingError values.
Finally I use File.WriteAllText to write the plot to disk as a HTML file.
I am now ready to build and run the app!
First I need to build the CNTKUtil library by running this command in the CNTKUtil folder:
$ dotnet build -o bin/Debug/netcoreapp3.0 -p:Platform=x64
This will build the CNKTUtil project. Note how I’m specifying the x64 platform because the CNTK library requires a 64-bit build.
Now I need to run this command in the SpamDetection folder:
$ dotnet build -o bin/Debug/netcoreapp3.0 -p:Platform=x64
This will build the app. Note how I’m again specifying the x64 platform.
Now I can run the app:
$ dotnet run
The app will create the neural network, load the dataset, train the network on the data, and create a plot of the training and testing errors for each epoch.
Here’s the neural network being trained on my laptop:
And here are the results:
The final classification error is 0 on training and 0.010 on testing. That corresponds to a final accuracy on testing of 0.99. This means the neural network makes 99 correct predictions for every 100 messages.
These seem like amazing results, notice how the training and testing curves start to diverge at epoch 2? The training error continues to converge towards zero while the testing error flatlines at 0.01. This is classic overfitting.
Overfitting means that the messages are too complex for the model to process. The mode is not sophisticated enough to capture the complexities of the patterns in the data.
And this is to be expected. Processing English text is a formidable problem and an area of active research, even today. A simple 32-node neural network is not going to be able to generate accurate spam predictions.
What if we increased the complexity of the neural network? Let’s double the number of nodes in the input and hidden layers:
The neural network now has 766,881 configurable parameters to train during each epoch. This is a massive network, but what will the results look like?
Well, check it out:
Nothing has changed. I’m still getting a training error of zero and a testing error of 0.01. And the curves still diverge, now at epoch 1.
Let’s go all out and crank up the number of nodes to 512:
The neural network now has an astounding 12,515,841 trainable parameters.
And here are the results:
Again no change. What’s going on here?
The reason this isn’t working is because the original neural network was big enough already. The 16 input nodes can check for the presence of 16 different words in a message to determine if the message is spam or not. That’s actually more than enough to do the job.
The reason I’m getting poor results is because the meaning of an English sentence is determined by the precise sequence of words in the sentence.
For example, in the text fragment “not very good”, the meaning of “good” is inverted by the presence of “not very”. If I simply check for the presence of the word “good”, I get a totally incorrect picture of the meaning of the sentence.
My neural network is looking at all the words in a message at once, ignores their order, and simply tests for spam by checking if certain words appear anywhere or not. This approach is not good enough for language processing.
So what do you think?
Are you ready to start writing C# machine learning apps with CNTK? | https://medium.com/machinelearningadvantage/detect-spam-messages-with-c-and-a-cntk-deep-neural-network-a83aca2a209e | ['Mark Farragher'] | 2019-11-19 14:56:22.994000+00:00 | ['Programming', 'Data Science', 'Artificial Intelligence', 'Csharp', 'Machine Learning'] | Title Detect Spam Messages C CNTK Deep Neural NetworkContent It’s TSV file 2 column information Label ‘spam’ spam message ‘ham’ normal message Message full text SMS message build binary classification network read message make prediction message spam ham Let’s get started Here’s set new console project NET Core dotnet new console SpamDetection cd SpamDetection Next need install required package dotnet add package MicrosoftML dotnet add package CNTKGPU dotnet add package XPlotPlotly dotnet add package FsharpCore MicrosoftML Microsoft machine learning package use load process data dataset CNTKGPU library Microsoft’s Cognitive Toolkit train run deep neural network XplotPlotly awesome plotting library based Plotly library designed F also need pull FsharpCore library CNTKGPU package train run deep neural network using GPU You’ll need NVidia GPU Cuda graphic driver work don’t NVidia GPU suitable driver library fall back use CPU instead work training neural network take significantly longer CNTK lowlevel tensor library building training running deep neural network code build deep neural network get bit verbose I’ve developed little wrapper called CNTKUtil help write code faster download CNTKUtil file save new CNTKUtil folder level project folder make sure you’re console project folder crearte project reference like dotnet add reference CNTKUtilCNTKUtilcsproj I’m ready start writing code edit Programcs file Visual Studio Code add following code SpamData class hold data one single spam message Note field tagged LoadColumn attribute tell TSV data loading code column import data Unfortunately can’t train deep neural network text data directly first need convert text number get conversion later I’ll add class contain converted text There’s Label notice message converted VBuffer stored Features field VBuffer type sparse vector It’s going store large vector mostly zero nonzero value nice thing VBuffer type store nonzero value zero stored occupy space memory GetFeatures method call DenseValues return complete vector return float neural network understands there’s GetLabel method return 1 message spam indicated Label field containing word ‘spam’ 0 message spam feature represent text converted sparse vector use train neural network label output variable we’re trying predict we’re training encoded text predict text spam it’s time start writing main program method working MLNET library always need set machine learning context represented MLContext class code call LoadFromTextFile method load CSV data memory Note SpamData type argument tell method class use load data use TrainTestSplit split data training partition containing 70 data testing partition containing 30 data Note I’m deviating usual 80–20 split data file quite small 20 data simply enough test neural network it’s time build pipeline convert text sparse vectorencoded data use FeaturizeText component MLNET machine learning library Machine learning pipeline MLNET built stacking transformation component using single component FeaturizeText convert text message SpamDataMessage sparse vectorencoded data new column called ‘Features’ FeaturizeText component nice solution handling text input data component performs number transformation text prepare model training Normalize text remove punctuation diacritic switching lowercase etc Tokenize word Remove stopwords Extract Ngrams skipgrams TFIDF rescaling Bag word conversion result message converted vector numeric value easily processed deep neural network call Fit method initialize pipeline call Transform twice transform text training testing partition Finally call CreateEnumerable convert training testing data enumeration ProcessedData instance training data training testing data testing enumeration ProcessedData instance CNTK can’t train enumeration class instance requires float feature float label need set four float array LINQ expression set four array containing feature label data training testing partition need tell CNTK shape input data I’ll train neural network shape output data neural network don’t know advance many dimension FeaturizeText component create simply check width trainingdata array first Var method tell CNTK neural network use 1dimensional tensor nodeCount float value input shape match array returned ProcessedDataGetFeatures method second Var method tell CNTK want neural network output single float value shape match single value returned ProcessedDataGetLabel method next step design neural network use deep neural network 16node input layer 16node hidden layer singlenode output layer I’ll use ReLU activation function input hidden layer Sigmoid activation output layer sigmoid function force output regression network range 01 mean treat number binary classification probability turn regression network binary classification network simply adding sigmoid activation function output layer Here’s build neural network Dense call add new dense feedforward layer network stacking two layer using ReLU activation add final layer single node using Sigmoid activation use ToSummary method output description architecture neural network console need decide loss function use train neural network going track prediction error network training epoch use BinaryCrossEntropy loss function it’s standard metric measuring binary classification loss I’ll track error BinaryClassificationError metric number time expressed percentage model prediction wrong error 0 mean prediction correct time error 1 mean prediction wrong time Next need decide algorithm use train neural network many possible algorithm derived Gradient Descent use going use AdamLearner learn Adam algorithm httpsmachinelearningmasterycomadam configuration value good starting point many machine learning scenario tweak like try improve quality prediction We’re almost ready train final step set trainer evaluator calculating loss error training epoch GetTrainer method set trainer track loss error training partition GetEvaluator set evaluator track error test partition finally ready start training neural network need add following code training network 10 epoch using batch size 64 training I’ll track loss error loss trainingError testingError array training done show final testing error console percentage mistake network make predicting spam message Note error accuracy related accuracy 1 — error also report final accuracy neural network Here’s code train neural network go inside loop IndexShuffleBatch sequence randomizes data split collection 64record batch second argument Batch function called every batch Inside batch function call GetBatch twice get feature batch corresponding label batch call TrainBatch train neural network two batch training data TrainBatch method return loss error training 64record batch simply add value divide number batch dataset give average loss error prediction training partition current epoch report console know training loss error one single training epoch next step test network making prediction data testing partition calculate testing error code go inside epoch loop right training code don’t need shuffle data testing call Batch directly I’m calling GetBatch get feature label batch note providing testingdata testinglabels array call TestBatch test neural network 64record test batch method return error batch add error batch divide number batch give average error neural network prediction test partition epoch training completes training testing error epoch available trainingError testingError array Let’s use XPlot create nice plot two error curve check overfitting code creates Plot two Scatter graph first one plot trainingError value second one plot testingError value Finally use FileWriteAllText write plot disk HTML file ready build run app First need build CNTKUtil library running command CNTKUtil folder dotnet build binDebugnetcoreapp30 pPlatformx64 build CNKTUtil project Note I’m specifying x64 platform CNTK library requires 64bit build need run command SpamDetection folder dotnet build binDebugnetcoreapp30 pPlatformx64 build app Note I’m specifying x64 platform run app dotnet run app create neural network load dataset train network data create plot training testing error epoch Here’s neural network trained laptop result final classification error 0 training 0010 testing corresponds final accuracy testing 099 mean neural network make 99 correct prediction every 100 message seem like amazing result notice training testing curve start diverge epoch 2 training error continues converge towards zero testing error flatlines 001 classic overfitting Overfitting mean message complex model process mode sophisticated enough capture complexity pattern data expected Processing English text formidable problem area active research even today simple 32node neural network going able generate accurate spam prediction increased complexity neural network Let’s double number node input hidden layer neural network 766881 configurable parameter train epoch massive network result look like Well check Nothing changed I’m still getting training error zero testing error 001 curve still diverge epoch 1 Let’s go crank number node 512 neural network astounding 12515841 trainable parameter result change What’s going reason isn’t working original neural network big enough already 16 input node check presence 16 different word message determine message spam That’s actually enough job reason I’m getting poor result meaning English sentence determined precise sequence word sentence example text fragment “not good” meaning “good” inverted presence “not very” simply check presence word “good” get totally incorrect picture meaning sentence neural network looking word message ignores order simply test spam checking certain word appear anywhere approach good enough language processing think ready start writing C machine learning apps CNTKTags Programming Data Science Artificial Intelligence Csharp Machine Learning |
2,099 | Why I Started Writing Shorter Articles | I started with an article. And then another. And then another. My brain was on fire that weekend.
I didn’t wake up with the plan to just sit and write. All I knew is that I wasn’t going to go out for the day without writing something I could put on my blog. The irony is that I had been trying to focus on longer content, but my hand was a bit forced.
I had written an article about quantity over quality which I wanted to live up to. My time had been heavily constrained with work and my baby. I also found it was a lot less pressure to write something when I lowered my minimum word count for what I was willing to write.
Aiming Closer
My aim was to write articles which were roughly 1,500 to 2,000 words, with a minimum of 1,500. I used the same approach for pretty much any kind of article, with the exception of certain short articles. This kind of structure gave me a concrete goal to keep my writing more consistent. I could write things outside of this, but this was my general benchmark for a “complete” article.
One day, I noticed I could knock out a short article which was roughly 1,000 words in less than half the time it took to knock out one which was 1,500 words. Writing two 1,000 word articles also left me feeling less tired than I did after a single 1,500 word article. I had many articles I stalled out on or forced an extra point into to make my minimum. I also had multiple factors from work and life which made it increasingly difficult to allocate time. If I could get the free time, I could knock out more, but that’s not really an option at present.
I decided to start shooting for somewhere between 750 to 1,000 words as my minimum depending on the type of content. Certain content I wanted to hit 1,200 words before I felt finished, but if I stopped or ran out of ideas, I ended it and moved on. This strategy has worked amazingly so far. My writing has become more organic, though a bit more volatile. I can feel the growth from the writing process more immediately and I can keep the heat up.
I have basically been able to double my output of content without feeling rushed or that I’m missing something. When I want to write more, I do. If I hit a dead end, I wrap it up and move on.
Quantity Over Quality
Sometimes you just need to do more to get more practice. Instead of obsessing over perfection, drop it and move on. Splitting up a task into smaller tasks means more practice with each individual component.
By shrinking the minimum I was aiming for, I could produce more, and it ended up faster. There is more to writing the just writing itself, especially when creating blog content or writing for something like Medium. You have to consider research, planning, writing, rewriting and editing, media production or procurement, title creation and summary, and polishing. Some of these factors have a fixed cost, some grow evenly with the word count, and some can grow exponentially for minimum time required per step.
Skill Sets
Image by Free-Photos from Pixabay
Each of these factors is also its own skill set. Research won’t make your writing itself better, but it provides better evidence and better topics to write about. Media production or procurement just enhances your writing and can help make a better product. Planning makes the writing more coherent and consistent and can give a scaffolding. Rewriting, editing, and polishing are all their own skills which temper writing into something better and better at different steps of the process. Title creation and summary writing are their own kind of writing entirely which impact how your writing is received.
By shortening the writing cycle, I get more practice on the skills that can help shape my writing as well as my writing itself. I can also test more ideas since the cycle is much shorter and a bad article is less of a hit to my productivity. Repeating the process more means I can focus on how everything goes together instead of trying to kill 200 more words for the sake of a number on my screen.
Working Around Time Constraints
Image by annca from Pixabay
My job has calmed down, but can still take a toll on my time outside the office. As my baby gets bigger and bigger, she gets to need more and more time with me. She doesn’t want to sleep early anymore either.
Most tasks have a warm-up period before getting productive, and writing is no exception. I have fewer and fewer blocks of time I can allocate to my writing, so I had to simplify my workflow to make use of what I had. It takes me a lot longer to get into the flow when I have to catch up on a massive amount of text. Smaller articles have a lower associated cost to get back in the flow.
I worked as an editor for years, so I have a specific workflow which requires periods of focus. The longer the article, the longer the period necessary. Shortening the writing cycle means I don’t need as much time so I can play fit my editing blocks in gaps of free time more efficiently.
My kid may not cooperate to give me productive writing time for days if she’s going through a growth spurt. The more of the process I can fully complete, the easier it is to keep the momentum going. I can knock a shorter article out on a moderately bad night now. If the baby or work don’t cooperate, I don’t get stuck halfway through the process.
Lowering Pressure
Image by Jan Vašek from Pixabay
More practice and lower time constraints on individual steps lead to less pressure. I don’t have to force the article, I can end it when I want. Setting a limit may arguably be restrictive, but I find it gives me structure which makes me write better. If I just sit down without some end in sight, I’ll either ramble or not write much. I may write for the sake of writing, but that doesn’t mean I don’t have a process for writing.
If I set a minimum, I feel a need to reach it, but some articles just don’t have 1,500 good words in them. A good writer can arbitrarily hit that (or pretty much any arbitrary standard), but I never said I was good. Setting my standards lower and surpassing them has helped me keep on track and write better. Don’t make the bar low enough to be pointless, but an easy win is still a win and can still provide great feedback.
Why It Works
I tend to obsess over the ritual for my writing process. The structure makes me pace myself and not burn myself out. If you tell me to run a mile, I’ll sprint until I can barely walk the last 9/10ths (I’m also not a runner), but if you tell me to run for 20 minutes, I’ll jog at a consistent speed. Setting conditions and restrictions forces me to pace myself. Writing is a release for me, and by controlling the release, I am able to get the most out of it for myself.
This advice may not be as applicable to you if you have plenty of free time and write organically. By slashing my articles down, I have more time to focus on other aspects of writing and perfect my overall process. I can fit more small sessions in where I can, and I feel a lot less pressure to finish an article. Try writing less and see if you don’t get more out of it.
Featured image by Jess Watters from Pixabay | https://medium.com/swlh/why-i-started-writing-shorter-articles-a15be0e214e | ['Some Dude Says'] | 2019-11-12 18:28:30.281000+00:00 | ['Writing Tips', 'Productivity', 'Writing', 'Self', 'Writer'] | Title Started Writing Shorter ArticlesContent started article another another brain fire weekend didn’t wake plan sit write knew wasn’t going go day without writing something could put blog irony trying focus longer content hand bit forced written article quantity quality wanted live time heavily constrained work baby also found lot le pressure write something lowered minimum word count willing write Aiming Closer aim write article roughly 1500 2000 word minimum 1500 used approach pretty much kind article exception certain short article kind structure gave concrete goal keep writing consistent could write thing outside general benchmark “complete” article One day noticed could knock short article roughly 1000 word le half time took knock one 1500 word Writing two 1000 word article also left feeling le tired single 1500 word article many article stalled forced extra point make minimum also multiple factor work life made increasingly difficult allocate time could get free time could knock that’s really option present decided start shooting somewhere 750 1000 word minimum depending type content Certain content wanted hit 1200 word felt finished stopped ran idea ended moved strategy worked amazingly far writing become organic though bit volatile feel growth writing process immediately keep heat basically able double output content without feeling rushed I’m missing something want write hit dead end wrap move Quantity Quality Sometimes need get practice Instead obsessing perfection drop move Splitting task smaller task mean practice individual component shrinking minimum aiming could produce ended faster writing writing especially creating blog content writing something like Medium consider research planning writing rewriting editing medium production procurement title creation summary polishing factor fixed cost grow evenly word count grow exponentially minimum time required per step Skill Sets Image FreePhotos Pixabay factor also skill set Research won’t make writing better provides better evidence better topic write Media production procurement enhances writing help make better product Planning make writing coherent consistent give scaffolding Rewriting editing polishing skill temper writing something better better different step process Title creation summary writing kind writing entirely impact writing received shortening writing cycle get practice skill help shape writing well writing also test idea since cycle much shorter bad article le hit productivity Repeating process mean focus everything go together instead trying kill 200 word sake number screen Working Around Time Constraints Image annca Pixabay job calmed still take toll time outside office baby get bigger bigger get need time doesn’t want sleep early anymore either task warmup period getting productive writing exception fewer fewer block time allocate writing simplify workflow make use take lot longer get flow catch massive amount text Smaller article lower associated cost get back flow worked editor year specific workflow requires period focus longer article longer period necessary Shortening writing cycle mean don’t need much time play fit editing block gap free time efficiently kid may cooperate give productive writing time day she’s going growth spurt process fully complete easier keep momentum going knock shorter article moderately bad night baby work don’t cooperate don’t get stuck halfway process Lowering Pressure Image Jan Vašek Pixabay practice lower time constraint individual step lead le pressure don’t force article end want Setting limit may arguably restrictive find give structure make write better sit without end sight I’ll either ramble write much may write sake writing doesn’t mean don’t process writing set minimum feel need reach article don’t 1500 good word good writer arbitrarily hit pretty much arbitrary standard never said good Setting standard lower surpassing helped keep track write better Don’t make bar low enough pointless easy win still win still provide great feedback Works tend ob ritual writing process structure make pace burn tell run mile I’ll sprint barely walk last 910ths I’m also runner tell run 20 minute I’ll jog consistent speed Setting condition restriction force pace Writing release controlling release able get advice may applicable plenty free time write organically slashing article time focus aspect writing perfect overall process fit small session feel lot le pressure finish article Try writing le see don’t get Featured image Jess Watters PixabayTags Writing Tips Productivity Writing Self Writer |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.