Unnamed: 0
int64
0
192k
title
stringlengths
1
200
text
stringlengths
10
100k
url
stringlengths
32
885
authors
stringlengths
2
392
timestamp
stringlengths
19
32
tags
stringlengths
6
263
info
stringlengths
45
90.4k
2,400
How Does the Moon Really Cause Waves?
Look, I know that the reason why the ocean has waves is because of the Moon. I learned that back in elementary school, and I’ll state it with full confidence. But if you asked an innocent follow-up question, my confidence would unravel. I know that there’s some connection, that there’s some sort of link between the phases of the moon and the strength of the tides. Didn’t sailors used to know about the tides by looking at the moon? How are tides caused? Does the Moon pull harder at some times to make stronger waves? Are tides in freshwater lakes, or just in the ocean? Why does the tide go in and out once per day in some places, but multiple times per day in other places? What about the Sun? Shouldn’t the Sun’s gravity also cause tides? Let’s learn about tides! Here’s how they are caused, what the Moon (and maybe the Sun) have to do with shifting water, and what else can contribute to the splashy waves that we see on the beach. 1. How are tides caused? The first thing to understand is that not all waves are equal. There are two types of waves — surface waves, which is what we see crashing on the shores of the ocean every few seconds — and long-period waves, which take hours to slowly hit the land. The waves that we see on the surface aren’t caused by the pull of gravity, but are driven by wind. These surface waves can still travel very long distances, even across an entire ocean, but they’re mostly on the surface. These waves don’t actually carry water over long distances, but instead transmit the energy of the wind through the rise and fall of the water. Surface waves, caused by the wind, not by gravity. Photo by Jeremy Bishop on Unsplash Long-period waves, on the other hand, are the waves caused by the pull of other celestial bodies on the Earth. These waves move the entire ocean, and reach deep below the surface. When a long-period wave hits the shore, it takes hours for the full wave to exhaust itself against the land, and it pushes the entire ocean against the shore, making the water rise. We call the ebb and crash of these long-period waves against the shore tides. 2. Does the Moon pull harder at some times to make stronger waves? Some tides are stronger than others — the strongest tides, with the highest rise and the lowest fall of the ocean, occur during a full moon or during a new moon. However, this isn’t because of just the Moon itself, but instead is due to the interaction of the Moon and the Sun. Long-form waves, which we know as tides when they hit the shore, are caused by the gravitational pull of other celestial bodies, most notably the Moon (because it’s closest) and the Sun (because it’s very large and heavy). And when the Moon and Sun happen to be pulling in the same direction, we get the strongest tides. This happens during a new moon (when the Moon is directly between the Sun and Earth), and during a full moon (when the Earth is directly between the Moon and the Sun). In this new Moon situation, where the Moon and Sun are aligned, their gravitational pulls are both in the same direction and we get stronger tides. Source. The majority of the strength of the tides comes from the Moon, as it is much closer, but the Sun still contributes somewhat to influencing tidal strength. 3. Are tides in freshwater lakes, or just in the ocean? Since tides are caused by the gravitational pull of celestial bodies, mainly the Moon, they aren’t just limited to oceans! Theoretically, a lake can have tidal effects as well. However, because there’s a lot less water in a lake, pond, or other freshwater body, the gravitational waves that pull on the water are much smaller, and the tides are much less noticeable. Very large lakes, like the Great Lakes in the northeastern United States, do see a regular tidal effect, but it’s much smaller than ocean tides. 4. Why does the tide go in and out once per day in some places, but multiple times per day in other places? Ah, now things start getting more complex. Someone may ask this question in Britain, for example, where vacationers at seaside resorts enjoy a regular two tides per day. In some locations, the spinning of the Earth and Moon generates centrifugal force, which also pulls on the water on the Earth’s surface. This leads to additional tides. The first tide occurs when Britain faces the Moon, and the pull of the Moon’s gravity is strongest. The second tide occurs when Britain faces away from the Moon, and centrifugal force pulls the water higher. Essentially, as the Earth rotates, this means that sometimes multiple tides stack up in some regions, while other regions may have less frequent tides, or sometimes almost no tidal activity at all! 5. What about the Sun? Shouldn’t the Sun’s gravity also cause tides? As mentioned above, in point #2, the Sun does have an effect on tides as well! But despite being much more massive, the Sun is much further away than the Moon, and so its effect is reduced. The strength of the Sun’s gravity is about 177 times that of the Moon — but it’s also 390 times as far away. When these two numbers are combined, we end up with the Sun having about half the gravitational strength of the Moon, when it comes to tides. As mentioned above, this is why tides are strongest during a full moon or a new moon! At those times, the Sun’s gravitational pull is along the same plane as the Moon’s pull, and thus we get strongest tides. These tides, by the way, are called spring tides — not named because they occur in the spring, but because this is when the water most strongly “springs forth”.
https://medium.com/a-microbiome-scientist-at-large/how-does-the-moon-really-cause-waves-9ff7b45691e3
['Sam Westreich']
2020-12-28 12:02:47.483000+00:00
['Environment', 'Astronomy', 'Nature', 'Science', 'Oceans']
Title Moon Really Cause WavesContent Look know reason ocean wave Moon learned back elementary school I’ll state full confidence asked innocent followup question confidence would unravel know there’s connection there’s sort link phase moon strength tide Didn’t sailor used know tide looking moon tide caused Moon pull harder time make stronger wave tide freshwater lake ocean tide go per day place multiple time per day place Sun Shouldn’t Sun’s gravity also cause tide Let’s learn tide Here’s caused Moon maybe Sun shifting water else contribute splashy wave see beach 1 tide caused first thing understand wave equal two type wave — surface wave see crashing shore ocean every second — longperiod wave take hour slowly hit land wave see surface aren’t caused pull gravity driven wind surface wave still travel long distance even across entire ocean they’re mostly surface wave don’t actually carry water long distance instead transmit energy wind rise fall water Surface wave caused wind gravity Photo Jeremy Bishop Unsplash Longperiod wave hand wave caused pull celestial body Earth wave move entire ocean reach deep surface longperiod wave hit shore take hour full wave exhaust land push entire ocean shore making water rise call ebb crash longperiod wave shore tide 2 Moon pull harder time make stronger wave tide stronger others — strongest tide highest rise lowest fall ocean occur full moon new moon However isn’t Moon instead due interaction Moon Sun Longform wave know tide hit shore caused gravitational pull celestial body notably Moon it’s closest Sun it’s large heavy Moon Sun happen pulling direction get strongest tide happens new moon Moon directly Sun Earth full moon Earth directly Moon Sun new Moon situation Moon Sun aligned gravitational pull direction get stronger tide Source majority strength tide come Moon much closer Sun still contributes somewhat influencing tidal strength 3 tide freshwater lake ocean Since tide caused gravitational pull celestial body mainly Moon aren’t limited ocean Theoretically lake tidal effect well However there’s lot le water lake pond freshwater body gravitational wave pull water much smaller tide much le noticeable large lake like Great Lakes northeastern United States see regular tidal effect it’s much smaller ocean tide 4 tide go per day place multiple time per day place Ah thing start getting complex Someone may ask question Britain example vacationer seaside resort enjoy regular two tide per day location spinning Earth Moon generates centrifugal force also pull water Earth’s surface lead additional tide first tide occurs Britain face Moon pull Moon’s gravity strongest second tide occurs Britain face away Moon centrifugal force pull water higher Essentially Earth rotates mean sometimes multiple tide stack region region may le frequent tide sometimes almost tidal activity 5 Sun Shouldn’t Sun’s gravity also cause tide mentioned point 2 Sun effect tide well despite much massive Sun much away Moon effect reduced strength Sun’s gravity 177 time Moon — it’s also 390 time far away two number combined end Sun half gravitational strength Moon come tide mentioned tide strongest full moon new moon time Sun’s gravitational pull along plane Moon’s pull thus get strongest tide tide way called spring tide — named occur spring water strongly “springs forth”Tags Environment Astronomy Nature Science Oceans
2,401
6 Imposter Syndrome Triggers to Watch Out For
6 Imposter Syndrome Triggers to Watch Out For Know the signs so you can rise above them Photo by airdone on iStock. Imposter syndrome is described as the psychological pattern in which a person doubts their abilities despite evidence of their competence. It is probably one of the most commonly faced issues by women in tech. The unfortunate thing about imposter syndrome is that it never truly goes away. It does get better with time. You learn to manage it, though, and even use it to fuel the majority of your learning and growth. Just like a parasite, imposter syndrome uses up all of your resources to feed itself. Whenever it flares up, it uses all your brain’s resources, and instead of focusing on the task at hand, you go into a spiral about all the reasons you can’t do the task. Most of the time, the parasite is dormant, but every now and then, something triggers the activation and ruins everything. Recognizing what triggers your imposter syndrome can be the first step to fighting it off. They differ from person to person, but here are some of the triggers for my imposter syndrome.
https://medium.com/better-programming/6-imposter-syndrome-triggers-to-watch-out-for-b18dc60e9adc
['Angela Mchunu']
2020-09-28 15:11:36.353000+00:00
['Mental Health', 'Imposter Syndrome', 'Software Engineering', 'Programming', 'Women In Tech']
Title 6 Imposter Syndrome Triggers Watch ForContent 6 Imposter Syndrome Triggers Watch Know sign rise Photo airdone iStock Imposter syndrome described psychological pattern person doubt ability despite evidence competence probably one commonly faced issue woman tech unfortunate thing imposter syndrome never truly go away get better time learn manage though even use fuel majority learning growth like parasite imposter syndrome us resource feed Whenever flare us brain’s resource instead focusing task hand go spiral reason can’t task time parasite dormant every something trigger activation ruin everything Recognizing trigger imposter syndrome first step fighting differ person person trigger imposter syndromeTags Mental Health Imposter Syndrome Software Engineering Programming Women Tech
2,402
Don’t Wait for Opportunity — Work Hard to Achieve What You Want
Hard work is the key In his book, Outliers: The Story Of Success, Malcolm Gladwell digs deep to determine why some people succeed in this life while others don’t. The primary component that came out was — opportunity. Opportunity in life is as significant as oxygen when it comes to success. It’s true. But my problem is hard work often underestimated in the success equation. Some believe that, no matter how hard you work, if an opportunity doesn’t present itself, it would be hard to succeed. That is not always true because many people fail to be successful despite having all the opportunities available to them, while some others succeed without having much luck. To those who, without opportunities, have become successful, what’s the secret? What things did they do that set them apart from their peers? What’s the success factor that they employed that the others didn’t? Was there even any factor? And the answer is yes! It’s hard work.
https://medium.com/the-masterpiece/dont-wait-for-opportunity-work-hard-to-achieve-what-you-want-2a73b6b56aff
['Emmanuel A. Anderson']
2020-12-09 15:16:35.359000+00:00
['The Masterpiece', 'Motivation', 'Success', 'Psychology', 'Self Improvement']
Title Don’t Wait Opportunity — Work Hard Achieve WantContent Hard work key book Outliers Story Success Malcolm Gladwell dig deep determine people succeed life others don’t primary component came — opportunity Opportunity life significant oxygen come success It’s true problem hard work often underestimated success equation believe matter hard work opportunity doesn’t present would hard succeed always true many people fail successful despite opportunity available others succeed without much luck without opportunity become successful what’s secret thing set apart peer What’s success factor employed others didn’t even factor answer yes It’s hard workTags Masterpiece Motivation Success Psychology Self Improvement
2,403
How Identity—Not Ignorance—Leads to Science Denial
How Identity—Not Ignorance—Leads to Science Denial Changing the minds of Covid-19 deniers may require a lot more than sound reasoning During the first months of the novel coronavirus outbreak, many rural parts of the U.S. did not experience the swell in caseloads or hospital admissions that threatened to overwhelm cities like New York, Detroit, and New Orleans. West Texas was one of these comparatively fortunate places. And considering the Lone Star State’s long-running antipathy toward government oversight, it made sense that some there would choose to ignore or downplay warnings from federal and local health officials. But elements of the script have since flipped, and Covid-19 case numbers are now spiking in many counties across West Texas. One might assume that, in the face of rising caseloads, many there would abandon their prior insouciance and embrace masks and other common-sense measures recommended by the nation’s top public health officials. But that doesn’t seem to be happening; if anything, the resolve of many Covid-19 skeptics appears to be stiffening. Even state officials who can no longer ignore the virus continue to lash out at public health authorities. (Last week, Texas Lt. Gov. Dan Patrick criticized Dr. Anthony Fauci, saying that Fauci “has been wrong every time on every issue” and “I don’t need his advice anymore.”) Anyone who has ever butted heads with a friend, a family member, or a colleague about one of science’s hot button issues — be it global warming, the safety of vaccines, or the gravity of the current pandemic — has likely walked away from the experience frustrated and exasperated at the other person’s stubborn and apparently nonsensical refusal to consider the facts. But psychologists say that the denial of facts is often rooted in identity and belonging, not in ignorance and that changing minds may require a lot more than sound reasoning. “The people who deny science are often trying to uphold membership in something that they find meaningful,” says Nina Eliasoph, PhD, a professor of sociology at the University of Southern California. That meaningful thing could be a political or religious affiliation or some other group that prizes certain ideas or ideals. Whatever shape that group takes, the important thing is that it has other members — it’s a community. Once a community absorbs an idea into its collective viewpoint, rejecting that idea becomes akin to rejecting the whole community, Eliasoph says. And that sort of rejection is a very, very difficult thing for any of its members to do. “This is why you talk with people who deny science and the goalposts are always changing,” she says. “What really matters is the membership in the thing that has meaning, and to keep that membership you have to ignore certain ideas and pay attention to others.” “The people who deny science are often trying to uphold membership in something that they find meaningful.” The causes and correlates of denial Denial, in a nutshell, is the rejection or diminution of a phenomenon that has a large — and sometimes overwhelming — body of supporting evidence. When it comes to science denial, global warming may be the most conspicuous example. Science’s case that the planet is warming, that people are contributing heavily to this warming, and that this warming — if not addressed — will imperil billions of lives is almost unassailable. And yet huge chunks of the American electorate evince some form of climate-change denial. Even people who are worried about global warming are often unwilling to make even small personal sacrifices that, collectively, could make a meaningful difference. Why do people do this? Experts say that our aversion to cognitive dissonance is one explanation. “Cognitive dissonance is a negative emotional state characterized by discomfort or tension, or maybe feelings of anxiety or guilt, that’s produced from beliefs or behaviors that are inconsistent with one another,” says April McGrath, PhD, an associate professor of psychology at Mount Royal University in Canada who has published work on cognitive dissonance. For example, a person who believes the planet is warming may also want to drive a gas-guzzling SUV, and these competing interests create cognitive dissonance. Because cognitive dissonance is unpleasant, people tend to want to get rid of it. And McGrath says that there are generally two ways that people can do this: change a behavior — that is, ditch the SUV for an electric vehicle — or change a belief. Most people go with option B. “Changing a behavior is usually difficult because most behaviors are rewarding,” she says. Changing a belief is often easier, and that’s where some element of denial comes into play. “This could mean trivializing the source of the dissonance” — telling yourself that switching to an electric car won’t make any difference in the grand scheme — “or adding some new belief or idea that supports or rationalizes your choice,” she says. The latter could entail embracing conspiracy theories that argue climate-change consensus is some kind of nefarious ploy. Before any of us gets too judgy, McGrath says that everyone engages in denial. “We are all constantly bombarded by decisions or choices that create dissonance or conflicts, so we can’t always act in accordance with our ideals,” she says. Once a community absorbs an idea into its collective viewpoint, rejecting that idea becomes akin to rejecting the whole community. Along with cognitive dissonance, there are many other scenarios or psychological states that tend to produce denial. “These are all related to each other — they’re not totally independent,” says Craig Anderson, PhD, a distinguished professor of psychology at Iowa State University. He terms one “belief perseverance,” which refers to people’s attachment to ideas or conceptions that they’ve held in the past. We don’t like to change our minds, Anderson explains, and we tend to ignore new information that challenges our long-held views. (Confirmation bias — seeking out and retaining only the information that supports one’s view — is a related concept.) “Reactance” is another, he says. This refers to the negative feelings that people experience when their freedom is somehow threatened — like if state or local government officials tell them that they can’t shop, dine, travel, or congregate as usual. “Fear is also a big one,” he says. If someone finds a belief or idea to be scary — both global warming and Covid-19 are ready examples — that fear is a powerful motivator of denial. While all of these overlapping factors can feed into denial, some who study human psychology say that group dynamics — coupled with every person’s vital need to belong — are at the root of many science deniers’ seemingly inscrutable beliefs and behaviors. Scratching a deep psychological itch Rebekka Darner, PhD, is director of the Center for Mathematics, Science, and Technology at Illinois State University. Much of her work has focused on improving science literacy and combatting science denial among the general public. Darner says that a key element of effective science teaching and communication involves “self-determination theory.” This theory holds that people have three basic psychological needs that undergird their motivation to engage in any behavior. “The first is a need for autonomy, or the belief that an action came from the self,” she says. The second is the need for competence. “This doesn’t mean that a person actually is competent,” she clarifies. What’s important is that the person believes that they are competent and capable of achieving their goals. “The third one is the need for relatedness — a sense of belonging and that other people need you and value your input,” she says. For those hoping to weaken a friend or loved one’s science denial, Darner says that it’s necessary to start from a place of respect and amity. The social groups that people identify with tend to satisfy all three of these basic psychological needs, Darner says. And because of this, people are strongly motivated to accept their group’s ideas or to engage in behaviors that are valued within their social spheres. For example, she says that some social groups may place a high value on bucking authority (“You’re not going to control me”) and this attitude and its associated behaviors — like not wearing a mask — can supersede all others. Self-determination theory helps explain why the widespread adoption of anti-science or anti-expert views is so dangerous. If a person’s group identity motivates them to deny one element of science — like the person who rejects the theory of evolution on religious grounds — then that can be a problem, but at least it’s somewhat contained. If huge numbers of Americans decide that a core element of their group identity is the rejection of science or of creditable expertise, then that’s a problem of a whole other magnitude. The good news, Darner says, is that beliefs linked to group identities are not intractable. “Humans are complex, which works in our favor,” she says. “No person associates with a single identity, and we all have a variety of different communities with which we interact.” When people are regularly exposed to diverse groups and ideas that clash with their own, the resulting contradictions create uncertainty. And while people tend to find uncertainty uncomfortable, Darner says that uncertainty is often the precursor of learning and idea reassessment. Unfortunately, she says that some elements of contemporary life may steer people away from these helpful, perspective-balancing encounters with other viewpoints. The ideological myopia — as well as the us-against-them vitriol — that characterizes much of today’s media, both traditional (newspapers, cable news) and new (social media, online message boards), tends to strengthen a person’s opinions and their feeling of being part of a large and like-minded community. Pushing back against all that can be a Sisyphean endeavor. For those hoping to weaken a friend or loved one’s science denial, Darner says that it’s necessary to start from a place of respect and amity. “People need to feel like you value them and their opinion,” she says. “This kind of relationship has to be there first.” It may help to ask questions — rather than offer counter-arguments — and to respond with interest and noncritical feedback to articles or viewpoints the other person shares. Once you do that and you’ve established more congenial footing, your counterpart may be more willing to consider your side of things. It goes without saying that, however satisfying it may be, telling someone that they’re ignorant and brandishing facts or articles that back your case is the kind of “I’m right and you’re wrong” approach that’s almost certain to fail, and is likely to solidify the person’s opposition to your viewpoints. But even if you say and do all the right things, your odds of success are probably slim. “Individuals very seldom fulfill basic psychological needs for other individuals,” Darner says. “That fulfillment comes from a larger community and identifying with them and being a part of them.” The science denier in your life may eventually come around, but it’s unlikely that you’re going to reel that person back in on your own.
https://elemental.medium.com/how-identity-not-ignorance-leads-to-science-denial-533686e718fa
['Markham Heid']
2020-07-09 05:31:01.391000+00:00
['Identity', 'Life', 'The Nuance', 'Psychology', 'Science']
Title Identity—Not Ignorance—Leads Science DenialContent Identity—Not Ignorance—Leads Science Denial Changing mind Covid19 denier may require lot sound reasoning first month novel coronavirus outbreak many rural part US experience swell caseloads hospital admission threatened overwhelm city like New York Detroit New Orleans West Texas one comparatively fortunate place considering Lone Star State’s longrunning antipathy toward government oversight made sense would choose ignore downplay warning federal local health official element script since flipped Covid19 case number spiking many county across West Texas One might assume face rising caseloads many would abandon prior insouciance embrace mask commonsense measure recommended nation’s top public health official doesn’t seem happening anything resolve many Covid19 skeptic appears stiffening Even state official longer ignore virus continue lash public health authority Last week Texas Lt Gov Dan Patrick criticized Dr Anthony Fauci saying Fauci “has wrong every time every issue” “I don’t need advice anymore” Anyone ever butted head friend family member colleague one science’s hot button issue — global warming safety vaccine gravity current pandemic — likely walked away experience frustrated exasperated person’s stubborn apparently nonsensical refusal consider fact psychologist say denial fact often rooted identity belonging ignorance changing mind may require lot sound reasoning “The people deny science often trying uphold membership something find meaningful” say Nina Eliasoph PhD professor sociology University Southern California meaningful thing could political religious affiliation group prize certain idea ideal Whatever shape group take important thing member — it’s community community absorbs idea collective viewpoint rejecting idea becomes akin rejecting whole community Eliasoph say sort rejection difficult thing member “This talk people deny science goalpost always changing” say “What really matter membership thing meaning keep membership ignore certain idea pay attention others” “The people deny science often trying uphold membership something find meaningful” cause correlate denial Denial nutshell rejection diminution phenomenon large — sometimes overwhelming — body supporting evidence come science denial global warming may conspicuous example Science’s case planet warming people contributing heavily warming warming — addressed — imperil billion life almost unassailable yet huge chunk American electorate evince form climatechange denial Even people worried global warming often unwilling make even small personal sacrifice collectively could make meaningful difference people Experts say aversion cognitive dissonance one explanation “Cognitive dissonance negative emotional state characterized discomfort tension maybe feeling anxiety guilt that’s produced belief behavior inconsistent one another” say April McGrath PhD associate professor psychology Mount Royal University Canada published work cognitive dissonance example person belief planet warming may also want drive gasguzzling SUV competing interest create cognitive dissonance cognitive dissonance unpleasant people tend want get rid McGrath say generally two way people change behavior — ditch SUV electric vehicle — change belief people go option B “Changing behavior usually difficult behavior rewarding” say Changing belief often easier that’s element denial come play “This could mean trivializing source dissonance” — telling switching electric car won’t make difference grand scheme — “or adding new belief idea support rationalizes choice” say latter could entail embracing conspiracy theory argue climatechange consensus kind nefarious ploy u get judgy McGrath say everyone engages denial “We constantly bombarded decision choice create dissonance conflict can’t always act accordance ideals” say community absorbs idea collective viewpoint rejecting idea becomes akin rejecting whole community Along cognitive dissonance many scenario psychological state tend produce denial “These related — they’re totally independent” say Craig Anderson PhD distinguished professor psychology Iowa State University term one “belief perseverance” refers people’s attachment idea conception they’ve held past don’t like change mind Anderson explains tend ignore new information challenge longheld view Confirmation bias — seeking retaining information support one’s view — related concept “Reactance” another say refers negative feeling people experience freedom somehow threatened — like state local government official tell can’t shop dine travel congregate usual “Fear also big one” say someone find belief idea scary — global warming Covid19 ready example — fear powerful motivator denial overlapping factor feed denial study human psychology say group dynamic — coupled every person’s vital need belong — root many science deniers’ seemingly inscrutable belief behavior Scratching deep psychological itch Rebekka Darner PhD director Center Mathematics Science Technology Illinois State University Much work focused improving science literacy combatting science denial among general public Darner say key element effective science teaching communication involves “selfdetermination theory” theory hold people three basic psychological need undergird motivation engage behavior “The first need autonomy belief action came self” say second need competence “This doesn’t mean person actually competent” clarifies What’s important person belief competent capable achieving goal “The third one need relatedness — sense belonging people need value input” say hoping weaken friend loved one’s science denial Darner say it’s necessary start place respect amity social group people identify tend satisfy three basic psychological need Darner say people strongly motivated accept group’s idea engage behavior valued within social sphere example say social group may place high value bucking authority “You’re going control me” attitude associated behavior — like wearing mask — supersede others Selfdetermination theory help explain widespread adoption antiscience antiexpert view dangerous person’s group identity motivates deny one element science — like person reject theory evolution religious ground — problem least it’s somewhat contained huge number Americans decide core element group identity rejection science creditable expertise that’s problem whole magnitude good news Darner say belief linked group identity intractable “Humans complex work favor” say “No person associate single identity variety different community interact” people regularly exposed diverse group idea clash resulting contradiction create uncertainty people tend find uncertainty uncomfortable Darner say uncertainty often precursor learning idea reassessment Unfortunately say element contemporary life may steer people away helpful perspectivebalancing encounter viewpoint ideological myopia — well usagainstthem vitriol — characterizes much today’s medium traditional newspaper cable news new social medium online message board tends strengthen person’s opinion feeling part large likeminded community Pushing back Sisyphean endeavor hoping weaken friend loved one’s science denial Darner say it’s necessary start place respect amity “People need feel like value opinion” say “This kind relationship first” may help ask question — rather offer counterargument — respond interest noncritical feedback article viewpoint person share you’ve established congenial footing counterpart may willing consider side thing go without saying however satisfying may telling someone they’re ignorant brandishing fact article back case kind “I’m right you’re wrong” approach that’s almost certain fail likely solidify person’s opposition viewpoint even say right thing odds success probably slim “Individuals seldom fulfill basic psychological need individuals” Darner say “That fulfillment come larger community identifying part them” science denier life may eventually come around it’s unlikely you’re going reel person back ownTags Identity Life Nuance Psychology Science
2,404
The Mosaic
The Mosaic A poem about sadness and anxiety Photo by Ashkan Forouzani on Unsplash we all have become a story and learned to part away with memories in a quiet way. each poem is a layer that we shed quietly we all have become a mirror that knows how to shatter without making a noise. each piece which shatters part of the mosaic.
https://medium.com/scribe/the-mosaic-fd5769e15249
['Priyanka Srivastava']
2020-12-28 08:42:42.712000+00:00
['Poetry', 'Mental Health', 'Sadness', 'Anxiety', 'Writing']
Title MosaicContent Mosaic poem sadness anxiety Photo Ashkan Forouzani Unsplash become story learned part away memory quiet way poem layer shed quietly become mirror know shatter without making noise piece shatters part mosaicTags Poetry Mental Health Sadness Anxiety Writing
2,405
What Happens If You Realize You’re Writing the Wrong Book?
Be brave enough to see your mistakes Knowing you have to start again can be a little scary. You just barely got through the fear of starting one book, and now you have to go through it again to start another? I could’ve chosen to ignore my gut feeling that I was doing something wrong. I could’ve continued forcing myself to write. I would’ve ruined my love for writing if I did that. The process would be grueling and miserable, and perhaps I wouldn’t want to try writing a book again for a long time after that. Admitting you made a mistake takes courage. You can’t place the blame on anyone else because writing a book, whether it’s the right or wrong one, is all on you. No one likes to confess they screwed up, especially in this era where everyone on social media seems to have perfect lives. They don’t mess up, how could I possibly admit I did? But we all fuck up. A lot. Every day. Sometimes we trip over nothing, say the wrong thing, or write the wrong books. Screwing up is a part of life, so own it.
https://itxylopez.medium.com/what-happens-if-you-realize-youre-writing-the-wrong-book-4d0f4a984216
['Itxy Lopez']
2019-12-15 19:47:35.036000+00:00
['Writing Tips', 'Motivation', 'Self', 'Advice', 'Writing']
Title Happens Realize You’re Writing Wrong BookContent brave enough see mistake Knowing start little scary barely got fear starting one book go start another could’ve chosen ignore gut feeling something wrong could’ve continued forcing write would’ve ruined love writing process would grueling miserable perhaps wouldn’t want try writing book long time Admitting made mistake take courage can’t place blame anyone else writing book whether it’s right wrong one one like confess screwed especially era everyone social medium seems perfect life don’t mess could possibly admit fuck lot Every day Sometimes trip nothing say wrong thing write wrong book Screwing part life itTags Writing Tips Motivation Self Advice Writing
2,406
ARK Says Goodbye to Marketing Adviser Jeremy Epstein
Six months ago, ARK signed Author, Marketing Expert, and CEO of Never Stop Marketing, Jeremy Epstein, to a contract to serve as a Marketing Adviser to our Chief Marketing Officer, Travis Walker. Jeremy’s goal was to help inform our team and develop a marketing strategy that would fill the gaps in awareness we were seeing within the industry. Having years of experience in marketing and a strong understanding of the blockchain space, Jeremy helped to analyze and understand the areas in which ARK needed to expand our outreach and highlighted a need to improve our collaboration with influencers in the space. Working with the team, Jeremy helped to put together a list of influencers to target for both the interoperability space, as well as blockchain at large. We have already started to implement some of these strategies and over the course of the next several months, as we launch Core v2 and other major developments for the ARK Ecosystem, the entire community will see a massive increase in outreach, interviews, podcasts, and articles as we push to make ARK a leader in the worlds fastest growing industry. As our contract comes to a close, we wanted to thank Mr. Epstein for the insight he has brought into our blossoming project and to let him know that we appreciate the time and energy he has put forth towards ARK and the ARK community. We will continue to welcome him around our Slack and ecosystem as an important ARK community member and supportive hodler, forever. As he transitions his focus to his upcoming book release and a renewed passion for his marketing agency and writing projects, everyone in the ARK Crew wishes Mr. Epstein much success in his future endeavors.
https://medium.com/ark-io/ark-says-goodbye-to-marketing-adviser-jeremy-epstein-203d2123f163
['Matthew Dc']
2018-06-12 22:38:48.819000+00:00
['Arkecosystem', 'Development', 'Marketing', 'Blockchain', 'Bitcoin']
Title ARK Says Goodbye Marketing Adviser Jeremy EpsteinContent Six month ago ARK signed Author Marketing Expert CEO Never Stop Marketing Jeremy Epstein contract serve Marketing Adviser Chief Marketing Officer Travis Walker Jeremy’s goal help inform team develop marketing strategy would fill gap awareness seeing within industry year experience marketing strong understanding blockchain space Jeremy helped analyze understand area ARK needed expand outreach highlighted need improve collaboration influencers space Working team Jeremy helped put together list influencers target interoperability space well blockchain large already started implement strategy course next several month launch Core v2 major development ARK Ecosystem entire community see massive increase outreach interview podcasts article push make ARK leader world fastest growing industry contract come close wanted thank Mr Epstein insight brought blossoming project let know appreciate time energy put forth towards ARK ARK community continue welcome around Slack ecosystem important ARK community member supportive hodler forever transition focus upcoming book release renewed passion marketing agency writing project everyone ARK Crew wish Mr Epstein much success future endeavorsTags Arkecosystem Development Marketing Blockchain Bitcoin
2,407
The economics of Airbnb
Airbnb just went public in a debut that’s been widely celebrated. We dug into the company’s prospectus and learned some interesting facts about their business and travel trends. Cash flow is highly seasonal. ABNB makes all it’s money in Q3. As you can see, that’s pretty much the only quarter in which the company is comfortably EBITDA positive in any given year. This is due to the fact that the bulk of ABNB’s customers travel in Q3, hence revenue is earned at that point. Airbnb generated EBITDA. During 2017 and 2018, the company did generate material positive EBITDA. In 2019, they swung back to a loss due to rising costs across the board, and it looks highly likely they’ll burn again in 2020. Airbnb generates significant free cash flow. Thanks to unearned fees, which is the payment customers make when they book a reservation, Airbnb does generate free cash flow even in years where they are unprofitable. Cash flow is especially strong in Q1 and Q2 when customers book their stays, and then declines in Q3 when many of the stays actually occur (see seasonality above) and Airbnb has to pay those reservations out to the host. Airbnb keeps 15% of every booking. Strong founder ownership. The three founders (Brian, Nathan, Joseph) own 14.1% to 15.3% each. This is an extraordinary level of ownership for a 3-founder company going public. This is due to two things: strong free cash flow generation and more importantly the ability for Airbnb to raise capital at consistently high valuations. Covid hurt a lot. Bookings in Q2 fell to $3.2bln whereas in Q2 2019, bookings were $9.8bln. That’s a 67% decline. The rebound in Q3 of 2020 however was dramatic as demand for travel exploded. Short trips are in. Thanks in part to covid, people are taking shorter trips. “Short-distance travel within 50 miles of guest origin has been highly resilient, even at the peak of the business interruption in April. Short-distance stays were one of the fastest growing categories prior to the COVID-19 pandemic. This growth was further bolstered by the COVID-19 pandemic, as many guests chose short-distance trips instead of long-distance travel.” Airbnb’s prospectus provides a very interesting look into travel trends and the economics of running a marketplace. Visit us at blossomstreetventures.com and email us directly with Series A or B opportunities at [email protected]. Connect on LI as well. We invest $1mm to $1.5mm in growth rounds, inside rounds, small rounds, cap table restructurings, note clean outs, and other ‘special situations’ all over the US & Canada.
https://blossomstreetventures.medium.com/the-economics-of-airbnb-dd2ed4828bf7
['Sammy Abdullah']
2020-12-17 13:41:52.505000+00:00
['Airbnb', 'Founders', 'Startup', 'Entrepreneurship', 'Venture Capital']
Title economics AirbnbContent Airbnb went public debut that’s widely celebrated dug company’s prospectus learned interesting fact business travel trend Cash flow highly seasonal ABNB make it’s money Q3 see that’s pretty much quarter company comfortably EBITDA positive given year due fact bulk ABNB’s customer travel Q3 hence revenue earned point Airbnb generated EBITDA 2017 2018 company generate material positive EBITDA 2019 swung back loss due rising cost across board look highly likely they’ll burn 2020 Airbnb generates significant free cash flow Thanks unearned fee payment customer make book reservation Airbnb generate free cash flow even year unprofitable Cash flow especially strong Q1 Q2 customer book stay decline Q3 many stay actually occur see seasonality Airbnb pay reservation host Airbnb keep 15 every booking Strong founder ownership three founder Brian Nathan Joseph 141 153 extraordinary level ownership 3founder company going public due two thing strong free cash flow generation importantly ability Airbnb raise capital consistently high valuation Covid hurt lot Bookings Q2 fell 32bln whereas Q2 2019 booking 98bln That’s 67 decline rebound Q3 2020 however dramatic demand travel exploded Short trip Thanks part covid people taking shorter trip “Shortdistance travel within 50 mile guest origin highly resilient even peak business interruption April Shortdistance stay one fastest growing category prior COVID19 pandemic growth bolstered COVID19 pandemic many guest chose shortdistance trip instead longdistance travel” Airbnb’s prospectus provides interesting look travel trend economics running marketplace Visit u blossomstreetventurescom email u directly Series B opportunity sammyblossomstreetventurescom Connect LI well invest 1mm 15mm growth round inside round small round cap table restructurings note clean out ‘special situations’ US CanadaTags Airbnb Founders Startup Entrepreneurship Venture Capital
2,408
What If I’m the Narcissist and Not the Victim?
When I realised that I was in a relationship with a narcissist, I started to read a lot about Narcissistic Personality Disorder. I devoured books and articles — to find some meaning in the chaos that I experienced, to find explanations where there were none and to confirm that I am not crazy. But, just like medical students are suffering from severe hypochondria and diagnosing themselves with all sorts of illnesses that they are learning about, I caught myself finding a lot of narcissistic traits in my otherwise normal personality. I already knew that something was off in my relationship. I knew that we were both suffering. I knew that I was suffering a lot. But everything I read made me start to doubt myself. What if I am the narcissistic one, and not him? What if the problem lies with me and it is all on me that we are both in this turmoil? It was quite disturbing to consider that I may have changed into someone I never wanted to be. I felt I was selfish — because I was told that I was selfish, not caring about his needs. I was told that I was abusive — when I was trying to have two-way communication and I wanted to express my opinions. I felt that I was a terrible person, who always wants attention, who is clingy and demanding and impossible to satisfy. And it was true. I wanted the attention that he used to give me but he decided to take away to punish me only to show me some random glimpses of affection as breadcrumbs. I started to become selfish, and I tried to get him to care about me too — instead of always dealing with his problems. According to studies, it is quite common that victims of narcissistic abuse start to question themselves, whether they are the narcissistic one or is it their partners. If you have to ask yourself whether you are narcissistic or not, odds say you’re not. Let me explain.
https://medium.com/mind-cafe/what-if-im-the-narcissist-and-not-the-victim-88ccde8fe62d
['Zita Fontaine']
2020-05-12 06:37:51.246000+00:00
['Mental Health', 'Communication', 'Relationships', 'Psychology', 'Narcissism']
Title I’m Narcissist VictimContent realised relationship narcissist started read lot Narcissistic Personality Disorder devoured book article — find meaning chaos experienced find explanation none confirm crazy like medical student suffering severe hypochondria diagnosing sort illness learning caught finding lot narcissistic trait otherwise normal personality already knew something relationship knew suffering knew suffering lot everything read made start doubt narcissistic one problem lie turmoil quite disturbing consider may changed someone never wanted felt selfish — told selfish caring need told abusive — trying twoway communication wanted express opinion felt terrible person always want attention clingy demanding impossible satisfy true wanted attention used give decided take away punish show random glimpse affection breadcrumb started become selfish tried get care — instead always dealing problem According study quite common victim narcissistic abuse start question whether narcissistic one partner ask whether narcissistic odds say you’re Let explainTags Mental Health Communication Relationships Psychology Narcissism
2,409
Bat Coronavirus Rc-o319 Found in Japan: New Relative of SARS-CoV-2
Bat Coronavirus Rc-o319 Found in Japan: New Relative of SARS-CoV-2 This study tells us there’re other undiscovered bat coronaviruses, even outside of China. Background vector created by articular — www.freepik.com The Centers for Disease Control and Prevention (CDC) has released a study from Japan, titled “Detection and Characterization of Bat Sarbecovirus Phylogenetically Related to SARS-CoV-2, Japan,” this month. In this study, a new bat coronavirus called Rc-o319 is discovered, which belongs to the same evolutionary clade as SARS-CoV-2 and RaTG13. This article will discuss the significance of this finding. (SARS-CoV-2 is the novel coronavirus that causes Covid-19. RaTG13 is a bat coronavirus that is the closest known relative of SARS-CoV-2. SARS-CoV-2 and RaTG13 belong to the coronavirus's beta genus under the sarbecovirus clade — betacoronavirus, sarbecovirus. So, Rc-o319, RaTG13, and SARS-CoV-2 will be called sarbecoviruses from now.) The study’s rationale Horseshoe bats of the Rhinolophus species are infamous for being reservoirs of betacoronaviruses. RaTG13 is one such bat sarbecovirus that is 96% identical to SARS-CoV-2 at the genetic level. Current evidence suggests that SARS-CoV-2 evolved from a common ancestor of RaTG13. RaTG13 is first sampled from a bat cave in the Yunnan Province of China. In fact, most of the bat coronavirus studies are from China. But Rhinolophus species and other bats are also found in other parts of Asia, Europe, and Africa, and nothing much is known about the coronaviruses they harbor. “We provide a hypothesis that a bat sarbecovirus with zoonotic potential might exist even outside China, because Rhinolophus spp. bats inhabit Asia, Europe, and Africa.” Thus, Shin Murakami, associate professor at the Department of Veterinary Medical Sciences of the University of Tokyo, led a study to characterize the complete genome of a bat sarbecovirus called Rc-o316 in Rhinolophus cornutus, a bat species endemic to Japan. What the study did and found In 2013, the researchers captured four R. cornutus from a cave in the Iwate prefecture of Japan. They then extracted RNA genetic material from the bats’ feces to screen for any presence of betacoronaviruses. Once candidates were identified, they proceed to sequence the full genome in 2020. Sequence analyses revealed that a new bat sarbecovirus called Rc-o319 is 81.47% genetically identical to SARS-CoV-2. While 18.5% of genetic differences are massive, the full genome and key genes (spike protein and ORF1ab) of Rc-o319 still qualify as a place in the same clade as SARS-CoV-2 and RaTG13. The study also showed that Rc-o319 could not infect human cells expressing the human ACE2 receptor. Another distinction of Rc-o319, the study found, is that it does not require TMPSSR2 to complete cell infection. Thus, the bat’s ACE2 receptor alone is sufficient for Rc-o319, whereas human ACE2 and TMPSSR2 are required for human SARS-1 and SARS-CoV-2. Adapted from Murakami et al. (2020). Phylogenetic tree of full genomes of Rc-o319, SARS-CoV-2, RaTG13 (highlighted in yellow), and others. Phylogenetic trees of other genes (spike protein and ORF1ab) can be found in the main paper. “Among R. cornutus bats in Japan, we detected sarbecovirus Rc-o319, which is phylogenetically positioned in the same clade as SARS-CoV-2. Sarbecoviruses belonging to this clade previously were detected from other Rhinolophus spp. bats and pangolins…in China and could have played a role in the emergence of SARS-CoV-2,” the authors concluded. “We provide a hypothesis that a bat sarbecovirus with zoonotic potential might exist even outside China, because Rhinolophus spp. bats inhabit Asia, Europe, and Africa.” With the current phylogenetic tree, at least five ancestors are standing in between Rc-o319 and SARS-CoV-2. So, while Rc-o319 is related to SARS-CoV-2, it’s very distantly related. The study also admitted that Rc-o319 is unlikely to jump directly to humans as it cannot bind to the human ACE2 receptor, unlike RaTG13 that also uses the human ACE2 receptor. However, as R. cornutus live in caves or tunnels with other bat species, and interact with other wild animals during the daytime, Rc-o319 may transmit to coinhabitant animals. A closer look at Rc-o319 First, the study did not suggest that Rc-o319 is involved in the origin of SARS-CoV-2. Rather, the study tells us that other undiscovered sarbecoviruses could still change the current phylogenetic tree — just like the Japanese study added a new member, Rc-o319, into the sarbecovirus clade. Rc-o319 is only 81.47% genetically identical to SARS-CoV-2, compared to RaTG13 with 96% identity. Scientists have predicted that the 4% genetic differences between RaTG13 and SARS-CoV-2 represent about 50 years of evolutionary time gap. Indeed, a published study in Nature suggests that the most recent common ancestor of RaTG13 and SARS-CoV-2 arose around 1950–1980. As follows, the most recent common ancestor of Rc-o319 and SARS-CoV-2, as well as other sarbecoviruses in between, would be dated back even further. With the current phylogenetic tree, at least five ancestors are standing in between Rc-o319 and SARS-CoV-2. So, while Rc-o319 is related to SARS-CoV-2, it’s very distantly related. The different biological functions between Rc-o319 and SARS-CoV-2 further supports this notion. To restate, compared to SARS-CoV-2, Rc-o319 uses a different form of ACE2 receptor and does not need the TMPSSR2 co-factor to complete cell infection. Is it possible that the Covid-19 pandemic started somewhere outside of China? Perhaps so, if a very closely related sarbecovirus of SARS-CoV-2 is discovered outside of China, which is certainly not Rc-o319. At this point, the Yunnan Province of China, where RaTG13 is sampled, is still the leading candidate region where Covid-19 started. Adapted from Murakami et al. (2020). Cropped portion of the phylogenetic tree depicting the associated common ancestors. Short abstract Japanese researchers discovered a new bat coronavirus called Rc-o319 that belong to the same evolutionary clade (betacoronavirus, sarbecovirus) as SARS-CoV-2 and its closest known relative, RaTG13. But Rc-o319 is only 81.47% genetically identical to SARS-CoV-2. By contrast, RaTG13 and SARS-CoV-2 are 96% identical, and these 4% differences entail about 50 years of evolution. Thus, while Rc-o319 is related to SARS-CoV-2, it’s very distantly related. Still, this study tells us that other uncharted coronaviruses — even outside of China — may possibly alter our current knowledge of the SARS-CoV-2 evolutionary tree.
https://medium.com/microbial-instincts/bat-coronavirus-rc-o319-found-in-japan-new-relative-of-sars-cov-2-d6221d90e8d2
['Shin Jie Yong']
2020-11-22 11:54:15.117000+00:00
['Innovation', 'Life', 'Technology', 'Coronavirus', 'Science']
Title Bat Coronavirus Rco319 Found Japan New Relative SARSCoV2Content Bat Coronavirus Rco319 Found Japan New Relative SARSCoV2 study tell u there’re undiscovered bat coronaviruses even outside China Background vector created articular — wwwfreepikcom Centers Disease Control Prevention CDC released study Japan titled “Detection Characterization Bat Sarbecovirus Phylogenetically Related SARSCoV2 Japan” month study new bat coronavirus called Rco319 discovered belongs evolutionary clade SARSCoV2 RaTG13 article discus significance finding SARSCoV2 novel coronavirus cause Covid19 RaTG13 bat coronavirus closest known relative SARSCoV2 SARSCoV2 RaTG13 belong coronaviruss beta genus sarbecovirus clade — betacoronavirus sarbecovirus Rco319 RaTG13 SARSCoV2 called sarbecoviruses study’s rationale Horseshoe bat Rhinolophus specie infamous reservoir betacoronaviruses RaTG13 one bat sarbecovirus 96 identical SARSCoV2 genetic level Current evidence suggests SARSCoV2 evolved common ancestor RaTG13 RaTG13 first sampled bat cave Yunnan Province China fact bat coronavirus study China Rhinolophus specie bat also found part Asia Europe Africa nothing much known coronaviruses harbor “We provide hypothesis bat sarbecovirus zoonotic potential might exist even outside China Rhinolophus spp bat inhabit Asia Europe Africa” Thus Shin Murakami associate professor Department Veterinary Medical Sciences University Tokyo led study characterize complete genome bat sarbecovirus called Rco316 Rhinolophus cornutus bat specie endemic Japan study found 2013 researcher captured four R cornutus cave Iwate prefecture Japan extracted RNA genetic material bats’ feces screen presence betacoronaviruses candidate identified proceed sequence full genome 2020 Sequence analysis revealed new bat sarbecovirus called Rco319 8147 genetically identical SARSCoV2 185 genetic difference massive full genome key gene spike protein ORF1ab Rco319 still qualify place clade SARSCoV2 RaTG13 study also showed Rco319 could infect human cell expressing human ACE2 receptor Another distinction Rco319 study found require TMPSSR2 complete cell infection Thus bat’s ACE2 receptor alone sufficient Rco319 whereas human ACE2 TMPSSR2 required human SARS1 SARSCoV2 Adapted Murakami et al 2020 Phylogenetic tree full genome Rco319 SARSCoV2 RaTG13 highlighted yellow others Phylogenetic tree gene spike protein ORF1ab found main paper “Among R cornutus bat Japan detected sarbecovirus Rco319 phylogenetically positioned clade SARSCoV2 Sarbecoviruses belonging clade previously detected Rhinolophus spp bat pangolins…in China could played role emergence SARSCoV2” author concluded “We provide hypothesis bat sarbecovirus zoonotic potential might exist even outside China Rhinolophus spp bat inhabit Asia Europe Africa” current phylogenetic tree least five ancestor standing Rco319 SARSCoV2 Rco319 related SARSCoV2 it’s distantly related study also admitted Rco319 unlikely jump directly human cannot bind human ACE2 receptor unlike RaTG13 also us human ACE2 receptor However R cornutus live cave tunnel bat specie interact wild animal daytime Rco319 may transmit coinhabitant animal closer look Rco319 First study suggest Rco319 involved origin SARSCoV2 Rather study tell u undiscovered sarbecoviruses could still change current phylogenetic tree — like Japanese study added new member Rco319 sarbecovirus clade Rco319 8147 genetically identical SARSCoV2 compared RaTG13 96 identity Scientists predicted 4 genetic difference RaTG13 SARSCoV2 represent 50 year evolutionary time gap Indeed published study Nature suggests recent common ancestor RaTG13 SARSCoV2 arose around 1950–1980 follows recent common ancestor Rco319 SARSCoV2 well sarbecoviruses would dated back even current phylogenetic tree least five ancestor standing Rco319 SARSCoV2 Rco319 related SARSCoV2 it’s distantly related different biological function Rco319 SARSCoV2 support notion restate compared SARSCoV2 Rco319 us different form ACE2 receptor need TMPSSR2 cofactor complete cell infection possible Covid19 pandemic started somewhere outside China Perhaps closely related sarbecovirus SARSCoV2 discovered outside China certainly Rco319 point Yunnan Province China RaTG13 sampled still leading candidate region Covid19 started Adapted Murakami et al 2020 Cropped portion phylogenetic tree depicting associated common ancestor Short abstract Japanese researcher discovered new bat coronavirus called Rco319 belong evolutionary clade betacoronavirus sarbecovirus SARSCoV2 closest known relative RaTG13 Rco319 8147 genetically identical SARSCoV2 contrast RaTG13 SARSCoV2 96 identical 4 difference entail 50 year evolution Thus Rco319 related SARSCoV2 it’s distantly related Still study tell u uncharted coronaviruses — even outside China — may possibly alter current knowledge SARSCoV2 evolutionary treeTags Innovation Life Technology Coronavirus Science
2,410
I’m Only Superhuman
All you ever gave me was a 1000 and 1 reasons to leave. But I never did. I stayed. Against all odds or hope for a better day, I stayed with you. Because I knew — You needed me.
https://medium.com/know-thyself-heal-thyself/im-only-superhuman-fe500c57cec6
['Audrey Malone']
2020-12-29 20:51:09.740000+00:00
['Storytelling', 'Self-awareness', 'Life Lessons', 'Love', 'Poetry']
Title I’m SuperhumanContent ever gave 1000 1 reason leave never stayed odds hope better day stayed knew — needed meTags Storytelling Selfawareness Life Lessons Love Poetry
2,411
Cracking the handwritten digits recognition problem with Scikit-learn
Sklearn Hello World! The example we’ll run is pretty simple: learn to recognize digits. Given a dataset of digits, learn the shape of them and predict unseen digits. This example is based on the Sklearn basic tutorial. Verify your Python configuration Before we move forward, just run a simple Python file to make sure you have configured everything properly. Open PyCharm Create a new project Create a Python file Add the following line into it: print("Running Sklearn Hello World!") Run the file. You should see that string in the console. Import datasets Sklearn has some built-in datasets that allow you to get started quickly. You could download the dataset from somewhere else if you want to, but in this blog, we’ll use Sklearn’s datasets. Note: How digits are transformed from images into pixels is out of the scope of this blog. Assume that someone did a transformation to get pixels from scanned images, and that’s your dataset. Edit your Python file and before the print command, add the following import: from sklearn import datasets Explore the dataset: digits = datasets.load_digits() print(digits.data) 3. Run your Python file. You should see the following output in the console: [[ 0. 0. 5. ... 0. 0. 0.] [ 0. 0. 0. ... 10. 0. 0.] [ 0. 0. 0. ... 16. 9. 0.] ... [ 0. 0. 1. ... 6. 0. 0.] [ 0. 0. 2. ... 12. 0. 0.] [ 0. 0. 10. ... 12. 1. 0.]] What you’re seeing in that output are all the digits (or instances) and all their features that each instance has. In this example, the pixels of each digit. If we printed the value digits.target instead, we would see the real values (classifications) for those digits: array([0, 1, 2, …, 8, 9, 8]). Features are attributes about an instance. A person may have attributes like nationality, skills, etc. Instead of calling them attributes, they’re called features. In our case, our instances (digits) has the brightness levels of each pixel as attributes or features. Learn from our dataset ML is about generalizing the behavior of our dataset. It’s like taking a look at the data and saying something like “yes, it seems that next month we’ll increase our sales”. That’s because based on what happened, you’re trying to generalize the situation and predict what may happen in the future. There are basically two ways of generalizing from data: Learning by heart: this means “memorizing” all the instances and then try to match new instances to the ones we knew. A good example of this is explained in [1]: If we had to implement a spam filter, one way could be flagging all emails that are identical to emails already flagged as spam. The similarity between emails could be the number of words they have in common with a known spam email. Building a model to represent data: this implies building a model that approximates known values with unseen values. The general idea is that if we know that instances A and B are similar and A has a target value 1, then we can guess that B may have a target value 1 as well. The difference with the first approach is that by building a model, we’re adjusting it to represent the data and then we forget about the instances. A cat-dogs classifier. In our case, we’ll classify by digit: 0, 1, 2, etc. Source Let’s create a model that represents our data behavior. As this is a classification problem (given some instances, we want to classify them based on their features and predict the digit they represent), we will call our component classifier and we’ll choose a Support Vector Machine (SVM). There are many other classifiers in Sklearn, but this one will be enough for our use case. For further details on when to use certain components depending on the problem, you can follow the following cheat-sheet:
https://medium.com/overfitted-microservices/cracking-the-handwritten-digits-recognition-problem-with-scikit-learn-b5afc28e2c24
['Ariel Segura']
2019-01-05 01:43:24.570000+00:00
['Machine Learning', 'Python', 'Data Science', 'Software Engineering', 'Scikit Learn']
Title Cracking handwritten digit recognition problem ScikitlearnContent Sklearn Hello World example we’ll run pretty simple learn recognize digit Given dataset digit learn shape predict unseen digit example based Sklearn basic tutorial Verify Python configuration move forward run simple Python file make sure configured everything properly Open PyCharm Create new project Create Python file Add following line printRunning Sklearn Hello World Run file see string console Import datasets Sklearn builtin datasets allow get started quickly could download dataset somewhere else want blog we’ll use Sklearn’s datasets Note digit transformed image pixel scope blog Assume someone transformation get pixel scanned image that’s dataset Edit Python file print command add following import sklearn import datasets Explore dataset digit datasetsloaddigits printdigitsdata 3 Run Python file see following output console 0 0 5 0 0 0 0 0 0 10 0 0 0 0 0 16 9 0 0 0 1 6 0 0 0 0 2 12 0 0 0 0 10 12 1 0 you’re seeing output digit instance feature instance example pixel digit printed value digitstarget instead would see real value classification digit array0 1 2 … 8 9 8 Features attribute instance person may attribute like nationality skill etc Instead calling attribute they’re called feature case instance digit brightness level pixel attribute feature Learn dataset ML generalizing behavior dataset It’s like taking look data saying something like “yes seems next month we’ll increase sales” That’s based happened you’re trying generalize situation predict may happen future basically two way generalizing data Learning heart mean “memorizing” instance try match new instance one knew good example explained 1 implement spam filter one way could flagging email identical email already flagged spam similarity email could number word common known spam email Building model represent data implies building model approximates known value unseen value general idea know instance B similar target value 1 guess B may target value 1 well difference first approach building model we’re adjusting represent data forget instance catdogs classifier case we’ll classify digit 0 1 2 etc Source Let’s create model represents data behavior classification problem given instance want classify based feature predict digit represent call component classifier we’ll choose Support Vector Machine SVM many classifier Sklearn one enough use case detail use certain component depending problem follow following cheatsheetTags Machine Learning Python Data Science Software Engineering Scikit Learn
2,412
Bokeh 2.0.1
Today we release Bokeh 2.0.1: a collection of improvements in automation, documentation, and other minor fixes following the recent 2.0 release. The full list of changes can be seen in the milestone list on GitHub. Some of the highlights include: Addressing a Cross-Origin Resource Sharing (CORS) issue seen in Chrome and Chromium-based browsers #9773 Adding multi-file support for FileInput widgets #9727 widgets Bokeh server can now serve custom extension code #9799 A handful of documentation clarifications, corrections, and expansions As of 2.0.1, Bokeh’s FileInput widget supports multiple file selections. If you have questions after upgrading, we encourage you to stop by the Bokeh Discourse! Friendly project maintainers and a community of Bokeh users are there to help you navigate any issues that arise. If you are using Anaconda, Bokeh can most easily be installed by executing conda install -c bokeh bokeh . Otherwise, use pip install bokeh . Developers interested in contributing to the library can visit Bokeh’s Zulip chat channels for guidance on best practices and technical considerations. As always, we appreciate the thoughtful feedback from users and especially the work of our contributor community that make Bokeh better!
https://medium.com/bokeh/bokeh-2-0-1-362eb5d0729a
[]
2020-06-10 00:13:12.811000+00:00
['Python', 'Python3', 'Visualization', 'Data Science', 'Bokeh']
Title Bokeh 201Content Today release Bokeh 201 collection improvement automation documentation minor fix following recent 20 release full list change seen milestone list GitHub highlight include Addressing CrossOrigin Resource Sharing CORS issue seen Chrome Chromiumbased browser 9773 Adding multifile support FileInput widget 9727 widget Bokeh server serve custom extension code 9799 handful documentation clarification correction expansion 201 Bokeh’s FileInput widget support multiple file selection question upgrading encourage stop Bokeh Discourse Friendly project maintainer community Bokeh user help navigate issue arise using Anaconda Bokeh easily installed executing conda install c bokeh bokeh Otherwise use pip install bokeh Developers interested contributing library visit Bokeh’s Zulip chat channel guidance best practice technical consideration always appreciate thoughtful feedback user especially work contributor community make Bokeh betterTags Python Python3 Visualization Data Science Bokeh
2,413
How I Escaped My Corporate Fate and Decided to Choose Myself
“Once you realise you deserve a bright future, letting go of your dark past is the best choice you will ever make.” ― Roy T. Bennett, The Light in the Heart We all make bad choices but sometimes they only become apparent when life is pulled so far off course you don’t recognise who you are anymore. For whatever reason, some of us double down, grit our teeth and push on through, and never stop making the wrong choice, even when truth stares us in the face. I realise that now. Corporate work gave me a stark vision of a possible future and made me realise I had been making the wrong choice for 20 years. It all came to a head this year. I had reached 42, still working in an office, looking at my manager, eight years my senior, who was cranky as hell. He was miserable, under pressure, pale and balding; the archetype of middle-aged misery. He talked lots about how much TV he watched. Boasted, even. He had an app that clocked up the hours of mindless escapism he had indulged in, a measurement of existential despair in prime colours and interactive graphs. He’d made a bad choice. He didn’t seem to realise he wasn’t meant to be there, in that office, but I saw him and I saw his mistake. Somewhere down the line, he’d chosen to stay small, and now, there he sat. He was unhappy and stressed and about every two weeks, it erupted out of him. He shouted, berated his team, swore, told people to “fuck off” — the rage spilled over as incongruence gnawed at his soul. He didn’t seem to realise this was the cause of his woe, I think he felt the irritation but didn’t connect it to anything deeper. Still, it couldn’t be kept in, the rage, it bubbled just below the surface, the existential angst caused him to hot-foot from steady leader to explosive child, back and forth and forth and back as he held on to a crust of sanity between two extremes, never quite managing to be either. He’d made a mistake. He shouldn’t be in an office but he’s 50 and this is his bed now. A bed of nails. Lie down, get comfortable, this is yours, you chose it. Someone once told me “if you can’t find someone you want to be in the place you work, then you should quit.” These words rattled in my head as he loomed, one singular rung above me on the corporate hierarchy, a walking, talking half-man on autopilot, who dared not think too deeply about concepts such as happiness and meaning else the walls of reality would come crashing down and a tsunami of truth would sweep him away like a bamboo beach hut. During this time, there has been a great deal of political quarrelling above us, the corporate gods argued and it rippled down the chain of command in gentle lapping waves of agitation. Sooner or later, it came to fruition and our departmental director was unceremoniously ousted, a sacrificial lamb slaughtered to appease someone, somewhere, I suppose. With expediency that raised more questions than it resolved, a new man arrived to fill the vacant post. He came with smiles, big ideas, ugly PowerPoint presentations and buzzwords of encouragement from the book of ‘How to be a Leader’. Everyone had seen the book, no one told him. Immediately, he displayed signs of cracking under pressure. It’s easy to understand why, his predecessor didn’t play ball and there he now sat, in the vacated throne, a Damoclesian sword dangling above his hairless head. He’d been brought in to solve the political machinations above him and everyone considered him “their man” but because of this, he could be no one’s and the stress of this inevitability was his burden to bear. He began looking more stressed with each appearance he made. He arranged his face to talk to others, but I saw him, I saw him because he was my manager, only another rung up, just one choice ahead, a little older, a little balder, a little fatter. He too had made a mistake. He too shouldn’t have been there. Perhaps he knew something my manager did not, he was more self-aware after all, but whatever his wisdom, it did not matter, he was tied in, committed, he’d made promises, this was his only egg, his only basket. At his age — I’m guessing mid-50’s — his life could have been anything, but it was crumpled suits from long commutes, high-pressured meetings, weekend working and the endless toil of trying to please everyone, but instead pleasing no one, simply disappointing one person here, one person there, watch how it unfolds, watch how it unravels, watch his undoing as the corporate gods rattle him as they have rattled us. I watched these two men, both my superiors, live out their bad choices. I watched as they chose them over and over, I watched them lose the war of attrition on their spirits. Regret hadn’t consumed them yet, not wholly, it had only frayed their edges, but I saw it coming. One day, it will be all they feel. It is in my reflection of these men that I realised it never stops. The mistake, the bad choice, it never ceases to be made, each day, each minute, each hour, unless you choose again, unless you choose differently. I was these men also, just a bit younger, just a bit slimmer, just with a bit more hair. The three of us were all on the same road, on the same conveyor belt, on the same mouse wheel. Their mistake was my mistake, it was our mistake. The only difference between the three of us was how far down the wrong road we had decided to travel, how much we had gritted our teeth and doubled down on our wrong choices. It doesn’t need to be this way. Life is not a trap. The stars in the night sky shine down from the past and looking up at them is looking up at what once was. In the corporate world, looking up at those above us is looking at what is to come. At what could be. Above me, all I saw were the ghosts of Christmas Future, but those two men didn’t yet know they were ghosts. They seemed alive, they walked, talked, made tea, buttoned-up shirts, put on ties, and sent emails, but were simply not there. These two men had long abandoned a search for meaning. Instead, they found themselves in the reflection of a gleaming axel, a shiny cog, a turning wheel, seeking answers in the grinding gears of corporate machinery. They didn’t realise they too were made from cold, hard steel. For almost 20 years I had wandered down the corporate road, for almost two decades I was empty and miserable, sacrificing my dreams on the cross of certainty. It took me that long for my bad choice to come into sharp focus. There it is, I see it now, my inevitable future shown to me in the reflection of two broken men. That day — when I saw my reflection and a painful epiphany arrived — I left the office and never went back. The reality, of course, was messier, less impulsive, less romantic, but that day I spiritually checked out. It was the first day I did something right. It was the first day I saw my bad choice and decided to choose differently because it never stops unless you stop choosing it. And that’s all it takes, one new choice for one new life.
https://medium.com/the-ascent/how-i-escaped-my-corporate-fate-and-decided-to-choose-myself-247e58c865b3
['Jamie Jackson']
2020-12-15 18:03:43.070000+00:00
['Work', 'Self-awareness', 'Spirituality', 'Life Lessons', 'Entrepreneurship']
Title Escaped Corporate Fate Decided Choose MyselfContent “Once realise deserve bright future letting go dark past best choice ever make” ― Roy Bennett Light Heart make bad choice sometimes become apparent life pulled far course don’t recognise anymore whatever reason u double grit teeth push never stop making wrong choice even truth stare u face realise Corporate work gave stark vision possible future made realise making wrong choice 20 year came head year reached 42 still working office looking manager eight year senior cranky hell miserable pressure pale balding archetype middleaged misery talked lot much TV watched Boasted even app clocked hour mindless escapism indulged measurement existential despair prime colour interactive graph He’d made bad choice didn’t seem realise wasn’t meant office saw saw mistake Somewhere line he’d chosen stay small sat unhappy stressed every two week erupted shouted berated team swore told people “fuck off” — rage spilled incongruence gnawed soul didn’t seem realise cause woe think felt irritation didn’t connect anything deeper Still couldn’t kept rage bubbled surface existential angst caused hotfoot steady leader explosive child back forth forth back held crust sanity two extreme never quite managing either He’d made mistake shouldn’t office he’s 50 bed bed nail Lie get comfortable chose Someone told “if can’t find someone want place work quit” word rattled head loomed one singular rung corporate hierarchy walking talking halfman autopilot dared think deeply concept happiness meaning else wall reality would come crashing tsunami truth would sweep away like bamboo beach hut time great deal political quarrelling u corporate god argued rippled chain command gentle lapping wave agitation Sooner later came fruition departmental director unceremoniously ousted sacrificial lamb slaughtered appease someone somewhere suppose expediency raised question resolved new man arrived fill vacant post came smile big idea ugly PowerPoint presentation buzzword encouragement book ‘How Leader’ Everyone seen book one told Immediately displayed sign cracking pressure It’s easy understand predecessor didn’t play ball sat vacated throne Damoclesian sword dangling hairless head He’d brought solve political machination everyone considered “their man” could one’s stress inevitability burden bear began looking stressed appearance made arranged face talk others saw saw manager another rung one choice ahead little older little balder little fatter made mistake shouldn’t Perhaps knew something manager selfaware whatever wisdom matter tied committed he’d made promise egg basket age — I’m guessing mid50’s — life could anything crumpled suit long commute highpressured meeting weekend working endless toil trying please everyone instead pleasing one simply disappointing one person one person watch unfolds watch unravels watch undoing corporate god rattle rattled u watched two men superior live bad choice watched chose watched lose war attrition spirit Regret hadn’t consumed yet wholly frayed edge saw coming One day feel reflection men realised never stop mistake bad choice never cease made day minute hour unless choose unless choose differently men also bit younger bit slimmer bit hair three u road conveyor belt mouse wheel mistake mistake mistake difference three u far wrong road decided travel much gritted teeth doubled wrong choice doesn’t need way Life trap star night sky shine past looking looking corporate world looking u looking come could saw ghost Christmas Future two men didn’t yet know ghost seemed alive walked talked made tea buttonedup shirt put tie sent email simply two men long abandoned search meaning Instead found reflection gleaming axel shiny cog turning wheel seeking answer grinding gear corporate machinery didn’t realise made cold hard steel almost 20 year wandered corporate road almost two decade empty miserable sacrificing dream cross certainty took long bad choice come sharp focus see inevitable future shown reflection two broken men day — saw reflection painful epiphany arrived — left office never went back reality course messier le impulsive le romantic day spiritually checked first day something right first day saw bad choice decided choose differently never stop unless stop choosing that’s take one new choice one new lifeTags Work Selfawareness Spirituality Life Lessons Entrepreneurship
2,414
By (Non)Design: The Connections Between Generic Packaging and Creative Life
19 Feb 2006 from On Kawara’s “Today” painting series Like many kids who attended state-run American elementary schools in the 1980s, I have barely any recollection of anything that I learned in the actual confines of a classroom, being mostly dependent on family support and autodidactic ability to acquire and retain knowledge. I do, however, have a comparatively vivid recall of all the “extra-curricular” rituals of violence and status-jockeying cruelty that were the rule, not the exception, in these institutions. In among the rapidly decaying memories of this time, I can still remember one popular insult that was hurled around on playgrounds and school buses with sadistic glee: “generic.” In terms of initiating either clumsy fistfights or defeated sobbing fits, it wasn’t as cruelly effective as other period barbs, i.e. “retard” or “L.D.” [an acronym for one placed in ‘learning disabled’ classes], but it often enough managed to strike a nerve and prompt intense, prolonged fits of self-doubt. While someone being mocked as a “retard” was simply being called inept, to be on the receiving end of a “generic” claim was to be simultaneously accused of low status and to be totally devoid of any distinguishing personality traits. The thing about such insults is that they often prompt their victims to come out on the other side of the aforementioned fits of self-doubt with a desire to throw the detested personality flaws back in their tormentors’ faces: in fact, an entire paradoxically vivid subculture eventually formed around the adoption of “generic” aesthetics and ideals. To understand the relevance of that both then and now, it is necessary to look back at the retailing landscape as it existed in the late 1970s and 1980s. But first, some more clarification about the pejorative nature of “generic” is in order. While “generic” seems semantically identical to more contemporary insults like “basic,” each is a clear product of its own time: the latter refers to a lack of imagination in the face of an unprecedented opportunity for self-individuation; an inability to make a passably original expression in an Information Age supposedly defined by endless difference. It is an insult that maintains its edge by denying others a capability for producing a memorable self-image during a time in which anyone recording himself screaming at a video game live stream can ostensibly become a world-renowned “content creator”. By contrast, a “generic” person in the ’80s was, essentially, someone incapable of “correctly” consuming. The hidden implication, I always felt, was that “generic” people were forced into their lot not by external circumstances, but by inadequate levels of ingenuity and poor decision-making skill. They were not only, as the story went, disgraces to themselves, but possible detriments to the national character as well, at a time when free-wheeling American vivacity and spontaneity were still being touted as the cultural forces that would turn Soviet citizens against their masters during the late stages of the Cold War. “Generic” people were incapable of rising to their task as cultural liberators; shuffling automatically and incuriously through life, showcasing such a total deficit of ambition that they practically necessitated the creation of a unique line of consumer goods that responded to their flat affects and self-reduced personal standards. Enough said. That product line, in a not-so-distant past, was unmistakeable when encountered in the pre-WalMart era of suburban American supermarkets. Entire “generic aisles” were comprised from solid walls of black text on opaque yellow packaging, with individual products from cookies to breakfast cereal to beer, blending together in a single uniform mass only interrupted by the exposed white strips of metal shelving. It was an artless spectacle that nevertheless could compete with the most sublime works of the Minimalist masters (Donald Judd, Carl Andre etc.) in terms of memorability and semantic clarity. A closer look at the product offerings revealed little more than what could be ascertained from a distance: the already stark two-color printing process was given more imposing weight by the total absence of any additional graphic elements aside from purely functional ones (e.g. UPC codes and ‘nutritional information’ charts), and the black block text announcing the packages’ contents contained no listing of the products’ benefits to the consumer, no elaborative pictograms, nor really any additional subtext to persuade or entice. Elements that might have contributed to both visual and haptic distinction, like the universally recognizable fluted surface and grooves of the Coke bottle, were also ignored. Given what has already been said about the gravitational pull of full-color American exuberance as a cultural force, this anti-marketing strategy couldn’t have lasted forever (one notable modern holdout is the wholly “generic” Canadian supermarket No Frills), and such generic products eventually made way for somewhat less austere “house brands” featuring an actual modicum of graphic design work. In the absence of such packaging, iconoclastic graphic designers like Art Chantry are left to unironically lament how “everything is so ‘pretty’ now in the grocery aisle.”[i] The memorable placement of these generic packages in Alex Cox’ 1984 film Repo Man (see above) begins to hint at the erstwhile omnipresence of these products, while re-purposing them as visual signifiers of cultural cynicism (though I occasionally meet Repo Man viewers who are under the impression that the no-brand “BEER” props were comical anomalies fashioned solely for the film). Numerous subcultural outliers in the U.S., and the Western world as a whole, would do generic packaging one better by apparently embracing it: the “generic” album from San Franciscan ‘art-damaged’ punks Flipper, which replicated the sterile black-on-yellow package design of generic foodstuffs to a “T,” was one watershed design that pointed towards a subcultural adoption of the generic anti-aesthetic as something superior to “proper,” element-rich graphic design. For one, appropriating generic design style for one’s own creative output communicated a certain resistance to being propagandized, and particularly in accepting the propaganda that consumer choices alone provided the molecular structure of a distinct identity (especially as it became steadily more obvious that the preemptively limited choices, in consumer goods as well as broadcast media and political candidates, did not represent everything really available or possible in the marketplace). It is not that bold of an assertion to say that some kind of “genericism as resistance” has manifested in every multi-media, d.i.y. subculture to have existed from the late 1970s to present. This tenacity has existed in spite of repeated lessons from market researchers, such as Orth and Malkewitz, whose findings implied that “nondescript designs score low on sincerity, exictement and ruggedness, and average on competence and sophistication…these designs further generate impressions of ‘corporate’ and ‘little value for money’, and do not evoke happy memories.”[ii] Then again, the above is not an exhaustive list of criteria for the appreciation of a given object, and generic packaging appropriated for artistic statements plays upon a different set of cultural impulses ranging from a distrust of arbitrariness to the many varieties of societal fatigue. For those inundated with other eye-popping pleas for attention defined by dancing typefaces and hyperreal graphic novelty, the attitude of “take it or leave it” challenge implied by mock-generic cultural products must have had (as it did for me) an attraction akin to the romantic curiosity one might feel for disengaged, aloof loners after being breathlessly propositioned by dozens of other prospective partners. Everyone from the “white label” underground of techno music to the more institutional (if just barely) culture of avant-garde classical have gambled on this psychological quirk with decent enough results: see, for example, the Swiss Hat Art label’s series of ‘modern classical’ masterworks on CD. Elsewhere, the packaging for my CD copy of the late Glenn Branca’s ecstatic Symphony №2 (The Peak of the Sacred) would be almost indistinguishable from a generic product bought at a Kroger supermarket in the early 1980s, save for the deviation of two contrasting text colors being featured on the cover. Genericism re-envisioned as culture also telegraphs a commitment to essentiality, which is at the core of any ethical statement that this style hopes to make. Chantry, in his musing on the ‘house brand,’ notes that a key to their strategy was “to make the labeling look like they weren’t ‘wasting’ your precious grocery money on elaborate (i.e. expensive packaging…) it all got tossed out anyway, right?”[iii] In doing so, he touches upon a stance that was both ethic and aesthetic, and one which applies to many other non-musical creative artifacts of the late 20th century and beyond, executed in media that did not require packaging. While not consciously attempting to appropriate or comment on generic packaging, some major works of the avant-garde do capture something of this same contrarian attractiveness and ethical essentialism. One of conceptual artist On Kawara’s most noted works, his Today series of paintings consisting only of the painting’s date of completion rendered in white block lettering on a single-color background, effectively served as “packages” or framing devices for the artist’s own continual self-development: they were a kind of “embodied time” demonstrating aspects of Merleau-Ponty’s phenomenology (and, unfortunately, doing so in a way too complex to be fully laid out in this short article). Elsewhere, something like Aram Saroyan’s utra-minimalist poems, e.g. lighght (the entirety of which you have now just read), arguably took the “generic” quality of stark non-descriptiveness into the field of poetry. In the process, they reduced that field’s complex relationship with language to a purely declarative function, and in a way that was shocking enough to become the National Endowment of the Arts’ first bona fide funding controversy. Rather than traveling further down this road, though, it would be wise to put on the brakes and state the perhaps obvious fact that a simple, nondescript, purely declarative design style has been the very lifeblood of corporate logos and luxury consumer goods for decades now. As to the latter, the marketing aims of the designer fragrance industry are much better encapsulated by something like the austere layout of the CK One bottle (arguably the first truly popular unisex fragrance in the U.S.) than by Katy Perry’s hilariously cloying, cat-shaped Meow container. It’s fascinating to consider how, simply by altering color schemes and diluting some of the blunt force of a bold / block typeface by going “lowercase”, changing the degree of kerning, etc., one can create “genericism” that exudes a much higher degree of “competence and sophistication” while also paying lip service to the essentialist “waste nothing” ethic. With the classical age of actually existing generic packaging behind us, a kind of carefully sculpted generic quality is a valuable weapon in the hands of marketing departments everywhere, and is a reliable alternative to adopt when humans’ limitless capacity for boredom and fatigue with established aesthetics comes into play once again. As in Dr. Seuss’ brilliant childrens’ fable The Sneetches, where a master salesman pits “star-bellied” and generically non-starred creatures against one another in a cyclical divide-and-conquer scheme, alternating acceptance and loathing of the nondescript seems to be an eternal recurrence. Yet there is one relatively new feature of our current cultural and media landscape that is altering the rules of this game: the simple fact that the relevance of “packaging” itself is eroding. This is certainly true for the music business, as claimed by the Royal Designer for Industry Malcolm Garrett — himself the designer of the notorious “generic” carrying bag design for the Buzzcocks’ Another Music in A Different Kitchen LP: Packaging is just one interface to the music. The application of creative energy, which once saw physical expression in record sleeves, posters, and club flyers, is now realized in ‘soft’ ways. The interface is now digital, but no less compelling. The point of access is the package, and consequently, identity is expressed in ways that complement rather than define the music.[iv] Garrett’s invocation of the “interface” brings us right back to the present age of social media, and the “internet of everything,” and their attendant imperatives for all to sacrifice their privacy in order to become recognizable creators of “content.” Musing upon these things also, after a fashion, brings us to what was initially so rewarding about announcing one’s creative presence to the world with a strictly uninformative data set. For some, this may have come from nothing else than a contrarian urge, but this was also informed by anonymity as a strategy, i.e. the hope that an austere interface would force prospective fans, supporters or friends to engage in direct contact and communicate unhindered by symbolic distractions, while also repelling those who could not be bothered to do so. The new equivalent of “genericist” counter-cultural revolt might be nothing other than a voluntary refusal of the dopamine rush of recognition provided by social media networks, and limitation of personal disclosure to the most purely declarative: something like the Geneva Convention injunction that captured combatants provide captors with no information other than “name, rank, and serial number.” To be sure, there will be a whole new repertoire of schoolyard insults ready to be launched when this strain of non-conformity finally becomes perceived as a genuine force, and when an individual’s level of usefulness to society becomes defined not by their skill in production or consumption, but in their degree of commitment to omnipresence (read: constant ability to be monitored and administered). As always, insults will be loudly bleated by schoolchildren, but only in imitation of those adults who have been successfully propagandized to see any degree of independent thought and action as existential threats. [i] Chantry, A. (2015). Art Chantry Speaks. Port Townsend: Feral House. [ii] Orth, U. & Malkewitz, Kevin (2008). “Holistic Package Design and Consumer Brand Impressions.” Journal of Marketing, 72(3). [iii] Chantry (2015). [iv] Garrett, M. (2015). “Bsolete?” Royal Society for the Encouragement of Arts, Manufactures and Commerce, 161.
https://thomasbeywilliambailey.medium.com/by-non-design-the-connections-between-generic-packaging-and-creative-life-f9ad735891a3
['Thomas Bey William Bailey']
2019-10-04 03:14:16.377000+00:00
['Anonymity', 'Marketing', 'Alternative Music', 'Design', 'Content Creation']
Title NonDesign Connections Generic Packaging Creative LifeContent 19 Feb 2006 Kawara’s “Today” painting series Like many kid attended staterun American elementary school 1980s barely recollection anything learned actual confines classroom mostly dependent family support autodidactic ability acquire retain knowledge however comparatively vivid recall “extracurricular” ritual violence statusjockeying cruelty rule exception institution among rapidly decaying memory time still remember one popular insult hurled around playground school bus sadistic glee “generic” term initiating either clumsy fistfight defeated sobbing fit wasn’t cruelly effective period barb ie “retard” “LD” acronym one placed ‘learning disabled’ class often enough managed strike nerve prompt intense prolonged fit selfdoubt someone mocked “retard” simply called inept receiving end “generic” claim simultaneously accused low status totally devoid distinguishing personality trait thing insult often prompt victim come side aforementioned fit selfdoubt desire throw detested personality flaw back tormentors’ face fact entire paradoxically vivid subculture eventually formed around adoption “generic” aesthetic ideal understand relevance necessary look back retailing landscape existed late 1970s 1980s first clarification pejorative nature “generic” order “generic” seems semantically identical contemporary insult like “basic” clear product time latter refers lack imagination face unprecedented opportunity selfindividuation inability make passably original expression Information Age supposedly defined endless difference insult maintains edge denying others capability producing memorable selfimage time anyone recording screaming video game live stream ostensibly become worldrenowned “content creator” contrast “generic” person ’80s essentially someone incapable “correctly” consuming hidden implication always felt “generic” people forced lot external circumstance inadequate level ingenuity poor decisionmaking skill story went disgrace possible detriment national character well time freewheeling American vivacity spontaneity still touted cultural force would turn Soviet citizen master late stage Cold War “Generic” people incapable rising task cultural liberator shuffling automatically incuriously life showcasing total deficit ambition practically necessitated creation unique line consumer good responded flat affect selfreduced personal standard Enough said product line notsodistant past unmistakeable encountered preWalMart era suburban American supermarket Entire “generic aisles” comprised solid wall black text opaque yellow packaging individual product cooky breakfast cereal beer blending together single uniform mass interrupted exposed white strip metal shelving artless spectacle nevertheless could compete sublime work Minimalist master Donald Judd Carl Andre etc term memorability semantic clarity closer look product offering revealed little could ascertained distance already stark twocolor printing process given imposing weight total absence additional graphic element aside purely functional one eg UPC code ‘nutritional information’ chart black block text announcing packages’ content contained listing products’ benefit consumer elaborative pictograms really additional subtext persuade entice Elements might contributed visual haptic distinction like universally recognizable fluted surface groove Coke bottle also ignored Given already said gravitational pull fullcolor American exuberance cultural force antimarketing strategy couldn’t lasted forever one notable modern holdout wholly “generic” Canadian supermarket Frills generic product eventually made way somewhat le austere “house brands” featuring actual modicum graphic design work absence packaging iconoclastic graphic designer like Art Chantry left unironically lament “everything ‘pretty’ grocery aisle”i memorable placement generic package Alex Cox’ 1984 film Repo Man see begin hint erstwhile omnipresence product repurposing visual signifier cultural cynicism though occasionally meet Repo Man viewer impression nobrand “BEER” prop comical anomaly fashioned solely film Numerous subcultural outlier US Western world whole would generic packaging one better apparently embracing “generic” album San Franciscan ‘artdamaged’ punk Flipper replicated sterile blackonyellow package design generic foodstuff “T” one watershed design pointed towards subcultural adoption generic antiaesthetic something superior “proper” elementrich graphic design one appropriating generic design style one’s creative output communicated certain resistance propagandized particularly accepting propaganda consumer choice alone provided molecular structure distinct identity especially became steadily obvious preemptively limited choice consumer good well broadcast medium political candidate represent everything really available possible marketplace bold assertion say kind “genericism resistance” manifested every multimedia diy subculture existed late 1970s present tenacity existed spite repeated lesson market researcher Orth Malkewitz whose finding implied “nondescript design score low sincerity exictement ruggedness average competence sophistication…these design generate impression ‘corporate’ ‘little value money’ evoke happy memories”ii exhaustive list criterion appreciation given object generic packaging appropriated artistic statement play upon different set cultural impulse ranging distrust arbitrariness many variety societal fatigue inundated eyepopping plea attention defined dancing typeface hyperreal graphic novelty attitude “take leave it” challenge implied mockgeneric cultural product must attraction akin romantic curiosity one might feel disengaged aloof loner breathlessly propositioned dozen prospective partner Everyone “white label” underground techno music institutional barely culture avantgarde classical gambled psychological quirk decent enough result see example Swiss Hat Art label’s series ‘modern classical’ masterworks CD Elsewhere packaging CD copy late Glenn Branca’s ecstatic Symphony №2 Peak Sacred would almost indistinguishable generic product bought Kroger supermarket early 1980s save deviation two contrasting text color featured cover Genericism reenvisioned culture also telegraph commitment essentiality core ethical statement style hope make Chantry musing ‘house brand’ note key strategy “to make labeling look like weren’t ‘wasting’ precious grocery money elaborate ie expensive packaging… got tossed anyway right”iii touch upon stance ethic aesthetic one applies many nonmusical creative artifact late 20th century beyond executed medium require packaging consciously attempting appropriate comment generic packaging major work avantgarde capture something contrarian attractiveness ethical essentialism One conceptual artist Kawara’s noted work Today series painting consisting painting’s date completion rendered white block lettering singlecolor background effectively served “packages” framing device artist’s continual selfdevelopment kind “embodied time” demonstrating aspect MerleauPonty’s phenomenology unfortunately way complex fully laid short article Elsewhere something like Aram Saroyan’s utraminimalist poem eg lighght entirety read arguably took “generic” quality stark nondescriptiveness field poetry process reduced field’s complex relationship language purely declarative function way shocking enough become National Endowment Arts’ first bona fide funding controversy Rather traveling road though would wise put brake state perhaps obvious fact simple nondescript purely declarative design style lifeblood corporate logo luxury consumer good decade latter marketing aim designer fragrance industry much better encapsulated something like austere layout CK One bottle arguably first truly popular unisex fragrance US Katy Perry’s hilariously cloying catshaped Meow container It’s fascinating consider simply altering color scheme diluting blunt force bold block typeface going “lowercase” changing degree kerning etc one create “genericism” exudes much higher degree “competence sophistication” also paying lip service essentialist “waste nothing” ethic classical age actually existing generic packaging behind u kind carefully sculpted generic quality valuable weapon hand marketing department everywhere reliable alternative adopt humans’ limitless capacity boredom fatigue established aesthetic come play Dr Seuss’ brilliant childrens’ fable Sneetches master salesman pit “starbellied” generically nonstarred creature one another cyclical divideandconquer scheme alternating acceptance loathing nondescript seems eternal recurrence Yet one relatively new feature current cultural medium landscape altering rule game simple fact relevance “packaging” eroding certainly true music business claimed Royal Designer Industry Malcolm Garrett — designer notorious “generic” carrying bag design Buzzcocks’ Another Music Different Kitchen LP Packaging one interface music application creative energy saw physical expression record sleeve poster club flyer realized ‘soft’ way interface digital le compelling point access package consequently identity expressed way complement rather define musiciv Garrett’s invocation “interface” brings u right back present age social medium “internet everything” attendant imperative sacrifice privacy order become recognizable creator “content” Musing upon thing also fashion brings u initially rewarding announcing one’s creative presence world strictly uninformative data set may come nothing else contrarian urge also informed anonymity strategy ie hope austere interface would force prospective fan supporter friend engage direct contact communicate unhindered symbolic distraction also repelling could bothered new equivalent “genericist” countercultural revolt might nothing voluntary refusal dopamine rush recognition provided social medium network limitation personal disclosure purely declarative something like Geneva Convention injunction captured combatant provide captor information “name rank serial number” sure whole new repertoire schoolyard insult ready launched strain nonconformity finally becomes perceived genuine force individual’s level usefulness society becomes defined skill production consumption degree commitment omnipresence read constant ability monitored administered always insult loudly bleated schoolchildren imitation adult successfully propagandized see degree independent thought action existential threat Chantry 2015 Art Chantry Speaks Port Townsend Feral House ii Orth U Malkewitz Kevin 2008 “Holistic Package Design Consumer Brand Impressions” Journal Marketing 723 iii Chantry 2015 iv Garrett 2015 “Bsolete” Royal Society Encouragement Arts Manufactures Commerce 161Tags Anonymity Marketing Alternative Music Design Content Creation
2,415
JIT fast! Supercharge tensor processing in Python with JIT compilation
At Starschema, we’re constantly looking for ways to speed up some of the computationally intensive tasks we’re dealing with. Since a good amount of our work involves image processing, this means that we’re in particular interested in anything that makes matrix computations — sometimes over fairly large tensors, e.g. high-resolution satellite or biomedical imagery––easier and faster. Because imagery often comes in multi-channel or even hyperspectral forms, anything that helps process them faster is a boon, shaving valuable seconds off that over large data sets can easily make days of difference. Until relatively recently, it was not uncommon to write development code in a high-level language with good data science and machine learning support, like Python, but rewrite and deploy it in C or C++, for raw speed (indeed, one of the motivations behind Julia was to develop a language that would be fast enough not to require this!). Python is great for putting your quantitative ideas clearly and succinctly, but interior loops in Python have always been slow due to the absence of type information. Python’s duck typing system really comes to bite when this absence of typing creates unnecessary code and indirection, leading to relatively slow inner loops. Recently, however, solutions were envisaged to get around this problem. The first of these was Cython — injecting C types into your Python code. It is, on the whole, a rather painstaking method of speeding up your code, albeit a lot of computationally intensive code is written in Cython, including code you’ve almost definitely used — much of the SciPy stack, for instance, and almost all of SageMath, were written in Cython. The problem is that ‘Cythonising’ your code can be time consuming, and often fraught with challenges that require a profound knowledge of C to solve. What if we had a better way to get efficient bytecode from our slow-but-intelligible Python code? Enter Numba. Numba is what is called a JIT (just-in-time) compiler. It takes Python functions designated by particular annotations (more about that later), and transforms as much as it can — via the LLVM (Low Level Virtual Machine) compiler — to efficient CPU and GPU (via CUDA for Nvidia GPUs and HSA for AMD GPUs) code. While in Cython, you got the tools to use C types directly, but had to go out of your way to actually be able to do so, Numba does most of the heavy lifting for you. The simplest way to get started with Numba is as easy as affixing the @numba.jit decorator to your function. Let’s consider the following function, performing a simple and pretty clumsy LU factorisation: import numpy as np def numpy_LUdet(A: np.ndarray): y = [1.0] n = A.shape[0] with np.errstate(invalid = 'ignore'): for i in range(n): y[0] = y[0] * A[i, i] for j in range(i+1, n): A[j][i] = A[j][i]/A[i][i] A[j][i+1:] = A[j][i+1:] - (A[j][i] * A[i][i+1:]) Note that as this is a measuring function, it does not return a value, it merely calculates the decomposition. As you can see, for an n x n square matrix, the runtime will be on the order of n², due to the nested iteration. What’s the best way to speed up this code? We could, of course, rewrite it in Cython. Numba, on the other hand, offers us the convenience of simply imposing a decorator: import numpy as np import numba @numba.jit() def numba_LUdet(A: np.ndarray): y = [1.0] n = A.shape[0] with np.errstate(invalid = 'ignore'): for i in range(n): y[0] = y[0] * A[i, i] for j in range(i+1, n): A[j][i] = A[j][i]/A[i][i] A[j][i+1:] = A[j][i+1:] - (A[j][i] * A[i][i+1:]) Through that simple decoration, the code already runs significantly faster (once, that is, the code has had a chance to compile in the first run) — approximately 23 times faster than NumPy code for a 10 x 10 matrix. OK, so how does it work? Unlike for Cython, we did not have to re-cast our code at all. It’s almost as if Numba knew what we wanted to do and created efficient precompiled code. It turns out that’s largely what it does: it analyses Python code, turns it into an LLVM IR (intermediate representation), then creates bytecode for the selected architecture (by default, the architecture the host Python runtime is running on). This allows additional enhancements, such as parallelisation and compiling for CUDA as well––given the near-ubiquitous support for LLVM, code can be generated to run on a fairly wide range of architectures (x86, x86_64, PPC, ARMv7, ARMv8) and a number of OSs (Windows, OS X, Linux), as well as on CUDA and AMD’s equivalent, ROC. The drawback is that Numba by definition only implements a strict subset of Python. Fortunately, Numba handles this, in two ways: Numba has very wide support for NumPy functions (see list here) and Python features (see list here) — although notably, it does not support context handlers ( with expressions) and exception handling ( try , except , finally ) . expressions) and exception handling ( , , ) . Unless running in nopython mode (see below), Numba will attempt to generate optimised bytecode and, failing to do so, simply try to create a Python function (this is known as ‘object mode’ within Numba). Object mode vs nopython mode In general, the biggest boon of Numba is that unlike with Cython, you don’t need to rewrite your whole function. All you need to do is to prefix it with the jit decorator, as seen above. This puts Numba on autopilot, allowing it to determine whether it can do something about the code, and leave the function as it was written if it cannot. This is known as ‘object mode’ and means that if JIT compilation fails because some or all of the function body is not supported by Numba, it will compile the function as a regular Python object. Chances are, the result will still be faster as it may be able to optimise some loops using loop-lifting, however, so it’s definitely worth the try. But where Numba really begins to shine is when you compile using nopython mode, using the @njit decorator or @jit(nopython=True) . In this case, Numba will immediately assume you know what you’re doing and try to compile without generating Python object code (and throw an exception if it cannot do so). The difference in terms of execution time between object and nopython mode can range from 20% to 40 times (!). In practice, I’ve found the best approach is to refactor and extract purely optimisable code, and optimise it in nopython mode. The rest can be kept as pure Python functions. This maximises overall optimisation gains without expending compilation overhead (more about which in the next section) unnecessarily. Where object code is generated, Numba still has the ability to ‘loop-lift’. This means to ‘lift out’ a loop automatically from an otherwise non-JITtable code, JIT compile it, and treat it as if it had been a separate nopython JITted function. While this is a useful trick, it’s overall best to explicitly do so yourself. Compilation overhead Because Numba’s JIT compiler has to compile the function to bytecode, there will be an inevitable overhead — often indicated by a pretty slow first run followed by tremendously faster subsequent runs. This is the time cost of JIT compiling a function. While compilation is almost always worth it and needs to be done only once, in performance-critical applications it makes sense to reduce compilation overhead. There are two principal ways to accomplish it with Numba: caching and eager compilation. The @jit decorator accepts a cache boolean argument. If set to True , it will cache the function it compiled into a file-based cache. In general, every time you open and run a Python script, everything that needs to be compiled by Numba gets compiled at that time. However, if you cache the compilation result, subsequent runs will be able to read the bytecode from the cache file. In theory, you can also distribute the cache file, but since Numba optimizes to your specific architecture (and supports a bewildering array of architectures, as described above), it may not work persistently. It nonetheless remains a good idea to cache functions, compile them once and use them all the time. Eager compilation is a different way of solving the same problem. Admittedly, the naming is a little misleading — most of the time, these terms are used to indicate when something is compiled (at call time, i.e. lazy, vs. well in advance, i.e. eager). In this case, it refers to a related notion, but one that stretches over what is being compiled, too. Consider the following example: import math import numba @numba.njit def lazy_hypotenuse(side1: int, side2: int) -> float: return math.sqrt(math.pow(side1, 2) + math.pow(side2, 2)) This is lazy compilation because––the Python typing annotations notwithstanding––we have not provided any information to Numba about the function’s possible arguments, and therefore, it will compile code at time of call depending on the type the values of side1 and side2 are taking. Eager compilation, on the other hand, rests on telling Numba well ahead of time what types to expect: import math from numba import njit, float32, int32 @numba.njit(float32(int32, int32)) def eager_hypotenuse(side1: int, side2: int) -> float: return math.sqrt(math.pow(side1, 2) + math.pow(side2, 2)) The format @jit(<return>(<argument1>, <argument2>,...)) (or its @njit equivalent) will allow the Numba JIT compiler to determine types (check out the documentation for the type system in Numba), and based on that, pre-generate compiled bytecode. Note that if you have an eager compiled function and your arguments cannot be coerced into the format you specify, the function will throw a TypingError . Invoking other JITted functions As a general rule, Numba will not do recursive optimisation for you. In other words, if you invoke other functions you yourself defined from a JITted function, you must mark those for JITting separately — Numba will not JIT them just because they’re invoked in a JITted function. Consider the following example: import numpy as np from numba import njit, float32 from typing import List def get_stdev(arr: List[float]): return np.std(np.array(arr)) @njit(float32(float32[:])) def get_variance(arr: List[float]): return get_stdev(arr)**2 In this case, the computationally inexpensive second function will benefit from JIT, but all it does is a simple exponentiation. The computationally more expensive first function has not been annotated, and therefore will be run as a Python function — that is, much slower. To get the most out of Numba, the get_stdev() function should also have been provided with a JIT decorator (preferably @njit , since NumPy’s numpy.std() is implemented by Numba). Just how fast is it? To demonstrate the benefits of JIT, I’ve run a benchmark, in which I used a somewhat clumsy LU decomposition of square matrices from 10 x 10 to 256 x 256 . As you can see, Numba-optimised NumPy code is at all times at least a whole order of magnitude faster than naive NumPy code and up to two orders of magnitude faster than native Python code. Directly invoked LAPACK code, written in FORTRAN 90 (via SciPy’s scipy.linalg.lu_factor() , a wrapper around the *GETRF routine in LAPACK ), emerges as the clear winner at larger matrix sizes, and Cython’s performance turns out to be only slightly inferior to the optimised NumPy code. LU decomposition benchmark for native Python, NumPy, optimised NumPy, LAPACK and Cython code. LAPACK was invoked using a SciPy wrapper. Optimised NumPy code is about an order of magnitude faster than ordinary NumPy code throughout, while up to two orders of magnitude faster than native Python. Cython code, on the other hand, is not significantly faster, whereas FORTRAN code only begins to lap optimised NumPy at relatively large matrix sizes. The ‘bang for a buck’ factor of optimising with Numba is clearly the highest — the NumPy code (orange) and the optimised NumPy code (crimson) differ only by the application of a single decorator. Of course, Numba has its limitations. Importantly, it only helps to optimise a particular kind of problem — namely, processes where loops or other repetitive structures are included. For tensor operations and other nested loop/high cyclomatic complexity workloads, it will make a significant difference. Even where you need to restructure your code to fit in with Numba’s requirements, such restructuring is a lot easier in my experience than having to rewrite the whole thing in Cython. Acting at the same time as an interface to quickly generate not just faster CPU code but also GPU enabled code (via PyCuda) for a slightly more limited subset of functionalities (NumPy array math functions are not supported on CUDA, nor are NumPy math functions in general), Numba is worth exploring if your work involves nested loops and/or large or repetitive tensor operations. For writing numerical code, image processing algorithms and certain operations involving neural networks, it is rapidly becoming my tool of choice for writing heavily optimised, fast code. There’s more to Numba than speed Numba’s main job, of course, is to speed up functions. But it also does an excellent job at several other things. Perhaps my favourite among these is the @vectorize decorator, which can turn any old function into a NumPy universal function (often just called a ‘ ufunc ’). If you have a background in R, you might from time to time find yourself to be wistfully reminiscing about R’s ability to vectorise functions without much ado. A ufunc is a vectorized wrapper that generalises a function to operate on tensors represented as n-dimensional NumPy arrays ( ndarray s), supporting tensor logic like broadcasting, internal buffers and internal type casting. An example is the function numpy.add() , which generalises the addition function (invoked via the addition operator, + ) for tensors of any size — including tensors that are not the same size, where NumPy’s broadcasting logic is used to reconcile the tensors of different size. Magic with Numba’s vectorisation decorator: a simple elementwise function can be generalised to higher-order tensors by nothing more than wrapping it in a decorator. This is, of course, rather inefficient, as failing to specify a signature for possible vectorisations means some type-specific optimisation cannot be carried out. For details on writing good vectorizable code in Numba, please refer to the documentation’s chapter on the vectorisation decorator. Consider, for instance, the math.log10 function. This is an unvectorised function, intended to operate on single values (size-1 arrays, as the error message quoth). But by simply prepending Numba’s @numba.vectorize decorator, we can generalise the math.log10 function into a function operating elementwise over NumPy ndarray s representing tensors of pretty much any order (dimensionality).
https://medium.com/starschema-blog/jit-fast-supercharge-tensor-processing-in-python-with-jit-compilation-47598de6ee96
['Chris Von Csefalvay']
2019-03-25 09:16:03.439000+00:00
['Machine Learning', 'Data Science', 'Python', 'Artificial Intelligence', 'Deep Learning']
Title JIT fast Supercharge tensor processing Python JIT compilationContent Starschema we’re constantly looking way speed computationally intensive task we’re dealing Since good amount work involves image processing mean we’re particular interested anything make matrix computation — sometimes fairly large tensor eg highresolution satellite biomedical imagery––easier faster imagery often come multichannel even hyperspectral form anything help process faster boon shaving valuable second large data set easily make day difference relatively recently uncommon write development code highlevel language good data science machine learning support like Python rewrite deploy C C raw speed indeed one motivation behind Julia develop language would fast enough require Python great putting quantitative idea clearly succinctly interior loop Python always slow due absence type information Python’s duck typing system really come bite absence typing creates unnecessary code indirection leading relatively slow inner loop Recently however solution envisaged get around problem first Cython — injecting C type Python code whole rather painstaking method speeding code albeit lot computationally intensive code written Cython including code you’ve almost definitely used — much SciPy stack instance almost SageMath written Cython problem ‘Cythonising’ code time consuming often fraught challenge require profound knowledge C solve better way get efficient bytecode slowbutintelligible Python code Enter Numba Numba called JIT justintime compiler take Python function designated particular annotation later transforms much — via LLVM Low Level Virtual Machine compiler — efficient CPU GPU via CUDA Nvidia GPUs HSA AMD GPUs code Cython got tool use C type directly go way actually able Numba heavy lifting simplest way get started Numba easy affixing numbajit decorator function Let’s consider following function performing simple pretty clumsy LU factorisation import numpy np def numpyLUdetA npndarray 10 n Ashape0 nperrstateinvalid ignore rangen y0 y0 Ai j rangei1 n Aji AjiAii Aji1 Aji1 Aji Aii1 Note measuring function return value merely calculates decomposition see n x n square matrix runtime order n² due nested iteration What’s best way speed code could course rewrite Cython Numba hand offer u convenience simply imposing decorator import numpy np import numba numbajit def numbaLUdetA npndarray 10 n Ashape0 nperrstateinvalid ignore rangen y0 y0 Ai j rangei1 n Aji AjiAii Aji1 Aji1 Aji Aii1 simple decoration code already run significantly faster code chance compile first run — approximately 23 time faster NumPy code 10 x 10 matrix OK work Unlike Cython recast code It’s almost Numba knew wanted created efficient precompiled code turn that’s largely analysis Python code turn LLVM IR intermediate representation creates bytecode selected architecture default architecture host Python runtime running allows additional enhancement parallelisation compiling CUDA well––given nearubiquitous support LLVM code generated run fairly wide range architecture x86 x8664 PPC ARMv7 ARMv8 number OSs Windows OS X Linux well CUDA AMD’s equivalent ROC drawback Numba definition implement strict subset Python Fortunately Numba handle two way Numba wide support NumPy function see list Python feature see list — although notably support context handler expression exception handling try except finally expression exception handling Unless running nopython mode see Numba attempt generate optimised bytecode failing simply try create Python function known ‘object mode’ within Numba Object mode v nopython mode general biggest boon Numba unlike Cython don’t need rewrite whole function need prefix jit decorator seen put Numba autopilot allowing determine whether something code leave function written cannot known ‘object mode’ mean JIT compilation fails function body supported Numba compile function regular Python object Chances result still faster may able optimise loop using looplifting however it’s definitely worth try Numba really begin shine compile using nopython mode using njit decorator jitnopythonTrue case Numba immediately assume know you’re try compile without generating Python object code throw exception cannot difference term execution time object nopython mode range 20 40 time practice I’ve found best approach refactor extract purely optimisable code optimise nopython mode rest kept pure Python function maximises overall optimisation gain without expending compilation overhead next section unnecessarily object code generated Numba still ability ‘looplift’ mean ‘lift out’ loop automatically otherwise nonJITtable code JIT compile treat separate nopython JITted function useful trick it’s overall best explicitly Compilation overhead Numba’s JIT compiler compile function bytecode inevitable overhead — often indicated pretty slow first run followed tremendously faster subsequent run time cost JIT compiling function compilation almost always worth need done performancecritical application make sense reduce compilation overhead two principal way accomplish Numba caching eager compilation jit decorator accepts cache boolean argument set True cache function compiled filebased cache general every time open run Python script everything need compiled Numba get compiled time However cache compilation result subsequent run able read bytecode cache file theory also distribute cache file since Numba optimizes specific architecture support bewildering array architecture described may work persistently nonetheless remains good idea cache function compile use time Eager compilation different way solving problem Admittedly naming little misleading — time term used indicate something compiled call time ie lazy v well advance ie eager case refers related notion one stretch compiled Consider following example import math import numba numbanjit def lazyhypotenuseside1 int side2 int float return mathsqrtmathpowside1 2 mathpowside2 2 lazy compilation because––the Python typing annotation notwithstanding––we provided information Numba function’s possible argument therefore compile code time call depending type value side1 side2 taking Eager compilation hand rest telling Numba well ahead time type expect import math numba import njit float32 int32 numbanjitfloat32int32 int32 def eagerhypotenuseside1 int side2 int float return mathsqrtmathpowside1 2 mathpowside2 2 format jitreturnargument1 argument2 njit equivalent allow Numba JIT compiler determine type check documentation type system Numba based pregenerate compiled bytecode Note eager compiled function argument cannot coerced format specify function throw TypingError Invoking JITted function general rule Numba recursive optimisation word invoke function defined JITted function must mark JITting separately — Numba JIT they’re invoked JITted function Consider following example import numpy np numba import njit float32 typing import List def getstdevarr Listfloat return npstdnparrayarr njitfloat32float32 def getvariancearr Listfloat return getstdevarr2 case computationally inexpensive second function benefit JIT simple exponentiation computationally expensive first function annotated therefore run Python function — much slower get Numba getstdev function also provided JIT decorator preferably njit since NumPy’s numpystd implemented Numba fast demonstrate benefit JIT I’ve run benchmark used somewhat clumsy LU decomposition square matrix 10 x 10 256 x 256 see Numbaoptimised NumPy code time least whole order magnitude faster naive NumPy code two order magnitude faster native Python code Directly invoked LAPACK code written FORTRAN 90 via SciPy’s scipylinalglufactor wrapper around GETRF routine LAPACK emerges clear winner larger matrix size Cython’s performance turn slightly inferior optimised NumPy code LU decomposition benchmark native Python NumPy optimised NumPy LAPACK Cython code LAPACK invoked using SciPy wrapper Optimised NumPy code order magnitude faster ordinary NumPy code throughout two order magnitude faster native Python Cython code hand significantly faster whereas FORTRAN code begin lap optimised NumPy relatively large matrix size ‘bang buck’ factor optimising Numba clearly highest — NumPy code orange optimised NumPy code crimson differ application single decorator course Numba limitation Importantly help optimise particular kind problem — namely process loop repetitive structure included tensor operation nested loophigh cyclomatic complexity workload make significant difference Even need restructure code fit Numba’s requirement restructuring lot easier experience rewrite whole thing Cython Acting time interface quickly generate faster CPU code also GPU enabled code via PyCuda slightly limited subset functionality NumPy array math function supported CUDA NumPy math function general Numba worth exploring work involves nested loop andor large repetitive tensor operation writing numerical code image processing algorithm certain operation involving neural network rapidly becoming tool choice writing heavily optimised fast code There’s Numba speed Numba’s main job course speed function also excellent job several thing Perhaps favourite among vectorize decorator turn old function NumPy universal function often called ‘ ufunc ’ background R might time time find wistfully reminiscing R’s ability vectorise function without much ado ufunc vectorized wrapper generalises function operate tensor represented ndimensional NumPy array ndarray supporting tensor logic like broadcasting internal buffer internal type casting example function numpyadd generalises addition function invoked via addition operator tensor size — including tensor size NumPy’s broadcasting logic used reconcile tensor different size Magic Numba’s vectorisation decorator simple elementwise function generalised higherorder tensor nothing wrapping decorator course rather inefficient failing specify signature possible vectorisations mean typespecific optimisation cannot carried detail writing good vectorizable code Numba please refer documentation’s chapter vectorisation decorator Consider instance mathlog10 function unvectorised function intended operate single value size1 array error message quoth simply prepending Numba’s numbavectorize decorator generalise mathlog10 function function operating elementwise NumPy ndarray representing tensor pretty much order dimensionalityTags Machine Learning Data Science Python Artificial Intelligence Deep Learning
2,416
Stay In Your Overlaps
Stay In Your Overlaps Your competitive advantage as a maker and/or an investor is your clarity and discipline around the overlaps of your interests and beliefs. Aspiring entrepreneurs wonder, “how do you decide which idea is worth committing 5–10 years of your life to build?” Similarly, investors ask, “among all the pitches you get, how do you decide where to invest your energy and money?” In 2005, I asked myself this question as an entrepreneur when founding Behance and 99U. And starting with my first angel investments in 2010, I have pondered the same question as an investor. By no means is my thesis complete, but at this point it is pretty refined. It can be summed up quickly (and graphically): My best attempt at my own Maker/Investor Thesis. (made with iPad apps Photoshop Sketch and Paper, working together via Creative Cloud!) The idea of Behance (to connect and empower the creative world) and the team we assembled were squarely in the overlap of the type of company I aspired to work for and the team I aspired to work with. This was true in 2005 and remains true today. We are focused on empowering careers in the creative industry using technology (among other mediums) to accomplish the mission. As for the team, we deeply value design and hire for raw talent and initiative over experience. No doubt, the experience building Behance helped develop my perspective as an investor. In 2010, I became an accidental angel investor. I was heads down in the ~5 bootstrapping years of Behance before we raised our own first round of investment. Circumstantially, I had gotten to know other entrepreneurs with similar interests. One, in particular, was Ben Silberman, who invited me to be an early Advisor for what became Pinterest. When Ben raised his seed round, I made my first ever angel investment. I didn’t know any better, so I applied the same thesis behind evaluating Behance as an entrepreneur to evaluating Pinterest as an investor. And I have done so with most of my angel investments since. As I reflect upon my projects and investments that have either succeeded or failed, I realize the importance of playing within “the overlaps.” When some opportunity lures me beyond the overlaps of my interests and beliefs (as displayed in the graphic above), it feels like gambling with my resources rather than investing and leveraging them. There’s no playbook for this stuff, and chances are whatever formula works for someone else won’t work for you. The world is not advanced by people replaying another person’s playbook. My advice for building your own playbook: Invest your energy and money in the overlap of what excites you (the opportunity), and who you respect (the team).
https://medium.com/positiveslope/makers-investors-stay-in-your-overlaps-5295ad920d17
['Scott Belsky']
2016-12-11 05:33:08.921000+00:00
['Management', 'Design', 'Investing', 'Entrepreneurship', 'Venture Capital']
Title Stay OverlapsContent Stay Overlaps competitive advantage maker andor investor clarity discipline around overlap interest belief Aspiring entrepreneur wonder “how decide idea worth committing 5–10 year life build” Similarly investor ask “among pitch get decide invest energy money” 2005 asked question entrepreneur founding Behance 99U starting first angel investment 2010 pondered question investor mean thesis complete point pretty refined summed quickly graphically best attempt MakerInvestor Thesis made iPad apps Photoshop Sketch Paper working together via Creative Cloud idea Behance connect empower creative world team assembled squarely overlap type company aspired work team aspired work true 2005 remains true today focused empowering career creative industry using technology among medium accomplish mission team deeply value design hire raw talent initiative experience doubt experience building Behance helped develop perspective investor 2010 became accidental angel investor head 5 bootstrapping year Behance raised first round investment Circumstantially gotten know entrepreneur similar interest One particular Ben Silberman invited early Advisor became Pinterest Ben raised seed round made first ever angel investment didn’t know better applied thesis behind evaluating Behance entrepreneur evaluating Pinterest investor done angel investment since reflect upon project investment either succeeded failed realize importance playing within “the overlaps” opportunity lure beyond overlap interest belief displayed graphic feel like gambling resource rather investing leveraging There’s playbook stuff chance whatever formula work someone else won’t work world advanced people replaying another person’s playbook advice building playbook Invest energy money overlap excites opportunity respect teamTags Management Design Investing Entrepreneurship Venture Capital
2,417
7 Running Quotes To Help You Hack Writer’s Block
Photo by Derick Santos from Pexels 7 Running Quotes To Help You Hack Writer’s Block “The moment my legs begin to move my thoughts begin to flow.” Henry David Thoreau Let’s face it- writing can be tough. It can, on some days seem to be more of a marathon than a sprint. On other days the words flow from mind to keyboard effortlessly. I write and I run and I have noticed similarities between the two ventures. Running, is a process of discovery, of healing and growth. It is magical, hypnotic and mind-expanding. And so is writing. You can run in solitude or with a group, you can run a sprint or endure a marathon. You can write when you are sad, happy, inspired and inflated. Both activities leave you drained, help you sleep better and enhance your relationship with yourself. Any problem can be solved by a good run as well as a good writing session. From Haruki Murakami to Ryan Holiday, Jeff Goins and Joyce Carol all appreciate the importance of running to help clear the fog and move their stories along. Below is a collection of quotes from great writers to help you finish your article. 1.“As a runner, the real race is getting up and running every single day. Life is the marathon. The same is true in writing”. Ryan Holiday If you want to see results, you need to show up every single day. Each day you show up builds your muscles, strengthens your resolve and helps you develop a focus for the long run. 2. “The moment my legs begin to move my thoughts begin to flow.”Henry David Thoreau One foot after another, deep breath in and out, sometimes it can be difficult and sometimes it can be easy. You can’t question whether you are doing it right or wrong, you just have to keep going. The same is true with writing; you need to type one word after the other for the ideas to flow. 3.“A problem with a piece of writing often clarifies itself if you go for a long walk.”Helen Dunmore Stepping away from your copy helps you find new connections to ideas, to structure a thought differently and tighten sentences. As you are out running your mind is busy at work forming connections you might have missed as you were writing. Running acts as the catalyst to the ideas that were marinating in your mind. 4.“In long-distance running the only opponent you have to beat is yourself, the way you used to be.”― Haruki Murakami, What I Talk About When I Talk About Running There is only one person you need to compete with: yourself. You need to compete with the version of you that showed up yesterday, to tweak the process and learn new ways of getting better. Each day is an opportunity to better yourself. 5.“The twin activities of running and writing keep the writer reasonably sane and with the hope, however illusory and temporary, of control.” Joyce Carol Oates Life can be unpredictable, messy and dark. Your best-laid plans might flop in ways you had not foreseen. But in between the stimuli and your response you get the choice to control your reaction. And therein lies your power. In writing and running you get to step away from the heat of the moment; to find solutions to the problems you are facing. 6.“If you don’t acquire the discipline to push through a personal low point, you will miss the reward that comes with persevering. Running taught me the discipline I need as a writer”. Jeff Goins The challenges we face can feel insurmountable and we might be tempted to give up. But in pushing past the pain and discomfort, we are building resilience and patience. Through running, writers deepen their ability to focus on a single, consuming task and enter a new state of mind entirely. The deliberate act of moving forward each day reminds you that everything will work out in the end. 7.“For me, running is both exercise and a metaphor. Running day after day, piling up the races, bit by bit I raise the bar, and by clearing each level I elevate myself. At least that’s why I’ve put in the effort day after day: to raise my level…The point is whether or not I improved over yesterday.”Haruki Murakami Word by word, mile by mile. All you can do is trust the process and put in the work despite your doubts, excuses, and fears. Once you start the fear begins to dissipate. You realize that the only way to finish an article or a race is to start. Just take one step and keep at it.
https://medium.com/illumination-curated/7-running-quotes-to-combat-writers-block-962d64206634
["Margaret'S Reflections"]
2020-09-25 10:39:39.981000+00:00
['Inspiration', 'Productivity', 'Life Lessons', 'Running', 'Writing']
Title 7 Running Quotes Help Hack Writer’s BlockContent Photo Derick Santos Pexels 7 Running Quotes Help Hack Writer’s Block “The moment leg begin move thought begin flow” Henry David Thoreau Let’s face writing tough day seem marathon sprint day word flow mind keyboard effortlessly write run noticed similarity two venture Running process discovery healing growth magical hypnotic mindexpanding writing run solitude group run sprint endure marathon write sad happy inspired inflated activity leave drained help sleep better enhance relationship problem solved good run well good writing session Haruki Murakami Ryan Holiday Jeff Goins Joyce Carol appreciate importance running help clear fog move story along collection quote great writer help finish article 1“As runner real race getting running every single day Life marathon true writing” Ryan Holiday want see result need show every single day day show build muscle strengthens resolve help develop focus long run 2 “The moment leg begin move thought begin flow”Henry David Thoreau One foot another deep breath sometimes difficult sometimes easy can’t question whether right wrong keep going true writing need type one word idea flow 3“A problem piece writing often clarifies go long walk”Helen Dunmore Stepping away copy help find new connection idea structure thought differently tighten sentence running mind busy work forming connection might missed writing Running act catalyst idea marinating mind 4“In longdistance running opponent beat way used be”― Haruki Murakami Talk Talk Running one person need compete need compete version showed yesterday tweak process learn new way getting better day opportunity better 5“The twin activity running writing keep writer reasonably sane hope however illusory temporary control” Joyce Carol Oates Life unpredictable messy dark bestlaid plan might flop way foreseen stimulus response get choice control reaction therein lie power writing running get step away heat moment find solution problem facing 6“If don’t acquire discipline push personal low point miss reward come persevering Running taught discipline need writer” Jeff Goins challenge face feel insurmountable might tempted give pushing past pain discomfort building resilience patience running writer deepen ability focus single consuming task enter new state mind entirely deliberate act moving forward day reminds everything work end 7“For running exercise metaphor Running day day piling race bit bit raise bar clearing level elevate least that’s I’ve put effort day day raise level…The point whether improved yesterday”Haruki Murakami Word word mile mile trust process put work despite doubt excuse fear start fear begin dissipate realize way finish article race start take one step keep itTags Inspiration Productivity Life Lessons Running Writing
2,418
Watson Personality Insights Introduction and How to access Watson without SDK
In short: The Holy Grail for marketing campaign authors. Are we happy? Keep on reading and likely you shouldn´t. Enter some concerns. “Does it work?” And some Concerns. I am not qualified to answer this question, because I am not a psychologist. The closer I have been to check this service is this test: I wrote some thoughts in a document and upload it to IBM personality Insights. Next I make a conventional personality test https://www.truity.com/test/big-five-personality-test and I compare the outputs with Watson ones. The Big 5 values are very close in both cases. This not a serious test, if you want a more accurate opinion, you must ask to Marketing and Psychology experts. In the other side we can have lots of moral and legal issues with the tool. A good read about these questions: https://medium.com/taraaz/https-medium-com-taraaz-human-rights-implications-of-ibm-watsons-personality-insights-942413e81117 In this post the author talks a lot about the service concerns and its background. Very interesting. The tech stuff: Accessing without SDK. The first question is, What is an SDK? An SDK is an additional module, that IBM gives us to access more easily to its services. We import the SDK to our programming language and this way we can access through the module. The second, Why Do I want to access without the SDK? There are two main reasons: I work with Microsoft Business Central AL. This programming language doesn´t able to import modules like IBM Watson SDK, so I have to access directly making a HTTPRequest to Watson API. My code could be useful form other people in dev environments that neither allow the use of modules. The other reason is that IBM doesn´t provide SDK for all its services. Some beta services as Natural Language Understanding, haven´t SDK. All JavaScript node code is in my GIT repo: https://github.com/JalmarazMartn/Watson-personal-insights-node-whithout-SDK Remarks: var request = require(“request”); auth = require(‘./ApiKey.json’); var transUrl = “https://gateway-lon.watsonplatform.net/personality-insights/api/v3/profile?version=2017-10-13&consumption_preferences=true" var data2 = {}; var data2 = require(‘./profile.json’); request.post( { url: transUrl, auth, headers:{ content_type: ‘application/json’, }, body: JSON.stringify(data2) } , function (err, response, body) { console.log(body); }); We make a HttpRequest to the service, with to files: Apikey. Are the access keys to Watson. I leave an example in the repo. Profile. That´s the file with the social media entries. Looks like this: { “contentItems”: [ { “content”: “Trump impeachment conclusion is unpredictable due to lack of antecedents.”,”contenttype”: “text/plain”,”created”: 1447639154000, “id”: “666073008692314113”,”language”: “en”}, { “content”: “I have serious doubts about Spain basket team, due important players refusing: Rodr�guez Ibaka Mirotic “contenttype”: “text/plain”,”created”: 1447638226000, “id”: “666069114889179136”,”language”: “en”}, { “content”: “Surprising win over Serbia. The keys: defense and Claver performance.”, That’s all. Have a nice day and be careful: some people are watching us (put on a silver paper hat to avoid it).
https://medium.com/analytics-vidhya/watson-personality-insights-introduction-and-how-to-access-watson-without-sdk-89eb8992fff2
['Jesus Almaraz Martin']
2019-11-12 11:45:27.318000+00:00
['Data Science', 'Artificial Intelligence', 'Ibm Watson', 'Sdk', 'Big Data']
Title Watson Personality Insights Introduction access Watson without SDKContent short Holy Grail marketing campaign author happy Keep reading likely shouldn´t Enter concern “Does work” Concerns qualified answer question psychologist closer check service test wrote thought document upload IBM personality Insights Next make conventional personality test httpswwwtruitycomtestbigfivepersonalitytest compare output Watson one Big 5 value close case serious test want accurate opinion must ask Marketing Psychology expert side lot moral legal issue tool good read question httpsmediumcomtaraazhttpsmediumcomtaraazhumanrightsimplicationsofibmwatsonspersonalityinsights942413e81117 post author talk lot service concern background interesting tech stuff Accessing without SDK first question SDK SDK additional module IBM give u access easily service import SDK programming language way access module second want access without SDK two main reason work Microsoft Business Central AL programming language doesn´t able import module like IBM Watson SDK access directly making HTTPRequest Watson API code could useful form people dev environment neither allow use module reason IBM doesn´t provide SDK service beta service Natural Language Understanding haven´t SDK JavaScript node code GIT repo httpsgithubcomJalmarazMartnWatsonpersonalinsightsnodewhithoutSDK Remarks var request require“request” auth require‘ApiKeyjson’ var transUrl “httpsgatewaylonwatsonplatformnetpersonalityinsightsapiv3profileversion20171013consumptionpreferencestrue var data2 var data2 require‘profilejson’ requestpost url transUrl auth header contenttype ‘applicationjson’ body JSONstringifydata2 function err response body consolelogbody make HttpRequest service file Apikey access key Watson leave example repo Profile That´s file social medium entry Looks like “contentItems” “content” “Trump impeachment conclusion unpredictable due lack antecedents””contenttype” “textplain””created” 1447639154000 “id” “666073008692314113””language” “en” “content” “I serious doubt Spain basket team due important player refusing Rodr�guez Ibaka Mirotic “contenttype” “textplain””created” 1447638226000 “id” “666069114889179136””language” “en” “content” “Surprising win Serbia key defense Claver performance” That’s nice day careful people watching u put silver paper hat avoid itTags Data Science Artificial Intelligence Ibm Watson Sdk Big Data
2,419
An Overview of Python’s Datatable package
“There were 5 Exabytes of information created between the dawn of civilization through 2003, but that much information is now created every 2 days”:Eric Schmidt If you are an R user, chances are that you have already been using the data.table package. Data.table is an extension of the data.frame package in R. It’s also the go-to package for R users when it comes to the fast aggregation of large data (including 100GB in RAM). The R’s data.table package is a very versatile and a high-performance package due to its ease of use, convenience and programming speed. It is a fairly famous package in the R community with over 400k downloads per month and almost 650 CRAN and Bioconductor packages using it(source).
https://towardsdatascience.com/an-overview-of-pythons-datatable-package-5d3a97394ee9
['Parul Pandey']
2019-06-02 06:20:22.673000+00:00
['Python', 'Data Science', 'Pandas', 'Big Data', 'H2oai']
Title Overview Python’s Datatable packageContent “There 5 Exabytes information created dawn civilization 2003 much information created every 2 days”Eric Schmidt R user chance already using datatable package Datatable extension dataframe package R It’s also goto package R user come fast aggregation large data including 100GB RAM R’s datatable package versatile highperformance package due ease use convenience programming speed fairly famous package R community 400k downloads per month almost 650 CRAN Bioconductor package using itsourceTags Python Data Science Pandas Big Data H2oai
2,420
Interview with Tony Xu
Interview with Tony Xu Chief Executive Officer & Co-Founder at DoorDash Hi Tony, for the audience who may not be familiar with you, tell us who you are! I am the CEO and co-founder of DoorDash. What is your day-to-day like as a CEO at DoorDash? It changes day-to-day but I would say there are a few categories that I’m spending most of my time in. The first category is the operating reviews, and it probably takes the largest portion of my days. It’s about reviewing and tracking the health of our major audiences (consumers, merchants, and dashers) on the top 5–6 priorities of the company. The second is spending time with customers, which is usually done in two forms: one is spending time in merchant calls. My calls with merchants range from larger national merchants all the way down to mom-and-pop businesses. In fact, I was just on the phone today with one of the original mom-and-pop merchants I signed up 6–7 years ago! The other form of connecting with customers is actually doing customer support for 15–30 minutes daily. I get dozens of emails per day from all sides of the audience and support some select cases myself. The third would be in recruiting. I believe that recruiting is one of the most leveraged uses of any manager’s time. I recruit for all roles across the company, not necessarily limited to the roles on my direct team. The fourth is in talent development through 1:1s with not only my directs but also many others on various levels. I like to give them a sense of what’s going on in the company, what can we be doing better, as well as anything that might be top of mind for them that I can help clarify. And then, of course, there’s time spent with external teams occasionally, such as investors, the Board, and the press. How are you engaged in the product development process these days? Until Nov 2017 when Rajat, our Head of Product, joined us, I used to lead product hands-on, involved in every product review, every design review, and some technical architecture discussions. Today, it’s a little bit different, and my job has evolved. I still attend major product reviews, but my focus is more on asking questions about the teams’ choices and whether they map to the strategic context of the company over the next 2–3 years. I also still read and respond to each one of our product review/update emails. Hope I’m adding more value than I subtract most of the time! I’m always amazed by how attentive you are in responses to every product review. How do you find the time to do it despite your busy days? Haha, I have an advantage of a 3.5-year head start (having run product during DoorDash’s early years), which gave me a lot of historical context and processing time in advance. Frankly, some of the bad decisions were created by me in those early years. And it’s been amazing to see the team driving the evolution of those ideas over the years. Having seen many successes and failures of different products, what do you care most about when building a product? First, it’s important to deeply think about the actual problem you’re solving, before getting into features and wireframes. You need to understand the customers’ mental model and their natural behavior. You’d know that you’ve created the best products when the product feels invisible because it removes the friction so seamlessly. In marketplace businesses like ours, it’s also critical to think about the interplaying effects. Every single decision on one audience impacts other audiences and that becomes increasingly important as the marketplace grows larger. I’m always thinking about how to increase the healthiness of the liquidity in the marketplace. And the last part is how each product will scale in the long run. It’s impossible to ship a perfect product that solves all problems overnight. You need to make choices on sequencing, make tradeoffs, and plan for the long-term evolution of the different problems. What were some recent products that you were proud of seeing shipped? I’ll give one old and one recent example… When DoorDash just got started in the summer of 2013, we made a decision to ship the driver app (AKA Dasher app) first before we shipped a consumer app. I know it sounds like a counter-intuitive decision as a consumer business. Our consumer’s expectation was simple then: they get something delivered on time and as described. It wasn’t necessarily about how they can order something in the most efficient way or how nice photos of food should look. Since it’s impossible to solve all of the complex problems at once, we made a choice to prioritize making Dashers successful first. In order to do that, we had to take care of a lot of the basics of the complex logistics system. I’m glad we did, and it’s a decision that I still stand by today. A recent product that I found really interesting was Convenience. We accelerated to launch our first non-restaurant category during the COVID pandemic given the high demand. Convenience includes pantry items and household goods, and we’re dealing with a significantly different inventory catalog. While restaurants carry 150 to maybe 200 items, an average supermarket sells tens of thousands of different items. Our team made sure to get the quality of storage and delivery operations successful before getting all of the consumer-interfaces right. This is another good example of designing a product for a scalable system, with a mission to deliver the convenience goods in a matter of minutes, not hours or days. [ Tony’s speaking at a biweekly company all-hands, pre-COVID ] What are the challenging problems that you’re excited to solve at DoorDash? I’ve always been so excited about digitizing what’s happening in the physical world. For example, how long it takes to make something inside of the restaurant, whether the item is available on the shelf, etc. We’ve been working on it for years already, and it remains a perennial problem to solve. Solving this problem truly will serve our mission of empowering local economies, and enabling merchants to participate in the convenience economy. No one has really done that successfully and that’s why the goods inside of the city can’t easily transport electronically today. Pioneering to build this piping is really exciting. Where do you think the DoorDash product will be in 5 years? We have two products that will continue to grow both in terms of directions and magnitude. One is a marketplace where we sit in between consumers and merchants. This is the service we’re most well known in the industry today. The merchants could be a restaurant, a convenience store, a grocery store, a retail store, etc. The other product is a platform where we provide tools we’ve built for ourselves to our merchants. For example, DoorDash Drive is our “logistics as a service” platform product that fulfills the delivery of the goods at merchants like Walmart.com, 1–800-flowers, or Little Caesars Pizza. Recently we also announced DoorDash Storefront, which is another platform that provides merchants e-commerce capability, especially for the 40% of businesses that aren’t online today. Five years from now, while each of these two products grows bigger, they need to be built on the same set of protocols and reinforce one another. What do you think the design’s role is in DoorDash? And where do you think the team is at now? Good designers are great problem solvers, who start with a very deep understanding of the problem and customers’ needs before jumping into pixel execution. Their process often includes collecting anecdotes from customers and laying out systemic questions as to what is required to address the customers’ pain points. So I believe the role of design is articulating all of those challenges and asking the right questions. In the end, in collaboration with the product, engineering, and business counterparts, we — as designers — must deliver the simplest solution for the customer while understanding and hiding all of the complexity. I know you have a strong philosophy in hiring and you always provide great feedback on design candidates. What are the things you typically look for, when you approve the offers? What I’ve learned is that regardless of discipline–whether it’s design or engineering, or business functions–the best people share very similar attributes. These are the attributes of excellence that have made people successful at DoorDash. The first I’d say is having a very strong bias for action. This is difficult because it requires the willingness to be wrong when they act quickly. They’re probably going to make more mistakes than they necessarily want to. But this is truly how to create the future when they take risks and put their reputation at stake. The second is the ability to hold two opposing ideas at the same time, especially in the world of product and design. People often love holding strong points of view which I think is really important, but the best people also look for discomforting evidence to argue against themselves. That way, they can involve more people into the problem and get to a better outcome. The third, best people are trying to get 1% better every day. And they put effort to learn things quickly–whether it’s a professional or a personal goal–and every effort adds up quickly. The fourth is the ability to operate at the lowest level of detail. Particularly in Design function, while the output looks simple, typically the inputs to get to the output can be very complicated. It’s not the customer’s job to decode a messy menu or order functionality. It’s the designers and the researchers to do heavy-lifting of distilling the most simple solution for the customers. The final one is that they have strong followership. It doesn’t mean that they necessarily run big teams but they’re the individual whom everyone else is drawn to. This ability requires a lot of emotional maturities. They usually have the ability to recruit other great people too. Great. Thanks so much for your time Tony! ======= Please learn more about other leaders at DoorDash: Christopher Payne — Chief Operating Officer Kathryn Gonzalez — Manager for Design Infrastructure Radhika Bhalla — Head of UX Research Rajat Shroff — VP of Product Sam Lind — Sr Manager for Core Consumer Design Tae Kim — UX Content Strategist Lead Will Dimondi — Manager for Merchant Design
https://medium.com/design-doordash/interview-with-tony-xu-f27121c33ed1
['Helena Seo']
2020-06-13 01:09:31.628000+00:00
['Leadership', 'Design', 'DoorDash', 'Product', 'Startup']
Title Interview Tony XuContent Interview Tony Xu Chief Executive Officer CoFounder DoorDash Hi Tony audience may familiar tell u CEO cofounder DoorDash daytoday like CEO DoorDash change daytoday would say category I’m spending time first category operating review probably take largest portion day It’s reviewing tracking health major audience consumer merchant dashers top 5–6 priority company second spending time customer usually done two form one spending time merchant call call merchant range larger national merchant way momandpop business fact phone today one original momandpop merchant signed 6–7 year ago form connecting customer actually customer support 15–30 minute daily get dozen email per day side audience support select case third would recruiting believe recruiting one leveraged us manager’s time recruit role across company necessarily limited role direct team fourth talent development 11 directs also many others various level like give sense what’s going company better well anything might top mind help clarify course there’s time spent external team occasionally investor Board press engaged product development process day Nov 2017 Rajat Head Product joined u used lead product handson involved every product review every design review technical architecture discussion Today it’s little bit different job evolved still attend major product review focus asking question teams’ choice whether map strategic context company next 2–3 year also still read respond one product reviewupdate email Hope I’m adding value subtract time I’m always amazed attentive response every product review find time despite busy day Haha advantage 35year head start run product DoorDash’s early year gave lot historical context processing time advance Frankly bad decision created early year it’s amazing see team driving evolution idea year seen many success failure different product care building product First it’s important deeply think actual problem you’re solving getting feature wireframes need understand customers’ mental model natural behavior You’d know you’ve created best product product feel invisible remove friction seamlessly marketplace business like it’s also critical think interplaying effect Every single decision one audience impact audience becomes increasingly important marketplace grows larger I’m always thinking increase healthiness liquidity marketplace last part product scale long run It’s impossible ship perfect product solves problem overnight need make choice sequencing make tradeoff plan longterm evolution different problem recent product proud seeing shipped I’ll give one old one recent example… DoorDash got started summer 2013 made decision ship driver app AKA Dasher app first shipped consumer app know sound like counterintuitive decision consumer business consumer’s expectation simple get something delivered time described wasn’t necessarily order something efficient way nice photo food look Since it’s impossible solve complex problem made choice prioritize making Dashers successful first order take care lot basic complex logistics system I’m glad it’s decision still stand today recent product found really interesting Convenience accelerated launch first nonrestaurant category COVID pandemic given high demand Convenience includes pantry item household good we’re dealing significantly different inventory catalog restaurant carry 150 maybe 200 item average supermarket sell ten thousand different item team made sure get quality storage delivery operation successful getting consumerinterfaces right another good example designing product scalable system mission deliver convenience good matter minute hour day Tony’s speaking biweekly company allhands preCOVID challenging problem you’re excited solve DoorDash I’ve always excited digitizing what’s happening physical world example long take make something inside restaurant whether item available shelf etc We’ve working year already remains perennial problem solve Solving problem truly serve mission empowering local economy enabling merchant participate convenience economy one really done successfully that’s good inside city can’t easily transport electronically today Pioneering build piping really exciting think DoorDash product 5 year two product continue grow term direction magnitude One marketplace sit consumer merchant service we’re well known industry today merchant could restaurant convenience store grocery store retail store etc product platform provide tool we’ve built merchant example DoorDash Drive “logistics service” platform product fulfills delivery good merchant like Walmartcom 1–800flowers Little Caesars Pizza Recently also announced DoorDash Storefront another platform provides merchant ecommerce capability especially 40 business aren’t online today Five year two product grows bigger need built set protocol reinforce one another think design’s role DoorDash think team Good designer great problem solver start deep understanding problem customers’ need jumping pixel execution process often includes collecting anecdote customer laying systemic question required address customers’ pain point believe role design articulating challenge asking right question end collaboration product engineering business counterpart — designer — must deliver simplest solution customer understanding hiding complexity know strong philosophy hiring always provide great feedback design candidate thing typically look approve offer I’ve learned regardless discipline–whether it’s design engineering business functions–the best people share similar attribute attribute excellence made people successful DoorDash first I’d say strong bias action difficult requires willingness wrong act quickly They’re probably going make mistake necessarily want truly create future take risk put reputation stake second ability hold two opposing idea time especially world product design People often love holding strong point view think really important best people also look discomforting evidence argue way involve people problem get better outcome third best people trying get 1 better every day put effort learn thing quickly–whether it’s professional personal goal–and every effort add quickly fourth ability operate lowest level detail Particularly Design function output look simple typically input get output complicated It’s customer’s job decode messy menu order functionality It’s designer researcher heavylifting distilling simple solution customer final one strong followership doesn’t mean necessarily run big team they’re individual everyone else drawn ability requires lot emotional maturity usually ability recruit great people Great Thanks much time Tony Please learn leader DoorDash Christopher Payne — Chief Operating Officer Kathryn Gonzalez — Manager Design Infrastructure Radhika Bhalla — Head UX Research Rajat Shroff — VP Product Sam Lind — Sr Manager Core Consumer Design Tae Kim — UX Content Strategist Lead Dimondi — Manager Merchant DesignTags Leadership Design DoorDash Product Startup
2,421
130 Common Design Terms to Know
A A/B Testing A/B testing is where you are comparing to two different layouts, such as webpages or an application, with a single variable online to see which one performs the best. Accessibility This is where you are designing the layout of a webpage or mobile app and taking into account people with disabilities who need to interact with your product easily. This includes designing for people who are blind, color blind, deaf, and other sensory disorders. Adaptive Adaptive means designing something that fits well on multiple devices, such as on an i-phone, tablet, or desktop computer. When designing, you have to take into account that people will be viewing information on different platforms. Affordance Affordance is there to help give clues or signals to the user on what to do next. For instance, designing buttons to show a user that if they want to get somewhere, they will need to tap or click on that icon or bit of text. Ajax Ajax stands for Asynchronous JavaScript. It’s used to create dynamic web applications and allows for asynchronous data retrieval while not having to reload the page a visitor is on. Alignments Is a process of making sure text and images are aligned in a way that visually makes sense to the user. This helps with everything staying organized, visual connections are made, and improves the overall experience for the user. For example, left, right, or center would all be different types of alignments. Analogous Are colors that are next to each other on the color wheel. They are often colors you find naturally in nature and are pleasing to the eye. Anchor Text Text that is linked to a site and is commonly used for SEO (Search Engine Optimization). Animation Creating images that look like they are moving through computer-generated imagery. Ascender Ascenders are the vertical, upwards strokes that rise above the x-height. For instance, letters h, b, and d. Aspect Ratio This is the proportional ratio between an images width and height or W:H. For instance, a square box will have an aspect ratio of 1:1. Avatar As the name suggests these are usually images that are used to represent a person but in a different visual form. You can usually see these on games or when you are setting up your profile on some website. B Balance Balance involves the placement of elements on the page so that text and other elements on a page are evenly distributed. Three ways to achieve balance are symmetrically, asymmetrically and radially. Baselinegrid Is a series of invisible vertical units that can be used to create consistent vertical spacing with your typography and page elements. Below-The-Fold The term ‘below the fold’ refers to the portion of a webpage that a user must scroll to see. A holdover from newspaper publishing, the term ‘below the fold’ was established when there was a physical fold in the middle of the page. Body Copy The main text that people will read on a design. The body copy refers to the paragraphs, sentences, or other text that are the main content on any website. In design terms, the body copy of a website is the main text rather than the titles, or subtitles. Blur Creating a soft or hazy affect around an image. Brand Every business needs something that makes them identifiable. Branding is a way of using color, names, and symbols in design that represent the company as a whole. C Cap Height Back to our friend the baseline — the cap height is the height of the top of a capital letter in any given font above the baseline. Cap height refers specifically to letters with a flat top, such as H and I. Round letters like O and pointed ones like A may rise above the cap height in their uppercase form. Case Study A case study outlines the success of a particular problem or project you undertook. Here you are showing the problem, the solution behind solving it and why you went that route. Complementary Think of these as the best friends of the color world — complementary colors are the colors that sit directly opposite of one another on the color wheel. Examples of complementary colors are red and green, blue and orange and purple and yellow. Using complementary colors ten to make a design more aesthetically pleasing. Compression Compression is where you are minimizing the size of bytes in a graphic file without harming the quality of the image or written text. Contrast Contrast is the arrangement of opposite elements on a page — in other words, when two things on a page are different. This can be light vs. dark colors, smooth vs. rough textures, text color vs. background color. Color Theory Rules and guidelines that designers use to make sure all the colors used work together properly. Copy Every website or mobile app needs copy or published text that a user will see once they visit your site. This text will inform the user on what the page is about and direct them to where they need to go. Crop Cropping is taking an image and cutting off the excess part if it appears too big or not important enough to include in the design. Depending upon what you are trying to emphasize more in an image, you may need to crop part of it out. CSS CSS (Cascading Style Sheets) describes how HTML is supposed to be laid out. CSS ensures developers have a clean, organized and uniform look for their pages. Once the style is created, it can be replicated across all other pages, making consistency much easier. D Debt When a designer makes short term goals or decision in order to meet a deadline, but often what will happen is that later on the person using the end product might not have the best experience due to the designer making rushed decisions or shortcuts. Descender A descender is the part on the letter where it descends below the baseline of that particular character. You will commonly see this with the letters: g, y, q, and p. Display Typeface Text that usually displays the header on a page before the subtext or body underneath. DPI Dots Per Inch (DPI) is the number of dots per square inch in digital design or print. Depending on the density of dots in an image, it can have a higher or lower viewing resolution. Drop Shadow In design drop shadow is an affect that you give to an element that makes it look like there is a shadow or that the image is elevated. You will see drop shadows with buttons or arrows on applications or web pages. E Elements Elements are what make up an image like the size, color, shape, texture, position, density, and direction are all components that make up an object. End User The person you are designing the end product for. EPS EPS stands for Encapsulated PostScript and is used when you want to print high resolution illustrations. EPS files are usually created in Adobe Illustrator. Eye Tracking Eye tracking is when you are measuring a users eye motion and where they focus most when viewing a webpage or other design format. F Feathering It’s another way of creating transparency to a design. This is usually applied to the outside portion of an object so that you can get a glimpse of an image underneath. Figma App A common product designer tool used to create designs for websites and mobile apps. Once the designs are finished, developers can use the files to create the end product. Flat Flat design is a minimalistic approach that focuses on being very simple. It tends to feature plenty of open space, crisp edges, bright colors, and two-dimensional images. Flowchart A process in wire-framing that shows what a user will do next as they are navigating through a mockup of a website or app. Font This refers to the text style you will see on any website or anything written online. The type of text or font that Google tends to use is Google Sans. G Gamification Gamification is adding elements to a design that mimic game-like qualities to drive more user interaction and engagement. An example would be receiving a gold star after completing your 5k run on an app that tracks your distance. This helps incentivize the user to interact with the app more often. Golden Ratio First discovered by the Greeks, it’s when a line is divided into two parts and the longer part is divided by the smaller part to get the number 1.618.0. The idea behind following the golden ratio is that it makes designs visually pleasing to the eye. GIF GIF stands for Graphics Interchange Format and is an animated image. Gradient A color gradient is also known as a color ramp or a color progression where you start with your first initial color in a defined area and move to another. The gradient tool creates a gradual blend of several colors. Grid A ruler like system used to align your objects. They are made up of vertical and horizontal lines that create an easy what to make sure objects or text are positioned properly. GUI GUI stands for Graphical User Interface or images that represent a certain action that will take place once you tap or click on it. An example would be scroll bars, menus, icons, pointers, etc. H Hex A six digit code used to represent a certain color. For example, black has a hex code of 000000 and white a hex code of FFFFFF. These are commonly used in Sketch and Figma when designing. Hierarchy A process of creating what is most important to the least important. It helps give order to a design and what a user should focus on. Hue Hue is the pure color. Basically, it’s just away to describe a color. Yellow, blue, and orange are all different hues. High Fidelities High fidelities refer to when actual color gets added to what was once a wireframe or general outline of a design. This is where things start to come to life and it looks like a functioning webpage or app. I Icon A small image used to represent an action a user is supposed to take to get them to their destination. An example is the search icon you will see next to asearch engine box online. iOS A mobile operating system that was created by Apple. Iteration Iteration in design is where you are constantly changing, testing, and reiterating a particular design layout until it makes sense to the user. J JPEG A compressed digital image that makes the file smaller and is commonly used for photo storage. K Kerning The distance between letters in a word. Knolling Knolling is where a you are arranging objects so that they are either at a 90 degree angle or are parallel to each other. L Landing Page A landing page or what is commonly referred to as the home page is the first page a user will see once they visit a website or application. Leading The line height or spacing between two lines of text. Logo A symbol or graphic created that represents or promotes your business. Logo Mark This is just a design that is centered around a brands actual name. For instance, the swoosh image for Nike would be the logo mark for that company. Lorum Ipsum Lorem Ipsum is basically dummy text used in design that will eventually get replaced with the actual text later on once you get the proper copy established. Lossy When you compress an image some of the quality is lost resulting in what is referred to as lossy. M Navigation/Menu A series of linked items that helps direct the user between the different pages on an application or webpage. The navigation is usually located at the top of any app or webpage. Margins Margins are the spacing between important elements in a design, such as on a website. Usually you will see this between the outer most part of a website and where you have the main hero image. Microcopy Bit sized content on a webpage or application that helps guide the user. This can be text in buttons, thank you pages, captions, tooltips, error messages, small print below pages, etc. Good microcopy is compact, clear, and easily delights the user. Midline Midline or mean line is the imaginary line where all non-ascending letters stop. Mockup A mockup is a prototype that provides at least part of the functionality of a system and helps with testing a design. Monochrome Monochrome is a color palette made up of various different shades and tones of a single color. It’s important to note that while grayscale is monochrome, monochrome is not necessarily greyscale — monochrome images can be made up of any color, like the different shades of orange. Monospace A monospaced typeface is a typeface where each character is the same width, all using the same amount of horizontal space. They can be called fixed-width or non-proportional typefaces. Moodboard The starting point for a lot of designers, a moodboard is a way for designers to collect together lots of visual references or ideas for a new design project. Photos, images or typography would all be elements you could use to create a moodboard. They are used to develop the project’s aesthetic, for inspiration or to help communicate a particular idea. MVP This stands for Minimal Viable Product. The main purpose of an MVP is to collect enough information about a product that will help the designer later on with fleshing out the project. The document states the bare minimum a product needs to get into production. O Opacity Often referred to as “transparency” this is the amount of light you let travel through on object. Adjusting opacity allows you to fade, blend, brighten, or layer within an element. Open Source Open source means that you are allowed to use and modify images that you find online to fit your preferences. Open Type A cross-platform font file where fonts are scalable. Orphan A single line or letter that is by itself at the end of a paragraph, page, or column. P Palette In design it’s a particular range of colors that you will use for a website or application. Pantone The Pantone Matching System is a standardized color scheme used for printing and graphic design. It’s used in a number of other industries including product and fashion design and manufacturing. Each color has its own individual number and name. PDF Portable Document Format is a file format used to represent text and images and is used when you need to save and share with another person. Persona A persona is a fictional character to represent a targeted audience that you are trying to design a product for to fit that demographic. Persona Mapping Is the creation of fictional characters that represent realistic people and what they would want out of a product. Here you would design a road map based on the targeted audiences preferences and why they would take those actions. Pixel Pixels are the smallest component that make up your screen and are tiny square in images that you see on your laptop or mobile phone. In design, especially if you are using Sketch or Figma, you will be using these as your base for sizing different objects. Plug-In This is commonly used in design. Basically, this third party extension will help increase the functionality of your designing process. PNG PNG stands for Portable Network Graphics and is a compressed raster graphic format. It’s used on the web and is also a popular choice for application graphics. PPI Pixels Per Inch (PPI) is used to describe the pixel density between images on a screen. Prototype A prototype is an early model or sample of what a product might look like. Generally, you will design multiple prototypes and test the concept during the beginning phases of designing and building a product. Proximity How objects are grouped or spaced on a page. Images that relate to each other will be closer, while ones that are not related will be spaced further apart. Q QA QA stands for Quality Assurance and is a chance for the designer to review the product before it goes out to be officially tested by a user. R Raster Raster images are constructed out of a set grid of pixels. Meaning, when you change the size or stretch a raster image, it will get a little blurry. Resolution Resolution is the detail of an image. Images with low resolution have little detail while high-resolution images have more detail. High-resolution images tend to be crisper looking, since they have more pixels per square inch compared to low-resolution images. Responsive An approach to web development where depending upon what device a user is viewing, the layout changes or adjusts to that screen size. Another example is when a user flips their screen horizontally, the images will adjust to that shape or if a user is zooming in on something, then that object will appear bigger. RGB RGB stands for red, green, and blue. These three colors are typically used to show images on a digital screen. The colors can be mixed to create any color you want. Rule Of Thirds The rule of thirds is a helpful way of aligning the subject of an image and making it aesthetically pleasing as possible. It involves dividing up your image using 2 horizontal lines and 2 vertical lines to create 6 squares total. You then position the important elements in this divided box along those lines, or at the points where they intersect. S Sans-Serif Sans means “without,” and a sans serif font has no serifs or hooks at the end of some letters. Saturation The intensity of a color. Script Script typefaces are fonts or type based upon historical or modern handwriting styles and are more fluid than traditional typefaces. Scale Refers to the relative size of a design element in comparison to another one. Serif Serifs are the tiny lines and hooks at the end of the strokes in some letters of the alphabet. Sketching A quick drawing done by hand to get an idea on a piece of paper fast and is not the end product. Skeuomorphism A term often used in user interface design to describe interface objects that mimic their real-world counterparts in how they appear and how a person can interact with them. Slab-Serif Slab serif is identified by thick, block-like serifs. Sprint Sprints are the main feature of the Scrum / Agile framework. Sprints are short periods of time in which goals are laid out for a scrum team to complete by the end of the sprint. They usually last for no more than a few weeks and occur in stages such as: planning, design, development, implementations, testing, deploy, then review and repeat. Stem A vertical stroke in a letterform. Can be found in both lowercase and uppercase letters. Stock Photo A place you can go to retrieve licensed images for mass use when designing websites, blogs, mobile apps, etc. If you don’t have an onsite photographer, popular sites to visit for stock photos are Unsplash, Pixabay, and Pexels. Storyboard A visual representation of a user’s experience with a product or problem. They are usually frames laid out in such a way that documents the overall journey a user takes to a final destination. Stroke A feature used to adjust the thickness, width, color, or style of a lines path. Style Guide A style guide is an established rule book that has certain colors, fonts, and icons used for a particular design. This helps make sure everything stays consistent and uniform brand-wise. SVG SVG stands for Scalable Vector Graphic. It’s a file format that helps display vector images on a website. Developers will commonly ask designers for SVG files so that they can easily show that image in their codebase. Symmetry Symmetry refers to a sense of harmonious balance and proportion to an overall design when viewed. T Template A template is a set of designs that are consistent. When designing a website you want to make sure everything stays on brand and templates create a space for this. Texture The surface characteristic of a particular design, such as smooth or rough. Thumbnail A smaller image of an object in order to give the person reviewing the design a quick representation. Thumbnail Sketch Sketches or drawings that are done very quickly to get an idea on paper with no corrections. Tint A lighter or darker version of a particular color. Tracking Tracking is when you are loosening or tightening a chosen set of text. Triadic Color schemes that are evenly spaced around a color wheel to create contrasting shades. Typeface A set of characters, such as letters and numbers that all share the same design. Typography Is the art of arranging groups of letters, numbers, and characters that share the same typeface into something that is pleasing to the eye. U UI UI or User Interface, are the actually assets or buttons a user interacts with to get to a specific destination within an app or website. This is the more physical journey rather than the more psychological experience of UX. Usability Here is where you are taking into account how a user interacts with a certain design. Is the app or website you designed for the user intuitive, safe, and effective at getting them to navigate easily through? User flow The journey the user takes from start to finish, for instance purchasing an item at a checkout successfully. UX UX stands for User Experience and refers the series of steps a user takes to accomplish a goal within a website or app. It’s more about the psychology of why they do what they do with a piece of digital technology and what that overall experience is like for them. V Vector An image made up of points, lines, and curves that are based upon mathematical equations, rather than solid colored square pixels. The beauty of a vector image is that when you zoom in you are not seeing pixels but clean smooth lines. W Watermark A watermark is an image that represents a company, in other words, your logo. You will commonly see watermarks on stickers, water bottles, T-Shirts, bags, etc. Weight Adding weight to an objects makes it appear heavier. Different ways to add weight are giving thickness to a line, or deepening the color of an object. All these varied factors can make an image look fuller. Whitespace The open space between objects or what is commonly called negative white space. There are no elements occupying that area. Widow A widow is a very short line or one word, that is located at the very end of a paragraph or column. Wire Frames The outline or bare bones of what a website or app might look like. There is little to no color added to the sequence and its sole purpose is to show what each element’s sole purpose is. Think of a house being built where all you have is the skeleton but none of its fixtures that make it an actual place to inhabit. X X-height Refers to the distance on the x-axis between the baseline of a letter and it’s uppermost top, basically how tall a typeface is for a letter. Z ZIP A zipped file is a compressed version of a file. To zip a file or to send a zip is sending a smaller, compressed version of a file so they can be transferred more quickly & easily, such as by email. All definitions were crafted with help from the following sites: https://careerfoundry.com/en/blog/ux-design/ux-design-glossary/ https://99designs.com/blog/tips/15-descriptive-design-words-you-should-know/ https://buffer.com/library/53-design-terms-explained-for-marketers/ https://www.smashingmagazine.com/2009/05/web-design-industry-jargon-glossary-and-resources/ As ever, QuarkWorks is available to help with any software application project — web, mobile, and more! If you are interested in our services you can check out our website. We would love to answer any questions you have! Just reach out to us on our Twitter, Facebook, LinkedIn, or Instagram.
https://medium.com/quark-works/130-common-design-terms-to-know-37849a0e7104
['Cassie Ferrick']
2020-10-26 19:19:23.634000+00:00
['Technology', 'Education', 'Design', 'Startup', 'Self Improvement']
Title 130 Common Design Terms KnowContent AB Testing AB testing comparing two different layout webpage application single variable online see one performs best Accessibility designing layout webpage mobile app taking account people disability need interact product easily includes designing people blind color blind deaf sensory disorder Adaptive Adaptive mean designing something fit well multiple device iphone tablet desktop computer designing take account people viewing information different platform Affordance Affordance help give clue signal user next instance designing button show user want get somewhere need tap click icon bit text Ajax Ajax stand Asynchronous JavaScript It’s used create dynamic web application allows asynchronous data retrieval reload page visitor Alignments process making sure text image aligned way visually make sense user help everything staying organized visual connection made improves overall experience user example left right center would different type alignment Analogous color next color wheel often color find naturally nature pleasing eye Anchor Text Text linked site commonly used SEO Search Engine Optimization Animation Creating image look like moving computergenerated imagery Ascender Ascenders vertical upwards stroke rise xheight instance letter h b Aspect Ratio proportional ratio image width height WH instance square box aspect ratio 11 Avatar name suggests usually image used represent person different visual form usually see game setting profile website B Balance Balance involves placement element page text element page evenly distributed Three way achieve balance symmetrically asymmetrically radially Baselinegrid series invisible vertical unit used create consistent vertical spacing typography page element BelowTheFold term ‘below fold’ refers portion webpage user must scroll see holdover newspaper publishing term ‘below fold’ established physical fold middle page Body Copy main text people read design body copy refers paragraph sentence text main content website design term body copy website main text rather title subtitle Blur Creating soft hazy affect around image Brand Every business need something make identifiable Branding way using color name symbol design represent company whole C Cap Height Back friend baseline — cap height height top capital letter given font baseline Cap height refers specifically letter flat top H Round letter like pointed one like may rise cap height uppercase form Case Study case study outline success particular problem project undertook showing problem solution behind solving went route Complementary Think best friend color world — complementary color color sit directly opposite one another color wheel Examples complementary color red green blue orange purple yellow Using complementary color ten make design aesthetically pleasing Compression Compression minimizing size byte graphic file without harming quality image written text Contrast Contrast arrangement opposite element page — word two thing page different light v dark color smooth v rough texture text color v background color Color Theory Rules guideline designer use make sure color used work together properly Copy Every website mobile app need copy published text user see visit site text inform user page direct need go Crop Cropping taking image cutting excess part appears big important enough include design Depending upon trying emphasize image may need crop part CSS CSS Cascading Style Sheets describes HTML supposed laid CSS ensures developer clean organized uniform look page style created replicated across page making consistency much easier Debt designer make short term goal decision order meet deadline often happen later person using end product might best experience due designer making rushed decision shortcut Descender descender part letter descends baseline particular character commonly see letter g q p Display Typeface Text usually display header page subtext body underneath DPI Dots Per Inch DPI number dot per square inch digital design print Depending density dot image higher lower viewing resolution Drop Shadow design drop shadow affect give element make look like shadow image elevated see drop shadow button arrow application web page E Elements Elements make image like size color shape texture position density direction component make object End User person designing end product EPS EPS stand Encapsulated PostScript used want print high resolution illustration EPS file usually created Adobe Illustrator Eye Tracking Eye tracking measuring user eye motion focus viewing webpage design format F Feathering It’s another way creating transparency design usually applied outside portion object get glimpse image underneath Figma App common product designer tool used create design website mobile apps design finished developer use file create end product Flat Flat design minimalistic approach focus simple tends feature plenty open space crisp edge bright color twodimensional image Flowchart process wireframing show user next navigating mockup website app Font refers text style see website anything written online type text font Google tends use Google Sans G Gamification Gamification adding element design mimic gamelike quality drive user interaction engagement example would receiving gold star completing 5k run app track distance help incentivize user interact app often Golden Ratio First discovered Greeks it’s line divided two part longer part divided smaller part get number 16180 idea behind following golden ratio make design visually pleasing eye GIF GIF stand Graphics Interchange Format animated image Gradient color gradient also known color ramp color progression start first initial color defined area move another gradient tool creates gradual blend several color Grid ruler like system used align object made vertical horizontal line create easy make sure object text positioned properly GUI GUI stand Graphical User Interface image represent certain action take place tap click example would scroll bar menu icon pointer etc H Hex six digit code used represent certain color example black hex code 000000 white hex code FFFFFF commonly used Sketch Figma designing Hierarchy process creating important least important help give order design user focus Hue Hue pure color Basically it’s away describe color Yellow blue orange different hue High Fidelities High fidelity refer actual color get added wireframe general outline design thing start come life look like functioning webpage app Icon small image used represent action user supposed take get destination example search icon see next asearch engine box online iOS mobile operating system created Apple Iteration Iteration design constantly changing testing reiterating particular design layout make sense user J JPEG compressed digital image make file smaller commonly used photo storage K Kerning distance letter word Knolling Knolling arranging object either 90 degree angle parallel L Landing Page landing page commonly referred home page first page user see visit website application Leading line height spacing two line text Logo symbol graphic created represents promotes business Logo Mark design centered around brand actual name instance swoosh image Nike would logo mark company Lorum Ipsum Lorem Ipsum basically dummy text used design eventually get replaced actual text later get proper copy established Lossy compress image quality lost resulting referred lossy NavigationMenu series linked item help direct user different page application webpage navigation usually located top app webpage Margins Margins spacing important element design website Usually see outer part website main hero image Microcopy Bit sized content webpage application help guide user text button thank page caption tooltips error message small print page etc Good microcopy compact clear easily delight user Midline Midline mean line imaginary line nonascending letter stop Mockup mockup prototype provides least part functionality system help testing design Monochrome Monochrome color palette made various different shade tone single color It’s important note grayscale monochrome monochrome necessarily greyscale — monochrome image made color like different shade orange Monospace monospaced typeface typeface character width using amount horizontal space called fixedwidth nonproportional typeface Moodboard starting point lot designer moodboard way designer collect together lot visual reference idea new design project Photos image typography would element could use create moodboard used develop project’s aesthetic inspiration help communicate particular idea MVP stand Minimal Viable Product main purpose MVP collect enough information product help designer later fleshing project document state bare minimum product need get production Opacity Often referred “transparency” amount light let travel object Adjusting opacity allows fade blend brighten layer within element Open Source Open source mean allowed use modify image find online fit preference Open Type crossplatform font file font scalable Orphan single line letter end paragraph page column P Palette design it’s particular range color use website application Pantone Pantone Matching System standardized color scheme used printing graphic design It’s used number industry including product fashion design manufacturing color individual number name PDF Portable Document Format file format used represent text image used need save share another person Persona persona fictional character represent targeted audience trying design product fit demographic Persona Mapping creation fictional character represent realistic people would want product would design road map based targeted audience preference would take action Pixel Pixels smallest component make screen tiny square image see laptop mobile phone design especially using Sketch Figma using base sizing different object PlugIn commonly used design Basically third party extension help increase functionality designing process PNG PNG stand Portable Network Graphics compressed raster graphic format It’s used web also popular choice application graphic PPI Pixels Per Inch PPI used describe pixel density image screen Prototype prototype early model sample product might look like Generally design multiple prototype test concept beginning phase designing building product Proximity object grouped spaced page Images relate closer one related spaced apart Q QA QA stand Quality Assurance chance designer review product go officially tested user R Raster Raster image constructed set grid pixel Meaning change size stretch raster image get little blurry Resolution Resolution detail image Images low resolution little detail highresolution image detail Highresolution image tend crisper looking since pixel per square inch compared lowresolution image Responsive approach web development depending upon device user viewing layout change adjusts screen size Another example user flip screen horizontally image adjust shape user zooming something object appear bigger RGB RGB stand red green blue three color typically used show image digital screen color mixed create color want Rule Thirds rule third helpful way aligning subject image making aesthetically pleasing possible involves dividing image using 2 horizontal line 2 vertical line create 6 square total position important element divided box along line point intersect SansSerif Sans mean “without” sans serif font serif hook end letter Saturation intensity color Script Script typeface font type based upon historical modern handwriting style fluid traditional typeface Scale Refers relative size design element comparison another one Serif Serifs tiny line hook end stroke letter alphabet Sketching quick drawing done hand get idea piece paper fast end product Skeuomorphism term often used user interface design describe interface object mimic realworld counterpart appear person interact SlabSerif Slab serif identified thick blocklike serif Sprint Sprints main feature Scrum Agile framework Sprints short period time goal laid scrum team complete end sprint usually last week occur stage planning design development implementation testing deploy review repeat Stem vertical stroke letterform found lowercase uppercase letter Stock Photo place go retrieve licensed image mass use designing website blog mobile apps etc don’t onsite photographer popular site visit stock photo Unsplash Pixabay Pexels Storyboard visual representation user’s experience product problem usually frame laid way document overall journey user take final destination Stroke feature used adjust thickness width color style line path Style Guide style guide established rule book certain color font icon used particular design help make sure everything stay consistent uniform brandwise SVG SVG stand Scalable Vector Graphic It’s file format help display vector image website Developers commonly ask designer SVG file easily show image codebase Symmetry Symmetry refers sense harmonious balance proportion overall design viewed Template template set design consistent designing website want make sure everything stay brand template create space Texture surface characteristic particular design smooth rough Thumbnail smaller image object order give person reviewing design quick representation Thumbnail Sketch Sketches drawing done quickly get idea paper correction Tint lighter darker version particular color Tracking Tracking loosening tightening chosen set text Triadic Color scheme evenly spaced around color wheel create contrasting shade Typeface set character letter number share design Typography art arranging group letter number character share typeface something pleasing eye U UI UI User Interface actually asset button user interacts get specific destination within app website physical journey rather psychological experience UX Usability taking account user interacts certain design app website designed user intuitive safe effective getting navigate easily User flow journey user take start finish instance purchasing item checkout successfully UX UX stand User Experience refers series step user take accomplish goal within website app It’s psychology piece digital technology overall experience like V Vector image made point line curve based upon mathematical equation rather solid colored square pixel beauty vector image zoom seeing pixel clean smooth line W Watermark watermark image represents company word logo commonly see watermark sticker water bottle TShirts bag etc Weight Adding weight object make appear heavier Different way add weight giving thickness line deepening color object varied factor make image look fuller Whitespace open space object commonly called negative white space element occupying area Widow widow short line one word located end paragraph column Wire Frames outline bare bone website app might look like little color added sequence sole purpose show element’s sole purpose Think house built skeleton none fixture make actual place inhabit X Xheight Refers distance xaxis baseline letter it’s uppermost top basically tall typeface letter Z ZIP zipped file compressed version file zip file send zip sending smaller compressed version file transferred quickly easily email definition crafted help following site httpscareerfoundrycomenbloguxdesignuxdesignglossary https99designscomblogtips15descriptivedesignwordsyoushouldknow httpsbuffercomlibrary53designtermsexplainedformarketers httpswwwsmashingmagazinecom200905webdesignindustryjargonglossaryandresources ever QuarkWorks available help software application project — web mobile interested service check website would love answer question reach u Twitter Facebook LinkedIn InstagramTags Technology Education Design Startup Self Improvement
2,422
An Easy Way for Writers to Move From Skinny Ideas to Rock Solid First Drafts
I used to believe that writing was a kind of romance. I would wake in the early hours of the morning, steaming hot cup of coffee in my hand, and light a candle. Just me and the dark and my muse. The scene was set. The wooing had begun. Now all I had to do was wait for her arrival. So, I would sip my coffee and wait. And wait. Usually, my beloved muse would show up. But she was a mess. Unfortunately, she was almost always drunk. All she could give me were mutterings I could barely make out. Incoherent rambles. Slurred words and vague statements. And usually, if I listened to her muddled monologues long enough, she would throw me a bone. A word. A random thought. Kind of like those movies where the guy is dying and he knows where the killer is hiding the abducted child but all he can do is whisper some cryptic words as he takes his last breath. That was where she left me. With a word and a puzzle. And it was infuriating. It wasn’t supposed to be this way. I was supposed to sit down at the computer and be inspired. Be driven. Instead, I found myself stuck. I had the seed of an idea, but I didn’t know how to make it grow. So I tried what everyone seems to think is the right way to get the juices flowing. Just start writing. That was a “no-go” too. Then, I started thinking about all the strategies I used in the past. What were the common threads that ran through my most successful articles? What were the common threads that ran through writers I enjoyed reading? I made a list. And then I created a pre-draft subheading of ideas which I filled in before I began actually writing the article. The results were amazing. By filling out this template, I found ways to practically make my articles write themselves and satisfy my reader as well. And I am hoping my strategy can help you too. The pre-writing structure that helps me write fuller, more engaging articles On my blank documents, I write the following headings: The “Why The “How Supporting Research Personal Anecdotes and Examples Start by brainstorming the “Why” You must always provide a “why” in your writing. People want to know the reason they should listen to what you have to say. How will you add value to their lives? Will you give them insights on how to improve their relationships, their finances, their health, or their career? Will you validate their opinions, feelings, or ideas, or will you change their minds so that they can lead a happier, more rewarding existence? Once you brainstorm the “why,” the next step is elaborating on its importance. Help the reader imagine the direction their lives will take if they listen to your advice or choose to not follow it. The best way to do this is to conjure one or both of the following emotions: excitement or fear. For example, if you are writing on qualities they need for a successful relationship, make them excited by helping them imagine how much more fulfilling, fun, and intimate their connection with their partner will be. On the other hand, you can create a bleaker future for them to imagine. For example, what will happen if they don’t listen to what you have to say? Will their relationship continue to fall apart or their communication with their partners dwindle away to nothing? Will they continue to live lives with their significant others more as friends than lovers? Create a scenario for readers to envision. A “picture” for them to hold on to as they read. This fear or excitement is what hooks your reader. Now outline the “How” Now that this picture is in your reader’s mind, tell them how to either make this picture a reality or ensure it doesn’t happen. Some rules to follow when you do this? Isolate your tips into separate headings Readers like you to make their lives easier. One of the ways you can do this is to separate each of your ideas into separate bullets or subheadings. The subheading itself tells the reader what they can do in simple terms. If used correctly, it also manages to create more curiosity in your reader. For example, a good subheading is clear enough to let your readers know your general idea or tip but not explicit enough to keep them from reading more. The result of this type of subheading is that readers will feel compelled to continue because they want to know exactly how to put your advice to use. Make your advice specific Using the relationship example from before, let’s say you tell your readers that they need to take more time to connect with their partners. Give them specific ideas on how to do this. Don’t just say you need to find time to talk. Give them ways to implement this advice. Maybe you suggest that they take their kids to play in the park so that they can talk together without interruption. Maybe you suggest a daily walk together so that the temptation of entertainment at home or the urge to complete household tasks is eliminated. Specific tips such as these will not only give your readers actionable advice, it will also serve as a catalyst for their own brainstorming. In other words, maybe your tip won’t exactly fit their lifestyle, but it will jumpstart their own ideas on how they can fit your advice into their specific routine or circumstances. Collect research The more proof you can give your readers that your advice is sound, the more they will trust you. So do the research to back up what you say. And when you do so, look for reputable sites that are related to the topic at hand. For example, if you are elaborating on the importance of communication in your article’s “Why” section, you could use the following information from the Forum for Family and Consumer Issues. “In one study of couples, both men and women agreed that the emotional connection they shared with their partner was what determined the quality of their relationships and whether they believed they had a good marriage or not.” So now, instead of you alone saying how important communication is, your readers have additional proof for what you say. Also, by providing this hard and fast evidence, readers trust you more overall, as it is obvious you have done your “homework” on the topic. Make a list of personal anecdotes that bond you to your reader One easy form of “research” you can give your readers is facts learned through personal experience. For example, a psychological study such as the one mentioned above is valuable, but first-hand experience carries equal if not more weight. Not only that, when you share your experiences with readers, a bond is built. You’ve been there. You “get” them. And they see you as not only a giver of advice but as a “friend” of sorts. All readers come to writing seeking commiseration or validation of their feelings, struggles, or beliefs. They want to know you have been there too. They’re not necessarily looking for a time you experienced the exact same situation (although it’s certainly a plus if you have), but they do desire hearing you’ve been in a related situation or have indirectly witnessed or been affected by the same experiences they’ve been through. So give this to them. Brainstorm personal examples or examples from others with whom you come into contact as it concerns the topic. You may find yourself using this as the “glue” between your claim, hard evidence, and your “how to.” The bottom line: Good writing is not always inspired writing. Sometimes it’s more like a grocery list, a list of objects your readers absolutely can’t do without. So fill their refrigerator by planning ahead and giving them the emotional connection, rock-solid facts, and simple tips they desperately desire.
https://medium.com/the-brave-writer/an-easy-way-for-writers-to-move-from-skinny-ideas-to-rock-solid-first-drafts-b6e911a84699
['Dawn Bevier']
2020-12-18 13:02:09.121000+00:00
['Marketing Strategies', 'Marketing', 'Writing Tips', 'Freelance Writing', 'Writing']
Title Easy Way Writers Move Skinny Ideas Rock Solid First DraftsContent used believe writing kind romance would wake early hour morning steaming hot cup coffee hand light candle dark muse scene set wooing begun wait arrival would sip coffee wait wait Usually beloved muse would show mess Unfortunately almost always drunk could give muttering could barely make Incoherent ramble Slurred word vague statement usually listened muddled monologue long enough would throw bone word random thought Kind like movie guy dying know killer hiding abducted child whisper cryptic word take last breath left word puzzle infuriating wasn’t supposed way supposed sit computer inspired driven Instead found stuck seed idea didn’t know make grow tried everyone seems think right way get juice flowing start writing “nogo” started thinking strategy used past common thread ran successful article common thread ran writer enjoyed reading made list created predraft subheading idea filled began actually writing article result amazing filling template found way practically make article write satisfy reader well hoping strategy help prewriting structure help write fuller engaging article blank document write following heading “Why “How Supporting Research Personal Anecdotes Examples Start brainstorming “Why” must always provide “why” writing People want know reason listen say add value life give insight improve relationship finance health career validate opinion feeling idea change mind lead happier rewarding existence brainstorm “why” next step elaborating importance Help reader imagine direction life take listen advice choose follow best way conjure one following emotion excitement fear example writing quality need successful relationship make excited helping imagine much fulfilling fun intimate connection partner hand create bleaker future imagine example happen don’t listen say relationship continue fall apart communication partner dwindle away nothing continue live life significant others friend lover Create scenario reader envision “picture” hold read fear excitement hook reader outline “How” picture reader’s mind tell either make picture reality ensure doesn’t happen rule follow Isolate tip separate heading Readers like make life easier One way separate idea separate bullet subheading subheading tell reader simple term used correctly also manages create curiosity reader example good subheading clear enough let reader know general idea tip explicit enough keep reading result type subheading reader feel compelled continue want know exactly put advice use Make advice specific Using relationship example let’s say tell reader need take time connect partner Give specific idea Don’t say need find time talk Give way implement advice Maybe suggest take kid play park talk together without interruption Maybe suggest daily walk together temptation entertainment home urge complete household task eliminated Specific tip give reader actionable advice also serve catalyst brainstorming word maybe tip won’t exactly fit lifestyle jumpstart idea fit advice specific routine circumstance Collect research proof give reader advice sound trust research back say look reputable site related topic hand example elaborating importance communication article’s “Why” section could use following information Forum Family Consumer Issues “In one study couple men woman agreed emotional connection shared partner determined quality relationship whether believed good marriage not” instead alone saying important communication reader additional proof say Also providing hard fast evidence reader trust overall obvious done “homework” topic Make list personal anecdote bond reader One easy form “research” give reader fact learned personal experience example psychological study one mentioned valuable firsthand experience carry equal weight share experience reader bond built You’ve “get” see giver advice “friend” sort reader come writing seeking commiseration validation feeling struggle belief want know They’re necessarily looking time experienced exact situation although it’s certainly plus desire hearing you’ve related situation indirectly witnessed affected experience they’ve give Brainstorm personal example example others come contact concern topic may find using “glue” claim hard evidence “how to” bottom line Good writing always inspired writing Sometimes it’s like grocery list list object reader absolutely can’t without fill refrigerator planning ahead giving emotional connection rocksolid fact simple tip desperately desireTags Marketing Strategies Marketing Writing Tips Freelance Writing Writing
2,423
Applying AI to Group Collaborations.
AI applications, Poetry. Applying AI to Group Collaborations. Applying AI to my research in group collaborations. Sharing some poetic thoughts. Groups collaborate and drink coffee. Can you apply AI to help? Photo by Nikita Vantorin on Unsplash I was thinking, Always dangerous, Could distract me, From drinking coffee, Time to write, Something different, AI is always touted, Greatest good and, Greatest evil by, Unknowing journalists, Ranging from, Benevolent robots, Big love-me eyes to, Ratbag bots with, Glowing red eyes, Guns and, Terrifying missiles. I thought to, Give you insights, Some things about, Systems Thinking so, You could explore, AI concepts and, Make your, Own judgements, Also thought to, Write this, Story in poetry, Why not? Artificial Intelligence, Is about matching, Patterns such as, Facial recognition, Fingerprint identification, Voice recognition, Translating Languages, Pandemic spread, Vaccination simulations, Disease infections, Diagnostics even, Car-tyre wear, Anything that can, Generate a pattern, Such as crowd or, Group and, Individual behaviours, May be ripe, For AI application. But before you, Click your fingers, For AI magic, Some poor sod, Like me, Has to analyse, The system, What makes, It tick, What happens, When I push, This ‘ere, Red button? I studied, How people, Work together, Examined banking, Systems and, Large retailers, Universities, Good thinking, Ho-hum coffee, What do you, Expect from, Carnivorous, Pinch-penny, Management? Collaborative Wellness. Term I invented, PhD research, Had one of my, Research Reviews, Coming up, Needed a name for, All my research, Including PhD, Totalled Nineteen years, Came to me, While drinking, Rare single-origin, Ridiculously priced, But after all, It was coffee, In conversation with, Another nutter, Said I was trying, To answer, “How well people, Worked together?”, Like a flash, “Collaborative Wellness”, Sprang into, Coffee-starved Mind, Determined my fate. Asking Questions People working together, First step in our quest, Ask questions to discover, Who for social connections, How for identifying processes, With What for discovering means, Big bit of paper, Covering table, Felt-tipped pens, Multiple colours, Make a, Rich picture, Describing system, Share with managers, Workers and Kookaburras. Diagram 1: Questions to ask when discovering how collaborations work. Research by John Rose. In essence, Discovery yields, Anatomy of, Linked collaborations, Flows of knowledge, Production line with, Work stations, Each stop along, The line is a, Collaborative, Wellness Unit, Our investigation, One step deeper, Examining how, Purpose is fulfilled, How value is, Delivered to, CWU Stakeholders. Diagram 2. Explores purpose fulfillment of Collaborative Wellness Unit (CWU). Research by John Rose. Now investigator, Step back, Holistic View, Look at Groups, Working and, Exchanging Knowledge, Notice boundaries defined, Create closed systems, So if you’re inclined, You can estimate, Flows of, Knowledge entropy, Useful for, Considering the, Declining value of, Knowledge over time, Not to mention, Head banging. Diagram 3: Groups Working Together. Published Research by John Rose Lastly, Time to put it, All together, Each group or, Process can be, Described as, Being CWU’s, Interacting and, Adapting to, Changing market, Requirements. Goodness I said, “Market”, Didn’t mean it, Obviously need coffee, Nerds forgive me. Diagram 4: Abstracted System now Basis for AI application. Research by John Rose. System Described, Now What? It takes some time, To build this overview, Knowledge flows, Collaborations, Interacting, Gathering data, Time to assemble, Input data for, AI Tensorflow, Analysis. As you can see, Applying AI to, Existing systems is, No easy task, Time consuming, Demands attention, Detail and accuracy, Approximate data and, “She’ll be right”, Attitudes upset, AI analysis, Invalidating results. You will find it, Most difficult to, Explain findings and, Predictions especially to, Sceptical managers and, Unsmiling stakeholders. On some, Occasions, I have been as, Popular as, Mud soup, “Sorry guys”, I exclaimed, “Your data is, Just manure, Not fit for, Purpose”, Another story, I’ll keep for, Grandchildren and, Kookaburras. Blessed be, I’m a tree, Not AI. Systems Thinking References. Python Language References. I use Python for, Experimenting and Prototyping in, Deep Learning, If you really, Want to blast ahead. AI References. If you want to, Play around, With proper AI, See for yourself, What can be and, Can’t be done, Stop engaging with, Popular empty heads and, Do something yourself, Suggest you start at, Getting your head, Around Deep Learning. Modelling References. I have used Netlogo, For modelling over, Many years, Gives quick insights, Minimum of work, Don’t need much, By way of, Programming skills, Allows curiosity to, Flourish. Learning Some Basics with Ready built libraries. Admittedly sometimes, Jumping into, Deep Learning, Is a bit overwhelming, Try scikit-learn, Gives you quick, Access to validated, Data and tools, See what it’s, All about. AI Background Reading Alan Turing was, In my opinion, Greatest pioneer of, Concepts in, Artificial Intelligence.
https://medium.com/technology-hits/applying-ai-to-group-collaborations-b6eaa950fba1
['Dr John Rose']
2020-10-18 10:33:16.012000+00:00
['Python', 'Artificial Intelligence', 'Deep Learning', 'Systems Thinking', 'Poetry']
Title Applying AI Group CollaborationsContent AI application Poetry Applying AI Group Collaborations Applying AI research group collaboration Sharing poetic thought Groups collaborate drink coffee apply AI help Photo Nikita Vantorin Unsplash thinking Always dangerous Could distract drinking coffee Time write Something different AI always touted Greatest good Greatest evil Unknowing journalist Ranging Benevolent robot Big loveme eye Ratbag bot Glowing red eye Guns Terrifying missile thought Give insight thing Systems Thinking could explore AI concept Make judgement Also thought Write Story poetry Artificial Intelligence matching Patterns Facial recognition Fingerprint identification Voice recognition Translating Languages Pandemic spread Vaccination simulation Disease infection Diagnostics even Cartyre wear Anything Generate pattern crowd Group Individual behaviour May ripe AI application Click finger AI magic poor sod Like analyse system make tick happens push ‘ere Red button studied people Work together Examined banking Systems Large retailer Universities Good thinking Hohum coffee Expect Carnivorous Pinchpenny Management Collaborative Wellness Term invented PhD research one Research Reviews Coming Needed name research Including PhD Totalled Nineteen year Came drinking Rare singleorigin Ridiculously priced coffee conversation Another nutter Said trying answer “How well people Worked together” Like flash “Collaborative Wellness” Sprang Coffeestarved Mind Determined fate Asking Questions People working together First step quest Ask question discover social connection identifying process discovering mean Big bit paper Covering table Felttipped pen Multiple colour Make Rich picture Describing system Share manager Workers Kookaburras Diagram 1 Questions ask discovering collaboration work Research John Rose essence Discovery yield Anatomy Linked collaboration Flows knowledge Production line Work station stop along line Collaborative Wellness Unit investigation One step deeper Examining Purpose fulfilled value Delivered CWU Stakeholders Diagram 2 Explores purpose fulfillment Collaborative Wellness Unit CWU Research John Rose investigator Step back Holistic View Look Groups Working Exchanging Knowledge Notice boundary defined Create closed system you’re inclined estimate Flows Knowledge entropy Useful Considering Declining value Knowledge time mention Head banging Diagram 3 Groups Working Together Published Research John Rose Lastly Time put together group Process Described CWU’s Interacting Adapting Changing market Requirements Goodness said “Market” Didn’t mean Obviously need coffee Nerds forgive Diagram 4 Abstracted System Basis AI application Research John Rose System Described take time build overview Knowledge flow Collaborations Interacting Gathering data Time assemble Input data AI Tensorflow Analysis see Applying AI Existing system easy task Time consuming Demands attention Detail accuracy Approximate data “She’ll right” Attitudes upset AI analysis Invalidating result find difficult Explain finding Predictions especially Sceptical manager Unsmiling stakeholder Occasions Popular Mud soup “Sorry guys” exclaimed “Your data manure fit Purpose” Another story I’ll keep Grandchildren Kookaburras Blessed I’m tree AI Systems Thinking References Python Language References use Python Experimenting Prototyping Deep Learning really Want blast ahead AI References want Play around proper AI See Can’t done Stop engaging Popular empty head something Suggest start Getting head Around Deep Learning Modelling References used Netlogo modelling Many year Gives quick insight Minimum work Don’t need much way Programming skill Allows curiosity Flourish Learning Basics Ready built library Admittedly sometimes Jumping Deep Learning bit overwhelming Try scikitlearn Gives quick Access validated Data tool See it’s AI Background Reading Alan Turing opinion Greatest pioneer Concepts Artificial IntelligenceTags Python Artificial Intelligence Deep Learning Systems Thinking Poetry
2,424
How to Create a New Year’s Resolution That Sticks
How to Create a New Year’s Resolution That Sticks Your first mistake is calling it a New Year’s resolution Photo by Kylo on Unsplash My New Year’s resolution is to remember everything about 2020 and to never take anything in my life for granted again. Yep, I said it. I want to remember every detail of this past year. I want to take the lessons I’ve learned over the past nine months and engrave them into my brain. Many of you probably want to forget 2020. You want to leave this year in the dust behind you. This year has been horrible, and it’s undone the lives of so many families. What is a New Year’s resolution supposed to look like in 2021 anyway? Things are shaping out to be a little different this year and I think my interpretation of resolutions will live on: they don’t work, and I’ll explain why. Making a resolution is like making a promise without any consequences. What’s the worst that’ll happen if you don’t read two books every month in 2021? Probably nothing. You’ll just feel bad about yourself for a little while but then move on with your life. “There’s always next year,” you’ll say as you mentally close the chapter on the resolution you said you’d fulfill. I want to say that New Year’s resolutions work. On paper, they should have a higher success rate. They are exciting opportunities to better oneself. Yet, it’s a commonly held notion that New Year’s resolutions fail. You can still accomplish your goal without putting the “resolution” label on it, and I’ll explain how.
https://medium.com/illumination/how-to-create-a-new-years-resolution-that-sticks-615e576d5170
['Ryan Porter']
2020-12-10 06:43:10.425000+00:00
['Inspiration', 'Productivity', 'Motivation', 'Ideas', 'Self Improvement']
Title Create New Year’s Resolution SticksContent Create New Year’s Resolution Sticks first mistake calling New Year’s resolution Photo Kylo Unsplash New Year’s resolution remember everything 2020 never take anything life granted Yep said want remember every detail past year want take lesson I’ve learned past nine month engrave brain Many probably want forget 2020 want leave year dust behind year horrible it’s undone life many family New Year’s resolution supposed look like 2021 anyway Things shaping little different year think interpretation resolution live don’t work I’ll explain Making resolution like making promise without consequence What’s worst that’ll happen don’t read two book every month 2021 Probably nothing You’ll feel bad little move life “There’s always next year” you’ll say mentally close chapter resolution said you’d fulfill want say New Year’s resolution work paper higher success rate exciting opportunity better oneself Yet it’s commonly held notion New Year’s resolution fail still accomplish goal without putting “resolution” label I’ll explain howTags Inspiration Productivity Motivation Ideas Self Improvement
2,425
“Never Give Up: You May Be Closer to Success Than You Think.”
Looking at my last MPP payout data, I tried to conjure the good feels and confidence of the comic that’s kept me moving ever-forward all these years, but I know too much now because I’m a member of two Facebook groups that support and encourage Medium writers. What I’ve learned from this generous community is that several writers who have a similar number of followers to me (800) and post a similar number of stories each week (5–8), are earning 5 to 12 times more than I am from their effort. Where my MPP payout for April was $85, other comparable contributors posted earnings of up to $1,000 for the 5-week period. Now, not only was I questioning the wisdom of diverting so much of my potential income-earning hours to the pursuit of making money writing about topics that interest me on Medium, I became concerned that I’d been faking it as a technical ghost-writer for the last 20 years. I even drafted an email to let my core clients know theyd been duped. But I didn’t send it since I worried it wasn’t well-enough written. I can’t stop imagining my confidence-building comic, now a little bit altered. The man in the top frame is still enthusiastically pickaxing his way to success, but it’s a woman in the lower frame—one who bears an uncanny resemblance to me. She is joyfully hacking away on the wrong wall. Image by Gerd Altmann from Pixabay This doesn’t surprise me since I am 90% consistent in turning left when right is called for. Write what you know? But, what if what you know is, you know, wrong? There is actual neuroscience that proves there are people (like my husband) who have an intuitive sense of direction and people (like myself) who feel like the operating system that governs navigation was never uploaded to their brain. What if that broken navigational O/S is at play with my Medium efforts? That is the question that was plaguing my sleep. What if I was taking a left turn with my stories when I should have stayed the course or veered slightly right? What if my story pick-axe was smashing away at the wrong wall? What if, instead of chipping away at the Wall that Pays, my stories have been working toward busting a hole in the Pay Wall? I don’t even know what that means but I know that a) just because I don’t understand something doesn’t mean it’s not true and b) that would be just like me to hear “Wall that Pays” and set my GPS for “Pay Wall.” After dropping my axe and hitting the wall with my skull to hit the wall for three weeks, I did something unthinkable: I popped my head of out my cave and asked for help from the wonderfully supportive and successful writers here on Medium. They were specific—brutal and loving all wrapped up in honest feedback. They spun me around, handed me a clear map, and said, “Good luck!” Three days later, I was curated. Twice. And then a third time. That was nine uncurated stories ago. I haven’t lost the map but I have put it down to write some things that feel right for right now. I’m no longer feeling the desperate need for validation from the anonymous, faceless elevators. I have been seen now. Perhaps like a shooting star, now gone from their view. But seen once means I can find my direction again—to crack more holes in that wall—maybe even enough to reach the diamonds that are waiting on the other side. And, if I don’t, these hours and months sharing on Medium will become part of my future success story with Enterprise Idea number 9 or 14 or even 26.
https://medium.com/love-and-stuff/never-give-up-you-may-be-closer-to-success-than-you-think-532c101c8e6
['Danika Bloom']
2019-06-21 23:06:59.876000+00:00
['Advice', 'Life Lessons', 'This Happened To Me', 'Entrepreneurship', 'Writing']
Title “Never Give May Closer Success Think”Content Looking last MPP payout data tried conjure good feel confidence comic that’s kept moving everforward year know much I’m member two Facebook group support encourage Medium writer I’ve learned generous community several writer similar number follower 800 post similar number story week 5–8 earning 5 12 time effort MPP payout April 85 comparable contributor posted earnings 1000 5week period questioning wisdom diverting much potential incomeearning hour pursuit making money writing topic interest Medium became concerned I’d faking technical ghostwriter last 20 year even drafted email let core client know theyd duped didn’t send since worried wasn’t wellenough written can’t stop imagining confidencebuilding comic little bit altered man top frame still enthusiastically pickaxing way success it’s woman lower frame—one bear uncanny resemblance joyfully hacking away wrong wall Image Gerd Altmann Pixabay doesn’t surprise since 90 consistent turning left right called Write know know know wrong actual neuroscience prof people like husband intuitive sense direction people like feel like operating system governs navigation never uploaded brain broken navigational OS play Medium effort question plaguing sleep taking left turn story stayed course veered slightly right story pickaxe smashing away wrong wall instead chipping away Wall Pays story working toward busting hole Pay Wall don’t even know mean know don’t understand something doesn’t mean it’s true b would like hear “Wall Pays” set GPS “Pay Wall” dropping axe hitting wall skull hit wall three week something unthinkable popped head cave asked help wonderfully supportive successful writer Medium specific—brutal loving wrapped honest feedback spun around handed clear map said “Good luck” Three day later curated Twice third time nine uncurated story ago haven’t lost map put write thing feel right right I’m longer feeling desperate need validation anonymous faceless elevator seen Perhaps like shooting star gone view seen mean find direction again—to crack hole wall—maybe even enough reach diamond waiting side don’t hour month sharing Medium become part future success story Enterprise Idea number 9 14 even 26Tags Advice Life Lessons Happened Entrepreneurship Writing
2,426
How I learned 15 programming languages, and why your kids will too
When I was 12 years old, I got sick and had to go to the hospital for a check-up. No need to worry, turned out I only had angina. But a couple of hours I spent in the hospital back then did, in some way, predetermine my whole life. While I was waiting for the examination, someone gave me a book — just to cheer me up. It turned out to be about BASIC, an early programming language that was and still is among the simplest and most popular programming languages. I loved the whole idea of creating new things from scratch so badly that I picked my life journey after reading that book. I started with writing code on pieces of paper, and over time, learned another 14 programming languages to truly understand how the world around us work. It might seem unrealistic, but your kids might follow my lead. Anticipating Mortal Kombat The tale takes place in Moscow at the beginning of the 90s. There were not too many people interested in IT and programming back then. More so, there were not many people who had personal computers. When I first read the book, I didn’t have a computer, so I tried to write codes on pieces of paper. Soon enough my father — a professor in a military academy — managed to get me one of those amazing devices, and I started using my newfound skills of BASIC programming to create simple apps, like calendars, planners, and even music apps. Well, apps that made sounds when I would command them so. However, my biggest pride was in the first game I’ve created. It resembled the oldest versions of Mortal Kombat. I managed to come up with brand new script language that would help me to code how characters would fight, fall down, stand up and win. I did all that when I was 13 years old, by the way. My first consulting jobs Apparently, not everyone could do what I could with PCs. When I learned how to code in PASCAL, a more efficient language that encouraged using structured programming, my father brought me to his office in the academy where I showed his colleagues my skills. The academics were shocked by what a teenager with a computer could do and even asked me to help them out, so I began consulting them on programming. Want to know how I learned PASCAL? Dad once told me that at his work people are already using it, and I’m very old-fashioned with my skills in BASIC. His words triggered me to go and learn how to code and become a multilingual programmer. Dad also was the reason why I learned C and C++. He used these techniques to persuade me to develop myself further quite a lot. Soon after, my mom thought that she could find a use case for my skills as well. She worked for an insurance company and asked me to write a program that helped to optimize their work. The program basically generated automated documents and emails — before that, the company did everything manually. I created a pattern language for optimizing parameters. And whoa — I earned my first 50 bucks! Not bad for a 15 years old, right? Java Changed it all End of the 90s was marked, and I’m not afraid of this word, with a revolution in the IT and programming world, when the Java was finally born. It was somehow simple and familiar, yet opened new horizons for programmers around the world. With automatic memory management and architecture-neutral and portable nature, Java made all the previous programming languages look like a manual car while being the newest model with the automatic gearbox. I did not have to worry about memory, Java was covering me up. It also made accessing code from different devices possible with its “write once, run anywhere” principle. Moreover, by the time Java has appeared, the Internet became a little bit more common and I got access to the like-minded enthusiasts that were interested in the same things as I was. Now I was not alone, and learning and developing myself became much easier. Money Maker: languages vs connections Once I’ve learned Java, I got my first job in one of the countries biggest banks — a place where people with connections worked. I didn’t have any connections, and I have just entered the university, but I knew the magic language that opened many doors. I’m not going to reveal any names, but over the last 15 years, I have worked for 10 different banks where my skills were a perfect fit. I specialized in building processing centres, but it was never that simple. IT guys haven’t been just coders for any of those banks, but real problem solvers. The new languages, like PHP (scripting language used to create interactive HTML Web pages) and Perl (seems to me like this language was actually created to confuse people, but there are coders who actually like it), kept coming, but none of them really won my heart. Meanwhile, I was dealing with the real world problems, like creating the first utility bills payment system for ATMs, or bringing these ATMs to small cities. I even had to stop a riot once! One of the machines we’ve installed in the suburbs was supposed to give out salaries to the workers of one of our clients (a factory) but went broke instead causing mass protests. Despite the fact that my fault was non-existent, I managed to fix all the issues and even had to talk to the press. That language was really new to me. Kotlin or why I joined Crypterium As you can see, working for banks was fun. Yet, the whole system is so deeply rooted in tradition and backward that I just had to move forward to see for myself how far the technology can go and what I can do for the world of the future to appear faster. Back in 2013, I’ve mastered another language — Kotlin, one of the most fascinating things I’ve learned since Java. It was created by my fellow Russian coders and is supported by Google as the main programming language today. I personally like Kotlin so much that I use it to write everything and regard it as a breath of fresh air after almost a decade of a standstill. I brought the Kotlin culture to the blockchain startup that I joined last year. What we’re trying to build is a whole new layer on the existing financial infrastructure. We like digital assets, mostly cryptocurrencies, so much that we decided to make them as easy to spend as cash. As a result, Crypterium was born and recognized as one of the most promising fintech projects by KPMG and H2 Ventures. Unlike banks, we are on the edge of the newest technology, and when at Crypterium, we tell candidates that we use Kotlin for our general ledger, they get inspired and motivated while their willingness to work for us increases with a geometrical progression. The kids will join the IT crowd It might seem like I’m writing this story just to brag, but learning new programming languages is not about looking smart, it’s about getting things done with the best possible tools. Of course, you can try to move all your belongings from one house to another using a bike, but it’s not the best solution, especially when you can use a truck instead. Over time I was just looking for the tools to get things done, and it doesn’t matter if it’s Delfi or Assembler x86, Python or JavaScript, I just couldn’t help but wonder how those languages could help us code new things in new ways. It might sound odd for you, but it wouldn’t for your kids, believe me. Technology will shape the future, and whether or not to learn how to code would not be a question in a couple of years. The new generation will discover, just like I did, the whole new galaxy that “talks” to us. They will look deeper into the way things work. When I ride a bus I wonder how the automatic tickets system or PayPass work. When I am in an elevator I wonder how it manages to go to all the floors it is commanded so. Technology is getting its hands on everything and is going everywhere, even to the most remote places of our planet — and that is the beauty of tomorrow. In order to understand how this world of future will function, the kids of today will need to learn technologies from an early age because knowing even 15 languages of programming today will be just the tip of an iceberg tomorrow.
https://medium.com/crypterium/how-i-learned-15-programming-languages-5c54d3ca0383
[]
2019-02-19 10:57:42.720000+00:00
['Python', 'Mobile App Development', 'Cryptocurrency', 'Programming', 'Blockchain']
Title learned 15 programming language kid tooContent 12 year old got sick go hospital checkup need worry turned angina couple hour spent hospital back way predetermine whole life waiting examination someone gave book — cheer turned BASIC early programming language still among simplest popular programming language loved whole idea creating new thing scratch badly picked life journey reading book started writing code piece paper time learned another 14 programming language truly understand world around u work might seem unrealistic kid might follow lead Anticipating Mortal Kombat tale take place Moscow beginning 90 many people interested programming back many people personal computer first read book didn’t computer tried write code piece paper Soon enough father — professor military academy — managed get one amazing device started using newfound skill BASIC programming create simple apps like calendar planner even music apps Well apps made sound would command However biggest pride first game I’ve created resembled oldest version Mortal Kombat managed come brand new script language would help code character would fight fall stand win 13 year old way first consulting job Apparently everyone could could PCs learned code PASCAL efficient language encouraged using structured programming father brought office academy showed colleague skill academic shocked teenager computer could even asked help began consulting programming Want know learned PASCAL Dad told work people already using I’m oldfashioned skill BASIC word triggered go learn code become multilingual programmer Dad also reason learned C C used technique persuade develop quite lot Soon mom thought could find use case skill well worked insurance company asked write program helped optimize work program basically generated automated document email — company everything manually created pattern language optimizing parameter whoa — earned first 50 buck bad 15 year old right Java Changed End 90 marked I’m afraid word revolution programming world Java finally born somehow simple familiar yet opened new horizon programmer around world automatic memory management architectureneutral portable nature Java made previous programming language look like manual car newest model automatic gearbox worry memory Java covering also made accessing code different device possible “write run anywhere” principle Moreover time Java appeared Internet became little bit common got access likeminded enthusiast interested thing alone learning developing became much easier Money Maker language v connection I’ve learned Java got first job one country biggest bank — place people connection worked didn’t connection entered university knew magic language opened many door I’m going reveal name last 15 year worked 10 different bank skill perfect fit specialized building processing centre never simple guy haven’t coder bank real problem solver new language like PHP scripting language used create interactive HTML Web page Perl seems like language actually created confuse people coder actually like kept coming none really heart Meanwhile dealing real world problem like creating first utility bill payment system ATMs bringing ATMs small city even stop riot One machine we’ve installed suburb supposed give salary worker one client factory went broke instead causing mass protest Despite fact fault nonexistent managed fix issue even talk press language really new Kotlin joined Crypterium see working bank fun Yet whole system deeply rooted tradition backward move forward see far technology go world future appear faster Back 2013 I’ve mastered another language — Kotlin one fascinating thing I’ve learned since Java created fellow Russian coder supported Google main programming language today personally like Kotlin much use write everything regard breath fresh air almost decade standstill brought Kotlin culture blockchain startup joined last year we’re trying build whole new layer existing financial infrastructure like digital asset mostly cryptocurrencies much decided make easy spend cash result Crypterium born recognized one promising fintech project KPMG H2 Ventures Unlike bank edge newest technology Crypterium tell candidate use Kotlin general ledger get inspired motivated willingness work u increase geometrical progression kid join crowd might seem like I’m writing story brag learning new programming language looking smart it’s getting thing done best possible tool course try move belonging one house another using bike it’s best solution especially use truck instead time looking tool get thing done doesn’t matter it’s Delfi Assembler x86 Python JavaScript couldn’t help wonder language could help u code new thing new way might sound odd wouldn’t kid believe Technology shape future whether learn code would question couple year new generation discover like whole new galaxy “talks” u look deeper way thing work ride bus wonder automatic ticket system PayPass work elevator wonder manages go floor commanded Technology getting hand everything going everywhere even remote place planet — beauty tomorrow order understand world future function kid today need learn technology early age knowing even 15 language programming today tip iceberg tomorrowTags Python Mobile App Development Cryptocurrency Programming Blockchain
2,427
Should I Self-Publish?
Pros, Cons and a Curveball for bypassing Publications and self-publishing on Medium Photo by x ) on Unsplash Considering the pros and cons of self-publishing on Medium just got much more exciting, thanks to the October 2020 desktop interface update. If you’re a writer of micro fiction or poetry you know what I mean… The October 2020 desktop made two serious changes to the personal homepages of Medium Partner Program participants. “Infinite scroll” meaning that a writer’s work is displayed one right after the next on our homepages Personal URLs; which are not located behind the paywall The two changes raised alarm bells for writers, especially producers of works that are less than 200 words, who feared that the interface was now giving away their words for free. Medium has since clarified that the upgrade is programmed to track the “lingering” of members on parts of a home page as reads, even when they don’t click “read more.” On the other hand, if a user is not logged into Medium and is not a paying member, they have access to home pages without interference of a paywall. Pros and Cons of Publications For those readers who are certain that “infinite scroll” and free previews are not in their best interest, there is a simple solution. Publications. When you publish your work with a publication, even a personal one, the story lives first and foremost on that publication’s wall. The share link you generate when promoting the post is to the publication, even though the story does still display on your own homepage as well. Unlike personal profiles, Publications still have the option to opt out of the new interface and most (in my Medium Ecosystem at least…) still are. Therefore, for the time being, a writer who wants to prevent infinite scroll and free previews can create a personal publication and publish their work there, rather than on their home page, at least for a little while longer. Pros In the Suggestion Box, we talk a lot about how important good relationships with publications are in moving up the Medium food chain, from ecosystem to ecosystem, all the more so following the October 2020 upgrades. Plain and Simple: If your reach is limited but the publication’s reach is broad, you extend your reach when your work is featured by a high traffic publication. Cons Self-published stories live ownerless on Medium. As a result, they are suggested by tag in the “More from Medium” footer when it displays, and I have a hunch (just a hunch!) that they are reviewed for curation (still happens, we’re just not told the categories) more expediently than those stories that are not granted immediate curation by publications (large ones can do this). There are reasons to believe that some good stories can gain traction faster when they are self-published than when they are published on small pubs or pubs with disengaged readerships. It is important that writers understand that Medium made these changes because they believe that they represent system upgrades that will improve the user experience of the platform over time. Indeed, though we may have panicked (prematurely…) about our compensation being impacted if readers don’t have to click into our stories, we have to acknowledge that Medium removing all barriers that stand between our readers and their desire to read more of our work is a good thing. Likewise, just as the music industry found with radio, often the very best way to gain new readers, users or followers is to give them a taste of our work for free. A Curveball My genre on Medium is personal essay, which means that my holy grail publication is Human Parts. Human Parts has 255,000 followers. It’s tagline is a “publication about humanity.” It’s editors work with writers to improve their work prior to publication. But most of all, unlike the Ascent, GEN, or Curious, great Medium publications, Human Parts welcomes embraces the genre that is personal essay. They welcome nuance. Their submission guidelines (back whey they had submission guidelines…) didn’t ask writers to do things like clearly list actionable take aways for their readers. Human Parts welcomes literature, and that’s what I’m really here to write. In 2020, Human Parts shifted their editorial focus to center minority voices; a move that I celebrate. As a privileged white woman, albeit an expat, they haven’t been looking for me, and that’s okay. However, in 2019, before this pivot, Human Parts replaced their submission guidelines with a “don’t call us, we’ll call you” notice, and since the Oct 2020 upgrades, even that is now gone. But, Human Parts is still putting out new work, and it’s no longer focused only on things like race and culture. No doubt hundreds of historic Medium writers still have the ability to directly submit to Human Parts, thereby sourcing some of the publication’s featured stories. I’m just not one of those lucky ducks. Beyond those lucky few, Human Parts is doing what they promised in the submission guidelines that came down a few days ago (apologies that I don’t have it here to quote for you…). They are scouring Medium for great well-written, compelling personal essays, reaching out to those writers and inviting them to publish with Human Parts. In the last few weeks, I’ve messaged with a handful of writers like these who confirm that this has been their experience. The Holy Grail for Personal Essays While generally the best way for a writer with less than 1,000 followers to expand their reach is to publish with an active publication that has greater than 1,000 followers, the holy grail of Human Parts changes the calculation for personal essayists like me. Whereas most personal essays that I self publish simple get pushed out to my followers and perhaps chosen for distribution, there is always a chance that a lucky, truly excellent essay could be picked-up and distributed by Human Parts.
https://medium.com/suggestion-box/should-i-self-publish-a054b277c6a5
['Sarene B. Arias']
2020-10-21 15:19:30.854000+00:00
['Tips', 'Marketing', 'Medium', 'Blogging', 'Writing']
Title SelfPublishContent Pros Cons Curveball bypassing Publications selfpublishing Medium Photo x Unsplash Considering pro con selfpublishing Medium got much exciting thanks October 2020 desktop interface update you’re writer micro fiction poetry know mean… October 2020 desktop made two serious change personal homepage Medium Partner Program participant “Infinite scroll” meaning writer’s work displayed one right next homepage Personal URLs located behind paywall two change raised alarm bell writer especially producer work le 200 word feared interface giving away word free Medium since clarified upgrade programmed track “lingering” member part home page read even don’t click “read more” hand user logged Medium paying member access home page without interference paywall Pros Cons Publications reader certain “infinite scroll” free preview best interest simple solution Publications publish work publication even personal one story life first foremost publication’s wall share link generate promoting post publication even though story still display homepage well Unlike personal profile Publications still option opt new interface Medium Ecosystem least… still Therefore time writer want prevent infinite scroll free preview create personal publication publish work rather home page least little longer Pros Suggestion Box talk lot important good relationship publication moving Medium food chain ecosystem ecosystem following October 2020 upgrade Plain Simple reach limited publication’s reach broad extend reach work featured high traffic publication Cons Selfpublished story live ownerless Medium result suggested tag “More Medium” footer display hunch hunch reviewed curation still happens we’re told category expediently story granted immediate curation publication large one reason believe good story gain traction faster selfpublished published small pub pub disengaged readership important writer understand Medium made change believe represent system upgrade improve user experience platform time Indeed though may panicked prematurely… compensation impacted reader don’t click story acknowledge Medium removing barrier stand reader desire read work good thing Likewise music industry found radio often best way gain new reader user follower give taste work free Curveball genre Medium personal essay mean holy grail publication Human Parts Human Parts 255000 follower It’s tagline “publication humanity” It’s editor work writer improve work prior publication unlike Ascent GEN Curious great Medium publication Human Parts welcome embrace genre personal essay welcome nuance submission guideline back whey submission guidelines… didn’t ask writer thing like clearly list actionable take aways reader Human Parts welcome literature that’s I’m really write 2020 Human Parts shifted editorial focus center minority voice move celebrate privileged white woman albeit expat haven’t looking that’s okay However 2019 pivot Human Parts replaced submission guideline “don’t call u we’ll call you” notice since Oct 2020 upgrade even gone Human Parts still putting new work it’s longer focused thing like race culture doubt hundred historic Medium writer still ability directly submit Human Parts thereby sourcing publication’s featured story I’m one lucky duck Beyond lucky Human Parts promised submission guideline came day ago apology don’t quote you… scouring Medium great wellwritten compelling personal essay reaching writer inviting publish Human Parts last week I’ve messaged handful writer like confirm experience Holy Grail Personal Essays generally best way writer le 1000 follower expand reach publish active publication greater 1000 follower holy grail Human Parts change calculation personal essayist like Whereas personal essay self publish simple get pushed follower perhaps chosen distribution always chance lucky truly excellent essay could pickedup distributed Human PartsTags Tips Marketing Medium Blogging Writing
2,428
The 6-Week Void in My Identity
Six Weeks That’s the length of time unaccounted for. Forty-two days of life. Those are the completely dark days. Six weeks is the hole in my heart. The entire month of October and then some. I do not know where I was. I do not know who cared for me. I do not know what happened to me. In his book, “The Body Keeps the Score,” Bessel Van Der Kolk argues extensively for a somatic understanding of trauma. When we face trauma, he suggests, our bodies encode the adverse experience deep within our nervous systems, and far below the level of conscious awareness. Increasingly, research surrounding PTSD is pointing toward a similar thesis, namely, that when we relive a traumatic moment, the memory is much more visceral, since it has been buried in our bodies, often never even having been explicitly recounted by consciousness. Our bodies remember, even if our minds do not recall. This view of trauma has led therapists and other mental healthcare practitioners to rethink their approach. We should stop asking what’s wrong with you and instead ask what happened to you, if we really want to help people heal. The problem is, I don’t know what happened to me for the first six weeks of my life and I might never find out. I do know I was given up for adoption the day I was born, and then I was in the foster system until I was adopted by the people I now call Mom and Dad. Since mine was a closed adoption and all records were sealed, I grew up knowing next to nothing about my biological family, though I was always insatiably curious. I’ve located many of them, and have cobbled together a lot about my origins. I even reconnected with the social worker who handled my case nearly forty years ago. Though he is getting up there in age, he helped me tremendously. Still, no one can tell me what my first six weeks of life were like. Related to Van Der Kolk’s work on trauma, researchers have also begun focusing on the ways early childhood traumas — adverse childhood experiences (ACEs), as they are called — impact development and health, both physiologically and psychologically. Abuse and neglect, for example, can manifest later in life as chronic illness. Growing up with an alcoholic parent can lead a child to develop all sorts of unhealthy coping strategies and in turn, those can become psychologically debilitating conditions. The body indeed keeps the score, as Nadine Burke Harris eloquently describes in this TED talk: An adverse childhood experience (ACE) that is less often acknowledged is adoption. Not the part where the adoptive parents take you home and you might finally begin to feel settled and start forming trusting and secure attachments. That is, of course, if you are fortunate enough to be adopted by good parents. Countless adoptees out there suffered just as much or even more by the hands of their adoptive parents, some even being murdered by them. This fact alone renders the sweeping claim that adoption is always in the best interest of the child patently absurd. Let me be clear: I had an amazing childhood and my parents absolutely did and admirable job. Nevertheless, the experience of being taken from my first mother and father on the day of my birth absolutely counts as an ACE. There are plenty of studies showing that babies placed in the NICU, for instance, suffer extreme stress and their nervous systems go into overdrive trying to compensate for being ripped apart from the only source of safety they have ever known. Even in utero, babies begin forming bodily memories, which is why, as soon as 24 hours after birth, they will show marked preference for the breastmilk of their biological mother over that of a genetic stranger. They will even show preference for music and sounds — like their mother’s voice — they were accustomed to hearing while gestating. Newborns have no way to conceptualize that they are a distinct human subject, different from their mother. This is why postnatal care has been increasingly emphasizing the importance of the “fourth trimester” in forming positive life experiences for the baby (and parents) as the transition from one body to two is made. All of this is to say that the separation of newborn from its biological parents is a preverbal trauma, one that leaves its mark on the baby’s nervous system just like any other ACE. This sort of trauma happens prior to the capacity for linguistic representation, but this does not mean babies do not remember. It is arguably why so many adoptees suffer from debilitating mental illness, are more susceptible to auto-immune disorders, and are more likely to attempt suicide in the non-adopted population. This lecture by psychiatrist, Paul Sunderland provides an excellent overview of preverbal trauma and its impact on development and functioning in adoptees. I’ve explored my own preverbal trauma as a potential source of some of the challenges I’ve faced in life. Despite an overwhelmingly positive childhood, it would be a lie to say I have not struggled with my mental health over the years. Having children really brought to the fore just how much being relinquished for adoption impacted me, and that is when I began searching for my biological relatives in earnest. Like I said, I found many of them and learned a lot about my genetic predispositions to certain conditions, including mental illness, and I even learned that my mom was volatile and stressed while she was pregnant with me, potentially a contributing factor to the high blood pressure and crippling anxiety I experience. We all try to build a cohesive narrative to frame our lives, and I have been able to piece together so much more of who I am by learning about where I came from. Yet, I still do not know anything about those first six weeks. Pictures or it didn’t happen! Everyone likes to say this, in our image-obsessed culture. I take a ton of pictures of my kids and I think it’s to compensate for this gaping hole in my own pictorial legacy. I don’t have any pictures of me on the day of my birth. But my birthday most assuredly happened. Life most assuredly happened for me during those six weeks where I have no pictures, no stories, and no information about the people with whom I interacted. Having my own children now, I realize just how much life happens in those first few weeks — the attachment formation, the mirroring, learning eye contact and gaze following, emotional regulation, sleep pattern formation, and bonding. What were all those things like for me? Did I bond with my foster family only to be traumatized again when I was abruptly taken from them? Or did they neglect and abuse me, the sad reality of so many fostering situations? Or was I basically in a modern-day orphanage? Was I scared? Was I fed enough? Did I scream out for attention like so many stressed babies, or did I simply collapse into silence out of fear I would be harmed if I was too much? There is abundant compelling science indicating that babies remember things even if they cannot consciously recall them. An abused child, even at 4 weeks of age, will be impacted by that abuse. Yet, people who uncritically praise adoption tend to ignore these facts. They insist a baby is basically a blank slate, at least for the first few days? Weeks? A whole year? It depends on what agenda they have. Many well-meaning adoptive parents think as long as they shower their child with love, it will cancel out anything bad that happened in the first moments or even months of life. The child has no real memories or conscious mind — their life does not truly begin — until they are adopted. It’s the biggest lie in the industry. If we really want to do adoption right — if that is even possible — we have to first stop insisting that there are no negative side effects to adoption. We need to admit that adoption is traumatic. We need to listen to adoptees when they tell us adoption hurts them. Not doing so is willful ignorance and it perpetuates the marginalization and harm adoptees experience. I have a six-week void in my narrative construction of myself. Many adoptees I know have a far bigger gap. Some of them were so traumatized by being culturally uprooted, internationally transplanted, and psychologically abused that they have effectively fragmented themselves and suffer dissociations as a result of unconsciously trying to cope with the ACEs that mark their development. If only I had pictures, or someone to share stories about me in those first six weeks, I could piece my story together even more and understand what happened to me, which would help me understand my behavioral schemas, drives, and non-conscious coping mechanisms. It would help me understand…me. Though I’ve spent a lifetime on this project, the frustration of knowing there are six weeks I will likely never account for and thus will always come up short is maddening. I’m a perfectionist and finish everything I start as perfectly as I can, but this is one task doubt I will ever complete. And it is even more frustrating to hear “it doesn’t matter” or “just be positive” or “you got THE BEST family though!” All that gaslighting only makes it worse because it reaffirms the fears I have — that so many adoptees have — that no one will ever understand or take it seriously. My pain is mine alone, save for the adoptee comrades I have who feel it too. As my 40th birthday approaches, I pull out the first picture I have of myself, the day I was brought ‘home.’ I look at this picture every year and wonder if I will ever feel fully home with myself. Because when I look at this picture I see a person who has six weeks of memories and life experiences already accumulated, but no one to share those with. I see this child and I want to talk to her and ask, what happened to you?
https://medium.com/curious/the-6-week-void-in-my-identity-1b1eaf369a3b
['Michele Merritt']
2020-09-24 17:54:18.291000+00:00
['Trauma', 'Mental Health', 'Self', 'Psychology', 'Adoption']
Title 6Week Void IdentityContent Six Weeks That’s length time unaccounted Fortytwo day life completely dark day Six week hole heart entire month October know know cared know happened book “The Body Keeps Score” Bessel Van Der Kolk argues extensively somatic understanding trauma face trauma suggests body encode adverse experience deep within nervous system far level conscious awareness Increasingly research surrounding PTSD pointing toward similar thesis namely relive traumatic moment memory much visceral since buried body often never even explicitly recounted consciousness body remember even mind recall view trauma led therapist mental healthcare practitioner rethink approach stop asking what’s wrong instead ask happened really want help people heal problem don’t know happened first six week life might never find know given adoption day born foster system adopted people call Mom Dad Since mine closed adoption record sealed grew knowing next nothing biological family though always insatiably curious I’ve located many cobbled together lot origin even reconnected social worker handled case nearly forty year ago Though getting age helped tremendously Still one tell first six week life like Related Van Der Kolk’s work trauma researcher also begun focusing way early childhood trauma — adverse childhood experience ACEs called — impact development health physiologically psychologically Abuse neglect example manifest later life chronic illness Growing alcoholic parent lead child develop sort unhealthy coping strategy turn become psychologically debilitating condition body indeed keep score Nadine Burke Harris eloquently describes TED talk adverse childhood experience ACE le often acknowledged adoption part adoptive parent take home might finally begin feel settled start forming trusting secure attachment course fortunate enough adopted good parent Countless adoptee suffered much even hand adoptive parent even murdered fact alone render sweeping claim adoption always best interest child patently absurd Let clear amazing childhood parent absolutely admirable job Nevertheless experience taken first mother father day birth absolutely count ACE plenty study showing baby placed NICU instance suffer extreme stress nervous system go overdrive trying compensate ripped apart source safety ever known Even utero baby begin forming bodily memory soon 24 hour birth show marked preference breastmilk biological mother genetic stranger even show preference music sound — like mother’s voice — accustomed hearing gestating Newborns way conceptualize distinct human subject different mother postnatal care increasingly emphasizing importance “fourth trimester” forming positive life experience baby parent transition one body two made say separation newborn biological parent preverbal trauma one leaf mark baby’s nervous system like ACE sort trauma happens prior capacity linguistic representation mean baby remember arguably many adoptee suffer debilitating mental illness susceptible autoimmune disorder likely attempt suicide nonadopted population lecture psychiatrist Paul Sunderland provides excellent overview preverbal trauma impact development functioning adoptee I’ve explored preverbal trauma potential source challenge I’ve faced life Despite overwhelmingly positive childhood would lie say struggled mental health year child really brought fore much relinquished adoption impacted began searching biological relative earnest Like said found many learned lot genetic predisposition certain condition including mental illness even learned mom volatile stressed pregnant potentially contributing factor high blood pressure crippling anxiety experience try build cohesive narrative frame life able piece together much learning came Yet still know anything first six week Pictures didn’t happen Everyone like say imageobsessed culture take ton picture kid think it’s compensate gaping hole pictorial legacy don’t picture day birth birthday assuredly happened Life assuredly happened six week picture story information people interacted child realize much life happens first week — attachment formation mirroring learning eye contact gaze following emotional regulation sleep pattern formation bonding thing like bond foster family traumatized abruptly taken neglect abuse sad reality many fostering situation basically modernday orphanage scared fed enough scream attention like many stressed baby simply collapse silence fear would harmed much abundant compelling science indicating baby remember thing even cannot consciously recall abused child even 4 week age impacted abuse Yet people uncritically praise adoption tend ignore fact insist baby basically blank slate least first day Weeks whole year depends agenda Many wellmeaning adoptive parent think long shower child love cancel anything bad happened first moment even month life child real memory conscious mind — life truly begin — adopted It’s biggest lie industry really want adoption right — even possible — first stop insisting negative side effect adoption need admit adoption traumatic need listen adoptee tell u adoption hurt willful ignorance perpetuates marginalization harm adoptee experience sixweek void narrative construction Many adoptee know far bigger gap traumatized culturally uprooted internationally transplanted psychologically abused effectively fragmented suffer dissociation result unconsciously trying cope ACEs mark development picture someone share story first six week could piece story together even understand happened would help understand behavioral schema drive nonconscious coping mechanism would help understand…me Though I’ve spent lifetime project frustration knowing six week likely never account thus always come short maddening I’m perfectionist finish everything start perfectly one task doubt ever complete even frustrating hear “it doesn’t matter” “just positive” “you got BEST family though” gaslighting make worse reaffirms fear — many adoptee — one ever understand take seriously pain mine alone save adoptee comrade feel 40th birthday approach pull first picture day brought ‘home’ look picture every year wonder ever feel fully home look picture see person six week memory life experience already accumulated one share see child want talk ask happened youTags Trauma Mental Health Self Psychology Adoption
2,429
Cosine Similarity Matrix using broadcasting in Python
Learn how to code a (almost) one liner python function to calculate (manually) cosine similarity or correlation matrices used in many data science algorithms using the broadcasting feature of numpy library in Python. Photo by mostafa rezaee on Unsplash Do you think we can say that a professional MotoGP rider and the kid in the picture have the same passion for motorsports even if they will never meet and are different in all the other aspects of their life ? If you think yes then you grasped the idea of cosine similarity and correlation. Now suppose you work for a pay tv channel and you have the results of a survey from two groups of subscribers. One of the anaysis could be about the similarities of tastes between the two groups. For this type of analysis we are interested to select people sharing similar behaviours regardless of “how much time” they watch TV. This is well represented by the concept of cosine similarity which allow to consider as “close” those ‘observations’ aligned to some interesting for us directions regardless of how different the magnitude of the measures are from each other . So as an example if “person A” watches 10 hours of sport and 4 hours movies and “person B” watches 5 hours of sport, 2 hours movies, we can see the twos are (perfectly in this case) aligned given the fact that regardless of how many hours in total they watch TV, in proportion they share the same behaviours. By contrast if the objective is to analyse those watching similar number of hours in an interval, the euclidean distance would have been more appropriate as that evaluates the distance as we are used normally to think. It’s rather intuitive from the chart below to see this comparing the two points A and B with the length of segment f=10 (euclidean distance) with cosine of angle alpha = 0.9487 which oscillates between 1 and -1 where 1 means same direction same orientation, -1 same direction but opposite orientation. Simple example to how cosine of alpha (0.94) show a good alignment between the two vectors (OA) and (OB) If the orientation is not important in our analysis the module of cosine would null this effect and consider +1 the same as -1. In terms of formulas cosine similarity is related to Pearson’s correlation coefficient by almost the same formula as cosine similarity is Pearson’s correlation when vectors are centered on their mean: (image by author) Cosine Similarity Matrix: The generalization of the cosine similarity concept when we have many points in a data matrix A to be compared with themselves (cosine similarity matrix using A vs. A) or to be compared with points in a second data matrix B (cosine similarity matrix of A vs. B with the same number of dimensions) is the same problem. So to make things different from usual we want to calculate the Cosine Similarity Matrix of a group of points A vs. a second group of points B, both with same number of variables (columns) like this: (image by author) Assuming the vectors to be compared are in the rows of A and B the Cosine Similarity Matrix would appear as follows where each cell is cosine of the angle between all the vectors of A (rows) with all the vectors of B (columns): (image by author) If you look at the color pattern you see that first vectors “a” replicate itself by row, while vectors “b” replicates itself by columns. To calculate this matrix in (almost) one line of code we need to look for a way to use what we know of algebra for numerator and denominator and then put it all together. Cell Numerator: If we keep A matrix fixed (3,3) we have to operate a ‘dot’ product with the transpose of B [=> (5,3)] and we get a (3,3) result. In python this is easy with: num=np.dot(A,B.T) Cell Denominator: It ’s a simple multiplication between 2 numbers but first we have to calculate the length of the two vectors. Let’s find a way to do that in a few Python lines using the numpy broadcasting operation which is a smart way to solve this problem. To calculate the lengths of vectors in A (and B) we should do this: square the elements of matrix A sum the values by row root square the values out of point 2 In the above case where A=(3,3) and B=(5,3) the two lines below (remember that axis=1 means ‘by row’) return two arrays (not matrices !): p1=np.sqrt(np.sum(A**2,axis=1)) # array with 3 elements (it’s not a matrix) p2=np.sqrt(np.sum(B**2,axis=1)) # array with 5 elements (it’s not a matrix) If we just multiply them together it doesn’t work because the ‘*’ works element by element and the shapes as you see are different. Because ‘*’ operation is element by element we want two matrices where the first has the vector p1 vertical and copied in width p2 times, while p2 is horizontal and copied in height p1 times. To do this with ‘broadcasting’ we have to modify p1 so that it becomes fixed in vertical (a1,a2,a3) but “elastic” in a second dimension. The same with p2 so that becomes fixed in horizontal and “elastic” in a second dimension. (image by author) To achieve this we leverage the np.newaxis function with this: p1=np.sqrt(np.sum(A**2,axis=1))[:,np.newaxis] p2=np.sqrt(np.sum(B**2,axis=1))[np.newaxis,:] p1 can be read like: make the vector vertical (:) and add a column dimension and p2 can be read like: add a row dimension, make the vector horizontal. This operation for p2 in theory is not necessary because p2 was already horizontal and even if it was an array, multiplying a matrix (p1) by an array (p2) results in a matrix (if they are compatible of course) but I like the above because more clean and flexible to changes. Now if you look p1 and p2 before and after you will notice that p1 is now a matrix and so p2 but still one dimensional. If you now multiply them with p1*p2 then the magic happens and the result is a 3x5 matrix like the p1*p2 in grey in the above picture. So we can now finalize the (almost) one liner for our cosine similarity matrix with this example complete of some data for A and B: import numpy as np A=np.array([[2,2,3],[1,0,4],[6,9,7]]) B=np.array([[1,5,2],[6,6,4],[1,10,7],[5,8,2],[3,0,6]]) def csm(A,B): num=np.dot(A,B.T) p1=np.sqrt(np.sum(A**2,axis=1))[:,np.newaxis] p2=np.sqrt(np.sum(B**2,axis=1))[np.newaxis,:] return num/(p1*p2) print(csm(A,B)) Correlation Matrix between A and B In case you want to modify the function to use it to calculate the correlation matrix the only difference is that you should subtract from the original matrices A and B their mean by row and also in this case you can leverage the np.newaxis function. In this case you first calculate the vector of the means by row as you’d usually do but remember that the result is again a horizontal vector and you cannot proceed with the code below B-B.mean(axis=1) A-A.mean(axis=1) We must make the means vector of A compatible with the matrix A by verticalizing and copying the now column vector the width of A times and the same for B. For this we can use again the broadcasting feature in Python “verticalizing” the vector (using ‘:’) and creating a new (elastic) dimension for columns. B=B-B.mean(axis=1)[:,np.newaxis] A=A-A.mean(axis=1)[:,np.newaxis] (image by author) Now we can modify our function including a boolean where if it’s True it calculates the correlation matrix between A and B while if it’s False calculate the cosine similarity matrix: import numpy as np A=np.array([[1,2,3],[5,0,4],[6,9,7]]) B=np.array([[4,0,9],[1,5,4],[2,8,6],[3,2,7],[5,9,4]]) def csm(A,B,corr): if corr: B=B-B.mean(axis=1)[:,np.newaxis] A=A-A.mean(axis=1)[:,np.newaxis] num=np.dot(A,B.T) p1=np.sqrt(np.sum(A**2,axis=1))[:,np.newaxis] p2=np.sqrt(np.sum(B**2,axis=1))[np.newaxis,:] return num/(p1*p2) print(csm(A,B,True)) Note that if you use this function to calculate the correlation matrix the result is similar to the numpy function np.corrcoef(A,B) with the difference that the numpy function calculates also the correlation of A with A and B with B which could be redundant and force you to cut out the parts you don’t need. For example the correlation of A with B is in the submatrix top right which can be cut out knowing the shapes of A and B and working with indices. Of course there are many methods to do the same thing described here including other libraries and functions but the np.newaxis is quite smart and in this example I hope I helped you in that … direction
https://towardsdatascience.com/cosine-similarity-matrix-using-broadcasting-in-python-2b1998ab3ff3
['Andrea Grianti']
2020-12-08 13:21:48.486000+00:00
['Machine Learning', 'Data Science', 'Python', 'Marketing', 'Analytics']
Title Cosine Similarity Matrix using broadcasting PythonContent Learn code almost one liner python function calculate manually cosine similarity correlation matrix used many data science algorithm using broadcasting feature numpy library Python Photo mostafa rezaee Unsplash think say professional MotoGP rider kid picture passion motorsports even never meet different aspect life think yes grasped idea cosine similarity correlation suppose work pay tv channel result survey two group subscriber One anaysis could similarity taste two group type analysis interested select people sharing similar behaviour regardless “how much time” watch TV well represented concept cosine similarity allow consider “close” ‘observations’ aligned interesting u direction regardless different magnitude measure example “person A” watch 10 hour sport 4 hour movie “person B” watch 5 hour sport 2 hour movie see two perfectly case aligned given fact regardless many hour total watch TV proportion share behaviour contrast objective analyse watching similar number hour interval euclidean distance would appropriate evaluates distance used normally think It’s rather intuitive chart see comparing two point B length segment f10 euclidean distance cosine angle alpha 09487 oscillates 1 1 1 mean direction orientation 1 direction opposite orientation Simple example cosine alpha 094 show good alignment two vector OA OB orientation important analysis module cosine would null effect consider 1 1 term formula cosine similarity related Pearson’s correlation coefficient almost formula cosine similarity Pearson’s correlation vector centered mean image author Cosine Similarity Matrix generalization cosine similarity concept many point data matrix compared cosine similarity matrix using v compared point second data matrix B cosine similarity matrix v B number dimension problem make thing different usual want calculate Cosine Similarity Matrix group point v second group point B number variable column like image author Assuming vector compared row B Cosine Similarity Matrix would appear follows cell cosine angle vector row vector B column image author look color pattern see first vector “a” replicate row vector “b” replicates column calculate matrix almost one line code need look way use know algebra numerator denominator put together Cell Numerator keep matrix fixed 33 operate ‘dot’ product transpose B 53 get 33 result python easy numnpdotABT Cell Denominator ’s simple multiplication 2 number first calculate length two vector Let’s find way Python line using numpy broadcasting operation smart way solve problem calculate length vector B square element matrix sum value row root square value point 2 case A33 B53 two line remember axis1 mean ‘by row’ return two array matrix p1npsqrtnpsumA2axis1 array 3 element it’s matrix p2npsqrtnpsumB2axis1 array 5 element it’s matrix multiply together doesn’t work ‘’ work element element shape see different ‘’ operation element element want two matrix first vector p1 vertical copied width p2 time p2 horizontal copied height p1 time ‘broadcasting’ modify p1 becomes fixed vertical a1a2a3 “elastic” second dimension p2 becomes fixed horizontal “elastic” second dimension image author achieve leverage npnewaxis function p1npsqrtnpsumA2axis1npnewaxis p2npsqrtnpsumB2axis1npnewaxis p1 read like make vector vertical add column dimension p2 read like add row dimension make vector horizontal operation p2 theory necessary p2 already horizontal even array multiplying matrix p1 array p2 result matrix compatible course like clean flexible change look p1 p2 notice p1 matrix p2 still one dimensional multiply p1p2 magic happens result 3x5 matrix like p1p2 grey picture finalize almost one liner cosine similarity matrix example complete data B import numpy np Anparray223104697 Bnparray1526641107582306 def csmAB numnpdotABT p1npsqrtnpsumA2axis1npnewaxis p2npsqrtnpsumB2axis1npnewaxis return nump1p2 printcsmAB Correlation Matrix B case want modify function use calculate correlation matrix difference subtract original matrix B mean row also case leverage npnewaxis function case first calculate vector mean row you’d usually remember result horizontal vector cannot proceed code BBmeanaxis1 AAmeanaxis1 must make mean vector compatible matrix verticalizing copying column vector width time B use broadcasting feature Python “verticalizing” vector using ‘’ creating new elastic dimension column BBBmeanaxis1npnewaxis AAAmeanaxis1npnewaxis image author modify function including boolean it’s True calculates correlation matrix B it’s False calculate cosine similarity matrix import numpy np Anparray123504697 Bnparray409154286327594 def csmABcorr corr BBBmeanaxis1npnewaxis AAAmeanaxis1npnewaxis numnpdotABT p1npsqrtnpsumA2axis1npnewaxis p2npsqrtnpsumB2axis1npnewaxis return nump1p2 printcsmABTrue Note use function calculate correlation matrix result similar numpy function npcorrcoefAB difference numpy function calculates also correlation B B could redundant force cut part don’t need example correlation B submatrix top right cut knowing shape B working index course many method thing described including library function npnewaxis quite smart example hope helped … directionTags Machine Learning Data Science Python Marketing Analytics
2,430
Remembering Steve Jobs
Remembering Steve Jobs Eight years on, we take a look at the life of a man whose innovations changed the world Known for charismatic presentations, perfectionism and his signature turtlenecks, Steve Jobs was a pioneer of the computing industry. From co-founding Apple Computers Inc. in 1976 through to his death in 2011, Jobs was one of the most polarising figures in the technology industry, whose ideas and ingenuity have impacted billions of people worldwide. But his life was so much more than technological innovations. In a world where we have force-fed children from a young age the idea that the road to success is through academia, Steve Jobs is an example to us all of how life is so much more satisfying when you are doing what you love, as opposed to doing what you have been told you should do. Born To Be A College Graduate? Born in 1955, Steve’s hadn’t even left his mother’s womb before she made the decision that he would go to college. She wanted him to have a better future than the one that she could provide, so following his birth, he was immediately put up for adoption to a college-educated, Catholic couple. But unfortunately for Steve, the couple passed on adopting him, explaining they wanted to adopt a girl instead. As a result, Steve was passed on to Paul and Clara Jobs, neither of whom were college-educated. It was only after they promised his biological mother that Steve would go to college that she agreed to sign the adoption papers. Throughout Steve’s childhood, it was clear for all to see that he was an incredibly gifted individual, but his dislike of formal education was just as apparent. And while his parents used their savings to send their son to college and keep their promise to his biological mother, what they perhaps didn’t anticipate was him dropping out after just six months. ‘Fate, it seems, is not without a sense of irony.’ — Morpheus, The Matrix Steve’s high school yearbook photo, taken before he discovered Turtlenecks. (Image credit Seth Poppel) Giving a commencement speech at Stanford University in 2005, Jobs joked that the speech he was giving was the closest he ever came to college graduation. When discussing his time at college, he said: ‘I couldn’t see the value in it, I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out, and here I was, spending all of the money that my parents had saved their entire life.’ Looking back, he said that dropping out of college was one of the best decisions that he ever made, as he was no longer burdened with studying things that didn’t interest him, which allowed him the freedom to study the things that did. He recalls the opportunity he had to take a class in calligraphy, as it is something that he found fascinating. The information learned in that class manifested itself more than ten years later when he incorporated it into the design of the original Macintosh. Without that class he argues, the Mac would never have had the varied typefaces and fonts that were built into it. The First Bite Of The Apple Apple was famously co-founded in Steve’s parent's garage in 1976, a far cry from the $5 billion, 2,800,000 square foot building that currently serves as Apple HQ in Cupertino, California. For the following nine years Apple was on the rise. Beginning with the Apple I, through to the release of the first Macintosh in 1984, the company had begun to establish itself as one of the premier computing companies in the world. Steve Jobs public demonstration of the first Macintosh computer in 1984 (Image credit www.businessinsider.com) But in 1985 things began to unravel. Then CEO John Sculley believed that Jobs was hurting the company, and the two had extremely opposing views on what direction the company should be going in. After failing to regain control of the company, Jobs resigned from his position at Apple, leaving behind the company which he co-founded aged twenty in his parent’s garage. The Inbetween Years When discussing the years that followed his departure from Apple, Jobs said that being fired was the best thing that could have happened to him at that point in his life. After leaving Apple he went on to found not one, but two successful companies. One of them went on to become the most successful animation studio in the world (a small company you may have heard of by the name of Pixar), while the other, a computing company called NeXT, was subsequently bought by Apple in 1996, bringing Jobs back to the company which he had co-founded twenty years previously. What happened in the following years can only be described as a complete revolution in personal computing technology. Returning To The 'Not-So-Big’ Apple By the time he had regained his position as CEO of Apple in 1997, the company was a stone’s throw away from declaring bankruptcy. After what had been a very prosperous decade for the company, sales were starting to decline. Microsoft had increased its market share by offering personal computers that were much more cost-effective than those offered by Apple. Jobs knew that continued competition between the two companies would spell the end of Apple. So, upon his return, he announced that Apple would be going into partnership with Microsoft. In his address to the Macworld Expo in 1997, he made it clear that there was no reason that the two companies couldn’t both be successful. He said: ‘We have to let go of this notion that for Apple to win, Microsoft has to lose. We have to embrace a notion that for Apple to win, Apple has to do a really good job.’ As part of the deal, Apple received $150 million from Microsoft, and Microsoft Office would be made available on Mac, with the additional announcement that Internet Explorer would be the default web browser on the Mac going forward. The rest, as they say, is history. Starting with the iMac, Apple went on to develop a huge range of products that went far beyond the realm of computers, such as iTunes, the iPod, and the iPad, in addition to its MacBook range. But you know as well as I do, that there is one product that is most associated with Apple. There is one product that has dominated the market since it was first released in 2007. I’m talking of course, about the iPhone. Steve Jobs unveiling the first generation iPhone at the Macworld Conference in 2007 (Image credit www.time.com) Having sold over 2 billion units since it was released in 2007, Apple has recently launched the 11th iteration of the iPhone. When Steve Jobs first announced the iPhone back in 2007, he said that the company’s aim was to capture 1% of the global mobile phone market. It’s safe to say that they succeeded. As of the financial quarter ending December 2018, Apple had garnered over 50% of the global smartphone market. Final Days In 2003, Jobs was diagnosed with pancreatic cancer. Following his diagnosis, he delayed having surgery, preferring to opt for alternative medical treatment in his attempts to beat the disease. It wasn’t until 2004 that he finally bowed to pressure from doctors and underwent surgery, in which the tumour was (apparently) successfully removed. In the years running up to his death, Jobs began to take more of a backseat at Apple. Having spent most of his time with the company taking the lead at Apple events and product unveilings, his retreat from the public eye fuelled speculation of his deteriorating health. He took several medical leaves of absence from his duties at Apple in the years prior to his death. On August 24th 2011, Steve Jobs officially resigned from his position as CEO of Apple. In his letter to the board he stated: ‘I have always said if there ever came a day when I could no longer meet my duties and expectations as Apple’s CEO, I would be the first to let you know. Unfortunately, that day has come.’ He died six weeks later at his home, surrounded by his family, after enduring a relapse of his pancreatic cancer. His Legacy Is Much More Powerful Than Any iPhone. We live in a world that loves to tell you how to live your life. Steve Jobs showed us that even the best-laid plans don’t always come to fruition, yet he is an example of why that isn’t necessarily a bad thing. Had he stayed in college instead of pursuing what he was truly passionate about, would the Macintosh have ever been invented? Would you be reading this on your iPhone, iPad, or Macbook? Possibly not. We can’t control most of what life throws at us. But what we can control is our response to it. Instead of wallowing in negativity after being fired from the company that he co-founded, he went on to found not one, but two companies. If he had never been fired from Apple in 1985, he may have never founded Pixar, and the world might never have known the delight that is the adventures of Woody, Buzz and the rest of the gang in Toy Story. Had he never have founded NeXT, he may have never developed the technology which Apple then went onto use in its core range of products. Remember Steve Jobs. Not only because you have him to thank for the iPhone in your pocket. Remember him because he was a man that didn’t waste his time on Earth doing things he didn’t want to do, because he was too busy doing the things that he loved. Because isn’t that the kind of life we all want to lead? “If you live every day like it was your last, one day you will most certainly be right." — Steve Jobs
https://medium.com/swlh/remembering-steve-jobs-edf2257bdab3
['Jon Peters']
2019-10-07 20:04:52.664000+00:00
['Innovation', 'Entrepreneurship', 'Culture', 'Technology', 'Apple']
Title Remembering Steve JobsContent Remembering Steve Jobs Eight year take look life man whose innovation changed world Known charismatic presentation perfectionism signature turtleneck Steve Jobs pioneer computing industry cofounding Apple Computers Inc 1976 death 2011 Jobs one polarising figure technology industry whose idea ingenuity impacted billion people worldwide life much technological innovation world forcefed child young age idea road success academia Steve Jobs example u life much satisfying love opposed told Born College Graduate Born 1955 Steve’s hadn’t even left mother’s womb made decision would go college wanted better future one could provide following birth immediately put adoption collegeeducated Catholic couple unfortunately Steve couple passed adopting explaining wanted adopt girl instead result Steve passed Paul Clara Jobs neither collegeeducated promised biological mother Steve would go college agreed sign adoption paper Throughout Steve’s childhood clear see incredibly gifted individual dislike formal education apparent parent used saving send son college keep promise biological mother perhaps didn’t anticipate dropping six month ‘Fate seems without sense irony’ — Morpheus Matrix Steve’s high school yearbook photo taken discovered Turtlenecks Image credit Seth Poppel Giving commencement speech Stanford University 2005 Jobs joked speech giving closest ever came college graduation discussing time college said ‘I couldn’t see value idea wanted life idea college going help figure spending money parent saved entire life’ Looking back said dropping college one best decision ever made longer burdened studying thing didn’t interest allowed freedom study thing recall opportunity take class calligraphy something found fascinating information learned class manifested ten year later incorporated design original Macintosh Without class argues Mac would never varied typeface font built First Bite Apple Apple famously cofounded Steve’s parent garage 1976 far cry 5 billion 2800000 square foot building currently serf Apple HQ Cupertino California following nine year Apple rise Beginning Apple release first Macintosh 1984 company begun establish one premier computing company world Steve Jobs public demonstration first Macintosh computer 1984 Image credit wwwbusinessinsidercom 1985 thing began unravel CEO John Sculley believed Jobs hurting company two extremely opposing view direction company going failing regain control company Jobs resigned position Apple leaving behind company cofounded aged twenty parent’s garage Inbetween Years discussing year followed departure Apple Jobs said fired best thing could happened point life leaving Apple went found one two successful company One went become successful animation studio world small company may heard name Pixar computing company called NeXT subsequently bought Apple 1996 bringing Jobs back company cofounded twenty year previously happened following year described complete revolution personal computing technology Returning NotSoBig’ Apple time regained position CEO Apple 1997 company stone’s throw away declaring bankruptcy prosperous decade company sale starting decline Microsoft increased market share offering personal computer much costeffective offered Apple Jobs knew continued competition two company would spell end Apple upon return announced Apple would going partnership Microsoft address Macworld Expo 1997 made clear reason two company couldn’t successful said ‘We let go notion Apple win Microsoft lose embrace notion Apple win Apple really good job’ part deal Apple received 150 million Microsoft Microsoft Office would made available Mac additional announcement Internet Explorer would default web browser Mac going forward rest say history Starting iMac Apple went develop huge range product went far beyond realm computer iTunes iPod iPad addition MacBook range know well one product associated Apple one product dominated market since first released 2007 I’m talking course iPhone Steve Jobs unveiling first generation iPhone Macworld Conference 2007 Image credit wwwtimecom sold 2 billion unit since released 2007 Apple recently launched 11th iteration iPhone Steve Jobs first announced iPhone back 2007 said company’s aim capture 1 global mobile phone market It’s safe say succeeded financial quarter ending December 2018 Apple garnered 50 global smartphone market Final Days 2003 Jobs diagnosed pancreatic cancer Following diagnosis delayed surgery preferring opt alternative medical treatment attempt beat disease wasn’t 2004 finally bowed pressure doctor underwent surgery tumour apparently successfully removed year running death Jobs began take backseat Apple spent time company taking lead Apple event product unveiling retreat public eye fuelled speculation deteriorating health took several medical leaf absence duty Apple year prior death August 24th 2011 Steve Jobs officially resigned position CEO Apple letter board stated ‘I always said ever came day could longer meet duty expectation Apple’s CEO would first let know Unfortunately day come’ died six week later home surrounded family enduring relapse pancreatic cancer Legacy Much Powerful iPhone live world love tell live life Steve Jobs showed u even bestlaid plan don’t always come fruition yet example isn’t necessarily bad thing stayed college instead pursuing truly passionate would Macintosh ever invented Would reading iPhone iPad Macbook Possibly can’t control life throw u control response Instead wallowing negativity fired company cofounded went found one two company never fired Apple 1985 may never founded Pixar world might never known delight adventure Woody Buzz rest gang Toy Story never founded NeXT may never developed technology Apple went onto use core range product Remember Steve Jobs thank iPhone pocket Remember man didn’t waste time Earth thing didn’t want busy thing loved isn’t kind life want lead “If live every day like last one day certainly right — Steve JobsTags Innovation Entrepreneurship Culture Technology Apple
2,431
An Investigation of the California Wildfire Crisis
After running the animation a couple of trends appear. Since the onset of 2017, there have been far more damaging fires across the state with coastal and forested areas hit hardest. Before 2018, fires were far more equally distributed in both size and frequency all over California. However, from 2018 onwards, there has been a rampant increase in deadly wildfires across Northern California compared to other regions in the state. 2020 alone has seen the most destructive fires in modern history with the August Complex Fire and LNU Lightning Complex of Northern California burning an estimated 1.4 million acres to date. The August Complex Fire (depicted as the large, bright yellow circle in the 2020 pane), was extinguished only last month in November after an unprecedented cost of $264.1 million dollars. One striking oddity in the trend of wildfire growth is the plummet of wildfires in 2019. With the lowest levels since 2004, the destruction done in 2019 amounted to only 260,000 acres — a mere 6% of the damage done in 2020. However, experts vehemently believe this trend will not change any long-term patterns and was simply an irregularity due to unusually heavy precipitation. From this animation, it is clear that the most destructive fires occur in more heavily-forested areas — centered around Northern California. However, can the same be said about the frequency of wildfire incidents? Hover over each bar for year-specific information In this bar chart, broken up by region (NorCal and SoCal), it is clear that while the number of incidents in Southern California is increasing at a slight linear rate, the number of incidents in Northern California appears to be growing exponentially year-on-year. Reasons for this stark division can be attributed to the presence of more vegetation in NorCal which acts as a veritable timebomb waiting to ignite during the dry season. With the exception of 2019 — which was an anomalous year for wildfires — there continues to be an overarching trend of both increased frequency and severity of wildfires, especially in Northern California. County-Level Analysis The figure below is a choropleth map — a thematic map colored in proportion to a specific statistical variable. In this case, this choropleth has been partitioned to visualize the cumulative acres burned since 2003 on a county-by-county basis. As expected, Northern California trends towards more land destruction. Hover over choropleth for information on specific counties Heavily damaged areas, depicted in purple and black, are also among the most forested regions in California. The San Joaquin Valley, an inland area constituting much of the Central Valley sports a more Mediterranean climate and is far less forested. This, in turn, accounts for fewer acres burned compared to the more northerly Sacramento Valley. Source: Drought Monitor NOAA Climate.gov Prolonged droughts and the rapid drying of vegetation accounts for the increase in wildfires. However, variability in temperature and precipitation are often the real instigators. Years of drought in California have traditionally been followed by very wet weather leaving behind vegetation that turns into fuel for wildfires. Cyclical weather patterns along with strong, warm winds prime wildfires for destruction. As seen above, Northern California is the most victim to moderate-severe drought compared to the rest of the state. This dichotomy might be explained by Southern California’s access to the Colorado River Aqueduct, which supplies more than a billion gallons of water a day. Hover over each bar for county-specific information The bar graph shown above reveals that Lake County, Shasta County, and Trinity county suffer the brunt of wildfire damage across California. The reason these three counties lead the pack is that they all encompass national parks and are more heavily forested than any other region in the state. Coincidentally the Shasta-Trinity National Forests — the largest National Forest in California was also the site of the August Complex Fire — the most destructive wildfire in modern California history. Across California, many minor incidents occur every day and are often contained without loss of property or life. Even with fewer wildfire incidents relative to other counties, Lake County is still responsible for more than 2 million acres worth of destruction over the last two decades. To better understand the impact wildfires have on private property we drew from data provided by the insurance industry. The below choropleth map serves to highlight figures provided by Verisk Analytics concerning the percentage of California households at high to extreme risk for wildfires. Hover over choropleth for risk-assessment of various counties Counties in blue indicate less than 20% of households are threatened, whereas red counties indicate a larger percentage of households at severe risk for wildfires. Alpine, Trinity, Tuolumne, and Mariposa counties had the highest concentration of severely at-risk households in Northern California. The reason for this disparity is because households in Northern counties are dispersed at the edge of forested areas and are often directly in the way of wildfires. In many instances, stray embers carried by the wind make their way to rain gutters, bursting into flames and engulfing hundreds of properties. Investigating Major Wildfire Incidents After investigating wildfire impact on a county-by-county level we shifted to analyzing specific wildfire incidents. The bar chart below shows the top wildfires incident in terms of acres burned. Hover over each bar for incident-specific information Among the wildfires listed, the top six all took place in Northern California between 2018 and 2020. Furthermore, the August Complex — the single-largest wildfire and the largest fire complex in recorded California history — is more than twice as large as any other recorded incident. Blazing across Mendocino, Humboldt, Trinity, Tehama, Glenn, Lake, and Colusa counties, the August Complex Fire has been coined as a “gigafire” — a term never before used in California history to signify a blaze that burns at least a million acres. From this subset of the Top 10 Wildfires in California history, the average duration that wildfires burned for was a whopping 59.5 days. Naturally, the question arises: What Makes Wildfires So Hard to Put Out? Wildfires behave in many ways like a combustion-powered hurricane. By channeling air and fuel upward, forests often spontaneously combust without actually coming into contact with flames. This combination of explosive growth and hellish conditions often renders fire-support teams from both the ground and air useless. In addition, since fires are most prevalent during the dry season, a lack of humidity leaves humans without the aid of Mother Nature. Seasonal Disparity For many Californian natives, fire season has always been marked by the end of Summer well into late Fall — making up the months of August, September, and October. To better understand the annual pattern of wildfires, we created a boxplot of Acres Burned across wildfire incidents from 2003–2020. By grouping wildfires by the month they started, we aimed to recognize and explain the seasonal disparity in damage. Hover over each point for incident-specific information Right from the start, it’s clear that wildfires dominate the autumn months. The peak of fire season spans from late July through September and is marked by radical wildfires destroying twice the number of acres compared to incidents in the spring and early summer. Interestingly, there likely exists two distinct fire seasons in California. A study by University of California authors found that California cycles between what is called the Summer Fire Season, and the Santa Ana Fire Season. Characterized by dry winds blowing towards the coast from the interior, the Santa Ana Fire Season occurs from October through April — striking more developed areas and inflicting more economic damage. The Summer Fire Season, however, can take place anywhere in the state and often impacts remote/wild areas — as was the case with the August Complex Fire which engulfed the Mendocino, Six Rivers, and Shasta-Trinity national forests. Making up the rest of the calendar year, the Summer Fire Season is between June and September. With the Santa Ana Fire Season inflicting more economic damage, the Summer Fire Season accounts for the most land destruction with millions of acres destroyed every year. In order to prove that there exist two distinct fire seasons in California, we tried to replicate the results made by University of California researchers. We decided to apply a form of unsupervised machine learning known as K-means Clustering to see if the “clusters” of months we identify align with the months outlined in the University of California paper. The idea behind this was to apply K-means clustering on key metrics that might reveal if a set of months truly deserves to belong in their own distinct fire season. As per the findings made by University of California researchers, we wanted to investigate metrics that are most representative of fires during the Summer Fire Season and Santa Ana Fire Season. For example, University of California researchers observed that ‘Summer Fires’ are more inclined to burn more slowly, while ‘Santa Ana Fires’ tend to burn along the coast. As such we take into account the average fire duration by month as well as the percent of fires located in coastal counties. Other metrics include the average acres burned and total wildfires incidents on a month-by-month basis. To ensure that the algorithm weights each metric with the same relative importance during the clustering process, we normalized each column between 0 and 1 using the MinMaxScaler function in Python’s Scikit Learn library. After normalizing my metrics, we were able to use the K-means algorithm to partition my data into clusters as seen below. There appear to be two distinct clusters based on this dataframe. Cluster A, identified by ‘1’ exists from June through September. Cluster B, identified by ‘0’ exists from October through April. Cluster A is striking in that there are more wildfire incidents, a greater number of acres burned on average, and longer wildfires. These results share similarities to the ‘Summer Fire Season’ description outlined by University of California researchers. In the same vein, fires in Cluster B tend to take place along coastal areas and burn fast — characteristic of the Santa Ana Fire Season. Hover over each point for incident-specific information To better visualize the disparity between fire seasons we applied a color scale to the boxplot from earlier. Months in orange constitute the Santa Ana Fire Season, whereas months in red represent the more destructive Summer Fire Season. Human Activity There are a variety of ways to gauge which incidents can be described as the worst wildfires in California history. Different metrics include size (acres burned), deadliness (lives lost), and destruction (infrastructure destroyed). In order to better understand the part humans play in instigating the worst wildfires in California history we look at the following subsets: the top 20 largest wildfires, the top 20 deadliest wildfires, and the top 20 most destructive wildfires.
https://ucladatares.medium.com/an-investigation-of-the-california-wildfire-crisis-7104b1cb4a69
['Ucla Datares']
2020-12-22 00:23:52.666000+00:00
['Python', 'California', 'Plotly', 'Wildfires', 'Datares']
Title Investigation California Wildfire CrisisContent running animation couple trend appear Since onset 2017 far damaging fire across state coastal forested area hit hardest 2018 fire far equally distributed size frequency California However 2018 onwards rampant increase deadly wildfire across Northern California compared region state 2020 alone seen destructive fire modern history August Complex Fire LNU Lightning Complex Northern California burning estimated 14 million acre date August Complex Fire depicted large bright yellow circle 2020 pane extinguished last month November unprecedented cost 2641 million dollar One striking oddity trend wildfire growth plummet wildfire 2019 lowest level since 2004 destruction done 2019 amounted 260000 acre — mere 6 damage done 2020 However expert vehemently believe trend change longterm pattern simply irregularity due unusually heavy precipitation animation clear destructive fire occur heavilyforested area — centered around Northern California However said frequency wildfire incident Hover bar yearspecific information bar chart broken region NorCal SoCal clear number incident Southern California increasing slight linear rate number incident Northern California appears growing exponentially yearonyear Reasons stark division attributed presence vegetation NorCal act veritable timebomb waiting ignite dry season exception 2019 — anomalous year wildfire — continues overarching trend increased frequency severity wildfire especially Northern California CountyLevel Analysis figure choropleth map — thematic map colored proportion specific statistical variable case choropleth partitioned visualize cumulative acre burned since 2003 countybycounty basis expected Northern California trend towards land destruction Hover choropleth information specific county Heavily damaged area depicted purple black also among forested region California San Joaquin Valley inland area constituting much Central Valley sport Mediterranean climate far le forested turn account fewer acre burned compared northerly Sacramento Valley Source Drought Monitor NOAA Climategov Prolonged drought rapid drying vegetation account increase wildfire However variability temperature precipitation often real instigator Years drought California traditionally followed wet weather leaving behind vegetation turn fuel wildfire Cyclical weather pattern along strong warm wind prime wildfire destruction seen Northern California victim moderatesevere drought compared rest state dichotomy might explained Southern California’s access Colorado River Aqueduct supply billion gallon water day Hover bar countyspecific information bar graph shown reveals Lake County Shasta County Trinity county suffer brunt wildfire damage across California reason three county lead pack encompass national park heavily forested region state Coincidentally ShastaTrinity National Forests — largest National Forest California also site August Complex Fire — destructive wildfire modern California history Across California many minor incident occur every day often contained without loss property life Even fewer wildfire incident relative county Lake County still responsible 2 million acre worth destruction last two decade better understand impact wildfire private property drew data provided insurance industry choropleth map serf highlight figure provided Verisk Analytics concerning percentage California household high extreme risk wildfire Hover choropleth riskassessment various county Counties blue indicate le 20 household threatened whereas red county indicate larger percentage household severe risk wildfire Alpine Trinity Tuolumne Mariposa county highest concentration severely atrisk household Northern California reason disparity household Northern county dispersed edge forested area often directly way wildfire many instance stray ember carried wind make way rain gutter bursting flame engulfing hundred property Investigating Major Wildfire Incidents investigating wildfire impact countybycounty level shifted analyzing specific wildfire incident bar chart show top wildfire incident term acre burned Hover bar incidentspecific information Among wildfire listed top six took place Northern California 2018 2020 Furthermore August Complex — singlelargest wildfire largest fire complex recorded California history — twice large recorded incident Blazing across Mendocino Humboldt Trinity Tehama Glenn Lake Colusa county August Complex Fire coined “gigafire” — term never used California history signify blaze burn least million acre subset Top 10 Wildfires California history average duration wildfire burned whopping 595 day Naturally question arises Makes Wildfires Hard Put Wildfires behave many way like combustionpowered hurricane channeling air fuel upward forest often spontaneously combust without actually coming contact flame combination explosive growth hellish condition often render firesupport team ground air useless addition since fire prevalent dry season lack humidity leaf human without aid Mother Nature Seasonal Disparity many Californian native fire season always marked end Summer well late Fall — making month August September October better understand annual pattern wildfire created boxplot Acres Burned across wildfire incident 2003–2020 grouping wildfire month started aimed recognize explain seasonal disparity damage Hover point incidentspecific information Right start it’s clear wildfire dominate autumn month peak fire season span late July September marked radical wildfire destroying twice number acre compared incident spring early summer Interestingly likely exists two distinct fire season California study University California author found California cycle called Summer Fire Season Santa Ana Fire Season Characterized dry wind blowing towards coast interior Santa Ana Fire Season occurs October April — striking developed area inflicting economic damage Summer Fire Season however take place anywhere state often impact remotewild area — case August Complex Fire engulfed Mendocino Six Rivers ShastaTrinity national forest Making rest calendar year Summer Fire Season June September Santa Ana Fire Season inflicting economic damage Summer Fire Season account land destruction million acre destroyed every year order prove exist two distinct fire season California tried replicate result made University California researcher decided apply form unsupervised machine learning known Kmeans Clustering see “clusters” month identify align month outlined University California paper idea behind apply Kmeans clustering key metric might reveal set month truly deserves belong distinct fire season per finding made University California researcher wanted investigate metric representative fire Summer Fire Season Santa Ana Fire Season example University California researcher observed ‘Summer Fires’ inclined burn slowly ‘Santa Ana Fires’ tend burn along coast take account average fire duration month well percent fire located coastal county metric include average acre burned total wildfire incident monthbymonth basis ensure algorithm weight metric relative importance clustering process normalized column 0 1 using MinMaxScaler function Python’s Scikit Learn library normalizing metric able use Kmeans algorithm partition data cluster seen appear two distinct cluster based dataframe Cluster identified ‘1’ exists June September Cluster B identified ‘0’ exists October April Cluster striking wildfire incident greater number acre burned average longer wildfire result share similarity ‘Summer Fire Season’ description outlined University California researcher vein fire Cluster B tend take place along coastal area burn fast — characteristic Santa Ana Fire Season Hover point incidentspecific information better visualize disparity fire season applied color scale boxplot earlier Months orange constitute Santa Ana Fire Season whereas month red represent destructive Summer Fire Season Human Activity variety way gauge incident described worst wildfire California history Different metric include size acre burned deadliness life lost destruction infrastructure destroyed order better understand part human play instigating worst wildfire California history look following subset top 20 largest wildfire top 20 deadliest wildfire top 20 destructive wildfiresTags Python California Plotly Wildfires Datares
2,432
5 Books Bill Gates Thinks You Should Read in 2021
5 Books Bill Gates Thinks You Should Read in 2021 A reading list that will inspire you to think differently. Photo via Flickr I’ve recently noticed that I have an increasing amount of things in common with Bill Gates. Unfortunately, I’m not talking about having a billion dollars in my checking account. Instead, I’m referring to a genuine love of reading that inspires me to think differently about the world. How do I continuously find new and exciting books to read? By paying attention to recommendations from passionate readers, such as Bill Gates, and then reading them as soon as I get an opportunity. So below are several interesting books that Bill Gates has recommended. Each of them changed the way I see the world, and I’m confident they will do the same for you, too.
https://medium.com/curious/5-books-bill-gates-thinks-you-should-read-in-2021-19f926e9730c
['Matt Lillywhite']
2020-12-28 13:15:53.716000+00:00
['Education', 'Books', 'Reading', 'Productivity', 'Self Improvement']
Title 5 Books Bill Gates Thinks Read 2021Content 5 Books Bill Gates Thinks Read 2021 reading list inspire think differently Photo via Flickr I’ve recently noticed increasing amount thing common Bill Gates Unfortunately I’m talking billion dollar checking account Instead I’m referring genuine love reading inspires think differently world continuously find new exciting book read paying attention recommendation passionate reader Bill Gates reading soon get opportunity several interesting book Bill Gates recommended changed way see world I’m confident tooTags Education Books Reading Productivity Self Improvement
2,433
The Psychology of Airport Design
The Psychology of Airport Design How airports are designed using traveller behaviour Photo by chuttersnap on Unsplash As a case study in environmental design, airports are fascinated. At the core, their function seems fairly simple: a holding space for travellers who are waiting for a flight. Yet, they’re actually an important retail space for many companies and, although you may not notice it, they’re designed with this firmly in mind. Airport designers think carefully about the journey that travellers make through an airport, from check in to security to gate. They then look to behavioural psychology, looking at how people move around spaces like airports. Combining these two elements allows airport designers to design a space around the traveller’s path which will entice them with retail and restaurant opportunities. It’s summer, and many of us will be passing through an airport or two over the coming months. Next time you’re in an airport, you might notice some of these ways that airport design reflects psychology and human behaviour. The stressful part is out of the way quickly Photo by Moralis Tsai on Unsplash Taking a flight can be a stressful experience. When you first get to the airport you’re faced with check in. You’re already worried that you’ll be hit with a mega charge if your bag is over the weight limit. You’re also quizzed with intense questions: did you pack your own bag? After that, you’re directed through a series of queues to get through security. Even if there’s nothing suspicious whatsoever in your bag or on your person, airport security is enough to get your pulse rating — especially if the beeper goes off and you have to have a hands-on search. Airport designers are well aware of this stress. And they also know that after that stress comes relaxation, the start of holiday mode. In terms of retail, this is the key time. All of the airport admin is done, and it’s time to grab a glass of pre-holiday prosecco and browse the duty-free shops. That’s why changes in airports are generally focused on optimising that initial portion of the airport experience: streamlining security checks, or improving at home check in, for instance. Get the stress over and done with quickly, and prolong that period of pre-flight relaxation when passengers are more likely to spend money on retail shops and restaurants. Pathways are built right through duty free Duty free shops are a key area of income for airports. Because travellers experience that period of relaxation immediately following security, airport designers will usually have the duty free shop as the first thing that a traveller sees after security. It acts as a ‘re-composure’ space where the traveller can move from stressful process to relaxed retail. Research has shown that if customers have to physically walk past items which are for sale, they’re 60% more likely to make a purchase. That’s why almost every duty free shop in an airport is configured in such a way that all passengers have to walk through it. It’s usually the gateway between security and the retail space of an airport. By exposing customers to products in this way, they’re able to maximise revenue. Walkways mirror how we walk Most of us are right handed, meaning that we’ll naturally use our right hand to pull our carry-on luggage. To improve our balance, we’ll therefore tend to walk in an anticlockwise direction. That means that when we’re walking through an airport, most of us are looking to the right far more than we’re looking to the left. Airport designers use this behavioural knowledge to inform how they design routes through an airport. They mimic the way that we walk, designing walkways which curve from right to left. The majority of shops will then be placed on the right hand side, where they are more visible to people who are walking to the left. Metres become minutes Photo by Steven Hille on Unsplash Airport can be quite big spaces, housing thousands of passengers. It can, therefore, take a while to get to your gate when the flight is ready. To mitigate the stress of this, airport signs for gates used to give the metres between your current position and the gate. However, in recent years you may have noticed that those metres have become minutes — the time it takes to walk to the gate. Research found that passengers understood minutes as a marker of distance much more quickly than they could understand the metres. This helps us to feel more at ease during their time in the airport, because we know exactly how much time they need to get to the gate. Therefore, we’re likely to spend longer in that retail and restaurant area of the airport, where we’re helping the airport to generate profit. Keep it cool and dark Photo by VanveenJF on Unsplash Recently ‘smart glass’ has begun to be used in airport design. This smart glass can adjust itself based on the amount of sunlight exposure coming through it, preventing too much heat and sun glare entering the airport. Dallas-Forth Worth International Airport ran a test with the smart glass in October 2018. They found that when the smart glass was installed customers were much more likely to stay longer in the airport’s restaurants, and to buy an extra drink or two. Sales of alcohol increased by a huge 80% during the test period, simply because it was cooler and darker in the restaurant. References Insights for this post were gathered using this report by Intervistas, titled: ‘Maximising Airport Retail Revenue’.
https://medium.com/swlh/the-psychology-of-airport-design-5858a5a2db25
['Tabitha Whiting']
2019-07-23 14:36:29.405000+00:00
['Travel', 'Design Thinking', 'Design', 'Psychology', 'Airports']
Title Psychology Airport DesignContent Psychology Airport Design airport designed using traveller behaviour Photo chuttersnap Unsplash case study environmental design airport fascinated core function seems fairly simple holding space traveller waiting flight Yet they’re actually important retail space many company although may notice they’re designed firmly mind Airport designer think carefully journey traveller make airport check security gate look behavioural psychology looking people move around space like airport Combining two element allows airport designer design space around traveller’s path entice retail restaurant opportunity It’s summer many u passing airport two coming month Next time you’re airport might notice way airport design reflects psychology human behaviour stressful part way quickly Photo Moralis Tsai Unsplash Taking flight stressful experience first get airport you’re faced check You’re already worried you’ll hit mega charge bag weight limit You’re also quizzed intense question pack bag you’re directed series queue get security Even there’s nothing suspicious whatsoever bag person airport security enough get pulse rating — especially beeper go handson search Airport designer well aware stress also know stress come relaxation start holiday mode term retail key time airport admin done it’s time grab glass preholiday prosecco browse dutyfree shop That’s change airport generally focused optimising initial portion airport experience streamlining security check improving home check instance Get stress done quickly prolong period preflight relaxation passenger likely spend money retail shop restaurant Pathways built right duty free Duty free shop key area income airport traveller experience period relaxation immediately following security airport designer usually duty free shop first thing traveller see security act ‘recomposure’ space traveller move stressful process relaxed retail Research shown customer physically walk past item sale they’re 60 likely make purchase That’s almost every duty free shop airport configured way passenger walk It’s usually gateway security retail space airport exposing customer product way they’re able maximise revenue Walkways mirror walk u right handed meaning we’ll naturally use right hand pull carryon luggage improve balance we’ll therefore tend walk anticlockwise direction mean we’re walking airport u looking right far we’re looking left Airport designer use behavioural knowledge inform design route airport mimic way walk designing walkway curve right left majority shop placed right hand side visible people walking left Metres become minute Photo Steven Hille Unsplash Airport quite big space housing thousand passenger therefore take get gate flight ready mitigate stress airport sign gate used give metre current position gate However recent year may noticed metre become minute — time take walk gate Research found passenger understood minute marker distance much quickly could understand metre help u feel ease time airport know exactly much time need get gate Therefore we’re likely spend longer retail restaurant area airport we’re helping airport generate profit Keep cool dark Photo VanveenJF Unsplash Recently ‘smart glass’ begun used airport design smart glass adjust based amount sunlight exposure coming preventing much heat sun glare entering airport DallasForth Worth International Airport ran test smart glass October 2018 found smart glass installed customer much likely stay longer airport’s restaurant buy extra drink two Sales alcohol increased huge 80 test period simply cooler darker restaurant References Insights post gathered using report Intervistas titled ‘Maximising Airport Retail Revenue’Tags Travel Design Thinking Design Psychology Airports
2,434
When Your Past Haunts Your Current Relationships
“The patient cannot remember the whole of what is repressed in him, and what he cannot remember may be precisely the essential part of it. He is obliged to repeat the repressed material as a contemporary experience instead of remembering it as something in the past.” ~Sigmund Freud The old saying, “we are creatures of habit” rings true here, and especially when talking about why we repeat — on autopilot — the things that we instinctively know are shooting us in the foot. Are we gluttons for punishment? Well, Behavior Analytically speaking…no. If we continue doing what inevitably doesn’t have our best interests at heart, then we’re sabotaging ourselves, as in compulsively…and repetitively. What I am referring to here is called “repetition compulsion” which is a term coined by Sigmund Freud as he watched a young child throw a toy repeatedly and then pick it back up, only to throw it again. In true Freudian analysis, he proposed that the young child was missing his mother who had left the house earlier, and that the kid’s behavior was a combination of ridding himself of his absentee mother (by tossing the toy) and then bringing mom back (grabbing the toy), thus “fixing” the situation. Freud aside, we are often guilty of a ‘repetition compulsion’ of sorts if we gravitate to the same show to binge over and over. Or, we may head to our favorite getaway and order the same thing from room service each time. While this is usually seen as being in our Netflix comfort zone, or having an affinity for Cobb salad, there isn’t anything necessarily self-destructive in these “repetitions” — as long as these habits aren’t being used to self-sabotage or to avoid or numb other pain. For example, if you binge Netflix as an escape or to numb for every failed relationship, then you’re avoiding digging deeper in unboxing what may keep you locked intoa habit of chasing a new relationship each time a problem surfaces in an existing one. However, healthy repetitive behavior isn’t what Freud or other analysts are referring to regarding this phenomenon. More often than not, a repetition compulsion is a series of learned, habitual behaviors and behavior patterns that originate in childhood and negatively influence us throughout our adult lives unless (or until) we choose to conquer them and make healthy changes. Because of their repetitive nature, most of us probably gravitate towards thinking that compulsive bad habits are part of intimate relationships. And, you would be correct in thinking this. Intimate relationships may be where a repetition compulsion has most of its strength and influence because of vulnerable emotions and emotional intimacy that are usually tied into it.
https://medium.com/hello-love/when-your-past-haunts-your-current-relationships-66adb4df634
['Annie Tanasugarn']
2020-12-02 16:56:36.890000+00:00
['Mental Health', 'Life Lessons', 'Love', 'Psychology', 'Life']
Title Past Haunts Current RelationshipsContent “The patient cannot remember whole repressed cannot remember may precisely essential part obliged repeat repressed material contemporary experience instead remembering something past” Sigmund Freud old saying “we creature habit” ring true especially talking repeat — autopilot — thing instinctively know shooting u foot glutton punishment Well Behavior Analytically speaking…no continue inevitably doesn’t best interest heart we’re sabotaging compulsively…and repetitively referring called “repetition compulsion” term coined Sigmund Freud watched young child throw toy repeatedly pick back throw true Freudian analysis proposed young child missing mother left house earlier kid’s behavior combination ridding absentee mother tossing toy bringing mom back grabbing toy thus “fixing” situation Freud aside often guilty ‘repetition compulsion’ sort gravitate show binge may head favorite getaway order thing room service time usually seen Netflix comfort zone affinity Cobb salad isn’t anything necessarily selfdestructive “repetitions” — long habit aren’t used selfsabotage avoid numb pain example binge Netflix escape numb every failed relationship you’re avoiding digging deeper unboxing may keep locked intoa habit chasing new relationship time problem surface existing one However healthy repetitive behavior isn’t Freud analyst referring regarding phenomenon often repetition compulsion series learned habitual behavior behavior pattern originate childhood negatively influence u throughout adult life unless choose conquer make healthy change repetitive nature u probably gravitate towards thinking compulsive bad habit part intimate relationship would correct thinking Intimate relationship may repetition compulsion strength influence vulnerable emotion emotional intimacy usually tied itTags Mental Health Life Lessons Love Psychology Life
2,435
The Story of our New Medium Publication Writing Heals
This past week I had an almost non stop bombardment of family stresses. I won’t go into all of it but my man Bobs’ sister Carol has rapidly progressing M.S. (Muscular Sclerosis). She is going downhill fast. She is my age, 57. It’s very hard to watch. Bobs mom, who has been her caretaker called yesterday sounding weary and overwhelmed with grief. “This is the hardest thing ever….to be a parent watching your child die slowly!” sigh… “Carol has not stopped eating. She weighs over 300 lbs now. She just doesn’t want to give up her sweets! She said, ‘Mom, I want to go out happy!” She knows she doesn’t have much time left and wants to eat what she wants. She has pretty much given up. Now she wants to enjoy herself and her life!” I guess I can’t blame her. She had a half written book she was getting ready to publish. It was a life ‘dream’ of hers. I planned to help her with formatting and editing but now that dream is over. She can no longer write or focus on anything. It’s sad to see the quickness of her deterioration — how fragile and short life really is. Photo by Clemente Cardenas on Unsplash I also have two friends (both also my age) who have cancer. It doesn’t look good for them either. So… all this had me thinking about the urgency of life. Why we must must try to use our time for life enhancing things. I believe this in my soul!
https://medium.com/writing-heals/the-story-of-our-new-medium-publication-grand-opening-today-ad842706363f
['Michelle Monet']
2019-10-05 23:38:33.093000+00:00
['Mental Health', 'Healing', 'Writing Tips', 'Writing Life', 'Writing']
Title Story New Medium Publication Writing HealsContent past week almost non stop bombardment family stress won’t go man Bobs’ sister Carol rapidly progressing MS Muscular Sclerosis going downhill fast age 57 It’s hard watch Bobs mom caretaker called yesterday sounding weary overwhelmed grief “This hardest thing ever…to parent watching child die slowly” sigh… “Carol stopped eating weighs 300 lb doesn’t want give sweet said ‘Mom want go happy” know doesn’t much time left want eat want pretty much given want enjoy life” guess can’t blame half written book getting ready publish life ‘dream’ planned help formatting editing dream longer write focus anything It’s sad see quickness deterioration — fragile short life really Photo Clemente Cardenas Unsplash also two friend also age cancer doesn’t look good either So… thinking urgency life must must try use time life enhancing thing believe soulTags Mental Health Healing Writing Tips Writing Life Writing
2,436
The Subtle Art of Writing Copy
The Subtle Art of Writing Copy Good design is based on a good copy. That’s why UX writing should definitely be the next thing on your skill set. Should designers… If you work in the design industry, you might have read at least one article starting with the following words: should designers [name the skill here]? This question returns to us like a boomerang, bringing a new thing each time. Should designers code? Should they know how to create breathtaking UI? Should they conduct research and client workshops? What about analytics? The list goes on and on. Should designers write copy? As a response to this phenomenon, we see that many fractions have emerged in the design community. From specialization fanatics to one-man-army believers. Recently, the newest skill on everyone’s mind is writing. UX writing, to be specific. So, should designers write? FOLI — the fear of lorem ipsum In 2019 everyone (sic!) knows, that lorem ipsum is bad. If you use lorem ipsum, it is because you either believe that content is not your responsibility, or you are too busy (lazy) to come up with your own copy. In defense of those who still like it, though, Scott Kubie, Lead Content Strategist at Brain Traffic, admits he sometimes uses lorem ipsum to see the shape of the text and visualize the paragraphs. In any other case, working on real content is often crucial to the design. If you don’t consider the length of the CTA labels, headings or blog posts when designing the interface, your whole concept will most probably break the second it goes live. Should I do it or should you? This brings up the issue of responsibility. But to talk about responsibility, we need to acknowledge the problem first. Writing content is at the very bottom of both the product team’s and client’s to-do list. Designers assume that the client will create or adjust the content based on their designs; that there will be someone else that will do it better, so they leave some places blank or use “button label” instead of real text. At the same time, clients believe they will get the working product that is not only beautiful but also functional and ready for development. It is thus not surprising that when the end of the project is on the horizon, it becomes clear that we are missing the text, i.a.: error messages and recovery flows confirmation screens user-visible metadata like page titles and search engine descriptions transactional emails in-app user assistance support documentation changelogs feature descriptions and marketing copy. As designers, we are responsible for delivering the product, and it definitely includes the copy, at least in a draft phase. Why? Because the copy sometimes has the power to alter the whole design and we need to be aware of that. Using a specific copy is a design decision. Our job is to guide the client on how to shape the content so that it agrees not only with the design itself but also with the content strategy that is best for the product. That said, we are not able to, and we shouldn’t produce the content without the client’s input. That is why cooperation is the key here. Ok, but I can’t write Well, that is simply not true. If you know how to design, you also know how to write. You might not be very good at it, sure, but that is where all the publications on UX writing come in handy. Polishing your skills in this area will help you in not only coming up with better copy, but also formulating and explaining your ideas to clients. And remember, unless you are a UX writer assigned specifically to create content for the project, the fact that you are responsible for it doesn’t mean you need to write it all by yourself. Browse through your client’s product descriptions, use common language patterns and don’t be afraid to ask for feedback. It is not the originality that is being assessed here. Sometimes being too creative puts us on the straight path to dark patterns or confirm-shaming, like in the examples below, where the copy that was supposed to be funny is actually shaming users into doing something they might not want to do. Examples of confirm-shaming — not opting-in means that you accept the website insulting or shaming you. Types of content In general, we usually divide digital content into 3 categories: Interface copy or microcopy — short text elements like labels for form fields, text on buttons, navigation labels, error messages, etc. The interface would break without them; Product copy — not necessarily a direct part of the interface, but plays an important role in the functioning of the product. It focuses on supporting the reader, like e.g. the body of the onboarding email. Marketing copy — connected with sales or promotion, often longer and focused on persuading the reader. Here you can be more creative. Depending on the product, there can be many more categories to deal with. The most crucial one for designers is the first one — microcopy. However, clients sometimes need some guidance with other types of content, and for your design to work, it is best to address that at an early stage. If the blog posts are very long or difficult to understand, even the most beautiful UI won’t improve the user experience. And if the value proposition is not stated clearly enough, the bounce rate might be very high despite the new shiny information architecture. This doesn’t mean you are responsible for the content that clients should produce themselves. But creating a draft can start a discussion, and discussion can lead to mutual understanding. After all, it is in your best interest as a designer for the product to work and perform best when it hits the light of the day. You can then create that Dribbble shot and attach the link to the real product with pride. Design and copy are inseparable DOs and DON’Ts Below you can find some tips to get you started. A list of practices that you should avoid if you want to deliver a high-quality copy is longer than that. Think of it as a base on which you can build later on. DOs More and more companies create their own content style guides. They are connected with their brand identity, but also use the very basic principles of clear and appropriate style. Check Shopify, Mailchimp, Buzzfeed or even Material design content style guides for more inspiration and implement it in your own process. Don’t just copy it 1:1, though, as context of your users will be different depending on whether you are designing a banking app or a social media platform. Try to use these style guides to create your very own. Try to cover all the possible errors, but while doing that, consider if you really need it. Can this problem be solved by changing the flow, layout, colors? You may discover that the error message can be avoided by simply getting rid of the error itself. When starting a project, always agree on who is responsible for the content. As I’ve mentioned before, it doesn’t mean this person needs to write it all, but they need to manage it, start the discussion and make sure that this issue is being addressed. Establish the values that will guide you throughout the process. For starters, try being helpful and human in your copy. This means empathy in error messages or avoiding technical jargon. Not sure if the text is understandable? Try to test a sample in one of the online tools like e.g. Hemingway App. It takes just a few seconds and you get feedback right away. Make it easier for the user to take in information. Use numerals instead of words for numbers, especially those higher than 9. Replace dates with “today,” “yesterday,” “tomorrow.” Make sure button labels always have action verbs. DON’Ts Don’t try to be too clever with the microcopy. It is not about creative writing but being simple and transparent so that users do not even notice your choice of words. You can work on some more exciting phrases when writing a marketing text for the landing page. Still, in most cases short beats good. Users don’t read word for word and if the text is not easy to scan, they might just not read it at all. Which one of these messages are you most likely to read? Don’t ignore the edge cases. If you don’t write the copy for every single error possible, there are two ways it can play out. One is that users will get the same error message each time, regardless of the problem. It is definitely not helpful and can get really annoying. Option two is that developers will write these texts for you, which often ends up being a very technical jargon. To avoid that, work closely with the engineers, learn about all the edge cases and address them with the right text. Don’t forget about other languages. If your product has more than one language version, you need to consider that when designing. Otherwise, this small lovely button of yours will break when switching from English to German. Avoid long blocks of text. It is easier to digest the information when it is divided into smaller chunks. Want to bring it to the next level? Add subheadings too. This way you can inform the users what they will find in the next couple of paragraphs. Why it is worth it Designers can’t be everything at once. But stepping out of your comfort zone has so many benefits that it is worth at least considering. And the art of writing needs to be cultivated — after all, it is one of the things that makes us human. If you want to practice writing outside of your work, start with things like Day One app. See how it goes and work your way up from there. I was once told that good design does not require words. You can agree or disagree with that sentence, but from my experience as a UX designer, I’ve learnt that good design is based on a good copy — one does not exist without the other. Poorly written text can be misleading and even the prettiest mockups might not make up for it.
https://medium.com/elpassion/the-subtle-art-of-writing-copy-3a566c367bf7
['Ewelina Skłodowska']
2019-09-16 12:10:49.349000+00:00
['Ux Writing', 'UX Design', 'Design', 'Productivity', 'UX']
Title Subtle Art Writing CopyContent Subtle Art Writing Copy Good design based good copy That’s UX writing definitely next thing skill set designers… work design industry might read least one article starting following word designer name skill question return u like boomerang bringing new thing time designer code know create breathtaking UI conduct research client workshop analytics list go designer write copy response phenomenon see many fraction emerged design community specialization fanatic onemanarmy believer Recently newest skill everyone’s mind writing UX writing specific designer write FOLI — fear lorem ipsum 2019 everyone sic know lorem ipsum bad use lorem ipsum either believe content responsibility busy lazy come copy defense still like though Scott Kubie Lead Content Strategist Brain Traffic admits sometimes us lorem ipsum see shape text visualize paragraph case working real content often crucial design don’t consider length CTA label heading blog post designing interface whole concept probably break second go live brings issue responsibility talk responsibility need acknowledge problem first Writing content bottom product team’s client’s todo list Designers assume client create adjust content based design someone else better leave place blank use “button label” instead real text time client believe get working product beautiful also functional ready development thus surprising end project horizon becomes clear missing text ia error message recovery flow confirmation screen uservisible metadata like page title search engine description transactional email inapp user assistance support documentation changelogs feature description marketing copy designer responsible delivering product definitely includes copy least draft phase copy sometimes power alter whole design need aware Using specific copy design decision job guide client shape content agrees design also content strategy best product said able shouldn’t produce content without client’s input cooperation key Ok can’t write Well simply true know design also know write might good sure publication UX writing come handy Polishing skill area help coming better copy also formulating explaining idea client remember unless UX writer assigned specifically create content project fact responsible doesn’t mean need write Browse client’s product description use common language pattern don’t afraid ask feedback originality assessed Sometimes creative put u straight path dark pattern confirmshaming like example copy supposed funny actually shaming user something might want Examples confirmshaming — optingin mean accept website insulting shaming Types content general usually divide digital content 3 category Interface copy microcopy — short text element like label form field text button navigation label error message etc interface would break without Product copy — necessarily direct part interface play important role functioning product focus supporting reader like eg body onboarding email Marketing copy — connected sale promotion often longer focused persuading reader creative Depending product many category deal crucial one designer first one — microcopy However client sometimes need guidance type content design work best address early stage blog post long difficult understand even beautiful UI won’t improve user experience value proposition stated clearly enough bounce rate might high despite new shiny information architecture doesn’t mean responsible content client produce creating draft start discussion discussion lead mutual understanding best interest designer product work perform best hit light day create Dribbble shot attach link real product pride Design copy inseparable DOs DON’Ts find tip get started list practice avoid want deliver highquality copy longer Think base build later DOs company create content style guide connected brand identity also use basic principle clear appropriate style Check Shopify Mailchimp Buzzfeed even Material design content style guide inspiration implement process Don’t copy 11 though context user different depending whether designing banking app social medium platform Try use style guide create Try cover possible error consider really need problem solved changing flow layout color may discover error message avoided simply getting rid error starting project always agree responsible content I’ve mentioned doesn’t mean person need write need manage start discussion make sure issue addressed Establish value guide throughout process starter try helpful human copy mean empathy error message avoiding technical jargon sure text understandable Try test sample one online tool like eg Hemingway App take second get feedback right away Make easier user take information Use numeral instead word number especially higher 9 Replace date “today” “yesterday” “tomorrow” Make sure button label always action verb DON’Ts Don’t try clever microcopy creative writing simple transparent user even notice choice word work exciting phrase writing marketing text landing page Still case short beat good Users don’t read word word text easy scan might read one message likely read Don’t ignore edge case don’t write copy every single error possible two way play One user get error message time regardless problem definitely helpful get really annoying Option two developer write text often end technical jargon avoid work closely engineer learn edge case address right text Don’t forget language product one language version need consider designing Otherwise small lovely button break switching English German Avoid long block text easier digest information divided smaller chunk Want bring next level Add subheading way inform user find next couple paragraph worth Designers can’t everything stepping comfort zone many benefit worth least considering art writing need cultivated — one thing make u human want practice writing outside work start thing like Day One app See go work way told good design require word agree disagree sentence experience UX designer I’ve learnt good design based good copy — one exist without Poorly written text misleading even prettiest mockups might make itTags Ux Writing UX Design Design Productivity UX
2,437
How emotions work to create preference
Two main traits of the human brain work together when creating brand preference: Energy conservancy and emotions. Where as the brains need to create preference stems from it’s need to conserve energy / survival instinct (read more…). Emotions help us create this preference. The important thing here is that emotions is not the brain being lazy, it’s the brains way of evaluating and labeling a choice (and then being able identify preference.) How does this work? Let’s again look to Daniel Gilbert: Gilbert says that great psychologists in the end are measured by how they finish the sentence “men differ from monkeys because they …”. And Gilberts claim is that they synthesize future. What does this mean? “synthesize future”? When faced with a decision of a specific proportion we imagine ourselves “using” the product or the product being in “use”. We do this by recalling previous experiences that we find relevant and that helps us understand. Images based on our previous experiences which we collect because we find them relevant to the situation. Our emotions connected to these previous images are then mixed and creates an end-state emotion that we connect to the choice at hand. Now the brain works in such a way that emotion created from imagining things has the same effect as real situation emotion. Some quotes to support this: (larger image…) (larger image…) (larger image…)
https://medium.com/137-jokull/how-emotions-work-to-create-preference-8f27c92d6558
['Helge Tennø']
2017-01-21 05:08:52.876000+00:00
['Perspective', 'Advertising', 'Psychology', 'Marketing']
Title emotion work create preferenceContent Two main trait human brain work together creating brand preference Energy conservancy emotion brain need create preference stem it’s need conserve energy survival instinct read more… Emotions help u create preference important thing emotion brain lazy it’s brain way evaluating labeling choice able identify preference work Let’s look Daniel Gilbert Gilbert say great psychologist end measured finish sentence “men differ monkey …” Gilberts claim synthesize future mean “synthesize future” faced decision specific proportion imagine “using” product product “use” recalling previous experience find relevant help u understand Images based previous experience collect find relevant situation emotion connected previous image mixed creates endstate emotion connect choice hand brain work way emotion created imagining thing effect real situation emotion quote support larger image… larger image… larger image…Tags Perspective Advertising Psychology Marketing
2,438
Plandemic : Debunked
SCIENCE Plandemic : Debunked Stop Sharing Propaganda and Misinformation Altered still from Plandemic meme, 2020 Go on social media right now and you are destined to run in to someone sharing or promoting a video that has gone viral called PLANDEMIC. They will parrot the points from this video like good little puppets, without taking the time to research who is giving them the information, or the credibility of the information being presented. This video is pure sensationalism and filled with outrageous lies and mistruths. And the information being shared is being shared by someone who is a known charlatan. If you believe in and promote this video, you’ve simply fallen for the trick. First, who is Dr. Judy Mikovits? Google her name. What do you find? I think the first thing a person should notice that sticks out like a giant red flag is her connection to the anti-vaccine movement. This person has become a hero to a movement that claims vaccines are dangerous, cause autism, and kill people, thusly making her the hero of a movement that has caused a resurgence in long conquered diseases like the measles. The next red flag a person should pay attention to is the word DISCREDITED which always appears next to her name. When a scientist is kicked out of the scientific community and then becomes a champion for conspiracy theories, some might say this is anything but a coincidence. But unlike conspiracy theorists, who connect invisible dots no one else can see, there are dots to be connected here that are as large as planets. As if the description of the video, which mentions a global conspiracy headed by the Rockefellers with ties to Nazi Germany in World War II, isn’t enough to clue people into its insane agenda, Plandemic follows the straightforward and rudimentary template of a conspiracy theory propaganda film, much the same as others that have come before it like Zeitgeist or Loose Change (videos claiming 9/11 was an inside job). It shows people talking in a darkly lit room making claims they present just on their own authority without any evidence to back them up. It does not give any other perspective. The music is a droning and somewhat ominous keyboard that provides an eery tone for the footage, making it appear to the viewer like they are seeing something that is supposed to be secret. The viewer is intended to feel this way because this is simply cut and paste manipulation tactics 101. The person who made this movie is not exactly credible either. Mikki Willis has made quite a few questionable “documentaries.” In one of them, he concludes that a filmmaker named Daniel Northcott contracted leukemia and died due to finding a cursed Mayan bone… Luckily for you, I’ve watched this 30 minute video so you don’t have to. Allow me to walk you through the claims it makes, and why they are blatantly incorrect and dangerous: Dr. Judy Mikovits The video claims Judy Mikovits to be “the most accomplished scientist of her generation.” NOT TRUE. She is known to be a fraud who manipulated laboratory conditions to produce her intended results. She was fired from the lab she worked in over concerns of integrity. She has repeated dangerous claims linking autism to vaccines after initially trying to link them to Chronic Fatigue Syndrome. While her early career does include some good work researching HIV and AIDS, this good potential was quickly undermined by her pursuits working for a private lab with a biased interest in Chronic Fatigue Syndrome. The video claims Dr. Judy Mikovits revolutionized the treatment of AIDS in 1991 with her doctoral thesis. She did publish a thesis and at least one other AIDS study such as this one, but there is no evidence to support that her research “revolutionized” the treatment of this disease. In this video, Mikovits claims that Big Pharma was behind her being jailed and slandered, and that she was arrested with NO CHARGES and her home was searched WITHOUT A WARRANT. These claims are outlandish, and easily proven to be FALSE. Mikovits was charged with theft. There was a warrant issued out of Washoe County, Nevada. These charges were related to her taking samples and equipment from the lab she worked in without consent, and over concerns that she would destroy evidence. On top of that, she was only in jail for five days, where she makes it seem like she was in prison for years. The Mikovits SCIENCE scandal Plandemic sensationalizes Mikovits as a victim by making claims that her published article in the magazine SCIENCE was something that shook the scientific community and that a conspiracy worked to take down her work. This is FALSE. Her paper was found to be manipulated and peer review studies could not replicate her findings. She worked under conditions that were heavily biased to link XMRV to Chronic Fatigue Syndrome because the owner of the institute she worked for had a daughter suffering with the disease. She manipulated the lab to present false positives. Thusly, her paper was retracted from SCIENCE. Mikovits thusly hates Dr. Anthony Fauci because he is the one who ordered the review of her research. Dr. Fauci and the AIDS Epidemic Mikovits makes the horrific claim that Dr. Anthony Fauci stole her research and suppressed it, thus leading to millions of deaths from AIDS. She is essentially blaming one person for the entire AIDS epidemic, and trying to claim she had come up with a miracle cure. There is no evidence for any of this. This is what we should realize happens to be a gigantic straw man, making an enemy with all the credibility of the boogeyman out of someone who in all actuality is a scientific hero, as Fauci, although not without his faults, has done considerable good in this arena. Look at her motivation here. It is easily seen through. She seeks revenge. Ironically, however, she seems to feel no remorse for claiming she worked in the lab that weaponized the Ebola virus and killed over 11,000 people, even though this is also easily proven to be a lie, as Ebola was discovered in 1976, well before she became a “scientist.” Anthony Fauci. Fair Use. The Bayh-Dole Act Mikovits claims that the Bayh-Dole Act allows researchers to gain patents for treatments they discover, and that this is a conflict of interest for the scientific community. This is way more complicated than Mikovits wants us to believe, and again, her reasons are personal. The Bayh-Dole Act allows non-profit organizations to retain patents on their inventions even when found through federally funded research. However, it also allows the federal government to take control of the invention if it is concluded that it is necessary for the public good. During her research into CFS, Mikovits had a 1.5 million dollar grant that she tried to take with her when she left the institute. XMRV Plandemic alleges that Mikovits discovered XMRV and that it is linked to plagues responsible for millions of deaths. FALSE. She did not discover it. This was found by Dr. Robert Silverman. Silverman linked it to prostate cancer, and worked with Mikovits on her study of CFS. He later retracted his own research, admitting that he made errors. There is no evidence to support Mikovits’ claims that this virus is the root cause of everything she wants it to be the cause of. This is sensationalism. Coronavirus Claims Plandemic and Mikovits goes on at length about their real agenda, which is to make dangerous and misleading claims regarding the coronavirus and the current global pandemic. In a nutshell, here are the claims made: Covid-19 had to be made in a lab. FALSE. Covid-19 would take 800 years to occur naturally from SARS 1. FALSE. Viruses mutate so fast we have already seen numerous examples occurring in real time. This claim is what the scientific community would likely refer to as BULL****. The government is purposefully faking the covid-19 numbers. FALSE. Important to note here, that if you die from pneumonia caused by the flu, the cause of your death is still the flu. This is a conspiracy theory that acts under the presumption that somehow what must be true in the United States, has to be true all over the world. This is impossible. Italy was hit harder because it utilized flu vaccines in 2019 containing a strain of h1n1 common in dogs. ARE YOU KIDDING ME??????? Hydroxochloroquine is the best treatment for covid-19. FALSE. People should not be sheltering in place. Nor should they be wearing masks or gloves. GIVE ME A BREAK. The MeDiA is Fake News, and people making this Fake News should be put in jail. This is the final point that convinces me this video was funded by the Trump Administration. It is nothing but a HUGE distraction from the myriad ways Trump has failed the American people in a time of absolute crisis, and created a scenario which has caused over 70,000 people in America to die, and still counting. The video runs the gamut of popular Trump talking points, and points fingers at people Trump likes to point fingers at. Coincidence? See how easy it is to jump to conclusions? Plandemic makes so many false claims, and at such a fast pace, it’s almost impossible to keep up with it. And that is part of the manipulation technique. It bombards you, and overwhelms you with information, making you more susceptible to believe it, because it is stating so many horrific things with such authority how could it not be true? But you need to stop and think. Who are these people? Why should I believe them over the experts? A few of the people speaking in this video are never even given credentials. They are just strangers wearing scrubs. I might as well put on a set of scrubs and make a video myself. Would you believe me if I did? (EDIT: These “doctors” have been identified. Dan Erickson and Artin Massihi are Urgent Care workers in California and are frequently on Fox News spreading false information. The other “doctor” is a chiropractor named Eric Nepute from St. Louis who is notable for telling people that tonic water is the cure for everything that ails you.) This viral video showcases the inherent dangers of propaganda and misinformation campaigns. It shows how easily people are ready to believe things because they want to believe them rather than the truth. It shows how people will search for explanations outside the realm of rationality or possibility when their senses are flooded with real fear of mortality. It is important that we as a people do not fall for such instinctual defense mechanisms. It is important that we vet the information we are given and that we refuse to spread this false propaganda narrative. STOP SHARING THIS VIDEO. STOP PROMOTING FEAR. STOP BEING A WILLING PARTICIPANT IN MAKING THE PROBLEM WORSE. Stop it. Just stop.
https://medium.com/politically-speaking/plandemic-debunked-403a6e7d3ff7
['Jay Sizemore']
2020-06-15 21:12:11.773000+00:00
['Propaganda', 'Plandemic', 'Science', 'Judy Mikovits', 'Coronavirus']
Title Plandemic DebunkedContent SCIENCE Plandemic Debunked Stop Sharing Propaganda Misinformation Altered still Plandemic meme 2020 Go social medium right destined run someone sharing promoting video gone viral called PLANDEMIC parrot point video like good little puppet without taking time research giving information credibility information presented video pure sensationalism filled outrageous lie mistruths information shared shared someone known charlatan believe promote video you’ve simply fallen trick First Dr Judy Mikovits Google name find think first thing person notice stick like giant red flag connection antivaccine movement person become hero movement claim vaccine dangerous cause autism kill people thusly making hero movement caused resurgence long conquered disease like measles next red flag person pay attention word DISCREDITED always appears next name scientist kicked scientific community becomes champion conspiracy theory might say anything coincidence unlike conspiracy theorist connect invisible dot one else see dot connected large planet description video mention global conspiracy headed Rockefellers tie Nazi Germany World War II isn’t enough clue people insane agenda Plandemic follows straightforward rudimentary template conspiracy theory propaganda film much others come like Zeitgeist Loose Change video claiming 911 inside job show people talking darkly lit room making claim present authority without evidence back give perspective music droning somewhat ominous keyboard provides eery tone footage making appear viewer like seeing something supposed secret viewer intended feel way simply cut paste manipulation tactic 101 person made movie exactly credible either Mikki Willis made quite questionable “documentaries” one concludes filmmaker named Daniel Northcott contracted leukemia died due finding cursed Mayan bone… Luckily I’ve watched 30 minute video don’t Allow walk claim make blatantly incorrect dangerous Dr Judy Mikovits video claim Judy Mikovits “the accomplished scientist generation” TRUE known fraud manipulated laboratory condition produce intended result fired lab worked concern integrity repeated dangerous claim linking autism vaccine initially trying link Chronic Fatigue Syndrome early career include good work researching HIV AIDS good potential quickly undermined pursuit working private lab biased interest Chronic Fatigue Syndrome video claim Dr Judy Mikovits revolutionized treatment AIDS 1991 doctoral thesis publish thesis least one AIDS study one evidence support research “revolutionized” treatment disease video Mikovits claim Big Pharma behind jailed slandered arrested CHARGES home searched WITHOUT WARRANT claim outlandish easily proven FALSE Mikovits charged theft warrant issued Washoe County Nevada charge related taking sample equipment lab worked without consent concern would destroy evidence top jail five day make seem like prison year Mikovits SCIENCE scandal Plandemic sensationalizes Mikovits victim making claim published article magazine SCIENCE something shook scientific community conspiracy worked take work FALSE paper found manipulated peer review study could replicate finding worked condition heavily biased link XMRV Chronic Fatigue Syndrome owner institute worked daughter suffering disease manipulated lab present false positive Thusly paper retracted SCIENCE Mikovits thusly hate Dr Anthony Fauci one ordered review research Dr Fauci AIDS Epidemic Mikovits make horrific claim Dr Anthony Fauci stole research suppressed thus leading million death AIDS essentially blaming one person entire AIDS epidemic trying claim come miracle cure evidence realize happens gigantic straw man making enemy credibility boogeyman someone actuality scientific hero Fauci although without fault done considerable good arena Look motivation easily seen seek revenge Ironically however seems feel remorse claiming worked lab weaponized Ebola virus killed 11000 people even though also easily proven lie Ebola discovered 1976 well became “scientist” Anthony Fauci Fair Use BayhDole Act Mikovits claim BayhDole Act allows researcher gain patent treatment discover conflict interest scientific community way complicated Mikovits want u believe reason personal BayhDole Act allows nonprofit organization retain patent invention even found federally funded research However also allows federal government take control invention concluded necessary public good research CFS Mikovits 15 million dollar grant tried take left institute XMRV Plandemic alleges Mikovits discovered XMRV linked plague responsible million death FALSE discover found Dr Robert Silverman Silverman linked prostate cancer worked Mikovits study CFS later retracted research admitting made error evidence support Mikovits’ claim virus root cause everything want cause sensationalism Coronavirus Claims Plandemic Mikovits go length real agenda make dangerous misleading claim regarding coronavirus current global pandemic nutshell claim made Covid19 made lab FALSE Covid19 would take 800 year occur naturally SARS 1 FALSE Viruses mutate fast already seen numerous example occurring real time claim scientific community would likely refer BULL government purposefully faking covid19 number FALSE Important note die pneumonia caused flu cause death still flu conspiracy theory act presumption somehow must true United States true world impossible Italy hit harder utilized flu vaccine 2019 containing strain h1n1 common dog KIDDING Hydroxochloroquine best treatment covid19 FALSE People sheltering place wearing mask glove GIVE BREAK MeDiA Fake News people making Fake News put jail final point convinces video funded Trump Administration nothing HUGE distraction myriad way Trump failed American people time absolute crisis created scenario caused 70000 people America die still counting video run gamut popular Trump talking point point finger people Trump like point finger Coincidence See easy jump conclusion Plandemic make many false claim fast pace it’s almost impossible keep part manipulation technique bombard overwhelms information making susceptible believe stating many horrific thing authority could true need stop think people believe expert people speaking video never even given credential stranger wearing scrub might well put set scrub make video Would believe EDIT “doctors” identified Dan Erickson Artin Massihi Urgent Care worker California frequently Fox News spreading false information “doctor” chiropractor named Eric Nepute St Louis notable telling people tonic water cure everything ail viral video showcase inherent danger propaganda misinformation campaign show easily people ready believe thing want believe rather truth show people search explanation outside realm rationality possibility sens flooded real fear mortality important people fall instinctual defense mechanism important vet information given refuse spread false propaganda narrative STOP SHARING VIDEO STOP PROMOTING FEAR STOP WILLING PARTICIPANT MAKING PROBLEM WORSE Stop stopTags Propaganda Plandemic Science Judy Mikovits Coronavirus
2,439
Android Image Color Change With ColorMatrix
Binary the Color We have the primary colors of red, green, and blue, and the secondary colors of magenta, cyan, and yellow. We could convert all colors to the binary colors, depending on the dominant color of the pixel. To do that, we need to come up with a formula. But before that, one important note: If a calculated color value is greater than 255, it will cap at 255. If a calculated color value is smaller than 0, it will be cap at 0. The above note is very handy for our case. We want to ensure that all colors are either 255 or 0. We could have a formula like this: NewColor = 255 * OriginalColor — 128 * 255 [Cap at 0 to 255] Let’s test the value: - Assuming original color is 0, the new value is -32640. But since it is cap at minimum 0, it is 0. - Assuming original color is 255, the new value is 32640. But since it is cap at maximum 255, it is 255. - Assuming original color is 127, the new value is -255, which is converted to 0 - Assuming original color is 128, the new value is 0 - Assuming original color is 129, the new value is 255 So, we have proven that any original color of 128 or less is converted to 0 and any original color of 129 to 255, will be 255. With that, we could have our matrix as below: [ 255, 0, 0, 0, -128*255, 0, 255, 0, 0, -128*255, 0, 0, 255, 0, -128*255, 0, 0, 0, 1, 0 ] You’ll realize that the decision to convert to either 0 or 255 is based on the coefficient 128-set. If we make that a variable, we could then adjust how much we make it bright/dim for the binary color, as per the demo below.
https://medium.com/mobile-app-development-publication/android-image-color-change-with-colormatrix-e927d7fb6eb4
[]
2020-12-24 13:39:44.573000+00:00
['Android', 'Android App Development', 'Mobile App Development', 'Programming', 'Design']
Title Android Image Color Change ColorMatrixContent Binary Color primary color red green blue secondary color magenta cyan yellow could convert color binary color depending dominant color pixel need come formula one important note calculated color value greater 255 cap 255 calculated color value smaller 0 cap 0 note handy case want ensure color either 255 0 could formula like NewColor 255 OriginalColor — 128 255 Cap 0 255 Let’s test value Assuming original color 0 new value 32640 since cap minimum 0 0 Assuming original color 255 new value 32640 since cap maximum 255 255 Assuming original color 127 new value 255 converted 0 Assuming original color 128 new value 0 Assuming original color 129 new value 255 proven original color 128 le converted 0 original color 129 255 255 could matrix 255 0 0 0 128255 0 255 0 0 128255 0 0 255 0 128255 0 0 0 1 0 You’ll realize decision convert either 0 255 based coefficient 128set make variable could adjust much make brightdim binary color per demo belowTags Android Android App Development Mobile App Development Programming Design
2,440
Social Media: The Death of Real World Interaction?
The digital age has been transformed into one surrounding social media and networking. With over a billion monthly active users on sites like Facebook alone, it is hard to argue against social networking being something ubiquitous. These social sites act as gatekeepers for the harboring of online connections between users. These forms of online communication are also not relegated to specific age groups either as more than 73% of online adults today (18-65+) are on some sort of social site (Social Networking Fact Sheet). As more and more people continue to find ways to communicate in the digital world, new issues arise, however, that have previously never been faced. These issues span major sectors of our cultures and societies, from the physical to the psychological. While new technologies are ushering in new mediums and outlets for interaction, old ones are being soon forgotten. In a world where we can get a message across to millions of people with a click of a button, the most fundamental type of communication, human face-to- face interaction, is becoming less and less important. Social media can have catastrophic affects on humans as social creatures if used to replace rather than enhance, provoking false senses of connection, psychological changes to how people approach relationships, and negative emotional responses to these types of communications. Social media is often becoming a replacement for building and establishing connections in the real world and there is something fundamentally wrong with this mentality. In a study conducted by the Pew Research Center, 54 percent of those surveyed said they text their friends at least once a day, while only 33% said they talk face-to-face with their friends on a consistent basis (Antisocial Networking). This tells us several things. Direct interaction is not being seen as the best way to communicate anymore, especially among teens, and people are not putting as much value as they once did on face to face interaction. Psychologist Sherry Turkle puts it brilliantly in describing what road we are going down by spending all of our time on online communication when saying, “We are sacrificing conversation for mere connection” (Connected, but Alone?). We are sacrificing the experiences and understanding of real world interactions that are necessary in our development for a mere connection that is established in social media, one that is superficial. These connections that are no more than surface deep are becoming sufficient replacements for face to face interaction among social media users because they are easier to establish, but have dire consequences for social development in the future. Ms Turkle also details this phenomenon very well in her talk when saying, “….So from social networks to sociable robots, we’re designing technologies that will give us the illusion of companionship without the demands of friendship” (Connected, but Alone?). It is undeniable that we as humans look for companionship throughout our lives. After all, we are social creatures; however, a text saying “I love you” is not the same thing as if someone were saying it directly to another person. It does not provoke the same level of emotional attachment, and this among other things is what is wrong with social media and why direct interaction is still so vital in our lives. For adolescents especially, the skill of maintaining real world interactions (and it can really be considered a skill with how our society is coming to approach this type of communication) is the “bedrock” of development. Real world interaction allows us to understand each other profoundly and allows us to get to know each other down to the most fundamental parts of who we are. Social media and social connections just don’t have the same level of profound connectedness. This is why the false sense of connection that comes as a byproduct of social media is so dangerous to who we are and who we end up becoming. We are in fact becoming more “connected” through social media in the very sense of the word, but this “connection” is one that we don’t want to replace our real life connections with. Social media can truly have harmful effects on us psychologically if we use the medium to replace rather than enhance and if we do not realize that the connections we are establishing through these mediums are not suffice for our social development. With the emergence of online communication there has also been a difference in the way we approach technology when it comes to relationships and companionship. Psychologically, we have a mentality different than that of past generations because of this new technology. In Sherry Turkle’s Alone Together-Why We Expect More from Technology and Less from Each Other, Turkle clearly lays out this out this idea when stating: As infants, we see the world in parts. There is the good-the things that feed and nourish us. There is the bad-the things that frustrate or deny us. As children mature, they come to see the world in more complex ways, realizing, for example, that beyond black and white, there are shades of gray. The same mother who feeds us may sometimes have no milk. Over time, we transform a collection of parts into a comprehension of wholes. With this integration, we learn to tolerate disappointment and ambiguity. And we learn that to sustain realistic relationships, one must accept others in their complexity. When we imagine a robot as true companion, there is no need to do any work. (Turkle 60) Although the robots mentioned in this piece Turkle’s writing refers to physical technology, it very much applies to how we see things when dealing with digital technology. We have adopted this notion that online means of connections can be substitutes for those connections that are so vital in the real world, when in fact it is simply not true. As per the Pew Research study and countless more like it, people are substituting this new form of communication for its real world counterpart, so this is not a psychological adaptation that is being taken up by a select few. With real life conversations, we learn to deal with the shortcomings and complexities of others, and vice versa. Every real life conversation is like practice or a warm up towards the game of social fluidity, if you will. This can simply not happen with any robot or any digital connection. With a digital connection you have all the time and energy in the world to project yourself as the perfect version of who you would like to be. No one has this luxury in the real world and avoiding real world interaction altogether is simply impossible. Social media has brought forth a drastic change in how we treat relationships. This mental adaptation to how we treat this form of online technology is not a path we should be going down, and one that can ultimately spell trouble for future generations. The wrong message is being created by users of these networks that think that it is alright to replace rather than enhance, which is what these networks were originally intended for. We have led ourselves to believe that online interactions themselves can be companions because in a way we feel more comfortable in these spaces. Several studies on the matter, however, have produced opposite results in how we feel emotionally when we use social media. Social media is affecting its users not only on how they act socially, but how they feel socially when it comes to using the sites. An online social connection is supposed to evoke sensations of emotional satisfaction as this type of communication is still social in nature and we as human get satisfaction from social activities, according to advocates of these online social systems. What has been seen, however, is that the more people use sites like Facebook, Twitter, Whatsapp, etc, the more anxious and emotionally taxed they became. A study of roughly 300 people by the Salford Business School found that these social networks are exacerbating negative emotions. The surveyors found that “If you are predisposed to anxiety it seems that the pressures from technology act as a tipping point, making people feel more insecure and more overwhelmed. These findings suggest that some may need to re-establish control over the technology they use, rather than being controlled by it” (Anxiety UK). More than half of the respondents reported having negative emotions after using social networking sites (Anxiety UK). This corroborates the idea that social media cannot be used to replace the interactions which take place in the real world. It may seem that these digital interactions are satisfactory on the surface, but there is something within us, much deeper than we can come to realize, that no matter how hard we try to indulge ourselves in our digital communications we cannot escape the truth that these interactions are not enough. Younger generations especially are vulnerable to the vortex that is social media. For the first time in history, face to face interaction has dropped to third behind texting and IM/FB messaging in the so called “iGeneration”, or those born from 1990-1999 (Rosen). As these younger generations are nurtured around technology and social media, it becomes increasingly difficult to get out of a digitally social driven life. With severe emotional implications in using social networks, the vast amounts of time spent on these sites should not be promoted, especially among adolescents. There are other ideas that exist, however, for the benefits of having a social life online. Some argue that the use of social media is a beneficial tool, allowing us to be become more connected than we ever were by allowing us to reach a much greater audience. Others argue that social media allows those to build social lives where it is hard to build them in the real world. While these arguments can certainly hold to be true, the fact of the matter is that social media does not replace real world interaction and while it is of benefit to have connections with dozens of people at once, this tool often becomes a replacement for real world interaction. What has been seen is that social media simply does not produce the same levels of psychological “well being” as real world interactions have, which is why “direct” interaction is still so important, as shown by the Public Library of Science’s study . “Because we also asked people to indicate how frequently they interacted with other people “directly” since the last time we text messaged them, we were able to test this idea. Specifically, we repeated each of the aforementioned analyses substituting “direct” social interaction for Facebook use. In contrast to Facebook use, “direct” social interaction did not predict changes in cognitive well-being, and predicted increases (not decreases) in affective well-being” (PLOS ONE). The study clearly illustrates how we may perceive social media and “direct” interaction to be on equal ground cognitively speaking. Emotionally, however, the very quality of our ability to be satisfied is diminished with the use of social media and lack of real world interaction, which in turn can have harmful effects on how we develop socially. It is possible to have a sort of balance between real and digital social connections, but these online connections HAVE to be used to enhance, not replace, which has unfortunately not been the case, as corroborated by aforementioned studies. Digital technology is evolving at an alarming rate. Face to face interactions have become the third method of communication behind text messaging and IM messaging in just a matter of a few years (Rosen). Billions of people around the world are flocking to social networking sites in hopes of creating online connections. The desire, accessibility, and interest in these digital connections have put the most fundamental type of communication, face to face interaction, in its shadow. It is almost disturbing that humans can abandon such a vital form of our social makeup without thinking twice. We want to have social interactions, but we don’t want to go through the trials and tribulations of real world interactions. It is these complexities in interaction, however, that help us to adapt to different social situations in the future and something that social media is not preparing us for. Social media can be greatly beneficial if used to enhance those relationships which we hold dear in the real world, but more often than not what is being seen is that these real world relationships are being substituted altogether by a digital experience, so these benefits end up having no merit. People would rather text message someone before talking to someone face to face, and that says something about who we have become as a society. We prefer to be interacting with a computer screen or mobile device than interact with each other directly and there is something vastly wrong with this way of thinking. In the words of psychologist Sherry Turkle, in today’s world we prefer to be “Alone together” (Turkle). As direct interaction becomes less prevalent, a false sense of connection, negative psychological adaptations to how we approach digital technology and negative emotional responses to online outlets brought on by social media are having devastating effects on who we become as social creatures.
https://medium.com/musings-of-a-writer/social-media-the-death-of-real-world-interaction-5e2f33cfd8ee
['Marcos Suliveres']
2018-02-12 18:29:43.242000+00:00
['Social Media', 'Marketing', 'Tech', 'Internet', 'Psychology']
Title Social Media Death Real World InteractionContent digital age transformed one surrounding social medium networking billion monthly active user site like Facebook alone hard argue social networking something ubiquitous social site act gatekeeper harboring online connection user form online communication also relegated specific age group either 73 online adult today 1865 sort social site Social Networking Fact Sheet people continue find way communicate digital world new issue arise however previously never faced issue span major sector culture society physical psychological new technology ushering new medium outlet interaction old one soon forgotten world get message across million people click button fundamental type communication human faceto face interaction becoming le le important Social medium catastrophic affect human social creature used replace rather enhance provoking false sens connection psychological change people approach relationship negative emotional response type communication Social medium often becoming replacement building establishing connection real world something fundamentally wrong mentality study conducted Pew Research Center 54 percent surveyed said text friend least day 33 said talk facetoface friend consistent basis Antisocial Networking tell u several thing Direct interaction seen best way communicate anymore especially among teen people putting much value face face interaction Psychologist Sherry Turkle put brilliantly describing road going spending time online communication saying “We sacrificing conversation mere connection” Connected Alone sacrificing experience understanding real world interaction necessary development mere connection established social medium one superficial connection surface deep becoming sufficient replacement face face interaction among social medium user easier establish dire consequence social development future Ms Turkle also detail phenomenon well talk saying “…So social network sociable robot we’re designing technology give u illusion companionship without demand friendship” Connected Alone undeniable human look companionship throughout life social creature however text saying “I love you” thing someone saying directly another person provoke level emotional attachment among thing wrong social medium direct interaction still vital life adolescent especially skill maintaining real world interaction really considered skill society coming approach type communication “bedrock” development Real world interaction allows u understand profoundly allows u get know fundamental part Social medium social connection don’t level profound connectedness false sense connection come byproduct social medium dangerous end becoming fact becoming “connected” social medium sense word “connection” one don’t want replace real life connection Social medium truly harmful effect u psychologically use medium replace rather enhance realize connection establishing medium suffice social development emergence online communication also difference way approach technology come relationship companionship Psychologically mentality different past generation new technology Sherry Turkle’s Alone TogetherWhy Expect Technology Less Turkle clearly lay idea stating infant see world part goodthe thing feed nourish u badthe thing frustrate deny u child mature come see world complex way realizing example beyond black white shade gray mother feed u may sometimes milk time transform collection part comprehension whole integration learn tolerate disappointment ambiguity learn sustain realistic relationship one must accept others complexity imagine robot true companion need work Turkle 60 Although robot mentioned piece Turkle’s writing refers physical technology much applies see thing dealing digital technology adopted notion online mean connection substitute connection vital real world fact simply true per Pew Research study countless like people substituting new form communication real world counterpart psychological adaptation taken select real life conversation learn deal shortcoming complexity others vice versa Every real life conversation like practice warm towards game social fluidity simply happen robot digital connection digital connection time energy world project perfect version would like one luxury real world avoiding real world interaction altogether simply impossible Social medium brought forth drastic change treat relationship mental adaptation treat form online technology path going one ultimately spell trouble future generation wrong message created user network think alright replace rather enhance network originally intended led believe online interaction companion way feel comfortable space Several study matter however produced opposite result feel emotionally use social medium Social medium affecting user act socially feel socially come using site online social connection supposed evoke sensation emotional satisfaction type communication still social nature human get satisfaction social activity according advocate online social system seen however people use site like Facebook Twitter Whatsapp etc anxious emotionally taxed became study roughly 300 people Salford Business School found social network exacerbating negative emotion surveyor found “If predisposed anxiety seems pressure technology act tipping point making people feel insecure overwhelmed finding suggest may need reestablish control technology use rather controlled it” Anxiety UK half respondent reported negative emotion using social networking site Anxiety UK corroborates idea social medium cannot used replace interaction take place real world may seem digital interaction satisfactory surface something within u much deeper come realize matter hard try indulge digital communication cannot escape truth interaction enough Younger generation especially vulnerable vortex social medium first time history face face interaction dropped third behind texting IMFB messaging called “iGeneration” born 19901999 Rosen younger generation nurtured around technology social medium becomes increasingly difficult get digitally social driven life severe emotional implication using social network vast amount time spent site promoted especially among adolescent idea exist however benefit social life online argue use social medium beneficial tool allowing u become connected ever allowing u reach much greater audience Others argue social medium allows build social life hard build real world argument certainly hold true fact matter social medium replace real world interaction benefit connection dozen people tool often becomes replacement real world interaction seen social medium simply produce level psychological “well being” real world interaction “direct” interaction still important shown Public Library Science’s study “Because also asked people indicate frequently interacted people “directly” since last time text messaged able test idea Specifically repeated aforementioned analysis substituting “direct” social interaction Facebook use contrast Facebook use “direct” social interaction predict change cognitive wellbeing predicted increase decrease affective wellbeing” PLOS ONE study clearly illustrates may perceive social medium “direct” interaction equal ground cognitively speaking Emotionally however quality ability satisfied diminished use social medium lack real world interaction turn harmful effect develop socially possible sort balance real digital social connection online connection used enhance replace unfortunately case corroborated aforementioned study Digital technology evolving alarming rate Face face interaction become third method communication behind text messaging IM messaging matter year Rosen Billions people around world flocking social networking site hope creating online connection desire accessibility interest digital connection put fundamental type communication face face interaction shadow almost disturbing human abandon vital form social makeup without thinking twice want social interaction don’t want go trial tribulation real world interaction complexity interaction however help u adapt different social situation future something social medium preparing u Social medium greatly beneficial used enhance relationship hold dear real world often seen real world relationship substituted altogether digital experience benefit end merit People would rather text message someone talking someone face face say something become society prefer interacting computer screen mobile device interact directly something vastly wrong way thinking word psychologist Sherry Turkle today’s world prefer “Alone together” Turkle direct interaction becomes le prevalent false sense connection negative psychological adaptation approach digital technology negative emotional response online outlet brought social medium devastating effect become social creaturesTags Social Media Marketing Tech Internet Psychology
2,441
Use C# and a CNTK Neural Network To Predict House Prices In California
The file contains information on 17k housing blocks all over the state of California: Column 1: The longitude of the housing block Column 2: The latitude of the housing block Column 3: The median age of all the houses in the block Column 4: The total number of rooms in all houses in the block Column 5: The total number of bedrooms in all houses in the block Column 6: The total number of people living in all houses in the block Column 7: The total number of households in all houses in the block Column 8: The median income of all people living in all houses in the block Column 9: The median house value for all houses in the block We can use this data to train a deep neural network to predict the value of any house in and outside the state of California. Here’s what the data looks like. This is a plot of all the housing blocks in the dataset color-coded by value: You can sort of see the shape of California, with the highest house values found in the Los Angeles and San Francisco area. Okay, let’s get started writing code. Please open a console or Powershell window. You are going to create a new subfolder for this assignment and set up a blank console application: $ dotnet new console -o HousePricePrediction $ cd HousePricePrediction Also make sure to copy the dataset file(s) into this folder because the code you’re going to type next will expect them here. Now install the following packages $ dotnet add package Microsoft.ML $ dotnet add package CNTK.GPU $ dotnet add package XPlot.Plotly $ dotnet add package Fsharp.Core Microsoft.ML is the Microsoft machine learning package. We will use to load and process the data from the dataset. The CNTK.GPU library is Microsoft’s Cognitive Toolkit that can train and run deep neural networks. And Xplot.Plotly is an awesome plotting library based on Plotly. The library is designed for F# so we also need to pull in the Fsharp.Core library. The CNTK.GPU package will train and run deep neural networks using your GPU. You’ll need an NVidia GPU and Cuda graphics drivers for this to work. If you don’t have an NVidia GPU or suitable drivers, you can also opt to train and run the neural networks on your CPU. In that case please install the CNTK.CPUOnly package instead. CNTK is a low-level tensor library for building, training, and running deep neural networks. The code to build deep neural network can get a bit verbose, so I’ve developed a wrapper called CNTKUtil that will help you write code faster. Please download the CNTKUtil files and save them in a new CNTKUtil folder at the same level as your project folder. Then make sure you’re in the console project folder and create a project reference like this: $ dotnet add reference ..\CNTKUtil\CNTKUtil.csproj Now you are ready to add classes. You’ll need a new class to hold all the information for a single housing block. Edit the Program.cs file with Visual Studio Code and add the following code: The HouseBlockData class holds all the data for one single housing block. Note how each field is tagged with a LoadColumn attribute that will tell the CSV data loading code which column to import data from. We also have a GetFeatures method that returns the longitude, latitude, median age, total number of rooms, total number of bedrooms, total population, number of households, and median income level of a housing block. And there’s a GetLabel method that return the median house value in thousands of dollars. The features are the house attributes that we will use to train the neural network on, and the label is the output variable that we’re trying to predict. So here we’re training on every column in the dataset to predict the median house value. Now we need to set up a custom TrainingEngine which is a helper class from the CNTKUtil library that will help us train and run a deep neural network: Note the CreateFeatureVariable override which tells CNTK that our neural network will use a 1-dimensional tensor of 8 float values as input. This shape matches the 8 values returned by the HouseBlockData.GetFeatures method. And the CreateLabelVariable override tells CNTK that we want our neural network to output a single float value. This shape matches the single value returned by the HouseBlockData.GetLabel method. We’re almost done with the training engine. Our final step is to design the neural network. We will use the following neural network to predict house prices: This is a deep neural network with an 8-node input layer, an 8-node hidden layer, and a single-node output layer. We’ll use the ReLU activation function everywhere. Here’s how to build this neural network: The CreateModel override builds the neural network. Note how each call to Dense adds a dense layer with the ReLU activation function to the network. The final output layer consists of only a single node without activation. With the training engine fully set up, we can now load the dataset in memory. We’re going to use an ML.NET data pipeline for the heavy lifting: This code calls the LoadFromTextFile method to load the CSV data in memory. Note the HouseBlockData type argument that tells the method which class to use to load the data. We then use TrainTestSplit to split the data in a training partition containing 80% of the data and a testing partition containing 20% of the data. Finally we call CreateEnumerable to convert the two partitions to an enumeration of HouseBlockData instances. Now we’re ready to set up the training engine. Add the following code: We’re instantiating a new training engine and configuring it to use the MSE metric (= Mean Square Error) to measure the training and testing loss. We’re going to train for 50 epochs with a batch size of 16 and a learning rate of 0001. Now let’s load the data from the ML.NET pipeline into the neural network: The SetData method loads data into the neural network and expects training features, training labels, testing features, and testing labels, in that order. Note how we’re using the GetFeatures and GetLabel methods we set up earlier. And that’s it. The following code will start the training engine and train the neural network: After training completes, the complete training and testing curves will be stored in the training engine. Let’s use XPlot to create a nice plot of the two curves so we can check for overfitting: This code creates a Plot with two Scatter graphs. The first one plots the TrainingCurve and the second one plots the TestingCurve. Both curves are defined as the loss values per training epoch. And note the Sqrt method to convert the MSE loss to RMSE ( = Root Mean Square Error). Finally we use File.WriteAllText to write the plot to disk as a HTML file. We’re now ready to build the app, so this is a good moment to save your work ;) Go to the CNTKUtil folder and type the following: $ dotnet build -o bin/Debug/netcoreapp3.0 -p:Platform=x64 This will build the CNKTUtil project. Note how we’re specifying the x64 platform because the CNTK library requires a 64-bit build. Now go to the HousePricePrediction folder and type: $ dotnet build -o bin/Debug/netcoreapp3.0 -p:Platform=x64 This will build your app. Note how we’re again specifying the x64 platform. Now run the app: $ dotnet run The app will create the neural network, load the dataset, train the network on the data, and create a plot of the training and testing loss for each epoch. The plot is written to disk in a new file called chart.html. Here’s what it looks like: The training and testing curves stay close together with the loss slowly dropping with each successive epoch. There is no hint of overfitting. The final RMSE is 80.64 on training and 81.59 on testing. It means my app predictions are roughly $80k off. That’s not a bad start. (disclaimer: note that RMSE is not expressed in dollars so this is only a rough approximation of the average prediction error) Now I’ll expand the number of nodes in each neural network layer to 64. I will change the CreateModel method in the TrainingEngine class as follows: The neural network now has 4,801 trainable parameters. And here’s the result: The training process is more unstable now, with some epochs reporting a large testing loss. But the network always corrects itself in subsequent epochs and there’s still no sign of overfitting. The final RMSE is now 66.63 for training and 67.64 for testing. A nice improvement! Now let’s add another layer. With the extra layer the neural network now has 8,961 trainable parameters: And here is the result: The training curves look about the same, with a final RMSE of 65.01 for training and 65.55 for testing. Even though I doubled the number of trainable parameters in the network, the results hardly improved. Let’s do one more experiment. I’ll remove the extra layer and increase the number of nodes in each layer to 128. The neural network now has 17,793 trainable parameters: And here are the results: Again the curves look unchanged, with a final RMSE of 64.73 for training and 64.84 for testing. You can see that adding more hidden layers or increasing the number of nodes per layer is not improving the final results. We’re not getting the loss below 64. In machine learning we’re always looking for the most simple model that provides the most accurate predictions, because if we make the model too complex it will tend to overfit. So it looks like for this dataset the optimal neural network that delivers the best tradeoff on accuracy and complexity is the second one we tried, with two 64-node layers. So what do you think? Are you ready to start writing C# machine learning apps with CNTK?
https://medium.com/machinelearningadvantage/use-c-and-a-cntk-neural-network-to-predict-house-prices-in-california-f776220916ba
['Mark Farragher']
2019-11-19 14:58:49.623000+00:00
['Machine Learning', 'Data Science', 'Artificial Intelligence', 'Csharp', 'Deep Learning']
Title Use C CNTK Neural Network Predict House Prices CaliforniaContent file contains information 17k housing block state California Column 1 longitude housing block Column 2 latitude housing block Column 3 median age house block Column 4 total number room house block Column 5 total number bedroom house block Column 6 total number people living house block Column 7 total number household house block Column 8 median income people living house block Column 9 median house value house block use data train deep neural network predict value house outside state California Here’s data look like plot housing block dataset colorcoded value sort see shape California highest house value found Los Angeles San Francisco area Okay let’s get started writing code Please open console Powershell window going create new subfolder assignment set blank console application dotnet new console HousePricePrediction cd HousePricePrediction Also make sure copy dataset file folder code you’re going type next expect install following package dotnet add package MicrosoftML dotnet add package CNTKGPU dotnet add package XPlotPlotly dotnet add package FsharpCore MicrosoftML Microsoft machine learning package use load process data dataset CNTKGPU library Microsoft’s Cognitive Toolkit train run deep neural network XplotPlotly awesome plotting library based Plotly library designed F also need pull FsharpCore library CNTKGPU package train run deep neural network using GPU You’ll need NVidia GPU Cuda graphic driver work don’t NVidia GPU suitable driver also opt train run neural network CPU case please install CNTKCPUOnly package instead CNTK lowlevel tensor library building training running deep neural network code build deep neural network get bit verbose I’ve developed wrapper called CNTKUtil help write code faster Please download CNTKUtil file save new CNTKUtil folder level project folder make sure you’re console project folder create project reference like dotnet add reference CNTKUtilCNTKUtilcsproj ready add class You’ll need new class hold information single housing block Edit Programcs file Visual Studio Code add following code HouseBlockData class hold data one single housing block Note field tagged LoadColumn attribute tell CSV data loading code column import data also GetFeatures method return longitude latitude median age total number room total number bedroom total population number household median income level housing block there’s GetLabel method return median house value thousand dollar feature house attribute use train neural network label output variable we’re trying predict we’re training every column dataset predict median house value need set custom TrainingEngine helper class CNTKUtil library help u train run deep neural network Note CreateFeatureVariable override tell CNTK neural network use 1dimensional tensor 8 float value input shape match 8 value returned HouseBlockDataGetFeatures method CreateLabelVariable override tell CNTK want neural network output single float value shape match single value returned HouseBlockDataGetLabel method We’re almost done training engine final step design neural network use following neural network predict house price deep neural network 8node input layer 8node hidden layer singlenode output layer We’ll use ReLU activation function everywhere Here’s build neural network CreateModel override build neural network Note call Dense add dense layer ReLU activation function network final output layer consists single node without activation training engine fully set load dataset memory We’re going use MLNET data pipeline heavy lifting code call LoadFromTextFile method load CSV data memory Note HouseBlockData type argument tell method class use load data use TrainTestSplit split data training partition containing 80 data testing partition containing 20 data Finally call CreateEnumerable convert two partition enumeration HouseBlockData instance we’re ready set training engine Add following code We’re instantiating new training engine configuring use MSE metric Mean Square Error measure training testing loss We’re going train 50 epoch batch size 16 learning rate 0001 let’s load data MLNET pipeline neural network SetData method load data neural network expects training feature training label testing feature testing label order Note we’re using GetFeatures GetLabel method set earlier that’s following code start training engine train neural network training completes complete training testing curve stored training engine Let’s use XPlot create nice plot two curve check overfitting code creates Plot two Scatter graph first one plot TrainingCurve second one plot TestingCurve curve defined loss value per training epoch note Sqrt method convert MSE loss RMSE Root Mean Square Error Finally use FileWriteAllText write plot disk HTML file We’re ready build app good moment save work Go CNTKUtil folder type following dotnet build binDebugnetcoreapp30 pPlatformx64 build CNKTUtil project Note we’re specifying x64 platform CNTK library requires 64bit build go HousePricePrediction folder type dotnet build binDebugnetcoreapp30 pPlatformx64 build app Note we’re specifying x64 platform run app dotnet run app create neural network load dataset train network data create plot training testing loss epoch plot written disk new file called charthtml Here’s look like training testing curve stay close together loss slowly dropping successive epoch hint overfitting final RMSE 8064 training 8159 testing mean app prediction roughly 80k That’s bad start disclaimer note RMSE expressed dollar rough approximation average prediction error I’ll expand number node neural network layer 64 change CreateModel method TrainingEngine class follows neural network 4801 trainable parameter here’s result training process unstable epoch reporting large testing loss network always corrects subsequent epoch there’s still sign overfitting final RMSE 6663 training 6764 testing nice improvement let’s add another layer extra layer neural network 8961 trainable parameter result training curve look final RMSE 6501 training 6555 testing Even though doubled number trainable parameter network result hardly improved Let’s one experiment I’ll remove extra layer increase number node layer 128 neural network 17793 trainable parameter result curve look unchanged final RMSE 6473 training 6484 testing see adding hidden layer increasing number node per layer improving final result We’re getting loss 64 machine learning we’re always looking simple model provides accurate prediction make model complex tend overfit look like dataset optimal neural network delivers best tradeoff accuracy complexity second one tried two 64node layer think ready start writing C machine learning apps CNTKTags Machine Learning Data Science Artificial Intelligence Csharp Deep Learning
2,442
For Those Who Try
For those who try to write something meaningful, something which will resonate with the reader, take heart. Prepare yourself. It’s another day, possibly one of frustration and sadness, because you just looked at your stats and realize you have fewer reads than the day before. Perhaps you’re sitting in front of your computer at the moment, staring at white space, wondering why you can’t think of anything to write. Maybe you’re telling yourself it doesn’t matter because no one’s going to read what you write anyway. For those of you experiencing this right now, it’s not over yet. Now is not the time to quit. Now is a momentary splash of worry and difficulty which will pass. Maybe it will come back tomorrow, perhaps not, but it shouldn’t be your focus. For those of you who try, now is the time to do just that. Try again. What we do is challenging. We all know it because we live it every day as we try to make our way forward, as we attempt to claw our way to the top. We struggle with set back after set back. And for all our hard work, we often achieve lackluster results. It’s just so damned frustrating most days, isn’t it? Up to this point, you’ve probably thought about moving on, searching for something else to nurture your creative spark what, at least a thousand times? We all have. But the dream still lives inside of us. It lives in you, doesn’t it? We all have that burning desire that won’t go away, the constant urge which prompts us to sit down and write something. Even if we know the chance of what we do going viral is maddeningly slim, we have to write. For those who try, it’s never been a conscious decision, a purposeful choice to become a writer. It’s not a choice. It’s a dream, a firm resolve, and an understanding of what and who we are. If you’ve not thought of yourself in this light before, remember this, hold on to this. You’re a writer. And you may be thinking based on the poor results you’re currently experiencing, you probably aren’t a good one, but that’s okay. We all start the same way; we all share the same pains, the same frustrations, the same heartbreaks. It’s how you deal with them that makes the difference. Remember, it’s not over yet. Unless you decide it is. For those who try, the difficulties and challenges experienced will never go away, and if we do it long enough, we eventually figure it out. We learn that we’re only as good as, or as bad as, the last thing we wrote. Even if the last thing we wrote was yesterday. And when we sit in front of that white space, we often remember all those dreams which prompted us to begin this writing journey in the first place. We remember and compare them to our current progress. Most times, it saddens us. Sometimes it infuriates us; it makes us so angry we want to just check out and quit. But we don’t stop. Instead, we try again. Somehow an idea manages to bubble up from where we neither know nor care. All we know is that it’s something we want to say. It’s something compelling us to speak, and so we say it. We tell the story the best way we can. We forget all the challenges which have knocked us about all these days, and we write. Another day, another story, and without us realizing it, another life lesson learned. For those who try, it’s essential to remember while the yield of our harvest may be poor most days, we must continue to plow the fields. We must continue to sow these fields with our writing. Though not backbreaking work, I’m sure to all those who try, it seems that way most days. At the very least, it’s mind-numbing, isn’t it? Most days, it seems as if it’s mind-numbing, grueling tedium as we lay down word, after word, after word. Somedays it all becomes an endless stream, a blur of things written and things yet to be written. And yet, we continue to plow those fields, don’t we? So, if you take away nothing else from this piece, remember this. It’s not over yet. There’s still time for you to write something, time for you to tell us another story, give us another opportunity to think differently about something, provide a perspective only you can provide. For those who try, it’s what we do. Thank you so much for reading. You didn’t have to, but I’m certainly glad you did. Let’s keep in touch: [email protected] © P.G. Barnett, 2020. All Rights Reserved.
https://medium.com/the-top-shelf/for-those-who-try-153f1acff8f9
['P.G. Barnett']
2020-09-09 13:51:53.507000+00:00
['Self-awareness', 'Self', 'Writers On Medium', 'Awareness', 'Writing']
Title TryContent try write something meaningful something resonate reader take heart Prepare It’s another day possibly one frustration sadness looked stats realize fewer read day Perhaps you’re sitting front computer moment staring white space wondering can’t think anything write Maybe you’re telling doesn’t matter one’s going read write anyway experiencing right it’s yet time quit momentary splash worry difficulty pas Maybe come back tomorrow perhaps shouldn’t focus try time Try challenging know live every day try make way forward attempt claw way top struggle set back set back hard work often achieve lackluster result It’s damned frustrating day isn’t point you’ve probably thought moving searching something else nurture creative spark least thousand time dream still life inside u life doesn’t burning desire won’t go away constant urge prompt u sit write something Even know chance going viral maddeningly slim write try it’s never conscious decision purposeful choice become writer It’s choice It’s dream firm resolve understanding you’ve thought light remember hold You’re writer may thinking based poor result you’re currently experiencing probably aren’t good one that’s okay start way share pain frustration heartbreak It’s deal make difference Remember it’s yet Unless decide try difficulty challenge experienced never go away long enough eventually figure learn we’re good bad last thing wrote Even last thing wrote yesterday sit front white space often remember dream prompted u begin writing journey first place remember compare current progress time saddens u Sometimes infuriates u make u angry want check quit don’t stop Instead try Somehow idea manages bubble neither know care know it’s something want say It’s something compelling u speak say tell story best way forget challenge knocked u day write Another day another story without u realizing another life lesson learned try it’s essential remember yield harvest may poor day must continue plow field must continue sow field writing Though backbreaking work I’m sure try seems way day least it’s mindnumbing isn’t day seems it’s mindnumbing grueling tedium lay word word word Somedays becomes endless stream blur thing written thing yet written yet continue plow field don’t take away nothing else piece remember It’s yet There’s still time write something time tell u another story give u another opportunity think differently something provide perspective provide try it’s Thank much reading didn’t I’m certainly glad Let’s keep touch paulpgbarnettcom © PG Barnett 2020 Rights ReservedTags Selfawareness Self Writers Medium Awareness Writing
2,443
Why design systems fail, and how to make them work
For a short period of time I worked on a design system at WebNL, an agency specialised in web design, development and marketing based in the Netherlands. Our design system was aimed at improving the bridge between the design and development of the products we’re making. In this blog I will explain how we did this, and why it didn’t work eventually. Hopefully this will prevent others from making the same mistakes we made, even though we’ve learned a lot from them. The beginning of the journey When I started working at WebNL, one of my first tasks was to look into the possibilities of improving the transition between design and development of web products. Traditionally this has been a process of developers ‘copying’ the mock-ups made by designers. The designers did their work primarily in Sketch. They translated their vision of the product into static designs. The developers then wrote HTML, CSS, Javascript and PHP to convert these static designs into a working product. One of the biggest ambitions inside the company was to find a way to make this process less time consuming as the work was basically done twice. So the first step I took was to find out more about ways in which this process of ‘copying’ could be automated. I looked into automation in Sketch and found out there were plugins that used the Sketch API for this purpose. But the plugins I found lacked reliability and I wasn’t really interested in writing my own Sketch plugin. I looked further and discovered that Sketch had recently opened up their file system format so their files could be used in other tools. Every property of every group and layer in the design was now easily accessable outside of Sketch and I quickly realised that I could use this to translate these properties into working products automatically. Building the first prototype After my discovery I quickly made a proof of concept. It was a very simple prototype that could turn a Sketch file into a working website. A Sketch file is basically a zip file consisting of images and json files. The prototype translated these json files into a Javascript array so it could read all the properties stored inside the Sketch files and use it to generate a standard website. Our developers were using a centralized file in which SCSS variables were stored. These variables controlled visual aspects of elements like colors, typography, buttons, and form elements. I took those elements and build them into a library of Sketch symbols which could be edited by designers. Designers could then use these elements as a starting point for new projects. When the visual appearance of these symbols had been changed, I could take the Sketch file and use it to create a new file with variables. Designers could now control these elements in the final product. /********************** Colors **********************/ $brand-primary: rgb(229,0,68); $brand-secondary: rgb(172,171,171); $brand-tertiary: rgb(0,0,0); $brand-lightest: rgb(248,249,250); $brand-darkest: rgb(52,58,63); $brand-success: rgb(85,184,144); $brand-error: rgb(229,0,68); $brand-warning: rgb(255,190,61); $brand-info: rgb(23,162,184); $text-color: black; There were also some drawbacks. We could only translate properties of the design into code if they had been standardised. Designers could change properties like colors, font properties, borders and shadows, which were then translated into working code. But the layers and symbols they added would not be translated. That didn’t seem like a problem. When designers would come up with new properties or elements, developers could just write new code to extend the existing code. I also started making more complicated elements like cards and menus in a standard way to make sure designers would not have to come up with new properties or elements as much as before. A modular approach for the symbols in our design system My first prototype got everyone at the company excited. The standardised way things worked had the potential to speed up the workflow of designers and developers alike. While designers could use the standardised elements as templates to make a jumpstart, developers would spent less time on getting things right. We got permission to spend 100 hours as an investment for future projects. I used these hours to make more elements, and translate them into code. A frontend developer worked alongside me to build the same elements as HTML with SCSS properties. When we were done, we started using the design system in production. The results were still moderate in the early phase, but showed a lot of potential. Realising we were building a design system Ironically, when we started to work on the system we didn’t know what a design system was. We had never heard of it before until our boss introduced the term design system as an existing thing, and as a way to give the project a noticeable name. We named our project the WebNL Design System, and I started to look into other companies that used design systems. During this time I read about Brad Frost, a pioneer in design systems. He talked a lot about them and he was even writing a book about it. From his book I learned about atomic design systems, a concept I implemented in our design system. Atoms, molecules, organisms, templates and pages I also read about how Airbnb was automating the design process. They used intelligent image recognition to analyse sketches made on paper and translate them into working prototypes immediately. I showed a video of their work inside my own company and that caused people to be even more excited about the potential of design systems. Another example from Airbnb was react-to-sketch. Airbnb uses it to generate Sketch symbols from existing React components. They can use the react components as a single source of truth like this. For us that didn’t work because we started a lot of new projects where the Sketch designs were the source of truth. So instead we tried to generate code components from existing Sketch symbols. This difference also exposed another difficulty we had compared to other companies. They usually had a single brand, providing a single service through a few digital products. But we were making products for a wide range of brands providing even more services. So our design system had to be more flexible. Vox Media has an excellent example of a flexible design system that can be used across brands. To me this proves the feasibility of such a design system, even when it will still makes things hard when trying to automate the workflow between design and development. Fixing bugs in production After the first hype about our design system, things started to head south. We used the system extensively, but never without trouble. We decided to use the system in short sprints where products were made within one week, because that was were we needed it the most. But on several occasions, especially in the beginning, we had to solve issues during the sprints. Instead of spending time on production we had to debug the system and produce bugfixes. Sometimes the designers had just broken things while editing the Sketch file. During those first trials I worked on getting fixes into the system and making things more enduring so designers couldn’t accidentally break things. And it worked, the system became better and more reliable. But the system still wasn’t meeting up to expectations. Managing expectations Beforehand we didn’t expect that having a design system that could automate things would have us spent less time on projects. The time we saved could be spent as extra time on our projects, we reasoned. But after a while, a product manager still mentioned that we weren’t spending less time. So not everyone was expecting the same thing from our design system. But things were also not exactly as how we expected them to be. This was because there were still a lot of bugs, not related to the design system but related to the projects. So any time left at the end of the projects would be spent on solving bugs instead of nice features. In a way this was not what anyone had expected to happen. But I didn’t see this as a problem. We just had to make the system more efficient so more time could be freed up, and less bugs would be produced. Error handling Yet this wasn’t were our problems ended. Even though the system had become more reliable, the designers were still making mistakes while building their Sketch files. These mistakes didn’t result in breaking the system anymore, because I had set up error messages that could be analysed by the developers. My idea was that these messages would cause developers and designers to talk more about problems together so they would understand each other better. But while they were indeed talking more to each other, it didn’t help them understand each other. The designers still didn’t understand the design system. Eventually I even heard some developers who weren’t directly working with the design system talk about how it didn’t work because designers weren’t using it right. I realized that I had to spent more time explaining the system to designers and co-creating with them. Teaching designers about design systems I had already spent a lot of time with developers. But I hadn’t spent much time explaining the design system to designers, assuming they would intuitively know how to use it. This was a mistake. After that realisation I spent a lot of time teaching our designers how to use the design system. I found out that they had some understanding about components, but they just weren’t used to working with nested components, naming conventions, and working with layer and text styles. This caused them to ignore some core Sketch principles that the design system relied upon. But moreover, they also weren’t used to working with design templates. Before the design system was created they always started out with a blank page, using ideation to create new and innovative designs. They wanted each design to be unique and incomparable to another. Even though the design system had been built upon patterns used in their previous work, they wanted to deviate from that work. This caused headaches with developers, because they now had to do more work instead of less, complying with the whishes of designers. The end of our Design System We did eventually reach a point where designers understood enough about Sketch principles and design systems so they could use it without much trouble. But by the time we reached this point, an unexpected decision was made to completely overhaul our standard codebase. There would be no central file with SCSS variables anymore, making it harder to generate SCSS variables from our Sketch files. All of the existing code components were also put out of order, they would all have to be rebuild before we could automate them again. At the same time, Invision launched their Design Systems Manager (DSM). This was a product which had become available in beta a short while after we had made the first prototype. DSM offered an API to translate designs into SCSS variables, like we had been doing ourselves before. Now it was out of beta and could be used in production. Even better, it offered a Sketch plugin for designers which made it easier for them to work with the components and styles used in our design system. We also decided that it would be best to switch to their API for future use, as we had found out that Sketch was continuously updating their file format, making it time-consuming to maintain generating SCSS variables ourselves. These events finally made us decide to pull the plug on the design system. We would have to rebuild the design system in a new way to make it automated again, and we just didn’t have the time at that moment. Instead we focused on smaller improvements with Invision DSM and our new codebase. Takeaway I still think design systems can do a lot of good, and at WebNL we are also still working on new design systems for clients. They are just more customized now and less automated. But there are some lessons we have learned that everyone should take in mind before creating their own design system. Manage expectations. Don’t make yourself or other people think your design system will change the world by saving you time. Instead, focus on things that are really important, like designers and developers understanding eachother. Don’t make yourself or other people think your design system will change the world by saving you time. Instead, focus on things that are really important, like designers and developers understanding eachother. Don’t do everything at once. At the start of your journey, it can be tempting to try and make a complete design system. This won’t work as you’ll have to explain and decide upon everything you make together with other designers and developers. Instead, try to take small steps over time. At the start of your journey, it can be tempting to try and make a complete design system. This won’t work as you’ll have to explain and decide upon everything you make together with other designers and developers. Instead, try to take small steps over time. Design for people. The biggest mistake I made is thinking that I could improve the connection between designers and developers by putting a system between them. It’s much better to actually get them in a room and have them making decisions together, even when this process takes a lot of time and effort. I hope these lessons can help you avoid the mistakes we made during our first attempt at building a design system. Hopefully I will be able to share about our new design systems workflow in the near future. I’m also curious to know about how other people use design systems in their workflow. Leave a comment if you’d like to share your experience or if you have any questions.
https://uxdesign.cc/why-design-systems-fail-and-how-to-make-them-work-6f6d812e216d
['Daniël De Wit']
2019-01-03 17:40:56.407000+00:00
['Development', 'Design Systems', 'Design', 'Sketch', 'UX']
Title design system fail make workContent short period time worked design system WebNL agency specialised web design development marketing based Netherlands design system aimed improving bridge design development product we’re making blog explain didn’t work eventually Hopefully prevent others making mistake made even though we’ve learned lot beginning journey started working WebNL one first task look possibility improving transition design development web product Traditionally process developer ‘copying’ mockups made designer designer work primarily Sketch translated vision product static design developer wrote HTML CSS Javascript PHP convert static design working product One biggest ambition inside company find way make process le time consuming work basically done twice first step took find way process ‘copying’ could automated looked automation Sketch found plugins used Sketch API purpose plugins found lacked reliability wasn’t really interested writing Sketch plugin looked discovered Sketch recently opened file system format file could used tool Every property every group layer design easily accessable outside Sketch quickly realised could use translate property working product automatically Building first prototype discovery quickly made proof concept simple prototype could turn Sketch file working website Sketch file basically zip file consisting image json file prototype translated json file Javascript array could read property stored inside Sketch file use generate standard website developer using centralized file SCSS variable stored variable controlled visual aspect element like color typography button form element took element build library Sketch symbol could edited designer Designers could use element starting point new project visual appearance symbol changed could take Sketch file use create new file variable Designers could control element final product Colors brandprimary rgb229068 brandsecondary rgb172171171 brandtertiary rgb000 brandlightest rgb248249250 branddarkest rgb525863 brandsuccess rgb85184144 branderror rgb229068 brandwarning rgb25519061 brandinfo rgb23162184 textcolor black also drawback could translate property design code standardised Designers could change property like color font property border shadow translated working code layer symbol added would translated didn’t seem like problem designer would come new property element developer could write new code extend existing code also started making complicated element like card menu standard way make sure designer would come new property element much modular approach symbol design system first prototype got everyone company excited standardised way thing worked potential speed workflow designer developer alike designer could use standardised element template make jumpstart developer would spent le time getting thing right got permission spend 100 hour investment future project used hour make element translate code frontend developer worked alongside build element HTML SCSS property done started using design system production result still moderate early phase showed lot potential Realising building design system Ironically started work system didn’t know design system never heard bos introduced term design system existing thing way give project noticeable name named project WebNL Design System started look company used design system time read Brad Frost pioneer design system talked lot even writing book book learned atomic design system concept implemented design system Atoms molecule organism template page also read Airbnb automating design process used intelligent image recognition analyse sketch made paper translate working prototype immediately showed video work inside company caused people even excited potential design system Another example Airbnb reacttosketch Airbnb us generate Sketch symbol existing React component use react component single source truth like u didn’t work started lot new project Sketch design source truth instead tried generate code component existing Sketch symbol difference also exposed another difficulty compared company usually single brand providing single service digital product making product wide range brand providing even service design system flexible Vox Media excellent example flexible design system used across brand prof feasibility design system even still make thing hard trying automate workflow design development Fixing bug production first hype design system thing started head south used system extensively never without trouble decided use system short sprint product made within one week needed several occasion especially beginning solve issue sprint Instead spending time production debug system produce bugfixes Sometimes designer broken thing editing Sketch file first trial worked getting fix system making thing enduring designer couldn’t accidentally break thing worked system became better reliable system still wasn’t meeting expectation Managing expectation Beforehand didn’t expect design system could automate thing would u spent le time project time saved could spent extra time project reasoned product manager still mentioned weren’t spending le time everyone expecting thing design system thing also exactly expected still lot bug related design system related project time left end project would spent solving bug instead nice feature way anyone expected happen didn’t see problem make system efficient time could freed le bug would produced Error handling Yet wasn’t problem ended Even though system become reliable designer still making mistake building Sketch file mistake didn’t result breaking system anymore set error message could analysed developer idea message would cause developer designer talk problem together would understand better indeed talking didn’t help understand designer still didn’t understand design system Eventually even heard developer weren’t directly working design system talk didn’t work designer weren’t using right realized spent time explaining system designer cocreating Teaching designer design system already spent lot time developer hadn’t spent much time explaining design system designer assuming would intuitively know use mistake realisation spent lot time teaching designer use design system found understanding component weren’t used working nested component naming convention working layer text style caused ignore core Sketch principle design system relied upon moreover also weren’t used working design template design system created always started blank page using ideation create new innovative design wanted design unique incomparable another Even though design system built upon pattern used previous work wanted deviate work caused headache developer work instead le complying whishes designer end Design System eventually reach point designer understood enough Sketch principle design system could use without much trouble time reached point unexpected decision made completely overhaul standard codebase would central file SCSS variable anymore making harder generate SCSS variable Sketch file existing code component also put order would rebuild could automate time Invision launched Design Systems Manager DSM product become available beta short made first prototype DSM offered API translate design SCSS variable like beta could used production Even better offered Sketch plugin designer made easier work component style used design system also decided would best switch API future use found Sketch continuously updating file format making timeconsuming maintain generating SCSS variable event finally made u decide pull plug design system would rebuild design system new way make automated didn’t time moment Instead focused smaller improvement Invision DSM new codebase Takeaway still think design system lot good WebNL also still working new design system client customized le automated lesson learned everyone take mind creating design system Manage expectation Don’t make people think design system change world saving time Instead focus thing really important like designer developer understanding eachother Don’t make people think design system change world saving time Instead focus thing really important like designer developer understanding eachother Don’t everything start journey tempting try make complete design system won’t work you’ll explain decide upon everything make together designer developer Instead try take small step time start journey tempting try make complete design system won’t work you’ll explain decide upon everything make together designer developer Instead try take small step time Design people biggest mistake made thinking could improve connection designer developer putting system It’s much better actually get room making decision together even process take lot time effort hope lesson help avoid mistake made first attempt building design system Hopefully able share new design system workflow near future I’m also curious know people use design system workflow Leave comment you’d like share experience questionsTags Development Design Systems Design Sketch UX
2,444
Self Esteem and Expectations
Self Esteem and Expectations Balancing on your pedestal Photo by Timothy Dykes on Unsplash Growing one’s self-esteem is like sorting the wheat from the chaff. A sieving system where beliefs and actions and habits are brought to light to be examined and queried. Does this empower me or disempower me? Do I want this or not? Perhaps it is maturity but lately, I’ve been more discerning. I’ve been questioning the very atmosphere of expectations — social and personal — and whether I need to obey or not. I’m cherry-picking. When I look at the expectations in my hands I realise they don’t belong to me — they disintegrate in my calloused palms as if nothing more than powdery ash. Like little delusions. Some of them mass delusions. Even traditions are not making much sense. All those expectations are like nails snagging my lace dress. I have a long friendship with a man who is so audacious it's impressive. He does not bend to anyone. He stands on a lot of toes and puts many a nose out of joint. Other men object to his Alpha maleness but my friend doesn’t care for those definitions. He genuinely doesn’t care what people think of him. Impervious. He defines himself. He’s not anchored by social mores. He’s authentic and game. He does whatever he wants as if his life depends on it. Ten years ago it made my jaw drop. I had the epiphany — If he can do whatever he wants and get away with it. I can do whatever I want! But knowledge and implementation are two different things. Repetition helps it integrate. It’s been a decade of chipping away, practicing using my voice, saying no, putting my well-being first, recognising the power-point where I drop the ball and rollover. I jot notes in the margins of my experience like an actor learning a new script. How interesting I think; next time I’ll do it like this. I scribble — poise. As your sense of loyalty to self increases you kind of stand up inside yourself. Less tempted to spend energy being sparkly at a party preferring instead to observe. Not interested in proving anything to anyone anymore but in self-improvement to see what more you can learn, experience, attempt and master. Noticing the point where you’d usually appease and instead - stand your ground. Many expectations I now dismiss with a flick of the wrist. I have other things to do. I have a high price on my head. To free ourselves of the expectations of others, to give us back to ourselves there lies the great, singular power of self-respect. - Joan Didion Yesterday a man barrelled towards me on a city street and instead of making way for him I continued cutting my own line — at the last moment he had to move his body so as not to bump me. Someone did an experiment about that once. How women move aside for men. I’ve been limiting the number of times I say sorry. Reserving the word for when I mean it, instead of saying it as a means of deferment. Self-respect is a discipline, a habit of mind that can never be faked but can be developed, trained, coaxed forth. - Joan Didion Standing on the pedestal you make for yourself means you can see clearly. You’re braver. The little knocks and nicks and snide remarks don’t penetrate. It means shining your own light and not becoming desolate over failed attempts. It means not getting sucked into snake dens. It means you like yourself a lot; you make no apologies for who you are or how you live or what you choose. It means throwing out the labels and categories. You don’t accept criticisms from people who have no skin in the game. You can divine motives by feel and vibe and don’t sacrifice anything to unworthy causes. You make your own decisions. You stand on the rock of yourself. You realise and it sinks in: if it's meant to be it's up to me. You care less for the opinions of others. You take yourself in hand. You stop worrying about being told off. You start eating with your hands. If you don’t care to be liked, they can’t touch you. - Navel Ravi Kant He that respects himself is safe from others. He wears a coat of mail that none can pierce. - Henry W. Longfellow Don’t throw your power away like a slut. - Anon Thanks for reading, Louise
https://medium.com/illumination/self-esteem-and-expectations-be6895aa708e
['Louise Moulin']
2020-12-07 01:54:15.298000+00:00
['Self-awareness', 'Self Love', 'Motivation', 'Life', 'Self Improvement']
Title Self Esteem ExpectationsContent Self Esteem Expectations Balancing pedestal Photo Timothy Dykes Unsplash Growing one’s selfesteem like sorting wheat chaff sieving system belief action habit brought light examined queried empower disempower want Perhaps maturity lately I’ve discerning I’ve questioning atmosphere expectation — social personal — whether need obey I’m cherrypicking look expectation hand realise don’t belong — disintegrate calloused palm nothing powdery ash Like little delusion mass delusion Even tradition making much sense expectation like nail snagging lace dress long friendship man audacious impressive bend anyone stand lot toe put many nose joint men object Alpha maleness friend doesn’t care definition genuinely doesn’t care people think Impervious defines He’s anchored social more He’s authentic game whatever want life depends Ten year ago made jaw drop epiphany — whatever want get away whatever want knowledge implementation two different thing Repetition help integrate It’s decade chipping away practicing using voice saying putting wellbeing first recognising powerpoint drop ball rollover jot note margin experience like actor learning new script interesting think next time I’ll like scribble — poise sense loyalty self increase kind stand inside Less tempted spend energy sparkly party preferring instead observe interested proving anything anyone anymore selfimprovement see learn experience attempt master Noticing point you’d usually appease instead stand ground Many expectation dismiss flick wrist thing high price head free expectation others give u back lie great singular power selfrespect Joan Didion Yesterday man barrelled towards city street instead making way continued cutting line — last moment move body bump Someone experiment woman move aside men I’ve limiting number time say sorry Reserving word mean instead saying mean deferment Selfrespect discipline habit mind never faked developed trained coaxed forth Joan Didion Standing pedestal make mean see clearly You’re braver little knock nick snide remark don’t penetrate mean shining light becoming desolate failed attempt mean getting sucked snake den mean like lot make apology live choose mean throwing label category don’t accept criticism people skin game divine motif feel vibe don’t sacrifice anything unworthy cause make decision stand rock realise sink meant care le opinion others take hand stop worrying told start eating hand don’t care liked can’t touch Navel Ravi Kant respect safe others wear coat mail none pierce Henry W Longfellow Don’t throw power away like slut Anon Thanks reading LouiseTags Selfawareness Self Love Motivation Life Self Improvement
2,445
Jake Reisch ’15 makes headphones that improve seniors’ lives
As co-founder and CEO of Eversound, Jake Reisch ’15 leads a team that creates wireless headphones designed to improve quality of life for older adults and ease communication between residents of senior living communities and their caregivers. To date, the technology has been adopted by over 500 senior living communities. Eversound’s founders were named to the “Forbes 30 Under 30” list for consumer technology in 2018. Eversound headphones can be used for group activities, communication with caregivers, music therapy, and visits with friends and family. What does your business do, and what problem does it solve? Eversound’s goal is to improve health outcomes and quality of life for seniors in elder care communities. Social isolation among seniors is linked to higher rates of mortality and greater health care costs. Eversound makes easy-to-use headphones that enhance focus and engagement for seniors with hearing loss or dementia. They can be used for group activities, communication with caregivers, music therapy, and visits with friends and family. We also provide member communities with a digital library of activities they can use to stimulate social interaction. How did you get the idea for your business? My co-founders and I had watched as our loved ones’ senses declined and they struggled to remain connected to the world around them. We wanted to create something that would help. Many of Eversound’s users have lost their spouses or their children, and some of them no longer have anyone visiting them. We see ourselves as advocates above all else. “Social isolation among seniors is linked to higher rates of mortality and greater health care costs…. We wanted to create something that would help.” Starting a business is a big risk, especially straight out of school. How did you decide to take the risk? My last semester at Cornell, we talked to people in over 100 senior living communities about the idea for Eversound. Almost no one believed in us. John Alexander ’74, MBA ’76, a former Cornell Entrepreneur of the Year, was the only person who saw the potential. He gave us a few words of encouragement, our first investment and countless hours of mentorship. That really helped to get us off the ground. How has your experience at Cornell impacted how you approach business? Going through Cornell’s eLab accelerator helped to structure my thinking. It forced me into a customer-centric development approach and taught me how to address each problem we faced. Ken Rother, Tom Schryver, Zach Shulman, Brad Treat and Deb Streeter were all critical figures in my learning experience at Cornell. To date, Eversound’s technology has been adopted by over 500 senior living communities. What has been your proudest moment as an entrepreneur? Why? I recently had a check-in with one of our most valuable partners, who has Eversound in use in over 50 assisted living communities. She told us about the impact we were having on the residents and the staff’s lives, and how amazing their Eversound account manager was to work with. It was indescribably rewarding to think back on where we started and to hear the impact we’re having now. “Going through Cornell’s eLab accelerator helped to structure my thinking. It forced me into a customer-centric development approach and taught me how to address each problem we faced.” Who or what inspires you? Many of my mentors inspire me with their good will and authenticity. It takes a lot of dedication to accomplish what our investors and advisors have accomplished. Life is short, and I firmly believe that rallying people around important missions can make a difference. If you had one piece of advice for someone just starting out, what would it be? Force yourself into the habit of monthly reporting with a consistent and focused metric dashboard. After you take your first funding, update your investors monthly, without fail, on the good and the bad. It builds rapport, creates accountability and forces you to look soberly at the progress you’re making over time.
https://medium.com/cornell-university/jake-reisch-15-makes-headphones-that-improve-seniors-lives-b837a6d96161
['Cornell University']
2019-12-18 20:42:29.832000+00:00
['Cornell', 'Technology', 'Cornell University', 'Startup', 'Entrepreneurship']
Title Jake Reisch ’15 make headphone improve seniors’ livesContent cofounder CEO Eversound Jake Reisch ’15 lead team creates wireless headphone designed improve quality life older adult ease communication resident senior living community caregiver date technology adopted 500 senior living community Eversound’s founder named “Forbes 30 30” list consumer technology 2018 Eversound headphone used group activity communication caregiver music therapy visit friend family business problem solve Eversound’s goal improve health outcome quality life senior elder care community Social isolation among senior linked higher rate mortality greater health care cost Eversound make easytouse headphone enhance focus engagement senior hearing loss dementia used group activity communication caregiver music therapy visit friend family also provide member community digital library activity use stimulate social interaction get idea business cofounder watched loved ones’ sens declined struggled remain connected world around wanted create something would help Many Eversound’s user lost spouse child longer anyone visiting see advocate else “Social isolation among senior linked higher rate mortality greater health care costs… wanted create something would help” Starting business big risk especially straight school decide take risk last semester Cornell talked people 100 senior living community idea Eversound Almost one believed u John Alexander ’74 MBA ’76 former Cornell Entrepreneur Year person saw potential gave u word encouragement first investment countless hour mentorship really helped get u ground experience Cornell impacted approach business Going Cornell’s eLab accelerator helped structure thinking forced customercentric development approach taught address problem faced Ken Rother Tom Schryver Zach Shulman Brad Treat Deb Streeter critical figure learning experience Cornell date Eversound’s technology adopted 500 senior living community proudest moment entrepreneur recently checkin one valuable partner Eversound use 50 assisted living community told u impact resident staff’s life amazing Eversound account manager work indescribably rewarding think back started hear impact we’re “Going Cornell’s eLab accelerator helped structure thinking forced customercentric development approach taught address problem faced” inspires Many mentor inspire good authenticity take lot dedication accomplish investor advisor accomplished Life short firmly believe rallying people around important mission make difference one piece advice someone starting would Force habit monthly reporting consistent focused metric dashboard take first funding update investor monthly without fail good bad build rapport creates accountability force look soberly progress you’re making timeTags Cornell Technology Cornell University Startup Entrepreneurship
2,446
Day One Python Engineer
__author__ = “Alex Varatharajah” class SoftwareEngineer: “””So you made it! Through all the applications, all the tests and all the interviews… Welcome to ONZO! I have been here just over a month now as a Software Engineer, and I’d like to do a short retrospective on what my first month has been like.“”” def __init__ (self, day_one_python_engineer): What better place to start than your first day? It is highly likely that you will be reading a “Day One Python Engineer” or an equivalent guide for your role on Confluence. You’ll be setting up your virtual environments, installing your IDE’s, setting up Git and cloning repositories, creating a branch, writing tests for a codebase you have no idea about, raising a PR…Googling what PR means… the point is it can all seem very overwhelming. That’s ok it is normal. If you manage to raise a PR on your first day kudos :+1: (don’t forget to install slack). def my_first_few_days (self, stand_up): I came to ONZO with a small Python background from my previous place, with a passion to learn about the fundamentals of programming best practice. Having not worked at a company that follows agile principles before I have found it such a breath of fresh air to come into a team that has such a good philosophy for it. The first few times I felt Stand-Up was a bit intimidating, as you are speaking in front of a whole new group of people, but once you realise that everyone is on the same page and ready to listen and willing help, it just becomes very natural…it also helps that everyone is so friendly and welcoming. My first few days were spent pair programming with various other engineers in the team. This was great because I got to get hands on with our codebase, while also having someone to discuss the structure and why we do certain things. I would thoroughly recommend doing this (though you may not have a choice), it is an easy way to knowledge share and pick up flaws in the code you are writing at the source before it goes into production. def my_first_ticket (self, sprint_before_you_can_walk): Pretty soon after I picked up my first ticket, which was a design for a postal code to lat-long look up. This involved creating a confluence page to discuss ideas about how to best attack this problem. Once this was done it was sent round the team for comment before organising a meeting to discuss the ideas and writing stories to fulfil the epic. Initially, I thought “what have I got myself in for?”, but actually it has given me the best opportunities to get stuck into a small cog of the machine, start playing around with ideas and learn a lot in a short amount of time. def day_to_day (self, pythonic_python): Since my first ticket, I have been helping mostly with the Python side of projects. Improvements to algorithms made by our data science team need to be available for client use. They are written in Python, therefore it is a big focus of mine to help with getting them into production. One of my personal interests (and objectives) is to make the Python side of data science as efficient as possible. At ONZO, I have been allowed to research new technologies and given the opportunity to remove inefficiencies in our Python codebase. A lot of my days have been refactoring code, writing functions and unit tests for those functions, while also making sure the functionality passes the original unit tests. def __del__ (self, pros_and_cons): If you are working at ONZO you’ll be working in a very relaxed environment. Everyone is very friendly and supportive. I feel like I have been here longer than I actually have, which can only be a good thing to have been embedded so quickly. Pros: Agile principles in action. You are trusted, you are responsible and accountable for what you are doing, however (to quote Dumbledore) “help is always available at ONZO for those who ask for it”. So much Python work, I’m never bored Knowledge sharing sessions Regular 1–2–1’s with managers and colleagues Flexible working Caffeine until your brain explodes Dangerous amounts of Tunnock’s Caramel Wafers Cons:
https://medium.com/onzo-tech/day-one-python-engineer-24214ef2f8d
[]
2018-12-03 15:51:54.404000+00:00
['Python', 'Utilities', 'Energy', 'Software Engineering']
Title Day One Python EngineerContent author “Alex Varatharajah” class SoftwareEngineer “””So made application test interviews… Welcome ONZO month Software Engineer I’d like short retrospective first month like“”” def init self dayonepythonengineer better place start first day highly likely reading “Day One Python Engineer” equivalent guide role Confluence You’ll setting virtual environment installing IDE’s setting Git cloning repository creating branch writing test codebase idea raising PR…Googling PR means… point seem overwhelming That’s ok normal manage raise PR first day kudos 1 don’t forget install slack def myfirstfewdays self standup came ONZO small Python background previous place passion learn fundamental programming best practice worked company follows agile principle found breath fresh air come team good philosophy first time felt StandUp bit intimidating speaking front whole new group people realise everyone page ready listen willing help becomes natural…it also help everyone friendly welcoming first day spent pair programming various engineer team great got get hand codebase also someone discus structure certain thing would thoroughly recommend though may choice easy way knowledge share pick flaw code writing source go production def myfirstticket self sprintbeforeyoucanwalk Pretty soon picked first ticket design postal code latlong look involved creating confluence page discus idea best attack problem done sent round team comment organising meeting discus idea writing story fulfil epic Initially thought “what got for” actually given best opportunity get stuck small cog machine start playing around idea learn lot short amount time def daytoday self pythonicpython Since first ticket helping mostly Python side project Improvements algorithm made data science team need available client use written Python therefore big focus mine help getting production One personal interest objective make Python side data science efficient possible ONZO allowed research new technology given opportunity remove inefficiency Python codebase lot day refactoring code writing function unit test function also making sure functionality pass original unit test def del self prosandcons working ONZO you’ll working relaxed environment Everyone friendly supportive feel like longer actually good thing embedded quickly Pros Agile principle action trusted responsible accountable however quote Dumbledore “help always available ONZO ask it” much Python work I’m never bored Knowledge sharing session Regular 1–2–1’s manager colleague Flexible working Caffeine brain explodes Dangerous amount Tunnock’s Caramel Wafers ConsTags Python Utilities Energy Software Engineering
2,447
What an Ostrich Can Teach Us About Gut Health
When we consider the gut microbiome, we usually think of two things: gut-related diseases, such as irritable bowel syndrome (IBS), or probiotic supplements. (At least, this is what I think about, as a microbiome scientist. Most people that I speak with are not microbiome scientists, and they just give me weird looks when I begin enthusiastically speaking about the gut microbiome.) But the gut microbiome — the collection of trillions of bacteria, comprised of hundreds of different species, all living in an uneasy balance with each other inside our intestinal tract — isn’t just for humans. Dogs, cats, mice, cattle, and just about every other animal on the planet has their own gut microbiome. Because we generally care more about curing irritable bowels in people, rather than in mice, most studies of the gut microbiome tend to focus on humans. Recently, however, researchers in Sweden published a paper looking at the impact of the gut microbiome on juvenile mortality — in ostriches. Here’s what they found. Not Only the Good Ostriches Die Young An ostrich can live for a very long time — if it makes it past childhood. Most adult ostriches live for at least 40–45 years, with some making it as old as 75 years. This is quite long-lived; ducks and chickens will live 5–10 years and a goose lives 10–15 years, in comparison. (Of course, some other birds, such as parrots, can live even longer — up to 75 years in captivity.) However, many ostriches don’t make it to even their first birthday. In one study of more than 2,500 chicks, more than three-quarters of them — 78% — didn’t even survive 90 days beyond hatching! This is a concern for us, not just because it’s sad to know that most baby ostriches don’t make it, but because ostrich farms are a profitable business industry. Ostriches are highly profitable, and have several advantages over raising beef cattle or other birds: They adapt well and need little shade or protection from the elements; Unlike chickens, ostriches are capable of aggressively defending themselves from predators; The meat, eggs, skin (for leather), and feathers of ostriches are all sold for excellent prices; Ostriches produce more meat for the amount of consumed resources (higher efficiency) than cattle. If you’re a farmer, a herd of ostriches could be big bucks — if you can keep too many chicks from dying. Experiments have looked at different ways of incubating and raising ostriches. A more intensive and nurturing system produced fewer dead birds, but it’s not a perfect solution. So what kills baby ostriches? One possibility — it could be related to their gut microbiomes. Is an Out-of-Whack Ostrich Gut Linked to Chick Death? In the research paper, Videvall and her team used a method called 16S rRNA sequencing to look at the composition of different bacteria in the guts of baby ostriches at various time points. 16S rRNA sequencing is a bit like scanning barcodes at a grocery-store checkout line; it looks at a specific gene, called the 16S subunit, in order to identify different bacteria. Each family of bacteria has slight variations in its 16S gene that differentiates it from other types of bacteria. By using computers to match the 16S gene of all the bacteria from a sample to a reference, we can quickly determine which bacteria are present in a sample. “You’re taking my poop… for WHAT?!?” Photo by Krzysztof Niewolny on Unsplash The researchers used this 16S sequencing method to take “snapshots” of the gut microbiomes of baby ostriches as they grew — and when they died, from autopsies of the dead birds. They then compared the microbiomes of the surviving ostriches with those of the chicks that died during the first 90 days of life. What did they find? Overall, individual birds who passed away had drastically reduced microbe diversity — that is, there were way fewer species of bacteria in their guts. If the healthy birds had an Amazon rainforest of different species thriving in their guts, the sick birds had a cornfield — far fewer different organisms. Additionally, some species of bacteria seemed to be more prevalent in sick birds, while other species were more present in the healthy birds. It’s not just how many bacteria were present, but the right ones, instead of the wrong ones, seemed to also play a role. One interesting conclusion that the authors found was not from the birds directly — but from their environment. How did the bacteria that seemed connected with early death get into the birds? Sequencing of the environment showed that they didn’t come from the food, water, or soil where the birds were raised. Instead, it seemed like small numbers of these “bad bacteria” were in the baby birds from the beginning. In the healthy birds, other species crowded out these bad bacteria so that they couldn’t take over. But in the sick birds, the lack of diversity let the bad bacteria proliferate, taking over that gut environment. There’s Always More to Study Of course, this study isn’t putting the nail in the coffin for the question of why so many baby ostriches die. There are still a bunch of outstanding questions, including: What mechanism makes some of these bacteria bad, and what’s different in the good bacteria? Can we reduce the loss of diversity in sick ostrich chicks? Is this causative? In other words, we see low gut diversity in the sick birds, but is that what’s responsible for their deaths? When and how would we intervene to restore a more diverse gut microbiome with healthy bacterial species? These questions aren’t just relevant to ostriches — we have the same questions with many of the bacteria that we see in human gut microbiomes. We’re working on answers, but they’re not present yet. Interestingly, many of the “bad bacteria” seen in the low-diversity guts of sick ostrich chicks are closely related to bacteria that are found in humans — and that are associated with negative outcomes. Perhaps if we can better figure out how to understand and improve the human gut, we can do the same for ostriches!
https://medium.com/a-microbiome-scientist-at-large/what-an-ostrich-can-teach-us-about-gut-health-cb56e71ede90
['Sam Westreich']
2020-12-14 12:12:31.358000+00:00
['Biology', 'Environment', 'Science', 'Farming', 'Microbiome']
Title Ostrich Teach Us Gut HealthContent consider gut microbiome usually think two thing gutrelated disease irritable bowel syndrome IBS probiotic supplement least think microbiome scientist people speak microbiome scientist give weird look begin enthusiastically speaking gut microbiome gut microbiome — collection trillion bacteria comprised hundred different specie living uneasy balance inside intestinal tract — isn’t human Dogs cat mouse cattle every animal planet gut microbiome generally care curing irritable bowel people rather mouse study gut microbiome tend focus human Recently however researcher Sweden published paper looking impact gut microbiome juvenile mortality — ostrich Here’s found Good Ostriches Die Young ostrich live long time — make past childhood adult ostrich live least 40–45 year making old 75 year quite longlived duck chicken live 5–10 year goose life 10–15 year comparison course bird parrot live even longer — 75 year captivity However many ostrich don’t make even first birthday one study 2500 chick threequarters — 78 — didn’t even survive 90 day beyond hatching concern u it’s sad know baby ostrich don’t make ostrich farm profitable business industry Ostriches highly profitable several advantage raising beef cattle bird adapt well need little shade protection element Unlike chicken ostrich capable aggressively defending predator meat egg skin leather feather ostrich sold excellent price Ostriches produce meat amount consumed resource higher efficiency cattle you’re farmer herd ostrich could big buck — keep many chick dying Experiments looked different way incubating raising ostrich intensive nurturing system produced fewer dead bird it’s perfect solution kill baby ostrich One possibility — could related gut microbiomes OutofWhack Ostrich Gut Linked Chick Death research paper Videvall team used method called 16S rRNA sequencing look composition different bacteria gut baby ostrich various time point 16S rRNA sequencing bit like scanning barcodes grocerystore checkout line look specific gene called 16S subunit order identify different bacteria family bacteria slight variation 16S gene differentiates type bacteria using computer match 16S gene bacteria sample reference quickly determine bacteria present sample “You’re taking poop… WHAT” Photo Krzysztof Niewolny Unsplash researcher used 16S sequencing method take “snapshots” gut microbiomes baby ostrich grew — died autopsy dead bird compared microbiomes surviving ostrich chick died first 90 day life find Overall individual bird passed away drastically reduced microbe diversity — way fewer specie bacteria gut healthy bird Amazon rainforest different specie thriving gut sick bird cornfield — far fewer different organism Additionally specie bacteria seemed prevalent sick bird specie present healthy bird It’s many bacteria present right one instead wrong one seemed also play role One interesting conclusion author found bird directly — environment bacteria seemed connected early death get bird Sequencing environment showed didn’t come food water soil bird raised Instead seemed like small number “bad bacteria” baby bird beginning healthy bird specie crowded bad bacteria couldn’t take sick bird lack diversity let bad bacteria proliferate taking gut environment There’s Always Study course study isn’t putting nail coffin question many baby ostrich die still bunch outstanding question including mechanism make bacteria bad what’s different good bacteria reduce loss diversity sick ostrich chick causative word see low gut diversity sick bird what’s responsible death would intervene restore diverse gut microbiome healthy bacterial specie question aren’t relevant ostrich — question many bacteria see human gut microbiomes We’re working answer they’re present yet Interestingly many “bad bacteria” seen lowdiversity gut sick ostrich chick closely related bacteria found human — associated negative outcome Perhaps better figure understand improve human gut ostrichesTags Biology Environment Science Farming Microbiome
2,448
You shouldn’t cheat on your partner and this is why
by: E.B. Johnson Our relationships form a cornerstone of our happiness, but when they become corrupted, the waters get muddied. Life is complex, and it’s hard to stay centered and focused on one another at all times. We drift and our affections and our attentions drift too. Things go wrong and we start to wonder if the grass wouldn’t be greener in some other pasture. No matter what’s going on in your relationship, infidelity is never acceptable. When we commit to our partners, we promise to do the right thing when it comes to their emotions and our needs. That’s not to say that you’re doomed to spend forever with someone who is no longer right for you. But it does mean you have to do the hard work when difficult temptations or difficulties come along. Commitment is important in all stages. Many dream of building a life with someone, but they don’t always consider what setbacks or challenges can come with that. It’s not all picket fences and butterflies. Relationships are hard work, and that hard work doesn’t always pay off. Sometimes we run up against divides that push us away from one another. In those instances, we can become tempted to cheat. This is never the answer, though. Commitment remains important even when things are bad. When you commit to be in a relationship with someone, you commit to do right by them — even when things are falling apart. You are allowed to change your mind. You are allowed to want out of your relationship, and you’re allowed to fall in love with someone else. Things change. People change. What you don’t have the right to do is harm your partner or lie to them. Our lives are the sum of the decisions we make. Committing to someone is making a promise to them, and among those promises is telling the truth. If you’re tempted to cheat on your partner, it’s time to open up and figure out what you really want from them and your relationship in general. In order to do this, though, you’re going to have to dig deep and be brutally honest with yourself and your partner too. Why you shouldn’t cheat on your partner. Relationships go through ups and downs, and sometimes they fail. No matter how hard things get, however, we don’t have a right to cheat on our partners. When we commit to someone we make a promise to do the right thing. Cheating only creates bigger issues, more stress, and an array of complex emotions and patterns which can be hard to heal. Creating bigger issues We all experience hardships in our relationships, but infidelity never makes those challenges easier. Perhaps your relationship isn’t broken, you’re just experiencing a momentary lapse or pressure point that’s making it hard to connect. By engaging in infidelity, you create a bigger problem — one which you may not be able to come back from. Cheating is complex, and it involves deep-rooted emotions. If you want to come back from problems, you can’t run to another person…you have to run to your partner. Inflicting unfair injury No matter what angle you view it from, cheating is wrong, and it inflicts serious pain on the other side of your partnership. When your partner is a good person, then the act of cheating creates unfair injury which is unnecessary. The hurt of cheating runs so much deeper than simply ending something you were both invested in. It also teaches the other person toxic lessons, which follow them throughout their remaining relationships. Betrayal is nothing to take lightly and its wounds last a lifetime. Corrupted reputation Affairs are never an event that remains between two people. Rightly, our reputations become corrupted when word gets out about our inability to stay faithful to the people that we’ve committed to. Word will get around and some people closest to you will begin to see you in a different light. Little-by-little, this can impact the way they see you in their lives, and they way in which you’re able to interact with your community at large. Degraded social circles Do you think that your affair will only touch you and your partner? Don’t console yourself with this thought. Not only will you potentially lose your spouse or loved one through your decision to cheat, you will potentially lose friends and family in response to your actions. Never underestimate the loyalty that the people we love will feel to a wronged partner. And no matter what they decide to do, you will have to accept it as a result of your actions. Emotional dysfunction When we cheat, we don’t just hurt the other person in deep and irreparable ways. We also cause a lot of damage to ourselves emotionally and cultivate feelings of guilt and shame, which change our personalities and our relationships with others. On top of that, we create even greater stress for ourselves, which causes more mistakes in other parts of our lives, as well as physical erosion, which impacts our quality of life. Cultivating toxic patterns Cheating, more often than not, is a part of a toxic cycle of self-destruction which undermines our long-term happiness time-and-time again. Cheaters tend to cheat in every relationship they’re in, whether that infidelity is emotional or physical. It becomes a toxic pattern which pulls people in and then pushes them away before reaching true vulnerability. It’s also a way to constantly chase “greener pastures” rather than putting in the work it takes to last. Handling your urge to cheat the right way. Are you struggling with an urge to cheat? Has someone new come into your life, or have things changed drastically between you and your partner? You have to process these challenging emotions the right way, and that happens by figuring out underlying issues and opening up communication channels the right way. 1. Figure out the underlying issue The urge to cheat isn’t necessarily something that happens overnight. Generally, it results from long-standing issues that have been ignored or otherwise swept under the rug. For example, you and your partner could be dealing with a long-term conflict that’s caused you both to shut down and shut one another out. Over time, this coldness compounds and presses you both to look outward for the comfort you can’t find within the relationship. Instead of embracing your urge to cheat as the natural “next step” in a failing relationship, take a step back and question what the underlying issues really are. Where is this new desire coming from? What is in you that is seeking someone who isn’t your partner? Avoid blaming it all on the other person. Relationships don’t (usually) fail because of a single person’s actions. We both make the decision to stop communication. We make the decision to put our partners last and everything else in our lives first. Don’t analyze your partner or act on your urges until you get clear on what you’re not getting. Then figure out how that’s feeding your need to cheat. 2. Think before you react Temptation is a powerful thing. One moment you are happily engaged in your life, and then the next moment you’re presented with something you didn’t even realize you were lacking. For some, this temptation is gambling or engaging in other risky or addictive behaviors. To others, though, that temptation can come in the form of a person who offers something you perceive your partner not to have. You have to think before you react to and act on this temptation. While your brain might be telling you that this is something you will never encounter again, that just isn’t true when you break it down. Is this person really offering you anything you couldn’t find at home with the right work and communication? If they can — then why are you settled in a relationship that isn’t giving you what you need? As humans, we claim to be so much better than the animals we rule over, but we ourselves are animals who often struggle to control our base impulses. Rise above your animal nature and think things through. To cheat will only detonate the good in your life. You need to move forward (in any direction) with maturity and good faith. 3. Open up communication channels Like it or not, communication is a fundamental part of facing up to and resolving your urge to cheat on your partner. You have to communicate with your inner self and get aligned with what you want, both emotionally and morally. You also have to communicate with your partner once your truths have been reached, and get their perspective if you want to repair things or move forward in a different way. Spend some time with your inner self. Make it a regular habit and spend that time getting reacquainted with your needs and your future designs. We all deserve to be happy in relationships and lives which are aligned to our authentic selves. Sometimes, our relationships change and no longer fit the person that we’re becoming. If that’s the case, you have to sit your partner down and be honest and candid with them. Find a safe space when you can both be secure and share what’s going on inside your head. Then, you can come together to find solutions and make mutual decisions on what comes next. 4. Get some perspective Our intimate relationships are intense, and they take up a lot of our time and our focus. When we’ve spent a long time with the same person, it’s easy to get tunnel vision and lose sight of the bigger picture. You need to get some perspective if you’re dealing with ideas of infidelity. From time-to-time, this can help shift us back into line with our partner. Or, it can reveal some more critical realities for us to embrace. Once you’ve opened up to your partner and taken some time to figure out what your ultimate relationship needs are, you need to take a step back and get some perspective. Extra-martial affairs and outside relationships are exciting. They give us that butterfly feeling and they get our blood racing again. That’s tempting, especially if you’ve been settled down with the same person for a long time. You have to question your reality on it, though. Are you chasing something you genuinely need, or are you excited about the prospect of a new adventure in territory you’ve never visited before? The time you’ve put in with someone is important. The fantasy presented by an affair is also important to acknowledge. Brace yourself in reality and get some perspective. 5. Do right by your commitments Like it or not, the commitment we make to our partners applies even to the challenging parts and ending of our relationships. To commit to someone isn’t just to say that you won’t cheat on them. It’s also making a promise to be truthful to them, even when your truth hurts them. That’s what it is to do right by someone. But you can only do this when you look to the future and the bigger victories (and losses) at stake. Have you decided that you can’t resist your urges? Have you decided that you need something different, or something better? That’s fine. Do right by your commitments and tell your partner that it’s time to call it a day on your partnership. Communicate that you’ve changed and what you want from your relationships has changed too. You don’t need to give them any gritty details, you just need to ensure that you aren’t betraying their trust. The pain that comes from infidelity is so much greater than the pain that comes from a relationship that’s come to a close. If either of you ever want to be civil to one another again — if you want a genuine chance of healing — then you have to do right by one another and cut the chord if that’s the only thing left to do. Putting it all together… No matter how strong your urge to cheat might be, it’s never the right answer for a crumbling relationship. You have the right to walk away from something which isn’t working, but you also have a responsibility to be honest and faithful to your partner. Are you sitting on the fence with a difficult decision to make? You need to handle your urge to cheat the right way. Figure out the underlying issues behind your urge to cheat and then figure out whether they are worth repairing with your partner. Think before you react. Is this temptation worth losing all the time and effort you’ve put into your relationship? Is it worth losing your friends and your happiness? These are all things we have to consider. Sit down with your partner and open up. Be compassionately honest with them and let them know where you stand. Perhaps the two of you can work things out, you’ll never know until you talk and see where you both stand. Then, you can get a more realistic perspective on where you’re both at and make the decisions which are authentically aligned to your happiness and your commitment to one another.
https://medium.com/lady-vivra/you-shouldnt-cheat-on-your-partner-a48e980f768e
['E.B. Johnson']
2020-10-27 07:07:05.291000+00:00
['Self', 'Nonfiction', 'Relationships', 'Psychology', 'Dating']
Title shouldn’t cheat partner whyContent EB Johnson relationship form cornerstone happiness become corrupted water get muddied Life complex it’s hard stay centered focused one another time drift affection attention drift Things go wrong start wonder grass wouldn’t greener pasture matter what’s going relationship infidelity never acceptable commit partner promise right thing come emotion need That’s say you’re doomed spend forever someone longer right mean hard work difficult temptation difficulty come along Commitment important stage Many dream building life someone don’t always consider setback challenge come It’s picket fence butterfly Relationships hard work hard work doesn’t always pay Sometimes run divide push u away one another instance become tempted cheat never answer though Commitment remains important even thing bad commit relationship someone commit right — even thing falling apart allowed change mind allowed want relationship you’re allowed fall love someone else Things change People change don’t right harm partner lie life sum decision make Committing someone making promise among promise telling truth you’re tempted cheat partner it’s time open figure really want relationship general order though you’re going dig deep brutally honest partner shouldn’t cheat partner Relationships go ups down sometimes fail matter hard thing get however don’t right cheat partner commit someone make promise right thing Cheating creates bigger issue stress array complex emotion pattern hard heal Creating bigger issue experience hardship relationship infidelity never make challenge easier Perhaps relationship isn’t broken you’re experiencing momentary lapse pressure point that’s making hard connect engaging infidelity create bigger problem — one may able come back Cheating complex involves deeprooted emotion want come back problem can’t run another person…you run partner Inflicting unfair injury matter angle view cheating wrong inflicts serious pain side partnership partner good person act cheating creates unfair injury unnecessary hurt cheating run much deeper simply ending something invested also teach person toxic lesson follow throughout remaining relationship Betrayal nothing take lightly wound last lifetime Corrupted reputation Affairs never event remains two people Rightly reputation become corrupted word get inability stay faithful people we’ve committed Word get around people closest begin see different light Littlebylittle impact way see life way you’re able interact community large Degraded social circle think affair touch partner Don’t console thought potentially lose spouse loved one decision cheat potentially lose friend family response action Never underestimate loyalty people love feel wronged partner matter decide accept result action Emotional dysfunction cheat don’t hurt person deep irreparable way also cause lot damage emotionally cultivate feeling guilt shame change personality relationship others top create even greater stress cause mistake part life well physical erosion impact quality life Cultivating toxic pattern Cheating often part toxic cycle selfdestruction undermines longterm happiness timeandtime Cheaters tend cheat every relationship they’re whether infidelity emotional physical becomes toxic pattern pull people push away reaching true vulnerability It’s also way constantly chase “greener pastures” rather putting work take last Handling urge cheat right way struggling urge cheat someone new come life thing changed drastically partner process challenging emotion right way happens figuring underlying issue opening communication channel right way 1 Figure underlying issue urge cheat isn’t necessarily something happens overnight Generally result longstanding issue ignored otherwise swept rug example partner could dealing longterm conflict that’s caused shut shut one another time coldness compound press look outward comfort can’t find within relationship Instead embracing urge cheat natural “next step” failing relationship take step back question underlying issue really new desire coming seeking someone isn’t partner Avoid blaming person Relationships don’t usually fail single person’s action make decision stop communication make decision put partner last everything else life first Don’t analyze partner act urge get clear you’re getting figure that’s feeding need cheat 2 Think react Temptation powerful thing One moment happily engaged life next moment you’re presented something didn’t even realize lacking temptation gambling engaging risky addictive behavior others though temptation come form person offer something perceive partner think react act temptation brain might telling something never encounter isn’t true break person really offering anything couldn’t find home right work communication — settled relationship isn’t giving need human claim much better animal rule animal often struggle control base impulse Rise animal nature think thing cheat detonate good life need move forward direction maturity good faith 3 Open communication channel Like communication fundamental part facing resolving urge cheat partner communicate inner self get aligned want emotionally morally also communicate partner truth reached get perspective want repair thing move forward different way Spend time inner self Make regular habit spend time getting reacquainted need future design deserve happy relationship life aligned authentic self Sometimes relationship change longer fit person we’re becoming that’s case sit partner honest candid Find safe space secure share what’s going inside head come together find solution make mutual decision come next 4 Get perspective intimate relationship intense take lot time focus we’ve spent long time person it’s easy get tunnel vision lose sight bigger picture need get perspective you’re dealing idea infidelity timetotime help shift u back line partner reveal critical reality u embrace you’ve opened partner taken time figure ultimate relationship need need take step back get perspective Extramartial affair outside relationship exciting give u butterfly feeling get blood racing That’s tempting especially you’ve settled person long time question reality though chasing something genuinely need excited prospect new adventure territory you’ve never visited time you’ve put someone important fantasy presented affair also important acknowledge Brace reality get perspective 5 right commitment Like commitment make partner applies even challenging part ending relationship commit someone isn’t say won’t cheat It’s also making promise truthful even truth hurt That’s right someone look future bigger victory loss stake decided can’t resist urge decided need something different something better That’s fine right commitment tell partner it’s time call day partnership Communicate you’ve changed want relationship changed don’t need give gritty detail need ensure aren’t betraying trust pain come infidelity much greater pain come relationship that’s come close either ever want civil one another — want genuine chance healing — right one another cut chord that’s thing left Putting together… matter strong urge cheat might it’s never right answer crumbling relationship right walk away something isn’t working also responsibility honest faithful partner sitting fence difficult decision make need handle urge cheat right way Figure underlying issue behind urge cheat figure whether worth repairing partner Think react temptation worth losing time effort you’ve put relationship worth losing friend happiness thing consider Sit partner open compassionately honest let know stand Perhaps two work thing you’ll never know talk see stand get realistic perspective you’re make decision authentically aligned happiness commitment one anotherTags Self Nonfiction Relationships Psychology Dating
2,449
Wandering in the Pandemic Wilderness
As a clinician in an outpatient mental health practice, I have been searching for the right analogy to what this time has felt like for my patients … and for me. As with many traumas, there is that initial shock and denial. As Kübler-Ross wisely observed, next will often come anger and bargaining perhaps settling into a sense of depression and finally acceptance. Photo by Blake Cheek on Unsplash But unlike a specific traumatic event that may have a beginning-middle-end, we don’t yet have a sense of the scope and duration of this pandemic. It feels more like a series of waves that continually crash upon us, a tsunami at first, then a series of other waves, some deceptively small, others overwhelming. You can feel like you have gotten a good breath and then relief. Then at other times, we may feel that we are flailing about in the water unable to feel the mushy ground underneath or to have something stable on which to hold. It is hard to swim or to even know which direction we should go. Yet also as a person of faith who works in the “Bible Belt”, the image that I keep returning to is the Exodus from Egypt and that time of wandering in the wilderness for the children of Israel. There were lessons there for the people … and maybe for us too. The people complained a lot. Over and over and over they complained. The people complained to God; they complained to Moses their leader and to Aaron their priest. And especially early in that journey, those complaints took a form that was not unlike grief or mourning. Even though what they had left may have been oppressive and difficult, the people longed to be back to what they knew, what was stable, what was “normal”. Photo by Aaron Burden on Unsplash We too complain about what we miss and what may be lost. We miss the communities where we sat/worked with others. Perhaps it is congregational singing, passing the peace, hugging each other. Maybe it is the restaurant and the sounds of clanging dishes, the variety of smells of food around us. It may be children playing together that are now socially distanced from the parks and playgrounds. Our grief is appropriate because we have experienced loss. We should honor our mourning … but not allow it to stop us putting one foot in front of the other. The people learned to eat manna. Out of the complaints of the people on this long journey, God provided manna. This food was their daily sustenance. They would gather enough for the day and no more. On the day before the day of rest (Sabbath), they could gather enough for two days. If they took more than what they needed, the food would spoil. This continually reminded the people that they should only gather what you need for that day. And when they were sick of manna, the people complained again. God sent an abundance of quail but not without making the point that God was frustrated with the people for not being satisfied with what they had been given. For many of us, we too may have to learn to have enough. We may have to look around at what we have for the day, to recognize that there is enough and to be content there. Photo by Austin Kehmeier on Unsplash And although God is not happy with their complaints, God still responds. As a parent, I am reminded of the times when I have had to acknowledge that my children needed what they were asking for … even if I had initially said “no” or failed to give it. We remember that the God with which we are presented in Exodus is a God who seems to have lots of feelings about the people, sometimes loving and gracious, sometimes frustrated and vindictive. Regardless, this is a God who remains in relationship with the people with whom God is covenanted, committed to, through all the ups and downs of that journey. This is the sort of steadfastness that one needs in a companion on this wandering path. We are tempted to build a golden calf. During one long stretch when their leader was absent, the people pressured their priest to build an idol. The people wanted something solid and tangible, not this God who said “I am that I am”. With the accumulated jewelry and metal from the people, they melted down their desire in order to form a golden calf. There is a strong desire in all of us for predictability and control. We look to our leaders and experts for this. But we should be careful not to make an idol of them. Photo by Philipp Knape on Unsplash My work as a clinician reminds me that when we are anxious and fearful, angry, and in pain, we will try nearly anything to find relief. This is a normal response. The desire to have life feel predictable again or to feel that someone somewhere has control or an answer helps us feel safe. But there is danger in the easy answer. Someone offering a quick solution that appears tangible and “real” could be an idol of our own making. Life in the wilderness is hard. And when we want it to be over, we can find ourselves holding on to someone or something that is not our answer. In many ways, this “building the golden calf” is a sort of bargaining, a trying to gain control one last time before acknowledging again our sadness at what we have lost and taking our steps toward an uncertain future. Photo by Christopher Sardegna on Unsplash In the wilderness, we walk with God, day by day, step by step. We accept where we are. We eat what we have. We camp for the night. We move on the next day. This is the cycle of wandering in the wilderness … and perhaps what is best during this pandemic. We may not necessarily know where we are going. Our vision is limited to where we presently are. We try to worry less about the future by grounding ourselves in what is present. This is not the same as walking blindly, but accepting that we can only know this step … then the next. We will not be returning “home” anytime soon … if ever. There is grief to acknowledge in that. Perhaps this new place has lessons to teach us. Maybe there are promises there that we cannot quite fathom yet. But for now, we’ll pack lightly, walk one step at a time, continue to follow the signs that God has given us, and try to get used to the taste of manna.
https://medium.com/caring-for-souls/wandering-in-the-pandemic-wilderness-6fda613b7ac4
['Jason B. Hobbs Lcsw']
2020-05-23 22:23:05.994000+00:00
['Spirituality', 'Covid 19', 'Mental Health', 'Coronavirus', 'Religion']
Title Wandering Pandemic WildernessContent clinician outpatient mental health practice searching right analogy time felt like patient … many trauma initial shock denial KüblerRoss wisely observed next often come anger bargaining perhaps settling sense depression finally acceptance Photo Blake Cheek Unsplash unlike specific traumatic event may beginningmiddleend don’t yet sense scope duration pandemic feel like series wave continually crash upon u tsunami first series wave deceptively small others overwhelming feel like gotten good breath relief time may feel flailing water unable feel mushy ground underneath something stable hold hard swim even know direction go Yet also person faith work “Bible Belt” image keep returning Exodus Egypt time wandering wilderness child Israel lesson people … maybe u people complained lot complained people complained God complained Moses leader Aaron priest especially early journey complaint took form unlike grief mourning Even though left may oppressive difficult people longed back knew stable “normal” Photo Aaron Burden Unsplash complain miss may lost miss community satworked others Perhaps congregational singing passing peace hugging Maybe restaurant sound clanging dish variety smell food around u may child playing together socially distanced park playground grief appropriate experienced loss honor mourning … allow stop u putting one foot front people learned eat manna complaint people long journey God provided manna food daily sustenance would gather enough day day day rest Sabbath could gather enough two day took needed food would spoil continually reminded people gather need day sick manna people complained God sent abundance quail without making point God frustrated people satisfied given many u may learn enough may look around day recognize enough content Photo Austin Kehmeier Unsplash although God happy complaint God still responds parent reminded time acknowledge child needed asking … even initially said “no” failed give remember God presented Exodus God seems lot feeling people sometimes loving gracious sometimes frustrated vindictive Regardless God remains relationship people God covenanted committed ups down journey sort steadfastness one need companion wandering path tempted build golden calf one long stretch leader absent people pressured priest build idol people wanted something solid tangible God said “I am” accumulated jewelry metal people melted desire order form golden calf strong desire u predictability control look leader expert careful make idol Photo Philipp Knape Unsplash work clinician reminds anxious fearful angry pain try nearly anything find relief normal response desire life feel predictable feel someone somewhere control answer help u feel safe danger easy answer Someone offering quick solution appears tangible “real” could idol making Life wilderness hard want find holding someone something answer many way “building golden calf” sort bargaining trying gain control one last time acknowledging sadness lost taking step toward uncertain future Photo Christopher Sardegna Unsplash wilderness walk God day day step step accept eat camp night move next day cycle wandering wilderness … perhaps best pandemic may necessarily know going vision limited presently try worry le future grounding present walking blindly accepting know step … next returning “home” anytime soon … ever grief acknowledge Perhaps new place lesson teach u Maybe promise cannot quite fathom yet we’ll pack lightly walk one step time continue follow sign God given u try get used taste mannaTags Spirituality Covid 19 Mental Health Coronavirus Religion
2,450
Considerations When Measuring Chatbot Success
Considerations When Measuring Chatbot Success And What Principles You Should Implement… Introduction Performance measures are important to organizations wanting to track their investment in a conversational interface… But standards & metrics differ by industry and obviously by companies in each industry. Due to the nascent nature of the technology companies are also eager to learn from one another. With some overestimate the importance and impact of their chatbot, and other heaving discounting the significant impact their conversational interface is having… Industry Type Matters Call Deflection Obviously chatbots are implemented across a vast array of industries. These industries use different parameters. Parameters which they deem as crucial to the measuring of success in their environment. Microsoft Power Virtual Agents have Analytics Built In Banking and financial sectors use chatbots to perform existing tasks faster. And an important driver is lower call volumes and how much savings are incurred from call deflection. Quality Conversations The most common and probably important chatbot performance metric is conversation length and structure. In most cases conversation transcripts are reviewed and manually classed in order so points of improvement noted. Organisations are aiming for shorter conversations and simplified dialog structures. A conversation or specific dialog always have a happy path which developers hope the user will find and stick to. Digression in a Chatbot Conversation A rudimentary and simplistic approach would be to have a repair path, or a few. Paths which intends to bring the conversation back to the happy path from points of digression. Hence ‘repairing’ it. This approach might lead to a situation called fallback proliferation.
https://cobusgreyling.medium.com/considerations-when-measuring-chatbot-success-93aaaac0cb86
['Cobus Greyling']
2020-05-21 15:55:40.499000+00:00
['Chatbots', 'NLP', 'Artificial Intelligence', 'Design', 'Conversational UI']
Title Considerations Measuring Chatbot SuccessContent Considerations Measuring Chatbot Success Principles Implement… Introduction Performance measure important organization wanting track investment conversational interface… standard metric differ industry obviously company industry Due nascent nature technology company also eager learn one another overestimate importance impact chatbot heaving discounting significant impact conversational interface having… Industry Type Matters Call Deflection Obviously chatbots implemented across vast array industry industry use different parameter Parameters deem crucial measuring success environment Microsoft Power Virtual Agents Analytics Built Banking financial sector use chatbots perform existing task faster important driver lower call volume much saving incurred call deflection Quality Conversations common probably important chatbot performance metric conversation length structure case conversation transcript reviewed manually classed order point improvement noted Organisations aiming shorter conversation simplified dialog structure conversation specific dialog always happy path developer hope user find stick Digression Chatbot Conversation rudimentary simplistic approach would repair path Paths intends bring conversation back happy path point digression Hence ‘repairing’ approach might lead situation called fallback proliferationTags Chatbots NLP Artificial Intelligence Design Conversational UI
2,451
ReElivate — Creating Better Social Virtual Experiences
ReElivate — Creating Better Social Virtual Experiences A marketplace connecting experience providers and companies to deliver unique, memorable, and virtual experiences The Problem The coronavirus pandemic has made companies pretty reliant on Zoom and virtual communication. While these virtual communication services have been life savers during this crazy time, they have not been able to replace in person social interaction. Virtual happy hours and coffee chats are redundant and people are looking for better ways to socialize virtually. What The Company Does ReElivate is a platform that connects companies with experience providers to help them create better virtual experiences. ReElivate is a marketplace to support companies in a coronavirus world, so companies can better engage their customers, teams, and clients. Events are centered around six categories including cooking, tasting, entertainment, crafts, care, and games. The platform also includes a concierge service if companies want higher levels of account management and assistance planning the experiences. The Market The company is serving a market that is smaller than traditional event management but focused on B2B. Some competitors include Airbnb and Kapow, but ReElivate believes the customers it is targeting are underserved by both competitors. Business Model ReElivate is a traditional marketplace that charges hosts a commission on the experiences that are booked. Traction ReElivate was founded in September and is working with more than 50 companies for their pilot, and with more than 20 local companies as hosts of experiences including Improv Asylum. The company has started to book experiences for November and will continue to add hosts and companies throughout the month. The self service marketplace will launch by the end of the year. Founding Team Background The founding team has over 30 years of experience in technology startups. Jon Conelias and Jason McCarthy were both executives at The Grommet. Conelias has been a CFO and operator for the past 15 years of marketplace companies focused on both B2B and B2C channels with multiple successful exits. McCarthy has been in marketplace operations for eight years. McCarthy founded The Grommet Wholesale business. What They Need Help With The company is looking for any hosts to provide experiences — the more interesting the better. The company is also looking to inform companies of its services to help connect them with the right experiences. Connect with the ReElivate team. Subscribe To The Buzz To Get More Startups In Your Inbox
https://medium.com/the-startup-buzz/reelivate-creating-better-social-virtual-experiences-a8ef73a186dc
['Bram Berkowitz']
2020-12-22 20:02:47.666000+00:00
['Marketplaces', 'Venture Capital', 'Startup', 'Coronavirus', 'Social']
Title ReElivate — Creating Better Social Virtual ExperiencesContent ReElivate — Creating Better Social Virtual Experiences marketplace connecting experience provider company deliver unique memorable virtual experience Problem coronavirus pandemic made company pretty reliant Zoom virtual communication virtual communication service life saver crazy time able replace person social interaction Virtual happy hour coffee chat redundant people looking better way socialize virtually Company ReElivate platform connects company experience provider help create better virtual experience ReElivate marketplace support company coronavirus world company better engage customer team client Events centered around six category including cooking tasting entertainment craft care game platform also includes concierge service company want higher level account management assistance planning experience Market company serving market smaller traditional event management focused B2B competitor include Airbnb Kapow ReElivate belief customer targeting underserved competitor Business Model ReElivate traditional marketplace charge host commission experience booked Traction ReElivate founded September working 50 company pilot 20 local company host experience including Improv Asylum company started book experience November continue add host company throughout month self service marketplace launch end year Founding Team Background founding team 30 year experience technology startup Jon Conelias Jason McCarthy executive Grommet Conelias CFO operator past 15 year marketplace company focused B2B B2C channel multiple successful exit McCarthy marketplace operation eight year McCarthy founded Grommet Wholesale business Need Help company looking host provide experience — interesting better company also looking inform company service help connect right experience Connect ReElivate team Subscribe Buzz Get Startups InboxTags Marketplaces Venture Capital Startup Coronavirus Social
2,452
Scaling Malcolm Gladwell
FIVE IDEAS… Developer credential management Enabling least-privilege for infrastructure developers In certain development environments, a “least-privilege” framework is optimal. This means that the developers working on a project are given only the information necessary to carry out their task without providing access to broader (potentially-sensitive) materials. There’s a need for a credential management solution that grants ephemeral access to infrastructure resources (think GCP or AWS) in a secure and compliant way. Crucially, existing solutions like Sailpoint don’t support infrastructure resources — compliance in that respect is vital for businesses at scale. I’d like to see a credential management system that solves this problem, making it as easy to share access to infrastructure as it is to give “Comment” or “Edit” access on a Google Doc. — Astasia Myers, Enterprise Investor at Redpoint Ventures Personalized podcast adS Voice synthesis technology to scale the soothing tones of Malcolm Gladwell Historically, podcast advertisements have worked through direct response: advertisers pay a flat-fee per episode based on the audience size (usually $10–30 per thousand listeners) and provide hosts with copy. Hosts record themselves reading that script and then place it somewhere in the episode’s static mp3 file. There are a few problems with this. Because the ad is hard-coded into that static file, it’s impossible to personalize the messaging for listeners in different demographics and geographies. Data collection is tricky (downloads are inaccurately counted as listens), back catalogs are difficult to monetize (you’d have to alter that static file), and programmatic advertising is impossible. Hosts might want to confine an advertiser to a certain number of downloads, for example. Canned advertisements do exist and can be inserted dynamically. But host-read ads remain the gold standard. Given that fact, how can we combine the dynamism of programmatic ads with the intimacy of host-read ones? Voice synthesis technology may be the solution. A new startup would help existing podcasters create “voiceprints” based on existing content. Then, within the system, advertisers could bid to target anonymized individuals based on their demographics but divorced from the podcast they were listening to. This would look a lot like Google and Facebook’s ad platforms. Whenever an advertiser won a bid, the system would create a synthetic version of a host-read commercial, stitching it into the episode, and delivering it to the chosen user segment. The result could be a gamechanger, helping podcasts close the monetization gap. — Elaine Zelby, Investor at SignalFire Staking creators Discovering creators first and sharing in their success The overabundance of digital content has led consumer seeking out individual creators to serve as curators and taste-makers. As with many discovery-driven activities, there’s a sense of pride in finding a creator (or brand) that others haven’t. Social capital can be earned by demonstrating one’s ability to identify these personalities first. Right now, though, there’s no great way to showcase and validate this ability. As an early-fan, you want to be able to visibly signal your support and maybe even benefit from it. Combine this desire with a creator’s need for capital, and you can imagine a kind of fan-creator investing relationship in which fans “stake” creators and then capture some of the upside in the event they blow up. A platform that enabled this behavior, giving the next Charli D’Amelio the cash to go full-time, would be intriguing. — Jerry Lu, Investor at Advancit Capital Essential oils for everyday life Curated therapeutic oils for modern ailments Tylenol isn’t always the answer, especially for millennials who are more apt to reach for custom supplement packs than a couple of shots of bourbon to treat a variety of maladies. One solution? Essential oils. I imagine a beautifully-branded collection of high-quality products designed to do everything from turning your shower into an Equinox (eucalyptus, of course) to relieving tension headaches from staring at a screen all day (peppermint). It’s time for essential oils to have a glow-up. A great DTC play would be to make them feel cool, curated, and quality-controlled, removing the need to ask your weird aunt who works for an essential oil MLM scheme or step foot in the overwhelming supplement aisles of a Whole Foods. — Willa Townsend, Director of Business Development at Banza Twitter podcast app Prove it can work, then sell This is a little different than the usual RFS. Mostly, because I think it’s a business made to exit, likely in a short timeframe. Maybe they don’t know it yet, but Twitter could be the most powerful podcasting app on the planet. The company’s social interest graph is perfectly and uniquely positioned to solve personalized discovery of podcasting. Even Spotify, Google, and Apple don’t have access to the same kind of information. So here’s the play: create a podcast app with 10x better discovery, leveraging Twitter’s API. Add easy social sharing so that recommended podcasters and episodes can be shared on the platform. If properly implemented, big podcasters would be excited by their ability to reach large audiences through a distribution platform that hasn’t been tapped. That could lead to a breakout trajectory that would cause Twitter to take notice. There’s a risk, of course. Whenever Twitter saw this was working, they might turn off API access and clone it. But I think there’s a genuine chance it might get taken off the table for a nice sum. — Alex Carter, Co-founder of the 1st social podcast app on iOS Have something to say about these ideas? Are you working on something similar? Vote for your favorite idea and share thoughts by hitting the button, below. Vote for an idea 👉 Get free startup ideas from leading VCs by joining RFS 100
https://medium.com/swlh/scaling-malcolm-gladwell-21ff00e563fc
['Mario Gabriele']
2020-12-03 00:55:14.175000+00:00
['Innovation', 'Startup', 'Entrepreneurship', 'Startup Ideas', 'Venture Capital']
Title Scaling Malcolm GladwellContent FIVE IDEAS… Developer credential management Enabling leastprivilege infrastructure developer certain development environment “leastprivilege” framework optimal mean developer working project given information necessary carry task without providing access broader potentiallysensitive material There’s need credential management solution grant ephemeral access infrastructure resource think GCP AWS secure compliant way Crucially existing solution like Sailpoint don’t support infrastructure resource — compliance respect vital business scale I’d like see credential management system solves problem making easy share access infrastructure give “Comment” “Edit” access Google Doc — Astasia Myers Enterprise Investor Redpoint Ventures Personalized podcast adS Voice synthesis technology scale soothing tone Malcolm Gladwell Historically podcast advertisement worked direct response advertiser pay flatfee per episode based audience size usually 10–30 per thousand listener provide host copy Hosts record reading script place somewhere episode’s static mp3 file problem ad hardcoded static file it’s impossible personalize messaging listener different demographic geography Data collection tricky downloads inaccurately counted listens back catalog difficult monetize you’d alter static file programmatic advertising impossible Hosts might want confine advertiser certain number downloads example Canned advertisement exist inserted dynamically hostread ad remain gold standard Given fact combine dynamism programmatic ad intimacy hostread one Voice synthesis technology may solution new startup would help existing podcasters create “voiceprints” based existing content within system advertiser could bid target anonymized individual based demographic divorced podcast listening would look lot like Google Facebook’s ad platform Whenever advertiser bid system would create synthetic version hostread commercial stitching episode delivering chosen user segment result could gamechanger helping podcasts close monetization gap — Elaine Zelby Investor SignalFire Staking creator Discovering creator first sharing success overabundance digital content led consumer seeking individual creator serve curator tastemakers many discoverydriven activity there’s sense pride finding creator brand others haven’t Social capital earned demonstrating one’s ability identify personality first Right though there’s great way showcase validate ability earlyfan want able visibly signal support maybe even benefit Combine desire creator’s need capital imagine kind fancreator investing relationship fan “stake” creator capture upside event blow platform enabled behavior giving next Charli D’Amelio cash go fulltime would intriguing — Jerry Lu Investor Advancit Capital Essential oil everyday life Curated therapeutic oil modern ailment Tylenol isn’t always answer especially millennials apt reach custom supplement pack couple shot bourbon treat variety malady One solution Essential oil imagine beautifullybranded collection highquality product designed everything turning shower Equinox eucalyptus course relieving tension headache staring screen day peppermint It’s time essential oil glowup great DTC play would make feel cool curated qualitycontrolled removing need ask weird aunt work essential oil MLM scheme step foot overwhelming supplement aisle Whole Foods — Willa Townsend Director Business Development Banza Twitter podcast app Prove work sell little different usual RFS Mostly think it’s business made exit likely short timeframe Maybe don’t know yet Twitter could powerful podcasting app planet company’s social interest graph perfectly uniquely positioned solve personalized discovery podcasting Even Spotify Google Apple don’t access kind information here’s play create podcast app 10x better discovery leveraging Twitter’s API Add easy social sharing recommended podcasters episode shared platform properly implemented big podcasters would excited ability reach large audience distribution platform hasn’t tapped could lead breakout trajectory would cause Twitter take notice There’s risk course Whenever Twitter saw working might turn API access clone think there’s genuine chance might get taken table nice sum — Alex Carter Cofounder 1st social podcast app iOS something say idea working something similar Vote favorite idea share thought hitting button Vote idea 👉 Get free startup idea leading VCs joining RFS 100Tags Innovation Startup Entrepreneurship Startup Ideas Venture Capital
2,453
Starting Conversations about Customer Privacy and AI
Starting Conversations about Customer Privacy and AI A guide for UX professionals By Derek DeBellis, Penny Marsh Collisson, Angelo Liao, and Mar Gines Marin I love when AI makes a recommendation that accounts for me, my goals, and my context. I love when AI automates part of my workflow by recognizing where I’m going and cutting out some of the work required to get there. I also love when companies respect my privacy. I’m not alone. I’ve heard this countless times in user interviews: people want personalized AI-driven experiences that cater to their specific needs while also respecting their privacy. When we operate with shared values and communicate about how to put those into practice, researchers and product teams can help to deliver both personalization and privacy to our customers. At Microsoft, we’ve been compiling privacy practices that we think every UX professional should know and understand. The below list isn’t exhaustive, but we’ve found that the ideas it contains help UX professionals exploring AI and privacy. We also include questions you can ask your product and data engineers to kickstart a conversation about AI privacy and design. Collecting the right data A lot of the beauty of advanced statistical approaches resides in the ability to handle rich, multidimensional sources of data. The more features a dataset has, however, the more effort is required to make sure that no one can be identified. Take care, also, to collect and use the data in a manner that aligns with your company’s values and your customers’ desires and values. Conversation starters • What features would be contained within this data set? • How important are these features for the model’s performance? • Do we have a justification for needing that piece of information? • Does having that information increase the odds we compromise someone’s anonymity? • Are we (and our partner teams) selecting and using data in a manner that our customers have both comprehended and agreed to under their own volition? • Are we collecting and using the data in a way that reflects our customers’ values? Exploring the shape of data without exploring the content or individuals AI systems don’t need to know much about individuals to make useful predictions for them. Current approaches allow data to be aggregated, featurized, and encoded to anonymity without detracting from the ability to do computations on it. The important patterns can be retained after adding noise to the data. This noise makes it extremely difficult to trace it back to content or individuals. There are also techniques that make sure queries and models return statistical or aggregate results, not raw or individuating results. Words can be represented as vectors of numbers. These sets appear meaningless to us, but they often contain patterns valuable to the AI system. Conversation starters • If I run a query on this data, is it possible that the results will be associated with a small subset of individuals? • Do we have a way to make sure our queries return statistical, aggregate, and de-identified results? • Is it possible to determine whose data was in this initial training set? • How are we anonymizing and encoding data to ensure privacy? Handling customer data Modern technology allows us to address many concerns about how, when, how long, and where the data is being handled. For example, a customer’s information doesn’t always need to travel to the cloud for AI to work. Advances have made it possible to get sophisticated models onto a customer’s device without taking up all the device’s memory or processing power. Once on the device, AI can function offline, without needing to constantly connect to the cloud. There are, in addition, many ways to maintain privacy within the cloud. Conversation starters • If we want personalized models, how do we build, store, and update them? • Are we housing our AI models in the cloud or the device? Why? • How do we update our general models? • Who, if anyone, can look at the data? When? How? What data exactly? • How long is the data being stored? • Where is the data being stored? Providing customers with transparency and control Ultimately, you’re asking these questions so you can give customers what they want, which our research shows is transparency and control. You want people to have the information they need to decide whether they want to use the AI-driven features. Make sure you’re presenting this information in an easily understandable way. And if customers decide they don’t want to use AI-powered features, they should have the controls to make the necessary adjustments. Conversation starters • Do we have answers to the questions users are asking? • Do customers have the information they need to determine if using our AI is worthwhile? • Do customers have the controls necessary to manage their experience? If so, are these controls nuanced enough? Are they too nuanced? The real UX work begins after you sift through these questions We hope that these questions help open conversations with the people on your team building AI-driven experiences. This communication reinforces a shared objective and leads to an understanding of how you can help protect user privacy. That knowledge empowers us, in turn, to help our customers navigate privacy in AI-driven products and communicate these intricacies in ways that are better, simpler, and clearer. Authors Angelo Liao is a program manager working on AI in PowerPoint. Mar Gines Marin is a program manager working on AI in Excel. Penny Collisson is a user research manager working on AI in Office. Derek DeBellis is a data scientist and user researcher working on AI in Office. With special thanks to Simo Ferraro, Zhang Li, Curtis Anderson, Josh Lovejoy, Ilke Kaya, Ben Noah, Bogdan Popp, and Robert Rounthwaite.
https://medium.com/microsoft-design/starting-conversations-about-customer-privacy-and-ai-41de0352dedc
['Derek Debellis']
2019-12-05 18:21:21.111000+00:00
['User Experience', 'Artificial Intelligence', 'Microsoft', 'Research And Insight', 'Design']
Title Starting Conversations Customer Privacy AIContent Starting Conversations Customer Privacy AI guide UX professional Derek DeBellis Penny Marsh Collisson Angelo Liao Mar Gines Marin love AI make recommendation account goal context love AI automates part workflow recognizing I’m going cutting work required get also love company respect privacy I’m alone I’ve heard countless time user interview people want personalized AIdriven experience cater specific need also respecting privacy operate shared value communicate put practice researcher product team help deliver personalization privacy customer Microsoft we’ve compiling privacy practice think every UX professional know understand list isn’t exhaustive we’ve found idea contains help UX professional exploring AI privacy also include question ask product data engineer kickstart conversation AI privacy design Collecting right data lot beauty advanced statistical approach resides ability handle rich multidimensional source data feature dataset however effort required make sure one identified Take care also collect use data manner aligns company’s value customers’ desire value Conversation starter • feature would contained within data set • important feature model’s performance • justification needing piece information • information increase odds compromise someone’s anonymity • partner team selecting using data manner customer comprehended agreed volition • collecting using data way reflects customers’ value Exploring shape data without exploring content individual AI system don’t need know much individual make useful prediction Current approach allow data aggregated featurized encoded anonymity without detracting ability computation important pattern retained adding noise data noise make extremely difficult trace back content individual also technique make sure query model return statistical aggregate result raw individuating result Words represented vector number set appear meaningless u often contain pattern valuable AI system Conversation starter • run query data possible result associated small subset individual • way make sure query return statistical aggregate deidentified result • possible determine whose data initial training set • anonymizing encoding data ensure privacy Handling customer data Modern technology allows u address many concern long data handled example customer’s information doesn’t always need travel cloud AI work Advances made possible get sophisticated model onto customer’s device without taking device’s memory processing power device AI function offline without needing constantly connect cloud addition many way maintain privacy within cloud Conversation starter • want personalized model build store update • housing AI model cloud device • update general model • anyone look data data exactly • long data stored • data stored Providing customer transparency control Ultimately you’re asking question give customer want research show transparency control want people information need decide whether want use AIdriven feature Make sure you’re presenting information easily understandable way customer decide don’t want use AIpowered feature control make necessary adjustment Conversation starter • answer question user asking • customer information need determine using AI worthwhile • customer control necessary manage experience control nuanced enough nuanced real UX work begin sift question hope question help open conversation people team building AIdriven experience communication reinforces shared objective lead understanding help protect user privacy knowledge empowers u turn help customer navigate privacy AIdriven product communicate intricacy way better simpler clearer Authors Angelo Liao program manager working AI PowerPoint Mar Gines Marin program manager working AI Excel Penny Collisson user research manager working AI Office Derek DeBellis data scientist user researcher working AI Office special thanks Simo Ferraro Zhang Li Curtis Anderson Josh Lovejoy Ilke Kaya Ben Noah Bogdan Popp Robert RounthwaiteTags User Experience Artificial Intelligence Microsoft Research Insight Design
2,454
From model inception to deployment
From model inception to deployment Machine Learning model training & scalable deployment with Flask, Nginx & Gunicorn wrapped in a Docker Container We all have been in this position after we are done building a model :p At some point, we all have struggled in deploying our trained Machine Learning model and a lot of questions start popping up into our mind. What is the best way to deploy a ML model? How do I serve the model’s predictions? Which server should I use? Should I use flask or django for creating REST API? What about shipping it inside docker? Don’t worry, I got you covered with all of it!! :) In this tutorial, we will learn how to train and deploy a machine learning model in production with more focus on deployment because this is where we all data scientists get stuck. Also, we will be using docker containers, one for flask app and another for Nginx web server shipped together with docker-compose. If you are new to docker or containerization, I would suggest reading this article. High-Level Architecture High level design of large scale Machine Learning model deployment Setting up Here is the GitHub link for this project This is the folder structure that we will follow for this project Let’s break this piece into three parts: — Training a Machine Learning model using python & scikit-learn — Creating a REST API using flask and gunicorn — Deploying the Machine Learning model in production using Nginx & ship the whole package in a docker container Model Training To keep things simple and comprehensive, we will use iris data-set to train a SVM classifier. iris_svm_train.py Here, we are training a Support Vector Machine with a linear kernel which is giving a pretty decent accuracy of 97%. Feel free to play around with the training part, try Random Forest or Xgboost & perform hyper-parameter optimization to beat the accuracy. Make sure you execute the ‘iris_svm_train.py’ because it will save the model inside the ‘model’ folder (refer to github repo). Building a REST API Creating a flask app is very easy. No kidding! All you need to know is how a request from the client(user) is sent to the server and how the server sends back the response and a little bit about GET and POST methods. Below, we are loading our saved model and processing the new data (request) from the user in order to send predictions(response) back to the user. app.py We will use gunicorn to serve our flask API. If you are on windows, you can use waitress (pure-Python WSGI server) as an alternative to gunicorn. Execute the command: gunicorn -w 1 -b :8000 app:app and hit http://localhost:8000’ in your browser to ensure your flask app is up and running. If you get the message ‘Hoilaaaaaaaaa!’, then you are good to go!! If you want to test the predict(Post) method, use curl command or use Postman curl --header "Content-Type: application/json" --request POST --data'[{"sepal_length":6.3,"sepal_width":2.3,"petal_length":4.4,"petal_width":1.3}]' http://localhost:8000/predict Deploying the ML model in production Finally, fun part begins :) We will use Nginx web server as a reverse proxy for Gunicorn, meaning users will hit Nginx from the browser and it will forward the request to your application. Nginx sits in front of Gunicorn which serves your flask app. More information on why Nginx is required when we have gunicorn: link nginx.conf Wrapping everything inside Docker Container Congratulations, you have made it to the last part. Now, we will create two docker files, one for API & one for Nginx. We will also create a docker-compose file which will contain information about our two docker containers. You have to install docker and docker-compose for this to work. Let’s ship our scalable ML app and make it portable & production ready. Docker file for API (keep it in api folder) We have created a docker file for API which needs to be saved inside ‘api’ folder with other files including requirements.txt (containing information about python packages required for your app). Docker file for Nginx(keep it in nginx folder with nginx.conf file) docker-compose.yml docker-compose.yml is the master file which binds everything together. As you can see, it contains two services, one for api & one for server(nginx). Now, all you need is just a single command to run your ML app: cd <project/parent directory> docker-compose up Output of above command Cheers! Your dockerized scalable Machine Learning app is up and running, accepting requests on port 8080 and ready to serve your model’s predictions. Open a new terminal to run predict method using curl or use Postman Predictions from your deployed ML model Thank you for making it till here, comment below if you face any challenges in running the project or have any feedback. Happy Learning!!
https://medium.com/datadriveninvestor/from-model-inception-to-deployment-adce1f5ed9d6
['Akshay Arora']
2018-11-28 10:04:23.359000+00:00
['Machine Learning', 'Python', 'Artificial Intelligence', 'Deep Learning', 'Docker']
Title model inception deploymentContent model inception deployment Machine Learning model training scalable deployment Flask Nginx Gunicorn wrapped Docker Container position done building model p point struggled deploying trained Machine Learning model lot question start popping mind best way deploy ML model serve model’s prediction server use use flask django creating REST API shipping inside docker Don’t worry got covered tutorial learn train deploy machine learning model production focus deployment data scientist get stuck Also using docker container one flask app another Nginx web server shipped together dockercompose new docker containerization would suggest reading article HighLevel Architecture High level design large scale Machine Learning model deployment Setting GitHub link project folder structure follow project Let’s break piece three part — Training Machine Learning model using python scikitlearn — Creating REST API using flask gunicorn — Deploying Machine Learning model production using Nginx ship whole package docker container Model Training keep thing simple comprehensive use iris dataset train SVM classifier irissvmtrainpy training Support Vector Machine linear kernel giving pretty decent accuracy 97 Feel free play around training part try Random Forest Xgboost perform hyperparameter optimization beat accuracy Make sure execute ‘irissvmtrainpy’ save model inside ‘model’ folder refer github repo Building REST API Creating flask app easy kidding need know request clientuser sent server server sends back response little bit GET POST method loading saved model processing new data request user order send predictionsresponse back user apppy use gunicorn serve flask API window use waitress purePython WSGI server alternative gunicorn Execute command gunicorn w 1 b 8000 appapp hit httplocalhost8000’ browser ensure flask app running get message ‘Hoilaaaaaaaaa’ good go want test predictPost method use curl command use Postman curl header ContentType applicationjson request POST datasepallength63sepalwidth23petallength44petalwidth13 httplocalhost8000predict Deploying ML model production Finally fun part begin use Nginx web server reverse proxy Gunicorn meaning user hit Nginx browser forward request application Nginx sits front Gunicorn serf flask app information Nginx required gunicorn link nginxconf Wrapping everything inside Docker Container Congratulations made last part create two docker file one API one Nginx also create dockercompose file contain information two docker container install docker dockercompose work Let’s ship scalable ML app make portable production ready Docker file API keep api folder created docker file API need saved inside ‘api’ folder file including requirementstxt containing information python package required app Docker file Nginxkeep nginx folder nginxconf file dockercomposeyml dockercomposeyml master file bind everything together see contains two service one api one servernginx need single command run ML app cd projectparent directory dockercompose Output command Cheers dockerized scalable Machine Learning app running accepting request port 8080 ready serve model’s prediction Open new terminal run predict method using curl use Postman Predictions deployed ML model Thank making till comment face challenge running project feedback Happy LearningTags Machine Learning Python Artificial Intelligence Deep Learning Docker
2,455
Startup Metrics
Startup Metrics “In God we trust; all others must bring data” This post lists my favourite articles on startup metrics. Learn what to measure from rock stars such as David Skok, Andrew Chen and Dave McClure. Happy reading! METRICS OVERVIEW: Key concepts explained THE CONVERSION FUNNEL: Describing users flows as funnels CUSTOMER ACQUISITION & RETENTION: Capture and maintain user loyalty SOME NUMBERS: Metrics in the real world Happy reading, — Livio (@LivMKk) Thank you for reading & recommending ❤ P.S.: If you care about measuring the right metrics, you should read about Financial Planning for SaaS startups URL.02.11
https://medium.com/startup-info/startup-metrics-155db194b3a9
['Livio Marcheschi']
2017-07-16 16:59:58.999000+00:00
['Metrics', 'Digital Marketing', 'Startup', 'Entrepreneurship', 'Growth Hacking']
Title Startup MetricsContent Startup Metrics “In God trust others must bring data” post list favourite article startup metric Learn measure rock star David Skok Andrew Chen Dave McClure Happy reading METRICS OVERVIEW Key concept explained CONVERSION FUNNEL Describing user flow funnel CUSTOMER ACQUISITION RETENTION Capture maintain user loyalty NUMBERS Metrics real world Happy reading — Livio LivMKk Thank reading recommending ❤ PS care measuring right metric read Financial Planning SaaS startup URL0211Tags Metrics Digital Marketing Startup Entrepreneurship Growth Hacking
2,456
COVID-19: Impact on Housing Security Across the U.S.
COVID-19: Impact on Housing Security Across the U.S. Jbochenek Follow Dec 10 · 14 min read Housing is essential, but not guaranteed. This has never been more obvious than since the start of the COVID-19 lockdowns stranded Americans from their jobs, and thus their incomes. Without income, paying for routine and necessary bills such as food and housing can become a struggle. Housing insecurity is certainly not a new addition to America, but for the first time, we have week by week data on how it has impacted households across America. Starting in April, the U.S. Census Bureau began a new project, the Household Pulse Survey, with the goal of determining the social and economic impacts of COVID-19 on the American populace. Phase one lasted from April 23rd to July 21st, and this analysis examines those 12 weeks (calendar savvy will notice that this is in fact 13 weeks, but that will be discussed below). The Household Pulse Survey phase one results are available as Public Use Files (PUF), where each row is a response. However, due to privacy reasons, the PUF does not include location indicators, which was desired for this analysis. Instead, we used the summarized data which was slightly edited due to nested headers. The file we used is available here. For this, we also worked in Google Colab for easier code sharing across the team. First we imported the necessary packages. from google.colab import drive import pandas as pd import matplotlib.pyplot as plt import seaborn as sns import plotly.express as px import plotly from sklearn import preprocessing from urllib.request import urlopen import json # This will prompt for authorization. drive.mount(‘/content/drive’) Then we imported the data: Household = ‘/content/drive/My Drive/Data/Housing/Household Pulse Survey/phase-one-household-pulse-survey-tool overall.xlsx’ Phase1 = pd.read_excel(Household, sheet_name=’Data’) The data has three different levels of location, nationwide, state level, and the top 15 largest metro areas. It was important to separate these out, as we wanted to make comparisons within these location groups, not between these location groups. We grabbed only the rows we wanted into three different datasets: State=[‘Alabama’, ‘Alaska’, ‘Arizona’, ‘Arkansas’, ‘California’, ‘Colorado’, ‘Connecticut’, ‘Delaware’, ‘District of Columbia’, ‘Florida’, ‘Georgia’, ‘Hawaii’, ‘Idaho’, ‘Illinois’, ‘Indiana’, ‘Iowa’, ‘Kansas’, ‘Kentucky’, ‘Louisiana’, ‘Maine’, ‘Maryland’, ‘Massachusetts’, ‘Michigan’, ‘Minnesota’, ‘Mississippi’, ‘Missouri’, ‘Montana’, ‘Nebraska’, ‘Nevada’, ‘New Hampshire’, ‘New Jersey’, ‘New Mexico’, ‘New York’, ‘North Carolina’, ‘North Dakota’, ‘Ohio’, ‘Oklahoma’, ‘Oregon’, ‘Pennsylvania’, ‘Rhode Island’, ‘South Carolina’, ‘South Dakota’, ‘Tennessee’, ‘Texas’, ‘Utah’, ‘Vermont’, ‘Virginia’, ‘Washington’, ‘West Virginia’, ‘Wisconsin’, ‘Wyoming’] US = [‘United States’] Metros=[‘Atlanta-Sandy Springs-Alpharetta, GA Metro Area’, ‘Boston-Cambridge-Newton, MA-NH Metro Area’, ‘Chicago-Naperville-Elgin, IL-IN-WI Metro Area’, ‘Dallas-Fort Worth-Arlington, TX Metro Area’, ‘Detroit-Warren-Dearborn, MI Metro Area’, ‘Houston-The Woodlands-Sugar Land, TX Metro Area’, ‘Los Angeles-Long Beach-Anaheim, CA Metro Area’, ‘Miami-Fort Lauderdale-Pompano Beach, FL Metro Area’, ‘New York-Newark-Jersey City, NY-NJ-PA Metro Area’, ‘Philadelphia-Camden-Wilmington, PA-NJ-DE-MD Metro Area’, ‘Phoenix-Mesa-Chandler, AZ Metro Area’, ‘Riverside-San Bernardino-Ontario, CA Metro Area’, ‘San Francisco-Oakland-Berkeley, CA Metro Area’, ‘Seattle-Tacoma-Bellevue, WA Metro Area’, ‘Washington-Arlington-Alexandria, DC-VA-MD-WV Metro Area’] StatesP1 = Phase1[Phase1[‘Geography (State or Metropolitan Area)’].isin(State)] USP1 = Phase1[Phase1[‘Geography (State or Metropolitan Area)’].isin(US)] MetroP1 = Phase1[Phase1[‘Geography (State or Metropolitan Area)’].isin(Metros)] It soon became obvious that 50 states was a large number to handle, in visualization, so we added another level to the state data — Divisions. The U.S. Census defines the US by several location levels, one of the most familiar is Regions: Midwest, Northeast, South, and West. There are also Divisions, which split the regions up even smaller. Figure 1 below shows the breakdown of Regions into Divisions. Figure 1. Regions and Divisions of the United States We used a data dictionary to add that to the data. I’m including this here so maybe no-one else has to write this code again. Divisions = {‘Alabama’: ‘East South Central’, ‘Alaska’: ‘Pacific’, ‘Arizona’: ‘Mountain’, ‘Arkansas’: ‘West South Central’, ‘California’: ‘Pacific’, ‘Colorado’: ‘Mountain’, ‘Connecticut’: ‘New England’, ‘Delaware’: ‘South Atlantic’, ‘District of Columbia’: ‘South Atlantic’, ‘Florida’: ‘South Atlantic’, ‘Georgia’: ‘South Atlantic’, ‘Hawaii’: ‘Pacific’, ‘Idaho’: ‘Mountain’, ‘Illinois’: ‘East North Central’, ‘Indiana’: ‘East North Central’, ‘Iowa’: ‘West North Central’, ‘Kansas’: ‘West North Central’, ‘Kentucky’: ‘East South Central’, ‘Louisiana’: ‘West South Central’, ‘Maine’: ‘New England’, ‘Maryland’: ‘South Atlantic’, ‘Massachusetts’: ‘New England’, ‘Michigan’: ‘East North Central’, ‘Minnesota’: ‘West North Central’, ‘Mississippi’: ‘East South Central’, ‘Missouri’: ‘West North Central’, ‘Montana’: ‘Mountain’, ‘Nebraska’: ‘West North Central’, ‘Nevada’: ‘Mountain’, ‘New Hampshire’: ‘New England’, ‘New Jersey’: ‘Middle Atlantic’, ‘New Mexico’: ‘Mountain’, ‘New York’: ‘Middle Atlantic’, ‘North Carolina’: ‘South Atlantic’, ‘North Dakota’: ‘West North Central’, ‘Ohio’: ‘East North Central’, ‘Oklahoma’: ‘West South Central’, ‘Oregon’: ‘Pacific’, ‘Pennsylvania’: ‘Middle Atlantic’, ‘Rhode Island’: ‘New England’, ‘South Carolina’: ‘South Atlantic’, ‘South Dakota’: ‘West North Central’, ‘Tennessee’: ‘East South Central’ , ‘Texas’: ‘West South Central’, ‘Utah’: ‘Mountain’, ‘Vermont’: ‘New England’, ‘Virginia’: ‘South Atlantic’, ‘Washington’: ‘Pacific’, ‘West Virginia’: ‘South Atlantic’, ‘Wisconsin’: ‘East North Central’, ‘Wyoming’: ‘Mountain’} StatesP1[“State”] = StatesP1[“Geography (State or Metropolitan Area)”].astype(‘category’) StatesP1[‘Division’] = StatesP1[‘State’].map(Divisions) We needed to do some exploratory data analysis to determine the quality of the data and any adjustments that would need to be made. sns.displot(StatesP1, x=”Housing Insecurity Percent”, element=”step”, col=”Division”, col_wrap=3) g = sns.boxplot(x="Division", y="Housing Insecurity Percent", #hue="Selected Horizontal Dimension", data=StatesP1, palette="Set3") g.set(xlabel='Division', ylabel='Housing Insecurity (%)') g.set_xticklabels(g.get_xticklabels(),rotation=45,ha="right") Figure 2. Histogram of Housing Insecurity Percent from the Household Pulse Survey by Census division from April 2020 — July 2020 Figure 3. Boxplot of Housing Insecurity Percent from the Household Pulse Survey by Census division from April 2020 — July 2020 Overall, we were very pleased with the distribution of data, in the histograms it shows as relatively normal and in the boxplots we only see one true outlier. For the purposes of this analysis, we kept that outlier as it was important trend data. We also wanted to get a first look at the actual data, how has the housing security changed over the 12 week period across the US? g = sns.relplot(kind = ‘line’, data=StatesP1, y =’Housing Insecurity Percent’, x= ‘Week Number’) g = sns.relplot(kind = ‘line’, col=’Division’, col_wrap=5, col_order =[‘Pacific’, ‘West North Central’, ‘East North Central’, ‘Middle Atlantic’, ‘New England’, ’Mountain’, ‘West South Central’, ‘East South Central’, ‘South Atlantic’ ], data=StatesP1, y =’Housing Insecurity Percent’, x= ‘Week Number’)
https://medium.com/swlh/covid-19-impact-on-housing-security-across-the-u-s-6c9d787ce2d
[]
2020-12-16 17:40:54.131000+00:00
['Data Science', 'Python', 'Housing', 'Coronavirus', 'Covid 19']
Title COVID19 Impact Housing Security Across USContent COVID19 Impact Housing Security Across US Jbochenek Follow Dec 10 · 14 min read Housing essential guaranteed never obvious since start COVID19 lockdown stranded Americans job thus income Without income paying routine necessary bill food housing become struggle Housing insecurity certainly new addition America first time week week data impacted household across America Starting April US Census Bureau began new project Household Pulse Survey goal determining social economic impact COVID19 American populace Phase one lasted April 23rd July 21st analysis examines 12 week calendar savvy notice fact 13 week discussed Household Pulse Survey phase one result available Public Use Files PUF row response However due privacy reason PUF include location indicator desired analysis Instead used summarized data slightly edited due nested header file used available also worked Google Colab easier code sharing across team First imported necessary package googlecolab import drive import panda pd import matplotlibpyplot plt import seaborn sn import plotlyexpress px import plotly sklearn import preprocessing urllibrequest import urlopen import json prompt authorization drivemount‘contentdrive’ imported data Household ‘contentdriveMy DriveDataHousingHousehold Pulse Surveyphaseonehouseholdpulsesurveytool overallxlsx’ Phase1 pdreadexcelHousehold sheetname’Data’ data three different level location nationwide state level top 15 largest metro area important separate wanted make comparison within location group location group grabbed row wanted three different datasets State‘Alabama’ ‘Alaska’ ‘Arizona’ ‘Arkansas’ ‘California’ ‘Colorado’ ‘Connecticut’ ‘Delaware’ ‘District Columbia’ ‘Florida’ ‘Georgia’ ‘Hawaii’ ‘Idaho’ ‘Illinois’ ‘Indiana’ ‘Iowa’ ‘Kansas’ ‘Kentucky’ ‘Louisiana’ ‘Maine’ ‘Maryland’ ‘Massachusetts’ ‘Michigan’ ‘Minnesota’ ‘Mississippi’ ‘Missouri’ ‘Montana’ ‘Nebraska’ ‘Nevada’ ‘New Hampshire’ ‘New Jersey’ ‘New Mexico’ ‘New York’ ‘North Carolina’ ‘North Dakota’ ‘Ohio’ ‘Oklahoma’ ‘Oregon’ ‘Pennsylvania’ ‘Rhode Island’ ‘South Carolina’ ‘South Dakota’ ‘Tennessee’ ‘Texas’ ‘Utah’ ‘Vermont’ ‘Virginia’ ‘Washington’ ‘West Virginia’ ‘Wisconsin’ ‘Wyoming’ US ‘United States’ Metros‘AtlantaSandy SpringsAlpharetta GA Metro Area’ ‘BostonCambridgeNewton MANH Metro Area’ ‘ChicagoNapervilleElgin ILINWI Metro Area’ ‘DallasFort WorthArlington TX Metro Area’ ‘DetroitWarrenDearborn MI Metro Area’ ‘HoustonThe WoodlandsSugar Land TX Metro Area’ ‘Los AngelesLong BeachAnaheim CA Metro Area’ ‘MiamiFort LauderdalePompano Beach FL Metro Area’ ‘New YorkNewarkJersey City NYNJPA Metro Area’ ‘PhiladelphiaCamdenWilmington PANJDEMD Metro Area’ ‘PhoenixMesaChandler AZ Metro Area’ ‘RiversideSan BernardinoOntario CA Metro Area’ ‘San FranciscoOaklandBerkeley CA Metro Area’ ‘SeattleTacomaBellevue WA Metro Area’ ‘WashingtonArlingtonAlexandria DCVAMDWV Metro Area’ StatesP1 Phase1Phase1‘Geography State Metropolitan Area’isinState USP1 Phase1Phase1‘Geography State Metropolitan Area’isinUS MetroP1 Phase1Phase1‘Geography State Metropolitan Area’isinMetros soon became obvious 50 state large number handle visualization added another level state data — Divisions US Census defines US several location level one familiar Regions Midwest Northeast South West also Divisions split region even smaller Figure 1 show breakdown Regions Divisions Figure 1 Regions Divisions United States used data dictionary add data I’m including maybe noone else write code Divisions ‘Alabama’ ‘East South Central’ ‘Alaska’ ‘Pacific’ ‘Arizona’ ‘Mountain’ ‘Arkansas’ ‘West South Central’ ‘California’ ‘Pacific’ ‘Colorado’ ‘Mountain’ ‘Connecticut’ ‘New England’ ‘Delaware’ ‘South Atlantic’ ‘District Columbia’ ‘South Atlantic’ ‘Florida’ ‘South Atlantic’ ‘Georgia’ ‘South Atlantic’ ‘Hawaii’ ‘Pacific’ ‘Idaho’ ‘Mountain’ ‘Illinois’ ‘East North Central’ ‘Indiana’ ‘East North Central’ ‘Iowa’ ‘West North Central’ ‘Kansas’ ‘West North Central’ ‘Kentucky’ ‘East South Central’ ‘Louisiana’ ‘West South Central’ ‘Maine’ ‘New England’ ‘Maryland’ ‘South Atlantic’ ‘Massachusetts’ ‘New England’ ‘Michigan’ ‘East North Central’ ‘Minnesota’ ‘West North Central’ ‘Mississippi’ ‘East South Central’ ‘Missouri’ ‘West North Central’ ‘Montana’ ‘Mountain’ ‘Nebraska’ ‘West North Central’ ‘Nevada’ ‘Mountain’ ‘New Hampshire’ ‘New England’ ‘New Jersey’ ‘Middle Atlantic’ ‘New Mexico’ ‘Mountain’ ‘New York’ ‘Middle Atlantic’ ‘North Carolina’ ‘South Atlantic’ ‘North Dakota’ ‘West North Central’ ‘Ohio’ ‘East North Central’ ‘Oklahoma’ ‘West South Central’ ‘Oregon’ ‘Pacific’ ‘Pennsylvania’ ‘Middle Atlantic’ ‘Rhode Island’ ‘New England’ ‘South Carolina’ ‘South Atlantic’ ‘South Dakota’ ‘West North Central’ ‘Tennessee’ ‘East South Central’ ‘Texas’ ‘West South Central’ ‘Utah’ ‘Mountain’ ‘Vermont’ ‘New England’ ‘Virginia’ ‘South Atlantic’ ‘Washington’ ‘Pacific’ ‘West Virginia’ ‘South Atlantic’ ‘Wisconsin’ ‘East North Central’ ‘Wyoming’ ‘Mountain’ StatesP1“State” StatesP1“Geography State Metropolitan Area”astype‘category’ StatesP1‘Division’ StatesP1‘State’mapDivisions needed exploratory data analysis determine quality data adjustment would need made snsdisplotStatesP1 x”Housing Insecurity Percent” element”step” col”Division” colwrap3 g snsboxplotxDivision yHousing Insecurity Percent hueSelected Horizontal Dimension dataStatesP1 paletteSet3 gsetxlabelDivision ylabelHousing Insecurity gsetxticklabelsggetxticklabelsrotation45haright Figure 2 Histogram Housing Insecurity Percent Household Pulse Survey Census division April 2020 — July 2020 Figure 3 Boxplot Housing Insecurity Percent Household Pulse Survey Census division April 2020 — July 2020 Overall pleased distribution data histogram show relatively normal boxplots see one true outlier purpose analysis kept outlier important trend data also wanted get first look actual data housing security changed 12 week period across US g snsrelplotkind ‘line’ dataStatesP1 ’Housing Insecurity Percent’ x ‘Week Number’ g snsrelplotkind ‘line’ col’Division’ colwrap5 colorder ‘Pacific’ ‘West North Central’ ‘East North Central’ ‘Middle Atlantic’ ‘New England’ ’Mountain’ ‘West South Central’ ‘East South Central’ ‘South Atlantic’ dataStatesP1 ’Housing Insecurity Percent’ x ‘Week Number’Tags Data Science Python Housing Coronavirus Covid 19
2,457
Here’s how we upgraded our marketing analytics
I hate interrupting my analysis workflow by tabbing between different applications and interfaces. It’s irritating, decreases your productivity and just makes things harder to understand. Therefore, I could empathize when one of our marketing people came up to me and expressed their need for an online marketing dashboard. In their vision, this dashboard would unite all our most important online marketing indicators and help them immensely by removing the need to go back and forth between the analytics views of different platforms. But online marketing data is isolated, lives in silos and the individual platforms don’t make it easy to integrate them with one-another. Luckily, most of them offer API services, so we rolled our sleeves up and built a basic data pipeline, which resides entirely in the cloud and feeds our Tableau dashboard. The data As far as social media platforms go, Starschema mostly uses Facebook and, to a much lesser extent, Instagram and Twitter. Our leads are generated through our website, the traffic of which we measure with Google Analytics, which we also use for our standalone, Wordpress-based blogs. It would have been nice to get the traffic data from Medium as well, but this platform doesn’t offer an API for that unfortunately, so it’s not currently in our scope. The pipeline If you’re only interested in the visuals and not how the data got there, just skip this section. We are not a small firm anymore, so whatever we would have created needed to be as enterprise-ready as possible, not least because we wanted to showcase this and use it as a proof of concept for other projects. We also wanted something with low cost and maintenance, since we want to be able to deploy this for smaller firms that might not necessarily have tech personnel on board. Thus, we opted for the Google Cloud Platform, mostly because their generous free tier ended up completely covering our requirements. The idea is to have scheduled Python scripts download the data through the API-s, flatten it and load it into our Marketing Data Warehouse which we set up in BigQuery for the sake of simplicity. In a more mature environment, we would put a frontend onto App Engine to drive the Scheduler and the Functions, but in our case we skipped this and manage everything through the GCP console. The very simple pipeline architecture we set up for this project The dashboard
https://medium.com/starschema-blog/unified-marketing-analytics-69426752b2e5
['Istvan Korompai']
2019-04-25 14:40:16.450000+00:00
['Marketing', 'Tableau', 'Analytics', 'Dataviz', 'Google Cloud Platform']
Title Here’s upgraded marketing analyticsContent hate interrupting analysis workflow tabbing different application interface It’s irritating decrease productivity make thing harder understand Therefore could empathize one marketing people came expressed need online marketing dashboard vision dashboard would unite important online marketing indicator help immensely removing need go back forth analytics view different platform online marketing data isolated life silo individual platform don’t make easy integrate oneanother Luckily offer API service rolled sleeve built basic data pipeline resides entirely cloud feed Tableau dashboard data far social medium platform go Starschema mostly us Facebook much lesser extent Instagram Twitter lead generated website traffic measure Google Analytics also use standalone Wordpressbased blog would nice get traffic data Medium well platform doesn’t offer API unfortunately it’s currently scope pipeline you’re interested visuals data got skip section small firm anymore whatever would created needed enterpriseready possible least wanted showcase use proof concept project also wanted something low cost maintenance since want able deploy smaller firm might necessarily tech personnel board Thus opted Google Cloud Platform mostly generous free tier ended completely covering requirement idea scheduled Python script download data APIs flatten load Marketing Data Warehouse set BigQuery sake simplicity mature environment would put frontend onto App Engine drive Scheduler Functions case skipped manage everything GCP console simple pipeline architecture set project dashboardTags Marketing Tableau Analytics Dataviz Google Cloud Platform
2,458
How To Get More Personal With Your Users As A Tech Company: The “Tailored Web Design” Concept
Reducing the number of users who unsubscribe? Therefore decreasing lost revenue? Yes please. —— Yesterday’s article was about traditional marketing lessons and how they should be mere guidelines. For those who haven’t read it, here’s the main idea roughly speaking: Traditional marketing rules tell us that the homepage should “convert” — that doesn’t apply all the time to SaaS products. When I’m coming back to your website to unsubscribe, is there anything that will remind me of the added value? Or is it just the story I’ve heard at the beginning which I’ve definitely forgotten? This morning I had this idea which still needs to be explored but I’ll list it here. We don’t have tailored ways to tackle both new visitors and recurring visitors for our websites. And we concluded yesterday that it matters. Especially if by“recurring visitor” we mean a paying user who’s looking to churn — that’s revenue that we lose. How about this: the website looks one way the first time it’s visited and then changes to something else on the second visit onwards. We can adjust this slightly. Maybe it looks the same the first 2 or 3 times — call that the “pitch phase”. Only then it changes into phase 2. Maybe that phase 2 kicks in only for people who have converted. With cookies, that can happen very easily. It does already, to an extent, on a lot of websites. My only concession would be having a pop-up or maybe a badge at the top saying something along the lines of “First time here? Click me.” — adjusted accordingly to the brand’s voice. Why would you do that? Very roughly speaking, user churn is likely to be fixed with two major directions: Adding new core features to your product Reminding the users about the product’s value delivery We’re looking closely at the second bullet point. I’m thinking this concept could work for SaaS companies/startups because once their users converted, they need to be reminded about the value delivery in a different manner. Objection: Yes Daniel, but that’s why the homepage is a pitch all the time! Because if they land there, they’re reminded the core features! Why would I want to change that? What I’m saying is that you have the opportunity to speak to your converting users in a different manner. Your homepage/landing page right now is “mixed up”: it presents the same thing to both new and existing users. How about talking differently to your existing (paying) users? You don’t need to explain to them “the idea in a couple of words” — you have the chance to tell them a bit more, since they know already some things. Of course, maybe you’ll want to re-explain the simple version of the idea, but you can use different language. Because they converted, you know their habits/language/way of communication. Through 2019’s technology, we can implement what I’m proposing without immense amounts of effort. Objection 2 But my users will become annoyed if they always have to click “no” in case I put something like that up Later I’ll tackle that with Basecamp’s example. Different things are shown based on who I am? Credits: Undraw.co Where did this idea come from? If you’ve ever used Google/Facebook ads, you probably know where this comes from. For those who don’t, the brief explanation is that it traditionally works like this: You create a couple of audiences that you think are relevant to what you’re doing You create something relevant and valuable to them. But this is not the “sign up to get this ebook” kind of bullshit. It could be something of actual help where you “lose money” on the ad. A/B test these audiences until you see what works better You “retarget” these audiences with the something you sell. Maybe it’s now that you ask them to download your eBook, if you want to play the long game, or maybe you’re going straight for the sale. In practical terms that means this. Let’s say you’re selling a product that helps people clean/maintain their watch. You create these audiences. Audience 1 people have liked the Rolex page and are 30 to 45 and working in this city. Audience 2 have liked a watch influencer’s page and are 40 to 55. You end up with 30 audiences. You create a video that’s basically a YouTube tutorial about cleaning your watch. That works wonders in a Facebook/Instagram feed since it’s “masked”— harder to tell whether it’s from a page you liked or an actual ad (as opposed to “BUY NOW” which is definitely an ad). People don’t get buyer’s resistance as you’re not selling something to them. These 30 audiences are A/B tested and then the top 5 audiences are picked (i.e. those 5 audiences that watched the most out of the video and/or engaged with a comment like etc). You run an ad now that actually sells your kit/product to those who have watched more than 60% of the video (i.e. “converted” in terms of video watch time) This is the idea in short. Now, when it comes to how a SaaS company/startup presents itself, there’s no difference between what’s shown to a new user and one that’s more engaged with the product. It’s like comparing what I’ve just described above to a newspaper ad —the same thing is shown to everyone, regardless of their interaction level. I’m proposing changing what’s shown to people based on a simple delimitation: whether they converted or not. Credits: Undraw.co and Ch Daniel This idea, taken even further Now that I’ve given this context from the advertising world, I can go even further. What if within your product you’ve got multiple audiences? Say Trello. Trello can be used by people who are: Into project management and working with their teams People who use it for their life, as they are organised. And within category 1, we’ve got the startup kind of team that’s just starting out and the more professional company — we can go and on with naming audiences. How about a point along the onboarding process where these users place themselves into an audience pigeonhole? And then based on which audience they are, they’ll have different versions of the homepage website (not the app!), should they ever go there? If not the homepage, then whatever page they have to go through before cancelling the payment. And that’s not to hold them as hostages (that’s another thing I believe in) — rather something that’s there to remind them about the value delivery either before they unsubscribe or when they happen to visit that page. Is this happening already? Since I’ve mentioned Trello, their homepage takes you to their app, if you’re logged in, or to their landing page if you’re not. Different behaviours based on the conversion level of the user. What if this website talked to me differently since I’ve already signed up? And also different to those who have paid for premium? Credits: Trello.com Basecamp shows this pop-up if you’re logged in.
https://medium.com/startup-grind/how-to-get-more-personal-with-your-users-as-a-tech-company-the-tailored-web-design-concept-26071e7dbe7c
['Ch Daniel']
2019-05-25 09:46:07.838000+00:00
['Design', 'SaaS', 'UX', 'Startup', 'Web Design']
Title Get Personal Users Tech Company “Tailored Web Design” ConceptContent Reducing number user unsubscribe Therefore decreasing lost revenue Yes please —— Yesterday’s article traditional marketing lesson mere guideline haven’t read here’s main idea roughly speaking Traditional marketing rule tell u homepage “convert” — doesn’t apply time SaaS product I’m coming back website unsubscribe anything remind added value story I’ve heard beginning I’ve definitely forgotten morning idea still need explored I’ll list don’t tailored way tackle new visitor recurring visitor website concluded yesterday matter Especially by“recurring visitor” mean paying user who’s looking churn — that’s revenue lose website look one way first time it’s visited change something else second visit onwards adjust slightly Maybe look first 2 3 time — call “pitch phase” change phase 2 Maybe phase 2 kick people converted cooky happen easily already extent lot website concession would popup maybe badge top saying something along line “First time Click me” — adjusted accordingly brand’s voice would roughly speaking user churn likely fixed two major direction Adding new core feature product Reminding user product’s value delivery We’re looking closely second bullet point I’m thinking concept could work SaaS companiesstartups user converted need reminded value delivery different manner Objection Yes Daniel that’s homepage pitch time land they’re reminded core feature would want change I’m saying opportunity speak converting user different manner homepagelanding page right “mixed up” present thing new existing user talking differently existing paying user don’t need explain “the idea couple words” — chance tell bit since know already thing course maybe you’ll want reexplain simple version idea use different language converted know habitslanguageway communication 2019’s technology implement I’m proposing without immense amount effort Objection 2 user become annoyed always click “no” case put something like Later I’ll tackle Basecamp’s example Different thing shown based Credits Undrawco idea come you’ve ever used GoogleFacebook ad probably know come don’t brief explanation traditionally work like create couple audience think relevant you’re create something relevant valuable “sign get ebook” kind bullshit could something actual help “lose money” ad AB test audience see work better “retarget” audience something sell Maybe it’s ask download eBook want play long game maybe you’re going straight sale practical term mean Let’s say you’re selling product help people cleanmaintain watch create audience Audience 1 people liked Rolex page 30 45 working city Audience 2 liked watch influencer’s page 40 55 end 30 audience create video that’s basically YouTube tutorial cleaning watch work wonder FacebookInstagram feed since it’s “masked”— harder tell whether it’s page liked actual ad opposed “BUY NOW” definitely ad People don’t get buyer’s resistance you’re selling something 30 audience AB tested top 5 audience picked ie 5 audience watched video andor engaged comment like etc run ad actually sell kitproduct watched 60 video ie “converted” term video watch time idea short come SaaS companystartup present there’s difference what’s shown new user one that’s engaged product It’s like comparing I’ve described newspaper ad —the thing shown everyone regardless interaction level I’m proposing changing what’s shown people based simple delimitation whether converted Credits Undrawco Ch Daniel idea taken even I’ve given context advertising world go even within product you’ve got multiple audience Say Trello Trello used people project management working team People use life organised within category 1 we’ve got startup kind team that’s starting professional company — go naming audience point along onboarding process user place audience pigeonhole based audience they’ll different version homepage website app ever go homepage whatever page go cancelling payment that’s hold hostage that’s another thing believe — rather something that’s remind value delivery either unsubscribe happen visit page happening already Since I’ve mentioned Trello homepage take app you’re logged landing page you’re Different behaviour based conversion level user website talked differently since I’ve already signed also different paid premium Credits Trellocom Basecamp show popup you’re logged inTags Design SaaS UX Startup Web Design
2,459
Tensorflow vs PyTorch for Text Classification using GRU
Preprocessing The dataset contains some columns that are not important for this problem and they were dropped. This is how the data frame looks like. We apply some preprocessing to facilitate the data modeling, thus contractions, punctuation, non-alphanumeric characters, and stop words are removed using regex. import re from nltk.corpus import stopwords def decontract(sentence): sentence = re.sub(r"n\'t", " not", sentence) sentence = re.sub(r"\'re", " are", sentence) sentence = re.sub(r"\'s", " is", sentence) sentence = re.sub(r"\'d", " would", sentence) sentence = re.sub(r"\'ll", " will", sentence) sentence = re.sub(r"\'t", " not", sentence) sentence = re.sub(r"\'ve", " have", sentence) sentence = re.sub(r"\'m", " am", sentence) return sentence def cleanPunc(sentence): cleaned = re.sub(r'[?|!|\'|"|#]',r'',sentence) cleaned = re.sub(r'[.|,|)|(|\|/]',r' ',cleaned) cleaned = cleaned.strip() cleaned = cleaned.replace(" "," ") return cleaned def keepAlpha(sentence): alpha_sent = "" for word in sentence.split(): alpha_word = re.sub('[^a-z A-Z]+', '', word) alpha_sent += alpha_word alpha_sent += " " alpha_sent = alpha_sent.strip() return alpha_sent def removeStopWords(sentence): global re_stop_words return re_stop_words.sub("", sentence) #removes characters repeated data['Text'] = data['Text'].apply(lambda x: re.sub(r'(\w)(\1{2,})', r'\1',x)) Now the text is cleaner, and we can transform the data into a form that is interpretable to the neural networks. The form we are going to use here is word embedding, which is one of the most common techniques for NLP. Word embedding consists of mapping the words in the form of numerical keys resembling the Bag of Words approach. The vectors created by Word Embedding preserve similarities of words, so words that regularly occur nearby in the text will also be in close proximity in vector space. There are two advantages to this approach: dimensionality reduction (it is a more efficient representation) and contextual similarity (it is a more expressive representation). There are a few ways of applying this method, but the one we use here is the Embedding Layer, which is used on the front end of a neural network and is fit in a supervised way using the backpropagation. To do that, it is necessary to vectorize and pad the text, so all the sentences will be uniform. The dataset is hefty (almost 600000 rows), and a portion of the text has a high quantity of tokens — the 4th percentile varies from 51 tokens to 2030 tokens — which adds unnecessary padding to the vast majority of observations and, consequently, it is computationally expensive. Thus, I remove the rows with more than 60 tokens and sample 50000 observations because a sample size bigger crashes the kernel. data['token_size'] = data['Text'].apply(lambda x: len(x.split(' '))) data = data.loc[data['token_size'] < 60] data = data.sample(n= 50000) Then we build a vocabulary based on the sample to build the Embedding Layer. # Construct a vocabulary class ConstructVocab(): def __init__(self, sentences): self.sentences = sentences self.word2idx = {} self.idx2word = {} self.vocab = set() self.create_index() def create_index(self): for sent in self.sentences: self.vocab.update(sent.split(' ')) #sort vacabulary self.vocab = sorted(self.vocab) #add a padding token with index 0 self.word2idx['<pad>'] = 0 #word to index mapping for index, word in enumerate(self.vocab): self.word2idx[word] = index + 1 # 0 is the pad #index to word mapping for word, index in self.word2idx.items(): self.idx2word[index] = word inputs = ConstructVocab(data['Text'].values.tolist()) Vectorize the text input_tensor = [[inputs.word2idx[s] for s in es.split(' ')] for es in data['Text']] Add padding def max_length(tensor): return max(len(t) for t in tensor) max_length_input = max_length(input_tensor) def pad_sequences(x, max_len): padded = np.zeros((max_len), dtype=np.int64) if len(x) > max_len: padded[:] = x[:max_len] else: padded[:len(x)] = x return padded input_tensor = [pad_sequences(x, max_length_input) for x in input_tensor] Binarize the target from sklearn import preprocessing rates = list(set(data.Score.unique())) num_rates = len(rates) mlb = preprocessing.MultiLabelBinarizer() data_labels = [set(rat) & set(rates) for rat in data[['Score']].values] bin_rates = mlb.fit_transform(data_labels) target_tensor = np.array(bin_rates.tolist()) Finally, we split the data into training, validating, and test sets. from sklearn.model_selection import train_test_split X_train, X_val, y_train, y_val = train_test_split(input_tensor, target_tensor, test_size=0.2, random_state=1000) X_val, X_test, y_val, y_test = train_test_split(X_val, y_val, test_size=0.5, random_state=1000) GRU — Gated Recurrent Unit Gated recurrent unit (GRU) is a type of recurrent neural network (RNN), and this type of artificial neural network, in which connections between nodes form a sequence, allowing temporal dynamic behavior for a time sequence. The GRU is like a long short-term memory (LSTM) with forget gate but has fewer parameters than LSTM, as it lacks an output gate. GRU’s performance on certain tasks of polyphonic music modeling, speech signal modeling, and natural language processing was found to be similar to that of LSTM. GRUs have been shown to exhibit even better performance on certain smaller and less frequent datasets. The model we are going to implement is composed of an Embedding Layer, a Dropout layer to decrease the overfitting, a GRU layer, and the output layer as represented in the following diagram. Neural Network architecture On Kaggle, we have available GPUs, and they are more efficient than CPUs when it comes to matrix multiplication and convolution, so we are going to use them here. There are some parameters that common to both frameworks, and we are going them. embedding_dim = 256 units = 1024 vocab_inp_size = len(inputs.word2idx) target_size = len(target_tensor[0]) Tensorflow In newer versions of Tensorflow, there is a bug due to deprecated methods, and it is necessary to make an adjustment to use the GPU in the backend. import tensorflow as tf import keras.backend.tensorflow_backend as tfback from keras import backend as K def _get_available_gpus(): """Get a list of available gpu devices (formatted as strings). # Returns a list of available GPU devices. """ #global _LOCAL_DEVICES if tfback._LOCAL_DEVICES is None: devices = tf.config.list_logical_devices() tfback._LOCAL_DEVICES = [x.name for x in devices] return [x for x in tfback._LOCAL_DEVICES if 'device:gpu' in x.lower()] tfback._get_available_gpus = _get_available_gpus K.tensorflow_backend._get_available_gpus() Here is the function for the model creation: from keras.layers import Dense, Embedding, Dropout, GRU from keras.models import Sequential from keras import layers def create_model(): model = Sequential() model.add(Embedding(vocab_inp_size, embedding_dim, input_length=max_length_input)) model.add(Dropout(0.5)) model.add(GRU(units)) model.add(layers.Dense(5, activation='sigmoid')) model.compile(loss='binary_crossentropy',optimizer='adam', metrics=['accuracy']) return model We also implement a callback function, so we can know the time spent in each epoch of the training. class timecallback(tf.keras.callbacks.Callback): def __init__(self): self.times = [] # use this value as reference to calculate cummulative time taken self.timetaken = time.process_time() def on_epoch_end(self,epoch,logs = {}): self.times.append((epoch,time.process_time() -self.timetaken)) Now we can train the neural network in batches. timetaken = timecallback() history = model.fit(pd.DataFrame(X_train), y_train, epochs=10, verbose=True, validation_data=(pd.DataFrame(X_val), y_val), batch_size=64, callbacks = [timetaken]) We train for 10 epochs, and the net already starts to overfit. The accuracy of the model with the test set is ~89% and takes ~74s/epoch during the training phase. The accuracy seems high, but when we have a better look at the confusion matrix, we notice that the model struggles with the medium rates (between 2–4). The model falsely classifies 2 as 1 and 4 as 5, having a high percentage of false positives. Confusion matrix of the Tensorflow model PyTorch The PyTorch is not so straight forward, and it is a deeper preparation of the data must be implemented before transforming it into tensors. # Use Dataset class to represent the dataset object class MyData(Dataset): def __init__(self, X, y): self.data = X self.target = y self.length = [np.sum(1 - np.equal(x,0)) for x in X] def __getitem__(self, index): x = self.data[index] y = self.target[index] x_len = self.length[index] return x, y, x_len def __len__(self): return len(self.data) We create the MyData class, and then we encapsulate it with DataLoader for two reasons: organization and avoid compatibility issues in the future. import torch from torch.autograd import Variable from torch.utils.data import Dataset, DataLoader TRAIN_BUFFER_SIZE = len(X_train) VAL_BUFFER_SIZE = len(X_val) TEST_BUFFER_SIZE = len(X_test) BATCH_SIZE = 64 TRAIN_N_BATCH = TRAIN_BUFFER_SIZE // BATCH_SIZE VAL_N_BATCH = VAL_BUFFER_SIZE // BATCH_SIZE TEST_N_BATCH = TEST_BUFFER_SIZE // BATCH_SIZE train_dataset = MyData(X_train, y_train) val_dataset = MyData(X_val, y_val) test_dataset = MyData(X_test, y_test) train_dataset = DataLoader(train_dataset, batch_size = BATCH_SIZE, drop_last=True, shuffle=True) val_dataset = DataLoader(val_dataset, batch_size = BATCH_SIZE, drop_last=True, shuffle=True) test_dataset = DataLoader(test_dataset, batch_size = BATCH_SIZE, drop_last=True, shuffle=True) Pytorch differs mainly from Tensorflow because it is a lower-level framework, which has upsides and drawbacks. The organizational schema gives the user more freedom to write custom layers and look under the hood of numerical optimization tasks. On the other hand, the price is verbosity, and everything must be implemented from scratch. Here we implement the same model as before. import torch.nn as nn class RateGRU(nn.Module): def __init__(self, vocab_size, embedding_dim, hidden_units, batch_sz, output_size): super(RateGRU, self).__init__() self.batch = batch_sz self.vocab_size = vocab_size self.embedding_dim = embedding_dim self.hidden_units = hidden_units self.output_size = output_size #layers self.embedding = nn.Embedding(self.vocab_size, self.embedding_dim) self.dropout = nn.Dropout(p=0.5) self.gru = nn.GRU(self.embedding_dim, self.hidden_units) self.fc = nn.Linear(self.hidden_units, self.output_size) def initialize_hidden_state(self, device): return torch.zeros((1, self.batch, self.hidden_units)).to(device) def forward(self, x, lens, device): x = self.embedding(x) self.hidden = self.initialize_hidden_state(device) output, self.hidden = self.gru(x, self.hidden) out = output[-1, :, :] out = self.dropout(out) out = self.fc(out) return out, self.hidden After the model is implemented, we use the GPU in case it is available and write the loss function alongside the accuracy function to check the model performance. use_cuda = True if torch.cuda.is_available() else False device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu") model = RateGRU(vocab_inp_size, embedding_dim, units, BATCH_SIZE, target_size) model.to(device) #loss criterion and optimizer criterion = nn.CrossEntropyLoss() optimizer = torch.optim.Adam(model.parameters()) def loss_function(y, prediction): target = torch.max(y, 1)[1] loss = criterion(prediction, target) return loss def accuracy(target, logit): target = torch.max(target, 1)[1] corrects = (torch.max(logit, 1)[1].data == target).sum() accuracy = 100. * corrects / len(logit) return accuracy Finally we are all set to train the model. EPOCHS = 10 for epoch in range(EPOCHS): start = time.time() total_loss = 0 train_accuracy, val_accuracy = 0, 0 for (batch, (inp, targ, lens)) in enumerate(train_dataset): loss = 0 predictions, _ = model(inp.permute(1, 0).to(device), lens, device) loss += loss_function(targ.to(device), predictions) batch_loss = (loss / int(targ.shape[1])) total_loss += batch_loss optimizer.zero_grad() loss.backward() optimizer.step() batch_accuracy = accuracy(targ.to(device), predictions) train_accuracy += batch_accuracy We also train for 10 epochs here, and the overfitting problem previously faced repeats itself. The accuracy is ~71%, but in terms of speed PyTorch wins by far with ~17s/epoch. The accuracy here is considerably lower, but this is misleading because the confusion matrix is similar to the Tensorflow model, suffering for the same pitfalls. Confusion matrix of the RateGRU Conclusion Tensorflow and PyTorch are both excellent choices. As far as training speed is concerned, PyTorch outperforms Keras, but in terms of accuracy the latter wins. I particularly find Tensorflow more intuitive and concise, not mentioning a wide access to tutorials and reusable code. However, I am biased because I have had more contact with Tensorflow so far. PyTorch is more flexible, encouraging a deeper understanding of deep learning concepts, and it counts with an extensive community support with active development, especially researchers.
https://medium.com/swlh/tensorflow-vs-pytorch-for-text-classification-using-gru-e95f1b68fa2d
['Rodolfo Saldanha']
2020-05-27 15:34:20.624000+00:00
['Machine Learning', 'Python', 'Neural Networks', 'Artificial Intelligence', 'Deep Learning']
Title Tensorflow v PyTorch Text Classification using GRUContent Preprocessing dataset contains column important problem dropped data frame look like apply preprocessing facilitate data modeling thus contraction punctuation nonalphanumeric character stop word removed using regex import nltkcorpus import stopwords def decontractsentence sentence resubrnt sentence sentence resubrre sentence sentence resubrs sentence sentence resubrd would sentence sentence resubrll sentence sentence resubrt sentence sentence resubrve sentence sentence resubrm sentence return sentence def cleanPuncsentence cleaned resubrrsentence cleaned resubrr cleaned cleaned cleanedstrip cleaned cleanedreplace return cleaned def keepAlphasentence alphasent word sentencesplit alphaword resubaz AZ word alphasent alphaword alphasent alphasent alphasentstrip return alphasent def removeStopWordssentence global restopwords return restopwordssub sentence remove character repeated dataText dataTextapplylambda x resubrw12 r1x text cleaner transform data form interpretable neural network form going use word embedding one common technique NLP Word embedding consists mapping word form numerical key resembling Bag Words approach vector created Word Embedding preserve similarity word word regularly occur nearby text also close proximity vector space two advantage approach dimensionality reduction efficient representation contextual similarity expressive representation way applying method one use Embedding Layer used front end neural network fit supervised way using backpropagation necessary vectorize pad text sentence uniform dataset hefty almost 600000 row portion text high quantity token — 4th percentile varies 51 token 2030 token — add unnecessary padding vast majority observation consequently computationally expensive Thus remove row 60 token sample 50000 observation sample size bigger crash kernel datatokensize dataTextapplylambda x lenxsplit data datalocdatatokensize 60 data datasamplen 50000 build vocabulary based sample build Embedding Layer Construct vocabulary class ConstructVocab def initself sentence selfsentences sentence selfword2idx selfidx2word selfvocab set selfcreateindex def createindexself sent selfsentences selfvocabupdatesentsplit sort vacabulary selfvocab sortedselfvocab add padding token index 0 selfword2idxpad 0 word index mapping index word enumerateselfvocab selfword2idxword index 1 0 pad index word mapping word index selfword2idxitems selfidx2wordindex word input ConstructVocabdataTextvaluestolist Vectorize text inputtensor inputsword2idxs essplit e dataText Add padding def maxlengthtensor return maxlent tensor maxlengthinput maxlengthinputtensor def padsequencesx maxlen padded npzerosmaxlen dtypenpint64 lenx maxlen padded xmaxlen else paddedlenx x return padded inputtensor padsequencesx maxlengthinput x inputtensor Binarize target sklearn import preprocessing rate listsetdataScoreunique numrates lenrates mlb preprocessingMultiLabelBinarizer datalabels setrat setrates rat dataScorevalues binrates mlbfittransformdatalabels targettensor nparraybinratestolist Finally split data training validating test set sklearnmodelselection import traintestsplit Xtrain Xval ytrain yval traintestsplitinputtensor targettensor testsize02 randomstate1000 Xval Xtest yval ytest traintestsplitXval yval testsize05 randomstate1000 GRU — Gated Recurrent Unit Gated recurrent unit GRU type recurrent neural network RNN type artificial neural network connection node form sequence allowing temporal dynamic behavior time sequence GRU like long shortterm memory LSTM forget gate fewer parameter LSTM lack output gate GRU’s performance certain task polyphonic music modeling speech signal modeling natural language processing found similar LSTM GRUs shown exhibit even better performance certain smaller le frequent datasets model going implement composed Embedding Layer Dropout layer decrease overfitting GRU layer output layer represented following diagram Neural Network architecture Kaggle available GPUs efficient CPUs come matrix multiplication convolution going use parameter common framework going embeddingdim 256 unit 1024 vocabinpsize leninputsword2idx targetsize lentargettensor0 Tensorflow newer version Tensorflow bug due deprecated method necessary make adjustment use GPU backend import tensorflow tf import kerasbackendtensorflowbackend tfback kera import backend K def getavailablegpus Get list available gpu device formatted string Returns list available GPU device global LOCALDEVICES tfbackLOCALDEVICES None device tfconfiglistlogicaldevices tfbackLOCALDEVICES xname x device return x x tfbackLOCALDEVICES devicegpu xlower tfbackgetavailablegpus getavailablegpus Ktensorflowbackendgetavailablegpus function model creation keraslayers import Dense Embedding Dropout GRU kerasmodels import Sequential kera import layer def createmodel model Sequential modeladdEmbeddingvocabinpsize embeddingdim inputlengthmaxlengthinput modeladdDropout05 modeladdGRUunits modeladdlayersDense5 activationsigmoid modelcompilelossbinarycrossentropyoptimizeradam metricsaccuracy return model also implement callback function know time spent epoch training class timecallbacktfkerascallbacksCallback def initself selftimes use value reference calculate cummulative time taken selftimetaken timeprocesstime def onepochendselfepochlogs selftimesappendepochtimeprocesstime selftimetaken train neural network batch timetaken timecallback history modelfitpdDataFrameXtrain ytrain epochs10 verboseTrue validationdatapdDataFrameXval yval batchsize64 callback timetaken train 10 epoch net already start overfit accuracy model test set 89 take 74sepoch training phase accuracy seems high better look confusion matrix notice model struggle medium rate 2–4 model falsely classifies 2 1 4 5 high percentage false positive Confusion matrix Tensorflow model PyTorch PyTorch straight forward deeper preparation data must implemented transforming tensor Use Dataset class represent dataset object class MyDataDataset def initself X selfdata X selftarget selflength npsum1 npequalx0 x X def getitemself index x selfdataindex selftargetindex xlen selflengthindex return x xlen def lenself return lenselfdata create MyData class encapsulate DataLoader two reason organization avoid compatibility issue future import torch torchautograd import Variable torchutilsdata import Dataset DataLoader TRAINBUFFERSIZE lenXtrain VALBUFFERSIZE lenXval TESTBUFFERSIZE lenXtest BATCHSIZE 64 TRAINNBATCH TRAINBUFFERSIZE BATCHSIZE VALNBATCH VALBUFFERSIZE BATCHSIZE TESTNBATCH TESTBUFFERSIZE BATCHSIZE traindataset MyDataXtrain ytrain valdataset MyDataXval yval testdataset MyDataXtest ytest traindataset DataLoadertraindataset batchsize BATCHSIZE droplastTrue shuffleTrue valdataset DataLoadervaldataset batchsize BATCHSIZE droplastTrue shuffleTrue testdataset DataLoadertestdataset batchsize BATCHSIZE droplastTrue shuffleTrue Pytorch differs mainly Tensorflow lowerlevel framework upside drawback organizational schema give user freedom write custom layer look hood numerical optimization task hand price verbosity everything must implemented scratch implement model import torchnn nn class RateGRUnnModule def initself vocabsize embeddingdim hiddenunits batchsz outputsize superRateGRU selfinit selfbatch batchsz selfvocabsize vocabsize selfembeddingdim embeddingdim selfhiddenunits hiddenunits selfoutputsize outputsize layer selfembedding nnEmbeddingselfvocabsize selfembeddingdim selfdropout nnDropoutp05 selfgru nnGRUselfembeddingdim selfhiddenunits selffc nnLinearselfhiddenunits selfoutputsize def initializehiddenstateself device return torchzeros1 selfbatch selfhiddenunitstodevice def forwardself x lens device x selfembeddingx selfhidden selfinitializehiddenstatedevice output selfhidden selfgrux selfhidden output1 selfdropoutout selffcout return selfhidden model implemented use GPU case available write loss function alongside accuracy function check model performance usecuda True torchcudaisavailable else False device torchdevicecuda0 torchcudaisavailable else cpu model RateGRUvocabinpsize embeddingdim unit BATCHSIZE targetsize modeltodevice loss criterion optimizer criterion nnCrossEntropyLoss optimizer torchoptimAdammodelparameters def lossfunctiony prediction target torchmaxy 11 loss criterionprediction target return loss def accuracytarget logit target torchmaxtarget 11 corrects torchmaxlogit 11data targetsum accuracy 100 corrects lenlogit return accuracy Finally set train model EPOCHS 10 epoch rangeEPOCHS start timetime totalloss 0 trainaccuracy valaccuracy 0 0 batch inp targ lens enumeratetraindataset loss 0 prediction modelinppermute1 0todevice lens device loss lossfunctiontargtodevice prediction batchloss loss inttargshape1 totalloss batchloss optimizerzerograd lossbackward optimizerstep batchaccuracy accuracytargtodevice prediction trainaccuracy batchaccuracy also train 10 epoch overfitting problem previously faced repeat accuracy 71 term speed PyTorch win far 17sepoch accuracy considerably lower misleading confusion matrix similar Tensorflow model suffering pitfall Confusion matrix RateGRU Conclusion Tensorflow PyTorch excellent choice far training speed concerned PyTorch outperforms Keras term accuracy latter win particularly find Tensorflow intuitive concise mentioning wide access tutorial reusable code However biased contact Tensorflow far PyTorch flexible encouraging deeper understanding deep learning concept count extensive community support active development especially researchersTags Machine Learning Python Neural Networks Artificial Intelligence Deep Learning
2,460
How to Make Decisions as a Team (When That Team Keeps Growing)
Reframing decisions. At their core, the decisions we make every day, both at home and at work are nothing more than bets. As much as we like to think we make decisions based on all the available information, this is rarely the case. Without all of the information, we are essentially betting on the outcome of whatever we decide based on the limited information we have. We don’t like to admit this because we all want to think that if a decision is being made, especially in the world of business, the decision-maker is sure it will be right. In a startup, this is magnified as often you will be doing something disruptive, new or different, meaning that the answers will rarely be laid out in front of you. After all, you will never be 100% certain of a future that does not exist yet. Over the last year in my current role, I have been part of a project to design and build an entirely new product that will be taking the company in a new direction. This has been an exciting project with lots of moving parts. With a team that was growing quickly around us, it was vital that, as the product team, we were able to communicate what we called our “comfortable uncertainty” to the wider organization. Early on, it becomes easy for conversations to end up descending into “Do we even know x is going to work?” which was absolutely the wrong way to be thinking about delivering a new product, especially when trying to do something disruptive. This question is unhelpful in several ways but ultimately at its core lies the question “Are we 100% certain?” which as we know, is just not possible. In a world where we are thinking in bets, the question we can ask ourselves is “Have we done enough to be confident in this decision”. By asking this, we can move the conversation away from a binary position and enable ourselves to have a more constructive discussion, where we can understand the actions and decisions we have made to get to where we are. This, however, has required a pretty significant shift in the fundamental culture of the organization, moving to a world where everyone understands and embraces the fact that the decisions we make as a team are moving us in a direction where we cannot possibly know all the answers. This becomes even more difficult as the team grows and new hires start to become more specialized coming from larger, more mature organizations where this way of thinking often goes against conventional wisdom. Moving to comfortable uncertainty In our experience, the first step in adopting this new way of thinking about decision making in your growing team is to train everyone to be comfortable in this uncertainty. This should absolutely be a trait you look for in new hires and something that should be preached internally. This is, however, easier said than done. Merely stating that the team now has to be more comfortable without knowing the answers will be seen by skeptics as passive management or poor planning. We have found it crucial to be able to give tangible evidence, a north star metric or a stick in the ground far in the future that people can look at. “We don’t know all the answers now, but we know this is where we are heading”. Many people will always feel a certain way when they hear the word bet in this context. It implies that we, as an organization, are going to leave our decisions to chance or take wild stabs in the dark. This simply isn’t the case. Thinking in bets is all about how you can frame decision making; it is not “I bet this will work.” Like any bet, we must ask “How confident am I of my decision, and what is my threshold based on its importance?”. Larger decisions will, of course, require us to have more confidence, whereas there will be smaller, less destructive decisions that we can make quickly with less confidence. Framing it in this way reduces the need to find a definitive solution and instead forces us to allocate it the correct amount of effort relative to its importance. This not only allows us the ability to make more decisions faster but also helps us prioritize the most critical issues and make sure we are focusing on the right things.
https://medium.com/swlh/how-to-make-decisions-as-a-team-when-that-team-keeps-growing-a0636fe4a63
['Jamie Carr']
2019-12-13 18:01:01.437000+00:00
['Work', 'Leadership', 'Startup Lessons', 'Productivity', 'Startup']
Title Make Decisions Team Team Keeps GrowingContent Reframing decision core decision make every day home work nothing bet much like think make decision based available information rarely case Without information essentially betting outcome whatever decide based limited information don’t like admit want think decision made especially world business decisionmaker sure right startup magnified often something disruptive new different meaning answer rarely laid front never 100 certain future exist yet last year current role part project design build entirely new product taking company new direction exciting project lot moving part team growing quickly around u vital product team able communicate called “comfortable uncertainty” wider organization Early becomes easy conversation end descending “Do even know x going work” absolutely wrong way thinking delivering new product especially trying something disruptive question unhelpful several way ultimately core lie question “Are 100 certain” know possible world thinking bet question ask “Have done enough confident decision” asking move conversation away binary position enable constructive discussion understand action decision made get however required pretty significant shift fundamental culture organization moving world everyone understands embrace fact decision make team moving u direction cannot possibly know answer becomes even difficult team grows new hire start become specialized coming larger mature organization way thinking often go conventional wisdom Moving comfortable uncertainty experience first step adopting new way thinking decision making growing team train everyone comfortable uncertainty absolutely trait look new hire something preached internally however easier said done Merely stating team comfortable without knowing answer seen skeptic passive management poor planning found crucial able give tangible evidence north star metric stick ground far future people look “We don’t know answer know heading” Many people always feel certain way hear word bet context implies organization going leave decision chance take wild stab dark simply isn’t case Thinking bet frame decision making “I bet work” Like bet must ask “How confident decision threshold based importance” Larger decision course require u confidence whereas smaller le destructive decision make quickly le confidence Framing way reduces need find definitive solution instead force u allocate correct amount effort relative importance allows u ability make decision faster also help u prioritize critical issue make sure focusing right thingsTags Work Leadership Startup Lessons Productivity Startup
2,461
Latest Social Media Marketing Trends in 2020
Digital Marketing Embrace these Two Game Changing Social Media & Digital Marketing Trends to Take Your Customer Engagement to a Whole Different Level Madhur Dixit Follow Jun 8 · 5 min read With around 5 billion people using the internet and the number of active social media users touching a 4 billion mark, it is needless to say that social media has undoubtedly become an unbeatable platform for marketing. Companies from almost every industry are embracing this fact and are channelizing their efforts on levelling up their social media marketing practices and engagement with the customers. Social media is a world in itself and is evolving faster than any other platform when it comes to using it for the marketing activities. In order to stay on top of their game, companies must embrace trends in social media as they come. The following two most recent trends and best practices are driving customer engagement and helping companies get traction like never before: 1. Content is king, Context is Kingdom and Storytelling is The Royal Guard Creating attractive and engaging content is no doubt one of the key metrics to success when it comes to social media marketing. However, what is more important is the context in which this content has been used. A same post cannot always be used on all the social media channels by a company to drive its marketing activity. Different content posts should be curated for different social channels targeted on different audiences on these channels. For example, Instagram is a great platform to drive customer engagement and awareness for a particular product. Viewers on Instagram are looking for entertainment and engagement and they do not prefer to be pushed by an advertisement to make a purchase right away. Creating pushy ads for Instagram, for example, might backfire on a company’s marketing campaign. To make sure that the context and the content both resonate with the viewers seamlessly, social media updates or advertisements alone are not enough. A royal guard is needed. I consider Storytelling as this royal guard. Storytelling allows a company to tell a story behind its products and services to the consumer in a given context thus allowing for the content to breathe and flow seamlessly among the consumers. “Just as a king is no one without his kingdom, his people and his guards, Content is nothing without the Context and the Storytelling.” Social media apps have understood the concept and importance of storytelling even before the companies looking to market their products on these apps have done so. Instagram, Facebook, Snapchat all are examples of the platforms which have integrated a story feature that allows anyone to upload stories, create engaging posts and polls, ask questions or start a discussion. It is a classic way of community building and the companies which can leverage this story feature would stay ahead of their competition all the way. Marks and Spencer is a great example of a business that uses Instagram’s story feature very efficiently and drives customer engagement. A proof of Marks and Spenser’s successful use of Instagram story feature is a fact that the following that M&S enjoys is more an organic play than a paid-for one. This is because they produce great content specifically for their audience on Instagram and give this content a beautiful breathable context with the help of visually engaging and attracting storytelling. Heineken is yet another example of a brand that has got the basics right and mastered the social media marketing game on Instagram. Heineken used its sponsorship of the UEFA Champions league to not just increase sales penetration among loyal customers but also attract new digital customers. Image captured by Vijeshwar Datt The brand discovered that many of its consumers watching the UEFA Champions League, which it sponsors, were doing so through digital devices only, meaning they wouldn’t see activations taking place in so-called traditional media. It was found that 8 out of 10 people were following the game on Social media channels. Heineken leveraged this fact to their advantage by starting a UEFA campaign on Instagram using the story feature. This practice not only helped them connect with their existing consumers but also helped them attract new consumers who are die-hard fans of the game in general. 2. Augmented Reality is The New Cool in Marketing Augmented reality allows brands to create one of a kind, immersive experiences which drive connection and brand-building opportunities. With the help of AR, brands are able to provide virtual tours, hold virtual events and enable customers to try their products virtually without leaving the comfort of their homes. Nike has always been very innovative and open when it comes to embracing new marketing practices. They have incorporated AR in their marketing practices and customer engagement excellently. In July 2019, they integrated an AR feature in their app in the USA market which allows the customers to scan their feet and get the correct shoe size the first time, thus taking the guesswork out of buying shoes online. This feature is a great addition to Nike app and for the customers to get the correct shoe size and see how a particular shoe would look like on them. Image Captured by Laura Chouette Companies such as LVMH and Estee Lauder in the fashion industry could benefit from integrating AR in their marketing campaigns thus providing the buyers a unique experience of trying out clothes, accessories and beauty products in a virtual setting and providing them with a unique shopping experience. EA Sports in the gaming industry is also leveraging the virtual reality concept in engaging with the gamers, thus providing them with an unforgettable gaming experience. They are using AR to allow the gamers to have an unmatched gaming experience and thus creating a buzz word around the brand. Electronic Arts CEO Andrew Wilson, in 2017, even said that AR is more interesting of an experience for the gamers as compared to the VR. To make the most out of social and online media, companies must embrace the new trends as they come and should be able to take risks in these campaigns.
https://medium.com/swlh/embrace-these-two-game-changing-social-media-digital-marketing-trends-to-take-your-customer-57edee1dc41
['Madhur Dixit']
2020-06-09 19:42:09.677000+00:00
['Storytelling', 'Marketing', 'Social Media Marketing', 'Digital Marketing', 'Content Marketing']
Title Latest Social Media Marketing Trends 2020Content Digital Marketing Embrace Two Game Changing Social Media Digital Marketing Trends Take Customer Engagement Whole Different Level Madhur Dixit Follow Jun 8 · 5 min read around 5 billion people using internet number active social medium user touching 4 billion mark needle say social medium undoubtedly become unbeatable platform marketing Companies almost every industry embracing fact channelizing effort levelling social medium marketing practice engagement customer Social medium world evolving faster platform come using marketing activity order stay top game company must embrace trend social medium come following two recent trend best practice driving customer engagement helping company get traction like never 1 Content king Context Kingdom Storytelling Royal Guard Creating attractive engaging content doubt one key metric success come social medium marketing However important context content used post cannot always used social medium channel company drive marketing activity Different content post curated different social channel targeted different audience channel example Instagram great platform drive customer engagement awareness particular product Viewers Instagram looking entertainment engagement prefer pushed advertisement make purchase right away Creating pushy ad Instagram example might backfire company’s marketing campaign make sure context content resonate viewer seamlessly social medium update advertisement alone enough royal guard needed consider Storytelling royal guard Storytelling allows company tell story behind product service consumer given context thus allowing content breathe flow seamlessly among consumer “Just king one without kingdom people guard Content nothing without Context Storytelling” Social medium apps understood concept importance storytelling even company looking market product apps done Instagram Facebook Snapchat example platform integrated story feature allows anyone upload story create engaging post poll ask question start discussion classic way community building company leverage story feature would stay ahead competition way Marks Spencer great example business us Instagram’s story feature efficiently drive customer engagement proof Marks Spenser’s successful use Instagram story feature fact following MS enjoys organic play paidfor one produce great content specifically audience Instagram give content beautiful breathable context help visually engaging attracting storytelling Heineken yet another example brand got basic right mastered social medium marketing game Instagram Heineken used sponsorship UEFA Champions league increase sale penetration among loyal customer also attract new digital customer Image captured Vijeshwar Datt brand discovered many consumer watching UEFA Champions League sponsor digital device meaning wouldn’t see activation taking place socalled traditional medium found 8 10 people following game Social medium channel Heineken leveraged fact advantage starting UEFA campaign Instagram using story feature practice helped connect existing consumer also helped attract new consumer diehard fan game general 2 Augmented Reality New Cool Marketing Augmented reality allows brand create one kind immersive experience drive connection brandbuilding opportunity help AR brand able provide virtual tour hold virtual event enable customer try product virtually without leaving comfort home Nike always innovative open come embracing new marketing practice incorporated AR marketing practice customer engagement excellently July 2019 integrated AR feature app USA market allows customer scan foot get correct shoe size first time thus taking guesswork buying shoe online feature great addition Nike app customer get correct shoe size see particular shoe would look like Image Captured Laura Chouette Companies LVMH Estee Lauder fashion industry could benefit integrating AR marketing campaign thus providing buyer unique experience trying clothes accessory beauty product virtual setting providing unique shopping experience EA Sports gaming industry also leveraging virtual reality concept engaging gamers thus providing unforgettable gaming experience using AR allow gamers unmatched gaming experience thus creating buzz word around brand Electronic Arts CEO Andrew Wilson 2017 even said AR interesting experience gamers compared VR make social online medium company must embrace new trend come able take risk campaignsTags Storytelling Marketing Social Media Marketing Digital Marketing Content Marketing
2,462
Trading Dashboard with Yfinance & Python.
Beginner level coding with advanced techniques. Table of Contents: Pull Data with Yfinance Api Set the Short and Long windows (SMA) Generate trading signals Plot Entry/Exit points Backtest Analyze Portfolio metrics Serve Dashboard Introduction To begin, let’s first understand the goal of this article, which is to provide the average retail investor with a quick and easy way to pull live data, use that data to highlight key indicators and create a nice clean readable table before investing in a particular company(s). This process will help you take emotion out of the equation and give you enough information to make informed decisions. Substitute any stock ticker you would like at the bottom of the code block: # Import libraries and dependencies import numpy as np import pandas as pd import hvplot.pandas from pathlib import Path import yfinance as yf #Cloudflare net = yf.Ticker(“net”) net # Set the timeframe you are interested in viewing. net_historical = net.history(start="2018-01-2", end="2020-12-11", interval="1d") # Create a new DataFrame called signals, keeping only the 'Date' & 'Close' columns. signals_df = net_historical.drop(columns=['Open', 'High', 'Low', 'Volume','Dividends', 'Stock Splits']) Moving Averages: Next, we want to create columns for the short and long windows, also known as the simple moving averages. In this case, we will be using the 50-day and the 100-day averages. In the code below we will need to set the trading signals as 0 or 1. This will tell python at which points we should Buy or Sell a position. Keep in mind when the SMA50 crosses above the SMA100 or resistance level, this is a bullish breakout signal. # Set the short window and long windows short_window = 50 long_window = 100 # Generate the short and long moving averages (50 and 100 days, respectively) signals_df['SMA50'] = signals_df['Close'].rolling(window=short_window).mean() signals_df['SMA100'] = signals_df['Close'].rolling(window=long_window).mean() signals_df['Signal'] = 0.0 # Generate the trading signal 0 or 1, # where 0 is when the SMA50 is under the SMA100, and # where 1 is when the SMA50 is higher (or crosses over) the SMA100 signals_df['Signal'][short_window:] = np.where( signals_df['SMA50'][short_window:] > signals_df['SMA100'][short_window:], 1.0, 0.0 ) # Calculate the points in time at which a position should be taken, 1 or -1 signals_df['Entry/Exit'] = signals_df['Signal'].diff() # Print the DataFrame signals_df.tail(10) The third step towards building our dashboard is creating a chart with green and red signal markers for Entry / Exit indicators. Plotting the Moving Averages with HvPlot: # Visualize exit position relative to close price exit = signals_df[signals_df['Entry/Exit'] == -1.0]['Close'].hvplot.scatter( color='red', legend=False, ylabel='Price in $', width=1000, height=400 ) # Visualize entry position relative to close price entry = signals_df[signals_df['Entry/Exit'] == 1.0]['Close'].hvplot.scatter( color='green', legend=False, ylabel='Price in $', width=1000, height=400 ) # Visualize close price for the investment security_close = signals_df[['Close']].hvplot( line_color='lightgray', ylabel='Price in $', width=1000, height=400 ) # Visualize moving averages moving_avgs = signals_df[['SMA50', 'SMA100']].hvplot( ylabel='Price in $', width=1000, height=400 ) # Overlay plots entry_exit_plot = security_close * moving_avgs * entry * exit entry_exit_plot.opts(xaxis=None) Next, we will set an initial investment stake of capital and set the number of shares. For this example, let’s say we want to buy 500 shares of Cloudflare. # Set initial capital initial_capital = float(100000) # Set the share size share_size = 500 # Take a 500 share position where the dual moving average crossover is 1 (SMA50 is greater than SMA100) signals_df['Position'] = share_size * signals_df['Signal'] # Find the points in time where a 500 share position is bought or sold signals_df['Entry/Exit Position'] = signals_df['Position'].diff() # Multiply share price by entry/exit positions and get the cumulatively sum signals_df['Portfolio Holdings'] = signals_df['Close'] * signals_df['Entry/Exit Position'].cumsum() # Subtract the initial capital by the portfolio holdings to get the amount of liquid cash in the portfolio signals_df['Portfolio Cash'] = initial_capital - (signals_df['Close'] * signals_df['Entry/Exit Position']).cumsum() # Get the total portfolio value by adding the cash amount by the portfolio holdings (or investments) signals_df['Portfolio Total'] = signals_df['Portfolio Cash'] + signals_df['Portfolio Holdings'] # Calculate the portfolio daily returns signals_df['Portfolio Daily Returns'] = signals_df['Portfolio Total'].pct_change() # Calculate the cumulative returns signals_df['Portfolio Cumulative Returns'] = (1 + signals_df['Portfolio Daily Returns']).cumprod() - 1 # Print the DataFrame signals_df.tail(10) Visualize the Exit positions relative to our portfolio: # Visualize exit position relative to total portfolio value exit = signals_df[signals_df['Entry/Exit'] == -1.0]['Portfolio Total'].hvplot.scatter( color='red', legend=False, ylabel='Total Portfolio Value', width=1000, height=400 ) # Visualize entry position relative to total portfolio value entry = signals_df[signals_df['Entry/Exit'] == 1.0]['Portfolio Total'].hvplot.scatter( color='green', legend=False, ylabel='Total Portfolio Value', width=1000, height=400 ) # Visualize total portoflio value for the investment total_portfolio_value = signals_df[['Portfolio Total']].hvplot( line_color='lightgray', ylabel='Total Portfolio Value', width=1000, height=400 ) # Overlay plots portfolio_entry_exit_plot = total_portfolio_value * entry * exit portfolio_entry_exit_plot.opts(xaxis=None) # Prepare DataFrame for metrics metrics = [ 'Annual Return', 'Cumulative Returns', 'Annual Volatility', 'Sharpe Ratio', 'Sortino Ratio'] columns = ['Backtest'] # Initialize the DataFrame with index set to evaluation metrics and column as `Backtest` (just like PyFolio) portfolio_evaluation_df = pd.DataFrame(index=metrics, columns=columns) Perform Backtest: In this section we will look to highlight 🖐🏼 indicators. 1. Cumulative return — return on the investment in total. on the investment in total. 2. Annual return — return on investment received that year. on investment received that year. 3. Annual volatility — daily volatility times the square root of 252 trading days. 4. Sharpe ratio — measures the performance of an investment compared to a risk-free asset, after adjusting for its risk. 5. Sortino ratio — differentiates harmful volatility from total overall volatility by using the asset’s standard deviation of negative portfolio returns, downside deviation, instead of the total standard deviation of portfolio returns. # Calculate cumulative return portfolio_evaluation_df.loc['Cumulative Returns'] = signals_df['Portfolio Cumulative Returns'][-1] # Calculate annualized return portfolio_evaluation_df.loc['Annual Return'] = ( signals_df['Portfolio Daily Returns'].mean() * 252 ) # Calculate annual volatility portfolio_evaluation_df.loc['Annual Volatility'] = ( signals_df['Portfolio Daily Returns'].std() * np.sqrt(252) ) # Calculate Sharpe Ratio portfolio_evaluation_df.loc['Sharpe Ratio'] = ( signals_df['Portfolio Daily Returns'].mean() * 252) / ( signals_df['Portfolio Daily Returns'].std() * np.sqrt(252) ) # Calculate Downside Return sortino_ratio_df = signals_df[['Portfolio Daily Returns']].copy() sortino_ratio_df.loc[:,'Downside Returns'] = 0 target = 0 mask = sortino_ratio_df['Portfolio Daily Returns'] < target sortino_ratio_df.loc[mask, 'Downside Returns'] = sortino_ratio_df['Portfolio Daily Returns']**2 portfolio_evaluation_df # Calculate Sortino Ratio down_stdev = np.sqrt(sortino_ratio_df['Downside Returns'].mean()) * np.sqrt(252) expected_return = sortino_ratio_df['Portfolio Daily Returns'].mean() * 252 sortino_ratio = expected_return/down_stdev portfolio_evaluation_df.loc['Sortino Ratio'] = sortino_ratio portfolio_evaluation_df.head() # Initialize trade evaluation DataFrame with columns. trade_evaluation_df = pd.DataFrame( columns=[ 'Stock', 'Entry Date', 'Exit Date', 'Shares', 'Entry Share Price', 'Exit Share Price', 'Entry Portfolio Holding', 'Exit Portfolio Holding', 'Profit/Loss'] ) Loop through DataFrame, if the ‘Entry / Exit’ trade is 1, set Entry trade metrics. If `Entry/Exit` is -1, set exit trade metrics and calculate profit. Append the record to the trade evaluation DataFrame. # Initialize iterative variables entry_date = '' exit_date = '' entry_portfolio_holding = 0 exit_portfolio_holding = 0 share_size = 0 entry_share_price = 0 exit_share_price = 0 for index, row in signals_df.iterrows(): if row['Entry/Exit'] == 1: entry_date = index entry_portfolio_holding = abs(row['Portfolio Holdings']) share_size = row['Entry/Exit Position'] entry_share_price = row['Close'] elif row['Entry/Exit'] == -1: exit_date = index exit_portfolio_holding = abs(row['Close'] * row['Entry/Exit Position']) exit_share_price = row['Close'] profit_loss = entry_portfolio_holding - exit_portfolio_holding trade_evaluation_df = trade_evaluation_df.append( { 'Stock': 'NET', 'Entry Date': entry_date, 'Exit Date': exit_date, 'Shares': share_size, 'Entry Share Price': entry_share_price, 'Exit Share Price': exit_share_price, 'Entry Portfolio Holding': entry_portfolio_holding, 'Exit Portfolio Holding': exit_portfolio_holding, 'Profit/Loss': profit_loss }, ignore_index=True) PLOT RESULTS: price_df = signals_df[['Close', 'SMA50', 'SMA100']] price_chart = price_df.hvplot.line() price_chart.opts(title='Cloudflare', xaxis=None) Final Step: Print Dashboard portfolio_evaluation_df.reset_index(inplace=True) portfolio_evaluation_table = portfolio_evaluation_df.hvplot.table() portfolio_evaluation_table Thanks for reading! If you found this article useful, feel welcome to download my personal codes on GitHub. You can also email me directly at [email protected] and find me on LinkedIn. Interested in learning more about data analytics, data science and machine learning applications? Follow me on Medium.
https://medium.com/analytics-vidhya/trading-dashboard-with-yfinance-python-56fa471f881d
['Scott Andersen']
2020-12-18 13:57:31.065000+00:00
['Python', 'Dashboard', 'Stock Analysis', 'API', 'Finance']
Title Trading Dashboard Yfinance PythonContent Beginner level coding advanced technique Table Contents Pull Data Yfinance Api Set Short Long window SMA Generate trading signal Plot EntryExit point Backtest Analyze Portfolio metric Serve Dashboard Introduction begin let’s first understand goal article provide average retail investor quick easy way pull live data use data highlight key indicator create nice clean readable table investing particular company process help take emotion equation give enough information make informed decision Substitute stock ticker would like bottom code block Import library dependency import numpy np import panda pd import hvplotpandas pathlib import Path import yfinance yf Cloudflare net yfTicker“net” net Set timeframe interested viewing nethistorical nethistorystart2018012 end20201211 interval1d Create new DataFrame called signal keeping Date Close column signalsdf nethistoricaldropcolumnsOpen High Low VolumeDividends Stock Splits Moving Averages Next want create column short long window also known simple moving average case using 50day 100day average code need set trading signal 0 1 tell python point Buy Sell position Keep mind SMA50 cross SMA100 resistance level bullish breakout signal Set short window long window shortwindow 50 longwindow 100 Generate short long moving average 50 100 day respectively signalsdfSMA50 signalsdfCloserollingwindowshortwindowmean signalsdfSMA100 signalsdfCloserollingwindowlongwindowmean signalsdfSignal 00 Generate trading signal 0 1 0 SMA50 SMA100 1 SMA50 higher cross SMA100 signalsdfSignalshortwindow npwhere signalsdfSMA50shortwindow signalsdfSMA100shortwindow 10 00 Calculate point time position taken 1 1 signalsdfEntryExit signalsdfSignaldiff Print DataFrame signalsdftail10 third step towards building dashboard creating chart green red signal marker Entry Exit indicator Plotting Moving Averages HvPlot Visualize exit position relative close price exit signalsdfsignalsdfEntryExit 10Closehvplotscatter colorred legendFalse ylabelPrice width1000 height400 Visualize entry position relative close price entry signalsdfsignalsdfEntryExit 10Closehvplotscatter colorgreen legendFalse ylabelPrice width1000 height400 Visualize close price investment securityclose signalsdfClosehvplot linecolorlightgray ylabelPrice width1000 height400 Visualize moving average movingavgs signalsdfSMA50 SMA100hvplot ylabelPrice width1000 height400 Overlay plot entryexitplot securityclose movingavgs entry exit entryexitplotoptsxaxisNone Next set initial investment stake capital set number share example let’s say want buy 500 share Cloudflare Set initial capital initialcapital float100000 Set share size sharesize 500 Take 500 share position dual moving average crossover 1 SMA50 greater SMA100 signalsdfPosition sharesize signalsdfSignal Find point time 500 share position bought sold signalsdfEntryExit Position signalsdfPositiondiff Multiply share price entryexit position get cumulatively sum signalsdfPortfolio Holdings signalsdfClose signalsdfEntryExit Positioncumsum Subtract initial capital portfolio holding get amount liquid cash portfolio signalsdfPortfolio Cash initialcapital signalsdfClose signalsdfEntryExit Positioncumsum Get total portfolio value adding cash amount portfolio holding investment signalsdfPortfolio Total signalsdfPortfolio Cash signalsdfPortfolio Holdings Calculate portfolio daily return signalsdfPortfolio Daily Returns signalsdfPortfolio Totalpctchange Calculate cumulative return signalsdfPortfolio Cumulative Returns 1 signalsdfPortfolio Daily Returnscumprod 1 Print DataFrame signalsdftail10 Visualize Exit position relative portfolio Visualize exit position relative total portfolio value exit signalsdfsignalsdfEntryExit 10Portfolio Totalhvplotscatter colorred legendFalse ylabelTotal Portfolio Value width1000 height400 Visualize entry position relative total portfolio value entry signalsdfsignalsdfEntryExit 10Portfolio Totalhvplotscatter colorgreen legendFalse ylabelTotal Portfolio Value width1000 height400 Visualize total portoflio value investment totalportfoliovalue signalsdfPortfolio Totalhvplot linecolorlightgray ylabelTotal Portfolio Value width1000 height400 Overlay plot portfolioentryexitplot totalportfoliovalue entry exit portfolioentryexitplotoptsxaxisNone Prepare DataFrame metric metric Annual Return Cumulative Returns Annual Volatility Sharpe Ratio Sortino Ratio column Backtest Initialize DataFrame index set evaluation metric column Backtest like PyFolio portfolioevaluationdf pdDataFrameindexmetrics columnscolumns Perform Backtest section look highlight 🖐🏼 indicator 1 Cumulative return — return investment total investment total 2 Annual return — return investment received year investment received year 3 Annual volatility — daily volatility time square root 252 trading day 4 Sharpe ratio — measure performance investment compared riskfree asset adjusting risk 5 Sortino ratio — differentiates harmful volatility total overall volatility using asset’s standard deviation negative portfolio return downside deviation instead total standard deviation portfolio return Calculate cumulative return portfolioevaluationdflocCumulative Returns signalsdfPortfolio Cumulative Returns1 Calculate annualized return portfolioevaluationdflocAnnual Return signalsdfPortfolio Daily Returnsmean 252 Calculate annual volatility portfolioevaluationdflocAnnual Volatility signalsdfPortfolio Daily Returnsstd npsqrt252 Calculate Sharpe Ratio portfolioevaluationdflocSharpe Ratio signalsdfPortfolio Daily Returnsmean 252 signalsdfPortfolio Daily Returnsstd npsqrt252 Calculate Downside Return sortinoratiodf signalsdfPortfolio Daily Returnscopy sortinoratiodflocDownside Returns 0 target 0 mask sortinoratiodfPortfolio Daily Returns target sortinoratiodflocmask Downside Returns sortinoratiodfPortfolio Daily Returns2 portfolioevaluationdf Calculate Sortino Ratio downstdev npsqrtsortinoratiodfDownside Returnsmean npsqrt252 expectedreturn sortinoratiodfPortfolio Daily Returnsmean 252 sortinoratio expectedreturndownstdev portfolioevaluationdflocSortino Ratio sortinoratio portfolioevaluationdfhead Initialize trade evaluation DataFrame column tradeevaluationdf pdDataFrame column Stock Entry Date Exit Date Shares Entry Share Price Exit Share Price Entry Portfolio Holding Exit Portfolio Holding ProfitLoss Loop DataFrame ‘Entry Exit’ trade 1 set Entry trade metric EntryExit 1 set exit trade metric calculate profit Append record trade evaluation DataFrame Initialize iterative variable entrydate exitdate entryportfolioholding 0 exitportfolioholding 0 sharesize 0 entryshareprice 0 exitshareprice 0 index row signalsdfiterrows rowEntryExit 1 entrydate index entryportfolioholding absrowPortfolio Holdings sharesize rowEntryExit Position entryshareprice rowClose elif rowEntryExit 1 exitdate index exitportfolioholding absrowClose rowEntryExit Position exitshareprice rowClose profitloss entryportfolioholding exitportfolioholding tradeevaluationdf tradeevaluationdfappend Stock NET Entry Date entrydate Exit Date exitdate Shares sharesize Entry Share Price entryshareprice Exit Share Price exitshareprice Entry Portfolio Holding entryportfolioholding Exit Portfolio Holding exitportfolioholding ProfitLoss profitloss ignoreindexTrue PLOT RESULTS pricedf signalsdfClose SMA50 SMA100 pricechart pricedfhvplotline pricechartoptstitleCloudflare xaxisNone Final Step Print Dashboard portfolioevaluationdfresetindexinplaceTrue portfolioevaluationtable portfolioevaluationdfhvplottable portfolioevaluationtable Thanks reading found article useful feel welcome download personal code GitHub also email directly scottandersen23gmailcom find LinkedIn Interested learning data analytics data science machine learning application Follow MediumTags Python Dashboard Stock Analysis API Finance
2,463
R.I.P. Dangerfields: The oldest comedy club in the world. (1969–2020)
Photo ©Copyright 2020 Jason Chatfield R.I.P. Dangerfields: The oldest comedy club in the world. (1969–2020) It may have been a bit of a shithole, but it was my shithole. No Respect, I tell ya. October 14, 2020 When I moved to New York 6 years ago, I had a notebook with 7 years worth of jokes in it that I’d been performing in Australia. None of them worked in New York. I flushed my notebook down the toilet at the Ludlow Hotel, blocked the toilet and fled as the water rose and flooded the bathroom. A worthy death for a such tired, dreadful material. Over the following 3 years, I went out every night of the week, did 3–4 spots each night and worked up a new hour of material (ok, 35 minutes of actually decent material, 25 minutes of B and C-grade material.) I got a manager, an agent, started booking casinos, clubs and doing TV commercials and shows. It was a lot of work, and it didn’t once feel like it. I loved every minute of going out there and building an act. Auditioning to get ‘passed’ at clubs was nerve-wracking, but I managed to get my foot in the door at a few ground-level places to cut my teeth at some late-night spots. (Getting passed is getting approved to be put on their regular roster of comics. You send in your avails to the booker each week and they give you times/shows that you’ll be on that week. The hardest club to get passed at is The Comedy Cellar; the best club in New York.) The first club I got ‘passed’ at was called LOL. It wasn’t so much a comedy club as a converted sex dungeon in Times Square with a cheap vinyl banner that said ‘LOL STANDUP COMEDY’ on it. It had two separate rooms inside running concurrent shows every night, filled with people from the mid-west who, 15 minutes earlier, had been told they were about to see Chris Rock, Louis CK and Tina Fey (not a stand-up comic). As you can imagine, by the time my schlubby face got up on stage, they had realised how badly they’d been screwed and every night there were people asking for their money back. One time the booker got punched in the face by an angry punter. I would perform there 2–3 nights a week, sometimes 3 or 4 shows in a night, 10–15 minutes apiece. Sometimes I’d be hosting, other times I’d close out the show. We’d do shows every night. In 100 degrees or in the middle of a blizzard. Working that club taught me to deal with hostile audiences and how to digest uncooked hotdogs. Before long, new management came in and I was turfed out the door along with a swag of other comics who had been working there since it opened. It was at that point a booker I was working with at Broadway Comedy Club put me up to audition for Dangerfields. He’d been producing outside shows for both clubs and threw me up with a few other comics for consideration. I passed. Within a month I was performing there 2–3 times a week, and booking road gigs at Casinos through their management company. It was my new home club.
https://medium.com/sketchesbychatfield/r-i-p-dangerfields-the-oldest-comedy-club-in-the-world-1969-2020-8c627eff0f08
['Jason Chatfield']
2020-10-14 21:15:49.292000+00:00
['Comedy', 'Humor', 'New York', 'Dangerfields', 'Writing']
Title RIP Dangerfields oldest comedy club world 1969–2020Content Photo ©Copyright 2020 Jason Chatfield RIP Dangerfields oldest comedy club world 1969–2020 may bit shithole shithole Respect tell ya October 14 2020 moved New York 6 year ago notebook 7 year worth joke I’d performing Australia None worked New York flushed notebook toilet Ludlow Hotel blocked toilet fled water rose flooded bathroom worthy death tired dreadful material following 3 year went every night week 3–4 spot night worked new hour material ok 35 minute actually decent material 25 minute B Cgrade material got manager agent started booking casino club TV commercial show lot work didn’t feel like loved every minute going building act Auditioning get ‘passed’ club nervewracking managed get foot door groundlevel place cut teeth latenight spot Getting passed getting approved put regular roster comic send avail booker week give timesshows you’ll week hardest club get passed Comedy Cellar best club New York first club got ‘passed’ called LOL wasn’t much comedy club converted sex dungeon Times Square cheap vinyl banner said ‘LOL STANDUP COMEDY’ two separate room inside running concurrent show every night filled people midwest 15 minute earlier told see Chris Rock Louis CK Tina Fey standup comic imagine time schlubby face got stage realised badly they’d screwed every night people asking money back One time booker got punched face angry punter would perform 2–3 night week sometimes 3 4 show night 10–15 minute apiece Sometimes I’d hosting time I’d close show We’d show every night 100 degree middle blizzard Working club taught deal hostile audience digest uncooked hotdog long new management came turfed door along swag comic working since opened point booker working Broadway Comedy Club put audition Dangerfields He’d producing outside show club threw comic consideration passed Within month performing 2–3 time week booking road gig Casinos management company new home clubTags Comedy Humor New York Dangerfields Writing
2,464
The Grand Master — Short Story. “For my entire life, I have moved along…
Photo by Jeswin Thomas from Pexels “For my entire life, I have moved along a path that was set for me. It was as if I was being thought through rather than actually producing these thoughts. After many years of reflection and guilt, I have decided to explain it to you. For those listening to this speech, it will come as quite a surprise that this is the truth. I have achieved a great many things in all of my years. I have contributed much to the progress of science and the understanding of the cosmos. All of the awards I received in the past decades haven’t phased my resolve, but with the honor bestowed upon me by the Nobel committee, I cannot continue this sham any longer. I stand here in the beautiful Stockholm Concert Hall, thanking you for my award in physics, but I must speak the truth. I’ve been a pawn in the game of the Grand Master, and now I reveal it to the world.” Around me and before me sat some of the most renowned scientists in the world. All were adorned with the most expensive tuxedos, the most ornate gowns. The Swedish Royal Family sat at my left and whispered to one another. Queen Silvia’s nine pronged tiara glistened in the stage lights. The stare laid upon me by King Gustaf penetrated my very soul. I could see it in his eyes. I mustn’t speak the truth. He knew as well as me how the chess game worked. I likely would not be able to finish my speech, but I don’t expect they were prepared for this. I continued. “Where to begin? Ah yes, on a quiet spring day in 1975, I was sitting with friends around a campfire on a weekend getaway from MIT. We discussed many topics in our escape from the realities of collegiate life. There were no professors to demand changes to our theses, no modifications of our arguments. We were free to think without the guide rails of the intellectual enforcers. At least, that is what I thought. If I were to know what would happen in those trees, I wouldn’t have gone,” I paused, eying the room for feedback. Everyone seemed slightly uncomfortable. “I realized that my friends had already joined the game, and I was the new initiate. The promise of prestige and power to a young man was hard to pass on. It was in this dark forest that I became enslaved to the Grand Master.” The room became even more uncomfortable, as they didn’t know if I was going to make the final revelation or was simply being comedic and sarcastic. That was my usual style of communication. I quite enjoyed making the room squirm. I could see it as they whispered amongst themselves and stiffened in their seats. It was time to say it. “At first, I didn’t understand what was happening, was I hallucinating? Was I drugged? As I grew older, I understood what happened that day. I have spoken with many of you about such a subject — I won’t name any names. Many of us, if not all, were initiated into the grand chess game. Each is assigned a role on the board, this board being quite more complex than what one would traditionally think of as chess, and as long as one plays their rank, they can continue, take part in the winnings, and live a decadent life. We all are connected to the stream of the Grand Master, and we do as we’re told. But not today. Not anymore!” King Gustaf stood up, “Oh, what a laugh you are making, Dr. Wilson,” he said in his thick Swedish accent. “You highness, it is quite a laugh, but please, let me continue with my speech.” The King didn’t know how to react, but I could see the guards at the back of the auditorium being radioed. I was sure to be hastened off of the stage in a moment, but I had to continue. “For those of you watching the live stream, know this — everything that I have discovered was given to me from the stream of thought of the Grand Master. I was a willing servant for most of my life, but now I must reveal the truth behind it. Those on this stage will likely call me mad, delusional, or some variation of the words. Science was not built by a series of geniuses, but by intellectual slaves, and I am tired of being a vessel from which ideas emerge.” A security guard from across the room began to make his way to the stage. I only had moments left. “Who is the Grand Master, you may ask? I’ve spent most of my life trying to find that answer, but I’ve realized that I won’t be able to. This is why I’ve made my stand here, on this prestigious stage. A madman cannot make it to this stage, save for John Nash. He knew what I know now. He couldn’t live with himself, just as I cannot any longer. Those watching must achieve what I could not in my life. You must find the Grand Master. You must!” The security guard was now at my back, and I was escorted off of the stage. I didn’t know what would happen from this moment on, but I expected I would be placed in an institution for the remainder of my days. Luckily, I didn’t have many remaining. I did this for all of those who had to live with this secret. I did this for John, who played the grand chess game and was broken by it. It breaks so many. He breaks so many. It’s now up to the next generation to transcend the game.
https://medium.com/beyond-the-river/the-grand-master-48ab02c03adb
['Drunk Plato']
2020-06-11 14:46:33.652000+00:00
['Short Story', 'Science', 'Mystery', 'Psychology']
Title Grand Master — Short Story “For entire life moved along…Content Photo Jeswin Thomas Pexels “For entire life moved along path set thought rather actually producing thought many year reflection guilt decided explain listening speech come quite surprise truth achieved great many thing year contributed much progress science understanding cosmos award received past decade haven’t phased resolve honor bestowed upon Nobel committee cannot continue sham longer stand beautiful Stockholm Concert Hall thanking award physic must speak truth I’ve pawn game Grand Master reveal world” Around sat renowned scientist world adorned expensive tuxedo ornate gown Swedish Royal Family sat left whispered one another Queen Silvia’s nine pronged tiara glistened stage light stare laid upon King Gustaf penetrated soul could see eye mustn’t speak truth knew well chess game worked likely would able finish speech don’t expect prepared continued “Where begin Ah yes quiet spring day 1975 sitting friend around campfire weekend getaway MIT discussed many topic escape reality collegiate life professor demand change thesis modification argument free think without guide rail intellectual enforcer least thought know would happen tree wouldn’t gone” paused eying room feedback Everyone seemed slightly uncomfortable “I realized friend already joined game new initiate promise prestige power young man hard pas dark forest became enslaved Grand Master” room became even uncomfortable didn’t know going make final revelation simply comedic sarcastic usual style communication quite enjoyed making room squirm could see whispered amongst stiffened seat time say “At first didn’t understand happening hallucinating drugged grew older understood happened day spoken many subject — won’t name name Many u initiated grand chess game assigned role board board quite complex one would traditionally think chess long one play rank continue take part winning live decadent life connected stream Grand Master we’re told today anymore” King Gustaf stood “Oh laugh making Dr Wilson” said thick Swedish accent “You highness quite laugh please let continue speech” King didn’t know react could see guard back auditorium radioed sure hastened stage moment continue “For watching live stream know — everything discovered given stream thought Grand Master willing servant life must reveal truth behind stage likely call mad delusional variation word Science built series genius intellectual slave tired vessel idea emerge” security guard across room began make way stage moment left “Who Grand Master may ask I’ve spent life trying find answer I’ve realized won’t able I’ve made stand prestigious stage madman cannot make stage save John Nash knew know couldn’t live cannot longer watching must achieve could life must find Grand Master must” security guard back escorted stage didn’t know would happen moment expected would placed institution remainder day Luckily didn’t many remaining live secret John played grand chess game broken break many break many It’s next generation transcend gameTags Short Story Science Mystery Psychology
2,465
An Idiot With a Plan Can Beat a Genius With Hope
Is it just me or does it seem like the quiet ones that you least expected to succeed in high school are the ones now living the lives we all dreamt of heading into adulthood? I mean, I guess I heard the idea of people "peaking" early — I just didn't have any true understanding of what that meant when I was younger. Listen, I'm not trying to prop myself or anyone up in writing this — it's just something that's been on my mind. I wasn't a part of the "popular" kids growing up. I had my friends and we were always kind of just doing our thing. Then — into adulthood, I moved away from home, "peeked" at what others from my graduating high school class were up to, kept my head down, and got to work. Now, I'm making a very healthy living doing what I love to do and helping build a community of other badasses achieve the same for themselves. It got me thinking about all the other weird kids I grew up with and how I notice a lot of them are doing really rad things in life. Of course, I'm able to see some of my own confirmation bias here — it isn't a perfect generalization, however I do think there is a trend. I don't think I'm anything special. Of course, I know that me being me is special — just like you being you is special. What I mean is I don't have any special skills, knowledge or education that allowed me to accomplish what I have and I know there are millions of other people who fall into that camp as well. In this piece, I'm going to go over the plan and share the strategy I have followed over the years that have helped me build, scale, and grow my online business so you can too.
https://medium.com/the-ascent/an-idiot-with-a-plan-can-beat-a-genius-with-hope-bf60faa4b3bd
['Jon Brosio']
2020-10-17 13:03:08.498000+00:00
['Blogging', 'Motivation', 'Entrepreneurship', 'Life', 'Self Improvement']
Title Idiot Plan Beat Genius HopeContent seem like quiet one least expected succeed high school one living life dreamt heading adulthood mean guess heard idea people peaking early — didnt true understanding meant younger Listen Im trying prop anyone writing — something thats mind wasnt part popular kid growing friend always kind thing — adulthood moved away home peeked others graduating high school class kept head got work Im making healthy living love helping build community badasses achieve got thinking weird kid grew notice lot really rad thing life course Im able see confirmation bias — isnt perfect generalization however think trend dont think Im anything special course know special — like special mean dont special skill knowledge education allowed accomplish know million people fall camp well piece Im going go plan share strategy followed year helped build scale grow online business tooTags Blogging Motivation Entrepreneurship Life Self Improvement
2,466
A Concise Guide to Remember More of What You Read
1. Start What You Can Finish Before you pick up a book, use what I’d like to call it the “Three-Pronged Questionnaire”: What do I want to learn or read? They can be categories such as fiction/non-fiction, self-help, politics, science, relationships, cooking, etc. Why am I reading this? What do I hope to get out of this book? To help you with that, going through the table of contents, book summary, and reviews give you a wonderful sense of idea what that book is all about. The list of questions above serves to ensure you’re reading a book that will pique your interest in the long run. If you’re likely to enjoy the purpose of the book, you’ll make an effort to understand the context of what the author has written. As one of my teachers used to put it, “If you study to remember, you’ll forget. If you study to understand, you’ll remember.” — Which do you remember better: the content in your history textbook or the logic behind why 5+6 = 11? 2. Annotate (The Messy Way) Scribbling notes on books is not something new, but the way you’re making notes on them make a difference. Merely underlining or highlighting an excerpt or an idea in the book isn’t effective in imprinting those words in your memory. Instead, I’d highlight a specific sentence or paragraph of ideas, draw a curly bracket beside it, and rephrase it in my own words. Doing this not only summarises the key points the author is trying to convey, but it also deepens your understanding. It’s the same way of telling someone what you’re trying to remember except you’re doing it for yourself. Putting something in your own words helps you retrieve that information later on. Don’t believe me? Try explaining the process of evaporation and revisit the concept a day or two later. Put tabs on the first page Sometimes, I’d also annotate striking ideas on the first page of the book. It’s usually filled with a title in the middle of a blank page, so there are tons of spaces left for me to write. On it is where I’ll write a short subtitle along with the page number relating to a concept that appealed to me. Whenever I want to refresh my memory on some grand ideas or lessons listed in the book, I just need to turn to the page. 3. Create Your Encyclopedia of Book Summaries Much of my advice here takes a great deal of work on your part but summarising each chapter and its takeaways help in remembering what you read. I use Notion to compile all book summaries I’ve written. You may not be carrying your pile of books all the time so having an app like this allows you to retrieve information wherever you go. Here are some useful functions that helped me organised my notes better: Collapsable drop lists — useful for parking a chunk of the information under the main header of a chapter Underline, bold , and italicise functions to emphasise various key concepts , and italicise functions to emphasise various key concepts Colour tags to differentiate the categories of books you’ve read Embed web links, images, or videos — could be book reviews or summaries by others online that you find useful Another bonus tip I’d like to share is a compilation of my favourite websites to visit for concise book summaries:
https://medium.com/the-innovation/a-concise-guide-to-remember-what-you-read-16d651f64132
['Charlene Annabel']
2020-12-22 11:05:43.406000+00:00
['Books', 'Reading', 'Productivity', 'Productivity Hacks', 'Self Improvement']
Title Concise Guide Remember ReadContent 1 Start Finish pick book use I’d like call “ThreePronged Questionnaire” want learn read category fictionnonfiction selfhelp politics science relationship cooking etc reading hope get book help going table content book summary review give wonderful sense idea book list question serf ensure you’re reading book pique interest long run you’re likely enjoy purpose book you’ll make effort understand context author written one teacher used put “If study remember you’ll forget study understand you’ll remember” — remember better content history textbook logic behind 56 11 2 Annotate Messy Way Scribbling note book something new way you’re making note make difference Merely underlining highlighting excerpt idea book isn’t effective imprinting word memory Instead I’d highlight specific sentence paragraph idea draw curly bracket beside rephrase word summarises key point author trying convey also deepens understanding It’s way telling someone you’re trying remember except you’re Putting something word help retrieve information later Don’t believe Try explaining process evaporation revisit concept day two later Put tab first page Sometimes I’d also annotate striking idea first page book It’s usually filled title middle blank page ton space left write I’ll write short subtitle along page number relating concept appealed Whenever want refresh memory grand idea lesson listed book need turn page 3 Create Encyclopedia Book Summaries Much advice take great deal work part summarising chapter takeaway help remembering read use Notion compile book summary I’ve written may carrying pile book time app like allows retrieve information wherever go useful function helped organised note better Collapsable drop list — useful parking chunk information main header chapter Underline bold italicise function emphasise various key concept italicise function emphasise various key concept Colour tag differentiate category book you’ve read Embed web link image video — could book review summary others online find useful Another bonus tip I’d like share compilation favourite website visit concise book summariesTags Books Reading Productivity Productivity Hacks Self Improvement
2,467
Post-modern rock-pooling
This piece was originally published in Mediaview, Geology Today — a publication of the Geological Society of London and the Geologists’ Association. Stygobites — Niphargus aquilex (Image: Chris Proctor). Gazing into nature’s aquarium; a replica of the distant life in the ocean, rock pools show us a glimpse of the distant, marine world of crabs, shrimps and all manner of crustaceans jostling for life in their aquatic domain. But is the coast as far as they venture? The holidaying shores of the seaside may hold the classic rock-pool, but a similar crustacean abundance exists unknown beneath our dry, clad feet. Deep within our inland geology, a rich biodiversity of crustaceans is only yet beginning to be unearthed, living squeezed into the tiny nooks and crannies carved into the subterranean landscape. The creatures of these unseen depths are known as stygobites, and after so long buried beneath ground have become like the ghosts of their more marine counterparts. Their bodies have become wraith-like; sightless and an eerie white, while they have sprouted further‐reaching limbs and antennae for fumbling around in the rocky crevices. These are not for catching in seaside fishing nets and buckets, but inhabit a unique ecological and geological niche, after a dual ancestry arising from both freshwater and marine animals. A suitably unique home is formed by deep underground hydro-geology, as groundwater erodes extensive submerged channels that permeate the land. They are found throughout these clandestine networks; from sparse, thin rock fissures to the deep aquifers in chalk, limestone and other rock strata, and even in the infinitesimal liquid spaces between the gravel grains of riverbeds. Stygobites — Niphargus glenniei (Image: Andy Lewington) Although also recorded in cave pools, it is thought that stygobites are native to the isolated channels of phreatic water (ground water below the water table) deep in rock beds, and only by flooding and heavy rains are they brought into the fringes of our world, as they are flushed out into cave and river systems. Despite their apparent isolation, this aquatic subterranean habitat has enabled stygobites to become relatively widespread throughout the subsurface, and they are wider ranging than the related troglobites (which are terrestrial, as opposed to the aquatic stygobites). This is likely due to the dynamics of water associated with flooding, allowing stygobites to disperse and spread in range. If you want to know which rocks beneath your feet may hold this secret life beneath, research has shown that stygobites appear to favour fissured, carbonate strata, which may be due to such rock providing the most fitting basis for their habitat. Stygobites are not just fascinating largely unknown creatures in our landscape, but have wider implications, from revealing more about biogeochemical processes deep in continental geology, to acting as indicators for the condition of subsurface waters and our increasing impact on them through aquifer drainage as a water resource. Even with their natural secrecy and our predominant ignorance of them, it seems that even these remote creatures can’t escape the global anthropogenic changes to the Earth. It is thought that the stygobites’ adaptations for stable aquatic environments (such as long life‐cycles and slower egg development) may not withstand a modernity of farmed aquifers, where water levels must follow the rhythm of mankind’s insatiably thirsty lifestyle. Stygobites attempting to survive such altered environments may migrate or simply decline; becoming dormant under such stressors. This unfortunate trend may however also provide a hidden benefit; as such effects may be utilizable as biomarkers of pollution or climate change. Knowledge of our geological past can also be gleaned from these unassuming creatures, such as our past climate through the distribution of the stygobite species. As past glaciers froze vast areas of the land surface, the stygobites’ ecosystem was deprived of nutrients and water; starving their population and leaving gaps in their distribution that still remain today, although recent research also suggests the survival of groups of stygobites from previously glaciated areas in other parts of the world, such as Canada and Ireland. Studies in everything from the micro‐structure of groundwater channels and aquifers, to large, extensive geological changes, and even to our past climate, can be advanced through a better understanding of these hidden rock-poolers. (Readers can find additional information in: Lamoreux, J., Journal of Cave and Karst Studies, 2004, v.66, pp.18–19; and, Roberston, A.L., et al., 2009. The distribution and diversity of stygobites in Great Britain: an analysis to inform groundwater management. Quarterly Journal of Engineering Geology and Hydrogeology, v.42, pp.359–368.)
https://medium.com/swlh/post-modern-rock-pooling-2f6b59eb65e8
['Georgia Melodie Hole']
2020-05-18 21:30:56.849000+00:00
['Creative Writing', 'Geology', 'Science', 'Wildlife', 'Writing']
Title Postmodern rockpoolingContent piece originally published Mediaview Geology Today — publication Geological Society London Geologists’ Association Stygobites — Niphargus aquilex Image Chris Proctor Gazing nature’s aquarium replica distant life ocean rock pool show u glimpse distant marine world crab shrimp manner crustacean jostling life aquatic domain coast far venture holidaying shore seaside may hold classic rockpool similar crustacean abundance exists unknown beneath dry clad foot Deep within inland geology rich biodiversity crustacean yet beginning unearthed living squeezed tiny nook cranny carved subterranean landscape creature unseen depth known stygobites long buried beneath ground become like ghost marine counterpart body become wraithlike sightless eerie white sprouted further‐reaching limb antenna fumbling around rocky crevice catching seaside fishing net bucket inhabit unique ecological geological niche dual ancestry arising freshwater marine animal suitably unique home formed deep underground hydrogeology groundwater erodes extensive submerged channel permeate land found throughout clandestine network sparse thin rock fissure deep aquifer chalk limestone rock stratum even infinitesimal liquid space gravel grain riverbed Stygobites — Niphargus glenniei Image Andy Lewington Although also recorded cave pool thought stygobites native isolated channel phreatic water ground water water table deep rock bed flooding heavy rain brought fringe world flushed cave river system Despite apparent isolation aquatic subterranean habitat enabled stygobites become relatively widespread throughout subsurface wider ranging related troglobites terrestrial opposed aquatic stygobites likely due dynamic water associated flooding allowing stygobites disperse spread range want know rock beneath foot may hold secret life beneath research shown stygobites appear favour fissured carbonate stratum may due rock providing fitting basis habitat Stygobites fascinating largely unknown creature landscape wider implication revealing biogeochemical process deep continental geology acting indicator condition subsurface water increasing impact aquifer drainage water resource Even natural secrecy predominant ignorance seems even remote creature can’t escape global anthropogenic change Earth thought stygobites’ adaptation stable aquatic environment long life‐cycles slower egg development may withstand modernity farmed aquifer water level must follow rhythm mankind’s insatiably thirsty lifestyle Stygobites attempting survive altered environment may migrate simply decline becoming dormant stressor unfortunate trend may however also provide hidden benefit effect may utilizable biomarkers pollution climate change Knowledge geological past also gleaned unassuming creature past climate distribution stygobite specie past glacier froze vast area land surface stygobites’ ecosystem deprived nutrient water starving population leaving gap distribution still remain today although recent research also suggests survival group stygobites previously glaciated area part world Canada Ireland Studies everything micro‐structure groundwater channel aquifer large extensive geological change even past climate advanced better understanding hidden rockpoolers Readers find additional information Lamoreux J Journal Cave Karst Studies 2004 v66 pp18–19 Roberston AL et al 2009 distribution diversity stygobites Great Britain analysis inform groundwater management Quarterly Journal Engineering Geology Hydrogeology v42 pp359–368Tags Creative Writing Geology Science Wildlife Writing
2,468
Visualising COVID19
Visualising COVID19 Analysis of coronavirus from an Epidemic to Pandemic Photo by Markus Spiske on Unsplash Coronavirus was first identified in the Wuhan region of China by December 2019 and by March 11, 2020, the World Health Organization (WHO) categorised the COVID-19 outbreak as a pandemic. A lot has happened in the months in between with major outbreaks in Iran, India, the United States, South Korea, Italy and many more countries. We know that COVID-19 spreads through respiratory droplets, such as through coughing, sneezing, or speaking. But this is an approach to visualise how quickly did the virus spread across the globe and, how did it take the form of a massive pandemic from an outbreak in China! This is an attempt to visualize COVID-19 data from the first several weeks of the outbreak to see at what point this virus became a global pandemic and finally visualising its numbers across some severely hit nations. The data used for visualisations have been collected from the publicly available data repository created by Johns Hopkins University’s Center for Systems Science and Engineering. Firstly, we use data till 17th march 2020, for the first several weeks of the outbreak to see at what point this virus became a global pandemic. A. Importing the Dataset and required Libraries Loading readr, ggplot2 and dplyr packages in R. Reading the data for confirmed cases from datasets/confirmed_cases_worldwide.csv using read_csv function and assigning. It to the variable confirmed_cases_worldwide. B. First Glance at the Data by Plotting the Confirmed Cases Throughout the World The data above shows the cumulative confirmed cases of COVID-19 worldwide by date. Just reading numbers in a table makes it hard to get a sense of the scale and growth of the outbreak. Hence, drawing a line plot to visualise the confirmed cases worldwide. Using confirmed_cases_worldwide, drawing a ggplot with aesthetics cum_cases (y-axis) versus date (x-axis) and ensuring it is a line plot by adding line geometry. Setting the y-axis label to “Cumulative Confirmed Cases” C. Comparing China to the Rest of the World The y-axis in that plot indicated a very steep rise, with the total number of confirmed cases around the world reaching approximately equal to 200,000 by 17th March 2020. Beyond that, some other things can also be concluded which are; there is an odd jump in mid-February, then the rate of new cases slows down for a while, then speeds up again in March. Early on in the outbreak, the COVID-19 cases were primarily centred in China. Hence, plotting confirmed COVID-19 cases in China and the rest of the world separately to see if it gives us any insight. Reading in the dataset for confirmed cases in China and the rest of the world from datasets/confirmed_cases_china_vs_world.csv, assigning to confirmed_cases_china_vs_world. Using glimpse() to explore the structure of confirmed_cases_china_vs_world. Drawing a ggplot of confirmed_cases_china_vs_world, and assigning it to plt_cum_confirmed_cases_china_vs_world. Adding a line layer. Adding aesthetics within this layer: date on the x-axis, cum_cases on the y-axis, and then grouping and coloring the lines by is_china. D. Annotation We can observe that the two lines have very different shapes. In February, the majority of cases were in China. That changed in March when it really became a global outbreak: around March 14, the total number of cases outside China overtook the cases inside China. This was days after the WHO declared a pandemic. There were a couple of other landmark events that happened during the outbreak. For example, the huge jump in the China line on February 13th, 2020, wasn’t just a bad day regarding the outbreak; China changed the way it reported figures on that day (CT scans were accepted as evidence for COVID-19, rather than only lab tests). By annotating events like this, we can better interpret changes in the plot, hence modifying the plt_cum_confirmed_cases_china_vs_world as follows: E. Adding a Trend Line to Chinese Cases To get a measure of how fast the number of cases in China grew we need to add a trend line to the Chinese case’s plot. A good starting point was to see if the cases grew faster or slower than linearly. We can see there is a clear surge of cases around February 13, 2020, with the reporting change in China. However, a couple of days after, the growth of cases in China slows down and to describe the same COVID-19’s growth in China after February 15, 2020 we added this trend line. Filtering rows of confirmed_cases_china_vs_world for observations of China where the date is greater than or equal to “2020–02–15”, and assigning it to china_after_feb15. Using china_after_feb15, drawing a line plot of cum_cases versus date. Adding a smooth trend line, calculated by using the linear regression method, without the standard error ribbon. F. Adding a Trend Line to Rest of the World Cases From the plot above, the growth rate in China is slower than linear. Which indicated that China had at least somewhat contained the virus in late February and early March. Now, similarly comparing the growth of cases across the globe. Filtering rows of confirmed_cases_china_vs_world for observations of Not China, and assigning them to not_china. Using not_china, drawing a line plot of cum_cases versus date, and assigning it to plt_not_china_trend_lin. Adding a smooth trend line, calculated by using the linear regression method, without the standard error ribbon. G. Adding a Logarithmic Scale to the Trend for Rest of the World From the plot above, we can see a straight line does not fit well at all, and the rest of the world cases grew much faster than linearly. Hence, trying to add a logarithmic scale to y-axis to check if the rise is exponential. Modifying the plot, plt_not_china_trend_lin, to use a logarithmic scale on the y-axis. H. Countries outside of China which have been hardest hit by COVID19 With the logarithmic scale, we get a much closer fit to the data. From a data science point of view, a good fit is a great news. But unfortunately, from a public health point of view, that meant that cases of COVID-19 in the rest of the world grew at an exponential rate which is quite evident today. Not all countries are being affected by COVID-19 equally, and it would be helpful to know where in the world the problems were the greatest. Hence, to find the countries outside of China with the most confirmed cases in our dataset; data was imported on confirmed cases by country. Chinese data has been excluded to focus on the rest of the world. Looking at the output of glimpse() to see the structure of confirmed_cases_by_country and Using confirmed_cases_by_country, we group by country. Summarising to calculate total_cases as the maximum value of cum_cases. And getting the top seven rows by total_cases. I. Plotting the Hardest Hit Countries as of Mid-March 2020 Even though the outbreak was first identified in China, there is only one country from East Asia (South Korea) in the above table. Four of the listed countries (France, Germany, Italy, and Spain) are in Europe and share borders. To get more context, we can plot these countries’ confirmed cases over time. Reading in the dataset for confirmed cases in China and the rest of the world from datasets/confirmed_cases_top7_outside_china.csv, and assigning it to confirmed_cases_top7_outside_china and Using glimpse() to explore the structure of confirmed_cases_top7_outside_china. Using confirmed_cases_top7_outside_china, drawing a line plot of cum_cases versus date, grouped and colored by country and setting the y-axis label to “Cumulative Confirmed Cases”. J. Plotting the Hardest Hit Countries as of Today Now in order to analyse the hardest-hit countries. As of today, we will have to import fresh data which is updated till today i.e 28th June 2020, hence we use coronavirus library from Github (Dev) version which is updated on a daily basis. Conclusion From the above analysis, we can conclude the timing and the shift in the virus from being an Epidemic in Wuhan, China to becoming a World crisis Pandemic. We can also observe a significant increase in the. Rise of cases in China. After mid-Feb due to improvement in tests and also considering CT Scans as a test for Corona Virus. Also, using a regression trend we can clearly see the exponential rise of the cases across the world and atlas we can visualise the countries which are severely hit by the virus in today’s time. This has been an effective way to study the growth in the number of cases across the world and especially in china through Visualisations in R using read, gig-lot and dplyr libraries. Looking at India in specific, it ranks 4th currently in the number of confirmed cases and it is very correct to state that the number of cases is increasing day by day and even though the government is being negligent and removing the lockdown, one must keep in mind that the virus has not been eradicated and he/she should vitally maintain the social distancing norms with giving utmost priority to one’s hygiene.
https://medium.com/analytics-vidhya/visualising-covid19-d3577ebee496
['Kartikay Laddha']
2020-07-04 16:43:36.588000+00:00
['Data Science', 'Business Analysis', 'Coronavirus', 'Visualization', 'Covid 19']
Title Visualising COVID19Content Visualising COVID19 Analysis coronavirus Epidemic Pandemic Photo Markus Spiske Unsplash Coronavirus first identified Wuhan region China December 2019 March 11 2020 World Health Organization categorised COVID19 outbreak pandemic lot happened month major outbreak Iran India United States South Korea Italy many country know COVID19 spread respiratory droplet coughing sneezing speaking approach visualise quickly virus spread across globe take form massive pandemic outbreak China attempt visualize COVID19 data first several week outbreak see point virus became global pandemic finally visualising number across severely hit nation data used visualisation collected publicly available data repository created Johns Hopkins University’s Center Systems Science Engineering Firstly use data till 17th march 2020 first several week outbreak see point virus became global pandemic Importing Dataset required Libraries Loading readr ggplot2 dplyr package R Reading data confirmed case datasetsconfirmedcasesworldwidecsv using readcsv function assigning variable confirmedcasesworldwide B First Glance Data Plotting Confirmed Cases Throughout World data show cumulative confirmed case COVID19 worldwide date reading number table make hard get sense scale growth outbreak Hence drawing line plot visualise confirmed case worldwide Using confirmedcasesworldwide drawing ggplot aesthetic cumcases yaxis versus date xaxis ensuring line plot adding line geometry Setting yaxis label “Cumulative Confirmed Cases” C Comparing China Rest World yaxis plot indicated steep rise total number confirmed case around world reaching approximately equal 200000 17th March 2020 Beyond thing also concluded odd jump midFebruary rate new case slows speed March Early outbreak COVID19 case primarily centred China Hence plotting confirmed COVID19 case China rest world separately see give u insight Reading dataset confirmed case China rest world datasetsconfirmedcaseschinavsworldcsv assigning confirmedcaseschinavsworld Using glimpse explore structure confirmedcaseschinavsworld Drawing ggplot confirmedcaseschinavsworld assigning pltcumconfirmedcaseschinavsworld Adding line layer Adding aesthetic within layer date xaxis cumcases yaxis grouping coloring line ischina Annotation observe two line different shape February majority case China changed March really became global outbreak around March 14 total number case outside China overtook case inside China day declared pandemic couple landmark event happened outbreak example huge jump China line February 13th 2020 wasn’t bad day regarding outbreak China changed way reported figure day CT scan accepted evidence COVID19 rather lab test annotating event like better interpret change plot hence modifying pltcumconfirmedcaseschinavsworld follows E Adding Trend Line Chinese Cases get measure fast number case China grew need add trend line Chinese case’s plot good starting point see case grew faster slower linearly see clear surge case around February 13 2020 reporting change China However couple day growth case China slows describe COVID19’s growth China February 15 2020 added trend line Filtering row confirmedcaseschinavsworld observation China date greater equal “2020–02–15” assigning chinaafterfeb15 Using chinaafterfeb15 drawing line plot cumcases versus date Adding smooth trend line calculated using linear regression method without standard error ribbon F Adding Trend Line Rest World Cases plot growth rate China slower linear indicated China least somewhat contained virus late February early March similarly comparing growth case across globe Filtering row confirmedcaseschinavsworld observation China assigning notchina Using notchina drawing line plot cumcases versus date assigning pltnotchinatrendlin Adding smooth trend line calculated using linear regression method without standard error ribbon G Adding Logarithmic Scale Trend Rest World plot see straight line fit well rest world case grew much faster linearly Hence trying add logarithmic scale yaxis check rise exponential Modifying plot pltnotchinatrendlin use logarithmic scale yaxis H Countries outside China hardest hit COVID19 logarithmic scale get much closer fit data data science point view good fit great news unfortunately public health point view meant case COVID19 rest world grew exponential rate quite evident today country affected COVID19 equally would helpful know world problem greatest Hence find country outside China confirmed case dataset data imported confirmed case country Chinese data excluded focus rest world Looking output glimpse see structure confirmedcasesbycountry Using confirmedcasesbycountry group country Summarising calculate totalcases maximum value cumcases getting top seven row totalcases Plotting Hardest Hit Countries MidMarch 2020 Even though outbreak first identified China one country East Asia South Korea table Four listed country France Germany Italy Spain Europe share border get context plot countries’ confirmed case time Reading dataset confirmed case China rest world datasetsconfirmedcasestop7outsidechinacsv assigning confirmedcasestop7outsidechina Using glimpse explore structure confirmedcasestop7outsidechina Using confirmedcasestop7outsidechina drawing line plot cumcases versus date grouped colored country setting yaxis label “Cumulative Confirmed Cases” J Plotting Hardest Hit Countries Today order analyse hardesthit country today import fresh data updated till today ie 28th June 2020 hence use coronavirus library Github Dev version updated daily basis Conclusion analysis conclude timing shift virus Epidemic Wuhan China becoming World crisis Pandemic also observe significant increase Rise case China midFeb due improvement test also considering CT Scans test Corona Virus Also using regression trend clearly see exponential rise case across world atlas visualise country severely hit virus today’s time effective way study growth number case across world especially china Visualisations R using read giglot dplyr library Looking India specific rank 4th currently number confirmed case correct state number case increasing day day even though government negligent removing lockdown one must keep mind virus eradicated heshe vitally maintain social distancing norm giving utmost priority one’s hygieneTags Data Science Business Analysis Coronavirus Visualization Covid 19
2,469
Why your Kubernetes configuration strategy is broken…
...and here’s how to fix it At kapitan.dev we believe the current way to manage Kubernetes configurations is broken. Actually, it probably goes even deeper than that, and you will see soon why. I have a strange way to look at Kubernetes: for me Kubernetes is something that allows me to define, package and distribute complex applications with a set of configuration files. Meaning, I can define every aspect of the deployment of a complex multi-components application with Kubernetes resources: the services that make the application, configuration files, network policies, services, load balancers, RBAC, monitoring, alerting, auth. Everything can be captured, packaged and distributed using declarative Kubernetes resource definitions. And yet, when we use tools like helm and kustomize, we tend to focus on one specific component at the time, effectively losing the big picture of what Kubernetes really is all about. To draw a parallel with “old” tech, we are ~6 years into the age of Kubernetes, which completely disrupted the way we deploy our services, and yet we are still at the “rpm” or “deb” stage. To be fair, at least helm gets closer to a “yum” or “apt-get”, but doesn’t go much further than that. Let me give you some examples of where these traditional approaches fall short. Example 1: Adding a new component to your infrastructure Imagine that you just created a new component/service and you want to deploy to your infrastructure. Fine, get your helm/kustomize configuration and deploy it, right? Pronto! Presto! But here is the “expectation vs reality” moment: Life has a way to get to you, and for the new component to work, you also need to: Add an env variable to another service Add a new route to your ingress Add a new annotation to all other services you have Create a new DB username/password for the new service to use Create a new network policy associated with the service Add a CD step to deploy the new component Please take a second to let it sink in, and answer these questions: How many steps, tools and pull requests you will have deal with in order to fulfil a “business as usual” operation. Could anyone in your company/team fulfil this request? If the answer is not “1, 1, 1” and “yes” please keep reading. Example 2: Enabling a feature flag This other example is nothing different from the previous one, just something that is expected to happen more often. Let’s imagine you have worked weeks cross-teams to define a new behaviour for you application, and it’s behind a feature flag: feature flags to be precise. Because you have fully embraced microservices like a boss, you need to do the following to enable the new behaviour, which we should call “holiday sales reporting” set the FLAG_HOLIDAY_SALES_REPORTING=true on fronted component on component add the --enable-json-output on backend component on component point the /report route to a new service Same questions as before really, how easy is it for you to achieve this with your current setup? How much coordination is needed? Now a bonus question: what if, after the holidays, you need to turn off this flag? How many steps will you have to go through? And what if, in the meanwhile, --enable-json-output has actually been required also by another feature? And what if you have only added this feature to a couple of environments? How do you document that the feature is enabled? Solving it with Kapitan When you use kapitan , you don’t just capture the configuration of one single component, but rather the full configuration of everything else it is needed to run the whole application: Kubernetes resources, Terraform files, documentation, scripts, secrets. The typical setup uses a “target file” to capture “at least” everything that you would normally put in one namespace, but you can easily track resources that need to be created in other namespaces (i.e. istio-system) A typical target file (i.e. targets/production.yml ) would look like this: classes: - common - profile.production - location.europe - release.production - component.frontend - component.backend - component.mysql parameters: description: "Production environment" Head over to our repository https://github.com/kapicorp/kapitan-reference for a more complete example i.e. Weaveworks “sock shop” A class (i.e. component.frontend ) points to a file on the local disk, so you would expect to find a file inventory/classes/component/frontend.yml to capture the configuration needed for the frontend component, which would look like this: parameters: components: frontend: image: company/frontend:${release} port: http: service_port: 80 ingresses: global: paths: - backend: serviceName: frontend servicePort: 80 path: /web/* Head over to our repository https://github.com/kapicorp/kapitan-reference for a more complete example. Solving Example 1: Adding a new component to your infrastructure With kapitan adding a new component consists on creating a new class to define that component: inventory/classes/components/report.yml parameters: component: users: # define new users report: username: report password: ?{gkms:${target_name}/report||randomstr|base64} # definition of the component itself report: image: company/report:${release} env: MYSQL_USERNAME: ${users:report:username} MYSQL_PASSWORD: secretKeyRef: key: mysql_password # Create a new Secret resource secrets: secret: data: mysql_password: value: ${users:report:password} # Create new network policy network_policies: default: ingress: - from: - podSelector: matchLabels: role: frontend ports: - protocol: TCP port: 80 # <add more report component configurations here> frontend: env: REPORT_SERVICE: # Add a new env variable in the fronted componentfrontend:env:REPORT_SERVICE: http://report:80 # Add a new ingress ingresses: global: paths: - backend: serviceName: report servicePort: 80 path: /report/* # Add a new annotation to all components in this target generator: manifest: default_config: annotations: company.com/report: active Notice how this one file captures everything you need to do to configure the new report service in your setup. When you add the component.report class to a target file (or to another class file, i.e. application.website ), Kapitan will take care of configuring everything that is needed for the component to work in one go. class to a target file (or to another class file, i.e. ), Kapitan will take care of configuring everything that is needed for the component to work in one go. As you might have guesses, when you remove it, all that extra configuration goes away. Secrets are automatically generated for you (and in the example, encrypted using Google KMS If you have a Kapitan integration with your CD software (i.e. the yet-to-be-released spinnaker integration), your CD pipelines will also be modified to include the new component Solving Example 2: Enabling a feature flag The solution here is pretty similar in principle to the Example 1: you create a class and you add it to your target. The name that class: features/holiday_report.yml parameters: frontend: env: FLAG_HOLIDAY_SALES_REPORTING: 'true' backend: args: - --enable-json-output # Add a new path to the ingress ingresses: global: paths: - backend: serviceName: report servicePort: 80 path: /holiday_report/* Now when you want to enable/disable this feature, all you need to do it to add/remove the class from the target file: classes: - common - profile.production - location.europe - release.production - component.frontend - component.backend - component.report - component.mysql - features. holiday_report parameters: description: "Production environment" Notice how it becomes much cleaner and easy to understand which features are enabled where. If you have docs that are automatically generated by Kapitan, you could add information related to the holiday_report features only to the affected targets. Final words I hope you have enjoyed this article, and you have been able to understand what Kapitan can offer. If you want to learn more, please check out our blog and our website https://kapitan.dev
https://medium.com/kapitan-blog/why-your-kubernetes-configuration-strategy-is-broken-c54ff3fdf9c3
['Alessandro De Maria']
2020-12-28 06:59:06.843000+00:00
['Kustomize', 'Helm', 'Microservices', 'Kubernetes', 'DevOps']
Title Kubernetes configuration strategy broken…Content here’s fix kapitandev believe current way manage Kubernetes configuration broken Actually probably go even deeper see soon strange way look Kubernetes Kubernetes something allows define package distribute complex application set configuration file Meaning define every aspect deployment complex multicomponents application Kubernetes resource service make application configuration file network policy service load balancer RBAC monitoring alerting auth Everything captured packaged distributed using declarative Kubernetes resource definition yet use tool like helm kustomize tend focus one specific component time effectively losing big picture Kubernetes really draw parallel “old” tech 6 year age Kubernetes completely disrupted way deploy service yet still “rpm” “deb” stage fair least helm get closer “yum” “aptget” doesn’t go much Let give example traditional approach fall short Example 1 Adding new component infrastructure Imagine created new componentservice want deploy infrastructure Fine get helmkustomize configuration deploy right Pronto Presto “expectation v reality” moment Life way get new component work also need Add env variable another service Add new route ingres Add new annotation service Create new DB usernamepassword new service use Create new network policy associated service Add CD step deploy new component Please take second let sink answer question many step tool pull request deal order fulfil “business usual” operation Could anyone companyteam fulfil request answer “1 1 1” “yes” please keep reading Example 2 Enabling feature flag example nothing different previous one something expected happen often Let’s imagine worked week crossteams define new behaviour application it’s behind feature flag feature flag precise fully embraced microservices like bos need following enable new behaviour call “holiday sale reporting” set FLAGHOLIDAYSALESREPORTINGtrue fronted component component add enablejsonoutput backend component component point report route new service question really easy achieve current setup much coordination needed bonus question holiday need turn flag many step go meanwhile enablejsonoutput actually required also another feature added feature couple environment document feature enabled Solving Kapitan use kapitan don’t capture configuration one single component rather full configuration everything else needed run whole application Kubernetes resource Terraform file documentation script secret typical setup us “target file” capture “at least” everything would normally put one namespace easily track resource need created namespaces ie istiosystem typical target file ie targetsproductionyml would look like class common profileproduction locationeurope releaseproduction componentfrontend componentbackend componentmysql parameter description Production environment Head repository httpsgithubcomkapicorpkapitanreference complete example ie Weaveworks “sock shop” class ie componentfrontend point file local disk would expect find file inventoryclassescomponentfrontendyml capture configuration needed frontend component would look like parameter component frontend image companyfrontendrelease port http serviceport 80 ingress global path backend serviceName frontend servicePort 80 path web Head repository httpsgithubcomkapicorpkapitanreference complete example Solving Example 1 Adding new component infrastructure kapitan adding new component consists creating new class define component inventoryclassescomponentsreportyml parameter component user define new user report username report password gkmstargetnamereportrandomstrbase64 definition component report image companyreportrelease env MYSQLUSERNAME usersreportusername MYSQLPASSWORD secretKeyRef key mysqlpassword Create new Secret resource secret secret data mysqlpassword value usersreportpassword Create new network policy networkpolicies default ingres podSelector matchLabels role frontend port protocol TCP port 80 add report component configuration frontend env REPORTSERVICE Add new env variable fronted componentfrontendenvREPORTSERVICE httpreport80 Add new ingres ingress global path backend serviceName report servicePort 80 path report Add new annotation component target generator manifest defaultconfig annotation companycomreport active Notice one file capture everything need configure new report service setup add componentreport class target file another class file ie applicationwebsite Kapitan take care configuring everything needed component work one go class target file another class file ie Kapitan take care configuring everything needed component work one go might guess remove extra configuration go away Secrets automatically generated example encrypted using Google KMS Kapitan integration CD software ie yettobereleased spinnaker integration CD pipeline also modified include new component Solving Example 2 Enabling feature flag solution pretty similar principle Example 1 create class add target name class featuresholidayreportyml parameter frontend env FLAGHOLIDAYSALESREPORTING true backend args enablejsonoutput Add new path ingres ingress global path backend serviceName report servicePort 80 path holidayreport want enabledisable feature need addremove class target file class common profileproduction locationeurope releaseproduction componentfrontend componentbackend componentreport componentmysql feature holidayreport parameter description Production environment Notice becomes much cleaner easy understand feature enabled doc automatically generated Kapitan could add information related holidayreport feature affected target Final word hope enjoyed article able understand Kapitan offer want learn please check blog website httpskapitandevTags Kustomize Helm Microservices Kubernetes DevOps
2,470
Personalised Christmas Cards With Unique Promo Codes
Every year, with the appearance of autumn, retailers feel the gust of approaching Christmas fever. It’s the last call to schedule a winning strategy. According to research data, people are willing to spend more than ever on Christmas gifts. Needless to say, it’s hard work for brands chasing consumers with special offers at this busy time. In this post, we’re going to show you how these holiday promotions can be neatly wrapped into personalized Merry Christmas cards. But first, why are the cards so important in your strategy? What’s under the tree CreditDonkey asked Americans what gifts they’re about to buy this Christmas. For every ten consumers, seven plan to buy a gift card. It may seem like an easy way out of time-consuming searches for matching gifts, however; it’s undoubtedly true that there’s more under the hood. Data shows us that it’s not only a convenient choice for buyers but also a present desired by a great majority of receivers. 82% of survey subjects claim that gift cards are what they’d like to find beneath their Christmas tree. Gift cards aren’t the only way you can use a promotion toolkit to drive sales; what’s even more important is a spending mood that consumers can catch and keep with the holiday spirit. Around Christmas time, customers are more willing to spend their money. Not only on gifts but also on their own needs and habits. This means they’re more susceptible to incentives which can be pushed out in front of them in special Christmas offers. What if you could wrap all the incentives for your employees, business partners or end customers into personalized Christmas cards? Or use gift cards to endow them with incentives not only as receivers but also as buyers? Let’s look at three examples of personalized “Merry Christmas” cards in an email message made using Voucherify. Each of them includes a unique, trackable incentive like a discount coupon, a gift card with predefined credits or a referral code. Unique codes are the key to seeing how your Christmas promotions perform and turn insights into action for future events. What’s most important here is that we’ve designed the cards in such a way as to allow for endowing a receiver (our customer), giving us a chance for immediate new acquisitions. These messages make universal Christmas cards a powerful weapon for sales. Voucherify is a promotion management system that develops, manages, and distributes coupon, referral, and loyalty solutions for businesses of every shape and size, worldwide. If you’re interested in having a consultative talk to help you decide how you should implement your promotions, let us know at [email protected] — We’re always happy to help! Christmas cards with individual customer codes for tracking Example 1: Unique gift card for your employees and a discount for their friends. Codes of both unique gift card and a discount for friends can be copied or scanned (QR code) from an email. Coupon history is stored in the Voucherify dashboard. You can see order details and customer data attached to every redemption that has ever occurred. Example 2: A gift card for your partners and a referral code for referred companies. This is how B2B companies can implement our strategy. Example 3: Exclusive discounts for your loyal customers and their friends to use in brick-and-mortar stores. Summary Every holiday opens up new opportunities for your growth. Besides well-targeted offers, keep in mind how you’re going to track your performance. Extreme traffic is not only about sales itself but also about learning about your audience. The more data you gather, the more insights for the future you will get.
https://medium.com/voucherify/personalised-christmas-cards-with-unique-promo-codes-c74a5a162238
['Jagoda Dworniczak']
2018-10-08 12:16:38.409000+00:00
['Christmas', 'Sales', 'Marketing', 'Ecommerce', 'Startup']
Title Personalised Christmas Cards Unique Promo CodesContent Every year appearance autumn retailer feel gust approaching Christmas fever It’s last call schedule winning strategy According research data people willing spend ever Christmas gift Needless say it’s hard work brand chasing consumer special offer busy time post we’re going show holiday promotion neatly wrapped personalized Merry Christmas card first card important strategy What’s tree CreditDonkey asked Americans gift they’re buy Christmas every ten consumer seven plan buy gift card may seem like easy way timeconsuming search matching gift however it’s undoubtedly true there’s hood Data show u it’s convenient choice buyer also present desired great majority receiver 82 survey subject claim gift card they’d like find beneath Christmas tree Gift card aren’t way use promotion toolkit drive sale what’s even important spending mood consumer catch keep holiday spirit Around Christmas time customer willing spend money gift also need habit mean they’re susceptible incentive pushed front special Christmas offer could wrap incentive employee business partner end customer personalized Christmas card use gift card endow incentive receiver also buyer Let’s look three example personalized “Merry Christmas” card email message made using Voucherify includes unique trackable incentive like discount coupon gift card predefined credit referral code Unique code key seeing Christmas promotion perform turn insight action future event What’s important we’ve designed card way allow endowing receiver customer giving u chance immediate new acquisition message make universal Christmas card powerful weapon sale Voucherify promotion management system develops manages distributes coupon referral loyalty solution business every shape size worldwide you’re interested consultative talk help decide implement promotion let u know salesvoucherifyio — We’re always happy help Christmas card individual customer code tracking Example 1 Unique gift card employee discount friend Codes unique gift card discount friend copied scanned QR code email Coupon history stored Voucherify dashboard see order detail customer data attached every redemption ever occurred Example 2 gift card partner referral code referred company B2B company implement strategy Example 3 Exclusive discount loyal customer friend use brickandmortar store Summary Every holiday open new opportunity growth Besides welltargeted offer keep mind you’re going track performance Extreme traffic sale also learning audience data gather insight future getTags Christmas Sales Marketing Ecommerce Startup
2,471
Rechunker: The missing link for chunked array analytics
by Ryan Abernathey and Tom Augspurger TLDR: this post describes a new python library called rechunker, which performs efficient on-disk rechunking of chunked array storage formats. Rechunker allows you to write code like this: from rechunker import rechunk target_chunks = (100, 10, 1) max_mem = "2GB" plan = rechunk(source_array, target_chunks, max_mem, "target_store.zarr", "temp_store.zarr") plan.execute() …and have the operation parallelized over any number of Dask workers. Motivation Chunked arrays are a key part of the modern scientific software stack in fields such as geospatial analytics and bioinformatics. Chunked arrays take a large multidimensional array dataset, such as an image captured over many timesteps, and split it up into many “chunks” — smaller arrays which can comfortably fit in memory. These chunks can form the basis of parallel algorithms that can make data science workflows go a lot faster. Example of a chunked array, as represented by Dask. Chunked arrays are implemented in both parallel computing frameworks — such as Dask and NumpyWren — and as an on-disk storage format. Some storage formats that support chunked arrays include HDF5, TileDB, Zarr, and Cloud Optimized Geotiff. When these chunked array storage formats are paired with the above computing frameworks, excellent scaling performance can be achieved. However, chunked arrays workflows can fail hard when the chunks are not aligned with the desired analysis method. A great example can be found in this post from a user on the Pangeo forum: Geospatial satellite data is often produced as a global map once per day, creating a natural chunk structure (e.g. one file per day). But what happens if you want to do a timeseries analysis at each point in space? This analysis can’t be parallelized over chunks. Many array-based workflows get suck on similar problems. One existing solution is to use Dask’s rechunk function to create a new chunk structure lazily, on the fly, in memory. This works great for some problems. For others, particularly those involving a full rechunk (every source chunk goes into every target chunk), Dask’s algorithm can run out of memory, or produce an unmanageably large number of tasks. (More details can be found in the post linked above.) To address this problem, we created a new package that aims to solve this specific problem in an optimal way: rechunker. The Rechunker Algorithm Rechunker takes an input chunked array (or group of arrays) stored in a persistent storage device (such as a filesystem or a cloud storage bucket) and writes out an array (or group of arrays) with the same data, but different chunking scheme, to a new location. Along the way, it may create a temporary, intermediate copy of the array in persistent storage. The reliance on persistent storage is a key difference between Rechunker and Dask’s rechunk function. Figuring out the most efficient way to do this was a fun computer science problem to solve. Via our Discourse forum, many people contributed to the discussion and shared different ideas they had implemented in the past. We identified a couple of key requirements for Rechunker’s algorithm: Respect memory limits. Rechunker’s algorithm guarantees that worker processes will not exceed a user-specified memory threshold. Rechunker’s algorithm guarantees that worker processes will not exceed a user-specified memory threshold. Minimize the number of required tasks. Specifically, for N source chunks and M target chunks, the number of tasks is always less than N + M. Specifically, for N source chunks and M target chunks, the number of tasks is always less than N + M. Be embarrassingly parallel. The task graph should be as simple as possible, to make it easy to execute using different task scheduling frameworks. This also means avoiding write locks, which are complex to manage, and inter-worker communication. These considerations led to the creation of an algorithm we call Push-Pull-Consolidated.
https://medium.com/pangeo/rechunker-the-missing-link-for-chunked-array-analytics-5b2359e9dc11
['Ryan Abernathey']
2020-07-21 12:01:02.033000+00:00
['Python', 'Data Science', 'Distributed Systems', 'Geospatial', 'Big Data']
Title Rechunker missing link chunked array analyticsContent Ryan Abernathey Tom Augspurger TLDR post describes new python library called rechunker performs efficient ondisk rechunking chunked array storage format Rechunker allows write code like rechunker import rechunk targetchunks 100 10 1 maxmem 2GB plan rechunksourcearray targetchunks maxmem targetstorezarr tempstorezarr planexecute …and operation parallelized number Dask worker Motivation Chunked array key part modern scientific software stack field geospatial analytics bioinformatics Chunked array take large multidimensional array dataset image captured many timesteps split many “chunks” — smaller array comfortably fit memory chunk form basis parallel algorithm make data science workflow go lot faster Example chunked array represented Dask Chunked array implemented parallel computing framework — Dask NumpyWren — ondisk storage format storage format support chunked array include HDF5 TileDB Zarr Cloud Optimized Geotiff chunked array storage format paired computing framework excellent scaling performance achieved However chunked array workflow fail hard chunk aligned desired analysis method great example found post user Pangeo forum Geospatial satellite data often produced global map per day creating natural chunk structure eg one file per day happens want timeseries analysis point space analysis can’t parallelized chunk Many arraybased workflow get suck similar problem One existing solution use Dask’s rechunk function create new chunk structure lazily fly memory work great problem others particularly involving full rechunk every source chunk go every target chunk Dask’s algorithm run memory produce unmanageably large number task detail found post linked address problem created new package aim solve specific problem optimal way rechunker Rechunker Algorithm Rechunker take input chunked array group array stored persistent storage device filesystem cloud storage bucket writes array group array data different chunking scheme new location Along way may create temporary intermediate copy array persistent storage reliance persistent storage key difference Rechunker Dask’s rechunk function Figuring efficient way fun computer science problem solve Via Discourse forum many people contributed discussion shared different idea implemented past identified couple key requirement Rechunker’s algorithm Respect memory limit Rechunker’s algorithm guarantee worker process exceed userspecified memory threshold Rechunker’s algorithm guarantee worker process exceed userspecified memory threshold Minimize number required task Specifically N source chunk target chunk number task always le N Specifically N source chunk target chunk number task always le N embarrassingly parallel task graph simple possible make easy execute using different task scheduling framework also mean avoiding write lock complex manage interworker communication consideration led creation algorithm call PushPullConsolidatedTags Python Data Science Distributed Systems Geospatial Big Data
2,472
The stars above inspire thoughts of perfection.
The stars above inspire thoughts of perfection. We look up to see constancy, harmony, eternity and above all serenity. Just remember that mud and stardust are ultimately the same thing. From Shakespeare’s Merchant of Venice: “Sit, Jessica. Look, how the floor of heaven Is thick inlaid with patines of bright gold: There’s not the smallest orb which thou behold’st But in his motion like an angel sings, Still quiring to the young-eyed cherubins, - Such harmony is in immortal souls; But whilst this muddy vesture of decay Doth grossly close it in, we cannot hear it.” From Marcus Aurelius’s Meditations:
https://stevengambardella.medium.com/the-stars-above-inspire-thoughts-of-perfection-9cb72a9a51b8
['Steven Gambardella']
2020-12-04 18:48:58.336000+00:00
['Self', 'Philosophy', 'Books', 'Psychology', 'Culture']
Title star inspire thought perfectionContent star inspire thought perfection look see constancy harmony eternity serenity remember mud stardust ultimately thing Shakespeare’s Merchant Venice “Sit Jessica Look floor heaven thick inlaid patines bright gold There’s smallest orb thou behold’st motion like angel sings Still quiring youngeyed cherubins harmony immortal soul whilst muddy vesture decay Doth grossly close cannot hear it” Marcus Aurelius’s MeditationsTags Self Philosophy Books Psychology Culture
2,473
Tools to build a prototype web app in one month without writing code
Tools to build a prototype web app in one month without writing code Prototyping stack (for non-developers) For non-technical people who have an idea for a digital product, I’ve noticed two reasons that keep them from pursuing: 1) anxiety towards sacrificing some aspect of their life to make the time, and 2) convincing themselves that they need a technical cofounder and/or funding to build something tangible. This essay aims to dismiss reason #2, by clarifying some free (or price-of-several-coffees) tools I’ve used to build a functional web application version-1; my prototyping stack. One key advantage of building a prototype is that it drastically improves how seriously you’re taken by peers in co-working spaces, potential cofounders, prospective mentors/angel investors, and target users. m = more information available, v = vlad’s (my own) personal experiences Organize Although I was skeptical how a “visual idea board” would be different than just writing ideas into a notebook, I was surprised to find that it had a side-effect of helping jumpstart momentum for doing work [v.1]. The first idea board was used purely for brainstorming along several categories which I guessed at being necessary for the idea to become tangible — the foundational product idea was found in the resulting conversations [v.2]. This is also a great time to apply the Jobs To Be Done framework as a way to understand how someone’s life can be improved; as a a consequence, this frame of thinking helps surface ideas for product features that would impact those improvements [m.1]. The point here isn’t to get stuck in brainstorming-paralysis, but rather to settle on a hypothesis for one “job to be done”, and then continue through the rest of this prototyping stack. First idea board (using the free tool Trello) was decommissioned after figuring out what to build Design The amount of incredible (and freely accessible) designs out there make the motto, “you don’t have to re-invent the wheel” valid. Since you’re still in the brainstorming phase — and probably don’t have a background in human factors — it’s a good idea to borrow inspiration from best-practice user interfaces [v.3]. At this point, I was primarily spending time bookmarking designs which looked like they could be used to be used to fulfill my “jobs to be done” hypothesis. Dribbble showcases professional-level design choices for any kind of user interface Mock-Up (& Iterate) Have you ever played around with Microsoft Paint? Well, there are tools which are just as simple to use, except they let you create a “fake app” that you can click around to navigate to various screens. At this point, I would take the inspirations from the “Design” phase, and build them in this mock-up tool [v.4]. These tools also export nicely into a mobile app, so this is a great time to hand your phone to prospective users and silently observe how they use your interface to gather feedback for design changes. The more times you can do this — build mockup, demo to target user, observe their hesitations with usability, re-build mockup — the more professional your user experience will feel. The non-free (low cost) tool Proto lets you create multiple screens to experiment with user-interfaces Functional App What if you could use your Microsoft Paint skills to clone an app like Uber or AirBnB over a weekend [m.2]? That’s the main appeal to a new class of app building tools, which feel like the next evolution of tools used in the “mock-up” phase. Although there is a steep learning curve [v.5], showing up to a cofounder speed-dating event / prospective investor coffee chat becomes a 10x better experience when you have the first version of a functional app. The free tool Bubble.is is probably my favorite tool discovered in 2019, for building fully-functional web (and mobile) apps where writing code is not a necessity for producing the first-version prototype Data (honorable mention) Despite poor design choices, user experience, app load times, bugs, etc, one way to immediately stand out with a prototype is through the data that you are now able to collect — not just typical user info, but rather behavioral data [v.6] which helps describe a trend starting to happen in your market; this a conversation every angel investor wants to have (and wants to be the first to have).
https://medium.com/swlh/tools-to-build-a-prototype-web-app-in-one-month-without-writing-code-dcda1afda5dd
['Vlad Shulman']
2019-06-08 16:01:42.533000+00:00
['Startup', 'Product Management', 'Design']
Title Tools build prototype web app one month without writing codeContent Tools build prototype web app one month without writing code Prototyping stack nondevelopers nontechnical people idea digital product I’ve noticed two reason keep pursuing 1 anxiety towards sacrificing aspect life make time 2 convincing need technical cofounder andor funding build something tangible essay aim dismiss reason 2 clarifying free priceofseveralcoffees tool I’ve used build functional web application version1 prototyping stack One key advantage building prototype drastically improves seriously you’re taken peer coworking space potential cofounder prospective mentorsangel investor target user information available v vlad’s personal experience Organize Although skeptical “visual idea board” would different writing idea notebook surprised find sideeffect helping jumpstart momentum work v1 first idea board used purely brainstorming along several category guessed necessary idea become tangible — foundational product idea found resulting conversation v2 also great time apply Jobs Done framework way understand someone’s life improved consequence frame thinking help surface idea product feature would impact improvement m1 point isn’t get stuck brainstormingparalysis rather settle hypothesis one “job done” continue rest prototyping stack First idea board using free tool Trello decommissioned figuring build Design amount incredible freely accessible design make motto “you don’t reinvent wheel” valid Since you’re still brainstorming phase — probably don’t background human factor — it’s good idea borrow inspiration bestpractice user interface v3 point primarily spending time bookmarking design looked like could used used fulfill “jobs done” hypothesis Dribbble showcase professionallevel design choice kind user interface MockUp Iterate ever played around Microsoft Paint Well tool simple use except let create “fake app” click around navigate various screen point would take inspiration “Design” phase build mockup tool v4 tool also export nicely mobile app great time hand phone prospective user silently observe use interface gather feedback design change time — build mockup demo target user observe hesitation usability rebuild mockup — professional user experience feel nonfree low cost tool Proto let create multiple screen experiment userinterfaces Functional App could use Microsoft Paint skill clone app like Uber AirBnB weekend m2 That’s main appeal new class app building tool feel like next evolution tool used “mockup” phase Although steep learning curve v5 showing cofounder speeddating event prospective investor coffee chat becomes 10x better experience first version functional app free tool Bubbleis probably favorite tool discovered 2019 building fullyfunctional web mobile apps writing code necessity producing firstversion prototype Data honorable mention Despite poor design choice user experience app load time bug etc one way immediately stand prototype data able collect — typical user info rather behavioral data v6 help describe trend starting happen market conversation every angel investor want want first haveTags Startup Product Management Design
2,474
The Startup Failure Curve: 7 Important Stats to Know
Photo by Quino Al on Unsplash Have you ever heard the statistic that 90 percent of businesses fail within the first year? Maybe you heard that it was in the first 5 years, or that it’s actually 80 percent of businesses, but chances are you heard a number like this at some point in your life, without much direct evidence to back it up. It’s certainly true that the majority of new businesses do fail — only a minority ever find success — but the stats aren’t nearly as dramatic as some would have you believe. Instead, failure tends to unfold over a curve, and understanding that curve could help your business from falling victim to the most common pitfalls. The Startup Failure Curve So what are the “real” statistics for business failure? It’s a complicated question, because definitions of “failure” might vary, and to be certain, there are many different types of businesses, each with different survival rates. Still, there are some critical facts we can use to better understand what the failure curve really looks like. 1. 66 percent of businesses with employees survive at least 2 years. According to the most recent report from the SBA, with data from the Bureau of Labor Statistics, about two-thirds of all businesses with employees last at least two years. Those aren’t bad odds compared to the “90 percent” statistic that persists. 2. About half of businesses survive at least 5 years. The same study found that the same group of businesses tended to last at least 5 years at a rate of around 50 percent. 3. The economy does not directly affect the failure curve. These data come from a span of more than a decade, stretching back into the 1990s. The curve was not significantly affected by times of economic prosperity or by recessions, making rates of success and failure even more consistent. 4. Failure rates are similar across industries. Have you ever heard someone say that restaurants and bars are especially risky business investments, since they have a higher rate of failure than other businesses? The data suggest this isn’t true. The food service and hotel industry has a similar failure curve as the manufacturing, construction, and retail trade industries. The differences are negligible at nearly every point on the curve. 5. 25 percent of businesses fail the first year. As you might expect, the failure curve is steeper at the beginning, with 25 percent of small businesses failing within the first year, according to data compiled by Statistic Brain. This is likely due to the learning curve associated with business ownership; the longer you remain in business, the more you learn, and the more resilient you are to problems that could otherwise shake your foundation. It’s a period that naturally weeds out the weakest candidates as well. 6. Reasons for failure vary. According to the same data, a whopping 46 percent of all company failures were attributable to “incompetence,” a blanket term that can refer to emotional pricing, failure to pay taxes, a lack of planning, no financing knowledge, and/or no experience in record keeping. Another 30 percent of company failures were attributable to unbalanced experience, or a lack of management experience. 7. 75 percent of venture capital-backed startups fail. Of course, for VC-backed startups, the picture isn’t as pretty; according to one report, about 75 percent of all VC-backed startups ultimately fail. This could be due to a number of reasons, including the highly competitive nature of VC competitions and the volatility of tech startups that emerge on the scene. When Failure Is a Good Thing If you’re reading these statistics, and you’re still worried about your business being classified as a “failure,” keep in mind that failure can actually be a good thing. For starters, many businesses that fail in the first year didn’t have the potential for long-term success; early failure actually spares them significant expenses, and frees up their entrepreneurs to pursue more valuable opportunities. On top of that, going through the process of starting a business and watching it fall apart can teach you valuable lessons, which you can apply to future opportunities; failed entrepreneurs who get back on the horse have a higher likelihood of success the second time around. So what should you take away from all this? First, if you’ve thought about becoming an entrepreneur, but have been intimidated by the thought of becoming part of an overwhelming majority of failed entrepreneurs, reconsider your position; that majority isn’t nearly as strong as you might have previously believed. Every entrepreneur faces failure in some form, but it doesn’t always lead to the failure of the entire business. Second, if you can make it past that trying first year, you can probably keep your business successful for years to come. And finally, even if your business does fail, it isn’t the end of the world; you’ll have new knowledge and new experiences you can use to fuel your next venture.
https://jaysondemers.medium.com/the-startup-failure-curve-7-important-stats-to-know-f5a3fc617e43
['Jayson Demers']
2020-11-09 23:45:59.624000+00:00
['Entrepreneur', 'Startup Life', 'Startup', 'Failure', 'Entrepreneurship']
Title Startup Failure Curve 7 Important Stats KnowContent Photo Quino Al Unsplash ever heard statistic 90 percent business fail within first year Maybe heard first 5 year it’s actually 80 percent business chance heard number like point life without much direct evidence back It’s certainly true majority new business fail — minority ever find success — stats aren’t nearly dramatic would believe Instead failure tends unfold curve understanding curve could help business falling victim common pitfall Startup Failure Curve “real” statistic business failure It’s complicated question definition “failure” might vary certain many different type business different survival rate Still critical fact use better understand failure curve really look like 1 66 percent business employee survive least 2 year According recent report SBA data Bureau Labor Statistics twothirds business employee last least two year aren’t bad odds compared “90 percent” statistic persists 2 half business survive least 5 year study found group business tended last least 5 year rate around 50 percent 3 economy directly affect failure curve data come span decade stretching back 1990s curve significantly affected time economic prosperity recession making rate success failure even consistent 4 Failure rate similar across industry ever heard someone say restaurant bar especially risky business investment since higher rate failure business data suggest isn’t true food service hotel industry similar failure curve manufacturing construction retail trade industry difference negligible nearly every point curve 5 25 percent business fail first year might expect failure curve steeper beginning 25 percent small business failing within first year according data compiled Statistic Brain likely due learning curve associated business ownership longer remain business learn resilient problem could otherwise shake foundation It’s period naturally weed weakest candidate well 6 Reasons failure vary According data whopping 46 percent company failure attributable “incompetence” blanket term refer emotional pricing failure pay tax lack planning financing knowledge andor experience record keeping Another 30 percent company failure attributable unbalanced experience lack management experience 7 75 percent venture capitalbacked startup fail course VCbacked startup picture isn’t pretty according one report 75 percent VCbacked startup ultimately fail could due number reason including highly competitive nature VC competition volatility tech startup emerge scene Failure Good Thing you’re reading statistic you’re still worried business classified “failure” keep mind failure actually good thing starter many business fail first year didn’t potential longterm success early failure actually spare significant expense free entrepreneur pursue valuable opportunity top going process starting business watching fall apart teach valuable lesson apply future opportunity failed entrepreneur get back horse higher likelihood success second time around take away First you’ve thought becoming entrepreneur intimidated thought becoming part overwhelming majority failed entrepreneur reconsider position majority isn’t nearly strong might previously believed Every entrepreneur face failure form doesn’t always lead failure entire business Second make past trying first year probably keep business successful year come finally even business fail isn’t end world you’ll new knowledge new experience use fuel next ventureTags Entrepreneur Startup Life Startup Failure Entrepreneurship
2,475
Google Kubernetes Engine (GKE) announcements from Cloud Next 2018
There was an almost overwhelming number of announcements at Cloud Next this year, so I want to focus on the technologies I care most about, Kubernetes and GKE! GKE On-Prem GKE On-Prem This is an important evolution of GKE for people who want the flexibility and power of Kubernetes in their own datacentre, but don’t want to invest in whole teams to manage the entire stack. Joe Beda discussed having multiple layers of Ops teams in his talk at KubeCon 2017. The GKE console will also provide unified management for your clusters across GCP and on-prem, super cool! GKE On-Prem cluster (moscone) Service Mesh - https://sebiwi.github.io/comics/service-mesh/ Service Mesh Service mesh is thrown around in buzz-wordy evangelism these days, but as projects such as Istio mature, the benefits for security, observability, and traffic management are starting to make people take notice. Istio v1.0 was announced, showing the product has reached a point of API stabilisation that will lead to much greater adoption. A Managed Istio (alpha) product was also announced that will remove even more complexity for GKE users. Cloud Services Platform family GKE Serverless add-on If you already use GKE and want to provide a Serverless platform to your developers, this add-on looks ideal. Google also provided a form for requesting early access. This could be useful if you want to develop on a Serverless stack that’s more portable than services like Cloud Functions or AWS Lambda. In the future, if many developers adopt a common Serverless framework (like Knative), your Serverless components could be less coupled to a specific vendor. Knative Knative This one is more for the Serverless platform developers out there. Knative is a suite of building blocks for creating modern, container based, Serverless applications. Google teamed up with Pivotal, IBM, RedHat and SAP to develop this open source framework that was then used to build the GKE Serverless add-on. Knative helps with three main use cases for Serverless: Serving Deploying and serving Serverless applicaitons and functions. Build On-cluster container builds. Eventing Loosely coupled eventing system compatible with CloudEvents. Expect more of your favourite Serverless platforms and projects in the ecosystem to announce support for running on top of Knative/Kubernetes in the future, if they haven’t already.
https://medium.com/weareservian/google-kubernetes-engine-gke-announcements-from-cloud-next-2018-7a9409872643
['Dylan Graham']
2019-07-08 04:20:20.148000+00:00
['Gke', 'Google Cloud Platform', 'Serverless', 'Cloud Computing', 'Kubernetes']
Title Google Kubernetes Engine GKE announcement Cloud Next 2018Content almost overwhelming number announcement Cloud Next year want focus technology care Kubernetes GKE GKE OnPrem GKE OnPrem important evolution GKE people want flexibility power Kubernetes datacentre don’t want invest whole team manage entire stack Joe Beda discussed multiple layer Ops team talk KubeCon 2017 GKE console also provide unified management cluster across GCP onprem super cool GKE OnPrem cluster moscone Service Mesh httpssebiwigithubiocomicsservicemesh Service Mesh Service mesh thrown around buzzwordy evangelism day project Istio mature benefit security observability traffic management starting make people take notice Istio v10 announced showing product reached point API stabilisation lead much greater adoption Managed Istio alpha product also announced remove even complexity GKE user Cloud Services Platform family GKE Serverless addon already use GKE want provide Serverless platform developer addon look ideal Google also provided form requesting early access could useful want develop Serverless stack that’s portable service like Cloud Functions AWS Lambda future many developer adopt common Serverless framework like Knative Serverless component could le coupled specific vendor Knative Knative one Serverless platform developer Knative suite building block creating modern container based Serverless application Google teamed Pivotal IBM RedHat SAP develop open source framework used build GKE Serverless addon Knative help three main use case Serverless Serving Deploying serving Serverless applicaitons function Build Oncluster container build Eventing Loosely coupled eventing system compatible CloudEvents Expect favourite Serverless platform project ecosystem announce support running top KnativeKubernetes future haven’t alreadyTags Gke Google Cloud Platform Serverless Cloud Computing Kubernetes
2,476
Design Patterns Saga: The Beginning
Factory Design Pattern What are your associations with the word factory? Someplace where workers manufacture goods. This is exactly this. The Factory Pattern is a creational pattern, whose purpose is to create objects. Just like a factory in the real world. In this pattern, the object creation happens in the factories, without exposing the creation logic to the client. Imagine that your software implements a sushi bar and you want to create sushi to serve at your bar. There are many different types of sushi, but let’s start with the California and Dragon rolls that were mentioned in the polymorphism example. A lot of the sushi rolls we’ve become familiar with are a Western take on Japanese Maki sushi. Therefore, to implement the sushi bar software, you need a Maki interface and concrete California and Dragon classes implementing Maki. Let’s say that all you need to create a maki roll is to add fish, fillings, and the topic. Can you smell polymorphism? Kidding, this is an overriding example we mentioned earlier. Now, you need a createRoll(RollType rollType) method. The method will first declare the maki variable, which will refer to the maki object to be created. Think about it, like maki base, which is traditionally made with a sheet of nori, wrapped around a layer of rice. California Roll, for example, is an inside-out sushi roll with a layer of rice on the outside and a sheet of nori on the inside. The roll type parameter will determine which maki roll is actually instantiated. When you have a base for your roll, you can call methods to add fish, fillings, and topics. These methods do not care what type of roll is created. All they want is a maki base to operate on. The method looks like this. Your brilliant chief adds more and more roll types to the menu. Spicy Tuna Roll, Spider Roll. In this example, the list of conditionals grows and grows as new roll types are added. Notice that what we do with the roll after creating it doesn’t change. Each roll needs to contain fish, fillings, and topics. This is all getting pretty complicated. To decouple maki instantiation from the client, you can delegate the responsibility of maki base creation, based on provided roll type, to Sushi Factory. In general, a factory object is an instance of such a class, which has a method to create product objects (maki base, in our case). Now, this Sushi Factory can be used by the Sushi Bar service. In other words, the sushi bar is now a client of the sushi factory. Here is a UML diagram of the Sushi Bar service you just implemented. Let’s look, what have you gained here? The Sushi Bar service and it’s createRoll(RollType rollType) method may not be the only client of Sushi Factory. Other clients, such as Sushi Delivery and Sushi Takeaway may use Sushi Factories to create maki as well. Since all of the actual maki creation happens in the Sushi Factory, you can simply add new roll types to your factory or to change the way the rolls are instantiated, without modifying the client’s code. That’s all! Now you know your onions 🤓. And I know that I feel like ordering sushi delivery. This post made me hungry. Bon appetit!
https://medium.com/swlh/design-patterns-saga-the-beginning-17ea936472cc
['Gene Zeiniss']
2020-07-05 05:05:33.133000+00:00
['Backend', 'Design Patterns', 'Factory Pattern', 'Polymorphism', 'Java']
Title Design Patterns Saga BeginningContent Factory Design Pattern association word factory Someplace worker manufacture good exactly Factory Pattern creational pattern whose purpose create object like factory real world pattern object creation happens factory without exposing creation logic client Imagine software implement sushi bar want create sushi serve bar many different type sushi let’s start California Dragon roll mentioned polymorphism example lot sushi roll we’ve become familiar Western take Japanese Maki sushi Therefore implement sushi bar software need Maki interface concrete California Dragon class implementing Maki Let’s say need create maki roll add fish filling topic smell polymorphism Kidding overriding example mentioned earlier need createRollRollType rollType method method first declare maki variable refer maki object created Think like maki base traditionally made sheet nori wrapped around layer rice California Roll example insideout sushi roll layer rice outside sheet nori inside roll type parameter determine maki roll actually instantiated base roll call method add fish filling topic method care type roll created want maki base operate method look like brilliant chief add roll type menu Spicy Tuna Roll Spider Roll example list conditionals grows grows new roll type added Notice roll creating doesn’t change roll need contain fish filling topic getting pretty complicated decouple maki instantiation client delegate responsibility maki base creation based provided roll type Sushi Factory general factory object instance class method create product object maki base case Sushi Factory used Sushi Bar service word sushi bar client sushi factory UML diagram Sushi Bar service implemented Let’s look gained Sushi Bar service it’s createRollRollType rollType method may client Sushi Factory client Sushi Delivery Sushi Takeaway may use Sushi Factories create maki well Since actual maki creation happens Sushi Factory simply add new roll type factory change way roll instantiated without modifying client’s code That’s know onion 🤓 know feel like ordering sushi delivery post made hungry Bon appetitTags Backend Design Patterns Factory Pattern Polymorphism Java
2,477
I Knew Happiness Once
I Knew Happiness Once Sky Collection quote prompt №24 Photo by Denise Jones on Unsplash Looking back to where I have been, Trying to figure out what went wrong, Where did you go, and why do you elude me? I knew happiness once. I need closure from our relationship, But you refuse to go there, Does growing apart just happen? I knew happiness once. That job of my dreams - It was like living a nightmare, When did I become a visitor and not family? I knew happiness once. The friends who disappeared, Without even a goodbye, Why did they not miss me as I missed them? I knew happiness once. I clung to you madly, And yet you still left me, Alone, afraid, and paralyzed. I knew happiness once, I spent hours analyzing the past, Revisiting the times you filled my heart, But you had left all of my memories. I knew happiness once. But I have left it behind. Fickle and fleeting is no longer for me, I built a strong foundation instead, Ready to face any disaster. I know happiness now, It grows inside of me And depends on no one else.
https://medium.com/sky-collection/i-knew-happiness-once-35912099f691
['Kim Mckinney']
2020-12-11 20:46:22.589000+00:00
['Self-awareness', 'Mental Health', 'Happiness', 'Relationships', 'Poetry']
Title Knew Happiness OnceContent Knew Happiness Sky Collection quote prompt №24 Photo Denise Jones Unsplash Looking back Trying figure went wrong go elude knew happiness need closure relationship refuse go growing apart happen knew happiness job dream like living nightmare become visitor family knew happiness friend disappeared Without even goodbye miss missed knew happiness clung madly yet still left Alone afraid paralyzed knew happiness spent hour analyzing past Revisiting time filled heart left memory knew happiness left behind Fickle fleeting longer built strong foundation instead Ready face disaster know happiness grows inside depends one elseTags Selfawareness Mental Health Happiness Relationships Poetry
2,478
Collections in Python
Array NumPy is a popular library for working with scientific and engineering data. Here, we highlight the array manipulation capabilities offered by NumPy. A NumPy array is an N-dimensional grid of homogenous values. It can be used to store a single value (scalar), coordinates of a point in N-dimensional space (vector), a 2D matrix containing the linear transformations of a vector (matrix), or even N-dimensional matrices (not tensors though). >>> import numpy as np >>> a_vector = np.array([1, 2, 3]) >>> print('vector shape:', a_vector.shape) vector shape: (3,) >>> a_matrix = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]]) >>> print('matrix shape:', a_matrix.shape) matrix shape: (3, 3) Now, let us look at some of the frequently used array operations. Query/Filter/Mask >>> numbers=np.array([1,2,3,4,5,6,7,8,9,10]) # mask >>> mask = numbers & 1 == 0 array([False, True, False, True, False, True, False, True, False, True]) # filter out the odds >>> numbers[mask] array([ 2, 4, 6, 8, 10]) # zero out the odds and retain the shape >>> numbers * mask array([ 0, 2, 0, 4, 0, 6, 0, 8, 0, 10]) Reshape Reshaping simply rearranges the existing items in an array into a new shape. # Reshape a row vector to a column vector >>> row = np.array([1,2,3]) >>> np.reshape(row, (3,1)) array([[1], [2], [3]]) >>> np.reshape(row, (-1,1)) array([[1], [2], [3]]) Transform All the power of NumPy comes from its ability to efficiently transform large arrays of data for scientific and engineering computations. This is really a vast topic and we will only touch upon a few key transformations here. # Vector >>> row = np.array([1, 2, 3]) # Scale >>> row*2 array([2, 4, 6]) # Shift >>> row + np.array([5,5,5]) array([6, 7, 8]) # Rotate by -90 degrees around the z-axis >>> row = np.array([1, 2, 3]) >>> rotation = np.array([[0, -1, 0],[1, 0, 0],[0, 0, 1]]) >>> np.dot(rotation, row) array([-2, 1, 3]) # Transpose >>> rows = np.array([[1,2,3],[2,3,4]]) >>> rows.T array([[1, 2], [2, 3], [3, 4]]) Sort Sorting is a bit tricky. The Python sort function does not behave the same way as it does for lists. # Sorting vectors on x-coordinate >>> rows = np.array([[2, 1, 3],[1, 2, 3]]) # naive sort >>> np.sort(rows, axis=0) array([[1, 1, 3], [2, 2, 3]]) # output does not contain the same vectors at all! # Correct method: # Obtain the sorted indices for first column (x) # and then use those indices to sort all the columns >>> ind = np.argsort(rows[:,0],axis=0).reshape(-1,1) >>> ind = np.repeat(ind, rows.shape[-1],axis=-1) >>> ind array([[1, 1, 1], [0, 0, 0]]) >>> np.take_along_axis(rows,ind,axis=0) array([[1, 2, 3], [2, 1, 3]])
https://medium.com/swlh/collections-in-python-d8954b006bb7
['Rajaram Gurumurthi']
2020-10-26 22:12:38.511000+00:00
['Machine Learning', 'Python', 'Data Science', 'Programming', 'Java']
Title Collections PythonContent Array NumPy popular library working scientific engineering data highlight array manipulation capability offered NumPy NumPy array Ndimensional grid homogenous value used store single value scalar coordinate point Ndimensional space vector 2D matrix containing linear transformation vector matrix even Ndimensional matrix tensor though import numpy np avector nparray1 2 3 printvector shape avectorshape vector shape 3 amatrix nparray1 2 3 4 5 6 7 8 9 printmatrix shape amatrixshape matrix shape 3 3 let u look frequently used array operation QueryFilterMask numbersnparray12345678910 mask mask number 1 0 arrayFalse True False True False True False True False True filter odds numbersmask array 2 4 6 8 10 zero odds retain shape number mask array 0 2 0 4 0 6 0 8 0 10 Reshape Reshaping simply rearranges existing item array new shape Reshape row vector column vector row nparray123 npreshaperow 31 array1 2 3 npreshaperow 11 array1 2 3 Transform power NumPy come ability efficiently transform large array data scientific engineering computation really vast topic touch upon key transformation Vector row nparray1 2 3 Scale row2 array2 4 6 Shift row nparray555 array6 7 8 Rotate 90 degree around zaxis row nparray1 2 3 rotation nparray0 1 01 0 00 0 1 npdotrotation row array2 1 3 Transpose row nparray123234 rowsT array1 2 2 3 3 4 Sort Sorting bit tricky Python sort function behave way list Sorting vector xcoordinate row nparray2 1 31 2 3 naive sort npsortrows axis0 array1 1 3 2 2 3 output contain vector Correct method Obtain sorted index first column x use index sort column ind npargsortrows0axis0reshape11 ind nprepeatind rowsshape1axis1 ind array1 1 1 0 0 0 nptakealongaxisrowsindaxis0 array1 2 3 2 1 3Tags Machine Learning Python Data Science Programming Java
2,479
How I Created a Course on Lane Detection and Lane Keeping
A while ago I was searching the web because I wanted to learn how lane-keeping systems work. I knew that these systems use a camera to detect the lane boundaries and then some control algorithms to keep the vehicle centered within the lane. But I wanted to understand this in more detail, and ideally implement a simple version of lane detection and lane-keeping myself. I love Massive Open Online Courses on platforms like Coursera and Udacity, so naturally, I started looking there first. Udacity offers the famous “Self-Driving Cars Nanodegree”, but I didn’t want to spend thousands of euros. On Coursera, you can find the “Self-Driving Cars Specialization” by the University of Toronto, and since you can audit it for free I tried it out. I learned how a vehicle can follow a path using a method called Pure Pursuit. And even better, I saw that the course offered an exercise where you would implement Pure Pursuit and try it out in the Carla Simulator. Neat! However, the course did not cover lane detection. So I continued searching for lane detection tutorials. I found a lot of stuff online that explained how to detect which pixels in the image are the lane markers. But I also wanted to know how to go from detecting lane marker pixels to generating a path on the road that a Pure Pursuit controller can follow. How do you go from pixels to (x,y) coordinates in meters? I did not find any easy to understand tutorials for that. Eventually, I found some older papers mentioning “Inverse Perspective Mapping”, which was kind of what I was looking for. But the mathematical formulas seemed complicated and I found no derivations. I understood that the idea is to use the assumption that the road is flat to invert the projection equation of the pinhole camera model. In the end, I derived the equations which I needed myself. Naturally, I wanted to implement and test them. What I wanted to do consisted of three steps South Park reference. Image from Wikipedia. Implement a method to detect lane boundary pixels Convert lane boundary pixels to a list of road coordinates (x,y), measured in meters. Fit a polynomial y(x) for the left and the right lane boundary Feed polynomials into Pure Pursuit to control a vehicle Since I had learned about the Carla Simulator on Coursera I decided to apply this pipeline there. I had found the equations for step 2 and had watched the Coursera videos on Pure Pursuit, so I knew how to implement step 3. I only needed to pick a method for step 1. Lane detection is a computer vision/perception problem and you probably heard that deep learning methods are dominating that field. You might also know that it is common for deep learning researchers and practitioners to publish data sets as “challenges” and to compare the performance of their neural nets on leaderboards. For lane detection, one important data set is the TuSimple Lane Detection Challange and I scanned some research papers that focussed on this. One paper stood out to me, because of its elegant approach and excellent exposition: End-to-end Lane Detection through Differentiable Least-Squares Fitting. In their paper, they also described a baseline model, which is extremely simple but still performs quite well. Good enough for me! So finally I had found resources to implement my own lane-detection and lane-keeping system. Since it had taken me so long to gather all this information, I decided to create my own online course: “Algorithms for Automated Driving”. This course should guide the reader in implementing lane-detection and lane-keeping for themselves. I published the course is in the form of an online book today and here you can see a screenshot of the landing page (you can also just visit the course by clicking here)
https://medium.com/swlh/how-i-created-a-course-on-lane-detection-and-lane-keeping-a78598914cfa
['Mario Theers']
2020-11-26 14:05:31.325000+00:00
['Python', 'Self Driving Cars', 'Education', 'Jupyter Notebook', 'Deep Learning']
Title Created Course Lane Detection Lane KeepingContent ago searching web wanted learn lanekeeping system work knew system use camera detect lane boundary control algorithm keep vehicle centered within lane wanted understand detail ideally implement simple version lane detection lanekeeping love Massive Open Online Courses platform like Coursera Udacity naturally started looking first Udacity offer famous “SelfDriving Cars Nanodegree” didn’t want spend thousand euro Coursera find “SelfDriving Cars Specialization” University Toronto since audit free tried learned vehicle follow path using method called Pure Pursuit even better saw course offered exercise would implement Pure Pursuit try Carla Simulator Neat However course cover lane detection continued searching lane detection tutorial found lot stuff online explained detect pixel image lane marker also wanted know go detecting lane marker pixel generating path road Pure Pursuit controller follow go pixel xy coordinate meter find easy understand tutorial Eventually found older paper mentioning “Inverse Perspective Mapping” kind looking mathematical formula seemed complicated found derivation understood idea use assumption road flat invert projection equation pinhole camera model end derived equation needed Naturally wanted implement test wanted consisted three step South Park reference Image Wikipedia Implement method detect lane boundary pixel Convert lane boundary pixel list road coordinate xy measured meter Fit polynomial yx left right lane boundary Feed polynomial Pure Pursuit control vehicle Since learned Carla Simulator Coursera decided apply pipeline found equation step 2 watched Coursera video Pure Pursuit knew implement step 3 needed pick method step 1 Lane detection computer visionperception problem probably heard deep learning method dominating field might also know common deep learning researcher practitioner publish data set “challenges” compare performance neural net leaderboards lane detection one important data set TuSimple Lane Detection Challange scanned research paper focussed One paper stood elegant approach excellent exposition Endtoend Lane Detection Differentiable LeastSquares Fitting paper also described baseline model extremely simple still performs quite well Good enough finally found resource implement lanedetection lanekeeping system Since taken long gather information decided create online course “Algorithms Automated Driving” course guide reader implementing lanedetection lanekeeping published course form online book today see screenshot landing page also visit course clicking hereTags Python Self Driving Cars Education Jupyter Notebook Deep Learning
2,480
5 Reasons to Create Your Own Medium Publication (And 3 Reasons You Shouldn’t)
1. Control The Visibility and Distribution of Your Own Content For new writers on Medium, there is a bit of a paradox when it comes to writing for publications. When you don’t have a large following and haven’t written for major Medium publications — it can be very difficult to become approved as a writer for any Medium publications. Conversely, once you are published in a few major publications — seemingly everyone wants your work. For writers that want to control the circulation of their own content, creating a Medium publication can be a great option. I’m a big believer in the idea that you should never let others stop you from pursuing your goals. So I started several Medium publications to better showcase my articles. While this doesn’t magically grant you followers, it does allow you to pick and choose which stories you would like to feature in your publication. Even if your article is selected for a major publication, it will most likely be pushed off the publications “featured article” section, fairly quickly. 2. Gain Access to More Detailed Analytics A second benefit of creating your own publication is the increased access to data analytics pertaining to your articles. Medium only gives writer’s a relatively small amount of data on their articles (number of views, reads, claps, fans, and some traffic sources). So any increased insight into your content’s data analytics is extremely valuable. Below are screenshots of the enhanced “views” and “visitors” data that Medium publication owners have: Views: The total number of views your publication has received on all posts and pages. Medium Publication Views Visitors: The average number of unique daily visitors who have visited your publication. Each visitor is counted once per day, even if they view multiple pages or the same page multiple times. Medium Publication Visitors 3. Utilize Features Only Available in Medium Publications When you create your own publication, there are several useful features that you gain access to. The two features I find the most useful are the “homepage promos” tabs and the “letters” function. Homepage promotions enable you to add custom blocks to your publication that link your readers to a post, a feature page, or even, an external link (outside of Medium). Below is an example from one of my publications, Black Edge Consulting:
https://medium.com/blogging-guide/5-reasons-to-create-your-own-medium-publication-and-3-reasons-you-shouldnt-8dddf72b5247
['Casey Botticello']
2020-07-10 03:00:46.880000+00:00
['Social Media', 'Medium', 'Journalism', 'Ideas', 'Writing']
Title 5 Reasons Create Medium Publication 3 Reasons Shouldn’tContent 1 Control Visibility Distribution Content new writer Medium bit paradox come writing publication don’t large following haven’t written major Medium publication — difficult become approved writer Medium publication Conversely published major publication — seemingly everyone want work writer want control circulation content creating Medium publication great option I’m big believer idea never let others stop pursuing goal started several Medium publication better showcase article doesn’t magically grant follower allow pick choose story would like feature publication Even article selected major publication likely pushed publication “featured article” section fairly quickly 2 Gain Access Detailed Analytics second benefit creating publication increased access data analytics pertaining article Medium give writer’s relatively small amount data article number view read clap fan traffic source increased insight content’s data analytics extremely valuable screenshots enhanced “views” “visitors” data Medium publication owner Views total number view publication received post page Medium Publication Views Visitors average number unique daily visitor visited publication visitor counted per day even view multiple page page multiple time Medium Publication Visitors 3 Utilize Features Available Medium Publications create publication several useful feature gain access two feature find useful “homepage promos” tab “letters” function Homepage promotion enable add custom block publication link reader post feature page even external link outside Medium example one publication Black Edge ConsultingTags Social Media Medium Journalism Ideas Writing
2,481
Best Resources for Deep Learning
Best Resources for Deep Learning Deep Learning Educational Resources Deep learning is a machine learning method that uses neural networks for prediction tasks. Deep learning methods can be used for a variety of tasks including object detection, synthetic data generation, user recommendation, and much more. In this post, I will walk through some of the best resources for getting started with deep learning. Let’s get started! Online Resources There are several online resources that are great for getting started with deep learning. Sentdex Sentdex is a YouTube channel, run by Harrison Kinsley, that has several tutorials on how to implement machine learning algorithms in python. While the channel contains many great tutorials on other machine learning algorithms like support vector machines, linear regression, tree-based models, and k-nearest neighbors, the tutorials on deep learning are a great place to start if you want to get your hands dirty with deep learning. The playlist Machine Learning with Python has a great 14 part series on learning to implement various neural networks such as simple multi-layer dense networks, recurrent neural networks, long short-term memory networks (LSTMs), and convolutional neural networks. The series also goes over tensorflow basics, preprocessing, training & testing, and installing the GPU version of tensorflow. The channel also has a playlist called Neural Networks from Scratch which has tutorials on how to build neural networks starting with their fundamental components. This is a great place to learn about how neural networks work under the hood. DataCamp DataCamp is a subscription-based platform that is great for those starting out in data science and machine learning. It has many great courses for learning how to implement neural networks. Specifically, I recommend the Introduction to Deep Learning Course. This course gives you hands-on and practical knowledge on how to use deep learning with Keras, through DataCamp’s interactive learning platform. This means in between videos, you apply what you’ve learned by writing and running real code. It goes over basic concepts such as forward propagation, activation functions, neural network layers and learned representations. It also goes into detail on neural network optimization with backpropagation, applying neural networks to regression and classification, and how to further fine-tune neural network models. After learning the basics, I recommend the course Advanced Deep Learning with Keras. This goes into detail discussing the Keras API and how to build neural networks using the functional building blocks. It also goes into some advanced concepts around categorical embedding, shared layers, and merged layers in neural networks Andrew Ng’s Deep Learning Lectures The resources I listed above heavily focus on implementation and practical application. For a more theoretical treatment of neural networks, I recommend Andrew Ng’s lectures on deep learning. The lectures cover many of the fundamentals including the gradient descent and the calculus behind it, vectorization, activation functions, backpropagation, model parameters and hyper-parameters and much more. I highly recommend these lectures if you’re interested in the math & theory behind neural networks. Books Hands on Machine Learning with Scikit-Learn & Tensorflow, by Aurelien Geron If you learn more effectively using books, this book is a great place to start learning how to implement neural networks. This book covers many machine learning topics including much of the fundamentals of neural networks including how to build simple multi-layer dense neural networks, convolutional neural networks and recurrent neural networks. Deep Learning, by Ian Goodfellow This book covers much of the theory and math behind a variety of neural network architectures. The book covers the prerequisite math concepts behind neural networks, the math behind many modern neural networks, and even outlines the work being done in deep learning research. Conclusions In this post, we discussed several resources that are useful for getting started with deep learning. First, we discussed the Sentdex YouTube channel, which covers many practical examples of how to build neural networks in python for classification and regression tasks. This is a great place to start if the theory and math of neural networks intimidate you but you’d still like to get started building neural network models. We also went over DataCamp which provides a great interactive learning platform where you solve coding exercises in between videos. Once you’re comfortable with implementing the code for deep learning algorithms, Andrew Ng’s course is great for deepening your knowledge of the theory and math behind deep learning. If you’re better suited to learning from books, Hands on Machine Learning contains many great chapters discussing how to implement neural networks in python. If you’re interested in learning the theory from a book, Ian Goodfellow’s Deep Learning is a great resource. I hope you found this post useful/interesting. Thank you for reading!
https://towardsdatascience.com/best-resources-for-deep-learning-f4c774356734
['Sadrach Pierre']
2020-09-07 02:50:09.934000+00:00
['Data Science', 'Python', 'Artificial Intelligence', 'Education', 'Deep Learning']
Title Best Resources Deep LearningContent Best Resources Deep Learning Deep Learning Educational Resources Deep learning machine learning method us neural network prediction task Deep learning method used variety task including object detection synthetic data generation user recommendation much post walk best resource getting started deep learning Let’s get started Online Resources several online resource great getting started deep learning Sentdex Sentdex YouTube channel run Harrison Kinsley several tutorial implement machine learning algorithm python channel contains many great tutorial machine learning algorithm like support vector machine linear regression treebased model knearest neighbor tutorial deep learning great place start want get hand dirty deep learning playlist Machine Learning Python great 14 part series learning implement various neural network simple multilayer dense network recurrent neural network long shortterm memory network LSTMs convolutional neural network series also go tensorflow basic preprocessing training testing installing GPU version tensorflow channel also playlist called Neural Networks Scratch tutorial build neural network starting fundamental component great place learn neural network work hood DataCamp DataCamp subscriptionbased platform great starting data science machine learning many great course learning implement neural network Specifically recommend Introduction Deep Learning Course course give handson practical knowledge use deep learning Keras DataCamp’s interactive learning platform mean video apply you’ve learned writing running real code go basic concept forward propagation activation function neural network layer learned representation also go detail neural network optimization backpropagation applying neural network regression classification finetune neural network model learning basic recommend course Advanced Deep Learning Keras go detail discussing Keras API build neural network using functional building block also go advanced concept around categorical embedding shared layer merged layer neural network Andrew Ng’s Deep Learning Lectures resource listed heavily focus implementation practical application theoretical treatment neural network recommend Andrew Ng’s lecture deep learning lecture cover many fundamental including gradient descent calculus behind vectorization activation function backpropagation model parameter hyperparameters much highly recommend lecture you’re interested math theory behind neural network Books Hands Machine Learning ScikitLearn Tensorflow Aurelien Geron learn effectively using book book great place start learning implement neural network book cover many machine learning topic including much fundamental neural network including build simple multilayer dense neural network convolutional neural network recurrent neural network Deep Learning Ian Goodfellow book cover much theory math behind variety neural network architecture book cover prerequisite math concept behind neural network math behind many modern neural network even outline work done deep learning research Conclusions post discussed several resource useful getting started deep learning First discussed Sentdex YouTube channel cover many practical example build neural network python classification regression task great place start theory math neural network intimidate you’d still like get started building neural network model also went DataCamp provides great interactive learning platform solve coding exercise video you’re comfortable implementing code deep learning algorithm Andrew Ng’s course great deepening knowledge theory math behind deep learning you’re better suited learning book Hands Machine Learning contains many great chapter discussing implement neural network python you’re interested learning theory book Ian Goodfellow’s Deep Learning great resource hope found post usefulinteresting Thank readingTags Data Science Python Artificial Intelligence Education Deep Learning
2,482
Use Python to Upload Your First Dataset on Kaggle— Taiwan Housing Project (1/2)
Step 1: Collect Data from the Open Data Platform On the website of the Ministry of the Interior, we can download the transaction records of the real estate based on the region and timeframe. Step 2: Observe What We Collected What can we see from the dataset below? The first row of data is actually English translation of the column names Some missing values (NaN) In the column transaction year month and day (交易年月日), the year information follows the civil calendar rather than the Gregorian calendar Text contents are in Chinese Original Dataset Step 3: Preprocess Data According to our observation, we will start to preprocess the dataset step by step. Drop the First Row As mentioned, the first row is the record of column names in English, so drop it. df.drop(0, axis=0) Rename Column Names To make the dataset easier to use, we translate all the columns in English. COL_NAME = ['district', 'transaction_type', 'address', 'land_shift_area', 'urban_land_use', 'non_urban_use', 'non_urban_use_code', 'transaction_date', 'transaction_number', 'shift_level', 'total_levels', 'building_state', 'main_use', 'main_building_material', 'complete_year', 'building_shift_total_area', 'num_room', 'num_hall', 'num_toilet', 'num_partition', 'management_org', 'total_ntd', 'unit_ntd', 'carpark_category', 'carpark_shift_area', 'carpark_ntd', 'note', 'serial_no'] df.columns = COL_NAME Drop Useless Columns Some columns have high missing values (above 90% of data) or useless information, thus, drop them. DROPED_COLUMNS = ['non_urban_use', 'non_urban_use_code', 'note', 'serial_no'] df = df.drop(DROPED_COLUMNS, axis=1) Transform Data Types By checking df.info(), we can transform data types into reasonable ones. df['land_shift_area'] = df['land_shift_area'].astype(float) Deal with Missing Values For each column, there are different ways to deal with missing values. Here let’s look at the simplest example, fill missing values of total levels with 0 because there is no building for the transaction of land or parking lot. Other columns might reference additional information such as related columns to infer the missing values. The detail could be found in my Gitlab. df['total_levels'] = df['total_levels'].fillna(0) # land and car park transaction have no shifting levels Generate Additional Features Column transactin_number “土地1建物0車位0” includes information telling us how many lands, buildings, and car parks are there in a transaction. By using the regular expression, we can extract key information out as a new feature. df['number_of_land'] = df['transaction_number'].apply(lambda x: int(re.findall('土地\d+', x)[0][2:])) df['number_of_building'] = df['transaction_number'].apply(lambda x: int(re.findall('建物\d+', x)[0][2:])) df['number_of_carpark'] = df['transaction_number'].apply(lambda x: int(re.findall('車位\d+', x)[0][2:])) Another column transaction_date helps us extract information of the transaction year. After adding 1,911 to the year of the civil calendar, we generalize our dataset in a more standard way. df[‘transaction_year’] = df[‘transaction_date’].apply(lambda x: str(1911 + int(x[:-4]))) # year should be categorical value Do Translation To make this dataset more widely used, we translate the text content into English. Since there is a limit of API calls with translation package, we do not translate text row by row but translate the unique word in each column first and then map them back to the original dataset. # 0. fields to translate COL_TO_TRANSLATE = ['transaction_type', 'urban_land_use', 'main_use', 'main_building_material', 'carpark_category'] # 1. find unique words and do translation dic_translation = {} from translate import Translator translator= Translator(from_lang="zh-TW", to_lang="english") for col in COL_TO_TRANSLATE: for word in pd.unique(df[col]).tolist(): dic_translation[word] = translator.translate(word) # 2. conduct replacement for col in COL_TO_TRANSLATE: df[col] = df[col].map(dic_translation) Next, I will show how I uploaded the preprocessed data to Kaggle and document more information on it. Happy Journey on Data Science! : ) For all detailed content in this section, please check Gitlab: https://gitlab.com/chrissmart/taiwan-housing-price-prediction/-/blob/master/src/data_collection.ipynb
https://medium.com/python-in-plain-english/use-python-to-upload-your-first-dataset-on-kaggle-taiwan-housing-project-1-2-41bf611a43c5
['Peiyuan Chien']
2020-06-21 20:48:37.459000+00:00
['Data Science', 'Python', 'Programming', 'Housing', 'Kaggle']
Title Use Python Upload First Dataset Kaggle— Taiwan Housing Project 12Content Step 1 Collect Data Open Data Platform website Ministry Interior download transaction record real estate based region timeframe Step 2 Observe Collected see dataset first row data actually English translation column name missing value NaN column transaction year month day 交易年月日 year information follows civil calendar rather Gregorian calendar Text content Chinese Original Dataset Step 3 Preprocess Data According observation start preprocess dataset step step Drop First Row mentioned first row record column name English drop dfdrop0 axis0 Rename Column Names make dataset easier use translate column English COLNAME district transactiontype address landshiftarea urbanlanduse nonurbanuse nonurbanusecode transactiondate transactionnumber shiftlevel totallevels buildingstate mainuse mainbuildingmaterial completeyear buildingshifttotalarea numroom numhall numtoilet numpartition managementorg totalntd unitntd carparkcategory carparkshiftarea carparkntd note serialno dfcolumns COLNAME Drop Useless Columns column high missing value 90 data useless information thus drop DROPEDCOLUMNS nonurbanuse nonurbanusecode note serialno df dfdropDROPEDCOLUMNS axis1 Transform Data Types checking dfinfo transform data type reasonable one dflandshiftarea dflandshiftareaastypefloat Deal Missing Values column different way deal missing value let’s look simplest example fill missing value total level 0 building transaction land parking lot column might reference additional information related column infer missing value detail could found Gitlab dftotallevels dftotallevelsfillna0 land car park transaction shifting level Generate Additional Features Column transactinnumber “土地1建物0車位0” includes information telling u many land building car park transaction using regular expression extract key information new feature dfnumberofland dftransactionnumberapplylambda x intrefindall土地d x02 dfnumberofbuilding dftransactionnumberapplylambda x intrefindall建物d x02 dfnumberofcarpark dftransactionnumberapplylambda x intrefindall車位d x02 Another column transactiondate help u extract information transaction year adding 1911 year civil calendar generalize dataset standard way df‘transactionyear’ df‘transactiondate’applylambda x str1911 intx4 year categorical value Translation make dataset widely used translate text content English Since limit API call translation package translate text row row translate unique word column first map back original dataset 0 field translate COLTOTRANSLATE transactiontype urbanlanduse mainuse mainbuildingmaterial carparkcategory 1 find unique word translation dictranslation translate import Translator translator TranslatorfromlangzhTW tolangenglish col COLTOTRANSLATE word pduniquedfcoltolist dictranslationword translatortranslateword 2 conduct replacement col COLTOTRANSLATE dfcol dfcolmapdictranslation Next show uploaded preprocessed data Kaggle document information Happy Journey Data Science detailed content section please check Gitlab httpsgitlabcomchrissmarttaiwanhousingpricepredictionblobmastersrcdatacollectionipynbTags Data Science Python Programming Housing Kaggle
2,483
How to Break into Data Science
When your data needs to get dressed up, Tableau is a fool-proof style service. It offers a sleek, drag-and-drop interface for data analytics with native integration to pull data from CSVs, JSON files, Google Sheets, SQL databases, and that back corner of the dryer where you’ve inevitably forgotten a sock. Data is automatically separated into dimensions (qualitative) and measures (quantitative) — and presumed to be ready for chart-making. Of course, if there are still a few data cleaning steps to be undertaken, Tableau can handle the dirty laundry as well. For example, it supports re-formatting data types and pivoting data from wide to tall format. When ready to make a chart, simply ctrl+click features of interest and an option from the “Show me” box of defaults. This simplicity of interaction enables even the most design-impaired data scientist to easily marshal data into a presentable format. Tableau will put your data into a suit and tie and send it to the boardroom. Follow these tips to go from “good” to “great” in your data visualization abilities. Gain inspiration from master chart-makers Throughout my time as a business analyst at a Big Four firm, these three blogs were my go-tos for how to create a great looking, functional Tableau dashboard. Keep these 4 guidelines in mind #1 — Sheets are the artist’s canvas and dashboards are the gallery wall. Sheets are for creating the artwork (ahem, charts), which you will then position onto a dashboard (using a tiled layout with containers — more on this in a second) along with any formatting elements. #2 — To save yourself time, set Default Properties for dimensions and measures. This will provide a unified approach to color, number of decimal points, sort order, etc. and prevent you from having to fiddle with these settings each time you go to use a given field. #3 — Along those lines, make use of the overarching Format Workbook and Format Dashboard options instead of one-off formatting tweaks. #4 — Avoid putting floating objects into your dashboards. Dragging charts around becomes a headache once you have more than two or three to work with. You can make your legends floating objects, but otherwise stay away from this “long-cut.” Instead, use the tiled layout, which forces objects to snap into place and automatically resizes if you change the size dimensions of your dashboard. Much faster and simpler in the long run. Get started with your first dashboard In summary, the Tableau platform is easier than finger paints to use, so if you’re ready to get started, Tableau Public is the free version that will allow you to create publicly accessible visualizations— like this one I put together after webscraping some info on questionable exempted developments from the Washing DC Office of Zoning — and share them to the cloud. Getting ready to present financials to the C-suite. Photo by Lisa Fotios on Pexels. After investigating data from your local community, another good sample project is pulling your checking account data and pretending you’re presenting it to a CEO for analysis. Read more about the difference between a data scientist and a data analyst: Now if you not-so-secretly love data viz and need to find more time to devote to putting your models into production (🙋‍♀️), let’s move on to… 🦁 Learning DevOps Your machine learning model is only as good as its predictions and classifications on data in the real world setting. Give your model a fighting chance by gaining at least a basic understanding of DevOps — the field responsible for integrating development and IT. Reframe your thinking about what data science is or isn’t In this brilliant article, hero of deep learning Andrej Karpathy argues that machine learning models are the new hotness in software — instead of following if-then rules, data is their codebase. Get a sense for how this works in enterprise This clever novel fictionalizes The DevOps Handbook and is surprisingly readable. (Not free — but if you buy a copy, give it to your coworker and hope they become super passionate about productionizing your models). Introduce your machine learning model to the wild Check out this article about how to use Streamlit for both deployment and data exploration. I’d be remiss if I didn’t also mention Docker and Kubernetes as enterprise-level tools for productionization.
https://towardsdatascience.com/new-data-science-f4eeee38d8f6
['Nicole Janeway Bills']
2020-11-14 22:35:43.066000+00:00
['Machine Learning', 'Data Science', 'Python', 'Artificial Intelligence', 'Programming']
Title Break Data ScienceContent data need get dressed Tableau foolproof style service offer sleek draganddrop interface data analytics native integration pull data CSVs JSON file Google Sheets SQL database back corner dryer you’ve inevitably forgotten sock Data automatically separated dimension qualitative measure quantitative — presumed ready chartmaking course still data cleaning step undertaken Tableau handle dirty laundry well example support reformatting data type pivoting data wide tall format ready make chart simply ctrlclick feature interest option “Show me” box default simplicity interaction enables even designimpaired data scientist easily marshal data presentable format Tableau put data suit tie send boardroom Follow tip go “good” “great” data visualization ability Gain inspiration master chartmakers Throughout time business analyst Big Four firm three blog gotos create great looking functional Tableau dashboard Keep 4 guideline mind 1 — Sheets artist’s canvas dashboard gallery wall Sheets creating artwork ahem chart position onto dashboard using tiled layout container — second along formatting element 2 — save time set Default Properties dimension measure provide unified approach color number decimal point sort order etc prevent fiddle setting time go use given field 3 — Along line make use overarching Format Workbook Format Dashboard option instead oneoff formatting tweak 4 — Avoid putting floating object dashboard Dragging chart around becomes headache two three work make legend floating object otherwise stay away “longcut” Instead use tiled layout force object snap place automatically resizes change size dimension dashboard Much faster simpler long run Get started first dashboard summary Tableau platform easier finger paint use you’re ready get started Tableau Public free version allow create publicly accessible visualizations— like one put together webscraping info questionable exempted development Washing DC Office Zoning — share cloud Getting ready present financials Csuite Photo Lisa Fotios Pexels investigating data local community another good sample project pulling checking account data pretending you’re presenting CEO analysis Read difference data scientist data analyst notsosecretly love data viz need find time devote putting model production 🙋‍♀️ let’s move to… 🦁 Learning DevOps machine learning model good prediction classification data real world setting Give model fighting chance gaining least basic understanding DevOps — field responsible integrating development Reframe thinking data science isn’t brilliant article hero deep learning Andrej Karpathy argues machine learning model new hotness software — instead following ifthen rule data codebase Get sense work enterprise clever novel fictionalizes DevOps Handbook surprisingly readable free — buy copy give coworker hope become super passionate productionizing model Introduce machine learning model wild Check article use Streamlit deployment data exploration I’d remiss didn’t also mention Docker Kubernetes enterpriselevel tool productionizationTags Machine Learning Data Science Python Artificial Intelligence Programming
2,484
Objectivity vs. Subjectivity: An Incongruity That Isn’t Really
Photo by Alex wong on Unsplash Nearly two years ago I started wearing glasses. At some point since I developed the strong impression that I had forgotten to take my glasses off after going to bed at night or laying down for a nap. They had become a part of me to such an extent that, like a phantom limb, I sensed I was still wearing them even though they weren’t there. I could even perceive the faint outline of their rims through my closed eyelids. If I happen to pull the blanket up over my face so that a fold touches the ridge of my nose just so, I become positively convinced I’m still wearing them and have to run a hand over my face to confirm I’ve taken them off. I’m sure I am not the only person who regularly has experiences such as this. The feeling that something is still being worn or that something is touching our skin when it objectively isn’t can be mildly disturbing. Unless one is intentionally seeking out experiences that cause mismatches between perception and reality, whether by taking drugs or via other means, even minor experiences like this can trigger some reflection about our actual grasp on reality. That subjective experiences don’t always accurately describe our environment isn’t exactly news. Indeed, subjectivity’s public stock has been steadily declining for well over a century, while its sibling rival, objectivity, has seen an unprecedented surge in credibility. Our collective lack of faith in subjectivity has grown in spite of the fact that when it comes to our own feelings we continue to inevitability overrate their importance. Objectivity’s worth has reached almost self-evident proportions in some circles. To be sure, human frailties like confirmation bias and blind spots created by feelings such as love or disgust do in fact make a certain degree of self-awareness critical to any effort to define reality with precision. We don’t want our doctor’s judgment to be too clouded by empathy when she’s making a diagnosis or evaluating our best course of treatment. Nor do we want our judges making rulings from the bench that are heavily colored by personal beliefs or a desire for revenge. But the fact remains, no conscious creature can possibly obtain anything like a truly objective point of view. Objectivity’s appeal, the philosopher Thomas Nagel wrote in his famous essay What Is It Like to Be a Bat?, is that it moves us “toward a more accurate view of the real nature of things. This is accomplished,” Nagel concluded, “by reducing our dependence on individual or species-specific points of view toward the object of investigation. We describe it not in terms of the impressions it makes on our senses, but in terms of its more general effects and of properties detectable by means other than the human senses.” To put it another way, objectivity isn’t a kind of transcendent view from nowhere. It’s actually a universal view from anywhere. A water molecule will ultimately appear the same from the point of view of either a hypothetical silicon-based life form or an actual carbon-based one. Likewise, it will remain unchanged from the vantage point of a species with one eye, two eyes, a compound eye, or no eyes whatsoever. In every case, it will consist of two hydrogen atoms and one oxygen atom because that’s what a water molecule is. All that matters is that the species analyzing it has developed the capacity to detect it. But the purpose of Nagel’s essay was neither to praise nor bury objectivity. His point was that the one thing we can never be truly objective about is our own experience. Beyond a certain level of complexity, it’s like something to be whoever we are. Consciousness means that even if who we happen to be is Spock or Data, our self-assessments will still have the quality of being subjective. There is no point of view from which our own experience can be truly understood for what it is. Nagel wrote: It is difficult to understand what could be meant by the objective character of an experience, apart from the particular point of view from which its subject apprehends it. After all, what would be left of what it was like to be a bat if one removed the viewpoint of the bat? Fortunately, the “problem” consciousness poses for objectivity is only really a problem if you’re wedded to the idea that individual consciousness can be reduced to an objective essence (self or soul) in the first place. That we actually have such an essence is far from certain. In fact, there have been people making very good arguments that we probably don’t for over two millennia now. In his excellent book, Why Buddhism is True, Robert Wright describes in some detail the many things modern science, particularly psychology, has confirmed the Buddha got right, or at least probably did. Wright spends some time on what he describes as the Buddha’s “Seminal Not-Self Sermon,” commonly translated as Discourses on the Not-Self. In this sermon the Buddha, according to Wright’s overview, asks his disciples which of what Buddhists refer to as the five aggregates “qualify as self”: form (or the physical body); sensation (feelings); perception; mental formation; or consciousness. He asked ‘is it just the physical body (form)?’ ‘Is it just our feelings?’ And so on. “If form were self,” the Buddha says, “then form would not lead to affliction, and it should obtain regarding form: ‘May my form be thus, may my form not be thus.” In other words, because our body does cause us suffering, it is clearly not under our control. Therefore, the body can’t be self. The Buddha then applies this same test of control to the remaining four aggregates to show they too could not possibly be self. It turns out that none of these, including consciousness, can truly be described as a self because all of them are beyond our control. Though the Buddha never explicitly ruled out the possibility of a self, and recognized the practical role self-identity plays for individuals in other suttas, so far as I’m aware no one over the past twenty five or so centuries since his sermon has been able to offer a response to his queries regarding where exactly something like a self or essence can be found. It appears there is no one at the helm steering our individual ships through life’s rough waters. This doesn’t mean we are completely rudderless, but the idea that there is a central self running the whole show is so far completely unsupportable. The American psychologist William James didn’t stop with the five aggregates. He turned outward in his challenge to the concept of self, asking us to clearly define where the boundary between the individual and the family lies. If that line exists at all, it is extremely fuzzy. Wright quotes James to lend a little extra contemporary support to the Buddha’s 2500-year-old point. ‘Between what a man calls me and what he simply calls mine the line is difficult to draw.’ In that sense, he [James] observed, ‘our immediate family is a part of ourselves. Our father and mother, our wife and babes, are bone of our bone and flesh of our flesh. When they die, a part of our very selves is gone.’ I would go even further than James. Consider the role friends and other contacts we make over the course of our lifetimes play in making us who we are today. Many of these contributions to our identity we aren’t even conscious of. Yet at the same time the number of people we honestly couldn’t imagine being the same without certainly extends well beyond our immediate family. Wright sums the situation up as follows when describing the related Buddhist concept of emptiness: In other words: nothing possesses inherent existence; nothing contains all the ingredients of ongoing existence within itself; nothing is self-sufficient. Hence the idea of emptiness: all things are empty of inherent, independent existence. With the self no longer in the picture, there is no subject for us to contend with. The perceiver becomes a collection of characteristics molded by a combination of biology, personal experience and culture, none of which alone qualifies as the individual subjective viewer. What is it that is being influenced by all these feelings? By adopting a supposedly objective point of view in order to eliminate all the feelings that cloud our judgment, who is the subject we are discarding in order to obtain this more accurate view of the world? In recognizing there is no self, the objective/subjective dichotomy suddenly becomes not so much two sides of the same coin as a false choice created by a faulty dualistic premise. One of the ten images developed by the psychiatrist Hermann Rorschach to help doctors effectively evaluate how their patients visually experience the world. Perhaps the best demonstration of the fluidity of the boundary between subjects and objects is the famous, if widely misunderstood, Rorschach Test. The ten inkblots used in the test are not random smears of ink like many people think, but carefully crafted images created by the psychiatrist Hermann Rorschach. Rorschach had been fascinated his entire life with how people see the world. In addition to his psychiatric training, he was the son of an artist with a considerable artistic talent of his own. This made him well suited for research into human perception; an area that had been largely overlooked by his more famous contemporaries, Freud and Jung. Rorschach’s inkblots are not the visual equivalent of free association. As Damion Searls puts it in his book, The Inkblots: Hermann Rorschach, His Iconic test, And The Power of Seeing, “The image itself constrains how you see it — as on rails — but without taking away all your freedom: different people see differently, and the differences are revealing.” Put another way, a Rorschach inkblot rests on the boundary between something that’s really there and multiple, if constrained, ways of viewing it. It’s hardly as fixed as a water molecule or the law of gravity, but it’s far from an entirely relativistic image either. In this regard, it’s an excellent metaphor for the complex patterns of relationships that make up both societies and ecosystems. According to Searls, Rorsach’s insight was that “perception included much more [than the physical mechanics of seeing or other sensations], all the way to interpreting what was perceived.” In his recent book on Buddhism, Robert Wright also draws attention to the fact that perception and interpretation cannot be treated as separate actions. To make this case he quotes the psychologist Robert Zajonc: There are probably very few perceptions and cognitions in everyday life that do not have a significant affective component, that aren’t hot, or in the very least tepid. And perhaps all perceptions contain some affect. We do not just see ‘a house’: we see ‘a handsome house,’ ‘an ugly house,’ or ‘a pretentious house.’ We do not just read an article on attitude change, on cognitive dissonance, or on herbicides. We read an ‘exciting’ article on attitude change, an ‘important’ article on cognitive dissonance, or a ‘trivial’ article on herbicides. The point here isn’t that what we call objective reality doesn’t exist. Rather it’s that any species with the capacity to unveil truth can’t possibly be objective about their own experiences. There are no objective scientists or philosophers out there. There is no objective people out there period. We all have feelings about our existence that color every decision we make, no matter how rational we think we’re being. Furthermore, we all have the impression there’s an inner objective self or essence guiding the whole show, but there isn’t. As was stated earlier, what makes something objectively true isn’t that it has been dispassionately observed, but that every single possible subjective observer can’t help but ultimately reach the same conclusion about its nature given the proper intellectual and technological tools to make the necessary examination. No matter how anyone feels about a water molecule, or through what physiological lens or mechanical device it is viewed, it will still be two hydrogen atoms and one oxygen. The same can’t be said about the relationships we form with each other or with our environment. It’s only by realizing we are enmeshed in the world rather than separate “objective” outside observers that we can truly hope to make any real progress in our understanding.
https://craig-axford.medium.com/objectivity-vs-subjectivity-an-incongruity-that-isnt-really-5c29ffe93c81
['Craig Axford']
2018-12-16 22:30:00.090000+00:00
['Philosophy', 'Spirituality', 'Consciousness', 'Psychology', 'Science']
Title Objectivity v Subjectivity Incongruity Isn’t ReallyContent Photo Alex wong Unsplash Nearly two year ago started wearing glass point since developed strong impression forgotten take glass going bed night laying nap become part extent like phantom limb sensed still wearing even though weren’t could even perceive faint outline rim closed eyelid happen pull blanket face fold touch ridge nose become positively convinced I’m still wearing run hand face confirm I’ve taken I’m sure person regularly experience feeling something still worn something touching skin objectively isn’t mildly disturbing Unless one intentionally seeking experience cause mismatch perception reality whether taking drug via mean even minor experience like trigger reflection actual grasp reality subjective experience don’t always accurately describe environment isn’t exactly news Indeed subjectivity’s public stock steadily declining well century sibling rival objectivity seen unprecedented surge credibility collective lack faith subjectivity grown spite fact come feeling continue inevitability overrate importance Objectivity’s worth reached almost selfevident proportion circle sure human frailty like confirmation bias blind spot created feeling love disgust fact make certain degree selfawareness critical effort define reality precision don’t want doctor’s judgment clouded empathy she’s making diagnosis evaluating best course treatment want judge making ruling bench heavily colored personal belief desire revenge fact remains conscious creature possibly obtain anything like truly objective point view Objectivity’s appeal philosopher Thomas Nagel wrote famous essay Like Bat move u “toward accurate view real nature thing accomplished” Nagel concluded “by reducing dependence individual speciesspecific point view toward object investigation describe term impression make sens term general effect property detectable mean human senses” put another way objectivity isn’t kind transcendent view nowhere It’s actually universal view anywhere water molecule ultimately appear point view either hypothetical siliconbased life form actual carbonbased one Likewise remain unchanged vantage point specie one eye two eye compound eye eye whatsoever every case consist two hydrogen atom one oxygen atom that’s water molecule matter specie analyzing developed capacity detect purpose Nagel’s essay neither praise bury objectivity point one thing never truly objective experience Beyond certain level complexity it’s like something whoever Consciousness mean even happen Spock Data selfassessments still quality subjective point view experience truly understood Nagel wrote difficult understand could meant objective character experience apart particular point view subject apprehends would left like bat one removed viewpoint bat Fortunately “problem” consciousness pose objectivity really problem you’re wedded idea individual consciousness reduced objective essence self soul first place actually essence far certain fact people making good argument probably don’t two millennium excellent book Buddhism True Robert Wright describes detail many thing modern science particularly psychology confirmed Buddha got right least probably Wright spends time describes Buddha’s “Seminal NotSelf Sermon” commonly translated Discourses NotSelf sermon Buddha according Wright’s overview asks disciple Buddhists refer five aggregate “qualify self” form physical body sensation feeling perception mental formation consciousness asked ‘is physical body form’ ‘Is feelings’ “If form self” Buddha say “then form would lead affliction obtain regarding form ‘May form thus may form thus” word body cause u suffering clearly control Therefore body can’t self Buddha applies test control remaining four aggregate show could possibly self turn none including consciousness truly described self beyond control Though Buddha never explicitly ruled possibility self recognized practical role selfidentity play individual suttas far I’m aware one past twenty five century since sermon able offer response query regarding exactly something like self essence found appears one helm steering individual ship life’s rough water doesn’t mean completely rudderless idea central self running whole show far completely unsupportable American psychologist William James didn’t stop five aggregate turned outward challenge concept self asking u clearly define boundary individual family lie line exists extremely fuzzy Wright quote James lend little extra contemporary support Buddha’s 2500yearold point ‘Between man call simply call mine line difficult draw’ sense James observed ‘our immediate family part father mother wife babe bone bone flesh flesh die part self gone’ would go even James Consider role friend contact make course lifetime play making u today Many contribution identity aren’t even conscious Yet time number people honestly couldn’t imagine without certainly extends well beyond immediate family Wright sum situation follows describing related Buddhist concept emptiness word nothing posse inherent existence nothing contains ingredient ongoing existence within nothing selfsufficient Hence idea emptiness thing empty inherent independent existence self longer picture subject u contend perceiver becomes collection characteristic molded combination biology personal experience culture none alone qualifies individual subjective viewer influenced feeling adopting supposedly objective point view order eliminate feeling cloud judgment subject discarding order obtain accurate view world recognizing self objectivesubjective dichotomy suddenly becomes much two side coin false choice created faulty dualistic premise One ten image developed psychiatrist Hermann Rorschach help doctor effectively evaluate patient visually experience world Perhaps best demonstration fluidity boundary subject object famous widely misunderstood Rorschach Test ten inkblot used test random smear ink like many people think carefully crafted image created psychiatrist Hermann Rorschach Rorschach fascinated entire life people see world addition psychiatric training son artist considerable artistic talent made well suited research human perception area largely overlooked famous contemporary Freud Jung Rorschach’s inkblot visual equivalent free association Damion Searls put book Inkblots Hermann Rorschach Iconic test Power Seeing “The image constrains see — rail — without taking away freedom different people see differently difference revealing” Put another way Rorschach inkblot rest boundary something that’s really multiple constrained way viewing It’s hardly fixed water molecule law gravity it’s far entirely relativistic image either regard it’s excellent metaphor complex pattern relationship make society ecosystem According Searls Rorsach’s insight “perception included much physical mechanic seeing sensation way interpreting perceived” recent book Buddhism Robert Wright also draw attention fact perception interpretation cannot treated separate action make case quote psychologist Robert Zajonc probably perception cognition everyday life significant affective component aren’t hot least tepid perhaps perception contain affect see ‘a house’ see ‘a handsome house’ ‘an ugly house’ ‘a pretentious house’ read article attitude change cognitive dissonance herbicide read ‘exciting’ article attitude change ‘important’ article cognitive dissonance ‘trivial’ article herbicide point isn’t call objective reality doesn’t exist Rather it’s specie capacity unveil truth can’t possibly objective experience objective scientist philosopher objective people period feeling existence color every decision make matter rational think we’re Furthermore impression there’s inner objective self essence guiding whole show isn’t stated earlier make something objectively true isn’t dispassionately observed every single possible subjective observer can’t help ultimately reach conclusion nature given proper intellectual technological tool make necessary examination matter anyone feel water molecule physiological lens mechanical device viewed still two hydrogen atom one oxygen can’t said relationship form environment It’s realizing enmeshed world rather separate “objective” outside observer truly hope make real progress understandingTags Philosophy Spirituality Consciousness Psychology Science
2,485
How My School Will Stay Open in COVID-Crippled Spain
A torrent of tragedy — that’s our dear year 2020 so far. Assuming that we survive the perils of existence — those from nature and, even more so, those from our own misguided steps — and humanity’s distant offspring look back on the trials and tribulations of their bumbling ancestors, the narrative of 2020 would be ripe for future historians. Perhaps even one of those defining years like 1060 and the Battle of Hastings or 1492 and the New World. Maybe 2020 will be called the Loss of Eden, defined by the singular and symbolic face mask that has essentially divided our life-intake system from the life-giving system. And yet, when those future historians read the narrative of 2020, hidden amongst the fear and confusion and crises and death and sorrow, they will uncover brilliant gems of humanity at its best. The selflessness and sacrifice of the world’s doctors and nurses are the crown jewel of 2020. And rightfully so. However, there’s a small school perched on a hill overlooking city and port on the Mediterranean island of Mallorca, and future historians looking for proof of humanity’s best would do well to pay attention. This school — my school — overcame odds and immense pressure during the first wave of COVID. Many teachers knew little more than Gmail and the basics of a Google Doc. In the end, we kept quality online education rolling, just as paying parents had expected at the beginning of the academic year in September 2019. And this is our strategy to once again do the impossible: stay open for the 2020/2021 academic in COVID-crippled Spain.
https://medium.com/age-of-awareness/how-my-school-will-stay-open-in-covid-crippled-spain-18140507d6bc
['Drew Sparkman']
2020-09-09 08:27:26.889000+00:00
['Education', 'Productivity', 'Travel', 'Coronavirus', 'Covid 19']
Title School Stay Open COVIDCrippled SpainContent torrent tragedy — that’s dear year 2020 far Assuming survive peril existence — nature even misguided step — humanity’s distant offspring look back trial tribulation bumbling ancestor narrative 2020 would ripe future historian Perhaps even one defining year like 1060 Battle Hastings 1492 New World Maybe 2020 called Loss Eden defined singular symbolic face mask essentially divided lifeintake system lifegiving system yet future historian read narrative 2020 hidden amongst fear confusion crisis death sorrow uncover brilliant gem humanity best selflessness sacrifice world’s doctor nurse crown jewel 2020 rightfully However there’s small school perched hill overlooking city port Mediterranean island Mallorca future historian looking proof humanity’s best would well pay attention school — school — overcame odds immense pressure first wave COVID Many teacher knew little Gmail basic Google Doc end kept quality online education rolling paying parent expected beginning academic year September 2019 strategy impossible stay open 20202021 academic COVIDcrippled SpainTags Education Productivity Travel Coronavirus Covid 19
2,486
I Tried Being Nicer to Myself for a Day
I don’t know why I don’t do this more often. Photo by Septian simon on Unsplash Wow, this was a challenge. Is it really sad that this took a conscious effort to say nice things to myself? Apparently, I’m kind of mean on a regular basis. When I look in the mirror, I focus on the “flaws” that I see. If my pants are fitting tighter than usual or I’m having a breakout or the circles under my eyes look darker than usual, I’ll sigh and think to myself, I wish I looked better. I even have a hard time accepting praise from other people. I’ll brush off compliments with an unnecessary explanation, but I’m the first to internalize any criticism. I became aware of how detrimental my negative self-talk can be when a friend told me it made me less attractive and not so fun to be around. She said that nobody else notices the “flaws” that I speak of and if I can’t take a compliment it makes people not want to give them to me anymore. When people feel good about themselves, it’s infectious. When they don’t, it’s a repellant. There’s a reason Eeyore, the pessimistic donkey in Winnie the Pooh, is always off by himself. Pity parties are best enjoyed solo. I don’t want to be like Eeyore. I don’t want to push away my friends with my sad-sack attitude about myself. So, I tried to be nicer to myself. I focused on only giving myself compliments with no critiques for a full day. This is how it went. Instead of starting my morning run with thoughts of how much I hope it will help me lose weight, because I hate my thighs, I thought about how great I felt that I woke up early to do it. I felt lucky to live right next to such a nice park that I can run around. I enjoyed every step because I was happy to be able to do it. I ended the run feeling more energized because I focused on positive thoughts instead of negative ones. I usually pick myself apart in the mirror before I take a shower, but today I skipped the mirror and instead just tried to focus on the parts of my body that I like while I was washing up. I washed my hair thinking that I love my curly hair. As I scrubbed my stomach, I complimented my small waist. While washing my face, I remembered my cystic acne from when I was in college and was so happy that my skin was relatively smooth now. I got out of the shower feeling good about myself. Getting dressed is always a problem for me, because if something doesn’t fit perfectly or like I think it should to be flattering, I can spiral into thinking I’m just so hideous. In order to bypass that altogether, I just picked my favorite spring dress that always fits and makes me feel beautiful. Easy. No struggles on what to wear. My closet should be filled with clothes like that. I looked in the mirror, told myself I looked pretty great and I believed it. If you only tell yourself to see the positives, it can change your entire mindset. When I got in my car, I commended my driving skills, because I’ve only gotten in one accident since I was 16 and that wasn’t even my fault. I even snuck a peek at myself in the rear view mirror and thought, Ok! I look cute today! Even just the thought made me smile. I sang along to a song on my playlist and thought, I don’t sound too bad. If I took voice lessons, I might just be unstoppable. Not all compliments have to be grounded in reality, but it felt good to think it. It even made me laugh out loud. Laughing at yourself is a pretty amazing joy. When I was in class that evening, I got a paper back with a 98% score. Normal me would have been hung up on that 2% and probably would have gone up to the teacher at the end of class to ask about it. But on this day, I didn’t. I thought I was a pretty kick-ass student and I truly didn’t care about that 2%. By the time I got home, I can honestly say I felt lighter than I usually do. Showering myself with compliments all day boosted me up. It put me in a great mood all day. I think that mood showed to others as well. I had a couple people tell me that I looked happy, which made me feel even better. I also found myself taking compliments in stride. I believed them, which is key. I smiled, said ‘thank you’ and kept moving. I’m going to make a conscious effort to do this more often. Obviously, I still had moments when little self-critiques tried to creep back in my mind, but I never let them stay around too long. Negativity is draining, especially if it’s consuming your thoughts. I need to be nicer to myself and I know that I can be. It takes work, but the work is worth my well-being. If you find yourself stuck in a cycle of negative self-talk, I suggest this challenge. Try to only compliment yourself for a full day. I’m sure you will notice a change. Be kind to yourself. You deserve it.
https://maclinlyons.medium.com/i-tried-being-nicer-to-myself-for-a-day-81c66e078b61
['Maclin Lyons']
2019-04-26 20:35:00.243000+00:00
['Mental Health', 'Self-awareness', 'Self', 'Self Love', 'Life Lessons']
Title Tried Nicer DayContent don’t know don’t often Photo Septian simon Unsplash Wow challenge really sad took conscious effort say nice thing Apparently I’m kind mean regular basis look mirror focus “flaws” see pant fitting tighter usual I’m breakout circle eye look darker usual I’ll sigh think wish looked better even hard time accepting praise people I’ll brush compliment unnecessary explanation I’m first internalize criticism became aware detrimental negative selftalk friend told made le attractive fun around said nobody else notice “flaws” speak can’t take compliment make people want give anymore people feel good it’s infectious don’t it’s repellant There’s reason Eeyore pessimistic donkey Winnie Pooh always Pity party best enjoyed solo don’t want like Eeyore don’t want push away friend sadsack attitude tried nicer focused giving compliment critique full day went Instead starting morning run thought much hope help lose weight hate thigh thought great felt woke early felt lucky live right next nice park run around enjoyed every step happy able ended run feeling energized focused positive thought instead negative one usually pick apart mirror take shower today skipped mirror instead tried focus part body like washing washed hair thinking love curly hair scrubbed stomach complimented small waist washing face remembered cystic acne college happy skin relatively smooth got shower feeling good Getting dressed always problem something doesn’t fit perfectly like think flattering spiral thinking I’m hideous order bypass altogether picked favorite spring dress always fit make feel beautiful Easy struggle wear closet filled clothes like looked mirror told looked pretty great believed tell see positive change entire mindset got car commended driving skill I’ve gotten one accident since 16 wasn’t even fault even snuck peek rear view mirror thought Ok look cute today Even thought made smile sang along song playlist thought don’t sound bad took voice lesson might unstoppable compliment grounded reality felt good think even made laugh loud Laughing pretty amazing joy class evening got paper back 98 score Normal would hung 2 probably would gone teacher end class ask day didn’t thought pretty kickass student truly didn’t care 2 time got home honestly say felt lighter usually Showering compliment day boosted put great mood day think mood showed others well couple people tell looked happy made feel even better also found taking compliment stride believed key smiled said ‘thank you’ kept moving I’m going make conscious effort often Obviously still moment little selfcritiques tried creep back mind never let stay around long Negativity draining especially it’s consuming thought need nicer know take work work worth wellbeing find stuck cycle negative selftalk suggest challenge Try compliment full day I’m sure notice change kind deserve itTags Mental Health Selfawareness Self Self Love Life Lessons
2,487
Four Lessons I Learned From Reading Origin by Dan Brown
Four Lessons I Learned From Reading Origin by Dan Brown #3 is vital for Novelists Screenshot by the Author Over the weekend, I was feeling a bit drained on morale, so I decided that rather than lie in bed all weekend, I would read a book and have a genuine reason to lie in bed all weekend. I decided to read a book from the famous author, Dan Brown because it had been sitting on my bookshelf for a while. The first chapter — or rather the front matter of the book was enough to convince me to keep reading. Dan Brown is a visual writer, an astounding one at that. His words created bold images in my head, and his choice of adjectives was spot on. The first thought I had when I dropped the book was that I had to read it all over again — I can count on my right hand the number of books that have had that effect on me. Before the enthusiasm of talking about my new favorite author consumes this article, here are four lessons I learned from reading Dan Brown’s book Origin. 1. Research can be made intriguing “All art, architecture, locations, science, and religious organizations in this novel are real.” This paragraph piqued my interest. There is a stupendous amount of research that went into writing Origin. The names of artists and artworks featured in this book alone are enough to make one’s head swirl, but the way Dan weaved facts with fiction was seamless and intriguing. In this interview, Dan Brown says it took him two years to research Origin in Spain. Amazing! I kept alternating between googling the landmarks (which he painted to produce the sweet music only a skilled wordsmith can craft) and reading, but alas, the strong pull of the book won, and I was content to read through and note down the locations for later viewing. 2. Character Development is vital A quote that comes to mind is: “No one is ever the villain of their own story.” ― Cassandra Clare There is something I find more appealing in reading books when compared to watching their respective movie adaptation. The fact that I get to peek into the character’s head, and see their motives rather than just watching their actions brings me closer to every single character in the book, be it the good guy or the bad guy. Origin was written in third person omnipresent POV, and every relevant character was featured in a chapter, especially when they had a confrontation with the main character. More, every single character had a backstory. Hence, no matter what actions they took with or against the protagonist, it was justifiable. For me, it made them realistic. And the fact that the characters didn’t see what we did — we were always a step ahead of each character because we could lead into the heads of other characters involved, gave us more empathy for their diverse causes. Most noticeably in Luis Avila, who was on an assassin mission to serve his God by killing the atheist, Edmond Kirsch, but still waited long enough to save a waitress from being disrespected. 3. A single strong plot is enough to pull a book The book was geared at answering the ultimate question of human existence, “Where do we come from, and where are we going?” This question became the central plot. I was deep into chapter 17 when I realized that Dan hadn’t told me anything about what I was reading, but I was still eager to turn the next page. This is because each chapter was progressive. The secret was, each chapter reminded us of the problem at hand and stoked our curiosity to discover the secret of our origin. I still felt the same enthusiasm in Chapter 58 that I did in chapter 17. All chapters were directed to solving the question posed in the first chapters, “Would we ever have the answer to the question Edmond claimed to have discovered the answer to?” I didn’t think we were ever going to get the answer, I thought the catastrophe the religious leaders feared would materialize. And I imagined how the author, being a writer, would be able to come up with a substantial answer to the amazing scientific discovery. I legit came out of the book to ponder the author; How could he know so much! The arguments were too sound and solid that the author made me question everything I knew about religion from a logical point of view. 4. Readers love mystery I didn’t realize how much I loved mystery until I read this book. Here again, is my talk about character backstory: To have everything you assumed about a character change due to new information is the kind of epiphany I want to see in books! For a quick summary of the lessons mentioned in this post, here are the points: 1. Research can be made intriguing 2. Character development is vital 3. A single plot is enough to pull a book 4. Readers love mystery In all, Dan is a master of his craft, and I can’t express how much I love his mode of writing. My next goal is to read every single one of his books before the end of this year!
https://medium.com/books-are-our-superpower/four-lessons-i-learned-from-reading-origin-by-dan-brown-94669b784d79
['Deborah Oyegue']
2020-09-10 13:16:01.417000+00:00
['Writing', 'Books', 'Reading', 'Fiction', 'Books And Authors']
Title Four Lessons Learned Reading Origin Dan BrownContent Four Lessons Learned Reading Origin Dan Brown 3 vital Novelists Screenshot Author weekend feeling bit drained morale decided rather lie bed weekend would read book genuine reason lie bed weekend decided read book famous author Dan Brown sitting bookshelf first chapter — rather front matter book enough convince keep reading Dan Brown visual writer astounding one word created bold image head choice adjective spot first thought dropped book read — count right hand number book effect enthusiasm talking new favorite author consumes article four lesson learned reading Dan Brown’s book Origin 1 Research made intriguing “All art architecture location science religious organization novel real” paragraph piqued interest stupendous amount research went writing Origin name artist artwork featured book alone enough make one’s head swirl way Dan weaved fact fiction seamless intriguing interview Dan Brown say took two year research Origin Spain Amazing kept alternating googling landmark painted produce sweet music skilled wordsmith craft reading ala strong pull book content read note location later viewing 2 Character Development vital quote come mind “No one ever villain story” ― Cassandra Clare something find appealing reading book compared watching respective movie adaptation fact get peek character’s head see motif rather watching action brings closer every single character book good guy bad guy Origin written third person omnipresent POV every relevant character featured chapter especially confrontation main character every single character backstory Hence matter action took protagonist justifiable made realistic fact character didn’t see — always step ahead character could lead head character involved gave u empathy diverse cause noticeably Luis Avila assassin mission serve God killing atheist Edmond Kirsch still waited long enough save waitress disrespected 3 single strong plot enough pull book book geared answering ultimate question human existence “Where come going” question became central plot deep chapter 17 realized Dan hadn’t told anything reading still eager turn next page chapter progressive secret chapter reminded u problem hand stoked curiosity discover secret origin still felt enthusiasm Chapter 58 chapter 17 chapter directed solving question posed first chapter “Would ever answer question Edmond claimed discovered answer to” didn’t think ever going get answer thought catastrophe religious leader feared would materialize imagined author writer would able come substantial answer amazing scientific discovery legit came book ponder author could know much argument sound solid author made question everything knew religion logical point view 4 Readers love mystery didn’t realize much loved mystery read book talk character backstory everything assumed character change due new information kind epiphany want see book quick summary lesson mentioned post point 1 Research made intriguing 2 Character development vital 3 single plot enough pull book 4 Readers love mystery Dan master craft can’t express much love mode writing next goal read every single one book end yearTags Writing Books Reading Fiction Books Authors
2,488
Disaster Recovery on Kubernetes
Using VMWare’s Velero to backup and restore, perform disaster recovery, as well as migrate Kubernetes resources. Photo by Markus Spiske on Unsplash Although Kubernetes (and especially managed Kubernetes services such as GKE, EKS, and AKS) provide out-of-the-box reliability and resiliency with self-healing and horizontal scaling capabilities, production systems still require disaster recovery solutions to protect against human error (e.g. accidentally deleting a namespace or secret) and infrastructure failures outside of Kubernetes (e.g. persistent volumes). While more companies are embracing multi-region solutions, it is a complicated and potentially expensive option if all you need is a simple backup and restore option. In this post, we’ll look at using Velero to backup and restore Kubernetes resources as well as demonstrating its use as a disaster recovery or migration tool. Are Backups Still Needed? A key point that is often lost when running services in high availability (HA) mode is that HA (and thus replication) is not the same as having backups. HA protects against zonal failures, but it will not protect against data corruption or accidental removals. It is very easy to mix up the context or namespaces and accidentally delete or update the wrong Kubernetes resources. This may be a Custom Resource Definition (CRD), a secret, or a namespace. Some may argue that with IaaS tools like Terraform and external solutions to manage some of these Kubernetes resources (e.g. Vault for secrets, ChartMuseum for Helm charts), backups become unnecessary. Still, if you are running a StatefulSet in your cluster (e.g. ELK stack for logging or self-hosting Postgres to install plugins not support on RDS or Cloud SQL), backups are needed to recover from persistent volume failures. Velero Velero (formerly known as Ark) is an open-source tool from Heptio (acquired by VMWare) to back up and restore Kubernetes cluster resources and persistent volumes. Velero runs inside the Kubernetes cluster and integrates with various storage providers (e.g. AWS S3, GCP Storage, Minio) as well as restic to take snapshots either on-demand or on a schedule. Installation Velero can be installed via Helm or via the CLI tool. In general, it seems like the CLI gets the latest updates and the Helm chart lags behind slightly with compatible Docker images. However, with each release, the Velero team does a great job updating the documentation to patch CRDs and the new Velero container image, so upgrading the Helm chart to the latest isn’t a huge concern. Configuration Once you have the server installed, you can configure Velero via CLI or by modifying values.yaml for the Helm chart. The key configuration steps are installing the plugins for the storage provider and defining the Storage Location as well as the Volume Snapshot Location: configuration: provider: aws backupStorageLocation: name: aws bucket: <aws-bucket-name> prefix: velero config: kmsKeyId: <my-kms-key> region: <aws-region> volumeSnapshotLocation: name: aws config: region: ${region} logLevel: debug (Note: There is an issue with CRDs with the latest Helm chart causing backup storage and volume snapshot location to not set the configured values as default. If you decide to name the storage and snapshot location, add --storage-location <name> --volume-snapshot-location name in the following Velero commands) Creating a Backup To create a backup, simply apply the backup command to a namespace or select by labels: $ velero backup create nginx-backup --include-namespaces nginx-example $ velero backup create postgres-backup --selector release=postgres When the backup command is issued, Velero runs through the following steps: Call the Kubernetes API to create the Backup CRD Velero BackupController validates the request Once the request is validated, it queries the Kubernetes resources and takes snapshots of disks to back up and creates a tarball Finally, it initiates the upload of the backup objects to the configured storage service Image Credit: OpenShift Restoring Data To list the available backups, first run: $ velero backup get Now you can restore from backup by issuing: $ velero restore create RESTORE_NAME \ --from-backup BACKUP_NAME Velero also supports restoring objects into a different namespace if you do not wish to override the existing resources (append --namespace-mappings old-ns-1:new-ns-1 to the above command). This is useful if you are experiencing outages and want to diagnose the problem for later while immediately restoring the service. Velero can change the storage class of persistent volumes during restores. This may be a good way to migrate workloads from HDD to SSD storage or to a smaller disk if you over-provisioning the persistent volume (see the documentation for the configuration). Finally, you can also selectively restore sub-components of the backup. Inspect the backup tarball by running: $ velero backup download <backup-name> From the tarball, you can choose a manifest for a specific resource and individually issue kubectl apply -f . This is useful if you took a snapshot of the entire namespace rather than filtering by labels. Scheduled Backups Instead of only creating backups on-demand, you can also configure scheduled backups for critical components: Via CLI: $ velero schedule create mysql --schedule="0 2* * *" --include-namespaces mysql Via Helm values: schedules: mysql: schedule: 0 2 * * * template: labelSelector: matchLabels: app: mysql snapshotVolumes: true ttl: 720h Notice the ttl configuration, which specifies the time to expire scheduled backups. If you are using a Cloud Storage provider, you can leverage lifecycle policies or control that via Velero as shown above to reduce storage costs. Other Uses Besides simply taking backups, Velero can be used as a disaster recovery solution by combining schedules and read-only backup storage locations. Configure Velero to create a daily schedule: $ velero schedule create <SCHEDULE NAME> --schedule "0 7 * * *" If you need to recreate resources due to human error or infrastructure outage, change the backup location to be read-only to prevent new backup objects from being created: $ kubectl patch backupstoragelocation <STORAGE LOCATION NAME> \ --namespace velero \ --type merge \ --patch '{"spec":{"accessMode":"ReadOnly"}}' Restore from backup in another location: $ velero restore create --from-backup <SCHEDULE NAME>-<TIMESTAMP> And finally, revert backup to be writable again: $ kubectl patch backupstoragelocation <STORAGE LOCATION NAME> \ --namespace velero \ --type merge \ --patch '{"spec":{"accessMode":"ReadWrite"}}' This process works to migrate clusters to a different region (if the provider supports it) or to create the last working version prior to a Kubernetes upgrade. Finally, even if Velero does not natively support migration of persistent volumes across clouds, you can configure restic to make backups at filesystem level and migrate data for a hybrid-cloud back up solution. Other Solutions While Velero is very easy to use and configure, it may not fit your specific use case (e.g. cross-cloud backup). As mentioned above, Velero integrates with other solutions such as restic or OpenEBS, but if you are looking for alternatives, the following list provides both open-source and enterprise options:
https://medium.com/dev-genius/disaster-recovery-on-kubernetes-98c5c78382bb
['Yitaek Hwang']
2020-09-24 07:10:26.796000+00:00
['Backup', 'Software Engineering', 'Disaster Recovery', 'Programming', 'Kubernetes']
Title Disaster Recovery KubernetesContent Using VMWare’s Velero backup restore perform disaster recovery well migrate Kubernetes resource Photo Markus Spiske Unsplash Although Kubernetes especially managed Kubernetes service GKE EKS AKS provide outofthebox reliability resiliency selfhealing horizontal scaling capability production system still require disaster recovery solution protect human error eg accidentally deleting namespace secret infrastructure failure outside Kubernetes eg persistent volume company embracing multiregion solution complicated potentially expensive option need simple backup restore option post we’ll look using Velero backup restore Kubernetes resource well demonstrating use disaster recovery migration tool Backups Still Needed key point often lost running service high availability HA mode HA thus replication backup HA protects zonal failure protect data corruption accidental removal easy mix context namespaces accidentally delete update wrong Kubernetes resource may Custom Resource Definition CRD secret namespace may argue IaaS tool like Terraform external solution manage Kubernetes resource eg Vault secret ChartMuseum Helm chart backup become unnecessary Still running StatefulSet cluster eg ELK stack logging selfhosting Postgres install plugins support RDS Cloud SQL backup needed recover persistent volume failure Velero Velero formerly known Ark opensource tool Heptio acquired VMWare back restore Kubernetes cluster resource persistent volume Velero run inside Kubernetes cluster integrates various storage provider eg AWS S3 GCP Storage Minio well restic take snapshot either ondemand schedule Installation Velero installed via Helm via CLI tool general seems like CLI get latest update Helm chart lag behind slightly compatible Docker image However release Velero team great job updating documentation patch CRDs new Velero container image upgrading Helm chart latest isn’t huge concern Configuration server installed configure Velero via CLI modifying valuesyaml Helm chart key configuration step installing plugins storage provider defining Storage Location well Volume Snapshot Location configuration provider aws backupStorageLocation name aws bucket awsbucketname prefix velero config kmsKeyId mykmskey region awsregion volumeSnapshotLocation name aws config region region logLevel debug Note issue CRDs latest Helm chart causing backup storage volume snapshot location set configured value default decide name storage snapshot location add storagelocation name volumesnapshotlocation name following Velero command Creating Backup create backup simply apply backup command namespace select label velero backup create nginxbackup includenamespaces nginxexample velero backup create postgresbackup selector releasepostgres backup command issued Velero run following step Call Kubernetes API create Backup CRD Velero BackupController validates request request validated query Kubernetes resource take snapshot disk back creates tarball Finally initiate upload backup object configured storage service Image Credit OpenShift Restoring Data list available backup first run velero backup get restore backup issuing velero restore create RESTORENAME frombackup BACKUPNAME Velero also support restoring object different namespace wish override existing resource append namespacemappings oldns1newns1 command useful experiencing outage want diagnose problem later immediately restoring service Velero change storage class persistent volume restores may good way migrate workload HDD SSD storage smaller disk overprovisioning persistent volume see documentation configuration Finally also selectively restore subcomponents backup Inspect backup tarball running velero backup download backupname tarball choose manifest specific resource individually issue kubectl apply f useful took snapshot entire namespace rather filtering label Scheduled Backups Instead creating backup ondemand also configure scheduled backup critical component Via CLI velero schedule create mysql schedule0 2 includenamespaces mysql Via Helm value schedule mysql schedule 0 2 template labelSelector matchLabels app mysql snapshotVolumes true ttl 720h Notice ttl configuration specifies time expire scheduled backup using Cloud Storage provider leverage lifecycle policy control via Velero shown reduce storage cost Uses Besides simply taking backup Velero used disaster recovery solution combining schedule readonly backup storage location Configure Velero create daily schedule velero schedule create SCHEDULE NAME schedule 0 7 need recreate resource due human error infrastructure outage change backup location readonly prevent new backup object created kubectl patch backupstoragelocation STORAGE LOCATION NAME namespace velero type merge patch specaccessModeReadOnly Restore backup another location velero restore create frombackup SCHEDULE NAMETIMESTAMP finally revert backup writable kubectl patch backupstoragelocation STORAGE LOCATION NAME namespace velero type merge patch specaccessModeReadWrite process work migrate cluster different region provider support create last working version prior Kubernetes upgrade Finally even Velero natively support migration persistent volume across cloud configure restic make backup filesystem level migrate data hybridcloud back solution Solutions Velero easy use configure may fit specific use case eg crosscloud backup mentioned Velero integrates solution restic OpenEBS looking alternative following list provides opensource enterprise optionsTags Backup Software Engineering Disaster Recovery Programming Kubernetes
2,489
1 Critical Skill Successful People Often Lose Over Time
What happens when you only think James’s experience reminded me of my time in France. I don’t know if you’re aware, but people who live in France speak French. I do not. Naturally, there was a bit of a communication gap. I would walk into the office and hear a coworker say: “Blah, blah, blah, blah, Todd?” I would smile and reply with the only word I knew at the time: “Oui!” What did I say yes to? I didn’t know. Nobody ever returned later in the day expecting me to help beat a person to death with crusty baguettes, though, so I counted that as a win. Still, I’m a communicator. I wanted to understand French, not just fake it. My plan was to think my way to learning French. Each day, I sat in the cafeteria reading the French newspaper. I chewed on smelly cheese and thought: “Okay, they just used the word lancé, and I definitely saw that word in another story about a software company releasing a new product. It must mean launch.” This process continued for months. My employer moved me to France to work, but most days I just eavesdropped. One morning my coworker Margaux came in late and furious. That was the day I learned how to curse at trains. Another day I sat in on a meeting about the company’s media bank with four people. They spoke French the entire hour. I caught snippets here and there about mot-clés after the conversation had already moved 10 minutes ahead. I thought and thought and thought and thought. The French language consumed me as much as any hobby ever had. Finally, two weeks before I left the country, a reckoning. A colleague opened the door for our meeting, looked me right in the eyes, and said: “Est-ce que cette chambre vous convient?” For the first time, the words instantly translated. “Is this room okay?” A chorus of invisible angels sang my praises. I understood! All the reading and thinking and listening quietly must have paid off. Euphoria had to be pouring out of my eyeballs. I’d never been so proud. That lasted for about 1.2 seconds. I opened my mouth to reply. I couldn’t. Why couldn’t I reply? Because I still couldn’t speak French. Ashamed I couldn’t answer more thoroughly, I just nodded and muttered my old standby: “oui.” She gave me that look you give sad puppy dogs, and said: “Faut-il parler en anglais?” I nodded miserably. Yes, we should speak in English.
https://medium.com/personal-growth/1-critical-skill-successful-people-often-lose-over-time-edb5fb20fe9c
['Todd Brison']
2020-06-23 16:13:12.051000+00:00
['Motivation', 'Personal Development', 'Entrepreneurship', 'Success', 'Inspiration']
Title 1 Critical Skill Successful People Often Lose TimeContent happens think James’s experience reminded time France don’t know you’re aware people live France speak French Naturally bit communication gap would walk office hear coworker say “Blah blah blah blah Todd” would smile reply word knew time “Oui” say yes didn’t know Nobody ever returned later day expecting help beat person death crusty baguette though counted win Still I’m communicator wanted understand French fake plan think way learning French day sat cafeteria reading French newspaper chewed smelly cheese thought “Okay used word lancé definitely saw word another story software company releasing new product must mean launch” process continued month employer moved France work day eavesdropped One morning coworker Margaux came late furious day learned curse train Another day sat meeting company’s medium bank four people spoke French entire hour caught snippet motclés conversation already moved 10 minute ahead thought thought thought thought French language consumed much hobby ever Finally two week left country reckoning colleague opened door meeting looked right eye said “Estce que cette chambre vous convient” first time word instantly translated “Is room okay” chorus invisible angel sang praise understood reading thinking listening quietly must paid Euphoria pouring eyeball I’d never proud lasted 12 second opened mouth reply couldn’t couldn’t reply still couldn’t speak French Ashamed couldn’t answer thoroughly nodded muttered old standby “oui” gave look give sad puppy dog said “Fautil parler en anglais” nodded miserably Yes speak EnglishTags Motivation Personal Development Entrepreneurship Success Inspiration
2,490
Why user testing with real copy is like the ultimate bacon sarnie
First written for and published on naturalinteraction.com Imagine you’ve spent months designing and building a shiny new website. It looks amazing and you’re sure it’s going to raise online profits and convert every visitor on first visit. You tested the design concept and even run click testing on the Information Architecture — what could go wrong? Words. Getting the content right, getting that microcopy right, using simple, on brand, audience appropriate language. These things are all important when it comes to delivering a great user experience. And yet, those words are often the last thing on the list and rarely form part of the testing process. Wherever possible, you should always user test with real copy. If you don’t have the final copy ready, at least use a draft or as close to final as you can. Say no to a dry sarnie Lorem Ipsum is a well known form of dummy text, used to fill spaces on designs and wireframes before the polish. It dates back to the 15th century when it was first used by typesetters and printers to show how a page layout might look. It’s still a useful tool but well, it’s a bit dry. You need to add some sauce (aka real words) so that when you’re usability testing a product or design with representative users, you really get to know that they fully understand the point of it all. And, that they find it easy to use from both a visual and reading point of view. The gold standard (aka the sauce) Having the content written and ready to go at the point you start user testing your designs is pretty rare. Being able to test the design in conjunction with its content with real users is absolutely the gold standard. Why? Because your users may well pick up things you’ve missed. This is especially key if you’re working on the user interface for something like an app or a piece of software. Call to action text and tooltips need to work hand in hand with the visual appearance of the product to enable a fast onboarding process and of course, smooth user experience. In this podcast episode Chris Myhill from Just UX Design talks about how a project he worked on for a large supermarket chain, it was actually the copy which caused usability issues and not the design which surprised the whole team. They were using sector jargon and complex wording which were both confusing users. By providing more context and simplifying the content, pairing it with icons, he was able to really improve the overall product Realistic best practices Areas that are important such as call to action buttons and instructions should be as clear as possible because these are places in which customers are most likely to struggle or get confused. If you’re not able to test your product or website with real, polished copy, at least ensure the following are as close to final as possible: Instructions Tooltips Microcopy Call to action Sign up process Do what you can — something is better than nothing If you’re working with a client, push for real copy to be supplied at the time of your initial brief and if that’s not possible, at least write something relevant in place of lorem ipsum to give your users a more holistic sense of the end product. In conclusion, and if I was pushed to make an analogy — and let’s face it, I’ve been building up to it all the way through — I would say that user testing without real copy is like eating a bacon sarnie without brown sauce. It’s better than nothing at all but add that sauce and you’ve got yourself something truly great.
https://uxdesign.cc/why-user-testing-with-real-copy-is-like-the-ultimate-bacon-sarnie-2886ba0b8d3d
['Alex Ryder']
2019-09-17 22:28:06.498000+00:00
['Copywriting', 'Marketing', 'Startup', 'Content Strategy', 'Ux Writing']
Title user testing real copy like ultimate bacon sarnieContent First written published naturalinteractioncom Imagine you’ve spent month designing building shiny new website look amazing you’re sure it’s going raise online profit convert every visitor first visit tested design concept even run click testing Information Architecture — could go wrong Words Getting content right getting microcopy right using simple brand audience appropriate language thing important come delivering great user experience yet word often last thing list rarely form part testing process Wherever possible always user test real copy don’t final copy ready least use draft close final Say dry sarnie Lorem Ipsum well known form dummy text used fill space design wireframes polish date back 15th century first used typesetter printer show page layout might look It’s still useful tool well it’s bit dry need add sauce aka real word you’re usability testing product design representative user really get know fully understand point find easy use visual reading point view gold standard aka sauce content written ready go point start user testing design pretty rare able test design conjunction content real user absolutely gold standard user may well pick thing you’ve missed especially key you’re working user interface something like app piece software Call action text tooltips need work hand hand visual appearance product enable fast onboarding process course smooth user experience podcast episode Chris Myhill UX Design talk project worked large supermarket chain actually copy caused usability issue design surprised whole team using sector jargon complex wording confusing user providing context simplifying content pairing icon able really improve overall product Realistic best practice Areas important call action button instruction clear possible place customer likely struggle get confused you’re able test product website real polished copy least ensure following close final possible Instructions Tooltips Microcopy Call action Sign process — something better nothing you’re working client push real copy supplied time initial brief that’s possible least write something relevant place lorem ipsum give user holistic sense end product conclusion pushed make analogy — let’s face I’ve building way — would say user testing without real copy like eating bacon sarnie without brown sauce It’s better nothing add sauce you’ve got something truly greatTags Copywriting Marketing Startup Content Strategy Ux Writing
2,491
Saying That You Feel Ugly and Calling Yourself Ugly are Two Extremely Different Things
Photo by Florian Pérennès on Unsplash So it is almost 4 AM here in Florida, and per usual I am awake till the crack of dawn obsessing over anything and everything. Recently, I did a shoot for a small reality show which was based around a blind date and going through each other’s phones. For months I waited anxiously to see the footage, and tonight I was finally able to watch it in which I was laughing positively and pinching my belly fat, crying inside at the same time. Looking at the set-up, it appeared to be the ugly duckling and the swan paired together, but deep down that is just a feeling and I know it is not a fact. That is why I prefer to say that I feel ugly rather than sealing the horrid perception of myself by means of two words: “I am”. A few days ago, I wrote a post on here about men and their struggles with eating disorder in much of a positive and empowering tone. However, the very next day I was wearing a tank-top feeling very self conscious about breathing in public because of the way my perceived belly would protrude. That night, I took my medicine, got the munchies, and binge ate relentlessly and regretfully all at once. The next day, my belly and man boobs were in the back of my mind because I was not even all that hungry. However, night time came along and when it was time to shower, I got naked and saw my body feeling desperate, sad, and uncomfortable in my own skin. Earlier in the day I had no cup size and a flat belly only to feel like a balloon at shower hour. These feelings are symptoms of a common disorder known as Body Dysmorphic Disorder. “A woman in tattoos and lingerie is wrapped in a white robe on a hotel bed” by Stas Svechnikov on Unsplash Now, whenever I make mention about my own personal struggles with BDD, most of the time I am usually shrugged off because people think I am looking for validation. Additionally, when the subject comes up it usually comes off as if I actually believe I am ugly even though it is merely a feeling that comes and goes like a fair-weather side boy. For starters, anyone who talks about their mental illnesses should be treated as nothing less than brave for opening up about an issue that is still shoved under the rug by society. More to the point, I do not believe these things about myself but merely feel them so intensely that the illusion in my mind makes it feel so real. Regarding tonight, seeing myself on taped blind date triggered insecurities that normally do not have any ounce of power over me. I didn’t see who I normally think I am whenever there is no mirror around for me and I’m free from my ego. Absolutely not. All I noticed was my hunched back, puffy cheeks, overly feminine qualities (which I’m unapologetic about but still insecure), and an image pale in comparison to the perfectly crafted wallflower across from me. I shed a tear and smiled all at once because of how grateful I am to be self aware, but the pain still exists. Photo by Jairo Alzate on Unsplash All in all, I absolutely refuse to let my mind’s distortion inhibit me from moving forward in my life. It is still nearly impossible for me to love a body picture enough to post it online, and when I do its usually removed within a day or two because of my insecurities. In addition, whenever I take pictures I am usually hiding part of my face, giving a kiss on the cheek, or doing anything to avoid smiling because I feel my face is puffy when I smile and it keeps me loathing myself. Ultimately, my insecurities come with a wisdom that makes the pain all worth the while. Knowing that language is extremely powerful can help us understand that feelings are not facts, so feeling a certain way does not equate to being it as well. Overall, these are the things I see when I look at myself, but they are not the bricks that build the house of my identity. Seeing the clip made me feel ugly but there is no ugly bone in my body. It took me forever to get to this point, but ever since I replaced “I am” with “I feel” before using the words ugly or fat, the pain of Body Dysmorphic Disorder became fifty percent less extreme than it initially was. Tweaking my language relieves the pain so much more than the four medications I take for these disorders. And the best part about it is that this wisdom doesn’t have a copay.
https://astoldbynaomi.medium.com/talking-about-feeling-ugly-and-calling-yourself-ugly-are-two-different-things-cb9befb46d1
['Naomi Eden']
2018-08-15 19:11:24.854000+00:00
['Self Improvement', 'Body Image', 'Writing', 'Mental Health', 'Life']
Title Saying Feel Ugly Calling Ugly Two Extremely Different ThingsContent Photo Florian Pérennès Unsplash almost 4 Florida per usual awake till crack dawn obsessing anything everything Recently shoot small reality show based around blind date going other’s phone month waited anxiously see footage tonight finally able watch laughing positively pinching belly fat cry inside time Looking setup appeared ugly duckling swan paired together deep feeling know fact prefer say feel ugly rather sealing horrid perception mean two word “I am” day ago wrote post men struggle eating disorder much positive empowering tone However next day wearing tanktop feeling self conscious breathing public way perceived belly would protrude night took medicine got munchies binge ate relentlessly regretfully next day belly man boob back mind even hungry However night time came along time shower got naked saw body feeling desperate sad uncomfortable skin Earlier day cup size flat belly feel like balloon shower hour feeling symptom common disorder known Body Dysmorphic Disorder “A woman tattoo lingerie wrapped white robe hotel bed” Stas Svechnikov Unsplash whenever make mention personal struggle BDD time usually shrugged people think looking validation Additionally subject come usually come actually believe ugly even though merely feeling come go like fairweather side boy starter anyone talk mental illness treated nothing le brave opening issue still shoved rug society point believe thing merely feel intensely illusion mind make feel real Regarding tonight seeing taped blind date triggered insecurity normally ounce power didn’t see normally think whenever mirror around I’m free ego Absolutely noticed hunched back puffy cheek overly feminine quality I’m unapologetic still insecure image pale comparison perfectly crafted wallflower across shed tear smiled grateful self aware pain still exists Photo Jairo Alzate Unsplash absolutely refuse let mind’s distortion inhibit moving forward life still nearly impossible love body picture enough post online usually removed within day two insecurity addition whenever take picture usually hiding part face giving kiss cheek anything avoid smiling feel face puffy smile keep loathing Ultimately insecurity come wisdom make pain worth Knowing language extremely powerful help u understand feeling fact feeling certain way equate well Overall thing see look brick build house identity Seeing clip made feel ugly ugly bone body took forever get point ever since replaced “I am” “I feel” using word ugly fat pain Body Dysmorphic Disorder became fifty percent le extreme initially Tweaking language relief pain much four medication take disorder best part wisdom doesn’t copayTags Self Improvement Body Image Writing Mental Health Life
2,492
AI Movies Recommendation System Based on K-Means Clustering Algorithm
AI Movies Recommendation System Based on K-Means Clustering Algorithm Overview of Article In this article, we’ll build an artificial intelligence movies recommendation system by using k-means algorithm which is a clustering algorithm. We’ll recommend movies to users which are more relevant to them based on their previous history. We’ll only import those data, where users has rated movies 4+ as we want to recommend only those movies which users like most. In this whole article, we have used Python programming language with their associated libraries i.e. NumPy, Pandas, Matplotlib and Scikit-Learn. Moreover, we have supposed that the reader has familiarity with Python and the aforementioned libraries. Introduction to AI Movies Recommendation System In this busy life as people don’t have time to search for their desired item and even they want it on their table or even in a little effort. So, the recommendation system has become an important part to help them to make a right choice for their desired thing and to grow our product. Since data is increasing day by day and in this era with such a large database, it has even become a difficult task to find a more relevant item of our interest, because often we can’t search an item of our interest with just a title and even sometimes it is harder. So, recommendation system help us to provide a most relevant item to individual available in our database. In this article, we’ll build a movies recommendation system. Movies recommendation system has become an essential part to movies website because an individual don’t know which movies are more interested to him with just a title or genre. Sometime an individual likes action movies but he/she will not always like every action movie. To handle this problem, many authors has provided a better way to recommend a movie to user 1 from the watch list or favorite movies of another user 2 whose movies database is more relevant to the user 1. That is, if the taste of two people is same, then both of them will like each other favorite food. Many tech giants has been using these recommendation system in their applications like YouTube, Netflix, etc. In this task, machine learning (ML) models has helped us a lot to build such recommendation system based on users previous watch history. ML models learns from users watch history and categorize them into groups which contain users of same taste. Different types of ML models has been used like clustering algorithms, deep learning models etc. K-Means Clustering Algorithm K-Means is an unsupervised machine learning algorithm which can be used to categorize data into different groups. In this article we’ll use this algorithm to categorize users based on their 4+ ratings on movies. I’ll not describe the background mathematics of this algorithm but I’ll describe little intuition of this algorithm. If you want to understand the mathematical background of this algorithm, then I’ll suggest you to search it on Google, many authors has written articles on its mathematical background. Since, the complete mathematics behind this algorithm has been done by Scikit-Learn library so, we will only understand and implement it. Note: Plots of data in this section are designed randomly and only for intuition of K-means algorithm. Figure 1 — Scatter Plot Before K-Means Clustering Suppose that we have 2-dimensional data in the form of (x₁, x₂). Let, we have plotted our data in Figure (1). Next we want to divide this data into groups. If we take a look at data, then we can observe that this data can be divided into three groups. In this plot which is only designed for intuition, a common man can observe that we can divide into three groups. But some times we have very complex and big data or some time we have 3-dimensional or 4-dimensional or more generally we can have 100 dimensions or 1000 or even more than this. Then, it is not possible for human to categorize such type of data and even we can’t plot such a higher dimensional data. Also, sometimes we don’t know the optimal number of clusters we should have for our data. So, we use some clustering algorithms which can work for such big data which can even of thousands of dimensions and their are methods which can be used to know the optimal number of clusters. Figure 2 — Scatter Plot After K-Means Clustering In Figure (2), a demonstration of k-means clustering is shown. The data of Figure (1) has categorized into three groups and presented in the Figure (2) with a unique color for each group. One can arise a question, how actually k-means worked to categorize the data? To categorize data into groups which contain same type of items/data, there are 6 steps which k-means algorithm follow. Figure (3) is presenting the steps which k-means algorithm follow to categorize data. Figure 3 — Graphical Abstract of K-Means Algorithm Figure (3) is describing the following steps of k-means algorithm. Firstly, we have to select the numbers of clusters which we want for our dataset. Later, an elbow method will be explained for selection of optimal number of clusters. Then, we have to select k random points called centroid which are not necessary from our dataset. Because to avoid random initialization trap which can stuck to bad clusters, we’ll use k-means++ to initalize k centroids and it is provided by Scikit-Learn in k-means algorithm. K-means algorithm will assign each data point to its closest centroid which will finally gives us k clusters. The centroid will be re-center to a position which is now actually the centroid of its own cluster and will be new centroid. It will reset all clusters and again assign each dataset point to its new closest centroid. If, the new clusters are same as the previous cluster was OR total iterations has completed then it will stop and gives us the final clusters of our dataset. Else, It will move again to step 4. Elbow Method The elbow method is the best way to find optimal number of clusters. For this, we need to find within clusters sum of squares (WCSS). WCSS is the sum of squares of each point distance from its centroid and its mathematical formula is following Where K is total number of clusters, Nᵢ is the size of i’th cluster or we can also say that data points in i’th cluster, Cᵢ is the centroid of i’th cluster and Pᵢ,ⱼ is the j’th data point of i’th cluster. So, what we’ll do with WCSS? WCSS will tells us how far are centroid from its data points. As we increase number of clusters, WCSS will become small and after some value of K the WCSS will reduce slowly and we will stop there and choose optimal number of clusters. I’ll suggest to Google for elbow method and take a look at more clear examples of elbow method. Here we have figure for intuition of elbow method. Figure 4 — Elbow Method Plot A demonstration of elbow method is show in Figure (4). As we can observe that, when number of clusters K moves from 1 to 5 then WCSS value decreases rapidly from 2500 to 400 approx. But, for clusters number 6 to onward it is decreasing slowly. So, here we can make a judgment that it is good for our dataset if we have 5 cluster. Further, as we can see its look like an elbow, the joint elbow will be the optimal number of clusters which is in this case is 5. Later we’ll see that we don’t have always such a smooth curve so in this work I have described another way to observe changes in WCSS and to know optimal clusters. Methodology Used in this Article In this article, we’ll build a clustering based algorithm to categorize users into groups of same interest by using k-means algorithm. We will use data, where users has rated movies with 4+ rating on the supposition of that, if a user is rating a movie 4+ then he/she may like it. We have downloaded database The Movies Dataset from Kaggle.com which is a MovieLens Dataset. In the following sections, we have completely described the whole project, from Importing Dataset -> Data Engineering -> Building K-Means Clustering Model -> Analyzing Optimal Number of Clusters -> Training Model and Predicting -> Fixing Clusters -> Saving Training -> Finally, Making Recommendations for Users. A complete project of movies recommendation system can be downloaded from my GitHub Library AI Movies Recommendation System Based on K-means Clustering Algorithm. A Jupyter notebook of this article is also provided in the repository, you can download and play with that. URL: https://github.com/asdkazmi/AI-Movies-Recommendation-System-K-Means-Clustering URL: https://www.kaggle.com/rounakbanik/the-movies-dataset?select=ratings.csv Now lets start to work on coding: Importing All Required Libraries import pandas as pd print('Pandas version: ', pd.__version__) import numpy as np print('NumPy version: ', np.__version__) import matplotlib print('Matplotlib version: ', matplotlib.__version__) from matplotlib import pyplot as plt import sklearn print('Scikit-Learn version: ', sklearn.__version__) from sklearn.feature_extraction.text import CountVectorizer from sklearn.cluster import KMeans import pickle print('Pickle version: ', pickle.format_version) import sys print('Sys version: ', sys.version[0:5]) from sys import exc_info import ast Out: Pandas version: 0.25.1 NumPy version: 1.16.5 Matplotlib version: 3.1.1 Scikit-Learn version: 0.21.3 Pickle version: 4.0 Sys version: 3.7.4 Data Engineering This section is divided into two subsections. Firstly, we will import data and reduce it into a sub DataFrame, so that we can focus more on our model and can look what type of users has rated movies and what type of recommendation for him based on that. Secondly, we’ll perform feature engineering so that we have data in the form which is valid for machine learning algorithm. Preparing Data for Model We have downloaded MovieLens Dataset from Kaggle.com. Here first we’ll import rating dataset, because we want users rating on movies and further we’ll filter data where users has gives 4+ ratings ratings = pd.read_csv('./Prepairing Data/From Data/ratings.csv', usecols = ['userId', 'movieId','rating']) print('Shape of ratings dataset is: ',ratings.shape, ' ') print('Max values in dataset are ',ratings.max(), ' ') print('Min values in dataset are ',ratings.min(), ' ') Out: Shape of ratings dataset is: (26024289, 3) Max values in dataset are userId 270896.0 movieId 176275.0 rating 5.0 dtype: float64 Min values in dataset are userId 1.0 movieId 1.0 rating 0.5 dtype: float64 Next we’ll filter this dataset for only 4+ ratings # Filtering data for only 4+ ratings ratings = ratings[ratings['rating'] >= 4.0] print('Shape of ratings dataset is: ',ratings.shape, ' ') print('Max values in dataset are ',ratings.max(), ' ') print('Min values in dataset are ',ratings.min(), ' ') Out: Shape of ratings dataset is: (12981742, 3) Max values in dataset are userId 270896.0 movieId 176271.0 rating 5.0 dtype: float64 Min values in dataset are userId 1.0 movieId 1.0 rating 4.0 dtype: float64 So, now minimum rating given by users is 4.0 and also data set has reduced from 2.6e⁷ to 1.2e⁷ which is less than half of the original dataset. But dataset is still large and we want to reduce it more. For the intuition of this article, I want to work on a small dataset. So, now we will get a subset of this dataset for only first 200 movies. Later when we will reduce it further for first 100 users, then we’ll may have less than 200 movies which has been rated by users and we want to work around 100 movies. movies_list = np.unique(ratings['movieId'])[:200] ratings = ratings.loc[ratings['movieId'].isin(movies_list)] print('Shape of ratings dataset is: ',ratings.shape, ' ') print('Max values in dataset are ',ratings.max(), ' ') print('Min values in dataset are ',ratings.min(), ' ') Out: Shape of ratings dataset is: (776269, 3) Max values in dataset are userId 270896.0 movieId 201.0 rating 5.0 dtype: float64 Min values in dataset are userId 1.0 movieId 1.0 rating 4.0 dtype: float64 Still the dataset is large, so we again get another subset of ratings by extracting it for not all users but some users i.e. for 100 users. users_list = np.unique(ratings['userId'])[:100] ratings = ratings.loc[ratings['userId'].isin(users_list)] print('Shape of ratings dataset is: ',ratings.shape, ' ') print('Max values in dataset are ',ratings.max(), ' ') print('Min values in dataset are ',ratings.min(), ' ') print('Total Users: ', np.unique(ratings['userId']).shape[0]) print('Total Movies which are rated by 100 users: ', np.unique(ratings['movieId']).shape[0]) Out: Shape of ratings dataset is: (447, 3) Max values in dataset are userId 157.0 movieId 198.0 rating 5.0 dtype: float64 Min values in dataset are userId 1.0 movieId 1.0 rating 4.0 dtype: float64 Total Users: 100 Total Movies which are rated by 100 users: 83 And finally, its done. We have a dataset of shape (447,3) which includes 4+ ratings of 83 movies by 100 users. As we were started with 200 movies but when we extracted it for only first 100 users, it looks like that 117 movies were not rated by first 100 users. As, now we are not worried for ratings column and further we have supposed that each movie which is rated 4+ by user is of his/her interest. So, if a movie is an interest of user 1 then that movie will also be interest of another user 2 of same taste. Now, we can drop this column as each movie is a favorite for every user. users_fav_movies = ratings.loc[:, ['userId', 'movieId']] Since we were sorted DataFrame by columns, so index may not be in proper order. Now, we want to reset the index. users_fav_movies = ratings.reset_index(drop = True) And finally, here is our final DataFrame of first 100 users favorite movies from the list of first 200 movies. The below DataFrame is printed with transpose users_fav_movies.T Now, let save this DataFrame to csv file on our local, so that we can use it later. users_fav_movies.to_csv('./Prepairing Data/From Data/filtered_ratings.csv') Data Featuring In this section, we will create a sparse matrix which we’ll use in k-means. For this, let define a function which return us a movies list for each user from dataset def moviesListForUsers(users, users_data): # users = a list of users IDs # users_data = a dataframe of users favourite movies or users watched movies users_movies_list = [] for user in users: users_movies_list.append(str(list(users_data[users_data['userId'] == user]['movieId'])).split('[')[1].split(']')[0]) return users_movies_list The method moviesListForUsers returns us a list which will contain strings for each users favorite movies list. Later we will use CountVectorizer to extract the features of strings which contains list of movies. Note: The method moviesListForUsers returns us list in the same order as users list. So to avoid trap, we will have users list in the descending order. In above defined method, we need to have a list of users and users_data dataframe. As users_data is the dataframe we already have. Now, let prepair the users list users = np.unique(users_fav_movies['userId']) print(users.shape) Out: (100,) Now, let prepare the list of movies for each user. users_movies_list = moviesListForUsers(users, users_fav_movies) print('Movies list for', len(users_movies_list), ' users') print('A list of first 10 users favourite movies: ', users_movies_list[:10]) Out: Movies list for 100 users A list of first 10 users favourite movies: ['147', '64, 79', '1, 47', '1, 150', '150, 165', '34', '1, 16, 17, 29, 34, 47, 50, 82, 97, 123, 125, 150, 162, 175, 176, 194', '6', '32, 50, 111, 198', '81'] Above is the list for first 10 users favorite movies. First string contain first users favorite movies IDs, second for second users and so on. It looks that the list of 7th users favorite movies is larger than others. Now, we’ll prepare a sparse matrix for each user against each movie. If user has watched movie then 1, else 0 Let us first define a function for sparse matrix def prepSparseMatrix(list_of_str): # list_of_str = A list, which contain strings of users favourite movies separate by comma ",". # It will return us sparse matrix and feature names on which sparse matrix is defined # i.e. name of movies in the same order as the column of sparse matrix cv = CountVectorizer(token_pattern = r'[^\,\ ]+', lowercase = False) sparseMatrix = cv.fit_transform(list_of_str) return sparseMatrix.toarray(), cv.get_feature_names() Now, let prepare the sparse matrix sparseMatrix, feature_names = prepSparseMatrix(users_movies_list) Now let put it into DataFrame to have a more clear presentation. The format will be as columns will presents each movie and index will presents users IDs df_sparseMatrix = pd.DataFrame(sparseMatrix, index = users, columns = feature_names) df_sparseMatrix Now, let make it clear that the matrix we defined above is exactly as we want it? We’ll check it for some users. Let take a look at some users favorite movies lists first_6_users_SM = users_fav_movies[users_fav_movies['userId'].isin(users[:6])].sort_values('userId') first_6_users_SM.T Now, let check the that if the users with above IDs have value 1 in the column of their favorite movie and 0 otherwise. Remember that in the sparseMatrix DataFrame df_sparseMatrix indexes were users IDs. df_sparseMatrix.loc[np.unique(first_6_users_SM['userId']), list(map(str, np.unique(first_6_users_SM['movieId'])))] We can observe from above two DataFrames that our sparse matrix is correct and have values in proper place. As, we have done with data engineering, now let create our machine learning clustering model with k-means algorithm. Clustering Model To clustering the data, first of all we need to find the optimal number of clusters. For this purpose, we will define an object for elbow method which will contain two functions first for running k-means algorithm for different number of clusters and other to showing plot. class elbowMethod(): def __init__(self, sparseMatrix): self.sparseMatrix = sparseMatrix self.wcss = list() self.differences = list() def run(self, init, upto, max_iterations = 300): for i in range(init, upto + 1): kmeans = KMeans(n_clusters=i, init = 'k-means++', max_iter = max_iterations, n_init = 10, random_state = 0) kmeans.fit(sparseMatrix) self.wcss.append(kmeans.inertia_) self.differences = list() for i in range(len(self.wcss)-1): self.differences.append(self.wcss[i] - self.wcss[i+1]) def showPlot(self, boundary = 500, upto_cluster = None): if upto_cluster is None: WCSS = self.wcss DIFF = self.differences else: WCSS = self.wcss[:upto_cluster] DIFF = self.differences[:upto_cluster - 1] plt.figure(figsize=(15, 6)) plt.subplot(121).set_title('Elbow Method Graph') plt.plot(range(1, len(WCSS) + 1), WCSS) plt.grid(b = True) plt.subplot(122).set_title('Differences in Each Two Consective Clusters') len_differences = len(DIFF) X_differences = range(1, len_differences + 1) plt.plot(X_differences, DIFF) plt.plot(X_differences, np.ones(len_differences)*boundary, 'r') plt.plot(X_differences, np.ones(len_differences)*(-boundary), 'r') plt.grid() plt.show() Why we write elbow method in object? As we don’t know where we will get elbow i.e. optimal number of clusters, so we write it in object in such a way that the values of WCSS will be in attribute of object and we’ll not lost them. As, firstly we may run elbow method for cluster number of 1–10 and later when we plot it, we may find that we don’t get joint of elbow yet and we need to run it for more. So, next time we can run the same instance of object from 11–20 and so on, until we’ll get joint for elbow. So we can save our time to run it for again from 1–20. And thus, we’ll not lost data of previous run. You may observe that in the above class method showPlot, I have written two plots. Yeah, here I’m going to use another strategy when we can’t observe an elbow. And this is the difference between each two WCSS values and we can set a boundary for more clear observations of changing in WCSS value. That is, when the changes in WCSS value will remain inside our required boundary then we will say that we have find elbow after which changes are small. See below the plots Now let, first we analyze for clusters 1–10 with the boundary of 10 i.e. when the changes in WCSS value will be remain inside the boundary, we’ll say that now we have find an elbow after which change is small. Remeber that the dataframe df_sparseMatrix was only for prsentation of sparseMatrix. For the algorithm, we always use only matrix sparseMatrix itself. Let first create an instance of elbow method on our defined sparseMatrix. elbow_method = elbowMethod(sparseMatrix) Now, first we will run it for 1–10 number of cluster, i.e. first k-mean will run for no of clusters 𝑘=1, then for no. of clusters 𝑘=2 and so on upto no. of clusters 𝑘=10. elbow_method.run(1, 10) elbow_method.showPlot(boundary = 10) Since, we don’t have any clear elbow yet and also we don’t have differences inside the boundary. Now let run it for clusters 11–20 elbow_method.run(11, 30) elbow_method.showPlot(boundary = 10) What happend? We don’t have elbow, but we have boundary in differences graph. If we look at the differences graph, we observe that after the cluster 14, the differences are almost inside the boundary. So, we will run k-means for clusters 15 because the 14'th difference is the difference between 𝑘=14 and 𝑘=15. Since we have done to analyze the optimal clusters 𝑘. Now move to fitting the model and making recommendations. Fitting Data on Model Now let first create the same k-means model and run it to make predictions. kmeans = KMeans(n_clusters=15, init = 'k-means++', max_iter = 300, n_init = 10, random_state = 0) clusters = kmeans.fit_predict(sparseMatrix) Now, let create a dataframe where we can see each user cluster number users_cluster = pd.DataFrame(np.concatenate((users.reshape(-1,1), clusters.reshape(-1,1)), axis = 1), columns = ['userId', 'Cluster']) users_cluster.T Now we’ll define a function which will create a list of DataFrames where each DataFrame will contain the movieId and the counts for that movie (count: the number of users who has that respective movie in their favorite list). So, the movie which will have more counts will be of more interest to other users who has not watched that movie yet. For Example, we’ll create a list as following [dataframe_for_Cluster_1, dataframe_for_Cluster_2, ..., dataframe_for_Cluster_3] Where each DataFrame will be of following format where 3rd column of Count is representing the total number of users in the cluster who have watched that particular movie. So, we will sort movies by their count in order to prioritize the movie which have most seen by users in cluster and is more favorite for users in the cluster. Now we want to create a list of all user movies in each cluster. For this, first we’ll define a method for creating movies of clusters. def clustersMovies(users_cluster, users_data): clusters = list(users_cluster['Cluster']) each_cluster_movies = list() for i in range(len(np.unique(clusters))): users_list = list(users_cluster[users_cluster['Cluster'] == i]['userId']) users_movies_list = list() for user in users_list: users_movies_list.extend(list(users_data[users_data['userId'] == user]['movieId'])) users_movies_counts = list() users_movies_counts.extend([[movie, users_movies_list.count(movie)] for movie in np.unique(users_movies_list)]) each_cluster_movies.append(pd.DataFrame(users_movies_counts, columns=['movieId', 'Count']).sort_values(by = ['Count'], ascending = False).reset_index(drop=True)) return each_cluster_movies cluster_movies = clustersMovies(users_cluster, users_fav_movies) Now, let take a look at any one DataFrame of cluster_movies. cluster_movies[1].T We have 30 movies in 1st cluster where movie with ID 1 is favorite by 19 users and at the top priority, followed by movie with ID 150 which is favorite by 8 users. Now, let see how much users we have in each cluster for i in range(15): len_users = users_cluster[users_cluster['Cluster'] == i].shape[0] print('Users in Cluster ' + str(i) + ' -> ', len_users) Out: Users in Cluster 0 -> 35 Users in Cluster 1 -> 19 Users in Cluster 2 -> 1 Users in Cluster 3 -> 5 Users in Cluster 4 -> 8 Users in Cluster 5 -> 1 Users in Cluster 6 -> 12 Users in Cluster 7 -> 2 Users in Cluster 8 -> 1 Users in Cluster 9 -> 1 Users in Cluster 10 -> 1 Users in Cluster 11 -> 11 Users in Cluster 12 -> 1 Users in Cluster 13 -> 1 Users in Cluster 14 -> 1 As, we can see that there are some clusters which contain only 1 user or 2 or 5. As we don’t want such small cluster where we can’t recommend enough movies to users. As the user in a cluster of size one will not get any recommendation for movies OR even user in size of cluster 2 will not get enough recommendations. So, we have to fix such small clusters. Fixing Small Clusters Since, there are many clusters which includes less number of users. So we don’t want any user in a cluster alone and let say we want at least 6 users in each cluster. So we have to move users from small cluster into a large cluster which contain more relevant movies to user First of all we’ll write a function to get user favorite movies list def getMoviesOfUser(user_id, users_data): return list(users_data[users_data['userId'] == user_id]['movieId']) Now, we’ll define a function for fixing clusters def fixClusters(clusters_movies_dataframes, users_cluster_dataframe, users_data, smallest_cluster_size = 11): # clusters_movies_dataframes: will be a list which will contain each dataframes of each cluster movies # users_cluster_dataframe: will be a dataframe which contain users IDs and their cluster no. # smallest_cluster_size: is a smallest cluster size which we want for a cluster to not remove each_cluster_movies = clusters_movies_dataframes.copy() users_cluster = users_cluster_dataframe.copy() # Let convert dataframe in each_cluster_movies to list with containing only movies IDs each_cluster_movies_list = [list(df['movieId']) for df in each_cluster_movies] # First we will prepair a list which containt lists of users in each cluster -> [[Cluster 0 Users], [Cluster 1 Users], ... ,[Cluster N Users]] usersInClusters = list() total_clusters = len(each_cluster_movies) for i in range(total_clusters): usersInClusters.append(list(users_cluster[users_cluster['Cluster'] == i]['userId'])) uncategorizedUsers = list() i = 0 # Now we will remove small clusters and put their users into another list named "uncategorizedUsers" # Also when we will remove a cluster, then we have also bring back cluster numbers of users which comes after deleting cluster # E.g. if we have deleted cluster 4 then their will be users whose clusters will be 5,6,7,..,N. So, we'll bring back those users cluster number to 4,5,6,...,N-1. for j in range(total_clusters): if len(usersInClusters[i]) < smallest_cluster_size: uncategorizedUsers.extend(usersInClusters[i]) usersInClusters.pop(i) each_cluster_movies.pop(i) each_cluster_movies_list.pop(i) users_cluster.loc[users_cluster['Cluster'] > i, 'Cluster'] -= 1 i -= 1 i += 1 for user in uncategorizedUsers: elemProbability = list() user_movies = getMoviesOfUser(user, users_data) if len(user_movies) == 0: print(user) user_missed_movies = list() for movies_list in each_cluster_movies_list: count = 0 missed_movies = list() for movie in user_movies: if movie in movies_list: count += 1 else: missed_movies.append(movie) elemProbability.append(count / len(user_movies)) user_missed_movies.append(missed_movies) user_new_cluster = np.array(elemProbability).argmax() users_cluster.loc[users_cluster['userId'] == user, 'Cluster'] = user_new_cluster if len(user_missed_movies[user_new_cluster]) > 0: each_cluster_movies[user_new_cluster] = each_cluster_movies[user_new_cluster].append([{'movieId': new_movie, 'Count': 1} for new_movie in user_missed_movies[user_new_cluster]], ignore_index = True) return each_cluster_movies, users_cluster Now, run it. movies_df_fixed, clusters_fixed = fixClusters(cluster_movies, users_cluster, users_fav_movies, smallest_cluster_size = 6) To observer changes for fixing clusters, first take a look at data which we were had before and and then data after fixing First we’ll print those clusters which contain maximum 5 users j = 0 for i in range(15): len_users = users_cluster[users_cluster['Cluster'] == i].shape[0] if len_users < 6: print('Users in Cluster ' + str(i) + ' -> ', len_users) j += 1 print('Total Cluster which we want to remove -> ', j) Out: Users in Cluster 2 -> 1 Users in Cluster 3 -> 5 Users in Cluster 5 -> 1 Users in Cluster 7 -> 2 Users in Cluster 8 -> 1 Users in Cluster 9 -> 1 Users in Cluster 10 -> 1 Users in Cluster 12 -> 1 Users in Cluster 13 -> 1 Users in Cluster 14 -> 1 Total Cluster which we want to remove -> 10 Now look at the users cluster data frame print('Length of total clusters before fixing is -> ', len(cluster_movies)) print('Max value in users_cluster dataframe column Cluster is -> ', users_cluster['Cluster'].max()) print('And dataframe is following') users_cluster.T Out: Length of total clusters before fixing is -> 15 Max value in users_cluster dataframe column Cluster is -> 14 And dataframe is following So, we want max value in Cluster column is 4 starting from index 0, as we’ll remove 10 smallest clusters and we’ll have 5 remaining clusters Now, let see what happend after fixing data. We want to remove all those 10 small clusters and also the users_cluster DataFrame shouldn’t contain any user whose clusters which is invalid. print('Length of total clusters after fixing is -> ', len(movies_df_fixed)) print('Max value in users_cluster dataframe column Cluster is -> ', clusters_fixed['Cluster'].max()) print('And fixed dataframe is following') clusters_fixed.T Out: Length of total clusters after fixing is -> 5 Max value in users_cluster dataframe column Cluster is -> 4 And fixed dataframe is following Now let see what happend when 10 clusters were deleted and how the remaining users clusters were adjusted which were already in large clusters. Let take a look at anyone 11th cluster user. Since 11th cluster was already containing enough users i.e. 11 users and we were not want to delete that, but as now we only have max 5 cluster and max value of cluster column is 4, so what actually happend to 11 cluster? As there were 7 clusters before cluster no. 11 which were small and removed, so the value 11 now should be bring back to 4. print('Users cluster dataFrame for cluster 11 before fixing:') users_cluster[users_cluster['Cluster'] == 11].T Out: Users cluster dataFrame for cluster 11 before fixing: Now let look at the cluster 4 after fixing print('Users cluster dataFrame for cluster 4 after fixing which should be same as 11th cluster before fixing:') clusters_fixed[clusters_fixed['Cluster'] == 4].T Out: Users cluster dataFrame for cluster 4 after fixing which should be same as 11th cluster before fixing: Both DataFrame are containing same users IDs, So we don’t disturbed any cluster and simililarly we did same with list of movies DataFrames for each cluster Now let take a look at list of movies dataframes print('Size of movies dataframe after fixing -> ', len(movies_df_fixed)) Out: Size of movies dataframe after fixing -> 5 Now, lets look at the sizes of clusters for i in range(len(movies_df_fixed)): len_users = clusters_fixed[clusters_fixed['Cluster'] == i].shape[0] print('Users in Cluster ' + str(i) + ' -> ', len_users) Out: Users in Cluster 0 -> 45 Users in Cluster 1 -> 21 Users in Cluster 2 -> 8 Users in Cluster 3 -> 15 Users in Cluster 4 -> 11 Each cluster is now containing enough users so that we can make recommendations for other users. Let take a look at each size of clusters movies list. for i in range(len(movies_df_fixed)): print('Total movies in Cluster ' + str(i) + ' -> ', movies_df_fixed[i].shape[0]) Out: Total movies in Cluster 0 -> 64 Total movies in Cluster 1 -> 39 Total movies in Cluster 2 -> 15 Total movies in Cluster 3 -> 50 Total movies in Cluster 4 -> 25 As, we have done working with training machine learning model k-means, making predictions of clusters for each user and fixing some issues. Finally, we need to store this training so that we can use it later. For this, we will use Pickle library to save and load trainings. We have already imported Pickle, now we will use it. Let me first design object to save and load trainings. We will directly design methods for saving/loading particular files and also we will design general save/load methods class saveLoadFiles: def save(self, filename, data): try: file = open('datasets/' + filename + '.pkl', 'wb') pickle.dump(data, file) except: err = 'Error: {0}, {1}'.format(exc_info()[0], exc_info()[1]) print(err) file.close() return [False, err] else: file.close() return [True] def load(self, filename): try: file = open('datasets/' + filename + '.pkl', 'rb') except: err = 'Error: {0}, {1}'.format(exc_info()[0], exc_info()[1]) print(err) file.close() return [False, err] else: data = pickle.load(file) file.close() return data def loadClusterMoviesDataset(self): return self.load('clusters_movies_dataset') def saveClusterMoviesDataset(self, data): return self.save('clusters_movies_dataset', data) def loadUsersClusters(self): return self.load('users_clusters') def saveUsersClusters(self, data): return self.save('users_clusters', data) In above class, exc_info imported from sys library for error handling and error writings. We will use saveClusterMoviesDataset/loadClusterMoviesDataset methods to save/load list of clusters movies DataFrames and saveUsersClusters/loadUsersClusters methods to save/load users clusters DataFrames. Now, lets try it. We will run and print responses in order to check if any error comes. If it return True then its mean our files has been saved successfully in proper place. saveLoadFile = saveLoadFiles() print(saveLoadFile.saveClusterMoviesDataset(movies_df_fixed)) print(saveLoadFile.saveUsersClusters(clusters_fixed)) Out: [True] [True] As response is True for both save methods. Our trained data has now saved and we can use it later. Let check it if we can load it. load_movies_list, load_users_clusters = saveLoadFile.loadClusterMoviesDataset(), saveLoadFile.loadUsersClusters() print('Type of Loading list of Movies dataframes of 5 Clusters: ', type(load_movies_list), ' and Length is: ', len(load_movies_list)) print('Type of Loading 100 Users clusters Data: ', type(load_users_clusters), ' and Shape is: ', load_users_clusters.shape) Out: Type of Loading list of Movies dataframes of 5 Clusters: <class 'list'> and Length is: 5 Type of Loading 100 Users clusters Data: <class 'pandas.core.frame.DataFrame'> and Shape is: (100, 2) We have successfully saved and loaded our data by using pickle library. As we worked for very small dataset. But often movies recommendation systems works with very large datasets as the dataset we were had initially, and there we have enough movies in each cluster to make recommendations. Now, we need to design functions for making recommendations to users. Recommendations for Users Now here we’ll create an object for recommending most favorite movies in the cluster to the user which user has not added to favorite earlier. And also when any user has added another movie in his favorite list, then we have to update clusters movies datasets also. class userRequestedFor: def __init__(self, user_id, users_data): self.users_data = users_data.copy() self.user_id = user_id # Find User Cluster users_cluster = saveLoadFiles().loadUsersClusters() self.user_cluster = int(users_cluster[users_cluster['userId'] == self.user_id]['Cluster']) # Load User Cluster Movies Dataframe self.movies_list = saveLoadFiles().loadClusterMoviesDataset() self.cluster_movies = self.movies_list[self.user_cluster] # dataframe self.cluster_movies_list = list(self.cluster_movies['movieId']) # list def updatedFavouriteMoviesList(self, new_movie_Id): if new_movie_Id in self.cluster_movies_list: self.cluster_movies.loc[self.cluster_movies['movieId'] == new_movie_Id, 'Count'] += 1 else: self.cluster_movies = self.cluster_movies.append([{'movieId':new_movie_Id, 'Count': 1}], ignore_index=True) self.cluster_movies.sort_values(by = ['Count'], ascending = False, inplace= True) self.movies_list[self.user_cluster] = self.cluster_movies saveLoadFiles().saveClusterMoviesDataset(self.movies_list) def recommendMostFavouriteMovies(self): try: user_movies = getMoviesOfUser(self.user_id, self.users_data) cluster_movies_list = self.cluster_movies_list.copy() for user_movie in user_movies: if user_movie in cluster_movies_list: cluster_movies_list.remove(user_movie) return [True, cluster_movies_list] except KeyError: err = "User history does not exist" print(err) return [False, err] except: err = 'Error: {0}, {1}'.format(exc_info()[0], exc_info()[1]) print(err) return [False, err] Now lets try it to make recommendations and updating favorite list request. For this, first we’ll import data for not only IDs but for movies details like title, genre etc. movies_metadata = pd.read_csv( './Prepairing Data/From Data/movies_metadata.csv', usecols = ['id', 'genres', 'original_title']) movies_metadata = movies_metadata.loc[ movies_metadata['id'].isin(list(map(str, np.unique(users_fav_movies['movieId']))))].reset_index(drop=True) print('Let take a look at movie metadata for all those movies which we were had in our dataset') movies_metadata Out: Let take a look at movie metadata for all those movies which we were had in our dataset Here is the list of movies which user with ID 12 has added into its favorite movies user12Movies = getMoviesOfUser(12, users_fav_movies) for movie in user12Movies: title = list(movies_metadata.loc[movies_metadata['id'] == str(movie)]['original_title']) if title != []: print('Movie title: ', title, ', Genres: [', end = '') genres = ast.literal_eval(movies_metadata.loc[movies_metadata['id'] == str(movie)]['genres'].values[0].split('[')[1].split(']')[0]) for genre in genres: print(genre['name'], ', ', end = '') print(end = '\b\b]') print('') Out: Movie title: ['Dancer in the Dark'] , Genres: [Drama , Crime , Music , ] Movie title: ['The Dark'] , Genres: [Horror , Thriller , Mystery , ] Movie title: ['Miami Vice'] , Genres: [Action , Adventure , Crime , Thriller , ] Movie title: ['Tron'] , Genres: [Science Fiction , Action , Adventure , ] Movie title: ['The Lord of the Rings'] , Genres: [Fantasy , Drama , Animation , Adventure , ] Movie title: ['48 Hrs.'] , Genres: [Thriller , Action , Comedy , Crime , Drama , ] Movie title: ['Edward Scissorhands'] , Genres: [Fantasy , Drama , Romance , ] Movie title: ['Le Grand Bleu'] , Genres: [Adventure , Drama , Romance , ] Movie title: ['Saw'] , Genres: [Horror , Mystery , Crime , ] Movie title: ["Le fabuleux destin d'Amélie Poulain"] , Genres: [Comedy , Romance , ] And finally these are the top 10 recommended movies for that user user12Recommendations = userRequestedFor(12, users_fav_movies).recommendMostFavouriteMovies()[1] for movie in user12Recommendations[:15]: title = list(movies_metadata.loc[movies_metadata['id'] == str(movie)]['original_title']) if title != []: print('Movie title: ', title, ', Genres: [', end = '') genres = ast.literal_eval(movies_metadata.loc[movies_metadata['id'] == str(movie)]['genres'].values[0].split('[')[1].split(']')[0]) for genre in genres: print(genre['name'], ', ', end = '') print(']', end = '') print() Out: Movie title: ['Trois couleurs : Rouge'] , Genres: [Drama , Mystery , Romance , ] Movie title: ["Ocean's Eleven"] , Genres: [Thriller , Crime , ] Movie title: ['Judgment Night'] , Genres: [Action , Thriller , Crime , ] Movie title: ['Scarface'] , Genres: [Action , Crime , Drama , Thriller , ] Movie title: ['Back to the Future Part II'] , Genres: [Adventure , Comedy , Family , Science Fiction , ] Movie title: ["Ocean's Twelve"] , Genres: [Thriller , Crime , ] Movie title: ['To Be or Not to Be'] , Genres: [Comedy , War , ] Movie title: ['Back to the Future Part III'] , Genres: [Adventure , Comedy , Family , Science Fiction , ] Movie title: ['A Clockwork Orange'] , Genres: [Science Fiction , Drama , ] Movie title: ['Minority Report'] , Genres: [Action , Thriller , Science Fiction , Mystery , ] And finally, we have successfully recommended movies to user based on his/her interest with most favorite movies by similar users. You’re Done Thanks for reading this article. If you want this whole project in the deployment coding, then please visit my GitHub library AI Movies Recommendation System Based on K-means Clustering Algorithm and download it to work with it, it is completely free for everyone. Thank You
https://asdkazmi.medium.com/ai-movies-recommendation-system-with-clustering-based-k-means-algorithm-f04467e02fcd
['Syed Muhammad Asad']
2020-08-19 11:21:08.647000+00:00
['Machine Learning', 'Artificial Intelligence', 'Recommendation System', 'Python', 'K Means Clustering']
Title AI Movies Recommendation System Based KMeans Clustering AlgorithmContent AI Movies Recommendation System Based KMeans Clustering Algorithm Overview Article article we’ll build artificial intelligence movie recommendation system using kmeans algorithm clustering algorithm We’ll recommend movie user relevant based previous history We’ll import data user rated movie 4 want recommend movie user like whole article used Python programming language associated library ie NumPy Pandas Matplotlib ScikitLearn Moreover supposed reader familiarity Python aforementioned library Introduction AI Movies Recommendation System busy life people don’t time search desired item even want table even little effort recommendation system become important part help make right choice desired thing grow product Since data increasing day day era large database even become difficult task find relevant item interest often can’t search item interest title even sometimes harder recommendation system help u provide relevant item individual available database article we’ll build movie recommendation system Movies recommendation system become essential part movie website individual don’t know movie interested title genre Sometime individual like action movie heshe always like every action movie handle problem many author provided better way recommend movie user 1 watch list favorite movie another user 2 whose movie database relevant user 1 taste two people like favorite food Many tech giant using recommendation system application like YouTube Netflix etc task machine learning ML model helped u lot build recommendation system based user previous watch history ML model learns user watch history categorize group contain user taste Different type ML model used like clustering algorithm deep learning model etc KMeans Clustering Algorithm KMeans unsupervised machine learning algorithm used categorize data different group article we’ll use algorithm categorize user based 4 rating movie I’ll describe background mathematics algorithm I’ll describe little intuition algorithm want understand mathematical background algorithm I’ll suggest search Google many author written article mathematical background Since complete mathematics behind algorithm done ScikitLearn library understand implement Note Plots data section designed randomly intuition Kmeans algorithm Figure 1 — Scatter Plot KMeans Clustering Suppose 2dimensional data form x₁ x₂ Let plotted data Figure 1 Next want divide data group take look data observe data divided three group plot designed intuition common man observe divide three group time complex big data time 3dimensional 4dimensional generally 100 dimension 1000 even possible human categorize type data even can’t plot higher dimensional data Also sometimes don’t know optimal number cluster data use clustering algorithm work big data even thousand dimension method used know optimal number cluster Figure 2 — Scatter Plot KMeans Clustering Figure 2 demonstration kmeans clustering shown data Figure 1 categorized three group presented Figure 2 unique color group One arise question actually kmeans worked categorize data categorize data group contain type itemsdata 6 step kmeans algorithm follow Figure 3 presenting step kmeans algorithm follow categorize data Figure 3 — Graphical Abstract KMeans Algorithm Figure 3 describing following step kmeans algorithm Firstly select number cluster want dataset Later elbow method explained selection optimal number cluster select k random point called centroid necessary dataset avoid random initialization trap stuck bad cluster we’ll use kmeans initalize k centroid provided ScikitLearn kmeans algorithm Kmeans algorithm assign data point closest centroid finally give u k cluster centroid recenter position actually centroid cluster new centroid reset cluster assign dataset point new closest centroid new cluster previous cluster total iteration completed stop give u final cluster dataset Else move step 4 Elbow Method elbow method best way find optimal number cluster need find within cluster sum square WCSS WCSS sum square point distance centroid mathematical formula following K total number cluster Nᵢ size i’th cluster also say data point i’th cluster Cᵢ centroid i’th cluster Pᵢⱼ j’th data point i’th cluster we’ll WCSS WCSS tell u far centroid data point increase number cluster WCSS become small value K WCSS reduce slowly stop choose optimal number cluster I’ll suggest Google elbow method take look clear example elbow method figure intuition elbow method Figure 4 — Elbow Method Plot demonstration elbow method show Figure 4 observe number cluster K move 1 5 WCSS value decrease rapidly 2500 400 approx cluster number 6 onward decreasing slowly make judgment good dataset 5 cluster see look like elbow joint elbow optimal number cluster case 5 Later we’ll see don’t always smooth curve work described another way observe change WCSS know optimal cluster Methodology Used Article article we’ll build clustering based algorithm categorize user group interest using kmeans algorithm use data user rated movie 4 rating supposition user rating movie 4 heshe may like downloaded database Movies Dataset Kagglecom MovieLens Dataset following section completely described whole project Importing Dataset Data Engineering Building KMeans Clustering Model Analyzing Optimal Number Clusters Training Model Predicting Fixing Clusters Saving Training Finally Making Recommendations Users complete project movie recommendation system downloaded GitHub Library AI Movies Recommendation System Based Kmeans Clustering Algorithm Jupyter notebook article also provided repository download play URL httpsgithubcomasdkazmiAIMoviesRecommendationSystemKMeansClustering URL httpswwwkagglecomrounakbanikthemoviesdatasetselectratingscsv let start work coding Importing Required Libraries import panda pd printPandas version pdversion import numpy np printNumPy version npversion import matplotlib printMatplotlib version matplotlibversion matplotlib import pyplot plt import sklearn printScikitLearn version sklearnversion sklearnfeatureextractiontext import CountVectorizer sklearncluster import KMeans import pickle printPickle version pickleformatversion import sys printSys version sysversion05 sys import excinfo import ast Pandas version 0251 NumPy version 1165 Matplotlib version 311 ScikitLearn version 0213 Pickle version 40 Sys version 374 Data Engineering section divided two subsection Firstly import data reduce sub DataFrame focus model look type user rated movie type recommendation based Secondly we’ll perform feature engineering data form valid machine learning algorithm Preparing Data Model downloaded MovieLens Dataset Kagglecom first we’ll import rating dataset want user rating movie we’ll filter data user give 4 rating rating pdreadcsvPrepairing DataFrom Dataratingscsv usecols userId movieIdrating printShape rating dataset ratingsshape printMax value dataset ratingsmax printMin value dataset ratingsmin Shape rating dataset 26024289 3 Max value dataset userId 2708960 movieId 1762750 rating 50 dtype float64 Min value dataset userId 10 movieId 10 rating 05 dtype float64 Next we’ll filter dataset 4 rating Filtering data 4 rating rating ratingsratingsrating 40 printShape rating dataset ratingsshape printMax value dataset ratingsmax printMin value dataset ratingsmin Shape rating dataset 12981742 3 Max value dataset userId 2708960 movieId 1762710 rating 50 dtype float64 Min value dataset userId 10 movieId 10 rating 40 dtype float64 minimum rating given user 40 also data set reduced 26e⁷ 12e⁷ le half original dataset dataset still large want reduce intuition article want work small dataset get subset dataset first 200 movie Later reduce first 100 user we’ll may le 200 movie rated user want work around 100 movie movieslist npuniqueratingsmovieId200 rating ratingslocratingsmovieIdisinmovieslist printShape rating dataset ratingsshape printMax value dataset ratingsmax printMin value dataset ratingsmin Shape rating dataset 776269 3 Max value dataset userId 2708960 movieId 2010 rating 50 dtype float64 Min value dataset userId 10 movieId 10 rating 40 dtype float64 Still dataset large get another subset rating extracting user user ie 100 user userslist npuniqueratingsuserId100 rating ratingslocratingsuserIdisinuserslist printShape rating dataset ratingsshape printMax value dataset ratingsmax printMin value dataset ratingsmin printTotal Users npuniqueratingsuserIdshape0 printTotal Movies rated 100 user npuniqueratingsmovieIdshape0 Shape rating dataset 447 3 Max value dataset userId 1570 movieId 1980 rating 50 dtype float64 Min value dataset userId 10 movieId 10 rating 40 dtype float64 Total Users 100 Total Movies rated 100 user 83 finally done dataset shape 4473 includes 4 rating 83 movie 100 user started 200 movie extracted first 100 user look like 117 movie rated first 100 user worried rating column supposed movie rated 4 user hisher interest movie interest user 1 movie also interest another user 2 taste drop column movie favorite every user usersfavmovies ratingsloc userId movieId Since sorted DataFrame column index may proper order want reset index usersfavmovies ratingsresetindexdrop True finally final DataFrame first 100 user favorite movie list first 200 movie DataFrame printed transpose usersfavmoviesT let save DataFrame csv file local use later usersfavmoviestocsvPrepairing DataFrom Datafilteredratingscsv Data Featuring section create sparse matrix we’ll use kmeans let define function return u movie list user dataset def moviesListForUsersusers usersdata user list user IDs usersdata dataframe user favourite movie user watched movie usersmovieslist user user usersmovieslistappendstrlistusersdatausersdatauserId usermovieIdsplit1split0 return usersmovieslist method moviesListForUsers return u list contain string user favorite movie list Later use CountVectorizer extract feature string contains list movie Note method moviesListForUsers return u list order user list avoid trap user list descending order defined method need list user usersdata dataframe usersdata dataframe already let prepair user list user npuniqueusersfavmoviesuserId printusersshape 100 let prepare list movie user usersmovieslist moviesListForUsersusers usersfavmovies printMovies list lenusersmovieslist user printA list first 10 user favourite movie usersmovieslist10 Movies list 100 user list first 10 user favourite movie 147 64 79 1 47 1 150 150 165 34 1 16 17 29 34 47 50 82 97 123 125 150 162 175 176 194 6 32 50 111 198 81 list first 10 user favorite movie First string contain first user favorite movie IDs second second user look list 7th user favorite movie larger others we’ll prepare sparse matrix user movie user watched movie 1 else 0 Let u first define function sparse matrix def prepSparseMatrixlistofstr listofstr list contain string user favourite movie separate comma return u sparse matrix feature name sparse matrix defined ie name movie order column sparse matrix cv CountVectorizertokenpattern r lowercase False sparseMatrix cvfittransformlistofstr return sparseMatrixtoarray cvgetfeaturenames let prepare sparse matrix sparseMatrix featurenames prepSparseMatrixusersmovieslist let put DataFrame clear presentation format column present movie index present user IDs dfsparseMatrix pdDataFramesparseMatrix index user column featurenames dfsparseMatrix let make clear matrix defined exactly want We’ll check user Let take look user favorite movie list first6usersSM usersfavmoviesusersfavmoviesuserIdisinusers6sortvaluesuserId first6usersSMT let check user IDs value 1 column favorite movie 0 otherwise Remember sparseMatrix DataFrame dfsparseMatrix index user IDs dfsparseMatrixlocnpuniquefirst6usersSMuserId listmapstr npuniquefirst6usersSMmovieId observe two DataFrames sparse matrix correct value proper place done data engineering let create machine learning clustering model kmeans algorithm Clustering Model clustering data first need find optimal number cluster purpose define object elbow method contain two function first running kmeans algorithm different number cluster showing plot class elbowMethod def initself sparseMatrix selfsparseMatrix sparseMatrix selfwcss list selfdifferences list def runself init upto maxiterations 300 rangeinit upto 1 kmeans KMeansnclustersi init kmeans maxiter maxiterations ninit 10 randomstate 0 kmeansfitsparseMatrix selfwcssappendkmeansinertia selfdifferences list rangelenselfwcss1 selfdifferencesappendselfwcssi selfwcssi1 def showPlotself boundary 500 uptocluster None uptocluster None WCSS selfwcss DIFF selfdifferences else WCSS selfwcssuptocluster DIFF selfdifferencesuptocluster 1 pltfigurefigsize15 6 pltsubplot121settitleElbow Method Graph pltplotrange1 lenWCSS 1 WCSS pltgridb True pltsubplot122settitleDifferences Two Consective Clusters lendifferences lenDIFF Xdifferences range1 lendifferences 1 pltplotXdifferences DIFF pltplotXdifferences nponeslendifferencesboundary r pltplotXdifferences nponeslendifferencesboundary r pltgrid pltshow write elbow method object don’t know get elbow ie optimal number cluster write object way value WCSS attribute object we’ll lost firstly may run elbow method cluster number 1–10 later plot may find don’t get joint elbow yet need run next time run instance object 11–20 we’ll get joint elbow save time run 1–20 thus we’ll lost data previous run may observe class method showPlot written two plot Yeah I’m going use another strategy can’t observe elbow difference two WCSS value set boundary clear observation changing WCSS value change WCSS value remain inside required boundary say find elbow change small See plot let first analyze cluster 1–10 boundary 10 ie change WCSS value remain inside boundary we’ll say find elbow change small Remeber dataframe dfsparseMatrix prsentation sparseMatrix algorithm always use matrix sparseMatrix Let first create instance elbow method defined sparseMatrix elbowmethod elbowMethodsparseMatrix first run 1–10 number cluster ie first kmean run cluster 𝑘1 cluster 𝑘2 upto cluster 𝑘10 elbowmethodrun1 10 elbowmethodshowPlotboundary 10 Since don’t clear elbow yet also don’t difference inside boundary let run cluster 11–20 elbowmethodrun11 30 elbowmethodshowPlotboundary 10 happend don’t elbow boundary difference graph look difference graph observe cluster 14 difference almost inside boundary run kmeans cluster 15 14th difference difference 𝑘14 𝑘15 Since done analyze optimal cluster 𝑘 move fitting model making recommendation Fitting Data Model let first create kmeans model run make prediction kmeans KMeansnclusters15 init kmeans maxiter 300 ninit 10 randomstate 0 cluster kmeansfitpredictsparseMatrix let create dataframe see user cluster number userscluster pdDataFramenpconcatenateusersreshape11 clustersreshape11 axis 1 column userId Cluster usersclusterT we’ll define function create list DataFrames DataFrame contain movieId count movie count number user respective movie favorite list movie count interest user watched movie yet Example we’ll create list following dataframeforCluster1 dataframeforCluster2 dataframeforCluster3 DataFrame following format 3rd column Count representing total number user cluster watched particular movie sort movie count order prioritize movie seen user cluster favorite user cluster want create list user movie cluster first we’ll define method creating movie cluster def clustersMoviesuserscluster usersdata cluster listusersclusterCluster eachclustermovies list rangelennpuniqueclusters userslist listusersclusterusersclusterCluster iuserId usersmovieslist list user userslist usersmovieslistextendlistusersdatausersdatauserId usermovieId usersmoviescounts list usersmoviescountsextendmovie usersmovieslistcountmovie movie npuniqueusersmovieslist eachclustermoviesappendpdDataFrameusersmoviescounts columnsmovieId Countsortvaluesby Count ascending FalseresetindexdropTrue return eachclustermovies clustermovies clustersMoviesuserscluster usersfavmovies let take look one DataFrame clustermovies clustermovies1T 30 movie 1st cluster movie ID 1 favorite 19 user top priority followed movie ID 150 favorite 8 user let see much user cluster range15 lenusers usersclusterusersclusterCluster ishape0 printUsers Cluster stri lenusers Users Cluster 0 35 Users Cluster 1 19 Users Cluster 2 1 Users Cluster 3 5 Users Cluster 4 8 Users Cluster 5 1 Users Cluster 6 12 Users Cluster 7 2 Users Cluster 8 1 Users Cluster 9 1 Users Cluster 10 1 Users Cluster 11 11 Users Cluster 12 1 Users Cluster 13 1 Users Cluster 14 1 see cluster contain 1 user 2 5 don’t want small cluster can’t recommend enough movie user user cluster size one get recommendation movie even user size cluster 2 get enough recommendation fix small cluster Fixing Small Clusters Since many cluster includes le number user don’t want user cluster alone let say want least 6 user cluster move user small cluster large cluster contain relevant movie user First we’ll write function get user favorite movie list def getMoviesOfUseruserid usersdata return listusersdatausersdatauserId useridmovieId we’ll define function fixing cluster def fixClustersclustersmoviesdataframes usersclusterdataframe usersdata smallestclustersize 11 clustersmoviesdataframes list contain dataframes cluster movie usersclusterdataframe dataframe contain user IDs cluster smallestclustersize smallest cluster size want cluster remove eachclustermovies clustersmoviesdataframescopy userscluster usersclusterdataframecopy Let convert dataframe eachclustermovies list containing movie IDs eachclustermovieslist listdfmovieId df eachclustermovies First prepair list containt list user cluster Cluster 0 Users Cluster 1 Users Cluster N Users usersInClusters list totalclusters leneachclustermovies rangetotalclusters usersInClustersappendlistusersclusterusersclusterCluster iuserId uncategorizedUsers list 0 remove small cluster put user another list named uncategorizedUsers Also remove cluster also bring back cluster number user come deleting cluster Eg deleted cluster 4 user whose cluster 567N well bring back user cluster number 456N1 j rangetotalclusters lenusersInClustersi smallestclustersize uncategorizedUsersextendusersInClustersi usersInClusterspopi eachclustermoviespopi eachclustermovieslistpopi usersclusterlocusersclusterCluster Cluster 1 1 1 user uncategorizedUsers elemProbability list usermovies getMoviesOfUseruser usersdata lenusermovies 0 printuser usermissedmovies list movieslist eachclustermovieslist count 0 missedmovies list movie usermovies movie movieslist count 1 else missedmoviesappendmovie elemProbabilityappendcount lenusermovies usermissedmoviesappendmissedmovies usernewcluster nparrayelemProbabilityargmax usersclusterlocusersclusteruserId user Cluster usernewcluster lenusermissedmoviesusernewcluster 0 eachclustermoviesusernewcluster eachclustermoviesusernewclusterappendmovieId newmovie Count 1 newmovie usermissedmoviesusernewcluster ignoreindex True return eachclustermovies userscluster run moviesdffixed clustersfixed fixClustersclustermovies userscluster usersfavmovies smallestclustersize 6 observer change fixing cluster first take look data data fixing First we’ll print cluster contain maximum 5 user j 0 range15 lenusers usersclusterusersclusterCluster ishape0 lenusers 6 printUsers Cluster stri lenusers j 1 printTotal Cluster want remove j Users Cluster 2 1 Users Cluster 3 5 Users Cluster 5 1 Users Cluster 7 2 Users Cluster 8 1 Users Cluster 9 1 Users Cluster 10 1 Users Cluster 12 1 Users Cluster 13 1 Users Cluster 14 1 Total Cluster want remove 10 look user cluster data frame printLength total cluster fixing lenclustermovies printMax value userscluster dataframe column Cluster usersclusterClustermax printAnd dataframe following usersclusterT Length total cluster fixing 15 Max value userscluster dataframe column Cluster 14 dataframe following want max value Cluster column 4 starting index 0 we’ll remove 10 smallest cluster we’ll 5 remaining cluster let see happend fixing data want remove 10 small cluster also userscluster DataFrame shouldn’t contain user whose cluster invalid printLength total cluster fixing lenmoviesdffixed printMax value userscluster dataframe column Cluster clustersfixedClustermax printAnd fixed dataframe following clustersfixedT Length total cluster fixing 5 Max value userscluster dataframe column Cluster 4 fixed dataframe following let see happend 10 cluster deleted remaining user cluster adjusted already large cluster Let take look anyone 11th cluster user Since 11th cluster already containing enough user ie 11 user want delete max 5 cluster max value cluster column 4 actually happend 11 cluster 7 cluster cluster 11 small removed value 11 bring back 4 printUsers cluster dataFrame cluster 11 fixing usersclusterusersclusterCluster 11T Users cluster dataFrame cluster 11 fixing let look cluster 4 fixing printUsers cluster dataFrame cluster 4 fixing 11th cluster fixing clustersfixedclustersfixedCluster 4T Users cluster dataFrame cluster 4 fixing 11th cluster fixing DataFrame containing user IDs don’t disturbed cluster simililarly list movie DataFrames cluster let take look list movie dataframes printSize movie dataframe fixing lenmoviesdffixed Size movie dataframe fixing 5 let look size cluster rangelenmoviesdffixed lenusers clustersfixedclustersfixedCluster ishape0 printUsers Cluster stri lenusers Users Cluster 0 45 Users Cluster 1 21 Users Cluster 2 8 Users Cluster 3 15 Users Cluster 4 11 cluster containing enough user make recommendation user Let take look size cluster movie list rangelenmoviesdffixed printTotal movie Cluster stri moviesdffixedishape0 Total movie Cluster 0 64 Total movie Cluster 1 39 Total movie Cluster 2 15 Total movie Cluster 3 50 Total movie Cluster 4 25 done working training machine learning model kmeans making prediction cluster user fixing issue Finally need store training use later use Pickle library save load training already imported Pickle use Let first design object save load training directly design method savingloading particular file also design general saveload method class saveLoadFiles def saveself filename data try file opendatasets filename pkl wb pickledumpdata file except err Error 0 1formatexcinfo0 excinfo1 printerr fileclose return False err else fileclose return True def loadself filename try file opendatasets filename pkl rb except err Error 0 1formatexcinfo0 excinfo1 printerr fileclose return False err else data pickleloadfile fileclose return data def loadClusterMoviesDatasetself return selfloadclustersmoviesdataset def saveClusterMoviesDatasetself data return selfsaveclustersmoviesdataset data def loadUsersClustersself return selfloadusersclusters def saveUsersClustersself data return selfsaveusersclusters data class excinfo imported sys library error handling error writing use saveClusterMoviesDatasetloadClusterMoviesDataset method saveload list cluster movie DataFrames saveUsersClustersloadUsersClusters method saveload user cluster DataFrames let try run print response order check error come return True mean file saved successfully proper place saveLoadFile saveLoadFiles printsaveLoadFilesaveClusterMoviesDatasetmoviesdffixed printsaveLoadFilesaveUsersClustersclustersfixed True True response True save method trained data saved use later Let check load loadmovieslist loadusersclusters saveLoadFileloadClusterMoviesDataset saveLoadFileloadUsersClusters printType Loading list Movies dataframes 5 Clusters typeloadmovieslist Length lenloadmovieslist printType Loading 100 Users cluster Data typeloadusersclusters Shape loadusersclustersshape Type Loading list Movies dataframes 5 Clusters class list Length 5 Type Loading 100 Users cluster Data class pandascoreframeDataFrame Shape 100 2 successfully saved loaded data using pickle library worked small dataset often movie recommendation system work large datasets dataset initially enough movie cluster make recommendation need design function making recommendation user Recommendations Users we’ll create object recommending favorite movie cluster user user added favorite earlier also user added another movie favorite list update cluster movie datasets also class userRequestedFor def initself userid usersdata selfusersdata usersdatacopy selfuserid userid Find User Cluster userscluster saveLoadFilesloadUsersClusters selfusercluster intusersclusterusersclusteruserId selfuseridCluster Load User Cluster Movies Dataframe selfmovieslist saveLoadFilesloadClusterMoviesDataset selfclustermovies selfmovieslistselfusercluster dataframe selfclustermovieslist listselfclustermoviesmovieId list def updatedFavouriteMoviesListself newmovieId newmovieId selfclustermovieslist selfclustermovieslocselfclustermoviesmovieId newmovieId Count 1 else selfclustermovies selfclustermoviesappendmovieIdnewmovieId Count 1 ignoreindexTrue selfclustermoviessortvaluesby Count ascending False inplace True selfmovieslistselfusercluster selfclustermovies saveLoadFilessaveClusterMoviesDatasetselfmovieslist def recommendMostFavouriteMoviesself try usermovies getMoviesOfUserselfuserid selfusersdata clustermovieslist selfclustermovieslistcopy usermovie usermovies usermovie clustermovieslist clustermovieslistremoveusermovie return True clustermovieslist except KeyError err User history exist printerr return False err except err Error 0 1formatexcinfo0 excinfo1 printerr return False err let try make recommendation updating favorite list request first we’ll import data IDs movie detail like title genre etc moviesmetadata pdreadcsv Prepairing DataFrom Datamoviesmetadatacsv usecols id genre originaltitle moviesmetadata moviesmetadataloc moviesmetadataidisinlistmapstr npuniqueusersfavmoviesmovieIdresetindexdropTrue printLet take look movie metadata movie dataset moviesmetadata Let take look movie metadata movie dataset list movie user ID 12 added favorite movie user12Movies getMoviesOfUser12 usersfavmovies movie user12Movies title listmoviesmetadatalocmoviesmetadataid strmovieoriginaltitle title printMovie title title Genres end genre astliteralevalmoviesmetadatalocmoviesmetadataid strmoviegenresvalues0split1split0 genre genre printgenrename end printend bb print Movie title Dancer Dark Genres Drama Crime Music Movie title Dark Genres Horror Thriller Mystery Movie title Miami Vice Genres Action Adventure Crime Thriller Movie title Tron Genres Science Fiction Action Adventure Movie title Lord Rings Genres Fantasy Drama Animation Adventure Movie title 48 Hrs Genres Thriller Action Comedy Crime Drama Movie title Edward Scissorhands Genres Fantasy Drama Romance Movie title Le Grand Bleu Genres Adventure Drama Romance Movie title Saw Genres Horror Mystery Crime Movie title Le fabuleux destin dAmélie Poulain Genres Comedy Romance finally top 10 recommended movie user user12Recommendations userRequestedFor12 usersfavmoviesrecommendMostFavouriteMovies1 movie user12Recommendations15 title listmoviesmetadatalocmoviesmetadataid strmovieoriginaltitle title printMovie title title Genres end genre astliteralevalmoviesmetadatalocmoviesmetadataid strmoviegenresvalues0split1split0 genre genre printgenrename end print end print Movie title Trois couleurs Rouge Genres Drama Mystery Romance Movie title Oceans Eleven Genres Thriller Crime Movie title Judgment Night Genres Action Thriller Crime Movie title Scarface Genres Action Crime Drama Thriller Movie title Back Future Part II Genres Adventure Comedy Family Science Fiction Movie title Oceans Twelve Genres Thriller Crime Movie title Genres Comedy War Movie title Back Future Part III Genres Adventure Comedy Family Science Fiction Movie title Clockwork Orange Genres Science Fiction Drama Movie title Minority Report Genres Action Thriller Science Fiction Mystery finally successfully recommended movie user based hisher interest favorite movie similar user You’re Done Thanks reading article want whole project deployment coding please visit GitHub library AI Movies Recommendation System Based Kmeans Clustering Algorithm download work completely free everyone Thank YouTags Machine Learning Artificial Intelligence Recommendation System Python K Means Clustering
2,493
Smile, You’re on Camera: The Future of Emotional Advertising
Smile, You’re on Camera: The Future of Emotional Advertising For those worried about “Big Brother,” you should probably stop reading now. There is a new technology on the market that takes behavioral tracking to a whole new level. Born from MIT’s Media Lab, Affectiva allows advertisers to record and analyze human emotional responses based on subtle, involuntary facial cues. The insights generated by this software completely surpass other creative testing methods by providing a treasure trove of accurate, objective and on-demand data. Affectiva requires no $40K eye-tracking goggles or other extraneous technology, just your own computer. By tapping into any webcam’s existing functions, Affectiva can scan faces for subtle micro-shifts. The slightest uptick of an eyebrow or twitch of the mouth could indicate an emotional response to content. In real time, Affectiva catalogues facial movements and displays results almost immediately after completion of the video. My Affdex data after watching Budweiser’s “Puppy Love” ad. It gets me every time! Not only does Affectiva track key emotions like happiness, sadness and anger, it also has the potential to detect cultural nuances. Its software is now able to catch the “politeness smile,” an expression prevalent in Southeast Asia and India but rare in the Americas, Africa and Europe. As its database of faces expands and Affectiva’s “emotional AI” continues to grow in complexity, advertisers will be able to predict and decipher unique emotional responses to their work across cultures, genders and national borders. This technology also solves the age-old dilemma of advertisers and psychologists alike. While tests and surveys can attempt to gauge emotional responses to ads before, during and after exposure, their results are often subjective and not generalizable to the natural environments where consumers would actually watch them. However, Affectiva’s technology can be used on any device, anywhere in the world. When using this technology, the only cue that hints you’re in a study is the light that comes on next to your webcam. You might be wondering, are brands watching me right now? Do they have a databank of videos of me crying to TD Bank’s #TDThanksYou ads? The answer is no, do not worry. Affectiva’s services are explicitly opt-in and require consent from the end user. We reached out to Affectiva to see how exactly they recruit subjects and will update this when we receive a response. In the meantime, you can try Affectiva out for yourself here. So, this is cool and all, but how can we use it to optimize ads? The information generated by Affectiva can be used to amplify key emotional moments, help place a call to action, or just prove that an ad is objectively awesome. The output of Affdex Discovery, Affectiva’s ad analysis software, clearly maps out the levels of surprise, smile, concentration, dislike, valence, attention, and expressiveness throughout a video. It also segments the data by age-group and gender, allowing marketers to see reactions specific to their target demographic. Best of all, performance on Affdex tests can accurately predict sales growth. Nice. Affectiva is truly disruptive, even beyond advertising. Its technology has been used to build an app that helps people with Autism get real-time feedback on social interactions and in a video game that adapts to the player’s emotions. For brands, Affectiva represents a way to avoid the nemesis of emotional advertising, neutrality. Unilever, Kellogg’s, Mars and CBS are already on the bandwagon…who’s next?
https://medium.com/comms-planning/smile-youre-on-camera-the-future-of-emotional-advertising-a179cd8366ed
['Ali Goldsmith']
2017-10-09 16:34:29.852000+00:00
['Marketing', 'Psychology', 'Emotions', 'Digital Marketing', 'Advertising']
Title Smile You’re Camera Future Emotional AdvertisingContent Smile You’re Camera Future Emotional Advertising worried “Big Brother” probably stop reading new technology market take behavioral tracking whole new level Born MIT’s Media Lab Affectiva allows advertiser record analyze human emotional response based subtle involuntary facial cue insight generated software completely surpass creative testing method providing treasure trove accurate objective ondemand data Affectiva requires 40K eyetracking goggles extraneous technology computer tapping webcam’s existing function Affectiva scan face subtle microshifts slightest uptick eyebrow twitch mouth could indicate emotional response content real time Affectiva catalogue facial movement display result almost immediately completion video Affdex data watching Budweiser’s “Puppy Love” ad get every time Affectiva track key emotion like happiness sadness anger also potential detect cultural nuance software able catch “politeness smile” expression prevalent Southeast Asia India rare Americas Africa Europe database face expands Affectiva’s “emotional AI” continues grow complexity advertiser able predict decipher unique emotional response work across culture gender national border technology also solves ageold dilemma advertiser psychologist alike test survey attempt gauge emotional response ad exposure result often subjective generalizable natural environment consumer would actually watch However Affectiva’s technology used device anywhere world using technology cue hint you’re study light come next webcam might wondering brand watching right databank video cry TD Bank’s TDThanksYou ad answer worry Affectiva’s service explicitly optin require consent end user reached Affectiva see exactly recruit subject update receive response meantime try Affectiva cool use optimize ad information generated Affectiva used amplify key emotional moment help place call action prove ad objectively awesome output Affdex Discovery Affectiva’s ad analysis software clearly map level surprise smile concentration dislike valence attention expressiveness throughout video also segment data agegroup gender allowing marketer see reaction specific target demographic Best performance Affdex test accurately predict sale growth Nice Affectiva truly disruptive even beyond advertising technology used build app help people Autism get realtime feedback social interaction video game adapts player’s emotion brand Affectiva represents way avoid nemesis emotional advertising neutrality Unilever Kellogg’s Mars CBS already bandwagon…who’s nextTags Marketing Psychology Emotions Digital Marketing Advertising
2,494
Web scraping with Python & BeautifulSoup
The web contains lots of data. The ability to extract the information you need from it is, with no doubt, a useful one, even necessary. Of course, there are still lots of datasets already available for you to download, on places like Kaggle, but in many cases, you won’t find the exact data that you need for your particular problem. However, chances are you’ll find what you need somewhere on the web and you’ll need to extract it from there. Web scraping is the process of doing this, of extracting data from web pages. In this article, we’ll see how to do web scraping in python. For this task, there are several libraries that you can use. Among these, here we will use Beautiful Soup 4. This library takes care of extracting data from a HTML document, not downloading it. For downloading web pages, we need to use another library: requests. So, we’ll need 2 packages: requests — for downloading the HTML code from a given URL beautiful soup — for extracting data from that HTML string Installing the libraries Now, let’s start by installing the required packages. Open a terminal window and type: python -m pip install requests beautifulsoup4 …or, if you’re using a conda environment: conda install requests beautifulsoup4 Now, try to run the following: import requests from bs4 import BeautifulSoup If you don’t get any error, then the packages are installed successfully. Using requests & beautiful soup to extract data From the requests package we will use the get() function to download a web page from a given URL: requests.get(url, params=None, **kwargs) Where the parameters are: url — url of the desired web page — url of the desired web page params — a optional dictionary, list of tuples or bytes to send in the query string — a optional dictionary, list of tuples or bytes to send in the query string **kwargs — optional arguments that request takes This function returns an object of type requests.Response . Among this object's attributes and methods, we are most interested in the .content attribute which consists of the HTML string of the target web page. Example: html_string = requests.get("http://www.example.com").content After we got the HTML of the target web page, we have to use the BeautifulSoup() constructor to parse it, and get an BeautifulSoup object that we can use to navigate the document tree and extract the data that we need. soup = BeautifulSoup(markup_string, parser) Where: markup_string — the string of our web page — the string of our web page parser — a string consisting of the name of the parser to be used; here we will use python’s default parser: “html.parser” Note that we named the first parameter as “markup_string” instead of “html_string” because BeautifulSoup can be used with other markup languages as well, not just HTML, but we need to specify an appropriate parser; e.g. we can parse XML by passing “xml” as parser. A BeautifulSoup object has several methods and attributes that we can use to navigate within the parsed document and extract data from it. The most used method is .find_all() : soup.find_all(name, attrs, recursive, string, limit, **kwargs) name — name of the tag; e.g. “a”, “div”, “img” — name of the tag; e.g. “a”, “div”, “img” attrs — a dictionary with the tag’s attributes; e.g. {“class”: “nav”, “href”: “#menuitem”} — a dictionary with the tag’s attributes; e.g. recursive — boolean; if false only direct children are considered, if true (default) all children are examined in the search — boolean; if false only direct children are considered, if true (default) all children are examined in the search string — used to search for strings in the element’s content — used to search for strings in the element’s content limit — limit the search to only this number of found elements Example: soup.find_all("a", attrs={"class": "nav", "data-foo": "value"}) The line above returns a list with all “a” elements that also have the specified attributes. HTML attributes that can not be confused with this method’s parameters or python’s keywords (like “class”) can be used directly as function parameters without the need to put them inside attrs dictionary. The HTML class attribute can also be used like this but instead of class=”…” write class_=”…” . Example: soup.find_all("a", class_="nav") Because this method is the most used one, it has a shortcut: calling the BeautifulSoup object directly has the same effect as calling the .find_all() method. Example: soup("a", class_="nav") The .find() method is like .find_all() , but it stops the search after it founds the first element; element which will be returned. It is roughly equivalent to .find_all(..., limit=1) , but instead of returning a list, it returns a single element. The .contents attribute of a BeautifulSoup object is a list with all its children elements. If the current element does not contain nested HTML elements, then .contents[0] will be just the text inside it. So after we got the element that contains the data we need using the .find_all() or .find() methods, all we need to do to get the data inside it is to access .contents[0] . Example: soup = BeautifulSoup(''' <div> <span class="rating">5</span> <span class="views">100</span> </div> ''', "html.parser") views = soup.find("span", class_="views").contents[0] What if we need a piece of data that is not inside the element, but as the value of an attribute? We can access an element’s attribute value as follows: soup['attr_name'] Example: soup = BeautifulSoup(''' <div> <img src="./img1.png"> </div> ''', "html.parser") img_source = soup.find("img")['src'] Web scraping example: get top 10 linux distros Now, let’s see a simple web scraping example using the concepts above. We will extract a list with the top 10 most popular linux distros from DistroWatch website. DistroWatch (https://distrowatch.com/) is a website featuring news about linux distros and open source software that runs on linux. This website has in the right side a ranking with the most popular linux distros. From this ranking we will extract the first 10. Firstly, we will download the web page and construct a BeautifulSoup object from it: import requests from bs4 import BeautifulSoup soup = BeautifulSoup( requests.get("https://distrowatch.com/").content, "html.parser") Then, we need to find out how to identify the data we want inside the HTML code. For that, we will use chrome’s developer tools. Right click somewhere in the web page and then click on “Inspect”, or press “Ctrl+Shift+I” in order to open chrome’s developer tools. It should look like this: Then, if you click on the little arrow in the top-left corner of the developer tools, and then click on some element on the web page, you should see in the dev tools window the piece of HTML associated with that element. After that you can use the information that you saw in the dev tools window to tell beautiful soup where to find that element. In our example, we can see that that ranking is structured as a HTML table and each distro name is inside a td element with class “phr2”. Then inside that td element is a link containing the text we want to extract (the distro’s name). That’s what we will do in the next few lines of code: top_ten_distros = [] distro_tds = soup("td", class_="phr2", limit=10) for td in distro_tds: top_ten_distros.append(td.find("a").contents[0]) And this is what we got:
https://towardsdatascience.com/web-scraping-with-python-beautifulsoup-40d2ce4b6252
['Dorian Lazar']
2020-11-27 14:30:31.033000+00:00
['Artificial Intelligence', 'Python', 'Data Science', 'Programming', 'Web Scraping']
Title Web scraping Python BeautifulSoupContent web contains lot data ability extract information need doubt useful one even necessary course still lot datasets already available download place like Kaggle many case won’t find exact data need particular problem However chance you’ll find need somewhere web you’ll need extract Web scraping process extracting data web page article we’ll see web scraping python task several library use Among use Beautiful Soup 4 library take care extracting data HTML document downloading downloading web page need use another library request we’ll need 2 package request — downloading HTML code given URL beautiful soup — extracting data HTML string Installing library let’s start installing required package Open terminal window type python pip install request beautifulsoup4 …or you’re using conda environment conda install request beautifulsoup4 try run following import request bs4 import BeautifulSoup don’t get error package installed successfully Using request beautiful soup extract data request package use get function download web page given URL requestsgeturl paramsNone kwargs parameter url — url desired web page — url desired web page params — optional dictionary list tuples byte send query string — optional dictionary list tuples byte send query string kwargs — optional argument request take function return object type requestsResponse Among object attribute method interested content attribute consists HTML string target web page Example htmlstring requestsgethttpwwwexamplecomcontent got HTML target web page use BeautifulSoup constructor parse get BeautifulSoup object use navigate document tree extract data need soup BeautifulSoupmarkupstring parser markupstring — string web page — string web page parser — string consisting name parser used use python’s default parser “htmlparser” Note named first parameter “markupstring” instead “htmlstring” BeautifulSoup used markup language well HTML need specify appropriate parser eg parse XML passing “xml” parser BeautifulSoup object several method attribute use navigate within parsed document extract data used method findall soupfindallname attrs recursive string limit kwargs name — name tag eg “a” “div” “img” — name tag eg “a” “div” “img” attrs — dictionary tag’s attribute eg “class” “nav” “href” “menuitem” — dictionary tag’s attribute eg recursive — boolean false direct child considered true default child examined search — boolean false direct child considered true default child examined search string — used search string element’s content — used search string element’s content limit — limit search number found element Example soupfindalla attrsclass nav datafoo value line return list “a” element also specified attribute HTML attribute confused method’s parameter python’s keywords like “class” used directly function parameter without need put inside attrs dictionary HTML class attribute also used like instead class”…” write class”…” Example soupfindalla classnav method used one shortcut calling BeautifulSoup object directly effect calling findall method Example soupa classnav find method like findall stop search found first element element returned roughly equivalent findall limit1 instead returning list return single element content attribute BeautifulSoup object list child element current element contain nested HTML element contents0 text inside got element contains data need using findall find method need get data inside access contents0 Example soup BeautifulSoup div span classrating5span span classviews100span div htmlparser view soupfindspan classviewscontents0 need piece data inside element value attribute access element’s attribute value follows soupattrname Example soup BeautifulSoup div img srcimg1png div htmlparser imgsource soupfindimgsrc Web scraping example get top 10 linux distros let’s see simple web scraping example using concept extract list top 10 popular linux distros DistroWatch website DistroWatch httpsdistrowatchcom website featuring news linux distros open source software run linux website right side ranking popular linux distros ranking extract first 10 Firstly download web page construct BeautifulSoup object import request bs4 import BeautifulSoup soup BeautifulSoup requestsgethttpsdistrowatchcomcontent htmlparser need find identify data want inside HTML code use chrome’s developer tool Right click somewhere web page click “Inspect” press “CtrlShiftI” order open chrome’s developer tool look like click little arrow topleft corner developer tool click element web page see dev tool window piece HTML associated element use information saw dev tool window tell beautiful soup find element example see ranking structured HTML table distro name inside td element class “phr2” inside td element link containing text want extract distro’s name That’s next line code toptendistros distrotds souptd classphr2 limit10 td distrotds toptendistrosappendtdfindacontents0 gotTags Artificial Intelligence Python Data Science Programming Web Scraping
2,495
Data pipelines on Spark and Kubernetes
Data pipelines on Spark and Kubernetes Considerations for using Apache Spark and Kubernetes to process data If you’re running data pipelines and workflows to get data from one location to the data lake, that usually means that the team will need to process huge amounts of data. To do this in a scalable way and to handle complex computation steps across a large amount of data (effectively from a cost perspective), Kubernetes is a great choice for scheduling Spark jobs, compared to YARN. Apache Spark is a framework that can quickly perform processing tasks on very large data sets, and Kubernetes is a portable, extensible, open-source platform for managing and orchestrating the execution of containerized workloads and services across a cluster of multiple machines. From an architectural perspective, when you submit a Spark application, one is directly interacting with Kubernetes, the API server, which will schedule the driver pod, the Spark driver container. Then the Spark driver and the Kubernetes Cluster will talk to each other to request and launch Spark executors. This can happen statically or this can happen, dynamically if you enable dynamic application. Dependency management When the team uses Kubernetes, each Spark app has its own Docker image, this means the team can have full isolation and full control of the environment. The team can set their Spark version, Python version, dependencies using this architecture. These containers package the code required to execute the workload, but also all the dependencies needed to run that code, removing the hassle of maintaining a common dependency for all workloads running on a common infrastructure. Dynamic autoscaling Another capability with this setup is that the team can have Spark applications with dynamic allocation activated and due scanning on the cluster. This also leads to better resource management, as the scheduler takes care of picking which nodes to deploy the workloads on in combination with the fact that in the cloud, scaling a cluster up/down is quick and easy because it’s just a matter of adding or removing VMs to the cluster, and the managed Kubernetes offerings have helpers for that. This in practice is a major cost saver. Deployment In the hybrid cloud world today enterprises want to prevent lock-ins. Running Spark on Kubernetes means building once and deploying anywhere, which makes a cloud-agnostic approach scalable. Metrics and Security For metrics, the team can export everything to a time series DB. This enables the superposition of a Spark stage boundaries and the resource usage metrics. Kubernetes is a technology which has role based access control model and secrets management. There are many open source projects that the team can leverage with which managing security is easy like the HashiCorp vault. Finally, running Spark on Kubernetes will save the team time. Data scientists, data engineers and data architects’ time is valuable, and this setup will bring more productivity to those people and departments will could lead to savings.
https://medium.com/acing-ai/data-pipelines-on-spark-and-kubernetes-8346d246ff6e
['Vimarsh Karbhari']
2020-09-03 15:27:06.936000+00:00
['Machine Learning', 'Artificial Intelligence', 'Spark', 'Data Science', 'Kubernetes']
Title Data pipeline Spark KubernetesContent Data pipeline Spark Kubernetes Considerations using Apache Spark Kubernetes process data you’re running data pipeline workflow get data one location data lake usually mean team need process huge amount data scalable way handle complex computation step across large amount data effectively cost perspective Kubernetes great choice scheduling Spark job compared YARN Apache Spark framework quickly perform processing task large data set Kubernetes portable extensible opensource platform managing orchestrating execution containerized workload service across cluster multiple machine architectural perspective submit Spark application one directly interacting Kubernetes API server schedule driver pod Spark driver container Spark driver Kubernetes Cluster talk request launch Spark executor happen statically happen dynamically enable dynamic application Dependency management team us Kubernetes Spark app Docker image mean team full isolation full control environment team set Spark version Python version dependency using architecture container package code required execute workload also dependency needed run code removing hassle maintaining common dependency workload running common infrastructure Dynamic autoscaling Another capability setup team Spark application dynamic allocation activated due scanning cluster also lead better resource management scheduler take care picking node deploy workload combination fact cloud scaling cluster updown quick easy it’s matter adding removing VMs cluster managed Kubernetes offering helper practice major cost saver Deployment hybrid cloud world today enterprise want prevent lockins Running Spark Kubernetes mean building deploying anywhere make cloudagnostic approach scalable Metrics Security metric team export everything time series DB enables superposition Spark stage boundary resource usage metric Kubernetes technology role based access control model secret management many open source project team leverage managing security easy like HashiCorp vault Finally running Spark Kubernetes save team time Data scientist data engineer data architects’ time valuable setup bring productivity people department could lead savingsTags Machine Learning Artificial Intelligence Spark Data Science Kubernetes
2,496
LOL — Issue 26
I reproduced a poem that I’d written An Ode To Cockroaches on an Instagram post where she’d written a photo poem Ode To Spiders She messaged me and shared me this poem, saying that she went batshit crazy after she read this. This poem is so confrontational and intense, with the roaches reference, it will make you shudder and awe.
https://medium.com/lol-weekly-list-of-lit/lol-issue-26-d54b0a7bd214
['Arihant Verma']
2017-08-23 05:51:08.828000+00:00
['Storytelling', 'Poetry', 'Writing', 'Pornography', 'Lolissue']
Title LOL — Issue 26Content reproduced poem I’d written Ode Cockroaches Instagram post she’d written photo poem Ode Spiders messaged shared poem saying went batshit crazy read poem confrontational intense roach reference make shudder aweTags Storytelling Poetry Writing Pornography Lolissue
2,497
How to Solve Conflict Productively at Work
How to Solve Conflict Productively at Work Seven strategies to turn tensions into fuel for growth. “Conflict is the beginning of consciousness.” — M. Esther Harding Tensions are a source of personal and organizational growth. Conflict is neither good or bad. If managed poorly, it can deteriorate the culture and collaboration. Conflict keeps teams and organizations alive. It’s the tension that challenges people to adapt, learn, innovate, and grow. Unfortunately, most organizations and leaders see conflict as a bad thing. They have an idealized version of collaboration. And expect people always to get along and agree on everything. Positive dissent is vital to maximize opportunities and uncover new ones. Cognitive dissonance makes teams smarter, as research shows. Innovation feeds off diverse perspectives, skills, and experiences. A practical approach to addressing conflict is adhering to the following ethos: Friction creates energy-and energy drives creativity. You can try to avoid conflict, but you cannot escape conflict. Tensions are unavoidable. Arguing is an excellent thing if you and your team can do it in a healthy way. Great leaders confront, rather than avoid conflict. They turn tensions into fuel for growth. The balance between appreciation and challenge “Peace is not the absence of conflict, but the ability to cope with it.” — Mahatma Gandhi Silence is the enemy of collaboration. 85% of people have failed to raise an issue with their boss even if that would harm the organization. How can your team grow if you don’t share your genuine opinions? How can your company innovate if your colleagues keep their best ideas to themselves? Managing diverse perspectives, tensions, and disagreements is not easy. But silence causes more harm in the long run. Practice Radical Candor instead. Find the sweet spot between caring about your colleagues and challenging them. Caring too much about your team can be harmful. People need to be challenged also. Your role as a leader is to help people grow. Just appreciating the good could be as detrimental as not caring at all. Radical Candor means saying what you really think while also caring about the person. As Kim Scott explains in Radical Candor, how we manage conflict can be broken down on a grid. One axis is Challenge Directly, and the other is Care Personally. Most people fail to provide Radical Candor when they fall into one of the following quadrants. Obnoxious Aggression happens when you challenge someone but don’t care. It’s praise that’s not sincere or criticism that isn’t delivered kindly. Aggression fuels defensiveness. Feedback feels like torture, not a gift. Ruinous Empathy happens when you want to be nice and don’t challenge people. You provide unspecific praise or sugarcoat the feedback. Ruinous Empathy feeds ignorance. It doesn’t provide people with clear insights to improve their game. Manipulative Insincerity happens when you neither care about people nor challenge them. You praise others without being specific or since. Or criticize them without being kind. Manipulative Insincerity seeds mistrust. It encourages backstabbing, passive-aggressiveness, and toxic behaviors. Radical Candor is the sweet spot. You help people grow in a positive, caring way. It means pushing others beyond their comfort zone without being disrespectful. You solve tensions in a healthy way. Seven strategies to turn conflict into fuel “Never have a battle of wits with an unarmed person.” — Mark Twain 1. Turn arguing into a natural practice Don’t avoid conflict, face it head-on. The sooner you address tensions, the easier you’ll solve them. Also, conflict becomes more personal once it escalates. Teamwork is a contact sport. Friction creates energy that can propel your team forward. Train your people to build a practice of arguing in a healthy way. 2. Build a culture of trust and respect Radical Candor is not about saying everything that comes to mind. It’s about being helpful but also respectful. Start by setting clear ground rules for dissent. Don’t assume respect means the same for everyone. Include your team in creating the framework. Make it clear and public. No name-calling or personal attacks. There is no winner but the team. Be ready to show some vulnerability. Leaders expect people to challenge each other but have a hard time accepting criticism themselves. Be patient. It takes time to find balance-especially if your team usually plays too nice. 3. Address real tensions Most conflict is built on miscommunication and misunderstanding. People assume things or let emotions filter their judgment. Practice separating real tensions from perceived ones. Which are the real, objective problems? And what are we creating? Also, avoid anticipation. Most teams worry about what might happen in the future. Don’t get stuck on future tensions. They might happen or not. Focus on solving real, present tensions. 4. Focus on the task, not on the person. Conflict becomes a war when we make it personal. Most people can’t separate the ideas from the person. Encourage a spirit of curiosity. Focus the debate on the idea, task, or project. Avoid making it -or taking it-personally. Keep the discussion about facts, logics, and events-not people. Train your team to separate their identities from their points of views. That someone doesn’t like their ideas, doesn’t mean they are attacking them. Give people the benefit of the doubt. Remember WikiPedia’s rule: “Assume good faith.” 5. Encourage diversity of thinking Cognitive biases come in many forms and shapes. They blind our perspective and makes us feel overconfident. We want to repeat our past successes. Diversity of thoughts require more than hiring diverse talent. Encourage people to speak up. Take turns so loud people don’t influence or silence quiet voices. Challenge people’s ideas and assumptions. Invite them to challenge yours as well. 6. Be intellectually humble Usually, people get stuck trying to be right. Discussions are not meant to find the best solution. They just want to win an argument. Intellectual humility turns people into better leaders, as I wrote here. They don’t let their ego blind their judgment. And feel okay with saying “I was wrong.” Reward people for making progress, not for being right. Solving tensions is not about wining an argument but finding the best answer to a problem. It takes wisdom to integrate opposing views. 7. Hit me with your best shot Start by asking your team to criticize you. Ask for feedback. Prove you can take criticism before you start dishing it out. Embrace your discomfort so people would embrace theirs. Kim Scott suggests not letting anyone off the hook. If they don’t say much, push back. It will take time for people to feel comfortable criticizing you. Pay close attention to silence. Soliciting feedback is an ongoing practice. Start small. Use casual meetings to ignite the conversation. Try kicking off a meeting by asking, “What’s not working?” and “What’s working?” Most people were taught to stay silent when they don’t have anything nice to say. Conflict is not about being harsh either but feeling comfortable with addressing tensions. You can only solve a problem that is made public. Tensions are fuel for growth. There’s no perfect way to avoid conflict. But avoidance only make things worse. The key to solving tensions is addressing them head-on. Model the behavior. Prove you can take criticism first before you encourage other to be radically candid.
https://medium.com/swlh/how-to-solve-conflict-productively-at-work-778e36755d60
['Gustavo Razzetti']
2019-09-17 15:04:21.492000+00:00
['Leadership', 'Work', 'Productivity', 'Conflict', 'Teamwork']
Title Solve Conflict Productively WorkContent Solve Conflict Productively Work Seven strategy turn tension fuel growth “Conflict beginning consciousness” — Esther Harding Tensions source personal organizational growth Conflict neither good bad managed poorly deteriorate culture collaboration Conflict keep team organization alive It’s tension challenge people adapt learn innovate grow Unfortunately organization leader see conflict bad thing idealized version collaboration expect people always get along agree everything Positive dissent vital maximize opportunity uncover new one Cognitive dissonance make team smarter research show Innovation feed diverse perspective skill experience practical approach addressing conflict adhering following ethos Friction creates energyand energy drive creativity try avoid conflict cannot escape conflict Tensions unavoidable Arguing excellent thing team healthy way Great leader confront rather avoid conflict turn tension fuel growth balance appreciation challenge “Peace absence conflict ability cope it” — Mahatma Gandhi Silence enemy collaboration 85 people failed raise issue bos even would harm organization team grow don’t share genuine opinion company innovate colleague keep best idea Managing diverse perspective tension disagreement easy silence cause harm long run Practice Radical Candor instead Find sweet spot caring colleague challenging Caring much team harmful People need challenged also role leader help people grow appreciating good could detrimental caring Radical Candor mean saying really think also caring person Kim Scott explains Radical Candor manage conflict broken grid One axis Challenge Directly Care Personally people fail provide Radical Candor fall one following quadrant Obnoxious Aggression happens challenge someone don’t care It’s praise that’s sincere criticism isn’t delivered kindly Aggression fuel defensiveness Feedback feel like torture gift Ruinous Empathy happens want nice don’t challenge people provide unspecific praise sugarcoat feedback Ruinous Empathy feed ignorance doesn’t provide people clear insight improve game Manipulative Insincerity happens neither care people challenge praise others without specific since criticize without kind Manipulative Insincerity seed mistrust encourages backstabbing passiveaggressiveness toxic behavior Radical Candor sweet spot help people grow positive caring way mean pushing others beyond comfort zone without disrespectful solve tension healthy way Seven strategy turn conflict fuel “Never battle wit unarmed person” — Mark Twain 1 Turn arguing natural practice Don’t avoid conflict face headon sooner address tension easier you’ll solve Also conflict becomes personal escalates Teamwork contact sport Friction creates energy propel team forward Train people build practice arguing healthy way 2 Build culture trust respect Radical Candor saying everything come mind It’s helpful also respectful Start setting clear ground rule dissent Don’t assume respect mean everyone Include team creating framework Make clear public namecalling personal attack winner team ready show vulnerability Leaders expect people challenge hard time accepting criticism patient take time find balanceespecially team usually play nice 3 Address real tension conflict built miscommunication misunderstanding People assume thing let emotion filter judgment Practice separating real tension perceived one real objective problem creating Also avoid anticipation team worry might happen future Don’t get stuck future tension might happen Focus solving real present tension 4 Focus task person Conflict becomes war make personal people can’t separate idea person Encourage spirit curiosity Focus debate idea task project Avoid making taking itpersonally Keep discussion fact logic eventsnot people Train team separate identity point view someone doesn’t like idea doesn’t mean attacking Give people benefit doubt Remember WikiPedia’s rule “Assume good faith” 5 Encourage diversity thinking Cognitive bias come many form shape blind perspective make u feel overconfident want repeat past success Diversity thought require hiring diverse talent Encourage people speak Take turn loud people don’t influence silence quiet voice Challenge people’s idea assumption Invite challenge well 6 intellectually humble Usually people get stuck trying right Discussions meant find best solution want win argument Intellectual humility turn people better leader wrote don’t let ego blind judgment feel okay saying “I wrong” Reward people making progress right Solving tension wining argument finding best answer problem take wisdom integrate opposing view 7 Hit best shot Start asking team criticize Ask feedback Prove take criticism start dishing Embrace discomfort people would embrace Kim Scott suggests letting anyone hook don’t say much push back take time people feel comfortable criticizing Pay close attention silence Soliciting feedback ongoing practice Start small Use casual meeting ignite conversation Try kicking meeting asking “What’s working” “What’s working” people taught stay silent don’t anything nice say Conflict harsh either feeling comfortable addressing tension solve problem made public Tensions fuel growth There’s perfect way avoid conflict avoidance make thing worse key solving tension addressing headon Model behavior Prove take criticism first encourage radically candidTags Leadership Work Productivity Conflict Teamwork
2,498
Recipe for Success. A new system that creates clean gas…
Recipe for Success Green Heat is helping Ugandan households fire their kitchens with environmentally friendly gas heat made from natural waste. Mama Justice doesn’t let her age slow her down. At 70 years old, she runs a small pig farm in Buwambo, Wakiso District in Uganda. And until recently, she gathered firewood every day to cook each meal for her four grandchildren. Mama Justice is a fantastic cook — easily whipping up meals of banana-like matoke, and cassava and groundnut paste with fish after her many years of practice. And she takes pride in her kitchen, even though the smoke from the firewood often made it difficult for Mama Justice and her grandchildren to breathe in her small home. Mama Justice in her home in Buwambo, Wakiso District (Uganda). After 70 years of hauling firewood, Mama Justice was beginning to struggle with the physical burden. A friend suggested that Green Heat, a social enterprise based in Kampala, could help. The Green Heat team recommended that she could use an anaerobic digester instead of firewood. Anaerobic digesters use biodegradable waste, such as plant leaves and livestock manure, to create clean-burning fuel. Microorganisms and bacteria break down the waste in a dark, oxygen-starved environment, until the mixture has fermented and renewable gas is produced. Mama Justice’s kitchen filling with smoke from her wood burning stove. Vianney Tumwesige, the managing director of Green Heat, and his team informed Justice that the digester could pipe clean gas directly into her kitchen — eliminating her reliance on firewood and ridding her home of dangerous smoke. The innovative Green Heat digester would also help her conserve water, saving her time-consuming and backbreaking trips to the well multiple times a day. “Green Heat’s digester recycles water back into the system,” explains Tumwesige. “It’s better for the environment and less work for the farmer to maintain.” Mama Justice had never used waste from her pigs as a source of energy before, and she had her doubts. As the cook responsible for feeding such a large family, she needed a reliable source of fuel. And she was concerned that cooking with a different kind of fuel would change the taste of her food. Justice wanted to know that her beloved recipes wouldn’t change when her fuel source did. Tumwesige assured her that her food would remain delicious. It took some patience. First, Mama Justice had to invest in a few more pigs to provide enough manure to power the system. Then the digester needed several months to begin breaking down the waste and start filling the fuel tanks. In all, it took about 3 months and 2,000 liters of waste to get the system up and running at full capacity. Then came the true test. Mama Justice set up at her new gas stove, with its shimmering blue flame, and set to work peeling onions, adding tomatoes and making her “famous green sauce.” It took over an hour to prepare, but when she was finished, she took a large spoon and tried the sauce — cooked for the first time with clean gas. The food tasted delicious, as always. Best of all, she didn’t have itchy, watery eyes like she usually did after an hour spent laboring over a hot, wood-burning stove! Mama Justice was so excited she invited the whole Green Heat team to stay for lunch. Mama Justice and her family gather for a delicious meal. All her life, Mama Justice had endured problems from the smoke in her kitchen — coughing, wheezing, and enduring eye pain. The Green Heat anaerobic digester not only brought energy into her home using the waste she was already producing and eliminated her need to gather wood; it provided a healthier environment for her to cook for her family. And as she teaches her recipes to the next generation, the only tears they will be crying will be from peeling onions — not from smoke. The Hattaway team co-created this story with Green Heat and Securing Water for Food, an organization working with entrepreneurs and scientists around the world to help farmers grow more food with less water. To learn more about Green Heat, visit them on Facebook.
https://medium.com/aspirational/recipe-for-success-firing-ugandan-kitchens-with-natural-waste-97f82e90bd29
['Hattaway Communications']
2018-08-04 00:12:37.220000+00:00
['Storytelling', 'Archive', 'Sustainability', 'Innovation', 'Uganda']
Title Recipe Success new system creates clean gas…Content Recipe Success Green Heat helping Ugandan household fire kitchen environmentally friendly gas heat made natural waste Mama Justice doesn’t let age slow 70 year old run small pig farm Buwambo Wakiso District Uganda recently gathered firewood every day cook meal four grandchild Mama Justice fantastic cook — easily whipping meal bananalike matoke cassava groundnut paste fish many year practice take pride kitchen even though smoke firewood often made difficult Mama Justice grandchild breathe small home Mama Justice home Buwambo Wakiso District Uganda 70 year hauling firewood Mama Justice beginning struggle physical burden friend suggested Green Heat social enterprise based Kampala could help Green Heat team recommended could use anaerobic digester instead firewood Anaerobic digester use biodegradable waste plant leaf livestock manure create cleanburning fuel Microorganisms bacteria break waste dark oxygenstarved environment mixture fermented renewable gas produced Mama Justice’s kitchen filling smoke wood burning stove Vianney Tumwesige managing director Green Heat team informed Justice digester could pipe clean gas directly kitchen — eliminating reliance firewood ridding home dangerous smoke innovative Green Heat digester would also help conserve water saving timeconsuming backbreaking trip well multiple time day “Green Heat’s digester recycles water back system” explains Tumwesige “It’s better environment le work farmer maintain” Mama Justice never used waste pig source energy doubt cook responsible feeding large family needed reliable source fuel concerned cooking different kind fuel would change taste food Justice wanted know beloved recipe wouldn’t change fuel source Tumwesige assured food would remain delicious took patience First Mama Justice invest pig provide enough manure power system digester needed several month begin breaking waste start filling fuel tank took 3 month 2000 liter waste get system running full capacity came true test Mama Justice set new gas stove shimmering blue flame set work peeling onion adding tomato making “famous green sauce” took hour prepare finished took large spoon tried sauce — cooked first time clean gas food tasted delicious always Best didn’t itchy watery eye like usually hour spent laboring hot woodburning stove Mama Justice excited invited whole Green Heat team stay lunch Mama Justice family gather delicious meal life Mama Justice endured problem smoke kitchen — coughing wheezing enduring eye pain Green Heat anaerobic digester brought energy home using waste already producing eliminated need gather wood provided healthier environment cook family teach recipe next generation tear cry peeling onion — smoke Hattaway team cocreated story Green Heat Securing Water Food organization working entrepreneur scientist around world help farmer grow food le water learn Green Heat visit FacebookTags Storytelling Archive Sustainability Innovation Uganda
2,499
The RSI² Leading Indicator. Detecting Trend Exhaustion Early in Trading.
The Relative Strength Index We all know about the Relative Strength Index — RSI and how to use it. It is without a doubt the most famous momentum indicator out there, and this is to be expected as it has many strengths especially in ranging markets. It is also bounded between 0 and 100 which makes it easier to interpret. Also, the fact that it is famous, contributes to its potential. This is because, the more traders and portfolio managers look at the RSI, the more people will react based on its signals and this in turn can push market prices. Of course, we cannot prove this idea, but it is intuitive as one of the basis of Technical Analysis is that it is self-fulfilling. J. Welles Wilder came up with this indicator in 1978 as a momentum proxy with an optimal lookback period of 14 periods. It is bounded between 0 and 100 with 30 and 70 as the agreed-upon oversold and overbought zones respectively. The RSI can be used through 4 known techniques: Oversold/Overbought zones as indicators of short-term corrections. Divergence from prices as an indication of trend exhaustion. Drawing graphical lines on the indicator to find reaction levels. Crossing the 50 neutrality level as a sign of a changing momentum. The RSI is calculated using a rather simple way. We first start by taking price differences of one period. This means that we have to subtract every closing price from the one before it. Then, we will calculate the smoothed average of the positive differences and divide it by the smoothed average of the negative differences. The last calculation gives us the Relative Strength which is then used in the RSI formula to be transformed into a measure between 0 and 100.
https://medium.com/python-in-plain-english/the-rsi%C2%B2-leading-indicator-detecting-trend-exhaustion-early-in-trading-284a59dc1ea3
['Sofien Kaabar']
2020-12-15 14:13:50.035000+00:00
['Machine Learning', 'Artificial Intelligence', 'Python', 'Finance', 'Data Science']
Title RSI² Leading Indicator Detecting Trend Exhaustion Early TradingContent Relative Strength Index know Relative Strength Index — RSI use without doubt famous momentum indicator expected many strength especially ranging market also bounded 0 100 make easier interpret Also fact famous contributes potential trader portfolio manager look RSI people react based signal turn push market price course cannot prove idea intuitive one basis Technical Analysis selffulfilling J Welles Wilder came indicator 1978 momentum proxy optimal lookback period 14 period bounded 0 100 30 70 agreedupon oversold overbought zone respectively RSI used 4 known technique OversoldOverbought zone indicator shortterm correction Divergence price indication trend exhaustion Drawing graphical line indicator find reaction level Crossing 50 neutrality level sign changing momentum RSI calculated using rather simple way first start taking price difference one period mean subtract every closing price one calculate smoothed average positive difference divide smoothed average negative difference last calculation give u Relative Strength used RSI formula transformed measure 0 100Tags Machine Learning Artificial Intelligence Python Finance Data Science