Unnamed: 0
int64 0
192k
| title
stringlengths 1
200
| text
stringlengths 10
100k
| url
stringlengths 32
885
| authors
stringlengths 2
392
| timestamp
stringlengths 19
32
⌀ | tags
stringlengths 6
263
| info
stringlengths 45
90.4k
|
---|---|---|---|---|---|---|---|
5,500 |
A Hell/Heaven of Our Own — Alma and the Tibetan Book of the Dead
|
Centuries old Zhi-Khro mandala, a part of the Bardo Thodol’s collection, a text known in the West as The Tibetan Book of the Dead, which comprises part of a group of bardo teachings held in the Nyingma (Tibetan tradition) originated with guru Padmasambhava in the 8th Century. Borrowed from Wikipedia.
“In the Tibetan Book of the Dead [or The Bardos], when the instructions are given as to what happens when someone leaves their body after death, it says something like, ‘When the clear light of the void comes, it is followed by the vision of the blissful Bodhisattvas; then comes the vision of the wrathful Bodhisattvas,’ and so on. And then it says, ‘Realize, oh nobly born, that all this is but the outpouring of your own mind.” — Alan Watts, What is Zen? (New World Library, 2000), 23 “I say unto you, can you imagine to yourselves that ye hear the voice of the Lord, saying unto you, in that day, ‘Come unto me ye blessed, for behold, your works have been the works of righteousness upon the face of the earth?’ Or do ye imagine to yourselves that ye can lie unto the Lord in that day, and say, ‘Lord, our works have been righteous works upon the face of the earth,’ and that he will save you? Or otherwise, can ye imagine yourselves brought before the tribunal of God with your souls filled with guilt and remorse, having a remembrance of all your guilt, yea, a perfect remembrance of all your wickedness, yea, a remembrance that ye have set at defiance the commandments of God? “I say unto you, can ye look up to God at that day with a pure heart and clean hands? I say unto you, can you look up, having the image of God engraven upon your countenances? I say unto you, can ye think of being saved when you have yielded yourselves to become subjects to the devil? “I say unto you, ye will know at that day that ye cannot be saved; for there can no man be saved except his garments are washed white; yea, his garments must be purified until they are cleansed from all stain, through the blood of him of whom it has been spoken by our fathers, who should come to redeem his people from their sins. … “These are they that are redeemed of the Lord; yea, these are they that are taken out, that are delivered from that endless night of darkness. And thus they stand or fall; for behold, they are their own judges, whether to do good or do evil.” — Alma 5:16–21, 41:7; borrowed from Grant Hardy, The Book of Mormon: A Reader’s Edition (University of Illinois Press, 2005), 261, 367
I believe these two ideas, those of the Tibetan Book of the Dead (as described by Alan Watts) and those of Alma the Younger in the Book of Mormon, are quite similar. Something about the approach of death, and even passing through it, brings something like judgment, we could say.
Both the Book of Alma and the Tibetan Book of the Dead agree, however, that this is not the judgment of external deities, though we surely experience it that way initially. Instead, we come to find that, whether punishment or reward, these do not come flying at us from some externality, but erupt from within us. Even in the afterlife, as in this life, our lives are not the result of an external authority’s judgment, but of our own inner state of being. This should entirely change how we read Alma’s teaching, that one must have their “garments purified” of that which condemns us from within, not without. The Book of Mormon has a long lineage of malefactors, like Nehor, who taught that God will save all without judgment. But this view misses the point: it’s not that God is unwilling to save all, but that God does not save anymore than condemn. God instead opens our eyes to those portions of ourselves which we seek to hide from others, including ourselves; those truths about ourselves which we desperately wish were not true. Nehor missed the point, but so do those who glory in self-condemnation, the unfortunate people who meet themselves with disgust, contempt, shame.
God is not the judge in any conventional sense, but the Christ, who looks upon even the darkest corners of our hearts, not with a runaway or renegade justice of retribution (that was our gig), but with unconditional mercy. We are the ones who cannot accept ourselves or others, not God, not Christ. Christ is the one who looks at the darkest reaches of the soul, that which one would wish wholeheartedly to hide away, and says, “I love you.” It’s in this vein that the Book of Mormon calls its reader to take upon themselves the name of Christ, to awaken the same pure love of Christ within, for themselves and others. Our problem is not that God is mad and we have to win him over; it’s that the root of our suffering is that we can’t accept ourselves or others as is. Our problem is not that we have to win over or woo someone else, or even ourselves, but that we keep desperately telling ourselves we must because we and others are not enough in ourselves. Always trying to be someone else or get somewhere else, we’re never who or where we are.
We all die, and, in a manner of speaking, we’re all going to the same place. In this life, we can be in the same room doing the same thing, but one of us is experiencing joy and the other their own private hell — a subtlety which follows us beyond death. A Zen koan says this:
“A soldier named Nobushige came to Hakuin, and asked: “Is there really a paradise and a hell?” “Who are you?” inquired Hakuin. “I am a samurai,” the warrior replied. “You, a soldier!” exclaimed Hakuin. “What kind of ruler would have you as his guard? Your face looks like that of a beggar.” Nobushige became so angry that he began to draw his sword, but Hakuin continued: “So you have a sword! Your weapon is probably much too dull to cut off my head.” As Nobushige drew his sword Hakuin remarked: “Here open the gates of hell!” At these words the samurai, perceiving the master’s discipline, sheathed his sword and bowed. “Here open the gates of paradise,” said Hakuin. — 122 Zen Koans: Find Enlightenment, ed. Taka Washi (2013), section 57
We are not punished for sin or rewarded for virtue, but punished by sin and rewarded by virtue. God cannot give that any more than life will — we create our own hells and heavens.
|
https://medium.com/interfaith-now/a-hell-heaven-of-our-own-alma-and-the-tibetan-book-of-the-dead-5e09c619758
|
['Nathan Smith']
|
2020-01-06 14:28:16.551000+00:00
|
['Mormon', 'Tibetan Book Of The Dead', 'Book Of Mormon', 'Buddhism', 'Afterlife']
|
Title HellHeaven — Alma Tibetan Book DeadContent Centuries old ZhiKhro mandala part Bardo Thodol’s collection text known West Tibetan Book Dead comprises part group bardo teaching held Nyingma Tibetan tradition originated guru Padmasambhava 8th Century Borrowed Wikipedia “In Tibetan Book Dead Bardos instruction given happens someone leaf body death say something like ‘When clear light void come followed vision blissful Bodhisattvas come vision wrathful Bodhisattvas’ say ‘Realize oh nobly born outpouring mind” — Alan Watts Zen New World Library 2000 23 “I say unto imagine ye hear voice Lord saying unto day ‘Come unto ye blessed behold work work righteousness upon face earth’ ye imagine ye lie unto Lord day say ‘Lord work righteous work upon face earth’ save otherwise ye imagine brought tribunal God soul filled guilt remorse remembrance guilt yea perfect remembrance wickedness yea remembrance ye set defiance commandment God “I say unto ye look God day pure heart clean hand say unto look image God engraven upon countenance say unto ye think saved yielded become subject devil “I say unto ye know day ye cannot saved man saved except garment washed white yea garment must purified cleansed stain blood spoken father come redeem people sin … “These redeemed Lord yea taken delivered endless night darkness thus stand fall behold judge whether good evil” — Alma 516–21 417 borrowed Grant Hardy Book Mormon Reader’s Edition University Illinois Press 2005 261 367 believe two idea Tibetan Book Dead described Alan Watts Alma Younger Book Mormon quite similar Something approach death even passing brings something like judgment could say Book Alma Tibetan Book Dead agree however judgment external deity though surely experience way initially Instead come find whether punishment reward come flying u externality erupt within u Even afterlife life life result external authority’s judgment inner state entirely change read Alma’s teaching one must “garments purified” condemns u within without Book Mormon long lineage malefactor like Nehor taught God save without judgment view miss point it’s God unwilling save God save anymore condemn God instead open eye portion seek hide others including truth desperately wish true Nehor missed point glory selfcondemnation unfortunate people meet disgust contempt shame God judge conventional sense Christ look upon even darkest corner heart runaway renegade justice retribution gig unconditional mercy one cannot accept others God Christ Christ one look darkest reach soul one would wish wholeheartedly hide away say “I love you” It’s vein Book Mormon call reader take upon name Christ awaken pure love Christ within others problem God mad win it’s root suffering can’t accept others problem win woo someone else even keep desperately telling must others enough Always trying someone else get somewhere else we’re never die manner speaking we’re going place life room thing one u experiencing joy private hell — subtlety follows u beyond death Zen koan say “A soldier named Nobushige came Hakuin asked “Is really paradise hell” “Who you” inquired Hakuin “I samurai” warrior replied “You soldier” exclaimed Hakuin “What kind ruler would guard face look like beggar” Nobushige became angry began draw sword Hakuin continued “So sword weapon probably much dull cut head” Nobushige drew sword Hakuin remarked “Here open gate hell” word samurai perceiving master’s discipline sheathed sword bowed “Here open gate paradise” said Hakuin — 122 Zen Koans Find Enlightenment ed Taka Washi 2013 section 57 punished sin rewarded virtue punished sin rewarded virtue God cannot give life — create hell heavensTags Mormon Tibetan Book Dead Book Mormon Buddhism Afterlife
|
5,501 |
The COVID-19 Boogaloo Opus
|
All decisions here, in either direction, could kill you.
As the sensemaking crisis goes into overload on COVID-19, many people are shutting down, many people are hyperventilating, and many people don’t know what’s real and what’s not. The signal to noise ratio on this is very badly weighted towards noise, with our mainstream media sources and politicians failing to be particularly helpful. And while COVID-19 could kill you, other things could kill you too, such as unemployment, starvation, or national civil war, and we need to look at all of those things honestly to pull the signal out of the noise. There are a lot of things to consider. Let’s look at the noise first.
The Noise
I saw this in my social feed today. I think it’s a totally accurate depiction of the noise we’re all facing.
Well good news! A friend has broken down all the facts and everything we need to know about COVID-19! 1. Basically, you can’t leave the house for any reason, but if you have to, then you can. 2. Masks are useless, but maybe you have to wear one, it can save you, it is useless, but maybe it is mandatory as well. 3. Stores are closed, except those that are open. 4. You should not go to hospitals unless you have to go there. Same applies to doctors, you should only go there in case of emergency, provided you are not too sick. 5. This virus is deadly but still not too scary, except that sometimes it actually leads to a global disaster. 6. Gloves won’t help, but they can still help. 7. Everyone needs to stay HOME, but it’s important to GO OUT. 8. There is no shortage of groceries in the supermarket, but there are many things missing when you go there in the evening, but not in the morning. Sometimes. 9. The virus has no effect on children except those it affects. 10. Animals are not affected, but there is still a cat that tested positive in Belgium in February when no one had been tested, plus a few tigers here and there… 11. You will have many symptoms when you are sick, but you can also get sick without symptoms, have symptoms without being sick, or be contagious without having symptoms. 12. In order not to get sick, you have to eat well and exercise, but eat whatever you have on hand and it’s better not to go out, well, but no… 13. It’s better to get some fresh air, but you get looked at very wrong when you get some fresh air, and most importantly, you don’t go to parks or walk. But don’t sit down, except that you can do that now if you are old, but not for too long or if you are pregnant (but not too old). 14. You can’t go to retirement homes, but you have to take care of the elderly and bring food and medication. 15. If you are sick, you can’t go out, but you can go to the pharmacy. 16. You can get restaurant food delivered to the house, which may have been prepared by people who didn’t wear masks or gloves. But you have to have your groceries decontaminated outside for 3 hours. Pizza too? 17. Every disturbing article or disturbing interview starts with “ I don’t want to trigger panic, but…” 18. You can’t see your older mother or grandmother, but you can take a taxi and meet an older taxi driver. 19. You can walk around with a friend but not with your family if they don’t live under the same roof. 20. You are safe if you maintain the appropriate social distance, but you can’t go out with friends or strangers at the safe social distance. 21. The virus remains active on different surfaces for two hours, no, four, no, six, no, we didn’t say hours, maybe days? But it takes a damp environment. Oh no, not necessarily. 22. The virus stays in the air — well no, or yes, maybe, especially in a closed room, in one hour a sick person can infect ten, so if it falls, all our children were already infected at school before it was closed. But remember, if you stay at the recommended social distance, however in certain circumstances you should maintain a greater distance, which, studies show, the virus can travel further, maybe. 23. We count the number of deaths but we don’t know how many people are infected as we have only tested so far those who were “almost dead” to find out if that’s what they will die of… 24. We have no treatment, except that there may be one that apparently is not dangerous unless you take too much (which is the case with all medications). Orange man bad. 25. We should stay locked up until the virus disappears, but it will only disappear if we achieve collective immunity, so when it circulates… but we must no longer be locked up for that?
This depiction of the noise about COVID-19 is dead on. And people clamoring for the general population to restrict their sensemaking to only official channels: (A) Don’t seem to be aware of the tremendous fuckups that the official channels have already made on this, and, (B) Seem to be vehemently opposed to the most “official” channel in the country anyway.
This is a morass of sensemaking failure that could lead to things far worse than the viral infection that caused it. Let’s move forward by extracting the signal, the actual facts, that we can hang our hat on at this time.
The Tale of Two National Fuckups
This thing came from a Chinese laboratory in Wuhan, probably the Wuhan Institute of Virology. We don’t need evidence gift wrapped by the Chinese to make this case. We just need simple mathematics, and the case is rock solid.
The “official channels” have maintained for four months that this virus originated in a wet market in Wuhan, not at the Wuhan Institute of Virology, which is the world’s Mecca of studying emergent SARS coronaviruses that originate in bats. A lot of speculation by the media has gone into supporting this case, as well as the solid support of the Chinese government, but the case is obviously garbage. I grant that wet markets for exotic harvested wild meats are a great vector for something like this, but set that aside for a moment.
There are between a hundred and a thousand wet markets in China. There are well over a thousand wet markets in Vietnam. There are well over a thousand wet markets in Thailand. There are hundreds or thousands of wet markets in Laos, hundreds or thousands more in Cambodia, hundreds or thousands more in Burma and Myanmar and Malaysia. Nobody knows for sure, but it’s completely reasonable to estimate the total number of wet markets in East Asia being at least ten thousand.
But only one of these ten thousand or more wet markets is two blocks from the Wuhan Institute of Virology.
The chance that a brand new never before seen SARS coronavirus variant would emerge at the only wet market two blocks from a laboratory whose primary function is to study never before seen SARS coronavirus variants, specifically from bats, is simply too astronomical to believe. If a brand-new world epidemic virus were to emerge every day from a wet market in east Asia, it would be three years or more on average before one emerged from Wuhan. No honest scientist would believe that coincidence given what we know.
I’ve followed a lot of traffic from geneticists and epidemiologists saying this virus doesn’t seem to have the earmarks of being created artificially. They may be right. But that doesn’t mean that a diseased bat wasn’t transported to Wuhan and the virus escaped via an infected technician, or via an improperly disposed of specimen. Nor does it rule out the disease being a product of “gain of function” research on bats with lesser uncatalogued diseases.
The Chinese reaction was archetypically communist and cannot be trusted. In order, they imprisoned whistle blowers, denied the virus, admitted the virus but said it wasn’t transmissible, admitted it was transmissible and invited foreign journalists in to watch them build a giant hospital, turned everyone in Wuhan into The Bubble Boy, snuffed it out (officially), then kicked all the journalists out and reopened the city. Then after the journalists were gone, they beat up people trying to go to the hospitals with COVID-19 to keep their new cases number down, cremated a lot more people than the official death count, denied any reinfection after lockdown was ended, and then blamed the origins of the infection on the US Army. Which is obviously not true, because if the USA had developed the virus we’d have tests for it way sooner than we did.
Now granted, that could just be communists acting like communists, but the entire timeline tells of a cover up.
The USA’s fuckup was a fuckup of mid level bureaucracy that has been widely reported, but doesn’t seem to be widely understood despite the reporting. This article is a fabulous primer, but I’ll summarize.
The first case in the US was identified the same day as the first case in South Korea, January 21st. South Korea gave out regulatory approval to every company in the country that wanted to make a test within one week, by the end of January, and as a result created the best testing apparatus in the world. The FDA and CDC collaborated to prevent US companies and universities from developing tests until the middle of March, and only eventually stopped obstructing test development by administrative (Trump/Pence) fiat. One of the most egregious examples of this behavior, which was promulgated by bureaucrats at the FDA and CDC, is the example of Washington University’s Helen Y. Chu, who after testing someone in her ongoing flu study for COVID-19 and discovering she had a sample pool that may have many infections, was told, basically:
1) You just violated that test subject’s HIPAA privacy rights, and 2) You don’t have a permit to do COVID-19 tests, therefore 3) Stop testing.
If they had said the exact opposite, Seattle would have been controlled. Chu had everything in her hands to isolate the Seattle cases and possibly the lions share of the cases on the West Coast.
When universities and companies tried to develop their own tests, they were told to apply for a permit, and then only one permit was issued — to the CDC. The CDC then screwed up the test, and had to release a new one several weeks later. The backlash from the screw-ups came to a head the last day of February, where the FDA begrudgingly allowed some 5,000 labs (of the 260,000 labs in the country) to start working on tests.
The doors were finally thrown open to academic and private entities in full on March 15th, when HIPAA was waived for anyone working on COVID-19, and March 16th, when Vice President Mike Pence announced that all the rest of the labs could work on this without FDA interference.
Wojtek Kopczuk, a professor of economics at Columbia University, quipped that the “FDA sped up the process by removing itself from the process.”
The USA lost 45 days as compared to South Korea, at the same starting gun, entirely due to pencil pushers at the FDA and CDC. The important thing to take away from the Tale of The Second National Fuckup, is that no politician could have prevented this, unless they were willing to unilaterally step in, deplatform the FDA, burn HIPAA sooner, and bust the CDC down into an “advisory only” role. Not Trump, not Hillary, not Biden, not Bernie. The one politician who might have been able to do it, is the hypothetical caricature of Trump for which many Trump voters voted. And knowing how government works in the USA, it is unthinkable that this will get fixed, or that this won’t happen again the next time, because our universal bipartisan answer to government failure is more government.
And the government’s final response to needlessly wasting 45 days reacting to this, is to issue a 2 trillion-dollar bailout to pause the national economy for 56 days, so we can catch up, while everyone loses their jobs.
Boogaloo Soup
Case fatality rates (CFRs) for this thing vary tremendously by country, because the numbers don’t exist to properly calculate it. You’re not supposed to calculate a CFR until a confirmed case has either cleared or died, but everyone is calculating them in real time for COVID-19 by looking at deaths of confirmed cases. The math is all wrong. For one, we have some dead people who probably had COVID-19 and didn’t get counted because they didn’t get tested. For another, we have a lot more people who caught it and survived, but never got confirmed, because of the testing SNAFU. For a third, we have some currently alive cases who haven’t resolved in our numbers. This means the numerator in the fraction is wrong, and the denominator is very wrong. It is likely, on speculative analysis, that the final CFR for this thing will turn out to be very similar to the flu, as we see in South Korea and Iceland which have good testing. It’s “just” a flu that everyone gets all at once because nobody started with any immunity to it, which leads to more dead people. The spike of COVID-19 deaths we will see in the coming months are going to be several times higher than the flu deaths, because basically they’re several years-worth of flu victims squeezed into one year.
What would happen if this were as deadly as the measles? What if a “true” epidemic, of the style we’ve seen in the past, hit the highly connected, highly vectored, Marketplace Of Disease we call the “global economy?” Same infection rate, much higher CFR. This is the sort of thing the “preppers” have been thinking about for years.
The preppers didn’t need to run out and buy toilet paper. Or meat. Or rice. Or guns. But everyone else did — especially guns. Check out the March 2020 statistics for the FBI’s NICS Background Check System:
Firearm sales vary seasonally, owing to things like hunting season, Black Friday, Christmas, and such, but a good indicator of gun sales trends can be drawn by comparing a month’s sales to the same month from the prior year. In the past full year, every month except one has set a new monthly record. Look closely at March 2020. 3.7 million background checks were issued in March, in a simply unprecedented wave of sales. It’s a million-gun spike, ten times previous spikes.
Gun store owners told an interesting tale as well — these are almost all new owners. Most estimate around 75% of gun sales in March 2020 were first time buyers. That would constitute 2.8 million people, almost 1% of the total US population buying their first gun. Many of them liberal. Some of them prior supporters of gun control. Many foreign nations only have 1% gun ownership rate nationwide. We may have that many who became first time owners a month ago. And now the ranges are closed, and they can’t practice or train with their new purchase, and they’re sitting at home losing their jobs reading a stream of social media anxiety.
These numbers don’t even count peer to peer sales or gifts of prior owned firearms. And the other things people are buying? Nonperishable food, medicines, seeds, things to use at home. Prepper stuff.
The makeup of COVID-19 America now constitutes the following classes:
1) Previously armed previous preppers
2) Previously armed new preppers
3) Newly armed new preppers
4) Unarmed new preppers, lovingly referred to as “targets.” Bless their heart.
Nobody’s not a prepper anymore. Certain corners of the internet might call this a “Boogaloo Soup.”
But shooting people is a really hard, really terrible thing to do, so people don’t generally start shooting each other unless they have three things, not just one. First, they need the tools. Second, they need dire motivation. And third, they need psychological reinforcement that dehumanizes “the other,” some frame of reference or point of view that posits the person at the other end of the barrel as less human than themselves. We have the tools. Do we have these other elements?
Tribal Dehumanization
It’s widely known that our modern cross-tribal trust, when we speak of the Red Tribe and the Blue Tribe, is historically low. We have described this in many ways on HWFO, and some good indicators came from studies in the run up to the 2018 midterms, which were already marred with political violence.
All that happened before the impeachment trial.
A graph from that article highlights alarming polling numbers from an APSA 2018 study.
see other article, link above
Be very clear, at the peak of the Syrian Civil War, the total number of combatants on all sides only numbered 2% or less of the population on a per capita basis. We have 1% new gun owners alone, a national gun ownership rate around 30%, and a projected number of Red or Blue tribal goons who support terrorism to be up around the 15% range before COVID-19 entered the picture.
But the statistics supporting cross tribal terrorism aside, one of the best indicators of literal dehumanization might be to look at marriage polls.
People who identified with a party had even more intense feelings. In 1958, 33 percent of Democrats wanted their daughters to marry a Democrat, and 25 percent of Republicans wanted their daughters to marry a Republican. But by 2016, 60 percent of Democrats and 63 percent of Republicans felt that way.
Compare that to gauges of classic racism, the most historically significant American tradition of dehumanizing “the other.”
Opinions about interracial dating and marriage on a personal level have also evolved significantly. In 1971, 48% nationally said they would not approve of their own children dating someone of another race, while 28% said they would approve.
Put simply, the Red Tribe / Blue Tribe cultural divide in the United States is thicker than mid-20th century racism. We have all the dehumanization we need for a civil war, and all the gear. We’re just not motivated yet.
Lockdown Calculus
A lot of people I speak to don’t seem to understand that the economy is not just something we do to manipulate a stock market. It is the fundamental way that humans provide for our needs, including food, and has been since we came down out of the trees and settled on this idea of “labor specialization.”
I spoke to a lady in the Philippines a few weeks ago. She’s in her 20s, poor, and lives in Manilla in a two-bedroom apartment with five other women. Not uncommon there. She teaches English as a Second Language over the internet to Japanese people. Or at least she did until their lockdown.
She told me stories from the ground floor in Manila. When their lockdown went into effect, tens of thousands of people hit the roads and walked home to their family villages in the rest of the country, in a mass pedestrian migration that took many days. They just walked. Slept on the side of the road. She stayed, and explained how their lockdown was being managed by the local matron in charge of a block of apartment buildings, who was acquiring food and delivering it to them, while the army patrolled the streets. She said the Army was very nice, and that everyone was in good spirits, because Filipino people are generally good spirited people. But the topic among everyone was when the food would run out, and whether more people were going to die from the lockdown than the virus.
Her Facebook account was deactivated last week.
I don’t know why.
People in Africa are revolting against the lockdowns now, and with very good reason. The median life expectancy in many areas of Africa isn’t much over 65, and their rate of the sorts of comorbidities that are leading indicators for COVID-19 fatalities is extremely low. Few obese people, few diabetics, few old people. In a morbid sort of way, COVID-19 deaths will be very small throughout Africa because a combination of other factors, malaria and malnutrition among them, have already cleared out the people most likely to die from it. One of their leading sources of death is malnutrition. Africans should objectively abandon lockdown now, if not yesterday.
Lesson: The calculus of when to come off lockdown is different everywhere, and the damage the lockdown does must be accounted for in this calculus.
There aren’t a lot of great studies on the fatality rate of recessions in the United States, but the best I’ve read was by Daniel Sullivan and Till von Wachter. Their sample pool was unfortunately limited to high seniority male workers in Pennsylvania from the 1970s on, but we might make some assumptions and apply it to the general pool. They found that overall mortality rates of their sample set increased by between 50% and 100% in the year following the year they got laid off. And while that effect declines sharply in further years, it remains at 10% to 15% higher twenty years later.
Let’s presume the mortality rate among the poor who get laid off is near the top of this band, and let’s further presume that the net chance of death over the time scale generally adds up, to a 100% increase in overall mortality due to an unemployment event.
The mortality rate for the working age population is on average, per CDC data, around 200 per 100,000, so a 100% increase would be an additional 200 per 100,000. The St. Louis Fed projects unemployment to top 47 million people in the wake of our COVID-19 response. If we presume that only 41 million of these people are directly due to the response, which seems reasonable given prior unemployment numbers, we can calculate 82,300 people killed by the lockdown. A recent article by the admittedly partisan National Review calculated a similar number by different means.
If the lockdown saves half a million people, perhaps this is worth it. If we value the lives of working age people higher than we do aging retirees or people in nursing homes, perhaps it’s not. If the lockdown doesn’t actually save that many people anyway, because our treatments in the hospital don’t help that much, then this entire calculation gets a lot muddier. And all this ignores the hunger element, which for the USA is tied in with both the Potential Boogaloo and the food service industry.
Hunger Is the Indicator
I routinely post on my Facebook wall how many people I know who (Have contracted COVID-19) / (Have Recovered) / (Have Died) / (Are Newly Unemployed).
My friends all respond with their counts. Most know fewer than I who’ve contracted it, and only a few know someone who’s died. But the greatest variation in the numbers is in the “unemployed” category. My friends who are tied up in the entertainment, food service, or bar industries have “unemployed” numbers in the hundreds, or say they are simply uncountable. And food service is a huge link in the supply chain of hunger, a chain that’s been broken, and the breaks in the chain have now officially spilled back into agriculture itself.
According to the New York Times, tens of millions of pounds of fresh food are being destroyed by the nations farmers because we closed restaurants, hotels, and schools. 3.7 million gallons of milk per day are dumped out on the ground. Farmers are currently plowing under fields of fresh produce, because they have no choice. It seems absurd on its face, but it’s entirely predictable. The banana you buy in the grocery store looks different than the bananas they use in restaurants. Nobody makes onion rings at home. Everybody bought potatoes and rice for three weeks, and now have to figure out what to do with all the storable starch they bought instead of buying lettuce. The Times indicated that 5% of the total nation’s milk supply is being dumped out every day, and that will grow to 10% if we stay on lockdown too much longer.
The Times narrative speaks about tragedy in the industry, and with good reason, but the terrifying thing lies below the surface. It’s conceivable that bailout money could keep those industries alive, but there is no amount of bailout money that will dig that onion out of the ground. And the onion served a more important purpose than farming revenue. It was food. It prevented hunger. That’s what the economy is for, remember?
On the whole, US farmers export over 20% of what they produce, according to the USDA. But 18% of the food Americans ate, before the lockdown, was eaten away from home. In a perfectly elastic economy, nobody would starve in the US from closing all the restaurants because a 20% reduction in food production will simply lead to 20% less food exported, and we would hoard the remaining food for ourselves. But that’s not how things work. If a food factory is built to put food into export boxes instead of grocery store boxes, it’s going to continue to do so. Especially now, when foreign countries are probably already struggling with their own food shortages and we are their bread basket. In a very real way, buying up all those potatoes during the Great Grocery Rush of 2020 took a potato out of some kid’s mouth in another country, so now they might starve. And in a very real way, if we don’t open the restaurants up soon, and get the prior supply chains working again, we are very likely to end up with long term food shortages here.
And that’s the last element we need to start shooting each other.
Although I’m a prepper, and I’ve got plenty of food in my garage, you may not be. And if I was you, and my children were starving, I might try to shoot someone and take their food. And if you are you, and you try to do that to me, you might get shot. Expand to the national case.
If that happens, we will have the Tools for the Boogaloo, which are guns. We will have the Dehumanization for the Boogaloo, which is our political and cultural tribalism. And we will finally have the Motivation for the Boogaloo, which is our kids need to eat.
The Boogaloo Soup will be complete.
And should that happen, it will kill far more people than COVID-19, and will kill far more people than the unemployment from our response to COVID-19. It will be the greatest tragedy in the history of our nation, because we will have brought it all upon ourselves, from our own Freakoutery.
The soup timer is ticking. Beginning of May would be a great time to get our asses in gear.
|
https://medium.com/handwaving-freakoutery/the-covid-19-boogaloo-opus-51b1c1b860cd
|
['Bj Campbell']
|
2020-04-17 16:56:20.075000+00:00
|
['Politics', 'Guns', 'Covid 19', 'Coronavirus', 'Random']
|
Title COVID19 Boogaloo OpusContent decision either direction could kill sensemaking crisis go overload COVID19 many people shutting many people hyperventilating many people don’t know what’s real what’s signal noise ratio badly weighted towards noise mainstream medium source politician failing particularly helpful COVID19 could kill thing could kill unemployment starvation national civil war need look thing honestly pull signal noise lot thing consider Let’s look noise first Noise saw social feed today think it’s totally accurate depiction noise we’re facing Well good news friend broken fact everything need know COVID19 1 Basically can’t leave house reason 2 Masks useless maybe wear one save useless maybe mandatory well 3 Stores closed except open 4 go hospital unless go applies doctor go case emergency provided sick 5 virus deadly still scary except sometimes actually lead global disaster 6 Gloves won’t help still help 7 Everyone need stay HOME it’s important GO 8 shortage grocery supermarket many thing missing go evening morning Sometimes 9 virus effect child except affect 10 Animals affected still cat tested positive Belgium February one tested plus tiger there… 11 many symptom sick also get sick without symptom symptom without sick contagious without symptom 12 order get sick eat well exercise eat whatever hand it’s better go well no… 13 It’s better get fresh air get looked wrong get fresh air importantly don’t go park walk don’t sit except old long pregnant old 14 can’t go retirement home take care elderly bring food medication 15 sick can’t go go pharmacy 16 get restaurant food delivered house may prepared people didn’t wear mask glove grocery decontaminated outside 3 hour Pizza 17 Every disturbing article disturbing interview start “ don’t want trigger panic but…” 18 can’t see older mother grandmother take taxi meet older taxi driver 19 walk around friend family don’t live roof 20 safe maintain appropriate social distance can’t go friend stranger safe social distance 21 virus remains active different surface two hour four six didn’t say hour maybe day take damp environment Oh necessarily 22 virus stay air — well yes maybe especially closed room one hour sick person infect ten fall child already infected school closed remember stay recommended social distance however certain circumstance maintain greater distance study show virus travel maybe 23 count number death don’t know many people infected tested far “almost dead” find that’s die of… 24 treatment except may one apparently dangerous unless take much case medication Orange man bad 25 stay locked virus disappears disappear achieve collective immunity circulates… must longer locked depiction noise COVID19 dead people clamoring general population restrict sensemaking official channel Don’t seem aware tremendous fuckup official channel already made B Seem vehemently opposed “official” channel country anyway morass sensemaking failure could lead thing far worse viral infection caused Let’s move forward extracting signal actual fact hang hat time Tale Two National Fuckups thing came Chinese laboratory Wuhan probably Wuhan Institute Virology don’t need evidence gift wrapped Chinese make case need simple mathematics case rock solid “official channels” maintained four month virus originated wet market Wuhan Wuhan Institute Virology world’s Mecca studying emergent SARS coronaviruses originate bat lot speculation medium gone supporting case well solid support Chinese government case obviously garbage grant wet market exotic harvested wild meat great vector something like set aside moment hundred thousand wet market China well thousand wet market Vietnam well thousand wet market Thailand hundred thousand wet market Laos hundred thousand Cambodia hundred thousand Burma Myanmar Malaysia Nobody know sure it’s completely reasonable estimate total number wet market East Asia least ten thousand one ten thousand wet market two block Wuhan Institute Virology chance brand new never seen SARS coronavirus variant would emerge wet market two block laboratory whose primary function study never seen SARS coronavirus variant specifically bat simply astronomical believe brandnew world epidemic virus emerge every day wet market east Asia would three year average one emerged Wuhan honest scientist would believe coincidence given know I’ve followed lot traffic geneticist epidemiologist saying virus doesn’t seem earmark created artificially may right doesn’t mean diseased bat wasn’t transported Wuhan virus escaped via infected technician via improperly disposed specimen rule disease product “gain function” research bat lesser uncatalogued disease Chinese reaction archetypically communist cannot trusted order imprisoned whistle blower denied virus admitted virus said wasn’t transmissible admitted transmissible invited foreign journalist watch build giant hospital turned everyone Wuhan Bubble Boy snuffed officially kicked journalist reopened city journalist gone beat people trying go hospital COVID19 keep new case number cremated lot people official death count denied reinfection lockdown ended blamed origin infection US Army obviously true USA developed virus we’d test way sooner granted could communist acting like communist entire timeline tell cover USA’s fuckup fuckup mid level bureaucracy widely reported doesn’t seem widely understood despite reporting article fabulous primer I’ll summarize first case US identified day first case South Korea January 21st South Korea gave regulatory approval every company country wanted make test within one week end January result created best testing apparatus world FDA CDC collaborated prevent US company university developing test middle March eventually stopped obstructing test development administrative TrumpPence fiat One egregious example behavior promulgated bureaucrat FDA CDC example Washington University’s Helen Chu testing someone ongoing flu study COVID19 discovering sample pool may many infection told basically 1 violated test subject’s HIPAA privacy right 2 don’t permit COVID19 test therefore 3 Stop testing said exact opposite Seattle would controlled Chu everything hand isolate Seattle case possibly lion share case West Coast university company tried develop test told apply permit one permit issued — CDC CDC screwed test release new one several week later backlash screwup came head last day February FDA begrudgingly allowed 5000 lab 260000 lab country start working test door finally thrown open academic private entity full March 15th HIPAA waived anyone working COVID19 March 16th Vice President Mike Pence announced rest lab could work without FDA interference Wojtek Kopczuk professor economics Columbia University quipped “FDA sped process removing process” USA lost 45 day compared South Korea starting gun entirely due pencil pusher FDA CDC important thing take away Tale Second National Fuckup politician could prevented unless willing unilaterally step deplatform FDA burn HIPAA sooner bust CDC “advisory only” role Trump Hillary Biden Bernie one politician might able hypothetical caricature Trump many Trump voter voted knowing government work USA unthinkable get fixed won’t happen next time universal bipartisan answer government failure government government’s final response needlessly wasting 45 day reacting issue 2 trilliondollar bailout pause national economy 56 day catch everyone loses job Boogaloo Soup Case fatality rate CFRs thing vary tremendously country number don’t exist properly calculate You’re supposed calculate CFR confirmed case either cleared died everyone calculating real time COVID19 looking death confirmed case math wrong one dead people probably COVID19 didn’t get counted didn’t get tested another lot people caught survived never got confirmed testing SNAFU third currently alive case haven’t resolved number mean numerator fraction wrong denominator wrong likely speculative analysis final CFR thing turn similar flu see South Korea Iceland good testing It’s “just” flu everyone get nobody started immunity lead dead people spike COVID19 death see coming month going several time higher flu death basically they’re several yearsworth flu victim squeezed one year would happen deadly measles “true” epidemic style we’ve seen past hit highly connected highly vectored Marketplace Disease call “global economy” infection rate much higher CFR sort thing “preppers” thinking year preppers didn’t need run buy toilet paper meat rice gun everyone else — especially gun Check March 2020 statistic FBI’s NICS Background Check System Firearm sale vary seasonally owing thing like hunting season Black Friday Christmas good indicator gun sale trend drawn comparing month’s sale month prior year past full year every month except one set new monthly record Look closely March 2020 37 million background check issued March simply unprecedented wave sale It’s milliongun spike ten time previous spike Gun store owner told interesting tale well — almost new owner estimate around 75 gun sale March 2020 first time buyer would constitute 28 million people almost 1 total US population buying first gun Many liberal prior supporter gun control Many foreign nation 1 gun ownership rate nationwide may many became first time owner month ago range closed can’t practice train new purchase they’re sitting home losing job reading stream social medium anxiety number don’t even count peer peer sale gift prior owned firearm thing people buying Nonperishable food medicine seed thing use home Prepper stuff makeup COVID19 America constitutes following class 1 Previously armed previous preppers 2 Previously armed new preppers 3 Newly armed new preppers 4 Unarmed new preppers lovingly referred “targets” Bless heart Nobody’s prepper anymore Certain corner internet might call “Boogaloo Soup” shooting people really hard really terrible thing people don’t generally start shooting unless three thing one First need tool Second need dire motivation third need psychological reinforcement dehumanizes “the other” frame reference point view posit person end barrel le human tool element Tribal Dehumanization It’s widely known modern crosstribal trust speak Red Tribe Blue Tribe historically low described many way HWFO good indicator came study run 2018 midterm already marred political violence happened impeachment trial graph article highlight alarming polling number APSA 2018 study see article link clear peak Syrian Civil War total number combatant side numbered 2 le population per caput basis 1 new gun owner alone national gun ownership rate around 30 projected number Red Blue tribal goon support terrorism around 15 range COVID19 entered picture statistic supporting cross tribal terrorism aside one best indicator literal dehumanization might look marriage poll People identified party even intense feeling 1958 33 percent Democrats wanted daughter marry Democrat 25 percent Republicans wanted daughter marry Republican 2016 60 percent Democrats 63 percent Republicans felt way Compare gauge classic racism historically significant American tradition dehumanizing “the other” Opinions interracial dating marriage personal level also evolved significantly 1971 48 nationally said would approve child dating someone another race 28 said would approve Put simply Red Tribe Blue Tribe cultural divide United States thicker mid20th century racism dehumanization need civil war gear We’re motivated yet Lockdown Calculus lot people speak don’t seem understand economy something manipulate stock market fundamental way human provide need including food since came tree settled idea “labor specialization” spoke lady Philippines week ago She’s 20 poor life Manilla twobedroom apartment five woman uncommon teach English Second Language internet Japanese people least lockdown told story ground floor Manila lockdown went effect ten thousand people hit road walked home family village rest country mass pedestrian migration took many day walked Slept side road stayed explained lockdown managed local matron charge block apartment building acquiring food delivering army patrolled street said Army nice everyone good spirit Filipino people generally good spirited people topic among everyone food would run whether people going die lockdown virus Facebook account deactivated last week don’t know People Africa revolting lockdown good reason median life expectancy many area Africa isn’t much 65 rate sort comorbidities leading indicator COVID19 fatality extremely low obese people diabetic old people morbid sort way COVID19 death small throughout Africa combination factor malaria malnutrition among already cleared people likely die One leading source death malnutrition Africans objectively abandon lockdown yesterday Lesson calculus come lockdown different everywhere damage lockdown must accounted calculus aren’t lot great study fatality rate recession United States best I’ve read Daniel Sullivan Till von Wachter sample pool unfortunately limited high seniority male worker Pennsylvania 1970s might make assumption apply general pool found overall mortality rate sample set increased 50 100 year following year got laid effect decline sharply year remains 10 15 higher twenty year later Let’s presume mortality rate among poor get laid near top band let’s presume net chance death time scale generally add 100 increase overall mortality due unemployment event mortality rate working age population average per CDC data around 200 per 100000 100 increase would additional 200 per 100000 St Louis Fed project unemployment top 47 million people wake COVID19 response presume 41 million people directly due response seems reasonable given prior unemployment number calculate 82300 people killed lockdown recent article admittedly partisan National Review calculated similar number different mean lockdown save half million people perhaps worth value life working age people higher aging retiree people nursing home perhaps it’s lockdown doesn’t actually save many people anyway treatment hospital don’t help much entire calculation get lot muddier ignores hunger element USA tied Potential Boogaloo food service industry Hunger Indicator routinely post Facebook wall many people know contracted COVID19 Recovered Died Newly Unemployed friend respond count know fewer who’ve contracted know someone who’s died greatest variation number “unemployed” category friend tied entertainment food service bar industry “unemployed” number hundred say simply uncountable food service huge link supply chain hunger chain that’s broken break chain officially spilled back agriculture According New York Times ten million pound fresh food destroyed nation farmer closed restaurant hotel school 37 million gallon milk per day dumped ground Farmers currently plowing field fresh produce choice seems absurd face it’s entirely predictable banana buy grocery store look different banana use restaurant Nobody make onion ring home Everybody bought potato rice three week figure storable starch bought instead buying lettuce Times indicated 5 total nation’s milk supply dumped every day grow 10 stay lockdown much longer Times narrative speaks tragedy industry good reason terrifying thing lie surface It’s conceivable bailout money could keep industry alive amount bailout money dig onion ground onion served important purpose farming revenue food prevented hunger That’s economy remember whole US farmer export 20 produce according USDA 18 food Americans ate lockdown eaten away home perfectly elastic economy nobody would starve US closing restaurant 20 reduction food production simply lead 20 le food exported would hoard remaining food that’s thing work food factory built put food export box instead grocery store box it’s going continue Especially foreign country probably already struggling food shortage bread basket real way buying potato Great Grocery Rush 2020 took potato kid’s mouth another country might starve real way don’t open restaurant soon get prior supply chain working likely end long term food shortage that’s last element need start shooting Although I’m prepper I’ve got plenty food garage may child starving might try shoot someone take food try might get shot Expand national case happens Tools Boogaloo gun Dehumanization Boogaloo political cultural tribalism finally Motivation Boogaloo kid need eat Boogaloo Soup complete happen kill far people COVID19 kill far people unemployment response COVID19 greatest tragedy history nation brought upon Freakoutery soup timer ticking Beginning May would great time get ass gearTags Politics Guns Covid 19 Coronavirus Random
|
5,502 |
The New Submission Guidelines For Making of a Millionaire
|
The New Submission Guidelines For Making of a Millionaire
What we are looking for in 2021
Photo by Aaron Burden on Unsplash
We all have a “money story.” Money impacts our lives every day in both positive and negative ways. Whether it be the anxiety caused by lack of money, the thrill of making money, the fear of losing money, or the responsibility of managing money, we all have a story to tell about how money impacts our lives.
We want to give you the opportunity to share your money story! We are now accepting submission requests to publish your money stories in “Making of a Millionaire.”
If you would like to submit a story to Making of a Millionaire, follow these easy steps.
Leave a comment below, and we can add you as a writer. Once we have added you as a writer, go to “Edit” on any story you wish to submit, click “Add to Publication,” and select “Making of a Millionaire.” We will either accept the story, make some edits to the story, or let you know if the story does not fit with this publication.
If you wouldn’t mind also telling us a bit about yourself and why you want to write about personal finance.
We will be publishing fewer stories in 2021
In 2020, we wanted to give as many writers as possible a chance to have their stories published in Making of a Millionaire.
And we published a LOT of stories. Too many, if I’m being honest. We ended up publishing many stories on a subject that we have already published many times before.
For example, we probably have over 50 stories published on how to build an emergency fund.
Does this serve the reader best?
No.
Having one or two great articles on how to build an emergency fund is all that is required and making sure they are visible to our readers.
What we are looking for in 2021: Stories that challenge our readers
Here are a few examples of the exact type of stories we want to publish moving forward.
Notice, all of the stories we are about to highlight are in-depth, insanely helpful, and go beyond “how to” and generic articles about personal finance.
Adam Parsons story on “When $100,000 isn’t enough.”
This is a real story about money and finances. It breaks down in detail how quickly $100k can go out the door.
Rocco Pendola story on why “ Mark Cuban’s Most Recent Money Advice is Ridiculously Simple and Super Important.”
Takes a simple piece of financial advice and breaks it down, and explains how it can apply to the readers' life.
My story on why “Passive Income Is A Lie, But Scalable Income Is Real.”
In this story, I push back against one of the trendiest topics in personal finance “passive income.” But I do it in a genuine way because it is something I actually believe. Then, I explain what “Scalable income” is and how it has changed my life and could potentially do the same for the reader.
To put it simply.
We will be publishing fewer stories that are more focused on unique points of view and provide value to our readers
If you are currently a writer for MOAM and we are currently publishing most of your drafts, then keep doing what you're doing.
If you find we are not publishing any of your drafts, ask yourself if the story you submitted is personal to your experience, provides a unique viewpoint, focuses on providing a huge amount of value to readers, or does not meet a high standard for quality and formating.
Stories we will almost certainly turn down
Stories on “how to become a trader”
Stories advocating that readers pick and choose individual stocks
Anything overly self-promotional or have the slightest leaning towards some type of multi-level marketing.
Stories containing undisclosed affiliate links.
We prefer stories that are 1,000 words or more. If your story is under 500 words we are unlikely to accept it. Our readers expect in-depth posts from Making of a Millionaire.
Poorly written stories. We often receive submissions that are simply not up to the standard of writing quality we expect from our writers.
Stories that are in our opinion, clickbait. See Medium’s guidelines on clickbait here.
Please do not take it personally if we reject your story. It is not a personal attack or comment on you. If your story is rejected, it simply means it’s not a fit with what we are after at the moment. Keep at your writing, look at the three stories I highlighted above and send us back your next draft.
I do want to thank all of our current writers and anyone who would take the time to send their hard-earned work to Making of a Millionaire. I have a huge amount of respect for anyone who has the courage to put their thoughts in writing and publish it.
Cheers,
Ben
|
https://medium.com/makingofamillionaire/the-new-submission-guidelines-for-making-of-a-millionaire-301bdba5d305
|
['Ben Le Fort']
|
2020-12-21 17:12:22.326000+00:00
|
['Money', 'Publication', 'Personal Finance', 'Writing', 'Writer']
|
Title New Submission Guidelines Making MillionaireContent New Submission Guidelines Making Millionaire looking 2021 Photo Aaron Burden Unsplash “money story” Money impact life every day positive negative way Whether anxiety caused lack money thrill making money fear losing money responsibility managing money story tell money impact life want give opportunity share money story accepting submission request publish money story “Making Millionaire” would like submit story Making Millionaire follow easy step Leave comment add writer added writer go “Edit” story wish submit click “Add Publication” select “Making Millionaire” either accept story make edits story let know story fit publication wouldn’t mind also telling u bit want write personal finance publishing fewer story 2021 2020 wanted give many writer possible chance story published Making Millionaire published LOT story many I’m honest ended publishing many story subject already published many time example probably 50 story published build emergency fund serve reader best one two great article build emergency fund required making sure visible reader looking 2021 Stories challenge reader example exact type story want publish moving forward Notice story highlight indepth insanely helpful go beyond “how to” generic article personal finance Adam Parsons story “When 100000 isn’t enough” real story money finance break detail quickly 100k go door Rocco Pendola story “ Mark Cuban’s Recent Money Advice Ridiculously Simple Super Important” Takes simple piece financial advice break explains apply reader life story “Passive Income Lie Scalable Income Real” story push back one trendiest topic personal finance “passive income” genuine way something actually believe explain “Scalable income” changed life could potentially reader put simply publishing fewer story focused unique point view provide value reader currently writer MOAM currently publishing draft keep youre find publishing draft ask story submitted personal experience provides unique viewpoint focus providing huge amount value reader meet high standard quality formating Stories almost certainly turn Stories “how become trader” Stories advocating reader pick choose individual stock Anything overly selfpromotional slightest leaning towards type multilevel marketing Stories containing undisclosed affiliate link prefer story 1000 word story 500 word unlikely accept reader expect indepth post Making Millionaire Poorly written story often receive submission simply standard writing quality expect writer Stories opinion clickbait See Medium’s guideline clickbait Please take personally reject story personal attack comment story rejected simply mean it’s fit moment Keep writing look three story highlighted send u back next draft want thank current writer anyone would take time send hardearned work Making Millionaire huge amount respect anyone courage put thought writing publish Cheers BenTags Money Publication Personal Finance Writing Writer
|
5,503 |
Find Your Best Customers with Customer Segmentation in Python
|
Overview
When it comes to finding out who your best customers are, the old RFM matrix principle is the best. RFM stands for Recency, Frequency and Monetary. It is a customer segmentation technique that uses past purchase behavior to divide customers into groups.
RFM Score Calculations
RECENCY (R): Days since last purchase
FREQUENCY (F): Total number of purchases
MONETARY VALUE (M): Total money this customer spent
Step 1: Calculate the RFM metrics for each customer.
Source: Slideshare
Step 2: Add segment numbers to RFM table.
Source: Slideshare
Step 3: Sort according to the RFM scores from the best customers (score 111).
Source: Blast Analytics Marketing
Since RFM is based on user activity data, the first thing we need is data.
Data
The dataset we will use is the same as when we did Market Basket Analysis — Online retail dataset that can be downloaded from UCI Machine Learning Repository.
import pandas as pd
import warnings
warnings.filterwarnings('ignore') df = pd.read_excel("Online_Retail.xlsx")
df.head()
df1 = df
The dataset contains all the transactions occurring between 01/12/2010 and 09/12/2011 for a UK-based and registered online retailer.
It took a few minutes to load the data, so I kept a copy as a backup.
Explore the data — validation and new variables
Missing values in important columns; Customers’ distribution in each country; Unit price and Quantity should > 0; Invoice date should < today.
df1.Country.nunique()
38
There were 38 unique countries as follows:
df1.Country.unique()
array([‘United Kingdom’, ‘France’, ‘Australia’, ‘Netherlands’, ‘Germany’,
‘Norway’, ‘EIRE’, ‘Switzerland’, ‘Spain’, ‘Poland’, ‘Portugal’,
‘Italy’, ‘Belgium’, ‘Lithuania’, ‘Japan’, ‘Iceland’,
‘Channel Islands’, ‘Denmark’, ‘Cyprus’, ‘Sweden’, ‘Austria’,
‘Israel’, ‘Finland’, ‘Bahrain’, ‘Greece’, ‘Hong Kong’, ‘Singapore’,
‘Lebanon’, ‘United Arab Emirates’, ‘Saudi Arabia’, ‘Czech Republic’,
‘Canada’, ‘Unspecified’, ‘Brazil’, ‘USA’, ‘European Community’,
‘Malta’, ‘RSA’], dtype=object
customer_country=df1[['Country','CustomerID']].drop_duplicates() customer_country.groupby(['Country'])['CustomerID'].aggregate('count').reset_index().sort_values('CustomerID', ascending=False)
More than 90% of the customers in the data are from the United Kingdom. There’s some research indicating that customer clusters vary by geography, so here I’ll restrict the data to the United Kingdom only.
df1 = df1.loc[df1['Country'] == 'United Kingdom']
Check whether there are missing values in each column.
There are 133,600 missing values in the CustomerID column, and since our analysis is based on customers, we will remove these missing values.
df1 = df1[pd.notnull(df1['CustomerID'])]
Check the minimum values in UnitPrice and Quantity columns.
df1 = df1[pd.notnull(df1['CustomerID'])]
0.0
df1.Quantity.min()
-80995
Remove the negative values in Quantity column.
df1 = df1[(df1['Quantity']>0)]
df1.shape
df1.info()
(354345, 8)
After cleaning up the data, we are now dealing with 354,345 rows and 8 columns.
Check unique value for each column.
def unique_counts(df1):
for i in df1.columns:
count = df1[i].nunique()
print(i, ": ", count)
unique_counts(df1)
InvoiceNo : 16649
StockCode : 3645
Description : 3844
Quantity : 294
InvoiceDate : 15615
UnitPrice : 403
CustomerID : 3921
Country : 1
Add a column for total price.
df1['TotalPrice'] = df1['Quantity'] * df1['UnitPrice']
Find out the first and last order dates in the data.
df1['InvoiceDate'].min()
Timestamp(‘2010–12–01 08:26:00’)
df1['InvoiceDate'].max()
Timestamp(‘2011–12–09 12:49:00’)
Since recency is calculated for a point in time, and the last invoice date is 2011–12–09, we will use 2011–12–10 to calculate recency.
import datetime as dt
NOW = dt.datetime(2011,12,10) df1['InvoiceDate'] = pd.to_datetime(df1['InvoiceDate'])
RFM Customer Segmentation
RFM segmentation starts from here.
Create a RFM table
rfmTable = df1.groupby('CustomerID').agg({'InvoiceDate': lambda x: (NOW - x.max()).days, 'InvoiceNo': lambda x: len(x), 'TotalPrice': lambda x: x.sum()}) rfmTable['InvoiceDate'] = rfmTable['InvoiceDate'].astype(int)
rfmTable.rename(columns={'InvoiceDate': 'recency',
'InvoiceNo': 'frequency',
'TotalPrice': 'monetary_value'}, inplace=True)
Calculate RFM metrics for each customer
Interpretation:
CustomerID 12346 has frequency: 1, monetary value: $77,183.60 and recency: 325 days.
CustomerID 12747 has frequency: 103, monetary value: $4,196.01 and recency: 2 days
Let’s check the details of the first customer.
|
https://towardsdatascience.com/find-your-best-customers-with-customer-segmentation-in-python-61d602f9eee6
|
['Susan Li']
|
2017-10-25 04:21:02.720000+00:00
|
['Machine Learning', 'Python', 'Customer Success', 'Data Science', 'Towards Data Science']
|
Title Find Best Customers Customer Segmentation PythonContent Overview come finding best customer old RFM matrix principle best RFM stand Recency Frequency Monetary customer segmentation technique us past purchase behavior divide customer group RFM Score Calculations RECENCY R Days since last purchase FREQUENCY F Total number purchase MONETARY VALUE Total money customer spent Step 1 Calculate RFM metric customer Source Slideshare Step 2 Add segment number RFM table Source Slideshare Step 3 Sort according RFM score best customer score 111 Source Blast Analytics Marketing Since RFM based user activity data first thing need data Data dataset use Market Basket Analysis — Online retail dataset downloaded UCI Machine Learning Repository import panda pd import warning warningsfilterwarningsignore df pdreadexcelOnlineRetailxlsx dfhead df1 df dataset contains transaction occurring 01122010 09122011 UKbased registered online retailer took minute load data kept copy backup Explore data — validation new variable Missing value important column Customers’ distribution country Unit price Quantity 0 Invoice date today df1Countrynunique 38 38 unique country follows df1Countryunique array‘United Kingdom’ ‘France’ ‘Australia’ ‘Netherlands’ ‘Germany’ ‘Norway’ ‘EIRE’ ‘Switzerland’ ‘Spain’ ‘Poland’ ‘Portugal’ ‘Italy’ ‘Belgium’ ‘Lithuania’ ‘Japan’ ‘Iceland’ ‘Channel Islands’ ‘Denmark’ ‘Cyprus’ ‘Sweden’ ‘Austria’ ‘Israel’ ‘Finland’ ‘Bahrain’ ‘Greece’ ‘Hong Kong’ ‘Singapore’ ‘Lebanon’ ‘United Arab Emirates’ ‘Saudi Arabia’ ‘Czech Republic’ ‘Canada’ ‘Unspecified’ ‘Brazil’ ‘USA’ ‘European Community’ ‘Malta’ ‘RSA’ dtypeobject customercountrydf1CountryCustomerIDdropduplicates customercountrygroupbyCountryCustomerIDaggregatecountresetindexsortvaluesCustomerID ascendingFalse 90 customer data United Kingdom There’s research indicating customer cluster vary geography I’ll restrict data United Kingdom df1 df1locdf1Country United Kingdom Check whether missing value column 133600 missing value CustomerID column since analysis based customer remove missing value df1 df1pdnotnulldf1CustomerID Check minimum value UnitPrice Quantity column df1 df1pdnotnulldf1CustomerID 00 df1Quantitymin 80995 Remove negative value Quantity column df1 df1df1Quantity0 df1shape df1info 354345 8 cleaning data dealing 354345 row 8 column Check unique value column def uniquecountsdf1 df1columns count df1inunique printi count uniquecountsdf1 InvoiceNo 16649 StockCode 3645 Description 3844 Quantity 294 InvoiceDate 15615 UnitPrice 403 CustomerID 3921 Country 1 Add column total price df1TotalPrice df1Quantity df1UnitPrice Find first last order date data df1InvoiceDatemin Timestamp‘2010–12–01 082600’ df1InvoiceDatemax Timestamp‘2011–12–09 124900’ Since recency calculated point time last invoice date 2011–12–09 use 2011–12–10 calculate recency import datetime dt dtdatetime20111210 df1InvoiceDate pdtodatetimedf1InvoiceDate RFM Customer Segmentation RFM segmentation start Create RFM table rfmTable df1groupbyCustomerIDaggInvoiceDate lambda x xmaxdays InvoiceNo lambda x lenx TotalPrice lambda x xsum rfmTableInvoiceDate rfmTableInvoiceDateastypeint rfmTablerenamecolumnsInvoiceDate recency InvoiceNo frequency TotalPrice monetaryvalue inplaceTrue Calculate RFM metric customer Interpretation CustomerID 12346 frequency 1 monetary value 7718360 recency 325 day CustomerID 12747 frequency 103 monetary value 419601 recency 2 day Let’s check detail first customerTags Machine Learning Python Customer Success Data Science Towards Data Science
|
5,504 |
Call-To-Action Buttons Usage Guide
|
Call-to-action buttons on websites are often neglected. Designers sometimes don’t understand exactly what makes a good call to action button beyond being attractive and fitting into the overall design. But the call to action buttons is too important to be designed without some kind of understanding of what makes them effective. After all, the main point of a call to action button is to get visitors to do something.
Call-To-Action Advantage Buttons
Your CTA has to really provide some sort of benefit to the user to make him/her click. Just imagine the last time you bought something on the internet… what prompted you to take action? I’m sure you took action not because you were looking for what to buy, but because you saw a good benefit attached to the ‘Buy’ button.
In the same vein, a user cannot take action if your CTA is not convincing enough — they want to know exactly what they’re getting, and what they’ll achieve with it to avoid wasting money. Therefore, your call to action has to provide a solid benefit to your customers. If people are not so sure about the value they’ll get from your CTA button, they won’t click. It’s as simple as that.
Furthermore, apart from the text in your CTA button, the button color and placement are equally as important as the message. For example, lots of marketers have discovered that placing a subscription box on the bottom of the landing page performs best, while other people saw an increase in conversions when they placed the button on the left side of the page.
It’s your duty to find out which placement works best for you. You don’t have to do what others are doing, just test, test, and test some more before choosing a winner.
Also, figure out which button color works well for you. Green buttons may imply money and prosperity, but the best choice is to always test. Test every element of your CTA (including button color).
Looks Like Buttons
The subject of “signifiers” is critical when it comes to conversions and user-experience (UX). When we mention “signifiers” in the web design space, we’re mostly talking about making every element on a web page to look exactly like what it’s supposed to be used for. It means that a button should look like a button… and nothing else.
This will make it easy for users to immediately identify it as an element that they should click on to initiate an action. So let me ask you… when a first-time visitor lands on your landing page, will he/she absolutely identify which elements are clickable? Or will he/she get confused and start guessing what to do? If you agreed with the second question, then you have to change something immediately. In a nutshell, buttons are generally easier to click when we’re sure they’re clickable.
It’s no wonder why gray buttons often convert poorly — they look deactivated, so lots of visitors won’t even know they’re expected to click them. Can your visitors easily identify the CTA on your site and landing pages? Is the call-to-action visible enough? Does it have signs implying clickability? Finally, another good idea to make your call-to-action stand out is to have lots of space around it, like the PayPal ones.
Curiosity
When a user sees this level of openness, they know exactly what they’re supposed to do. Make Your Visitor Curious. Use curiosity effectively, and you’ll see a massive boost in conversions. According to Andrew Sobel, one of the 6 rules for evoking curiosity is: “Tell people what you do and the results you get, not every detail about how you do it. The former is interesting; the latter can become tedious.”
Curiosity brings out the burning desire to know something you didn’t know before. If you design your call-to-action message in a way that could create a burning desire for your prospects to find out what’s on the other side of the CTA, they’ll be more willing and eager to click, thereby giving you the lead generations you want. And, remember: The higher your click-through rate, the more sales you’ll generate.
In other words, emotional triggers like surprise, trust, fun, delight, and, most importantly, satisfaction arouse curiosity in your users: For example, when people trust you, they’ll be more willing to click. In the same way, when people are delighted with your PPC ads or landing page copy, they’ll immediately click, because they envision a benefit.
You should always remember that your target audiences are human beings who continually make emotional and rational choices depending on the information presented before them.
For Free
We all love free stuff, especially when it’s useful free stuff. Although there may be no such thing as a free lunch, even in free town, as humans, we can’t resist the attraction of a bonus, including a free eBook that sounds interesting. Offering your customers a helpful freebie is one super-effective way to attracting and retaining more of them. Therefore, you have to start offering a bonus in your CTA message, too.
For example, when a company offers you a great opportunity to save a little money while making a purchase, that’s a reward because they’ll bear all the risk and you’ll gain more. In fact, the majority of telecommunications service providers out there offer some kind of “bonus”, such as free shipping, extra savings, rebates, and “buy-one-get-one-free” offers.
Attractive Call-To-Action
Your sales copy and PPC ad campaigns, promotional banners, and landing pages can only drive quality leads and customers to your business when they click on your call-to-action button.
To a significant extent, a high click-through rate (CTR) equals a higher conversion rate. If all the other important elements like your sales funnel and offer are properly optimized for your target users, and you’re not seeing conversions. The problem is likely with your CTA.
|
https://medium.com/visualmodo/call-to-action-buttons-usage-guide-be78c8755e7
|
[]
|
2019-01-21 19:05:07.478000+00:00
|
['Marketing', 'Call To Action', 'Buttons', 'Inspiration', 'Cta']
|
Title CallToAction Buttons Usage GuideContent Calltoaction button website often neglected Designers sometimes don’t understand exactly make good call action button beyond attractive fitting overall design call action button important designed without kind understanding make effective main point call action button get visitor something CallToAction Advantage Buttons CTA really provide sort benefit user make himher click imagine last time bought something internet… prompted take action I’m sure took action looking buy saw good benefit attached ‘Buy’ button vein user cannot take action CTA convincing enough — want know exactly they’re getting they’ll achieve avoid wasting money Therefore call action provide solid benefit customer people sure value they’ll get CTA button won’t click It’s simple Furthermore apart text CTA button button color placement equally important message example lot marketer discovered placing subscription box bottom landing page performs best people saw increase conversion placed button left side page It’s duty find placement work best don’t others test test test choosing winner Also figure button color work well Green button may imply money prosperity best choice always test Test every element CTA including button color Looks Like Buttons subject “signifiers” critical come conversion userexperience UX mention “signifiers” web design space we’re mostly talking making every element web page look exactly like it’s supposed used mean button look like button… nothing else make easy user immediately identify element click initiate action let ask you… firsttime visitor land landing page heshe absolutely identify element clickable heshe get confused start guessing agreed second question change something immediately nutshell button generally easier click we’re sure they’re clickable It’s wonder gray button often convert poorly — look deactivated lot visitor won’t even know they’re expected click visitor easily identify CTA site landing page calltoaction visible enough sign implying clickability Finally another good idea make calltoaction stand lot space around like PayPal one Curiosity user see level openness know exactly they’re supposed Make Visitor Curious Use curiosity effectively you’ll see massive boost conversion According Andrew Sobel one 6 rule evoking curiosity “Tell people result get every detail former interesting latter become tedious” Curiosity brings burning desire know something didn’t know design calltoaction message way could create burning desire prospect find what’s side CTA they’ll willing eager click thereby giving lead generation want remember higher clickthrough rate sale you’ll generate word emotional trigger like surprise trust fun delight importantly satisfaction arouse curiosity user example people trust they’ll willing click way people delighted PPC ad landing page copy they’ll immediately click envision benefit always remember target audience human being continually make emotional rational choice depending information presented Free love free stuff especially it’s useful free stuff Although may thing free lunch even free town human can’t resist attraction bonus including free eBook sound interesting Offering customer helpful freebie one supereffective way attracting retaining Therefore start offering bonus CTA message example company offer great opportunity save little money making purchase that’s reward they’ll bear risk you’ll gain fact majority telecommunication service provider offer kind “bonus” free shipping extra saving rebate “buyonegetonefree” offer Attractive CallToAction sale copy PPC ad campaign promotional banner landing page drive quality lead customer business click calltoaction button significant extent high clickthrough rate CTR equal higher conversion rate important element like sale funnel offer properly optimized target user you’re seeing conversion problem likely CTATags Marketing Call Action Buttons Inspiration Cta
|
5,505 |
Thinking Outside the Books
|
Thinking Outside the Books
A library without librarians, a bookshelf without books, and other wonders of the modern word.
It’s late in the night, when most people are getting ready for bed — if they aren’t asleep already. Daytime shops have closed long ago; they wait with their shutters down until the next morning. But in this silent street, one door remains open.
It leads to a library.
But what an extraordinary library this is! There are no staff working inside; no librarian. You walk in through the door, after having your library-card scanned: walk in, pick the books you want to borrow, and walk out again. As you move about, the library’s sensors follow your motions with their electronic eyes.
And when you walk out, the library detects which books you’ve borrowed, automatically entering them into your account.
This is an ‘Intelligent Library’, one of several set up around Taiwan to help people access books more easily. Intelligent Libraries are not meant to replace ordinary ones. Instead, they act as a supplement. They extend a library’s reach to places where a full-fledged, human-handled library would be too impractical.
Intelligent Libraries can also open earlier and stay open later into the night. That’s important, because with ever-lengthening work hours, that’s the only time people have for reading. Libraries in China and Singapore regularly remain open until midnight.
But some libraries never close at all.
The Bath University, UK, was the first in the country to try out a 24-hour library. That was in 1996. But it’s only in the past decade or so that other universities have started to follow suit.
24-hour libraries are open round the clock. They aren’t always in use, but they’re still useful for students who work late at night to catch up on assignments, or wake up early morning to prepare for an upcoming exam. Foreign students use the space to conduct Skype calls with relatives in different timezones.
Some people are concerned that 24-hour libraries may lead to bad habits. Students may get the impression that they’re expected to work late, rather than catch up on much-needed sleep. They might do that anyway, but an all-night library would only encourage the habit.
Students don’t buy that argument, though. If they’re so pressed for time, it’s useful to have library access whenever they need it instead of worrying about opening and closing times.
Either way, it seems the 24-hour library is valued mainly for the space it provides. The question of books hardly ever turns up.
Maybe that’s why some libraries have decided to do away with them altogether.
The Vision IAS Library, Delhi, is one of many such spaces that have sprung up in India’s capital. This ‘library’ has soothing air-conditioned rooms to keep out the worst of Delhi’s heat. They have WiFi access, open discussion spaces, the usual silent atmosphere, and everything else you’d expect from a library.
Well, almost.
It doesn’t have any books.
Instead, what it has are rows of desks, at which you can sit and study without disturbance. You can book desks in one of three ‘shifts’ — morning, evening, and night — or pay extra for round-the-clock access.
The Vision IAS Library was started in 2011, after Shalini and Sanjeev Rathod had failed several attempts to crack the government-job IAS exam. They decided to use their experience to conduct coaching classes, but then they realised students were missing one more thing: a comfortable, distraction-free space to study.
Libraries like the Vision IAS don’t store books, because exam textbooks change every year. Instead, some provide lockers for people to store their own belongings. Though summer is peak season, these ‘libraries’ are in demand all through the year — some so much that they even let you pre-register online.
While Vision IAS is a library without books, you could say Safari Books Online is books without a library.
Started in 2001, Safari Books Online is the Netflix of digital textbooks. You pay a monthly or yearly fee, and get unlimited access to the whole collection. And you can read the book in whichever ebook format you prefer, on a variety of devices.
Safari Books Online started with a focus on computer science and programming, but it has expanded to other areas as well. All this is still about textbooks, though. What about proper books — books that you want to read, for fun, and for pleasure?
Ignoring all the illegal pirate sites out there, Project Gutenberg and WorldCat are the places to go.
The idea behind Project Gutenberg is simple. Take all the books whose copyrights have expired. Scan them, digitise them, and put them up for people to read.
But copyrights take time to expire. Nowadays, companies renew them even after the original authors are long gone. So Project Gutenberg is good if you want to dip into Alice in Wonderland or Sherlock Holmes, but not if you want to check out the latest New York Times bestseller.
That’s when WorldCat comes in. It’s a catalogue of real, physical libraries from all round the world. Search for a book, and WorldCat will tell you which libraries have a copy (of the libraries who’ve registered with WorldCat, that is). Then you can go to the library and pick up the book, or, in some cases, even have it home-delivered.
Of course, if the library’s too far from you to access, you’re out of luck. Or are you? Maybe not: libraries can also lend you books through the Internet. And the Internet is seldom “too far away”.
How does that work? Well, libraries don’t just lend out physical books. They can also lend out ebooks. Lending works the same way: you get the book for a while, and then, when you’re done, you “return” it by deleting it from your device. Libraries keep track of how many copies of an ebook are ‘lent’, and make sure not to load more than they’re allowed.
Usually, ebook loading is only available for people who visit the library. But through WorldCat, they can lend ebooks to anybody in the world (but you’ll still have to pay).
If you’re like me, you wouldn’t consider an ebook a ‘proper’ book. It would have the same text, of course, but everything else is different.
Ebooks are getting very popular because they’re cheaper, lighter to carry, and don’t cut trees. But if you don’t like ebooks, don’t worry: there are initiatives to promote ‘proper’ book reading, too.
The Delhi Metro is a busy place. Over two-million commuters use it every day to get around the city.
There are harried businessmen for whom the ride is the only routine they have during the day. There are students squeezing in some last minute revision, brick-like textbooks propped open in their arms. There are suburban housewives in their bright clothes and gaudy lipstick, heading into the city to meet their friends for lunch. And they’re all trying to get to the places they want to go as fast as possible.
But somewhere in a corner, if you look very carefully, you might find a book.
‘Books on the Delhi Metro’ is an project by writer Shruti Sharma and her husband Tarun Chauhan. With a tagline of “Take it, Read it, Return it”, the idea is to leave books in random places for people to find. If you find a book, you can take it home and read it. When you’re done, leave it somewhere — anywhere — in the metro system, for the next person to find. People are also asked to leave a tweet, to keep track of where the books are going.
Books on the Delhi Metro is currently only in Delhi, but there are plans to expand it. And it’s not the only such initiative. The Delhi project was itself inspired by Emma Watson, who played Hermione in the Harry Potter movies.
Emma Watson left books on the London Underground. She later went on to start Book Fairies, which lets anyone around the world becoming a ‘book fairy’ by leaving books in random places for people to pick up. Other projects, like BookCrossing, let you do the same thing too.
With reading habits going down and bookless libraries going up, you may be worried about the future of libraries. But if Books on the Delhi Metro and similar initiatives catch on, we won’t have to worry so much about them vanishing.
The whole world will be a library.
Have something to say? At Snipette, we encourage questions, comments, corrections and clarifications — even if they are something that’s easily Googled! You can also sign up for our weekly email.
Sources and references for this article can be found here.
|
https://medium.com/snipette/thinking-outside-the-books-7e484fa1aa41
|
['Badri Sunderarajan']
|
2018-10-28 02:31:02.458000+00:00
|
['Books', 'Reading', 'Libraries', 'Culture', 'Books On The Delhi Metro']
|
Title Thinking Outside BooksContent Thinking Outside Books library without librarian bookshelf without book wonder modern word It’s late night people getting ready bed — aren’t asleep already Daytime shop closed long ago wait shutter next morning silent street one door remains open lead library extraordinary library staff working inside librarian walk door librarycard scanned walk pick book want borrow walk move library’s sensor follow motion electronic eye walk library detects book you’ve borrowed automatically entering account ‘Intelligent Library’ one several set around Taiwan help people access book easily Intelligent Libraries meant replace ordinary one Instead act supplement extend library’s reach place fullfledged humanhandled library would impractical Intelligent Libraries also open earlier stay open later night That’s important everlengthening work hour that’s time people reading Libraries China Singapore regularly remain open midnight library never close Bath University UK first country try 24hour library 1996 it’s past decade university started follow suit 24hour library open round clock aren’t always use they’re still useful student work late night catch assignment wake early morning prepare upcoming exam Foreign student use space conduct Skype call relative different timezones people concerned 24hour library may lead bad habit Students may get impression they’re expected work late rather catch muchneeded sleep might anyway allnight library would encourage habit Students don’t buy argument though they’re pressed time it’s useful library access whenever need instead worrying opening closing time Either way seems 24hour library valued mainly space provides question book hardly ever turn Maybe that’s library decided away altogether Vision IAS Library Delhi one many space sprung India’s capital ‘library’ soothing airconditioned room keep worst Delhi’s heat WiFi access open discussion space usual silent atmosphere everything else you’d expect library Well almost doesn’t book Instead row desk sit study without disturbance book desk one three ‘shifts’ — morning evening night — pay extra roundtheclock access Vision IAS Library started 2011 Shalini Sanjeev Rathod failed several attempt crack governmentjob IAS exam decided use experience conduct coaching class realised student missing one thing comfortable distractionfree space study Libraries like Vision IAS don’t store book exam textbook change every year Instead provide locker people store belonging Though summer peak season ‘libraries’ demand year — much even let preregister online Vision IAS library without book could say Safari Books Online book without library Started 2001 Safari Books Online Netflix digital textbook pay monthly yearly fee get unlimited access whole collection read book whichever ebook format prefer variety device Safari Books Online started focus computer science programming expanded area well still textbook though proper book — book want read fun pleasure Ignoring illegal pirate site Project Gutenberg WorldCat place go idea behind Project Gutenberg simple Take book whose copyright expired Scan digitise put people read copyright take time expire Nowadays company renew even original author long gone Project Gutenberg good want dip Alice Wonderland Sherlock Holmes want check latest New York Times bestseller That’s WorldCat come It’s catalogue real physical library round world Search book WorldCat tell library copy library who’ve registered WorldCat go library pick book case even homedelivered course library’s far access you’re luck Maybe library also lend book Internet Internet seldom “too far away” work Well library don’t lend physical book also lend ebooks Lending work way get book you’re done “return” deleting device Libraries keep track many copy ebook ‘lent’ make sure load they’re allowed Usually ebook loading available people visit library WorldCat lend ebooks anybody world you’ll still pay you’re like wouldn’t consider ebook ‘proper’ book would text course everything else different Ebooks getting popular they’re cheaper lighter carry don’t cut tree don’t like ebooks don’t worry initiative promote ‘proper’ book reading Delhi Metro busy place twomillion commuter use every day get around city harried businessmen ride routine day student squeezing last minute revision bricklike textbook propped open arm suburban housewife bright clothes gaudy lipstick heading city meet friend lunch they’re trying get place want go fast possible somewhere corner look carefully might find book ‘Books Delhi Metro’ project writer Shruti Sharma husband Tarun Chauhan tagline “Take Read Return it” idea leave book random place people find find book take home read you’re done leave somewhere — anywhere — metro system next person find People also asked leave tweet keep track book going Books Delhi Metro currently Delhi plan expand it’s initiative Delhi project inspired Emma Watson played Hermione Harry Potter movie Emma Watson left book London Underground later went start Book Fairies let anyone around world becoming ‘book fairy’ leaving book random place people pick project like BookCrossing let thing reading habit going bookless library going may worried future library Books Delhi Metro similar initiative catch won’t worry much vanishing whole world library something say Snipette encourage question comment correction clarification — even something that’s easily Googled also sign weekly email Sources reference article found hereTags Books Reading Libraries Culture Books Delhi Metro
|
5,506 |
Neural Networks, Demystified
|
You’ve no doubt heard about neural networks — the mysterious and sci-fi-like technology that makes for a great buzzword. But being someone who isn’t technical, you’ve written them off as an enigma left only for computer science nerds (like myself). That changes today with this primer on how they work, designed for people who know nothing about computer science, coding, or mathematics.
What is a neural network?
A neural network can be thought of as an artificial information processor. It takes input, processes it in some way, and then produces some output. The structure of the network defines how it does the processing, with different structures producing different output. The result of which are networks that can classify images, translate languages and much, much more.
As we’ll soon see, some parts of the network are fixed while others, known as parameters, can change. Our goal is to adjust these parameters in such a way that our network learns to solve a problem. Initially, our network will be very bad at its task, like a child doing calculus, because these parameters are set randomly. But as we iterate through many cycles of testing the network, and updating the parameters based on its response, it will get better over time. Unsurprisingly, this repeated process of testing and updating means that training data is a big part of neural networks.
Let's take a look at what a neural network looks like.
Network Architecture
The original motivation for neural networks are neurons in the human brain, which have a few important features:
Neurons in our brain are connected to each other by a massive network where the output of some neurons can serve as input to others. The strength of connections between neurons can change based on the frequency of use, leading to the popular phrase by Donald Hebb “neurons that fire together, wire together”. The electrochemical potential in a neuron can build up, but the neuron won’t ‘fire’ until the potential passes some threshold.
Let’s see if we can artificially replicate some of this functionality by looking at the building blocks of neural networks, perceptrons.
In the above diagram, we’ve represented two connected neurons, A and C, where the output of neuron A, x, is equal to the input of neuron C. We’ll represent neurons with nodes (the circles) and the connections between neurons with edges (the lines). You can think of a neuron like this: it takes some input, it holds a value (some combination of its input) and then passes that value on as output. So far this model satisfies the first feature listed above. Let’s introduce connection strengths.
We can vary the strength of connections by introducing a connection weight, w. The input to neuron C will now be the output of neuron A, x, multiplied by the weight w. Intuitively, the greater (smaller) the value of w, the stronger (weaker) the connection between the two neurons. This satisfies the second feature. Finally, let’s introduce potential thresholds.
We’ve now introduced another neuron B which has a value of b and a connection weight of -1. B is known as a bias, and we’ll see why soon. The input to C becomes the weighted sum of A and B, that is, w*x + (-1)*b. Next, we apply the step function to the input at C which is defined as f(x) = 1 if x > 0, 0 otherwise.
The step function
To summarise, the value at C becomes 1 if w*x -b > 0, 0 otherwise. Why on earth would we do that? Well, the value at C will equal 0 if w*x < b. In other words, the bias b acts as a threshold that we need to pass in order for the value at C to not be 0. This is exactly like the third feature of neurons discussed earlier! Because of this, we call the step function an ‘activation function’.
There’s only one problem. The vertical part of the step graph at x = 0 means that it's not differentiable. If you don’t know what that means, don’t worry (see the conclusion if you’re not satisfied). All you need to know is that we can approximate the step function with the sigmoid function.
The Sigmoid function
You can think of the sigmoid function as a ‘squishification’ of all possible inputs to fit between 0 and 1. The larger (smaller) x is, the closer sigmoid(x) is to 1 (0).
We can extend our current model to have many neurons feeding input, each of which will have its own weight. Note that only one of these will be a bias. Again, the input becomes the weighted sum (the product of the output of each node by its connection weight) of the neurons before it.
And while we’re at it, why not add a couple more nodes to each layer, and a couple more layers of connectivity? We call the layers between the first layer and the last ‘hidden layers’. Here, each layer will have only one bias.
We generally start by populating the leftmost layer of neurons and move ‘forward’ through the network by calculating the value of each neuron in the next layer, and so on. Finally, we can compute the value of the neurons in the output layer.
As we said before, there are some fixed features and some parameters in our network. The general structure, that is, the number of layers, the number of nodes in each layer and the activation function are fixed. Based on how we move forward through the network, the value of each neuron is deterministic given the neurons and the weights that precede it. Therefore the only thing we can change, our parameters, become the weights of the connections between neurons.
Now that we understand what a network is, let’s look at how we can use it to solve a problem.
How it ‘learns’
We’ll take a look at one of the most famous machine learning tasks, identifying handwritten images.
The general process for learning is this:
Define a network Pass an image into the network (input) The network will predict the label of the image (output) Use the prediction to update the network in such a way that it ‘learns’ Return to step two and repeat
Let’s assume the images are each 28x28 (784) pixels, and since they are grey-scaled, the value of each pixel ranges from 0 (black) to 1 (white). In order to train the network, we need training data in the form of images and their associated label.
The first layer of our network will represent the data; it’s how we feed a data point (an image) into our network. There will be 784 neurons in the first layer (plus a bias) and the value of each neuron will be the value of a pixel from the training image. The last layer in the network will represent the output; the model’s prediction for the label of the image. There will be 10 neurons in this layer, and the closer the value in neuron i is to 1, the more the model thinks the image has label i.
Initially, we set the weights of the graph to random values, which is why the initial predictions won’t be very good. The choice for the number of hidden layers and the number of neurons in each is a challenging problem to solve, which we’ll skip over. For educational purposes let’s just assume there is 1 hidden layer with 10 nodes, and look at an example.
|
https://towardsdatascience.com/neural-networks-demystified-49f3426d4478
|
['Ben Rohald']
|
2019-07-15 14:52:10.702000+00:00
|
['Machine Learning', 'Education', 'Neural Networks', 'Artificial Intelligence']
|
Title Neural Networks DemystifiedContent You’ve doubt heard neural network — mysterious scifilike technology make great buzzword someone isn’t technical you’ve written enigma left computer science nerd like change today primer work designed people know nothing computer science coding mathematics neural network neural network thought artificial information processor take input process way produce output structure network defines processing different structure producing different output result network classify image translate language much much we’ll soon see part network fixed others known parameter change goal adjust parameter way network learns solve problem Initially network bad task like child calculus parameter set randomly iterate many cycle testing network updating parameter based response get better time Unsurprisingly repeated process testing updating mean training data big part neural network Lets take look neural network look like Network Architecture original motivation neural network neuron human brain important feature Neurons brain connected massive network output neuron serve input others strength connection neuron change based frequency use leading popular phrase Donald Hebb “neurons fire together wire together” electrochemical potential neuron build neuron won’t ‘fire’ potential pass threshold Let’s see artificially replicate functionality looking building block neural network perceptrons diagram we’ve represented two connected neuron C output neuron x equal input neuron C We’ll represent neuron node circle connection neuron edge line think neuron like take input hold value combination input pass value output far model satisfies first feature listed Let’s introduce connection strength vary strength connection introducing connection weight w input neuron C output neuron x multiplied weight w Intuitively greater smaller value w stronger weaker connection two neuron satisfies second feature Finally let’s introduce potential threshold We’ve introduced another neuron B value b connection weight 1 B known bias we’ll see soon input C becomes weighted sum B wx 1b Next apply step function input C defined fx 1 x 0 0 otherwise step function summarise value C becomes 1 wx b 0 0 otherwise earth would Well value C equal 0 wx b word bias b act threshold need pas order value C 0 exactly like third feature neuron discussed earlier call step function ‘activation function’ There’s one problem vertical part step graph x 0 mean differentiable don’t know mean don’t worry see conclusion you’re satisfied need know approximate step function sigmoid function Sigmoid function think sigmoid function ‘squishification’ possible input fit 0 1 larger smaller x closer sigmoidx 1 0 extend current model many neuron feeding input weight Note one bias input becomes weighted sum product output node connection weight neuron we’re add couple node layer couple layer connectivity call layer first layer last ‘hidden layers’ layer one bias generally start populating leftmost layer neuron move ‘forward’ network calculating value neuron next layer Finally compute value neuron output layer said fixed feature parameter network general structure number layer number node layer activation function fixed Based move forward network value neuron deterministic given neuron weight precede Therefore thing change parameter become weight connection neuron understand network let’s look use solve problem ‘learns’ We’ll take look one famous machine learning task identifying handwritten image general process learning Define network Pass image network input network predict label image output Use prediction update network way ‘learns’ Return step two repeat Let’s assume image 28x28 784 pixel since greyscaled value pixel range 0 black 1 white order train network need training data form image associated label first layer network represent data it’s feed data point image network 784 neuron first layer plus bias value neuron value pixel training image last layer network represent output model’s prediction label image 10 neuron layer closer value neuron 1 model think image label Initially set weight graph random value initial prediction won’t good choice number hidden layer number neuron challenging problem solve we’ll skip educational purpose let’s assume 1 hidden layer 10 node look exampleTags Machine Learning Education Neural Networks Artificial Intelligence
|
5,507 |
Starting your career during a pandemic: 40 young journalists enter a new journalism landscape
|
The response to this year’s GNI Fellowship call for applications was exciting and a great sign that many young people are looking to land their dream job in journalism. More than 1,400 students and graduates looking to kickstart their journalism careers applied for a summer placement. Leading European news organisations across 14 countries will now host 40 Fellows for eight weeks.
The GNI Fellowship seeks to bring young talent into newsrooms and help them become more diverse and interdisciplinary while staying at the forefront of the use of technology in journalism. For many aspiring journalists, the Fellowship is their first paid job and a step that can kick-start their career in journalism.
The Fellows were carefully selected by each host organisation for the skills they offer in digital and data journalism, audience and product development, verification and fact-checking, all of which are in high demand to help move the industry forward. Amid a pandemic, which has pushed many newsrooms to find new ways of producing journalism, these skills are needed more than ever.
Meet the Fellows
The cohort is a young group of aspiring media professionals with backgrounds that range from journalism to UX/UI design, and from computer science to philosophy. Here are the GNI Fellows for 2020 and the main areas they will focus on:
Area: Data Journalism and Visualisation
‘The GNI Fellowship gives me an opportunity to learn a new inspiring way of reaching and engaging audiences by using data and new storytelling formats. I am really looking forward to working at the intersection between journalism and design and connecting with the community of fellows from across Europe.’- Pilar Tomás Franco ‘I look forward to using new technologies and narrative formats to complement my previous journalistic experience. Because we need strong critical journalism and we need to make it ready for the future.’- Gabriel Rinaldi
Area: Audience engagement and digital storytelling
‘I’m most looking forward to have the opportunity to make meaningful change in the environmental sphere, learn from industry professionals and experiment with social content.’- Katie Anderson
Area: Fact-checking, verification and investigative journalism
‘The GNI fellowship is an exceptional opportunity to work with the best fact-checking journalists of a leading news organisation. I will get to learn about the state-of-the-art technological tools and latest innovative formats used in this domain. It is really exciting!’- Juliette Mansour
Area: Design and product development
|
https://medium.com/we-are-the-european-journalism-centre/starting-your-career-during-a-pandemic-40-young-journalists-enter-a-new-journalism-landscape-5ce92d0fa33f
|
['Charlene Arsasemita']
|
2020-06-30 11:48:11.162000+00:00
|
['Journalism', 'Fellowship', 'Innovation', 'Media', 'Updates']
|
Title Starting career pandemic 40 young journalist enter new journalism landscapeContent response year’s GNI Fellowship call application exciting great sign many young people looking land dream job journalism 1400 student graduate looking kickstart journalism career applied summer placement Leading European news organisation across 14 country host 40 Fellows eight week GNI Fellowship seek bring young talent newsroom help become diverse interdisciplinary staying forefront use technology journalism many aspiring journalist Fellowship first paid job step kickstart career journalism Fellows carefully selected host organisation skill offer digital data journalism audience product development verification factchecking high demand help move industry forward Amid pandemic pushed many newsroom find new way producing journalism skill needed ever Meet Fellows cohort young group aspiring medium professional background range journalism UXUI design computer science philosophy GNI Fellows 2020 main area focus Area Data Journalism Visualisation ‘The GNI Fellowship give opportunity learn new inspiring way reaching engaging audience using data new storytelling format really looking forward working intersection journalism design connecting community fellow across Europe’ Pilar Tomás Franco ‘I look forward using new technology narrative format complement previous journalistic experience need strong critical journalism need make ready future’ Gabriel Rinaldi Area Audience engagement digital storytelling ‘I’m looking forward opportunity make meaningful change environmental sphere learn industry professional experiment social content’ Katie Anderson Area Factchecking verification investigative journalism ‘The GNI fellowship exceptional opportunity work best factchecking journalist leading news organisation get learn stateoftheart technological tool latest innovative format used domain really exciting’ Juliette Mansour Area Design product developmentTags Journalism Fellowship Innovation Media Updates
|
5,508 |
How Denying My Sexuality Destroyed My Ability to Love Those Most Like Me
|
I was once asked to protect an archbishop from gay Catholic protestors during a church service.
It is a rather unusual thing to admit, especially considering I am gay myself, but life is nothing if not ironic.
I was in seminary at the time, training to be a Catholic priest. One day the rector in charge of the seminary asked to see several of us who he had identified as leaders. We had quickly bought into his vision of a more masculine and orthodox church reform led by the men of Catholicism and he had made us his lieutenants.
Tomorrow, he explained, the Archbishop will be celebrating Mass at the university chapel. A group of LGBT Catholics are planning on protesting the service, probably inside the chapel and possibly during communion.
For Catholics, communion is the high point of our worship, where God becomes physically present under the auspices of simple bread and wine. A small amount of protestors around the world had taken to desecrating the communion host to make their point, which, when you believe God is physically present, is a truly offensive protest. Thus, any protests at a Mass were taken seriously, whether a threat was actually made or not.
The four of us, the rector said, would protect the Archbishop and protect Jesus at communion.
I was thrilled.
me in my seminary days
The next day we walked over to the chapel together, in our khakis and blue polos with the name of the seminary embroidered on the left chest. Standard issue seminarian uniform. Honestly, we looked more like an early 2000’s dance group in a retail clothing commercial than bouncers — a fact that was once called out to us on the street by an actual bouncer in New Orleans while fifty of us walked by: Five dollar wells! Four dollar domestics! Ladies get in free! And what is that a motherfucking Gap commercial?!
As we processed down the aisle at the beginning of the Mass we could see a group of four or five men and women sitting towards the back. Some wore a rainbow flag draped over their shoulders like the suffragettes in Mary Poppins, others a more modest pin. They all seemed to be at least fifty.
The scripture readings and sermon moved along without any disruptions. The protesters sat quietly, standing only when the whole congregation did, saying the usual prayers along with us. I was both relieved and disappointed. Part of me was grateful they were being so respectful, but I also was curious to see how any conflict would play out. Church is stereotypically boring for a reason — what would it look like with a little bit of rebellion thrown in the mix? Just being asked to be there, my ego was already sky high. My mind raced at what I would do if I had to step in.
The priest had asked us to stand alongside everyone handing out communion in case anything happened. The big question — and the heart of the protest, really — was whether the Archbishop would give the LGBT members communion. They were protesting the Catholic Church’s treatment of its queer members and symbolically, you couldn’t get much better than being prohibited to even share a meal with the rest of the congregation.
Devout Catholics would object to that set up. There’s no discrimination in barring LGBT Catholics from communion, they say. There are certain beliefs required of all members and various actions prohibited — confession awaiting all those who fall short — in order to be ready to participate in communion. There are no separate rules for gay and straight Catholics, just one set of beliefs that binds us all together.
Add onto that an understanding that the solemnity of communion is such that protesting or in any way politicizing the act would be wildly inappropriate and it is no large surprise that the threat of rejection floated in the air.
The small group queued up in the Archbishops line as I stood tall and still right beside him. It is not normal in Catholic churches to have anyone accompanying a person distributing communion, and given the potential for disruption we believed was present, I did my best to look intimidating. By this point I was nineteen and into it. I felt like a knight protecting royalty, or maybe secret service agent, accompanying the president as he shook hands or kissed babies.
A woman in her sixties approached the Archbishop and gently put up her wrinkled hands like the rest. Her grey hair was cut short and an “LGBT Catholic” button was placed prominently on her chest. I glanced out of the corner of my eye and caught the Archbishop’s face. He smiled and had a genuine tenderness in his eyes. He reached down and put his hand on her shoulder like he did for people not ready to receive communion and said a small prayer asking for God’s blessing.
The woman seemed to crumple like crepe paper under his hand. Tears gushed out and she quickly walked past the line for the wine and returned to her pew, muffled sobs echoing off the plaster walls.
I betrayed no emotion, but my eyes followed her all the way back to her pew until her head fell down into her hands. This was the great threat I was here to thwart, and honestly, I was proud to have played my part. Her tears did not move me. The dignity of the Church had to be guarded and the truth she bore upheld. If her stunt left her feeling rejected, good.
I was a knight. I was nineteen.
When Mass was over we accompanied the Archbishop back to the sacristy where he could hang up his priestly vestments. Us seminarians were in formation around the Archbishop as the protestors approached outside. They stayed just outside our perimeter, signs that were tucked away during the service suddenly hoisted high as they chanted about justice for gay Catholics. It was all over within several seconds. By the time the Archbishop had stowed away his vestments everyone had dispersed. Within fifteen minutes I was back in my room, my feet propped up and a Philosophy 101 textbook in my lap.
a different, bigger protest
What worries me most, looking back on that episode, is not the dramatic overreaction in itself. The desire to protect the church, its leaders, and yes even God, is a sincere one. It was silly in its intensity, but not rooted in malice. Those protestors were incredibly peaceful, and honestly not even that good at being distracting. The part I struggle with is how easily I was able to disassociate myself from the LGBTQ Catholics that were there.
In 2005 I was still closeted to all but my family and a few old friends. In the seminary, I was trying desperately to increase the Catholic side of my identity, hoping it might somehow consume the gay one. Hoping that with enough fervor and devotion, the parts of me the Church found lacking would fade away and be forgotten.
The thrill I got at standing tall and being the Catholic knight, the defender of what is right and true and good, eclipsed any reality of how close I was to those who were approaching the altar, and why. I think my subconscious did some kind of primitive, communal calculus, and decided the queers wanting a place at the table were a threat to my own inclusion as an real Catholic.
Watching that woman with her wrinkled hands held out to the Archbishop, my mind wouldn’t allow me to see how close I was to her. How it had been only a couple years since I was laying in bed with a boy and then mumbling to my mother the next morning that maybe the Church could change.
Instead, I put my shoulders back and my eyes forward and decided that I had been the one who changed. I wasn’t gay — not really. I struggled with same-sex attraction, but no one needed to know that. It wasn’t a real part of me. Being a good son of the Church was my identity now.
I wonder what might have gone different if I had learned to see a part of me in those protestors. I can’t imagine I would have been willing to play the Secret Service agent. My discomfort with turning a church service into a protest would probably have remained. There are certain places and times that need to remain inherently reverent and free from antics, righteous though the cause may be.
But my willingness to engage, to listen, to search out a more proactive solution would have been there. If I had just been able to say, I disagree with how they are going about this, but I at least understand how I could have ended up there myself. That changes the dispositions of the dialogue entirely.
As should be plenty obvious by now, I have switched sides on the gay/Christian debate, and think it is the Church that needs to move more towards the gays, not just the other way around. But I hope I can remember the path I took to get there. How I was able to stand next to the Archbishop and, in my heart like Peter, utter my own, Woman, I do not know the man.
I need more compassion for those who disagree with me, even while I insist on the need to recognize the full humanity of LGBTQ individuals and their rights. The empathy piece is key though. Both in humility to recognize how easily I could have ended up in their shoes. And in practicality, to adopt a more inviting and, dare I say, Christian approach to conflict.
But I also believe trying to shut down my gayness, to compartmentalize and lock away this part of me, is what made me able to so harshly view those protestors as only their sexuality. The smugness with which I smiled inside at that woman’s pain was a direct consequence of me being taught to hate and push away being queer. The coded language of you are more than just your sexuality I was constantly told, in practice meant that any acknowledgment of it was to claim queerness as my sole identity. And when I saw it in another, I suppose I wanted it destroyed.
A refusal to see myself clearly meant an inability to recognize anyone like me. That is the terrible hole so many of us gay Christians are still trying to climb out of. It is how some of the most vocal opponents of gay rights end up coming out of the closet when the weight of it becomes too much. They look at someone with all the same life experiences and instead of seeing themselves, only see what they think could shatter them.
There is a lot about my youthful zeal I wish I could go back and do again. But few pieces of my history gnaw at my conscience like my role in seeking to intimidate that woman. To intimidate myself, really. So far from fear not. I stood tall and puffed out my chest, but inside I was cowering.
What a grace to know I have nothing to fear. I hate my role I played in sending away others from the communion table. But I cherish my part in being able to invite them back.
|
https://medium.com/reaching-out/how-denying-my-sexuality-destroyed-my-ability-to-love-those-most-like-me-7f4c00c49eaf
|
['Patrick Flores']
|
2018-01-14 16:53:05.732000+00:00
|
['Reaching Out', 'Christianity', 'Life', 'Storytelling', 'LGBTQ']
|
Title Denying Sexuality Destroyed Ability Love Like MeContent asked protect archbishop gay Catholic protestors church service rather unusual thing admit especially considering gay life nothing ironic seminary time training Catholic priest One day rector charge seminary asked see several u identified leader quickly bought vision masculine orthodox church reform led men Catholicism made u lieutenant Tomorrow explained Archbishop celebrating Mass university chapel group LGBT Catholics planning protesting service probably inside chapel possibly communion Catholics communion high point worship God becomes physically present auspex simple bread wine small amount protestors around world taken desecrating communion host make point believe God physically present truly offensive protest Thus protest Mass taken seriously whether threat actually made four u rector said would protect Archbishop protect Jesus communion thrilled seminary day next day walked chapel together khaki blue polo name seminary embroidered left chest Standard issue seminarian uniform Honestly looked like early 2000’s dance group retail clothing commercial bouncer — fact called u street actual bouncer New Orleans fifty u walked Five dollar well Four dollar domestic Ladies get free motherfucking Gap commercial processed aisle beginning Mass could see group four five men woman sitting towards back wore rainbow flag draped shoulder like suffragette Mary Poppins others modest pin seemed least fifty scripture reading sermon moved along without disruption protester sat quietly standing whole congregation saying usual prayer along u relieved disappointed Part grateful respectful also curious see conflict would play Church stereotypically boring reason — would look like little bit rebellion thrown mix asked ego already sky high mind raced would step priest asked u stand alongside everyone handing communion case anything happened big question — heart protest really — whether Archbishop would give LGBT member communion protesting Catholic Church’s treatment queer member symbolically couldn’t get much better prohibited even share meal rest congregation Devout Catholics would object set There’s discrimination barring LGBT Catholics communion say certain belief required member various action prohibited — confession awaiting fall short — order ready participate communion separate rule gay straight Catholics one set belief bind u together Add onto understanding solemnity communion protesting way politicizing act would wildly inappropriate large surprise threat rejection floated air small group queued Archbishops line stood tall still right beside normal Catholic church anyone accompanying person distributing communion given potential disruption believed present best look intimidating point nineteen felt like knight protecting royalty maybe secret service agent accompanying president shook hand kissed baby woman sixty approached Archbishop gently put wrinkled hand like rest grey hair cut short “LGBT Catholic” button placed prominently chest glanced corner eye caught Archbishop’s face smiled genuine tenderness eye reached put hand shoulder like people ready receive communion said small prayer asking God’s blessing woman seemed crumple like crepe paper hand Tears gushed quickly walked past line wine returned pew muffled sob echoing plaster wall betrayed emotion eye followed way back pew head fell hand great threat thwart honestly proud played part tear move dignity Church guarded truth bore upheld stunt left feeling rejected good knight nineteen Mass accompanied Archbishop back sacristy could hang priestly vestment Us seminarian formation around Archbishop protestors approached outside stayed outside perimeter sign tucked away service suddenly hoisted high chanted justice gay Catholics within several second time Archbishop stowed away vestment everyone dispersed Within fifteen minute back room foot propped Philosophy 101 textbook lap different bigger protest worry looking back episode dramatic overreaction desire protect church leader yes even God sincere one silly intensity rooted malice protestors incredibly peaceful honestly even good distracting part struggle easily able disassociate LGBTQ Catholics 2005 still closeted family old friend seminary trying desperately increase Catholic side identity hoping might somehow consume gay one Hoping enough fervor devotion part Church found lacking would fade away forgotten thrill got standing tall Catholic knight defender right true good eclipsed reality close approaching altar think subconscious kind primitive communal calculus decided queer wanting place table threat inclusion real Catholic Watching woman wrinkled hand held Archbishop mind wouldn’t allow see close couple year since laying bed boy mumbling mother next morning maybe Church could change Instead put shoulder back eye forward decided one changed wasn’t gay — really struggled samesex attraction one needed know wasn’t real part good son Church identity wonder might gone different learned see part protestors can’t imagine would willing play Secret Service agent discomfort turning church service protest would probably remained certain place time need remain inherently reverent free antic righteous though cause may willingness engage listen search proactive solution would able say disagree going least understand could ended change disposition dialogue entirely plenty obvious switched side gayChristian debate think Church need move towards gay way around hope remember path took get able stand next Archbishop heart like Peter utter Woman know man need compassion disagree even insist need recognize full humanity LGBTQ individual right empathy piece key though humility recognize easily could ended shoe practicality adopt inviting dare say Christian approach conflict also believe trying shut gayness compartmentalize lock away part made able harshly view protestors sexuality smugness smiled inside woman’s pain direct consequence taught hate push away queer coded language sexuality constantly told practice meant acknowledgment claim queerness sole identity saw another suppose wanted destroyed refusal see clearly meant inability recognize anyone like terrible hole many u gay Christians still trying climb vocal opponent gay right end coming closet weight becomes much look someone life experience instead seeing see think could shatter lot youthful zeal wish could go back piece history gnaw conscience like role seeking intimidate woman intimidate really far fear stood tall puffed chest inside cowering grace know nothing fear hate role played sending away others communion table cherish part able invite backTags Reaching Christianity Life Storytelling LGBTQ
|
5,509 |
React to React Native: Tips and tricks for your journey
|
React Native is gaining popularity these days and many people try to create React Native components out of their existing React ones. This process is not always straightforward so we have created some quick tips to make the migration smooth and error-free.
1. Replace HTML elements with React Native components
The first and most important step in migration is to replace all the HTML elements — as React Native doesn’t support HTML — with the React Native components. For instance, div/section should be replaced with View component and h1 , h2 , … h6 , p and similar text-based elements should be replaced with Text component. For example:
// Web / HTML Component:
const TextComponent = ({content}) => <h1>{content}</h1> // React - Native version of above Component: import { Text } from 'react-native'; const TextComponent = ({content}) => <Text>{content}</Text>
Such React Native components compile into Native code based on the platform and, hence, constitute the fundamental building blocks of the app.
2. Conditional rendering of components can be tricky
Conditional rendering is one of the commonly used patterns in React. Say, we are conditional rendering TestComponent as follows:
<View> //React Native
{ifTheConditionIsTrue && <TestComponent>}
</View>
The above code works fine in React Native until the variable `ifTheConditionIsTrue` is an empty string. If ifTheConditionIsTrue becomes an empty string, React Native starts expecting a <Text> component to encapsulate it and break the app.
The solution is type coercion. Adding a !! before ifTheConditionIsTrue will convince React Native that the variable is a boolean. The solution looks like this:
<View> //React Native
{!!ifTheConditionIsTrue && <TestComponent>}
</View>
If you are using typescript, nullish coalescing is also an option as you can use the benefits of the ternary operation and still maintain the readability like so:
<View> //React Native
{ifTheConditionIsTrue ?? <TestComponent>}
</View>
Since nullish coalescing acts internally as a ternary operator, React Native won't expect a <Text> component if ifTheConditionIsTrue is an empty string.
3. You always click so, learn to press.
In React, we use onClick synthetic events in components. As we don't use a mouse on the mobile phone (yet), we don't have an onClick event. Instead, we have a <TouchableOpacity onPress={}> component, which handles press events in mobile phones. Hence, all the onClick events should be changed to onPress in order to execute the callback when interacting with the components.
4. Platform-agnostic components pose a challenge
When we build an app using React Native, the code gets compiled into Native code depending on the platform (iOS or Android). Maintaining consistency across platforms is quite difficult, especially with the Picker/Select/DropDown. For instance, the native picker component resembles a dropdown in Android and opens a model with options in iOS. If you want to maintain a consistent design, either build a custom component or use libraries such as react-native-picker-select.
5. SVGs
Handling SVGs in React Native is one of the most difficult things. Check out this article to find out how to deal with icons in React Native, which are mono-color SVG. If you want to render “complex” SVGs with more than one color or attributes, you should use npm packages such as react-native-svg .
6. Ditch CSS
React Native doesn’t support CSS (SASS/LESS)and hence we will have to either use a CSS-in-JS solution or StyleSheet from React Native. Of course, inline-css is always an option with the least priority.
A personal suggestion is to use styled-components as they enable reuse of existing CSS using its css package wherein you may pass CSS as a string without changing much. And if you are already using styled-components for your React components, it’s quite convenient because it provides the same API to style in both React and React Native. This increases the development speed substantially.
A detailed explanation of the effective usage of styled components can be found in this article, which also covers how to handle inline styles with styled components.
7. Inline styles
Inline styles — which can be avoided — are written as objects and accept string values in px for React components. But in React Native such values should be passed as numbers without px . Values in pixels(inline styles) tend to break React Native.
Also, React Native doesn't support shorthand CSS properties in styles. For instance, padding : 10px will break React Native. The individual properties should be listed as follows:
style = {{
paddingTop: 10px,
paddingBottom: 10px,
paddingLeft: 10px,
paddingRight: 10px,
}}
To avoid all these issues, it's better to use a CSS-in-JS solution.
8. FlexBox is key
Flexbox can be used with React Native and it makes it a lot easier for developers to maintain layouts between React and React Native. However, in desktop, the default value for flex-direction is row , while in React Native the default value is column .
Hence to maintain the uniform layout, we need to specify the value of flex-direction .
If you are new to Flexbox or just lazy like I am, Facebook has an interactive tool called Yoga, which creates complex layouts and directly issues codes to React Native.
9. Use styled components with animated components
In the case of React components, we use styled-components as follows:
import ReactComponent from './ReactComponent';
import styled from 'styled-components'; const StyledReactComponent = styled(ReactComponent)`
/* Styles goes here */
`
In React Native, we have a set of Animated components — used for Animations — which can’t be used with styled-components in a similar way. styled(Animated.Text) will issue an error. How does one solve this issue?
Instead of directly using the Animated.Text component, we should leverage the createAnimatedComponent function of the Animated class and create our own component. We can use this custom component styled-components as below.
const CustomAnimatedText = Animated.createAnimatedComponent(Text); const StyledCustomAnimatedText = styled(CustomAnimatedText)`
/* Styles goes here */
`
10. Invalid styling properties
React Native doesn't support all CSS properties supported by the browser. For instance, properties such as position: fixed are not supported but position: absolute is supported. Another such example is cursor: pointer .
This React Native cheatsheet might come in handy when you are searching for styles supported in React Native.
11. Background images
Are you a CSS fanboy who uses background-image extensively? I have bad news for you: React Native doesn't support background-image and you will have to use the ImageBackground component of React -Native to set an image as background. It goes something like this:
|
https://medium.com/omio-engineering/react-to-react-native-tips-and-tricks-for-your-journey-c5f5ddfc09b5
|
['Vilva Athiban P B']
|
2020-01-10 13:48:03.030000+00:00
|
['React Native', 'JavaScript', 'Software Development', 'React', 'Styled Components']
|
Title React React Native Tips trick journeyContent React Native gaining popularity day many people try create React Native component existing React one process always straightforward created quick tip make migration smooth errorfree 1 Replace HTML element React Native component first important step migration replace HTML element — React Native doesn’t support HTML — React Native component instance divsection replaced View component h1 h2 … h6 p similar textbased element replaced Text component example Web HTML Component const TextComponent content h1contenth1 React Native version Component import Text reactnative const TextComponent content TextcontentText React Native component compile Native code based platform hence constitute fundamental building block app 2 Conditional rendering component tricky Conditional rendering one commonly used pattern React Say conditional rendering TestComponent follows View React Native ifTheConditionIsTrue TestComponent View code work fine React Native variable ifTheConditionIsTrue empty string ifTheConditionIsTrue becomes empty string React Native start expecting Text component encapsulate break app solution type coercion Adding ifTheConditionIsTrue convince React Native variable boolean solution look like View React Native ifTheConditionIsTrue TestComponent View using typescript nullish coalescing also option use benefit ternary operation still maintain readability like View React Native ifTheConditionIsTrue TestComponent View Since nullish coalescing act internally ternary operator React Native wont expect Text component ifTheConditionIsTrue empty string 3 always click learn press React use onClick synthetic event component dont use mouse mobile phone yet dont onClick event Instead TouchableOpacity onPress component handle press event mobile phone Hence onClick event changed onPress order execute callback interacting component 4 Platformagnostic component pose challenge build app using React Native code get compiled Native code depending platform iOS Android Maintaining consistency across platform quite difficult especially PickerSelectDropDown instance native picker component resembles dropdown Android open model option iOS want maintain consistent design either build custom component use library reactnativepickerselect 5 SVGs Handling SVGs React Native one difficult thing Check article find deal icon React Native monocolor SVG want render “complex” SVGs one color attribute use npm package reactnativesvg 6 Ditch CSS React Native doesn’t support CSS SASSLESSand hence either use CSSinJS solution StyleSheet React Native course inlinecss always option least priority personal suggestion use styledcomponents enable reuse existing CSS using cs package wherein may pas CSS string without changing much already using styledcomponents React component it’s quite convenient provides API style React React Native increase development speed substantially detailed explanation effective usage styled component found article also cover handle inline style styled component 7 Inline style Inline style — avoided — written object accept string value px React component React Native value passed number without px Values pixelsinline style tend break React Native Also React Native doesnt support shorthand CSS property style instance padding 10px break React Native individual property listed follows style paddingTop 10px paddingBottom 10px paddingLeft 10px paddingRight 10px avoid issue better use CSSinJS solution 8 FlexBox key Flexbox used React Native make lot easier developer maintain layout React React Native However desktop default value flexdirection row React Native default value column Hence maintain uniform layout need specify value flexdirection new Flexbox lazy like Facebook interactive tool called Yoga creates complex layout directly issue code React Native 9 Use styled component animated component case React component use styledcomponents follows import ReactComponent ReactComponent import styled styledcomponents const StyledReactComponent styledReactComponent Styles go React Native set Animated component — used Animations — can’t used styledcomponents similar way styledAnimatedText issue error one solve issue Instead directly using AnimatedText component leverage createAnimatedComponent function Animated class create component use custom component styledcomponents const CustomAnimatedText AnimatedcreateAnimatedComponentText const StyledCustomAnimatedText styledCustomAnimatedText Styles go 10 Invalid styling property React Native doesnt support CSS property supported browser instance property position fixed supported position absolute supported Another example cursor pointer React Native cheatsheet might come handy searching style supported React Native 11 Background image CSS fanboy us backgroundimage extensively bad news React Native doesnt support backgroundimage use ImageBackground component React Native set image background go something like thisTags React Native JavaScript Software Development React Styled Components
|
5,510 |
About Me — Aria Dailee. Professional Geographer, budding…
|
About Me — Aria Dailee
Professional geographer, budding developer, and writer
Picture of me voting this year using an at-home voting booth (haha)!
Hello, everyone! I started writing on Medium in October 2020. Funny enough, I thought about starting in 2018 but decided against it. At the time, I didn’t think people would like my writing or even care what I had to say. However, I’ve had a change of heart and decided to give it a try. I’m really excited to start on this journey!
I’m a first-generation American. My parents immigrated to the United States from the Caribbean and met in New York. My father, unfortunately, passed away this year.
I’m a former online tutor with an educational background in geology (BS), geography (BA), and cartography (MS). Now, I’m a mineral commodity analyst by day and a writer by night! I love making maps, so I hope to incorporate a few in future articles.
I enjoy researching and writing about a variety of topics, but a few favorites are introspective personal essays, climate activism and sustainability, history, and tips about whatever skills I recently learned.
I’m one of those strange people that finds learning and working fun. I’m always learning a new skill. Right now I’m working on refining my graphic design and web and app development skills.
My Current Personal Projects:
As of December 8, 2020
My blogging website (The first website I’ve ever made from scratch!)
Updating a Mars weather app I made using Flutter (will link to it here once finished)
Teaching myself how to use Blender (I want to create some art and maybe a few animations)
I’m currently participating in #100DaysOfCode by learning something new in Python for the next 100 days.
Feel free to say hi and connect with me! Don’t forget to check out a few of my stories listed below. I’ll update this story periodically, so be sure to check back occasionally for future updates!
|
https://medium.com/about-me-stories/about-me-aria-dailee-f32fc1a30be3
|
['Aria Dailee']
|
2020-12-08 11:46:52.012000+00:00
|
['Nonfiction', 'Autobiography', 'Introduction', 'About Me', 'Writers On Medium']
|
Title — Aria Dailee Professional Geographer budding…Content — Aria Dailee Professional geographer budding developer writer Picture voting year using athome voting booth haha Hello everyone started writing Medium October 2020 Funny enough thought starting 2018 decided time didn’t think people would like writing even care say However I’ve change heart decided give try I’m really excited start journey I’m firstgeneration American parent immigrated United States Caribbean met New York father unfortunately passed away year I’m former online tutor educational background geology BS geography BA cartography MS I’m mineral commodity analyst day writer night love making map hope incorporate future article enjoy researching writing variety topic favorite introspective personal essay climate activism sustainability history tip whatever skill recently learned I’m one strange people find learning working fun I’m always learning new skill Right I’m working refining graphic design web app development skill Current Personal Projects December 8 2020 blogging website first website I’ve ever made scratch Updating Mars weather app made using Flutter link finished Teaching use Blender want create art maybe animation I’m currently participating 100DaysOfCode learning something new Python next 100 day Feel free say hi connect Don’t forget check story listed I’ll update story periodically sure check back occasionally future updatesTags Nonfiction Autobiography Introduction Writers Medium
|
5,511 |
The Ice Giant Planet that Put Jupiter and Saturn in their Place
|
The Ice Giant Planet that Put Jupiter and Saturn in their Place James Maynard Follow Oct 29 · 4 min read
How did Jupiter and Saturn get where they are today? A massive ice giant planet which once orbited between Saturn and Uranus may have played a role in shaping our Solar System.
A massive planet may have once orbited between Saturn and Uranus, forever altering the orbits of Jupiter and Saturn, before heading out to space. Image credit: The Cosmic Companion / Created in Universe Sandbox
Did a massive ice giant planet once orbit in the outer solar system? And what could evidence for such a world teach us about the original positions of Jupiter and Saturn?
The ancient Solar System was the formed from a disk of gas and dust spiraling around the nascent Sun. At first, most astronomers believe, the earliest planets formed in regular, closely-packed, orbits. Soon, however, gravitational tugs from the most massive of these worlds played havoc with the regular orbits of their neighbors.
It was once thought that solar systems like our own — with small, rocky planets placed close to their parent star and larger gas giants in the outskirts of the system — would be common. But, following the discovery of 4,500 exoplanets, the makeup of our solar system was found to be rare.
“We now know that there are thousands of planetary systems in our Milky Way galaxy alone. But it turns out that the arrangement of planets in our own Solar System is highly unusual, so we are using models to reverse engineer and replicate its formative processes. This is a bit like trying to figure out what happened in a car crash after the fact — how fast were the cars going, in what directions, and so on,” said Matt Clement of Carnegie Institution.
You Should be a Model
The research team ran more than 6,000 simulations of the evolution of the Solar System, revealing an unexpected finding about Jupiter and Saturn.
The orbits of Jupiter and Saturn may have been shaped, in part, by influences from a massive world, long gone from our solar system. Image credit: NASA
Astrophysicists typically thought the two planets orbited in a 3:2 ratio — for every three orbits around the Sun made by Jupiter, Saturn was thought to trace out three trips around our parent star.
Instead, the simulations showed that the two planets were, more likely, in a 2:1 resonance, where Jupiter raced around the Sun twice for every trip completed by Saturn.
Such resonances produce systems much like the one we see in the present day — with small terrestrial planets in the inner solar system, surrounded by larger worlds.
The models also showed that the orbits of Uranus and Neptune were shaped, in part, by gravitational pulls from the multitude of bodies in the Kuiper Belt, sitting at the edge of our family of planets.
Ice Planets Leave me Cold
Voyager became the first spacecraft to visit an ice giant when it arrived at Uranus in January 1986. Video credit: NASA
Another surprise was evidence for an ancient ice giant world that once existed in our Solar System, which left our family of planets long ago.
Ice giant planets are worlds far larger than Earth, mostly consisting of elements heavier than hydrogen and helium, including sulfur, nitrogen, carbon, and oxygen. Two ice planets orbit in the outer reaches of our own solar system — Uranus and Neptune.
“In the strictest definition, ice is the solid form of water. However, planetary astronomers often use ‘ice’ to refer to the solid form of any condensable molecule. These tend to be highly reflective, form clouds, and (unlike minerals) can readily change between liquid, solid, and gas states at relatively low temperatures. Frozen water and carbon dioxide (‘dry ice’) are the most familiar ices on Earth, but methane, ammonia, hydrogen sulfide, and phosphine (PH3) can all freeze in the atmospheres of Uranus and Neptune,” Amy Simon, planetary scientist at NASA’s Goddard Space Flight Center, writes for the Planetary Society.
The first known ice giant planet in another solar system was confirmed in October 2014, sitting 25,000 light years from Earth. This world, four times more massive than Uranus, orbits at a similar distance as its more familiar cousin.
Tools and techniques developed in this study might also assist researchers looking at exoplanets orbiting distant stars.
|
https://medium.com/the-cosmic-companion/the-ice-giant-planet-that-put-jupiter-and-saturn-in-their-place-451b9145687b
|
['James Maynard']
|
2020-10-29 23:15:51.676000+00:00
|
['Astrophysics', 'Space', 'Solar System', 'Physics', 'Science']
|
Title Ice Giant Planet Put Jupiter Saturn PlaceContent Ice Giant Planet Put Jupiter Saturn Place James Maynard Follow Oct 29 · 4 min read Jupiter Saturn get today massive ice giant planet orbited Saturn Uranus may played role shaping Solar System massive planet may orbited Saturn Uranus forever altering orbit Jupiter Saturn heading space Image credit Cosmic Companion Created Universe Sandbox massive ice giant planet orbit outer solar system could evidence world teach u original position Jupiter Saturn ancient Solar System formed disk gas dust spiraling around nascent Sun first astronomer believe earliest planet formed regular closelypacked orbit Soon however gravitational tug massive world played havoc regular orbit neighbor thought solar system like — small rocky planet placed close parent star larger gas giant outskirt system — would common following discovery 4500 exoplanets makeup solar system found rare “We know thousand planetary system Milky Way galaxy alone turn arrangement planet Solar System highly unusual using model reverse engineer replicate formative process bit like trying figure happened car crash fact — fast car going direction on” said Matt Clement Carnegie Institution Model research team ran 6000 simulation evolution Solar System revealing unexpected finding Jupiter Saturn orbit Jupiter Saturn may shaped part influence massive world long gone solar system Image credit NASA Astrophysicists typically thought two planet orbited 32 ratio — every three orbit around Sun made Jupiter Saturn thought trace three trip around parent star Instead simulation showed two planet likely 21 resonance Jupiter raced around Sun twice every trip completed Saturn resonance produce system much like one see present day — small terrestrial planet inner solar system surrounded larger world model also showed orbit Uranus Neptune shaped part gravitational pull multitude body Kuiper Belt sitting edge family planet Ice Planets Leave Cold Voyager became first spacecraft visit ice giant arrived Uranus January 1986 Video credit NASA Another surprise evidence ancient ice giant world existed Solar System left family planet long ago Ice giant planet world far larger Earth mostly consisting element heavier hydrogen helium including sulfur nitrogen carbon oxygen Two ice planet orbit outer reach solar system — Uranus Neptune “In strictest definition ice solid form water However planetary astronomer often use ‘ice’ refer solid form condensable molecule tend highly reflective form cloud unlike mineral readily change liquid solid gas state relatively low temperature Frozen water carbon dioxide ‘dry ice’ familiar ice Earth methane ammonia hydrogen sulfide phosphine PH3 freeze atmosphere Uranus Neptune” Amy Simon planetary scientist NASA’s Goddard Space Flight Center writes Planetary Society first known ice giant planet another solar system confirmed October 2014 sitting 25000 light year Earth world four time massive Uranus orbit similar distance familiar cousin Tools technique developed study might also assist researcher looking exoplanets orbiting distant starsTags Astrophysics Space Solar System Physics Science
|
5,512 |
Wireframes, flows, personas and beautifully crafted UX deliverables for your inspiration
|
When people think of UX Design documentation, the first image that comes to mind is dense, long, and heavily annotated wireframes, full of boxes and arrows that indicate how a system is going to function and behave.
But it doesn’t have to be like that.
Here are a few examples of UX deliverables that are well polished, legible and simple to understand.
Sketches
Wireframes
User flows
Personas
|
https://uxdesign.cc/wireframes-flows-personas-and-beautifully-crafted-ux-deliverables-for-your-inspiration-bb7a8d99af62
|
['Fabricio Teixeira']
|
2017-10-26 13:13:51.607000+00:00
|
['Interaction Design', 'Design', 'User Experience', 'Personas', 'UX']
|
Title Wireframes flow persona beautifully crafted UX deliverable inspirationContent people think UX Design documentation first image come mind dense long heavily annotated wireframes full box arrow indicate system going function behave doesn’t like example UX deliverable well polished legible simple understand Sketches Wireframes User flow PersonasTags Interaction Design Design User Experience Personas UX
|
5,513 |
Junior designers, stop designing for the happy path
|
Junior designers, stop designing for the happy path
In design, the happy path is what happens when the user does everything exactly the way you expect them to. Although this can happen, it won’t always happen.
This is a happy path. Alice Donovan Rouse Unsplash
One of the most common mistakes I see in portfolios from junior designers is that they often show too much — and too little — simultaneously.
One unfortunate reality of the many design boot camps out there is that they seem to encourage students to design entire apps or entire web pages for fake businesses.
Now, before we continue, I am a big supporter of design boot camps. I believe that there is a lot of work to be done to improve many of them, and I also feel like General Assembly is a borderline criminal enterprise. The most unprepared students tend to come from there and I have met more than a few that haven’t been able to find jobs even years later. Regardless, I think that boot camps, in general, are filling a very real gap for a career that lacks college programs. I personally attended DESIGNLAB and a few courses at BrainStation, both of which taught me a lot, although neither was perfect.
The problem with many student projects is that they take on too much. You don’t start your first job and get presented with the task of designing an entire app. You also don’t start right off redoing an entire website (in most cases). Projects are broken down much more granularly, and it’s more likely than not that you’ll be collaborating with other people throughout the project.
Many junior designer’s portfolios (including my own, when I started) will show the concept for an app, along with a chronological checklist of research methodologies, and then final designs, that often don’t adhere to either Material Design or Human Interface Guidelines, and would be unjustifiably complicated and expensive to build, for no good reason.
When I talk about the “happy path” issue, I am referring to these final designs (or the wireframes). I often see designs that show a user effortlessly signing up, making an account, accomplishing a single important task with no hiccups, and then going on their way. When designing in the real world, you don’t get to just account for that happy path. You also have to account for a plethora of other paths. And I can bet you that interviewers are going to ask you about some of those paths when you show these designs.
This is more realistic, but like x10. Caleb Jones Unsplash
Some questions to ask yourself — AND DESIGN FOR:
What happens if the user exits the app and then reopens it? What happens if the user has airplane mode on? In the case of apps with populated lists, what happens if the user doesn’t have any “items?” In the case of apps with the ability to buy things, what happens if the user doesn’t have a payment method on file? In the case of apps with registration flows, what happens if the user skips a section and then wants to go back? In the case of apps that load things (pretty much all apps), what does that loading look like? And what happens if it fails?
This list represents some of the more lightweight questions you may be asked by a product manager. The reality is that complex APIs and limits on front-end logic can require you to solve even more complex situations.
A few that I’ve come across.
What if it’s possible to save some rewards for later, but others will be automatically applied? How do we show the user which they can save, and what UI component makes the most sense for saving and activating rewards? What if the user runs out of pre-loaded funds. The way it’s built, we will charge their default payment method? But how do we communicate that with them so they know? It takes a long time to load this screen, what can we show the user first to allow them to make corrections before the rest of the screen finishes loading?
These difficult questions, in addition to fleshing out realistic designs, also challenge you to make compromises. Fake projects don’t have to make compromises, but it would be so much more impressive if they did. Very often I am faced with decisions between including some piece of information that I think the user would enjoy having or increasing the loading speed.
I’ve looked at a lot of portfolios. I guarantee if I saw some empty states, error states, or other interesting cases accounted for, I’d look again.
|
https://medium.com/design-bootcamp/junior-designers-stop-designing-for-the-happy-path-44cdae1aa69c
|
['Aaron Cecchini-Butler']
|
2020-11-25 22:36:00.774000+00:00
|
['Product Design', 'Design', 'UI', 'Careers', 'UX']
|
Title Junior designer stop designing happy pathContent Junior designer stop designing happy path design happy path happens user everything exactly way expect Although happen won’t always happen happy path Alice Donovan Rouse Unsplash One common mistake see portfolio junior designer often show much — little — simultaneously One unfortunate reality many design boot camp seem encourage student design entire apps entire web page fake business continue big supporter design boot camp believe lot work done improve many also feel like General Assembly borderline criminal enterprise unprepared student tend come met haven’t able find job even year later Regardless think boot camp general filling real gap career lack college program personally attended DESIGNLAB course BrainStation taught lot although neither perfect problem many student project take much don’t start first job get presented task designing entire app also don’t start right redoing entire website case Projects broken much granularly it’s likely you’ll collaborating people throughout project Many junior designer’s portfolio including started show concept app along chronological checklist research methodology final design often don’t adhere either Material Design Human Interface Guidelines would unjustifiably complicated expensive build good reason talk “happy path” issue referring final design wireframes often see design show user effortlessly signing making account accomplishing single important task hiccup going way designing real world don’t get account happy path also account plethora path bet interviewer going ask path show design realistic like x10 Caleb Jones Unsplash question ask — DESIGN happens user exit app reopens happens user airplane mode case apps populated list happens user doesn’t “items” case apps ability buy thing happens user doesn’t payment method file case apps registration flow happens user skip section want go back case apps load thing pretty much apps loading look like happens fails list represents lightweight question may asked product manager reality complex APIs limit frontend logic require solve even complex situation I’ve come across it’s possible save reward later others automatically applied show user save UI component make sense saving activating reward user run preloaded fund way it’s built charge default payment method communicate know take long time load screen show user first allow make correction rest screen finish loading difficult question addition fleshing realistic design also challenge make compromise Fake project don’t make compromise would much impressive often faced decision including piece information think user would enjoy increasing loading speed I’ve looked lot portfolio guarantee saw empty state error state interesting case accounted I’d look againTags Product Design Design UI Careers UX
|
5,514 |
Let’s remember 2020 as the year we learned how to do things better
|
Let’s remember 2020 as the year we learned how to do things better Enrique Dans Follow Dec 22 · 2 min read
Every year at this time, I borrow some of the greetings created by my employers, IE University, to celebrate the holidays with my readers, those of you who use the site regularly, those of you who find yourselves here by chance, and those of you who read my content elsewhere.
It would be an understatement to see that this has been quite a year. Quite simply, it has been unprecedented. What’s more, wishing you all a better 2021 makes little sense: it would be hard for it to be worse than this one.
On a personal level, over the course of the year I have had to develop skills in an area, video, I was never comfortable with, and make it the cornerstone for my professional activities. I have had to give hours and hours of teaching with a mask on, and I have been able to reflect on the extreme difficulty of delivering content and conversing when half my face and that of the people I was talking to is hidden. Another of my main activities, conferences (and all the activity around them) has come to an end, bringing home to me the extent to which they inspired so much of my thinking and what I write about every day. My timeline on Google Maps has pretty much shrunk to a dot on the map, while the days are all very similar to each other, with the resulting negative impact on my creativity, which has certainly robbed me of my sparkle. To those of you who are still around, thank you for continuing to put up with me in spite of everything.
A very different year, this has been a time to think about many things, about our limits as individuals and as a society, and about the many problems that the pandemic has exposed, mainly that we need to do things differently now, and not in 20 or 30 years. Last year, my fears were based on years of research and reflection, but in 2020 we have experienced the evidence first hand.
Vaccines are now being rolled out, but we’re not out of the woods yet, and we should remain vigilant. Please continue to take all precautions. Let’s enjoy the festive season, but remember that we’ve lowered our guard before, and it only made things worse.
That said, my very best wishes to everybody. Let’s hope that 2020 will soon be a not-so pleasant memory. But above all, let’s not remember it as a lost year: even if it came at a high price… it taught us many things and that we can do things better.
|
https://medium.com/enrique-dans/lets-remember-2020-as-the-year-we-learned-how-to-do-things-better-108bc0ec06dd
|
['Enrique Dans']
|
2020-12-22 11:59:58.082000+00:00
|
['Personal', 'Greetings', '2020', 'Ie University', 'Coronavirus']
|
Title Let’s remember 2020 year learned thing betterContent Let’s remember 2020 year learned thing better Enrique Dans Follow Dec 22 · 2 min read Every year time borrow greeting created employer IE University celebrate holiday reader use site regularly find chance read content elsewhere would understatement see quite year Quite simply unprecedented What’s wishing better 2021 make little sense would hard worse one personal level course year develop skill area video never comfortable make cornerstone professional activity give hour hour teaching mask able reflect extreme difficulty delivering content conversing half face people talking hidden Another main activity conference activity around come end bringing home extent inspired much thinking write every day timeline Google Maps pretty much shrunk dot map day similar resulting negative impact creativity certainly robbed sparkle still around thank continuing put spite everything different year time think many thing limit individual society many problem pandemic exposed mainly need thing differently 20 30 year Last year fear based year research reflection 2020 experienced evidence first hand Vaccines rolled we’re wood yet remain vigilant Please continue take precaution Let’s enjoy festive season remember we’ve lowered guard made thing worse said best wish everybody Let’s hope 2020 soon notso pleasant memory let’s remember lost year even came high price… taught u many thing thing betterTags Personal Greetings 2020 Ie University Coronavirus
|
5,515 |
30 Mission-Driven Startups You Should Know
|
Many in Silicon Valley aspire to build products and companies that have the potential to create positive change on a large scale. Often, however, these companies can be difficult to find because they have smaller recruiting budgets, and their mission statements can get lost in the noise of startups claiming to be “changing the world.”
To help, we’ve decided to put together a list of mission-driven companies that are attempting to make a big impact on the world.
We tried to pick companies that:
have solutions that are technology-driven or at least tech-enabled
have similar hiring needs to a typical silicon valley startup
are focused on fulfilling basic needs of people who are typically underserved
Based on that loose criteria, we’ve put together a list of companies that we think are awesome and fit the bill.
Of course, we may have left some great ones off our list and would love to know who we missed! If after reading this list you want to learn more about these companies, here are two things you can do:
|
https://medium.com/tradecraft-traction/30-mission-driven-startups-you-should-know-35195cf45c77
|
[]
|
2018-03-01 15:51:38.911000+00:00
|
['Nonprofit', 'Tech For Good', 'Entrepreneurship', 'You Should Know', 'Startups']
|
Title 30 MissionDriven Startups KnowContent Many Silicon Valley aspire build product company potential create positive change large scale Often however company difficult find smaller recruiting budget mission statement get lost noise startup claiming “changing world” help we’ve decided put together list missiondriven company attempting make big impact world tried pick company solution technologydriven least techenabled similar hiring need typical silicon valley startup focused fulfilling basic need people typically underserved Based loose criterion we’ve put together list company think awesome fit bill course may left great one list would love know missed reading list want learn company two thing doTags Nonprofit Tech Good Entrepreneurship Know Startups
|
5,516 |
Oracle Big Data Cloud, Event Hub and Analytics Cloud Data Lake Edition pt.1 : Creating the Real-Time Data Pipeline
|
Some time ago I posted a blog on what analytics and big data development looked like on Google Cloud Platform using Google BigQuery as my data store and Looker as the BI tool, with data sourced from social media, wearable and IoT data sources routed through a Fluentd server running on Google Compute Engine.
Read more at the MJR Analytics blog.
|
https://medium.com/mark-rittman/oracle-big-data-cloud-event-hub-cloud-and-analytics-cloud-data-lake-edition-pt-1-84961cd4274f
|
['Mark Rittman']
|
2018-10-16 19:52:51.795000+00:00
|
['Oracle Cloud', 'Obiee', 'Analytics', 'Big Data', 'Apache Kafka']
|
Title Oracle Big Data Cloud Event Hub Analytics Cloud Data Lake Edition pt1 Creating RealTime Data PipelineContent time ago posted blog analytics big data development looked like Google Cloud Platform using Google BigQuery data store Looker BI tool data sourced social medium wearable IoT data source routed Fluentd server running Google Compute Engine Read MJR Analytics blogTags Oracle Cloud Obiee Analytics Big Data Apache Kafka
|
5,517 |
How a Daily 60-Minute Break From My Smartphone Fixed My Anxiety
|
Consume Great Fiction Before Bed.
This is a complete gamechanger.
I majored in Communications in college, and right before I dropped out I learned that there was an interesting phenomenon called “narrative transportation.” The Wikipedia definition of it is as follows: “Narrative transportation theory proposes that when people lose themselves in a story, their attitudes and intentions change to reflect that story.”
Think back to the last great movie you watched, one that made you feel like you were right there as the bombs went off or the dragons breathed fire or when the characters stepped out of a dusty closet into Narnia. Narrative transportation explains why when I read The Lord of the Rings I feel like I’m in The Shire, and why when I watch Stranger Things I taste the bittersweet tinge of 80’s nostalgia.
Narrative transportation takes you away from your reality to the world of whatever you’re reading or watching. Most of the time this is a wonderful thing, but in my case, it led to a lot of anxiety. You see, I used to read a lot of self-help and business books before going to bed.
Yes, these books helped me grow, but reading The Four Hour Work Week before bed would hype me up and bombard my brain full of ideas. And this state of heightened state of arousal would follow me into dreamland and manifest as unrestful sleep, nightmares and the slow but constant grinding of teeth.
I didn’t even realize how bad I had it until I switched to fiction upon my friend's advice. Reading the works of Neil Gaiman, G.R.R Martin and even the horrorscapes of Stephen King took me out of my head. Suddenly, I wasn’t a struggling Singaporean entrepreneur anymore. I was Ser Jaime Lannister of Westeros, who upon having his sword hand cut off, has to forge a new identity for himself as a crippled commander.
Consuming fiction, particularly speculative fiction such as high fantasy and sci-fi, serves to transport me to a realm that is infinitely bigger, brighter and fundamentally different from the one I inhabit. There, for a few short hours, I can forget my weals and woes. I can rest. I can heal.
So don’t consume non-fiction before bed. Self-help articles and the latest tragedy unfolding on Fox News can be perused during the day. Opt instead to swap out your nighttime entertainment for something more relaxing, something more magical. A good-old-fashioned novel comes to mind, or a classic fantasy flick like Harry Potter. Please, give it a try before you roll your eyes.
You’d be surprised by how much tranquillity this simple tip introduces into your life.
For God’s Sake — Put Your Phone on Airplane Mode Before You Sleep.
Or better yet, shut it off. If you can’t turn it off your device for 8 measly hours a day — when you’re dead to the world, no less, it’s a sign that something is truly, terribly wrong.
I’ve been rudely awakened more times than I can count by a stranger calling the wrong number or by a text from the bank advertising their latest loan scheme. And believe me, these rude awakenings add up.
There’s a reason why many self-help gurus emphasise morning routines, and that's because the way you start your morning colours your entire day. Isn’t it stupid to let your day dictated by something as inconsequential as a mistaken phone call?
A simple way to avoid this problem is to switch your phone off and stick it in the drawer an hour before bedtime, or at least put it on airplane mode. This simple act will also prevent you from checking your phone the instant you wake, which brings us to our next point….
Don’t Check Your Phone Immediately After Waking. Get One Mindfulness Practice in Instead.
I have to confess, this last point is the one I have the most trouble with.
It's so easy, so convenient, so tempting, to wake up and check your notifications. To check what new notifications popped up over the night, or if you’re work-oriented, to refresh your emails.
The problem is this habit catapults you into work mode. It doesn’t give you time to relish that delicious moment where you’re semi-conscious, your Self slow-emerging from your soupy subconscious as sunlight streams soft-yellow into your room. It doesn’t allow you time to reflect and plan for your day ahead.
Nowadays, instead of checking my notifications first thing in the morning, I practice mindfulness instead. For 30-minutes, I do some light stretching, then proceed to journal three-pages for the day. And this is no hyperbole, I truly consider my morning practice the cornerstone of my happiness.
Writing first thing in the morning makes me feel productive. If nothing else happens, I already have three pages of freehand material done and dusted. Filling in my journal over a cup of coffee gives me rare time to introspect, to digest my past and plan for the future. And lastly, my journalling habit helps me warm up my writer’s fingers — and more crucially, my writer’s mind.
If, for some reason, you can only implement one of these three tips, let it be this one. Swap out your morning phone time for some mindful time with yourself. It doesn’t have to be journaling. Here are some great options:
Yoga
Meditation
A morning run in nature
Playing the piano over some tea
A light workout in the sun
Take your pick. It doesn’t have to be anything hardcore — your morning mindfulness practise is nothing more than a medium that allows you to spend some quality time with yourself before the busyness of the day. Done right, this practice will not only help you dispel anxiety. It will help you know yourself better.
And to know is to love.
|
https://medium.com/the-ascent/how-a-daily-60-minute-break-from-my-smartphone-fixed-my-anxiety-65c821428e6
|
['Alvin Ang']
|
2020-11-17 15:03:13.339000+00:00
|
['Self Improvement', 'Lifestyle', 'Life Lessons', 'Mental Health', 'Social Media']
|
Title Daily 60Minute Break Smartphone Fixed AnxietyContent Consume Great Fiction Bed complete gamechanger majored Communications college right dropped learned interesting phenomenon called “narrative transportation” Wikipedia definition follows “Narrative transportation theory proposes people lose story attitude intention change reflect story” Think back last great movie watched one made feel like right bomb went dragon breathed fire character stepped dusty closet Narnia Narrative transportation explains read Lord Rings feel like I’m Shire watch Stranger Things taste bittersweet tinge 80’s nostalgia Narrative transportation take away reality world whatever you’re reading watching time wonderful thing case led lot anxiety see used read lot selfhelp business book going bed Yes book helped grow reading Four Hour Work Week bed would hype bombard brain full idea state heightened state arousal would follow dreamland manifest unrestful sleep nightmare slow constant grinding teeth didn’t even realize bad switched fiction upon friend advice Reading work Neil Gaiman GRR Martin even horrorscapes Stephen King took head Suddenly wasn’t struggling Singaporean entrepreneur anymore Ser Jaime Lannister Westeros upon sword hand cut forge new identity crippled commander Consuming fiction particularly speculative fiction high fantasy scifi serf transport realm infinitely bigger brighter fundamentally different one inhabit short hour forget weal woe rest heal don’t consume nonfiction bed Selfhelp article latest tragedy unfolding Fox News perused day Opt instead swap nighttime entertainment something relaxing something magical goodoldfashioned novel come mind classic fantasy flick like Harry Potter Please give try roll eye You’d surprised much tranquillity simple tip introduces life God’s Sake — Put Phone Airplane Mode Sleep better yet shut can’t turn device 8 measly hour day — you’re dead world le it’s sign something truly terribly wrong I’ve rudely awakened time count stranger calling wrong number text bank advertising latest loan scheme believe rude awakening add There’s reason many selfhelp guru emphasise morning routine thats way start morning colour entire day Isn’t stupid let day dictated something inconsequential mistaken phone call simple way avoid problem switch phone stick drawer hour bedtime least put airplane mode simple act also prevent checking phone instant wake brings u next point… Don’t Check Phone Immediately Waking Get One Mindfulness Practice Instead confess last point one trouble easy convenient tempting wake check notification check new notification popped night you’re workoriented refresh email problem habit catapult work mode doesn’t give time relish delicious moment you’re semiconscious Self slowemerging soupy subconscious sunlight stream softyellow room doesn’t allow time reflect plan day ahead Nowadays instead checking notification first thing morning practice mindfulness instead 30minutes light stretching proceed journal threepages day hyperbole truly consider morning practice cornerstone happiness Writing first thing morning make feel productive nothing else happens already three page freehand material done dusted Filling journal cup coffee give rare time introspect digest past plan future lastly journalling habit help warm writer’s finger — crucially writer’s mind reason implement one three tip let one Swap morning phone time mindful time doesn’t journaling great option Yoga Meditation morning run nature Playing piano tea light workout sun Take pick doesn’t anything hardcore — morning mindfulness practise nothing medium allows spend quality time busyness day Done right practice help dispel anxiety help know better know loveTags Self Improvement Lifestyle Life Lessons Mental Health Social Media
|
5,518 |
Azure Functions Express: Running Azure Functions locally using Docker Compose
|
Every time I join a new project, I try not to rely too much on external environments when building and running the software that I’m working on. Most of the time, a DEV or CI environment is overrated and unstable. My approach is to run all the components locally and remove the external dependencies.
I already wrote about running a local SQL Express Docker instance, you will find this article to have the same merit:
If it were up to me, I’d write everything in Azure Functions, not everything fits the model, though. Maarten Balliauw explains this in more detail here: https://blog.maartenballiauw.be/post/2019/10/02/dont-use-azure-functions-as-a-web-application.html
However, when I do develop an Azure Function, I like to run it locally first, without the interference of anything hosted in Azure or elsewhere. The fastest way of having such an experience, in my opinion, is using Docker and Docker Compose.
The term ‘ Docker’ is glorified by many and horrified by some, I consider Docker to be just another tool in my toolbelt, and a great one at that.
I think the reason I like Docker so much, is the versatility of the tool:
Do you have an ASP.NET Core Web Application? Docker.
How about an SQL Database? Docker.
Angular Frontend Application? Docker.
An executable that you’d likely run via a Windows Service? Docker.
Database Migrations? Docker.
Azure Storage Emulator? Docker.
Your entire CI/CD Jenkins Pipeline? Docker.
Local Azure functions with Docker and Docker-Compose
Everything I’ll mention here is compressed inside of the accompanied git repo below:
I’ve created a template C# Azure Functions project for you to scaffold that holds the following features:
An HTTP Triggered function
A Blob Triggered function with Blob output binding connected to a local storage account
A Queue Triggered function connected to a local queue
An easy “start and stop” way of hosting this function locally
What you will need to do first in order to get started:
Install Docker
Install Docker-Compose
Install Azure Storage Explorer
VS Code with the REST CLIENT extension
Clone the repo
Run the following command:
docker-compose up
First, you will see the Azurite image being pulled from Docker.
Next, you will see the Local.Functions project being built and containerized.
Finally, when both containers are ready, they are spun up inside of one network using their respective names:
Now, you can open up the Storage Explorer and browse the local.storage.emulator’s Blob storage and Queues:
Preparing the environment
Create two containers named; input-container and output-container.
Create a queue named; queue.
Working with the Queue
Add a new message to the queue via the Azure Storage Explorer
In a few moments, you’ll see the message getting picked up by the local.functions container.
|
https://maartenmerken.medium.com/azure-functions-express-running-azure-functions-locally-using-docker-compose-bf6be03250fc
|
['Maarten Merken']
|
2020-09-11 13:28:41.482000+00:00
|
['Docker Compose', 'Docker', 'Software Engineering', 'Azure Functions']
|
Title Azure Functions Express Running Azure Functions locally using Docker ComposeContent Every time join new project try rely much external environment building running software I’m working time DEV CI environment overrated unstable approach run component locally remove external dependency already wrote running local SQL Express Docker instance find article merit I’d write everything Azure Functions everything fit model though Maarten Balliauw explains detail httpsblogmaartenballiauwbepost20191002dontuseazurefunctionsasawebapplicationhtml However develop Azure Function like run locally first without interference anything hosted Azure elsewhere fastest way experience opinion using Docker Docker Compose term ‘ Docker’ glorified many horrified consider Docker another tool toolbelt great one think reason like Docker much versatility tool ASPNET Core Web Application Docker SQL Database Docker Angular Frontend Application Docker executable you’d likely run via Windows Service Docker Database Migrations Docker Azure Storage Emulator Docker entire CICD Jenkins Pipeline Docker Local Azure function Docker DockerCompose Everything I’ll mention compressed inside accompanied git repo I’ve created template C Azure Functions project scaffold hold following feature HTTP Triggered function Blob Triggered function Blob output binding connected local storage account Queue Triggered function connected local queue easy “start stop” way hosting function locally need first order get started Install Docker Install DockerCompose Install Azure Storage Explorer VS Code REST CLIENT extension Clone repo Run following command dockercompose First see Azurite image pulled Docker Next see LocalFunctions project built containerized Finally container ready spun inside one network using respective name open Storage Explorer browse localstorageemulator’s Blob storage Queues Preparing environment Create two container named inputcontainer outputcontainer Create queue named queue Working Queue Add new message queue via Azure Storage Explorer moment you’ll see message getting picked localfunctions containerTags Docker Compose Docker Software Engineering Azure Functions
|
5,519 |
Data Ingestion from 5 Major Data Sources using Python
|
Did you know that in 2020 around 147 GB of data is generated per day? And, we have already stored around 40 trillion GB of data until now. All these stored data are not even the same. Data types like text or numbers have different formats. That explains why we have different types of data sources.
When you are working with data, you should know how to ingest the data from different sources. In this article, we are going to ingest data from various sources with the help of python libraries.
We will go through the below Data sources.
1. RDBMS Database
2. XML file format
3. CSV file format
4. Apache Parquet file format
5. Microsoft Excel
Do we have one python library which fetches data from all the sources?
Nope, because every data source has its own protocol for data transfer. We have multiple python library which does this job. Consider this article as a one-stop place to know about these python libraries.
In this article, we explain why we save data in different sources and how we retrieve data using python library.
Let’s start with our data fetching story.
1. Relational database management system (RDBMS) Database
The data in RDBMS has saved in rows and columns format. Tables present in the database have a fixed schema. We can directly use Structured Query Language (SQL) in the database to update the table. Examples of RDBMS are oracle, Microsoft SQL Server, etc.
Why we use an RDBMS database?
· Easy to use by users due to tabular format.
· A standard language SQL is available for RDBMS to manipulate data.
· The processing speed increases if we optimize RDBMS properly.
· Maintenance is easy.
· More people can access the database at the same time.
Now we can access different RDBMS databases using python libraries.
import pyodbc
server_name = “SQL instance of your database”
username = “username of your database”
password = “password of your database”
database_name = “name of your database”
port = “connection port for your database” conn=pyodbc.connect(‘DRIVER={PostgreSQL ODBC Driver(UNICODE)};
SERVER=’+ server_name +
‘;UID=’ + username +
‘;PWD=’ + password +
‘;DATABASE=’ + database_name +
‘;PORT=’ + port + ‘;’)
cursor = conn.cursor()
cursor.execute(query)
query_data = cursor.fetchall()
We will be using this code in MySQL and Postgress database connection. The MySQL database connection does not need a port variable.
The contents of the Driver variable are different for different databases. Driver for MySQL and Postgress databases are SQL Server and PostgreSQL ODBC Drive(UNICODE). Also, check what types of database drivers are available in your computer. Use pyodbc.drivers() function.
Use the below code for the Oracle database.
import cx_Oracle dsn_tns = cx_Oracle.makedsn(server_name,
port,
service_name=server_name) conn = cx_Oracle.connect(user=username,
password=password,
dsn=dsn_tns)
cursor = conn.cursor()
cursor.execute(query)
query_data = cursor.fetchall()
Reach out to your database admins to get the values of username, password, server_name, port, service_name, and database_name variables.
2. XML file format
XML is a file extension for the External Markup Language (XML) file. It stores those textual data that is human-readable and machine-readable. XML has designed in such a way that it’s format not change across the internet.
Why we use an XML file format?
· XML is a plain-text file format which can be understood by both human and machine.
· XML has a simple and common syntax rule to exchange information between applications.
· We can use a programming language to manipulate the information inside the XML file.
· We can combine multiple XML documents to form one large XML file without adding extra information. You can also divide XML into various parts and use them separately.
· The XML file format is preferable in web applications.
Now, we can access the XML file using the xml library.
import pandas as pd
import xml.etree.ElementTree as etree xml_tree = etree.parse(“sample.xml”)
xml_root = xml_tree.getroot()
columns = [“A”, “B”] datatframe = pd.DataFrame(columns = columns)
for node in xml_root:
name = node.attrib.get(“A”)
mail = node.find(“B”).text if node is not None else None
datatframe = datatframe.append(pd.Series([A,B], index=columns),
ignore_index = True)
You can use the request library to post the XML file in SOAP API.
3. CSV file format.
A Comma Separated Values (CSV) is a file format that stores plain text and tabular data. The first line of CSV file generally contains the columns and comma separate each column. Second-row and onwards have contents of the columns. It could be a text, number, or date. Tab Separated values file also has .csv file extension. It solves column separation issues related to CSV file format.
Why we use the CSV file format?
· Easy to create and manipulate data.
· Easy to read and understand data.
· We can organize a large amount of data.
· We can easily import and export CSV files.
Now we can access CSV files using pandas and the CSV library.
With the help of the pandas library, you can directly import the CSV file into the dataframe.
# importing Pandas library
import pandas as pd
csv_dataframe = pd.read_csv(“hr_data.csv”, sep=”,”,)
print(csv_dataframe) Name Hire Date Salary Sick Days remaining
0 Graham Chapman 03/15/14 50000.0 10
1 John Cleese 06/01/15 65000.0 8
2 Eric Idle 05/12/14 45000.0 10
3 Terry Jones 11/01/13 70000.0 3
4 Terry Gilliam 08/12/14 48000.0 7
5 Michael Palin 05/23/13 66000.0 8
If CSV file has ‘\t’ separator then use sep=”\t”. In case of space use sep=” “. Visit here for more information about read_csv function.
import csv with open(“hr_data.csv”) as csv_file:
csv_reader = csv.reader(csv_file, delimiter=’,’)
line_count = 0
for row in csv_reader:
if line_count == 0:
print(f’Column names are {“, “.join(row)}’)
line_count += 1
else:
print(f’\t{row[0]} first column content {row[1]} second
row content {row[2]} third row content.’)
line_count += 1
print(f’Processed {line_count} lines.’)
4. Apache Parquet file format
Apache Parquet is a column-oriented data storage file format. The data stored in parquet files have compressed efficiently. Shredding and assembly algorithms are used in parquet to store the data. It is enhanced to handle complex data in bulk. Generally, parquet formats are useful in big data technologies.
Why we use the apache parquet file format
· The parquet file created using an efficient compression algorithm that saves a lot of storage space than other file formats.
· Queries that fetch columns data need not scan the whole row, which improves performance.
· Each column has its own encoding techniques.
· Parquet files have optimized for queries that process a large amount of data.
Now we can access parquet files using pandas and pyarrow libraries.
import pyarrow.parquet as pq example_table = pq.read_pandas('example.parquet',
columns=['one',’two’]).to_pandas() print(example_table) one two
a foo bar
b bar baz
c baz foo import pandas as pd pandas_dataframe = pd.read_parquet('example.parquet',
engine='pyarrow') print(pandas_dataframe) one two
a foo bar
b bar baz
c baz foo
5. Microsoft Excel
Excel is a spreadsheet developed by Microsoft. It stores data in tabular format. It has a grid of cells, which form rows and columns when combined. It has a lot of inbuilt features such as calculations, graphing tools, pivot tables, etc.
Why we use Microsoft Excel file format?
· You can analyze the data in excel using charts and graphs.
· Excel is good at sorting, filtering, and searching in data.
· You can build a mathematical formula and apply it to the data.
· Excel comes with a password-protected feature.
· You can use excel as a calendar.
· You can also use excel to automate data-related jobs.
Now, we can access Microsoft Excel using openpyxl library.
from openpyxl import load_workbook workbook = load_workbook(filename=”sample.xlsx”)
workbook_sheets = workbook.sheetnames
sheet = workbook.active print(sheet[“A1”].value) “hello” print(sheet.cell(row=10, column=6).value) “this is hello world store in row 10 and column 6.” import pandas as pd df = pd.read_excel(‘File.xlsx’, sheetname=’Sheet1') print(“Column headings:”)
print(df.columns) [‘A’,’B’,’C’]
Conclusion
This article helps you to understand why we need different sources to store data and how you retrieve data from these sources. We have used multiple python libraries to ingest data. In this article, I have covered 5 data sources.
Hopefully, this article will help you in data processing activities.
Other Articles by Author
|
https://medium.com/towards-artificial-intelligence/data-ingestion-from-5-major-data-sources-using-python-936144b30fa6
|
['Manmohan Singh']
|
2020-10-24 13:45:02.980000+00:00
|
['Python Programming', 'Parquet', 'Rdbms', 'Data Ingestion', 'Big Data']
|
Title Data Ingestion 5 Major Data Sources using PythonContent know 2020 around 147 GB data generated per day already stored around 40 trillion GB data stored data even Data type like text number different format explains different type data source working data know ingest data different source article going ingest data various source help python library go Data source 1 RDBMS Database 2 XML file format 3 CSV file format 4 Apache Parquet file format 5 Microsoft Excel one python library fetch data source Nope every data source protocol data transfer multiple python library job Consider article onestop place know python library article explain save data different source retrieve data using python library Let’s start data fetching story 1 Relational database management system RDBMS Database data RDBMS saved row column format Tables present database fixed schema directly use Structured Query Language SQL database update table Examples RDBMS oracle Microsoft SQL Server etc use RDBMS database · Easy use user due tabular format · standard language SQL available RDBMS manipulate data · processing speed increase optimize RDBMS properly · Maintenance easy · people access database time access different RDBMS database using python library import pyodbc servername “SQL instance database” username “username database” password “password database” databasename “name database” port “connection port database” connpyodbcconnect‘DRIVERPostgreSQL ODBC DriverUNICODE SERVER’ servername ‘UID’ username ‘PWD’ password ‘DATABASE’ databasename ‘PORT’ port ‘’ cursor conncursor cursorexecutequery querydata cursorfetchall using code MySQL Postgress database connection MySQL database connection need port variable content Driver variable different different database Driver MySQL Postgress database SQL Server PostgreSQL ODBC DriveUNICODE Also check type database driver available computer Use pyodbcdrivers function Use code Oracle database import cxOracle dsntns cxOraclemakedsnservername port servicenameservername conn cxOracleconnectuserusername passwordpassword dsndsntns cursor conncursor cursorexecutequery querydata cursorfetchall Reach database admins get value username password servername port servicename databasename variable 2 XML file format XML file extension External Markup Language XML file store textual data humanreadable machinereadable XML designed way it’s format change across internet use XML file format · XML plaintext file format understood human machine · XML simple common syntax rule exchange information application · use programming language manipulate information inside XML file · combine multiple XML document form one large XML file without adding extra information also divide XML various part use separately · XML file format preferable web application access XML file using xml library import panda pd import xmletreeElementTree etree xmltree etreeparse“samplexml” xmlroot xmltreegetroot column “A” “B” datatframe pdDataFramecolumns column node xmlroot name nodeattribget“A” mail nodefind“B”text node None else None datatframe datatframeappendpdSeriesAB indexcolumns ignoreindex True use request library post XML file SOAP API 3 CSV file format Comma Separated Values CSV file format store plain text tabular data first line CSV file generally contains column comma separate column Secondrow onwards content column could text number date Tab Separated value file also csv file extension solves column separation issue related CSV file format use CSV file format · Easy create manipulate data · Easy read understand data · organize large amount data · easily import export CSV file access CSV file using panda CSV library help panda library directly import CSV file dataframe importing Pandas library import panda pd csvdataframe pdreadcsv“hrdatacsv” sep”” printcsvdataframe Name Hire Date Salary Sick Days remaining 0 Graham Chapman 031514 500000 10 1 John Cleese 060115 650000 8 2 Eric Idle 051214 450000 10 3 Terry Jones 110113 700000 3 4 Terry Gilliam 081214 480000 7 5 Michael Palin 052313 660000 8 CSV file ‘t’ separator use sep”t” case space use sep” “ Visit information readcsv function import csv open“hrdatacsv” csvfile csvreader csvreadercsvfile delimiter’’ linecount 0 row csvreader linecount 0 printf’Column name “ “joinrow’ linecount 1 else printf’trow0 first column content row1 second row content row2 third row content’ linecount 1 printf’Processed linecount lines’ 4 Apache Parquet file format Apache Parquet columnoriented data storage file format data stored parquet file compressed efficiently Shredding assembly algorithm used parquet store data enhanced handle complex data bulk Generally parquet format useful big data technology use apache parquet file format · parquet file created using efficient compression algorithm save lot storage space file format · Queries fetch column data need scan whole row improves performance · column encoding technique · Parquet file optimized query process large amount data access parquet file using panda pyarrow library import pyarrowparquet pq exampletable pqreadpandasexampleparquet columnsone’two’topandas printexampletable one two foo bar b bar baz c baz foo import panda pd pandasdataframe pdreadparquetexampleparquet enginepyarrow printpandasdataframe one two foo bar b bar baz c baz foo 5 Microsoft Excel Excel spreadsheet developed Microsoft store data tabular format grid cell form row column combined lot inbuilt feature calculation graphing tool pivot table etc use Microsoft Excel file format · analyze data excel using chart graph · Excel good sorting filtering searching data · build mathematical formula apply data · Excel come passwordprotected feature · use excel calendar · also use excel automate datarelated job access Microsoft Excel using openpyxl library openpyxl import loadworkbook workbook loadworkbookfilename”samplexlsx” workbooksheets workbooksheetnames sheet workbookactive printsheet“A1”value “hello” printsheetcellrow10 column6value “this hello world store row 10 column 6” import panda pd df pdreadexcel‘Filexlsx’ sheetname’Sheet1 print“Column headings” printdfcolumns ‘A’’B’’C’ Conclusion article help understand need different source store data retrieve data source used multiple python library ingest data article covered 5 data source Hopefully article help data processing activity Articles AuthorTags Python Programming Parquet Rdbms Data Ingestion Big Data
|
5,520 |
Unleash the Power of Your Teams
|
Unleash the Power of Your Teams
Want to stop leaking value? Start gathering team data.
Do you know how your organization’s teams are getting work done? Do you know how much value they’re producing each day? What about the factors that are driving their success — or failure?
If you can’t answer these questions, and provide numbers to backup your answers, chances are you’re leaving productivity and profit on the table every single day.In today’s rapidly evolving business environment, where teams are formed and reformed fluidly in order to get work done, most leaders have very little idea of what their company’s team structure looks like, how work is getting done, or how value is being generated (or depleted) — even as they remain responsible for creating that value.If that’s you, you’re not alone. And it’s not your fault.
When researching what makes for team success, hard numbers and concrete advice are hard to find. This topic has been well researched, but the results are conceptual and difficult to implement, requiring significant cultural change without evidence of results. Most discouragingly, though research agrees on some common qualities of successful teams, what actually makes for consistent team success and failure is different for each company, and changes based on factors like industry, product and business model.
The damaging result of this is that when you want to know what makes the difference between your teams succeeding or failing (creating or draining value), you only have a few options: guess (really), rely on tribal knowledge, or slog through historical data to come up with stale numbers. This leads to decision lag and an inefficient and costly trial-and-error approach, leaving leaders vulnerable and resulting, ultimately, in missed value.
If this is sounding all too familiar, don’t panic. You can get a true grip on what’s happening in your teams, and where you can drive productivity and value across your organization.
Team Insights helps you visualize the way your teams get work done, behind a single pane of glass.
“One of the hardest things for technology leaders and managers today, is that there’s plenty of data out there, but it’s hard to get a clear idea of what’s driving team success,” explains Kevin Tuskey, SingleStone’s Director of Design. “Even if you’re not starting from scratch, the products and tools available just aren’t designed to show you what’s important.”
Suzanne Hawthorne, Account Director and Client Advisor at SingleStone agrees: “Even our most technologically advanced clients are struggling in this space. Measuring team success across an entire enterprise is tough, but it’s crucial to today’s leadership if they want to make the quick, data-driven decisions that will help their company get ahead. It’s the void in this critical area that drove us to create Team Insights.”
Team Insights, SingleStone’s team intelligence product, takes the guesswork out of what makes a company’s Agile teams successful. Instead of merely showing the HR structure of an organization, it shows how a company’s cross-functional teams are organized in a simple, easy-to-use system. Team Insights provides data-based decisions to drive how to get work done and create value. In short, it’s built to capture the data that helps a company and its leadership pinpoint, measure and unleash untapped productivity in their Agile organization.
“Before Team Insights, there was nothing like this available in the marketplace for our clients, and it was a huge gap,” says Hawthorne. “This product is designed to help our clients do three things: Gain a clear line of sight to the way work gets done in their company, capture straightforward data on the factors that contribute to team success, and uncover trends and insights that drive confident action and proactive decision-making.”
If you are struggling to gain line of sight to your Agile organization, or to make the data-driven decisions that will take your company to the next level, SingleStone can help. Our consultants and technology teams bring more than 20 years of experience in the areas of data, software, team success and Agile transformation. Our custom dashboards power decision making at many Fortune 500 companies in financial services and other industries. Team Insights is a direct result of our collective experience and track record.
Make 2019 your year to get on track and unleash the hidden potential of your teams with Team Insights. Reach out to learn more about our team intelligence product and schedule a free demo.
|
https://medium.com/singlestone/unleash-the-power-of-your-teams-8c71685b3a46
|
[]
|
2019-08-30 21:04:47.452000+00:00
|
['Dashboard', 'Data Visualization', 'Software Development', 'Agile', 'Teamwork']
|
Title Unleash Power TeamsContent Unleash Power Teams Want stop leaking value Start gathering team data know organization’s team getting work done know much value they’re producing day factor driving success — failure can’t answer question provide number backup answer chance you’re leaving productivity profit table every single dayIn today’s rapidly evolving business environment team formed reformed fluidly order get work done leader little idea company’s team structure look like work getting done value generated depleted — even remain responsible creating valueIf that’s you’re alone it’s fault researching make team success hard number concrete advice hard find topic well researched result conceptual difficult implement requiring significant cultural change without evidence result discouragingly though research agrees common quality successful team actually make consistent team success failure different company change based factor like industry product business model damaging result want know make difference team succeeding failing creating draining value option guess really rely tribal knowledge slog historical data come stale number lead decision lag inefficient costly trialanderror approach leaving leader vulnerable resulting ultimately missed value sounding familiar don’t panic get true grip what’s happening team drive productivity value across organization Team Insights help visualize way team get work done behind single pane glass “One hardest thing technology leader manager today there’s plenty data it’s hard get clear idea what’s driving team success” explains Kevin Tuskey SingleStone’s Director Design “Even you’re starting scratch product tool available aren’t designed show what’s important” Suzanne Hawthorne Account Director Client Advisor SingleStone agrees “Even technologically advanced client struggling space Measuring team success across entire enterprise tough it’s crucial today’s leadership want make quick datadriven decision help company get ahead It’s void critical area drove u create Team Insights” Team Insights SingleStone’s team intelligence product take guesswork make company’s Agile team successful Instead merely showing HR structure organization show company’s crossfunctional team organized simple easytouse system Team Insights provides databased decision drive get work done create value short it’s built capture data help company leadership pinpoint measure unleash untapped productivity Agile organization “Before Team Insights nothing like available marketplace client huge gap” say Hawthorne “This product designed help client three thing Gain clear line sight way work get done company capture straightforward data factor contribute team success uncover trend insight drive confident action proactive decisionmaking” struggling gain line sight Agile organization make datadriven decision take company next level SingleStone help consultant technology team bring 20 year experience area data software team success Agile transformation custom dashboard power decision making many Fortune 500 company financial service industry Team Insights direct result collective experience track record Make 2019 year get track unleash hidden potential team Team Insights Reach learn team intelligence product schedule free demoTags Dashboard Data Visualization Software Development Agile Teamwork
|
5,521 |
Easy Python Speedup Wins With Numba
|
If you have functions that do a lot of mathematical operations, use NumPy or rely heavily on loops, then there is a way to speed them up significantly with one line of code. Ok, two lines if you count the import.
Numba and the @jit decorator
Meet Numba and its @jit decorator. It changes how your code is compiled, often improving its performance. You don’t have to install any special tools (just the numba pip package), you don't have to tweak any parameters. All you have to do is:
Add the @jit decorator to a function
decorator to a function Check if it’s faster
Let’s see an example of code before and after applying Numba 's optimization.
Before
The only purpose of this code is to do some calculations and to “be slow.” Let’s see how slow (benchmarks are done with Python 3.8 — I describe the whole setup in the Introduction article on my blog):
Now, we add @jit to our code. The body of the function stays the same, and the only difference is the decorator. Don't forget to install Numba package with pip ( pip install numba ).
… and after. Can you spot 2 lines that have changed?
Let’s measure the execution time once more:
Using @jit decorator gave us a 120x speedup (217 / 1.76 = 123.295)! That’s a huge improvement for such a simple change!
Other features of Numba
@jit is the most common decorator from the Numba library, but there are others that you can use:
@njit — alias for @jit(nopython=True). In nopython mode, Numba tries to run your code without using the Python interpreter at all. It can lead to even bigger speed improvements, but it's also possible that the compilation will fail in this mode.
mode, Numba tries to run your code without using the Python interpreter at all. It can lead to even bigger speed improvements, but it's also possible that the compilation will fail in this mode. @vectorize and @guvectorize — produces ufunc and generalized ufunc used in NumPy.
and generalized used in NumPy. @jitclass — can be used to decorate the whole class.
@cfunc — declares a function to be used as a native callback (from C or C++ code).
There are also advanced features that let you, for example, run your code on GPU with @cuda.jit. This doesn’t work out of the box, but it might be worth the effort for some very computational-heavy operations.
Numba has plenty of configuration options that will further improve your code’s execution time if you know what you are doing. You can:
Disable GIL (Global Interpreter Lock) with nogil
Cache results with cache
Automatically parallelize functions with parallel .
Check out the documentation to see what you can do. And to see more real-life examples (like computing the Black-Scholes model or the Lennard-Jones potential), visit the Numba Examples page.
Conclusions
Numba is a great library that can significantly speed up your programs with minimal effort. Given that it takes less than a minute to install and decorate some slow functions, it's one of the first solutions that you can check when you want to quickly improve your code (without rewriting it).
It works best if your code:
|
https://medium.com/python-in-plain-english/easy-speedup-wins-with-numba-b3dad3f0c207
|
['Sebastian Witowski']
|
2020-10-01 12:45:57.714000+00:00
|
['Coding', 'Software Development', 'Performance', 'Programming', 'Python']
|
Title Easy Python Speedup Wins NumbaContent function lot mathematical operation use NumPy rely heavily loop way speed significantly one line code Ok two line count import Numba jit decorator Meet Numba jit decorator change code compiled often improving performance don’t install special tool numba pip package dont tweak parameter Add jit decorator function decorator function Check it’s faster Let’s see example code applying Numba optimization purpose code calculation “be slow” Let’s see slow benchmark done Python 38 — describe whole setup Introduction article blog add jit code body function stay difference decorator Dont forget install Numba package pip pip install numba … spot 2 line changed Let’s measure execution time Using jit decorator gave u 120x speedup 217 176 123295 That’s huge improvement simple change feature Numba jit common decorator Numba library others use njit — alias jitnopythonTrue nopython mode Numba try run code without using Python interpreter lead even bigger speed improvement also possible compilation fail mode mode Numba try run code without using Python interpreter lead even bigger speed improvement also possible compilation fail mode vectorize guvectorize — produce ufunc generalized ufunc used NumPy generalized used NumPy jitclass — used decorate whole class cfunc — declares function used native callback C C code also advanced feature let example run code GPU cudajit doesn’t work box might worth effort computationalheavy operation Numba plenty configuration option improve code’s execution time know Disable GIL Global Interpreter Lock nogil Cache result cache Automatically parallelize function parallel Check documentation see see reallife example like computing BlackScholes model LennardJones potential visit Numba Examples page Conclusions Numba great library significantly speed program minimal effort Given take le minute install decorate slow function one first solution check want quickly improve code without rewriting work best codeTags Coding Software Development Performance Programming Python
|
5,522 |
A modern-day washing machine…
|
I Confess. I'm In Love With An Appliance | News Break
I resisted for a long time and now I wonder why. Once I made the decision, I couldn't believe it took me so long to…
|
https://medium.com/technology-hits/the-wonders-of-technology-fef10aa6df39
|
['Tree Langdon']
|
2020-12-22 03:34:23.520000+00:00
|
['Relationships', 'Philosophy', 'Technology', 'Self Improvement', 'Science']
|
Title modernday washing machine…Content Confess Im Love Appliance News Break resisted long time wonder made decision couldnt believe took long to…Tags Relationships Philosophy Technology Self Improvement Science
|
5,523 |
Zoe
|
Evening fell hard that night, its blackness descending almost in an instant while I struggled to balance my overdrawn checking account.
I didn’t write that sentence in one shot. I backed up and embellished it several times before it was done. First sentences are important. I hope it grabs you.
The night-chatter of frogs and hum of insects rose in the dark. I didn’t feel their presence until, about an hour after sunset with an orange gibbous moon rising in the east, I pushed back from my computer, ran my hand through my hair, and gave up. I was grateful for their unseen companionship. I had moved out here to the middle of nowhere, Montana to get away from people, but sometimes when the night closed in I could wish for someone to talk to.
I’m ready to set aside the word list. I’ve only used two of its entries, but I don’t need it anymore. I have a character now, not well-developed but reasonably intriguing, and I think I can run with him. I still need the theme, though, to make something happen to our protagonist. A couple of influences come into play here. One is my interest in astronomy. I like to incorporate astronomical notes in my works. The description of the moon is accurate. A moon rising in the east shortly after sunset will be just past full — a waning gibbous — and can appear yellow or even orange depending on the state of the atmosphere. A second influence is a work I’m currently reading in which, near the beginning, a group of friends in a remote cabin are menaced by some strange intruders.
When I first came here, it seemed an idyllic existence. I had no family left, no real friends, not even any colleagues I particularly liked. A failed law student, for twelve years I’d shuffled papers and appointment calendars for a modest law firm in Denver. It paid well enough. If only I hadn’t ended each day feeling like I’d been run over several times by a sixteen-wheeler. Finally I realized the insanity of contenting myself with discontent. I chucked it all — job, condo, everything — and came here to make a living off the internet.
No, I didn’t go off half-cocked. I planned it. I knew what I’d sell, how I’d reap the rewards of placing ads on my websites, even wrote up a business plan. I just hadn’t realized how hard it was to translate plans into reality. And now? Insolvency loomed as large as the rising moon. What I needed, I thought with some desperation, was for some guardian angel to show up at the door.
Yes, I actually thought that. And immediately, the hammering on the door began, followed by a woman’s voice, filled with desperation, calling, “Is anybody home?”
All of this pretty much flowed out. In revision I might change or move some of it, as it strikes me it might be a bit slow. We’ll see what happens. The newcomer was, of course, planned based on the theme, “Stranger at the Door.” I hadn’t originally figured the stranger to be a woman, though. I made her female only when she showed up. One might expect a menacing figure to appear out of the night, but I like to contradict my own expectations from time to time, just to see what happens.
The coincidence was too much. After my initial startle reaction, I stared at the door during a lull in the pounding and only rose when it resumed. I flipped on the porch light and, without removing the chain, opened the door a crack. I’m not sure what I expected to see. A tall, ethereal beauty affixed with white billowing wings, maybe. In fact she was rather small, only five foot five including the thick brown hair she wore in a pony tail. Her dark eyes blinked at me in astonishment or fear. Dressed in a short green skirt and white blouse, she clutched a little white purse before her with both hands.
“I’m sorry,” she said. “I’m really sorry.”
Feeling like an idiot scared of his own reflection, I undid the chain and eased the door open. “For what?”
“Do you have a land line? I can’t get any reception out here.”
I peered into the night but couldn’t see a vehicle. She must have had car trouble out on the road, I supposed. My cabin was set a quarter mile up a gravel drive. Nobody would walk up here in the dark by choice.
I had to stop there to take my wife for cataract surgery. Seriously, I did. Life doesn’t stop just because you’re writing a story. Later that evening, I continued…
“Yeah,” I told her. “Come on in.” I held the door for her and closed it behind her. While she looked around my spartan one-room, I redid the chain. “Car trouble?”
“Ran out of gas. Stupid of me, but I really thought there would be a gas station somewhere.”
I gave the room a once-over, too, fearing it might be too messy for visitors. The kitchen in the back corner wasn’t exactly choked with dirty dishes, but I hadn’t cleaned it up today. My bed in the opposite corner — a queen because I liked having the space to flop around — was unmade, but didn’t look too disreputable. As it was a warm summer night, I hadn’t lit a fire in the tiny stone fireplace, but uncleaned soot and ash had accumulated there, it’s smell suffusing the air.
“You live here alone?” she asked.
It should have been obvious. “Yeah. Here’s the phone.” I led her back to the kitchen and snatched the device from the table.
She took it and studied it as though she’d never seen one before. “I don’t even know who to call.”
“Family?” I suggested. “A friend?”
She shrugged. “Don’t have any.”
“Me, either.” As soon as I said it, I wished I hadn’t. I had no interest in forming bonds, however tenuous. “I don’t suppose you’re a triple-A member?”
There was that shrug again. “I’m pretty hopeless, I guess. I don’t suppose you’d have any gas, like for a lawn mower?”
“Afraid not.” I didn’t bother telling her lawn mowers were of little use in a forest.
She pulled out a chair and sat, sighing heavily. “I guess I’m just stuck.”
Time to quit for the night. Can you tell where this is going yet? Neither can I!
“Don’t worry. I can find emergency service for you online.” My laptop was at the table, too, along with the uncleared dishes from dinner. And lunch. And breakfast. I pushed as much of it aside as I could and sat around the corner from her. “Sorry about the mess.”
Again that little shrug. “I don’t suppose it seems so important when it’s just you.”
“Depends on the day.” I entered the search terms into the computer and got a list of one, a place over thirty miles away. “Here we go. It’ll take them a little while to get here.” I turned the computer so she could see the number.
“Thanks.” She punched the number into my phone and waited. “Hi, I ran out of gas. Could you send someone out?” She listened, then gave our location, then listened some more before signing off with a resigned, “Okay, thanks.” She handed me the phone back. “Three hours.” She glanced at the door, then at the darkness beyond the kitchen window.
I didn’t want company. I came out here to get away from people. Why, I wondered, did she have to run out of gas in front of my place? But I couldn’t send her out into the dark on her own. “You can stay here,” I offered. “I’ll walk you to your car when the time comes.”
She smiled gratefully while objecting, “You don’t have to do that.”
“It’s okay.” Which it wasn’t, not exactly, but strangely I found her presence less of a burden than I would have expected. If nothing else, I’d have an excuse to ignore my financial mess for a few hours. “You want something? Coffee? Tea?”
“Coffee would be nice.” Again the smile, which I found myself returning. “But really, I don’t want to be a bother.”
“It’s no bother.” I got up, flipped on the coffee maker, and took a pair of mugs from a cupboard.
“But you don’t like people, do you? I’m an unwanted intrusion.”
I felt myself flush. Was it that obvious? “I’m a bit of a loner, I guess.”
“Where are your parents?”
“Dead. Their house burned down. Faulty wiring, the fire inspector said.” As the coffee dripped through the machine, I turned to face her. Why was I telling this to a complete stranger? “What’s your name?”
She gave me a coy smile. “What’s yours?”
“I asked first.”
“So you did. No brothers or sisters, I guess?”
I shook my head and turned away. I should have been irritated, but I couldn’t manage it. All I could feel was a deep hollow in the pit of my stomach, an emptiness that had probably been there for more years than I cared to admit.
“No woman in your life?”
“Thankfully not.”
She laughed. “Can’t live with ’em, can’t live without ’em, right?”
The coffee finished brewing. Forcing a few breaths to steady myself, I poured and brought the steaming mugs to the table. “They can’t seem to live with me. Best you don’t even think about it.”
“Hmm.” She took the mug in both hands and seemed to melt in its warmth. “I’m not thinking anything. You’re the one who called me.”
I watched her drink the whole mug in one long swallow, steam curling about her face, her eyes never leaving mine. “What’s your name?” I asked again.
“Take your pick. I have so many. ”
The encounter had to turn strange at some point, otherwise it would have no interest. Now that it has, I need to take a small break. When I come back, I expect the end will materialize.
She held out her mug as if to ask for more coffee. I hadn’t touched mine yet, so I pushed it to her. “Your real one will do.”
With a nod of thanks, she wrapped her hands around the mug and lifted it to her lips. “Zoe.”
I laughed. “Like the second Doctor Who’s companion? You don’t look a thing like her.”
“She was named after me.” Zoe drank down her second mug in one long gulp, then delicately wiped her mouth with the back of her hand. “Who do I look like?”
I didn’t know, but something about the shape of her face and the turn of her mouth reminded me a little of my mother. A fragment of anguish rose in my throat and tried to choke me. I forced it back down. “I have a strange feeling you know.”
She held out the mug. “I don’t want to impose, but . . .”
I got up and gave her a refill, which she downed as quickly as the first two. Then, setting the mug gently on the table, she rose. “I should go. If I stay, I’ll run you out out of coffee.”
“Go where? You’re out of gas.”
“Nah, I just said that to get you to open the door.”
I watched her cross the cabin and undo the chain on the front door, then rushed after her. “Wait! I don’t know who you are. I don’t even know what you are!” Reaching her, I put my hand on hers to keep her from turning the knob.
“Oh, so now you want me to stay.”
Her words jarred me as though she’d slapped my face. “Well . . .”
She smiled and enfolded my hand in hers. “For now, that’s enough.”
A moment later, I was standing alone, not remembering when or how she had slipped through the door, or even if she had. She might have dissolved into mist and floated up the chimney for all I knew.
A lot of this was written while feeling my way toward the end. If you think I planned it all, you’re wrong. And I’m still looking for the finish. I know about what it is now, but not the form.
I thought about her a lot in the coming days. I even looked for her online, but with only a first name it was a fool’s quest. Thinking she might live in a town nearby — relatively speaking — I forced myself to go out looking, but to no avail. I talked with servers in mom and pop restaurants, gas station attendants, even a couple of librarians. Nobody had ever heard of Zoe or anyone quite fitting her description. Not with her thirst for coffee, anyway.
Strangely, the further I ventured, the less I craved my solitude. Not that I wanted to abandon it, but I began to discover — or rediscover —that connections had some value after all. I began to wonder if maybe I didn’t need the world at least a little, and if just possibly the world needed me in return.
In short, thanks to Zoe, I’m rediscovering life. It would be hard not to. She isn’t just any woman. I’m convinced of that. She’s . . .
Oh, look it up for yourself.
The end? Yes and no. The end of the first draft. Along the way I’ve done a bit of rewriting, but not so much as I usually do. Normally, I’d rework the story several times before showing it around. What you’re seeing is therefore rough around the edges. I’ll post the final version later, so you can see what changes.
Addendum: The final draft of “Zoe” is now available on Lit Up. You might like to compare it to this first draft.
|
https://lehket.medium.com/zoe-51989ed3fe63
|
['Dale E. Lehman']
|
2018-08-08 14:28:03.836000+00:00
|
['Writing Prompts', 'Writing', 'Short Fiction', 'Fiction', 'Short Story']
|
Title ZoeContent Evening fell hard night blackness descending almost instant struggled balance overdrawn checking account didn’t write sentence one shot backed embellished several time done First sentence important hope grab nightchatter frog hum insect rose dark didn’t feel presence hour sunset orange gibbous moon rising east pushed back computer ran hand hair gave grateful unseen companionship moved middle nowhere Montana get away people sometimes night closed could wish someone talk I’m ready set aside word list I’ve used two entry don’t need anymore character welldeveloped reasonably intriguing think run still need theme though make something happen protagonist couple influence come play One interest astronomy like incorporate astronomical note work description moon accurate moon rising east shortly sunset past full — waning gibbous — appear yellow even orange depending state atmosphere second influence work I’m currently reading near beginning group friend remote cabin menaced strange intruder first came seemed idyllic existence family left real friend even colleague particularly liked failed law student twelve year I’d shuffled paper appointment calendar modest law firm Denver paid well enough hadn’t ended day feeling like I’d run several time sixteenwheeler Finally realized insanity contenting discontent chucked — job condo everything — came make living internet didn’t go halfcocked planned knew I’d sell I’d reap reward placing ad website even wrote business plan hadn’t realized hard translate plan reality Insolvency loomed large rising moon needed thought desperation guardian angel show door Yes actually thought immediately hammering door began followed woman’s voice filled desperation calling “Is anybody home” pretty much flowed revision might change move strike might bit slow We’ll see happens newcomer course planned based theme “Stranger Door” hadn’t originally figured stranger woman though made female showed One might expect menacing figure appear night like contradict expectation time time see happens coincidence much initial startle reaction stared door lull pounding rose resumed flipped porch light without removing chain opened door crack I’m sure expected see tall ethereal beauty affixed white billowing wing maybe fact rather small five foot five including thick brown hair wore pony tail dark eye blinked astonishment fear Dressed short green skirt white blouse clutched little white purse hand “I’m sorry” said “I’m really sorry” Feeling like idiot scared reflection undid chain eased door open “For what” “Do land line can’t get reception here” peered night couldn’t see vehicle must car trouble road supposed cabin set quarter mile gravel drive Nobody would walk dark choice stop take wife cataract surgery Seriously Life doesn’t stop you’re writing story Later evening continued… “Yeah” told “Come in” held door closed behind looked around spartan oneroom redid chain “Car trouble” “Ran gas Stupid really thought would gas station somewhere” gave room onceover fearing might messy visitor kitchen back corner wasn’t exactly choked dirty dish hadn’t cleaned today bed opposite corner — queen liked space flop around — unmade didn’t look disreputable warm summer night hadn’t lit fire tiny stone fireplace uncleaned soot ash accumulated it’s smell suffusing air “You live alone” asked obvious “Yeah Here’s phone” led back kitchen snatched device table took studied though she’d never seen one “I don’t even know call” “Family” suggested “A friend” shrugged “Don’t any” “Me either” soon said wished hadn’t interest forming bond however tenuous “I don’t suppose you’re tripleA member” shrug “I’m pretty hopeless guess don’t suppose you’d gas like lawn mower” “Afraid not” didn’t bother telling lawn mower little use forest pulled chair sat sighing heavily “I guess I’m stuck” Time quit night tell going yet Neither “Don’t worry find emergency service online” laptop table along uncleared dish dinner lunch breakfast pushed much aside could sat around corner “Sorry mess” little shrug “I don’t suppose seems important it’s you” “Depends day” entered search term computer got list one place thirty mile away “Here go It’ll take little get here” turned computer could see number “Thanks” punched number phone waited “Hi ran gas Could send someone out” listened gave location listened signing resigned “Okay thanks” handed phone back “Three hours” glanced door darkness beyond kitchen window didn’t want company came get away people wondered run gas front place couldn’t send dark “You stay here” offered “I’ll walk car time comes” smiled gratefully objecting “You don’t that” “It’s okay” wasn’t exactly strangely found presence le burden would expected nothing else I’d excuse ignore financial mess hour “You want something Coffee Tea” “Coffee would nice” smile found returning “But really don’t want bother” “It’s bother” got flipped coffee maker took pair mug cupboard “But don’t like people I’m unwanted intrusion” felt flush obvious “I’m bit loner guess” “Where parents” “Dead house burned Faulty wiring fire inspector said” coffee dripped machine turned face telling complete stranger “What’s name” gave coy smile “What’s yours” “I asked first” “So brother sister guess” shook head turned away irritated couldn’t manage could feel deep hollow pit stomach emptiness probably year cared admit “No woman life” “Thankfully not” laughed “Can’t live ’em can’t live without ’em right” coffee finished brewing Forcing breath steady poured brought steaming mug table “They can’t seem live Best don’t even think it” “Hmm” took mug hand seemed melt warmth “I’m thinking anything You’re one called me” watched drink whole mug one long swallow steam curling face eye never leaving mine “What’s name” asked “Take pick many ” encounter turn strange point otherwise would interest need take small break come back expect end materialize held mug ask coffee hadn’t touched mine yet pushed “Your real one do” nod thanks wrapped hand around mug lifted lip “Zoe” laughed “Like second Doctor Who’s companion don’t look thing like her” “She named me” Zoe drank second mug one long gulp delicately wiped mouth back hand “Who look like” didn’t know something shape face turn mouth reminded little mother fragment anguish rose throat tried choke forced back “I strange feeling know” held mug “I don’t want impose ” got gave refill downed quickly first two setting mug gently table rose “I go stay I’ll run coffee” “Go You’re gas” “Nah said get open door” watched cross cabin undo chain front door rushed “Wait don’t know don’t even know are” Reaching put hand keep turning knob “Oh want stay” word jarred though she’d slapped face “Well ” smiled enfolded hand “For that’s enough” moment later standing alone remembering slipped door even might dissolved mist floated chimney knew lot written feeling way toward end think planned you’re wrong I’m still looking finish know form thought lot coming day even looked online first name fool’s quest Thinking might live town nearby — relatively speaking — forced go looking avail talked server mom pop restaurant gas station attendant even couple librarian Nobody ever heard Zoe anyone quite fitting description thirst coffee anyway Strangely ventured le craved solitude wanted abandon began discover — rediscover —that connection value began wonder maybe didn’t need world least little possibly world needed return short thanks Zoe I’m rediscovering life would hard isn’t woman I’m convinced She’s Oh look end Yes end first draft Along way I’ve done bit rewriting much usually Normally I’d rework story several time showing around you’re seeing therefore rough around edge I’ll post final version later see change Addendum final draft “Zoe” available Lit might like compare first draftTags Writing Prompts Writing Short Fiction Fiction Short Story
|
5,524 |
Why you should bring prototyping into your design process?
|
Why you should bring prototyping into your design process?
Learn the value of the prototype, make a compelling design and boosting your product development process.’
Photo by UX Store on Unsplash
I’ve been with UI/UX design for nearly 5 years so far, every time when I present my design solutions to the stakeholders, I got different levels of feedback depending on how I demonstrate my works.
And I found an interesting fact is that if I only show my static design mockup with some basic user flows, people can roughly understand how I want to approach the problem, but it’s not easy for them to imagine how this design would work in the reality. So I have to be very articulated about every interaction detail of my design in order to better visualize the whole design concept for my audiences.
However, sometimes when I’m designing for a large scope project or a complex feature, it’s a bit difficult to clearly describe my idea by just showing static design mockups. So I’ve been looking for a better way to help me showcase my work without explaining too much? Then I finally landed on prototyping.
I started translating my ideas into an animated or playable prototype from different levels of tasks, it helps me largely reduce the communication cost and give people a simpler way to absorb the idea I provided.
In the past few years, I’ve tried a lot of prototyping tools in different use cases, I learned a lot from building those interactive experiences, it makes my design more vibrant and compelling, it’s also an important skill to me being a UX designer.
|
https://uxdesign.cc/why-you-should-bring-prototyping-into-your-design-process-fb25b679accb
|
['Lin Simon']
|
2020-04-20 11:36:15.752000+00:00
|
['Marketing', 'Prototyping', 'UI', 'User Experience', 'UX Design']
|
Title bring prototyping design processContent bring prototyping design process Learn value prototype make compelling design boosting product development process’ Photo UX Store Unsplash I’ve UIUX design nearly 5 year far every time present design solution stakeholder got different level feedback depending demonstrate work found interesting fact show static design mockup basic user flow people roughly understand want approach problem it’s easy imagine design would work reality articulated every interaction detail design order better visualize whole design concept audience However sometimes I’m designing large scope project complex feature it’s bit difficult clearly describe idea showing static design mockups I’ve looking better way help showcase work without explaining much finally landed prototyping started translating idea animated playable prototype different level task help largely reduce communication cost give people simpler way absorb idea provided past year I’ve tried lot prototyping tool different use case learned lot building interactive experience make design vibrant compelling it’s also important skill UX designerTags Marketing Prototyping UI User Experience UX Design
|
5,525 |
How to rewrite your SQL queries in Pandas, and more
|
Fifteen years ago, there were only a few skills a software developer would need to know well, and he or she would have a decent shot at 95% of the listed job positions. Those skills were:
Object-oriented programming.
Scripting languages.
JavaScript, and…
SQL.
SQL was a go-to tool when you needed to get a quick-and-dirty look at some data, and draw preliminary conclusions that might, eventually, lead to a report or an application being written. This is called exploratory analysis.
These days, data comes in many shapes and forms, and it’s not synonymous with “relational database” anymore. You may end up with CSV files, plain text, Parquet, HDF5, and who knows what else. This is where Pandas library shines.
What is Pandas?
Python Data Analysis Library, called Pandas, is a Python library built for data analysis and manipulation. It’s open-source and supported by Anaconda. It is particularly well suited for structured (tabular) data. For more information, see http://pandas.pydata.org/pandas-docs/stable/index.html.
What can I do with it?
All the queries that you were putting to the data before in SQL, and so many more things!
Great! Where do I start?
This is the part that can be intimidating for someone used to expressing data questions in SQL terms.
SQL is a declarative programming language: https://en.wikipedia.org/wiki/List_of_programming_languages_by_type#Declarative_languages.
With SQL, you declare what you want in a sentence that almost reads like English.
Pandas’ syntax is quite different from SQL. In Pandas, you apply operations on the dataset, and chain them, in order to transform and reshape the data the way you want it.
We’re going to need a phrasebook!
The anatomy of a SQL query
A SQL query consists of a few important keywords. Between those keywords, you add the specifics of what data, exactly, you want to see. Here is a skeleton query without the specifics:
SELECT… FROM… WHERE…
GROUP BY… HAVING…
ORDER BY…
LIMIT… OFFSET…
There are other terms, but these are the most important ones. So how do we translate these terms into Pandas?
First we need to load some data into Pandas, since it’s not already in database. Here is how:
I got this data at http://ourairports.com/data/.
SELECT, WHERE, DISTINCT, LIMIT
Here are some SELECT statements. We truncate results with LIMIT, and filter them with WHERE. We use DISTINCT to remove duplicated results.
SELECT with multiple conditions
We join multiple conditions with an &. If we only want a subset of columns from the table, that subset is applied in another pair of square brackets.
ORDER BY
By default, Pandas will sort things in ascending order. To reverse that, provide ascending=False.
IN… NOT IN
We know how to filter on a value, but what about a list of values — IN condition? In pandas, .isin() operator works the same way. To negate any condition, use ~.
GROUP BY, COUNT, ORDER BY
Grouping is straightforward: use the .groupby() operator. There’s a subtle difference between semantics of a COUNT in SQL and Pandas. In Pandas, .count() will return the number of non-null/NaN values. To get the same result as the SQL COUNT, use .size().
Below, we group on more than one field. Pandas will sort things on the same list of fields by default, so there’s no need for a .sort_values() in the first example. If we want to use different fields for sorting, or DESC instead of ASC, like in the second example, we have to be explicit:
What is this trickery with .to_frame() and .reset_index()? Because we want to sort by our calculated field (size), this field needs to become part of the DataFrame. After grouping in Pandas, we get back a different type, called a GroupByObject. So we need to convert it back to a DataFrame. With .reset_index(), we restart row numbering for our data frame.
HAVING
In SQL, you can additionally filter grouped data using a HAVING condition. In Pandas, you can use .filter() and provide a Python function (or a lambda) that will return True if the group should be included into the result.
Top N records
Let’s say we did some preliminary querying, and now have a dataframe called by_country, that contains the number of airports per country:
In the next example, we order things by airport_count and only select the top 10 countries with the largest count. Second example is the more complicated case, in which we want “the next 10 after the top 10”:
Aggregate functions (MIN, MAX, MEAN)
Now, given this dataframe or runway data:
Calculate min, max, mean, and median length of a runway:
A reader pointed out that SQL does not have median function. Let’s pretend you wrote a user-defined function to calculate this statistic (since the important part here is syntactic differences between SQL and Pandas).
You will notice that with this SQL query, every statistic is a column. But with this Pandas aggregation, every statistic is a row:
Nothing to worry about —simply transpose the dataframe with .T to get columns:
JOIN
Use .merge() to join Pandas dataframes. You need to provide which columns to join on (left_on and right_on), and join type: inner (default), left (corresponds to LEFT OUTER in SQL), right (RIGHT OUTER), or outer (FULL OUTER).
UNION ALL and UNION
Use pd.concat() to UNION ALL two dataframes:
To deduplicate things (equivalent of UNION), you’d also have to add .drop_duplicates().
INSERT
So far, we’ve been selecting things, but you may need to modify things as well, in the process of your exploratory analysis. What if you wanted to add some missing records?
There’s no such thing as an INSERT in Pandas. Instead, you would create a new dataframe containing new records, and then concat the two:
UPDATE
Now we need to fix some bad data in the original dataframe:
DELETE
The easiest (and the most readable) way to “delete” things from a Pandas dataframe is to subset the dataframe to rows you want to keep. Alternatively, you can get the indices of rows to delete, and .drop() rows using those indices:
Immutability
I need to mention one important thing — immutability. By default, most operators applied to a Pandas dataframe return a new object. Some operators accept a parameter inplace=True, so you can work with the original dataframe instead. For example, here is how you would reset an index in-place:
However, the .loc operator in the UPDATE example above simply locates indices of records to updates, and the values are changed in-place. Also, if you updated all values in a column:
or added a new calculated column:
these things would happen in-place.
And more!
The nice thing about Pandas is that it’s more than just a query engine. You can do other things with your data, such as:
Export to a multitude of formats:
Plot it:
to see some really nice charts!
Share it.
The best medium to share Pandas query results, plots and things like this is Jupyter notebooks (http://jupyter.org/). In facts, some people (like Jake Vanderplas, who is amazing), publish the whole books in Jupyter notebooks: https://github.com/jakevdp/PythonDataScienceHandbook.
It’s that easy to create a new notebook:
After that:
- navigate to localhost:8888
- click “New” and give your notebook a name
- query and display the data
- create a GitHub repository and add your notebook (the file with .ipynb extension).
GitHub has a great built-in viewer to display Jupyter notebooks with Markdown formatting.
And now, your Pandas journey begins!
I hope you are now convinced that Pandas library can serve you as well as your old friend SQL for the purposes of exploratory data analysis — and in some cases, even better. It’s time to get your hands on some data to query!
|
https://medium.com/jbennetcodes/how-to-rewrite-your-sql-queries-in-pandas-and-more-149d341fc53e
|
['Irina Truong']
|
2019-10-30 16:08:53.862000+00:00
|
['Sql', 'Coding', 'Software Development', 'Python', 'Data Science']
|
Title rewrite SQL query Pandas moreContent Fifteen year ago skill software developer would need know well would decent shot 95 listed job position skill Objectoriented programming Scripting language JavaScript and… SQL SQL goto tool needed get quickanddirty look data draw preliminary conclusion might eventually lead report application written called exploratory analysis day data come many shape form it’s synonymous “relational database” anymore may end CSV file plain text Parquet HDF5 know else Pandas library shine Pandas Python Data Analysis Library called Pandas Python library built data analysis manipulation It’s opensource supported Anaconda particularly well suited structured tabular data information see httppandaspydataorgpandasdocsstableindexhtml query putting data SQL many thing Great start part intimidating someone used expressing data question SQL term SQL declarative programming language httpsenwikipediaorgwikiListofprogramminglanguagesbytypeDeclarativelanguages SQL declare want sentence almost read like English Pandas’ syntax quite different SQL Pandas apply operation dataset chain order transform reshape data way want We’re going need phrasebook anatomy SQL query SQL query consists important keywords keywords add specific data exactly want see skeleton query without specific SELECT… FROM… WHERE… GROUP BY… HAVING… ORDER BY… LIMIT… OFFSET… term important one translate term Pandas First need load data Pandas since it’s already database got data httpourairportscomdata SELECT DISTINCT LIMIT SELECT statement truncate result LIMIT filter use DISTINCT remove duplicated result SELECT multiple condition join multiple condition want subset column table subset applied another pair square bracket ORDER default Pandas sort thing ascending order reverse provide ascendingFalse IN… know filter value list value — condition panda isin operator work way negate condition use GROUP COUNT ORDER Grouping straightforward use groupby operator There’s subtle difference semantics COUNT SQL Pandas Pandas count return number nonnullNaN value get result SQL COUNT use size group one field Pandas sort thing list field default there’s need sortvalues first example want use different field sorting DESC instead ASC like second example explicit trickery toframe resetindex want sort calculated field size field need become part DataFrame grouping Pandas get back different type called GroupByObject need convert back DataFrame resetindex restart row numbering data frame SQL additionally filter grouped data using condition Pandas use filter provide Python function lambda return True group included result Top N record Let’s say preliminary querying dataframe called bycountry contains number airport per country next example order thing airportcount select top 10 country largest count Second example complicated case want “the next 10 top 10” Aggregate function MIN MAX MEAN given dataframe runway data Calculate min max mean median length runway reader pointed SQL median function Let’s pretend wrote userdefined function calculate statistic since important part syntactic difference SQL Pandas notice SQL query every statistic column Pandas aggregation every statistic row Nothing worry —simply transpose dataframe get column JOIN Use merge join Pandas dataframes need provide column join lefton righton join type inner default left corresponds LEFT OUTER SQL right RIGHT OUTER outer FULL OUTER UNION UNION Use pdconcat UNION two dataframes deduplicate thing equivalent UNION you’d also add dropduplicates INSERT far we’ve selecting thing may need modify thing well process exploratory analysis wanted add missing record There’s thing INSERT Pandas Instead would create new dataframe containing new record concat two UPDATE need fix bad data original dataframe DELETE easiest readable way “delete” thing Pandas dataframe subset dataframe row want keep Alternatively get index row delete drop row using index Immutability need mention one important thing — immutability default operator applied Pandas dataframe return new object operator accept parameter inplaceTrue work original dataframe instead example would reset index inplace However loc operator UPDATE example simply locates index record update value changed inplace Also updated value column added new calculated column thing would happen inplace nice thing Pandas it’s query engine thing data Export multitude format Plot see really nice chart Share best medium share Pandas query result plot thing like Jupyter notebook httpjupyterorg fact people like Jake Vanderplas amazing publish whole book Jupyter notebook httpsgithubcomjakevdpPythonDataScienceHandbook It’s easy create new notebook navigate localhost8888 click “New” give notebook name query display data create GitHub repository add notebook file ipynb extension GitHub great builtin viewer display Jupyter notebook Markdown formatting Pandas journey begin hope convinced Pandas library serve well old friend SQL purpose exploratory data analysis — case even better It’s time get hand data queryTags Sql Coding Software Development Python Data Science
|
5,526 |
How to Reverse Engineering an Android App
|
Video Tutorial
Here are the steps we will follow:
Download an APK file from Google Play Store (it could be any APK file)
Using some free tools, we will reverse engineer the APK file to see the code.
If the APK file is protected using any premium software, then we can not reverse the code actually, or if the code is obfuscated then it will also be difficult to read the code after reverse engineering.
APK stands for the Android application package, a file format used by the Android operating system for distribution and installation.
Disclaimer: This tutorial is for educational purposes. During this demonstration, we selected an application APK file from the Google Play Store. If this chosen APK file is not working, try a different one or may use your own. There is no intention to harm or tamper with the APK file we chosen from the Google Play Store.
Step 1:
We will use the dex2jar software.
Let’s download dex2jar software from the following link: https://sourceforge.net/projects/dex2jar/.
software from the following link: https://sourceforge.net/projects/dex2jar/. If you want to see the code there is a GitHub link, that is https://github.com/pxb1988/dex2jar
This is a zip file and in my desktop->demo3 directory I put the unzipped directory of this zip file.
Step 2:
We need to download JD-GUI software. You can visit the following link http://java-decompiler.github.io/ and in the download section based on your operating system, you can download the software.
I store the software in my desktop->demo3 directory.
dex2jar-2.0 and jd-gui directories within demo3 directory
Step 3:
We need to target any android app. So in this case I am targeting EnglishScore: Free British Council English Test app from the following google play store link. https://play.google.com/store/apps/details?id=com.englishscore
Step 4:
We need to download the APK file.
Visit the following site https://apkpure.com/region-free-apk-download
On the website, at the top text field, paste the google play store app link and click download .
and click . It will take some time, and will show you another download link. Use that link to download the APK file.
Download link generated
After downloading store the app in demo3->dex2jar-2.0 directory
The APK file is placed within dex2jar-2.0 directory
Step 5:
Open your terminal if you use mac or open your Windows PowerShell if you use windows.
In the terminal, go to the target directory. In my case it will be: cd /Users/mahmud/Desktop/demo3/dex2jar-2.0
Now fix the permission of all files by pasting the following commands:
chmod 0777 *
Now type ./d2j-dex2jar.sh and then type Eng and click tab to get the full file name. EnglishScore\ Free\ British\ Council\ English\ Test_v1.00.32_apkpure.com.apk .
and then and to get the full file name. . Now click enter. It will take some time, and you will see a new file named
EnglishScore Free British Council English Test_v1.00.32_apkpure.com-dex2jar.jar is created.
File Structures and Commands in Terminal
Step 6:
Now go to your jd-gui->build->libs directory. And in my case, if I double click jd-gui-1.6.1.jar you will see the following interface of the app.
directory. And in my case, if I double click you will see the following interface of the app. Now drag and paste the EnglishScore Free British Council English Test_v1.00.32_apkpure.com-dex2jar.jar file within the app and you will see the following things.
Reversed Engineering Code
If you click the com section, you will see what 3rd party libraries this app is used.
Also in this app, if you click englishscore section, you will see the source code of the app.
Now click any file, for example, BritishCouncil.class file and you will see the actual code of the app. If this app was protected by any premium tool, or even obfuscated the code before releasing the app, we couldn’t easily understand the code after reverse engineering.
But unfortunately, many android app developers didn’t know about reverse engineering and spying eyes can easily reverse engineer the app.
So it is important to know by all android developers.
Conclusion
If you click distinct classes of this application, you can see all the code. Now think, if you are an android developer, you are developing a financial or banking app and you didn’t use any obfuscation techniques or especially for financial type app; you didn’t use any protection then how easy it is for hackers to hack your application or breach the security right?
So I hope you understand how vulnerable our app is. If you want to know how to protect an Android app from reverse engineering, check the following article.
|
https://medium.com/level-up-programming/how-to-reverse-engineering-an-android-app-be5835f6fa1e
|
['Mahmud Ahsan']
|
2020-10-28 08:20:02.427000+00:00
|
['Software Engineering', 'Android', 'Hacking', 'Security', 'Android App Development']
|
Title Reverse Engineering Android AppContent Video Tutorial step follow Download APK file Google Play Store could APK file Using free tool reverse engineer APK file see code APK file protected using premium software reverse code actually code obfuscated also difficult read code reverse engineering APK stand Android application package file format used Android operating system distribution installation Disclaimer tutorial educational purpose demonstration selected application APK file Google Play Store chosen APK file working try different one may use intention harm tamper APK file chosen Google Play Store Step 1 use dex2jar software Let’s download dex2jar software following link httpssourceforgenetprojectsdex2jar software following link httpssourceforgenetprojectsdex2jar want see code GitHub link httpsgithubcompxb1988dex2jar zip file desktopdemo3 directory put unzipped directory zip file Step 2 need download JDGUI software visit following link httpjavadecompilergithubio download section based operating system download software store software desktopdemo3 directory dex2jar20 jdgui directory within demo3 directory Step 3 need target android app case targeting EnglishScore Free British Council English Test app following google play store link httpsplaygooglecomstoreappsdetailsidcomenglishscore Step 4 need download APK file Visit following site httpsapkpurecomregionfreeapkdownload website top text field paste google play store app link click download click take time show another download link Use link download APK file Download link generated downloading store app demo3dex2jar20 directory APK file placed within dex2jar20 directory Step 5 Open terminal use mac open Windows PowerShell use window terminal go target directory case cd UsersmahmudDesktopdemo3dex2jar20 fix permission file pasting following command chmod 0777 type d2jdex2jarsh type Eng click tab get full file name EnglishScore Free British Council English Testv10032apkpurecomapk get full file name click enter take time see new file named EnglishScore Free British Council English Testv10032apkpurecomdex2jarjar created File Structures Commands Terminal Step 6 go jdguibuildlibs directory case double click jdgui161jar see following interface app directory case double click see following interface app drag paste EnglishScore Free British Council English Testv10032apkpurecomdex2jarjar file within app see following thing Reversed Engineering Code click com section see 3rd party library app used Also app click englishscore section see source code app click file example BritishCouncilclass file see actual code app app protected premium tool even obfuscated code releasing app couldn’t easily understand code reverse engineering unfortunately many android app developer didn’t know reverse engineering spying eye easily reverse engineer app important know android developer Conclusion click distinct class application see code think android developer developing financial banking app didn’t use obfuscation technique especially financial type app didn’t use protection easy hacker hack application breach security right hope understand vulnerable app want know protect Android app reverse engineering check following articleTags Software Engineering Android Hacking Security Android App Development
|
5,527 |
New call for applications opens for European Journalism COVID-19 Support Fund
|
Emergency Fund
For news organisations
Amount : €5,000, €10,000 or €25,000
: €5,000, €10,000 or €25,000 Eligible : news organisations (that employ the equivalent of at least one full-time journalist)
: news organisations (that employ the equivalent of at least one full-time journalist) Focus: providing specific financial support to address immediate and critical business needs. These grants may be used, for example, to replace lost sales revenue including from printed and digital products/ services, to fund alternative print distribution methods, to cover key organisational costs, to hire freelancers to replace staff during illness, to maintain essential coverage and services unrelated to COVID-19 and to fund IT, software services and infrastructure support.
“Without the help of this fund, we would have had to close Star & Crescent, at least for the foreseeable future. This funding will also support our current work towards launching a Members’ Scheme, which will help us to develop a sustainable income stream, and to deepen our relationship with the local community. I cannot adequately put into words the difference this funding makes to us.” Star & Crescent, U.K. (Wave 1 grantee)
For freelance journalists
Amount : €5,000
: €5,000 Eligible : freelance journalists; groups of freelance journalists
: freelance journalists; groups of freelance journalists Focus: helping community-based, community-driven local media to engage communities and their conversations within short-term or one-off COVID-19-related initiatives. These grants may be used, for example, to launch a dedicated newsletter, create a community group, cover travel costs, cover costs of audio/visual/recording equipment to aid remote working, undertake local fact-checking, engage in community data reporting, produce short-run print material, or set-up online events.
“We want to guarantee local reporting about migrant communities. This project will help a broader reporting of realities often unnoticed. Now I feel encouraged and motivated to share these stories.” María Clara Montoya, freelance journalist, Spain (Wave 1 grantee)
Endurance Fund
For news organisations only
Amount : €10,000 or €25,000
: €10,000 or €25,000 Eligible : news organisations (that employ the equivalent of at least one full-time journalist)
: news organisations (that employ the equivalent of at least one full-time journalist) Focus: providing specific financial support to news organisations that have pivoted / are pivoting their business model during the COVID-19 crisis. These grants may be used, for example, to invest in resources (including technology, toolkits, people, and experts) to: build resilience within teams and leadership, facilitate effective cross-team collaboration and sharing of knowledge, create more/better pathways for community participation in the work of the news organisation, execute user-focused product development, or develop or launch reader-revenue models.
“This support is vital to our survival as we have lost a significant portion of our advertising revenue. We can continue to serve our audience through the hard times ahead, as we have done for the last 20 years.” Klubrádió, Hungary (Wave 1 grantee)
Grantee from Wave 1, Koncentrat
Eligibility and selection criteria
The fund is open to freelance journalists or news organisations with their principal place of business located in a country in the Council of Europe. The applicant must be serving communities on a hyperlocal, local or regional scale and/or communities of interest.
Independent experts and the EJC team will shortlist and select grantees according to the criteria laid out in the Call for Applications (Wave 2).
Previous, unsuccessful applicants from Wave 1 (April 2020) may re-apply, but must create a new application that reflects changes in circumstances since the original application. Successful news organisations and freelancers who received grants from Wave 1 are not eligible to re-apply.
Please check the new Call for Applications (Wave 2) and the updated FAQs for details.
The deadline for applications is 11:59 pm CEST, Friday 25 September 2020.
Important links
“We want to share the joy of this award with our members: without them, Slow News wouldn’t even exist. So, we are committed more than ever to provide them journalism they support every day.” Slow News, Italy (Wave 1 grantee)
About the Fund Partners
Since 1992, the EJC has been building a sustainable, ethical and innovative future for journalism through grants, events, training and media development. It is an international non-profit, headquartered in the Netherlands, that connects journalists with new ideas, skills and people. Our focus in 2020 is building resilience into journalism.
The Facebook Journalism Project works with publishers around the world to strengthen the connection between journalists and the communities they serve. It also helps address the news industry’s core business challenges. Its trainings, programs, and partnerships work in three ways: build community through news, train newsrooms globally, and quality through partnerships.
|
https://medium.com/we-are-the-european-journalism-centre/new-call-for-applications-opens-for-european-journalism-covid-19-support-fund-98a33e8372f4
|
['Adam Thomas']
|
2020-09-17 12:01:18.531000+00:00
|
['Journalism', 'Funding', 'Covid 19', 'Media', 'Updates']
|
Title New call application open European Journalism COVID19 Support FundContent Emergency Fund news organisation Amount €5000 €10000 €25000 €5000 €10000 €25000 Eligible news organisation employ equivalent least one fulltime journalist news organisation employ equivalent least one fulltime journalist Focus providing specific financial support address immediate critical business need grant may used example replace lost sale revenue including printed digital product service fund alternative print distribution method cover key organisational cost hire freelancer replace staff illness maintain essential coverage service unrelated COVID19 fund software service infrastructure support “Without help fund would close Star Crescent least foreseeable future funding also support current work towards launching Members’ Scheme help u develop sustainable income stream deepen relationship local community cannot adequately put word difference funding make us” Star Crescent UK Wave 1 grantee freelance journalist Amount €5000 €5000 Eligible freelance journalist group freelance journalist freelance journalist group freelance journalist Focus helping communitybased communitydriven local medium engage community conversation within shortterm oneoff COVID19related initiative grant may used example launch dedicated newsletter create community group cover travel cost cover cost audiovisualrecording equipment aid remote working undertake local factchecking engage community data reporting produce shortrun print material setup online event “We want guarantee local reporting migrant community project help broader reporting reality often unnoticed feel encouraged motivated share stories” María Clara Montoya freelance journalist Spain Wave 1 grantee Endurance Fund news organisation Amount €10000 €25000 €10000 €25000 Eligible news organisation employ equivalent least one fulltime journalist news organisation employ equivalent least one fulltime journalist Focus providing specific financial support news organisation pivoted pivoting business model COVID19 crisis grant may used example invest resource including technology toolkits people expert build resilience within team leadership facilitate effective crossteam collaboration sharing knowledge create morebetter pathway community participation work news organisation execute userfocused product development develop launch readerrevenue model “This support vital survival lost significant portion advertising revenue continue serve audience hard time ahead done last 20 years” Klubrádió Hungary Wave 1 grantee Grantee Wave 1 Koncentrat Eligibility selection criterion fund open freelance journalist news organisation principal place business located country Council Europe applicant must serving community hyperlocal local regional scale andor community interest Independent expert EJC team shortlist select grantee according criterion laid Call Applications Wave 2 Previous unsuccessful applicant Wave 1 April 2020 may reapply must create new application reflects change circumstance since original application Successful news organisation freelancer received grant Wave 1 eligible reapply Please check new Call Applications Wave 2 updated FAQs detail deadline application 1159 pm CEST Friday 25 September 2020 Important link “We want share joy award member without Slow News wouldn’t even exist committed ever provide journalism support every day” Slow News Italy Wave 1 grantee Fund Partners Since 1992 EJC building sustainable ethical innovative future journalism grant event training medium development international nonprofit headquartered Netherlands connects journalist new idea skill people focus 2020 building resilience journalism Facebook Journalism Project work publisher around world strengthen connection journalist community serve also help address news industry’s core business challenge training program partnership work three way build community news train newsroom globally quality partnershipsTags Journalism Funding Covid 19 Media Updates
|
5,528 |
What 2020 Taught Us About Ourselves
|
What 2020 Taught Us About Ourselves
The year made us appreciate things we once hated.
Photo by Charl Folscher on Unsplash
This morning I went to YouTube’s recommended section, originally planning to watch the latest late night show episodes and get my update on what you guys across the big pond are doing to your country.
Instead I ended up watching a soccer goal compilation, which has to be the first time in my life that I did that. I never liked soccer, whenever our news casters went on to talk about sports I switched off the TV.
In the same ways I learned to like the simple pleasure of comfortable pants, not quite to the point where I would wear them in public but I do wear them while working from home most days.
I learned to enjoy meeting and talking to people after two decades of doing whatever I possibly could to prevent human interactions.
And while that is just me I believe that we can all agree on seeing our priorities, beliefs, work and pastimes shift in all kind of weird, unexpected ways this year.
So what then, does that teach us about ourselves?
Priorities do shift.
This makes me afraid, it may mean that I end up one of those guys with a house, a family and too-big-a-car that is financed where I can barely keep up with the montly payments.
I can only hope that I will be able to maintain my disgruntled loner state who likes work more than living and fills every waking hour with productivity of some sort, that is certainly the easier life to live.
We need some form of continuity in our lives.
If I look back at my own life I have usually made larger changes while retaining continuities and routines in other areas — even if I ended up changing those soon after sometimes.
So maybe I would move places, but stay in contact with people of the old worlds, then once we inevitably grew apart I had my new world offering me continuity.
2020 has tossed a lot of those around, shuffled the cards and the dices fell in weird, unrelated ways sometimes. I think that a lot of us have suffered from this uncertainty on a larger scale more than the uncertainty in our personal lives. Unemployment security used to safeguard us all from going hungry, but I think this year was the first when many started wondering how long the state would be still in a state (hah) to cover those social securities.
Staying adaptable may just be the most powerful skill we can hone.
Anyone remember the gay, jewish ex-nazi who realized one day that he should probably reevaluate his beliefs? If a guy so deep into any one rabbit hole can recover and readjust then so can we, right?
My life in retrospect has been a long string of swift and drastic lifestyle changes and I’m incredibly glad for that. I lived in eight different cities, towns and villages now, worked a farm job, did construction work, assembled e-cigarettes in a warehouse — and then somehow ended up in a hectic programming job that got so bad now towards the end that I’m delivering pizza to clear my head.
So I guess I have the advantage of change being my routine, this time next year I might be at a completely different place living a completely different life. Who knows, who cares.
We can live well on much less than we do right now.
I am nowhere near rich, but I’m entering that frightening stage in life where I begin to have actual savings, haven’t needed to touch my emergency fund in months and money comes in slightly faster than I can reasonably spend it.
But there was also a time when I lived on the 650€ a month that we were paid as apprentices (obviously with help of my mom but still) and somewhere in between those two I was working hard at substandard wage and lived a life worth living regardless.
I have good memories of those days, I even own a now-dysfunctional pilot watch that I bought from a rare, unexpected year-end-bonus and treasured ever since. I should probably have it repaired, but it was cheap enough to make the repair more expensive than buying a completely new one so I’m on the fence with that.
The point here is that I live more on less than other people spend to live miserable lives.
Happiness is not in things or people but rather in the moment.
I live a weird fringe life where I all-but-know that the life I’m currently living is due for a change in little time — that includes people I like, people I dislike, the things I own and the plans and dreams I have at any given moment.
Right now what I enjoy most is the weirdly stressful, weirdly relaxing way I meet with a friend who is a nurse working at odd hours, varying shifts and only a week’s advance knowledge of when and where we might be able to meet. The other night she needed moss for Christmas crafts so we met in the darkness of the early morning hours to hike around a lake and hunt for patches of moss to the shine of our flashlights — as you usually do.
That was fun by the sheer weirdness of the idea, even before adding the interesting conversations that arise from living two vastly different lifestyles.
I treasure those moments, much like the nights I spent working late into the nights constructing event locations, the hours on the farm cutting firewood, riding a tractor for hours across the interstate after a one-minute crash course (that’s the gas pedal, that’s the brake, if you think you need turn signals those are there) — they are insignificant on the greater scale but I still enjoy thinking back to them.
Having too much time to think can’t be good in the long run.
This year has seen me change my ways in more than just one regard, the main one being that I started to live, think, read and write again after what feels like five years of absence from life.
|
https://medium.com/the-ascent/what-2020-taught-us-about-ourselves-50aea26e8c14
|
[]
|
2020-12-26 22:02:31.269000+00:00
|
['Life', 'Self-awareness', 'Self', 'Self Improvement', 'Life Lessons']
|
Title 2020 Taught Us OurselvesContent 2020 Taught Us year made u appreciate thing hated Photo Charl Folscher Unsplash morning went YouTube’s recommended section originally planning watch latest late night show episode get update guy across big pond country Instead ended watching soccer goal compilation first time life never liked soccer whenever news caster went talk sport switched TV way learned like simple pleasure comfortable pant quite point would wear public wear working home day learned enjoy meeting talking people two decade whatever possibly could prevent human interaction believe agree seeing priority belief work pastime shift kind weird unexpected way year teach u Priorities shift make afraid may mean end one guy house family toobigacar financed barely keep montly payment hope able maintain disgruntled loner state like work living fill every waking hour productivity sort certainly easier life live need form continuity life look back life usually made larger change retaining continuity routine area — even ended changing soon sometimes maybe would move place stay contact people old world inevitably grew apart new world offering continuity 2020 tossed lot around shuffled card dice fell weird unrelated way sometimes think lot u suffered uncertainty larger scale uncertainty personal life Unemployment security used safeguard u going hungry think year first many started wondering long state would still state hah cover social security Staying adaptable may powerful skill hone Anyone remember gay jewish exnazi realized one day probably reevaluate belief guy deep one rabbit hole recover readjust right life retrospect long string swift drastic lifestyle change I’m incredibly glad lived eight different city town village worked farm job construction work assembled ecigarettes warehouse — somehow ended hectic programming job got bad towards end I’m delivering pizza clear head guess advantage change routine time next year might completely different place living completely different life know care live well much le right nowhere near rich I’m entering frightening stage life begin actual saving haven’t needed touch emergency fund month money come slightly faster reasonably spend also time lived 650€ month paid apprentice obviously help mom still somewhere two working hard substandard wage lived life worth living regardless good memory day even nowdysfunctional pilot watch bought rare unexpected yearendbonus treasured ever since probably repaired cheap enough make repair expensive buying completely new one I’m fence point live le people spend live miserable life Happiness thing people rather moment live weird fringe life allbutknow life I’m currently living due change little time — includes people like people dislike thing plan dream given moment Right enjoy weirdly stressful weirdly relaxing way meet friend nurse working odd hour varying shift week’s advance knowledge might able meet night needed moss Christmas craft met darkness early morning hour hike around lake hunt patch moss shine flashlight — usually fun sheer weirdness idea even adding interesting conversation arise living two vastly different lifestyle treasure moment much like night spent working late night constructing event location hour farm cutting firewood riding tractor hour across interstate oneminute crash course that’s gas pedal that’s brake think need turn signal — insignificant greater scale still enjoy thinking back much time think can’t good long run year seen change way one regard main one started live think read write feel like five year absence lifeTags Life Selfawareness Self Self Improvement Life Lessons
|
5,529 |
36 Alien Civilizations Have Colonized the Milky Way
|
Re-thinking the Drake Equation; The Astrobiological Copernican Limit.
A new study published in the Astrophysical Journal has shifted the paradigm on the question of alien existence.
The study, conducted by researchers at the University of Nottingham, takes a fresh look at the Drake Equation. The researchers developed a new calculation called the Astrobiological Copernican Limit which is a more specific alien probability analysis than the Drake Equation.
First, some background.
The ACL draws inspiration from the famous Copernican Principle; the idea that earth does not sit at the center of the universe or is special in some way.
In the 16th Century, scientist Nikolau Copernicus proposed that the Sun is centrally located and stationary in contrast to the then currently upheld belief that the Earth was central. Austrian cosmologist Sir Herman Bondi named the principle after Copernicus in the mid-20th century.
By looking at the world through the eyes of this principle, we can remove certain blinders and preconceptions about ourselves and re-examine our relationship with the universe.
The new study centers around the Copernican principle. The study guesstimates that the number of Communicating Extraterrestrial Intelligent Life in our galaxy could be somewhere between 4 and 211, but most likely 36!
Professor of Astrophysics at the University of Nottingham, Christopher Conselice, who led the research, explains: “There should be at least a few dozen active civilizations in our Galaxy under the assumption that it takes 5 billion years for intelligent life to form on other planets, as on Earth.”
The research is looking at evolution, but on a cosmic scale, hence the “Astrobiological Copernican Limit.”
Of course, the civilizations proposed by the team would be able to send radio signals out into space, which is what qualifies them as communicating.
The key difference between the ACL and the Drake equation is that it makes very simple assumptions about how life developed.
One major assumption is that life forms scientifically; that is, if the right conditions are met, then life will form.
This approach bypasses the two impossible-to-answer questions that have plagued previous calculations; “what fraction of planets in the habitable zone of a star will form life?” and “what fraction of life will evolve into intelligent life?”
These two questions are not answerable until we detect life, which still seems like a far-fetched reality.
Again, much like the Drake Equation, the ACL isn’t without its fair share of issues. We can’t accurately know the correct figures to compute, with estimates of the number of anything in the Milky Way tending to differ from source to source.
We don’t know the number of stars and exoplanets in the galaxy. So while this new method is surgical in its approach, it’s by no means definitive. We are still at the stage of trying to figure out the most precise values to compute.
There’s also the fact that we’re still not certain what caused intelligent life to evolve even on earth. The assumption that it could happen anywhere else in the universe “under the right circumstances” is incredibly far-fetched.
Additionally, the study suggested that the average distance to the nearest civilization might be about 17000 light-years, making detection and communication extremely difficult with our current technology.
|
https://medium.com/predict/36-alien-civilizations-have-colonized-the-milky-way-b23d67518cba
|
['Leon Okwatch']
|
2020-12-28 01:38:45.466000+00:00
|
['Science Fiction', 'Astronomy', 'Space', 'Physics', 'Science']
|
Title 36 Alien Civilizations Colonized Milky WayContent Rethinking Drake Equation Astrobiological Copernican Limit new study published Astrophysical Journal shifted paradigm question alien existence study conducted researcher University Nottingham take fresh look Drake Equation researcher developed new calculation called Astrobiological Copernican Limit specific alien probability analysis Drake Equation First background ACL draw inspiration famous Copernican Principle idea earth sit center universe special way 16th Century scientist Nikolau Copernicus proposed Sun centrally located stationary contrast currently upheld belief Earth central Austrian cosmologist Sir Herman Bondi named principle Copernicus mid20th century looking world eye principle remove certain blinder preconception reexamine relationship universe new study center around Copernican principle study guesstimate number Communicating Extraterrestrial Intelligent Life galaxy could somewhere 4 211 likely 36 Professor Astrophysics University Nottingham Christopher Conselice led research explains “There least dozen active civilization Galaxy assumption take 5 billion year intelligent life form planet Earth” research looking evolution cosmic scale hence “Astrobiological Copernican Limit” course civilization proposed team would able send radio signal space qualifies communicating key difference ACL Drake equation make simple assumption life developed One major assumption life form scientifically right condition met life form approach bypass two impossibletoanswer question plagued previous calculation “what fraction planet habitable zone star form life” “what fraction life evolve intelligent life” two question answerable detect life still seems like farfetched reality much like Drake Equation ACL isn’t without fair share issue can’t accurately know correct figure compute estimate number anything Milky Way tending differ source source don’t know number star exoplanets galaxy new method surgical approach it’s mean definitive still stage trying figure precise value compute There’s also fact we’re still certain caused intelligent life evolve even earth assumption could happen anywhere else universe “under right circumstances” incredibly farfetched Additionally study suggested average distance nearest civilization might 17000 lightyears making detection communication extremely difficult current technologyTags Science Fiction Astronomy Space Physics Science
|
5,530 |
Was a ‘Secret’ Version of the Gospel of Mark Found in 1958?
|
Was a ‘Secret’ Version of the Gospel of Mark Found in 1958?
Morton Smith made an announcement
Mar Saba monastery, Palestine (2011; Creative Commons license)
On December 29, 1960, at a meeting in New York of the Society of Biblical Literature and Exegesis, an assistant professor of history at Columbia University named Morton Smith announced an exciting manuscript discovery. Two years prior, he said, he’d been looking over some old Latin books in the top room of the tower library at the Mar Saba monastery, which is outside of Jerusalem.
At the end of a book printed in 1646, he’d noticed two and a half pages of handwriting. It was a text, in Greek, identified as a letter by Clement of Alexandria, the second-century Christian, addressed to a person named Theodore. The letter discusses, and quotes from, a “secret” version of the gospel of Mark.
One would have to appreciate, at least, a good story? A scholar, well-regarded in his field, had found in an old book a copy of a secret teaching of a sacred text, kept hidden since the origins of the faith. He wasn’t allowed to remove the book, so it remained locked up in the monastery.
But he had taken photos.
Morton Smith, “Secret Mark” (1958; public domain; colorized)
There were ‘new’ scenes with Jesus, his mother and Mary Magdalene, and a resurrected Lazarus
The most detailed scene was a passage to be inserted between Mark 10:34 and 35. It was, Clement prefaces, a “more spiritual” version of the gospel.
“And after six days Jesus told him what to do and in the evening the youth comes to him, wearing a linen cloth over his naked body. And he remained with him that night, for Jesus taught him the mystery of the kingdom of God.”
Clement adds, in reference to a question from Theodore, that the phrase “naked man with naked man” wasn’t in the secret text.
The day following Smith’s presentation, newspapers around the country report on the matter, framing it as a curiosity.
The Hackensack Record, December 30, 1960.
Was it an eerie moment in Christian history?
Since 1945, ‘new’ biblical text had been shocking the established religious traditions: the Dead Sea Scrolls, the Nag Hammadi codex. Christians were being pressed to consider manuscripts for religious acceptability of which they had no knowledge!
However, Christians knew that Jesus was not a “magician,” or something on the level of a hypnotist, and had not had an erotic scene with a young man—which was the case that Smith set out to prove.
Along the way, he kept up a correspondence with Gershom Scholem, the great Jewish scholar. No matter how reluctant, Christian scholars will have to deal with the matter, Smith writes to him, as “the text is there and has to be explained, and the problems are there and have to be answered.”
Is the text saying that Jesus and Lazarus were sexual? As Smith writes in his resulting 1973 book, “there is no telling how far symbolism went,” though the key moment in the ritual, he thinks, would be when “the disciple was possessed by Jesus’ spirit.”
This “mystery” language is given a spousal context in Ephesians 5:32, and Paul and Peter do issue those warnings against ‘immorality’. To Smith, it seems the early Christians had gotten a little licentious, and the scene with Jesus and the young man might provide the reason.
His book got media play and professional blowback. “I’m reconciled to the attacks,” he tells the New York Times. “Thank God I have tenure!”
However, from the establishment of Biblical scholarship, the feeling might be that he was fired.
A Jesuit scholar named Quentin Quesnell was suspicious
In two papers, in 1975 and 1976, he broached the matter. Why had Smith not taken more efforts to secure this supposed manuscript, and make it available for public scrutiny?
Why would it be, he mused, that Smith had—all his professional career—been interested in the very subjects his discovery seemed to verify? Even in his 1951 dissertation, Smith had written about “secret doctrine” in early Christianity and “forbidden sexual relationships.”
The hazy suggestion is that Smith might have forged it, but Quesnell allows that Smith might have found a text forged by someone else. “I do not find the style typical of Clement,” he notes.
In 1983, Quesnell traveled to Mar Saba to examine the manuscript, expecting to find an obvious forgery. His knowledge of this field, as he writes in his notes, included “what I read about forgeries in detective stories.”
When the book was finally before him, just as Smith had described it, Quesnell realized he wasn’t sure. He saw the librarians were good at guarding their property, and it would be “impossible” to remove the book. Quesnell went home, never to discuss the experience publicly.
About four other scholars had also gone to see the letter. They’d taken more photographs, and tried to arrange tests on the paper by Israeli scientists. As this would involve Jews, the monks wouldn’t allow it.
The monastery had begun to see Smith as publicity-hungry. He tried to get a BBC camera crew into the library. This disturbance was refused. Somewhere along the way, the letter went missing.
For years it seemed Smith had been the only person to see the ‘Letter to Theodore’
A theory formed among the skeptics. Quesnell’s notes, found after he died in 2012, note the talk that “psychological explanations” account for Smith’s forgery. He’d never married and was an Anglican priest in early life.
Morton Smith in 1989 by Allan J. Pantuck (public domain)
In 2005, Stephen C. Carlson published The Gospel Hoax, a dismissal of “Secret Mark” as a forgery motivated by Smith’s homosexuality. In 2007, Peter Jeffery published a similar critique, The Secret Gospel of Mark Unveiled.
Jeffery writes:
“My impression is that Morton Smith was a man in great personal pain, even if (which I don’t know) he was usually able to hide this fact from the people who knew him.”
The posthumous ‘outing’ by Bible scholarship was done without biographical investigation. In 2010, Biblical Archaeology Review did a feature on the controversy. Friends of Smith wrote in, disputing he’d been homosexual. He’d dated at least two women. A friend reports: “I suspect that he was just an Anglican clergyman who had had an unsuccessful love affair and afterward condemned himself to bachelorhood.”
Scholars continued to dismiss “Secret Mark”
Academics as illustrious as Larry Hurtado, Bart Ehrman, and Craig Evans, saw it as more or less a ruse on Smith’s part. Among Bible scholars this might even have been a required view.
A South African feminist scholar named Winsome Munro had a 1992 paper, “Women Disciples: Light from Secret Mark.” And other non-American scholars, like Richard Bauckham and Scott G. Brown, examined how the ‘new’ text could illuminate some famous problems in the gospel narratives — as if a piece, removed, had been placed back in.
Timo Paananen, from the University of Helsinki, analyzed the handwriting in the photos of the letter, finding a case for forgery rather weak.
In 2008, the correspondence between Smith and Scholem was published. One watches Smith thinking, and re-thinking, the letter over time. The idea of his being a forger, the editor thinks, had emerged “from quite unscholarly grounds,” and in retrospect the evidence:
“…strongly points to the total trustworthiness of Smith’s account of his important discovery (though not necessarily of his interpretation of the document).”
An independent scholar named Stephan Huller did a series of blog posts on the matter, noting possibilities. Following textual clues, this ‘Theodore’ seemed to be an early Christian about whom more was known—like that he’d a “same-sex union rite” and been “united to another man in this city with Origen of Alexandria presiding over the ceremony.”
That connection was noticed by an independent scholar named Michael Zeddies. In two papers, in 2017 and 2019, he traces the possibility that the ‘Letter to Theodore’ wasn’t written by Clement at all. It sounded more like Origen, the key third-century Christian scholar.
Origen had been declared a heretic in 553, some two hundred years after his death. A misattribution of the letter could have been the key to its survival. A story, that is, of Christians saving texts from Christians.
The idea of a ‘secret’ teaching was hardly new
Zeddies writes:
“Origen would have been quite comfortable with suggesting that some parts of Scripture were to be literally withheld from the spiritually unprepared.”
Smith’s ideas of erotic scenes might have been overdone. Religions often have their own specialized vocabulary. For Origen, the word “carnal,” for example, would point to the material world.
The “Secret Mark” scene might just describe a baptism, which in early Christianity had been done at night, all night, and fully naked. The early motto, Zeddies points out, was: “naked to follow the naked Christ.” The follower is shedding humanity along with clothes, dreaming of a new spirit form that is yet to be.
Whatever this “mystery” had been, it didn’t seem too lurid, at least, in 1 Corinthians 2:7, which notes the “mystery that has been hidden…”
What might not be clear is that it was ever revealed.
|
https://medium.com/history-of-yesterday/was-a-secret-version-of-the-gospel-of-mark-found-in-1958-9bde330fa1f9
|
['Jonathan Poletti']
|
2020-12-18 17:14:56.369000+00:00
|
['Religion', 'Christianity', 'Books', 'Bible', 'History']
|
Title ‘Secret’ Version Gospel Mark Found 1958Content ‘Secret’ Version Gospel Mark Found 1958 Morton Smith made announcement Mar Saba monastery Palestine 2011 Creative Commons license December 29 1960 meeting New York Society Biblical Literature Exegesis assistant professor history Columbia University named Morton Smith announced exciting manuscript discovery Two year prior said he’d looking old Latin book top room tower library Mar Saba monastery outside Jerusalem end book printed 1646 he’d noticed two half page handwriting text Greek identified letter Clement Alexandria secondcentury Christian addressed person named Theodore letter discus quote “secret” version gospel Mark One would appreciate least good story scholar wellregarded field found old book copy secret teaching sacred text kept hidden since origin faith wasn’t allowed remove book remained locked monastery taken photo Morton Smith “Secret Mark” 1958 public domain colorized ‘new’ scene Jesus mother Mary Magdalene resurrected Lazarus detailed scene passage inserted Mark 1034 35 Clement preface “more spiritual” version gospel “And six day Jesus told evening youth come wearing linen cloth naked body remained night Jesus taught mystery kingdom God” Clement add reference question Theodore phrase “naked man naked man” wasn’t secret text day following Smith’s presentation newspaper around country report matter framing curiosity Hackensack Record December 30 1960 eerie moment Christian history Since 1945 ‘new’ biblical text shocking established religious tradition Dead Sea Scrolls Nag Hammadi codex Christians pressed consider manuscript religious acceptability knowledge However Christians knew Jesus “magician” something level hypnotist erotic scene young man—which case Smith set prove Along way kept correspondence Gershom Scholem great Jewish scholar matter reluctant Christian scholar deal matter Smith writes “the text explained problem answered” text saying Jesus Lazarus sexual Smith writes resulting 1973 book “there telling far symbolism went” though key moment ritual think would “the disciple possessed Jesus’ spirit” “mystery” language given spousal context Ephesians 532 Paul Peter issue warning ‘immorality’ Smith seems early Christians gotten little licentious scene Jesus young man might provide reason book got medium play professional blowback “I’m reconciled attacks” tell New York Times “Thank God tenure” However establishment Biblical scholarship feeling might fired Jesuit scholar named Quentin Quesnell suspicious two paper 1975 1976 broached matter Smith taken effort secure supposed manuscript make available public scrutiny would mused Smith had—all professional career—been interested subject discovery seemed verify Even 1951 dissertation Smith written “secret doctrine” early Christianity “forbidden sexual relationships” hazy suggestion Smith might forged Quesnell allows Smith might found text forged someone else “I find style typical Clement” note 1983 Quesnell traveled Mar Saba examine manuscript expecting find obvious forgery knowledge field writes note included “what read forgery detective stories” book finally Smith described Quesnell realized wasn’t sure saw librarian good guarding property would “impossible” remove book Quesnell went home never discus experience publicly four scholar also gone see letter They’d taken photograph tried arrange test paper Israeli scientist would involve Jews monk wouldn’t allow monastery begun see Smith publicityhungry tried get BBC camera crew library disturbance refused Somewhere along way letter went missing year seemed Smith person see ‘Letter Theodore’ theory formed among skeptic Quesnell’s note found died 2012 note talk “psychological explanations” account Smith’s forgery He’d never married Anglican priest early life Morton Smith 1989 Allan J Pantuck public domain 2005 Stephen C Carlson published Gospel Hoax dismissal “Secret Mark” forgery motivated Smith’s homosexuality 2007 Peter Jeffery published similar critique Secret Gospel Mark Unveiled Jeffery writes “My impression Morton Smith man great personal pain even don’t know usually able hide fact people knew him” posthumous ‘outing’ Bible scholarship done without biographical investigation 2010 Biblical Archaeology Review feature controversy Friends Smith wrote disputing he’d homosexual He’d dated least two woman friend report “I suspect Anglican clergyman unsuccessful love affair afterward condemned bachelorhood” Scholars continued dismiss “Secret Mark” Academics illustrious Larry Hurtado Bart Ehrman Craig Evans saw le ruse Smith’s part Among Bible scholar might even required view South African feminist scholar named Winsome Munro 1992 paper “Women Disciples Light Secret Mark” nonAmerican scholar like Richard Bauckham Scott G Brown examined ‘new’ text could illuminate famous problem gospel narrative — piece removed placed back Timo Paananen University Helsinki analyzed handwriting photo letter finding case forgery rather weak 2008 correspondence Smith Scholem published One watch Smith thinking rethinking letter time idea forger editor think emerged “from quite unscholarly grounds” retrospect evidence “…strongly point total trustworthiness Smith’s account important discovery though necessarily interpretation document” independent scholar named Stephan Huller series blog post matter noting possibility Following textual clue ‘Theodore’ seemed early Christian known—like he’d “samesex union rite” “united another man city Origen Alexandria presiding ceremony” connection noticed independent scholar named Michael Zeddies two paper 2017 2019 trace possibility ‘Letter Theodore’ wasn’t written Clement sounded like Origen key thirdcentury Christian scholar Origen declared heretic 553 two hundred year death misattribution letter could key survival story Christians saving text Christians idea ‘secret’ teaching hardly new Zeddies writes “Origen would quite comfortable suggesting part Scripture literally withheld spiritually unprepared” Smith’s idea erotic scene might overdone Religions often specialized vocabulary Origen word “carnal” example would point material world “Secret Mark” scene might describe baptism early Christianity done night night fully naked early motto Zeddies point “naked follow naked Christ” follower shedding humanity along clothes dreaming new spirit form yet Whatever “mystery” didn’t seem lurid least 1 Corinthians 27 note “mystery hidden…” might clear ever revealedTags Religion Christianity Books Bible History
|
5,531 |
Case study: how I would design Twitter’s Edit Button
|
So I hope everyone has a mask on because that might encourage twitter to add on an edit button. That being said, here’s a case study(case essay? redesign?) that shows how I’d implement the infamous ‘edit’ button on twitter.
Target Use Case:
Giving a chance for users to fix their tweets while retaining accountability.
Giving people the chance to fix a spelling error, misquoted tweet, or as a way to rectify false information, would be greatly beneficial to the platform as a whole.
Just think about the number of times you’ve been wrong or you’ve seen tweets that were wrong, a couple of outcomes may occur such as;
People begin to correct and pile onto the original poster to let them know that they’re wrong
and pile onto the original poster to let them know that they’re wrong A tweet becomes a thread — with the correct response (kinda like social proofing)
— with the correct response (kinda like social proofing) It gets deleted which holds 0 accountability to the poster.
Design Analysis:
Mockups for how the edit button would
The design process for me was to be as authentic and realistic as possible to what Twitter would do themselves.
Hence why I decided that instead of adding another action button to a tweet, it was implemented as another option similarly to ‘delete tweet’ and ‘pin to profile’.
Afterward, the tweet with being treated just like a regular retweet except at the bottom of tweet there will be a tag stating that the tweet is edited & it essentially is threaded to the original tweet.
This tag acts as a similar function to Twitter’s already existing tag regarding violations to their rules
An example of what kind of tag when you violate twitter’s rules.
The language I chose for the tag does a few things;
Holds the user accountable
Has notified any viewer that the tweet is NOT the original tweet & has been edited
the original tweet & has been edited Anytime someone comments or retweets the edited or original tweet the tag will appear and notify users that it has been edited.
The ability to gain optional knowledge on twitter’s guidelines
Langauge used for the tag
The feature would also hypothetically not allow the author to delete the original tweet unless they delete the edited version as well — effectively deleting the thread of tweets altogether. The edit function would only be able to be used once on one tweet. This way it’ll ‘foolproof’ any kind of misleading tweets that may be allowed.
A terrible recording of the mockup I made for the interaction
Final Thoughts
I believe having an edit button would be beneficial in twitter’s interactions as long as there is some form of accountability between users and a way for the original tweet to still exist alongside the edited version. This would help deter any misleading or offensive styled tweets because of the role the interaction would play. The edit button action would also help educate people on what is an error and what is a mistake — effectively creating a way to redeem oneself if need be.
We’ve seen the power of social media has on the ‘real’ world. From affecting our mental health to taking over entire elections it’s not a surprise it’s gotten to a point where we question if the pros outweigh the cons that platforms like Reddit, Twitter, Facebook, and Instagram, do to us. Is it worth it? And if it is, how can we make it better for everyone to interact with? What steps can we take to prevent the spread of misinformation, propaganda, and inappropriate opinions?
It’s a question that we have to continually ask ourselves, because with the advent of ‘deep fakes’, and people claiming your source is wrong, my source is right (what are we? children?) it seems like understanding what is true and what is not is becoming more and more difficult than we’d like to admit.
I believe the edit button on twitter will help aid in the fight against all of these factors, as long as there are accountability and education, Twitter can make their platform a little less toxic by implementing this.
TL;DR:
|
https://medium.com/design-bootcamp/twitters-edit-button-how-i-would-do-it-e39458168a2b
|
['Anik Ahmed']
|
2020-10-16 03:08:19.994000+00:00
|
['User Interface', 'Case Study', 'Design', 'Twitter', 'UX']
|
Title Case study would design Twitter’s Edit ButtonContent hope everyone mask might encourage twitter add edit button said here’s case studycase essay redesign show I’d implement infamous ‘edit’ button twitter Target Use Case Giving chance user fix tweet retaining accountability Giving people chance fix spelling error misquoted tweet way rectify false information would greatly beneficial platform whole think number time you’ve wrong you’ve seen tweet wrong couple outcome may occur People begin correct pile onto original poster let know they’re wrong pile onto original poster let know they’re wrong tweet becomes thread — correct response kinda like social proofing — correct response kinda like social proofing get deleted hold 0 accountability poster Design Analysis Mockups edit button would design process authentic realistic possible Twitter would Hence decided instead adding another action button tweet implemented another option similarly ‘delete tweet’ ‘pin profile’ Afterward tweet treated like regular retweet except bottom tweet tag stating tweet edited essentially threaded original tweet tag act similar function Twitter’s already existing tag regarding violation rule example kind tag violate twitter’s rule language chose tag thing Holds user accountable notified viewer tweet original tweet edited original tweet edited Anytime someone comment retweets edited original tweet tag appear notify user edited ability gain optional knowledge twitter’s guideline Langauge used tag feature would also hypothetically allow author delete original tweet unless delete edited version well — effectively deleting thread tweet altogether edit function would able used one tweet way it’ll ‘foolproof’ kind misleading tweet may allowed terrible recording mockup made interaction Final Thoughts believe edit button would beneficial twitter’s interaction long form accountability user way original tweet still exist alongside edited version would help deter misleading offensive styled tweet role interaction would play edit button action would also help educate people error mistake — effectively creating way redeem oneself need We’ve seen power social medium ‘real’ world affecting mental health taking entire election it’s surprise it’s gotten point question pro outweigh con platform like Reddit Twitter Facebook Instagram u worth make better everyone interact step take prevent spread misinformation propaganda inappropriate opinion It’s question continually ask advent ‘deep fakes’ people claiming source wrong source right child seems like understanding true becoming difficult we’d like admit believe edit button twitter help aid fight factor long accountability education Twitter make platform little le toxic implementing TLDRTags User Interface Case Study Design Twitter UX
|
5,532 |
Dr. Carmen Köhler: “We need more females in space”
|
Dr. Carmen Köhler is a true role model for men and women alike. As an “analogue astronaut” she explores Mars-like environments on earth and she is also a founder of the primary school competition Code4Space.
In this episode of REWRITE TECH, we talk to Dr. Carmen Köhler about Mars simulation missions, women in space, and how her passion has led to a very unique career.
How to become an analogue astronaut
The first thing people stumble upon when getting to know Dr. Köhler is her unusual career path. Born in Berlin, Dr. Köhler always loved maths, but she didn’t think she was capable of studying it. In consequence, she followed her second passion and made an apprenticeship to become a hairdresser. But then serendipity came into her life in the form of a client.
The client was a professor and they started to talk about books. When Carmen mentioned that she was currently reading Fermat’s Last Theorem, which is a book about mathematical proof, the professor was astonished. From that day on, he started to bring her mathematical programs and encouraged her to finally study the subject.
“I gave myself half a year after the hairdressing to become a make-up artist and then I studied maths. The first time I sat in the university and the professor started writing equations over equations, I was totally in love,” Dr. Köhler recalls.
The next turning point was the Austrian Space Forum, which was looking for an analogue astronaut. Carmen took the chance and eventually got the job. She explains: “Analogue astronauts are actually people who do science on Mars-like environments on earth.”
Dr. Carmen Köhler, why should we explore Mars?
The goal of these missions is to find out how the human body reacts to certain circumstances, both physically and mentally. Analogue astronauts are commissioned by universities or private sponsors to find answers to their questions. “As an astronaut, you’re the eyes and hands of the scientists,” describes Dr. Köhler.
“I think as humans, we are explorers. We are curious and we want to know things”
But the learning generated on Mars is not only relevant if we want to populate other planets, as Dr. Köhler clarifies: “What we learn in space, we can use for the earth and I think that is really important.” For example, spaceflights induce bone loss, which makes those flights an accelerated model for drug testing. Thanks to research on the International Space Station, a new therapy for osteoporosis could be found.
Is space made for women?
Since men and women produce different hormones, they also react differently to the environment of space. That’s why it’s so important to diversify the team and collect data. “We need data to know how our body and psychology react and what it has to do with our chemistry. For example, women in space have fewer problems with their eyes and ears than men.”
As the first female analogue astronaut, Dr. Köhler experienced some of these problems first-hand. The spacesuit and the shoes are made for men and are therefore quite heavy and large.
“We need more women, but we also need to make things better for women,” concludes Dr. Köhler. In her view, it’s important to have pioneers to guide the way and make space more inclusive for women. With her work, Dr. Köhler is one of these necessary forerunners.
Listen to REWRITE TECH with Dr. Carmen Köhler
Listen to the full conversation with Dr. Carmen Köhler on our REWRITE TECH Podcast, which is available on all common audio streaming platforms including Spotify and Apple Podcasts.
Don’t miss out on our other episodes including Janina Mütze from Civey or André Christ from LeanIX.
______________________________________________________________
Learn more about REWRITE TECH.
|
https://medium.com/rewrite-tech/dr-carmen-k%C3%B6hler-we-need-more-females-in-space-e8a52496caa1
|
['Sarah Schulze Darup']
|
2020-11-24 11:05:45.774000+00:00
|
['Aviation', 'Podcast', 'Women In Tech', 'Space', 'Science']
|
Title Dr Carmen Köhler “We need female space”Content Dr Carmen Köhler true role model men woman alike “analogue astronaut” explores Marslike environment earth also founder primary school competition Code4Space episode REWRITE TECH talk Dr Carmen Köhler Mars simulation mission woman space passion led unique career become analogue astronaut first thing people stumble upon getting know Dr Köhler unusual career path Born Berlin Dr Köhler always loved math didn’t think capable studying consequence followed second passion made apprenticeship become hairdresser serendipity came life form client client professor started talk book Carmen mentioned currently reading Fermat’s Last Theorem book mathematical proof professor astonished day started bring mathematical program encouraged finally study subject “I gave half year hairdressing become makeup artist studied math first time sat university professor started writing equation equation totally love” Dr Köhler recall next turning point Austrian Space Forum looking analogue astronaut Carmen took chance eventually got job explains “Analogue astronaut actually people science Marslike environment earth” Dr Carmen Köhler explore Mars goal mission find human body reacts certain circumstance physically mentally Analogue astronaut commissioned university private sponsor find answer question “As astronaut you’re eye hand scientists” describes Dr Köhler “I think human explorer curious want know things” learning generated Mars relevant want populate planet Dr Köhler clarifies “What learn space use earth think really important” example spaceflight induce bone loss make flight accelerated model drug testing Thanks research International Space Station new therapy osteoporosis could found space made woman Since men woman produce different hormone also react differently environment space That’s it’s important diversify team collect data “We need data know body psychology react chemistry example woman space fewer problem eye ear men” first female analogue astronaut Dr Köhler experienced problem firsthand spacesuit shoe made men therefore quite heavy large “We need woman also need make thing better women” concludes Dr Köhler view it’s important pioneer guide way make space inclusive woman work Dr Köhler one necessary forerunner Listen REWRITE TECH Dr Carmen Köhler Listen full conversation Dr Carmen Köhler REWRITE TECH Podcast available common audio streaming platform including Spotify Apple Podcasts Don’t miss episode including Janina Mütze Civey André Christ LeanIX Learn REWRITE TECHTags Aviation Podcast Women Tech Space Science
|
5,533 |
Why I Don’t Follow My Passion
|
Why I Don’t Follow My Passion
Motivations can’t bring food to your table
Photo by LinkedIn Sales Navigator on Unsplash
I’m a Digital Marketer by profession & a photographer by passion. You may wonder why I didn’t make my career in photography. I actually tried and learned a life-changing lesson that drove me to separate my career and passion.
I brought up hearing the words “Follow Your Passion”. Later found out that it’s the biggest misconception, motivational speakers give us.
The quote should be, “Follow your passion but not Blindly”.
I was confused at the age of 25, “from where should I start?”. The most common thought of us.
I didn’t have a shitload of money or any ovarian lottery, and passion related jobs pushed me to fail horribly.
But I didn't take it as a failure, I took it as a life-changing lesson and started my new beginning. I would like to share what are the lessons that changed my perspective.
|
https://medium.com/illumination/why-i-dont-follow-my-passion-78177176d9cc
|
['Intisar Mahee']
|
2020-12-12 21:11:32.627000+00:00
|
['Passion', 'Motivation', 'Jobs', 'Careers', 'Career Advice']
|
Title Don’t Follow PassionContent Don’t Follow Passion Motivations can’t bring food table Photo LinkedIn Sales Navigator Unsplash I’m Digital Marketer profession photographer passion may wonder didn’t make career photography actually tried learned lifechanging lesson drove separate career passion brought hearing word “Follow Passion” Later found it’s biggest misconception motivational speaker give u quote “Follow passion Blindly” confused age 25 “from start” common thought u didn’t shitload money ovarian lottery passion related job pushed fail horribly didnt take failure took lifechanging lesson started new beginning would like share lesson changed perspectiveTags Passion Motivation Jobs Careers Career Advice
|
5,534 |
“Usability is Accessibility”
|
Disability access symbols; image credit (https://oae.stanford.edu/resources-faqs/disability-access-symbols)
How and why designers should think about accessibility.
Design for “Everyone”
Who do you design for? As designers, we aspire to create designs that have the potential to impact the world and people of all kinds of shapes, sizes, and backgrounds. Moreover, designers fixate on the importance of creating products that anyone can use; after all, usability is important, and designers must critically consider the efficacy of their products. Thus, inherent to design is the notion of “design for everyone” — that is, to create simple designs that “even your grandma can pickup and use!”
Junior designers (myself included) easily acknowledge and accept this utopian adage with eagerness, strictly follow the long and winding road of usability checklists, and gently fall into the lion’s den of non-inclusive design. What happened here? Usability checklists aren’t wrong; in contrast, they espouse great principles and I encourage you to continue following these checklists when creating your designs. The problem lies in what the designer failed to do — they failed to think as a designer. Design is more than just creating aesthetic details or manufacturing textbook usable motifs; design is about thinking about your audience. Unfortunately, attempting to design for everyone may accidentally cause designers to create overly-general designs that fail to accommodate the unique populations of users who may actually use your designs, but are not specifically mentioned in your general guidelines. That doesn’t mean it’s bad to create designs that benefit the most types of users, but you should be aware of how your perceptions of “everyone” might exclude particular audiences. The saccharine idea of universal design unfortunately poses an ironic dilemma — if you attempt to design for everyone, you can leave out anyone.
If you attempt to design for everyone, you can leave out anyone.
What happens when you attempt to design something that is usable by anyone? Can you design something that is usable by everyone? Who does your “everyone” exclude? Do you consider a working and pregnant mother in your designs? Do you consider the grandmother with Asperger's? A four-year old who recently scraped his hands after falling from his bike? A fifty-eight year old electrician with moderate eye strain? Or how about the teenager with thyroid cancer?
Whether implicitly or explicitly, your design will leave out particular audiences. This truth is unfortunate, but inherent to the politics of the artifact that you design. Now, as a designer, you may wonder, “How can I make sure that I’m designing for my user then?” The answer lies in the question: “design for the user.”
“How can I make sure that I’m designing for my user then?” The answer lies in the question: “design for the user.”
You may bat an eye at this statement. Isn’t designing for everyone designing for your user? The two ideas sound almost identical, but the results of religiously following one idea over the other create vastly different results. “Everyone” is an ambiguous term. Although it appears to stand for all people, it stands for not. It serves as a “catch-all” term that doesn’t cater towards anyone’s needs. And, unfortunately, the needs of “everyone” can leave out the unique traits that may characterize your user. Instead, critically identifying a discrete number of probable users (and noting their characteristics and skills) can enable you to vicariously avoid the pit of generalizing your user and specifically create great features that best suit the needs for the individuals who will actually utilize your app. After all, if your target audience never includes a particular type of user, why divest resources that could otherwise be used to craft a solution that accommodates your actual users?
Accessibility is for “Everyone” Usability Forgets
61% of adults in the United States have some type of disability — Center for Disease Control and Prevention
A real consequence of “designing for everyone” is forsaking unexpectedly large populations of common users; these common users are usually tech savvy individuals who rely on accessible technologies to live productively. Although you may counter that your app most likely will never be used by someone with “those demographics”, the numbers beg to differ. According to the CDC, 61% of adults in the United States have some type of disability. That’s 1 in 4 Americans. Again, you may counter that people with disabilities do not have the skills or needs to use technology, but again, you would be wrong. Pew Research reports that at least half of individuals with disability use the Internet on a daily basis. Although this number suggests that this population uses the Internet at a smaller percentage, it’s important to recognize that disabled populations still make up a significant percentage of possible users. Statistically speaking, it would be unwise to ignore the consideration of accessible designs and instead build for the idealized “everyone”.
…And Other Dangerous UX Myths
Accessible Design is Ugly
Even if designers agree that accessible design is important, it’s likely that they also think that accessible design is ugly. Although “ugly” accessible designs exist, accessible design is not inherently unattractive. (Moreover, accessible design is more than just creating aesthetic user interfaces but also includes creating accessible user flows and experiences.) Although few companies espouse accessibility in their design philosophies, beautiful accessible designs are not as uncommon as you may think.
Have you ever used an iPhone? How about a Mac? Or an Apple Watch? All of these products have several things in common but notably one: globally accessible features and designs. Perhaps surprisingly, all Apple products offer a suite of accessible options and tools that reflect its mindset towards accessible design. Throughout the years, Apple has remained dedicated towards creating beautiful products that remain accessible for individuals of a wide range of skills and abilities. Both disability advocates and designers praise Apple for its beautiful and accessible products.
People like Sadie (featured in the linked Apple advertisement) use accessible products to enjoy creative and productive lives like anyone and everyone else.
As much as designers hate ugly or bulky accessible designs, those with disabilities despite them just as much. Of course, those with disability embrace their identities, but that doesn’t mean that a stigma against those with disabilities doesn’t exist. Consequently, perhaps for the benefit and delight of both designers and those with disability, design shouldn’t be ugly.
Accessible Design is for Edge Cases
Both usability and accessibility checklists exist for designers to check and optimize their designs, but by no means are these checklists the end-all-be-all. Although heuristics exist to help promote accessible and usable designs, they don’t represent the nuanced designs necessary to create a unique, accessible product. Consequently, making a product accessible means more than just checking a box for using the right colors or the right fonts — it’s about creating user flows that reduce cognitive load or physical strain, and this type of attention to detail demands more consideration than just a PDF checklist from Nielsen Norman Group, even if it provides great heuristic guidelines. (Good) user experience designers don’t just create one-off designs, they create experiences, and accessible experiences should be created no differently. That said, making a product design does take time, but most likely not as much as you anticipated.
Accessible Design is Too Time Consuming
For individual developers or designers, creating accessible designs can feel overwhelming. To be fair, there exist many types of people with all kinds of disabilities. This can be intimidating to a junior designer who isn’t familiar with creating designs in the first place and certainly not accessible designs. And, it is likely that if you are a busy designer on a tight deadline, that making something accessible isn’t your first (or maybe even your last) thought — you just really, really need to ship this design by tonight so the devs can have it ready by the weekend.
First of all, accessibility shouldn’t be an afterthought; if you want to create quality designs that are usable for those populations, then you should treat disabled populations as real people and real users who deserve your attention. Additionally, because many resources for creating accessible technologies exist, nowadays, it’s especially easy to create accessible designs. Just as many common practices and motifs exist for creating usable designs, they also exist for accessible designs. Not only that, but usability and accessibility go hand in hand. After all, accessibility is usability for a significant, but marginalized and forgotten population.
Additionally, and luckily for our developer friends, many developer guides and tools exist to support accessible design. Developers can easily harness these tools to create websites that are both functional and accessible.
The Case for Designing for Accessibility
Expand Your Audience to Millions of People Worldwide
As I mentioned above, disability isn’t as uncommon as you think it is. Hundreds of millions of people worldwide have a disability. Thus, not accommodating to these individuals not only leaves out a sizable population of those who can experience and enjoy your website, but it can also reduce your own audience and outreach. Accommodating the virtual workflow of millions of individuals is not only amicable, but it’s also economical.
Accommodate the Needs of Others
Disability is hidden. It’s likely that you take classes with people of a wide range or abilities with all kinds of necessary day-to-day accommodations. That said, although some disabilities may not be as blatantly obvious as absolute blindness or paraplegia, they still can impact how millions of people use or visit websites. Moreover, many individuals who don’t have disabilities use accessibility tools to also improve their workflows, so accommodating to general accessible needs helps a lot more people than you would imagine.
Set a Good Example
For better or for worse, a lot of companies and organizations claim to be welcoming to all people regardless of race, sexuality, gender, and more! I’ve seen it. You’ve seen it. We’ve all seen it. So if you claim to welcome people of all different shapes and sizes, races and ethnicities, genders and sexualities, then welcoming people of different abilities and skills is no exception. Although few companies (and a nonexistent number of student organizations) make it a priority to create products to accommodate a variety of audiences, setting a standard for being welcoming of a diversity of abilities makes a big difference.
Abide by WCAG 2.0
In the United States, WCAG 2.0 is a set of web accessibility guidelines that dictates standards that define web accessibility. It offers a comprehensive list for what supports accessible design. Although individuals have more freedom to create whatever website they want, there are penalties for not meeting accessibility standards in the United States for large corporations. Luckily, the United States government takes accessibility seriously, and this mentality should be reflected in all US-hosted websites.
How Do I Make My Designs Accessible?
WCAG 2.0 offers a list of guidelines for designers and developers on how to create more accessible, but I wanted to share a few easy but important ways to make your digital products more accessible.
Strong Color Contrast
Do you wear glasses? Do you wear contact lenses? Can you not see unsaturated reds or greens? Many Americans experience some type of visual impairment and many more wear glasses or contacts to correct their vision. One of the most well-known ways to enforce accessibility in your designs is with using colors with strong color contrast. Strong color contrast helps make certain elements more visible and distinguishable. Certain web browsers offer options for website users to turn on options to increase color contrasts, too.
Large or Variable Font Size
These days, using extra large, bold fonts are especially trendy, but that hasn’t always been the case. Using large fonts or offering large font options are important to a growing number of audiences with low vision. Consider offering these options to create an accessible solution to those people.
Descriptive Alt-Text
Did you know that Facebook and Instagram offer alt-text options when posting pictures? However, fewer than 0.1% of pictures on Twitter have captions, and most of these descriptions offer poor explanations of the depicted imagery. While including alt-text and captions is particularly important in creating an accessible website, maintaining descriptive, elaborated, and helpful descriptions is even more important to help those with low vision or blindness to understand the displayed content.
Keyboard Input Options
Many Americans with visual or motor disabilities rely on their keyboard to provide input or navigate websites. However, many websites fail to support this alternate method of navigation. Thus, to support these audiences, incorporating descriptive navigational menus into websites or apps is a necessary imperative.
Reduced Cognitive Load
This is a bit hard to explain, but if you are familiar with user experience principles, then it’s likely that you’ve heard of “cognitive load”. Simply put, reducing cognitive load is all about making things less complicated than it needs to be — keep it simple! This means that user flows should be straight forward, interactions should be logical, and descriptions should be succinct and informational.
And More!
WCAG 2.0 elaborates all of its standards for accessibility on its website. It’s a long list, but it helps to capture the needs of hundreds of millions of people throughout the world! I encourage you to take a look at the WCAG 2.0 website to better understand what steps you can take to make your website more accessible.
What Does an Accessible Design Look Like?
The Brain Exercise Initiative has the cutest mascot!
This summer of 2020 the Brain Exercise Initiative (“a 501(c)(3) nonprofit organization that uses simple math, writing and reading aloud exercises as an intervention to improve cognitive function in those with Alzheimer’s Disease”) reached out to Bits of Good. In the midst of the shelter at home orders, the Brain Exercise Initiative was having trouble reaching out to seniors to practice their reading, writing, and math skills; this meant that these seniors were unable to gain access to vital resources aimed to help improve their cognitive function. Thus, the Brain Exercise Initiative requested Bits of Good to create a mobile app that could serve as an intervention to improve cognitive function in those with Alzheimer’s Disease particularly during the COVID-19 pandemic. Given this urgent task, Bits of Good organized a team of product managers, engineering managers, developers, and designers to create an app that would do just that, and I was one of three designers to help create this “brain exercise” app.
Reflecting on our design process, I am proud that my team of designers and I thought about accessibility long before we began to design. We knowingly understood that designing for seniors (and possibly seniors with moderate to mild cognitive impairment) meant that we had to think critically about how we designed our app and for whom we designed our app. Nonetheless, since we were mostly unfamiliar with accessible technology, we dedicated a few weeks to read articles upon articles about best practices for accessible design. Thus, identifying our users and target audience was especially crucial for creating the Brain Exercise Initiative app.
Our original wireframes for the Brain Exercise Initiative app were messy and unlike the final product, but it helped us think critically about what our next steps would be.
When we moved on from ideation to design, we not only opted to use vivid, high-contrast colors and bold, obvious fonts, but we also deliberately included several accessibility settings including ways to narrate text, change font size, increase contrast, reduce motion, and more. Although we understood that our main audience was likely seniors familiar with technology with moderate cognitive impairment, we also recognized that our aging population would likely have a wide range of abilities and skills, so we included features that would accommodate those users.
Later, as we iterated through our designs, our most difficult decisions were with choosing designs that would be most enjoyable and usable for our seniors. Reflecting on the effects of the world pandemic, we realized that these seniors, although in need of exercising their brains, were likely without visitors for months. That said, we included fun pictures, warm copy, and encouraging notifications to design a more delightful experience. Additionally, we organized several meetings (and consulted our point of contact from the Brain Exercise Initiative) to fiddle with layouts, formats, colors, and fonts as we tried to optimize the app for our senior audience. Altogether, designing for the specific audience was a new and challenging task, but our previous research and outstanding grit helped us design something fantastic.
Look for the official Brain Exercise Initiative app in the iOS App Store and Google Play Store!
I’m excited and proud of the work that my team of designers put into this app. We spent countless hours squabbling over which designs would be easier for our senior audience to use and enjoy, and I think that the elegant flows and designs of the app reflects this dedication for accessibility. Now, I’ll be the first one to admit that our app isn’t perfect; our app hasn’t been evaluated by an accessibility expert or consultant and it still needs a lot of user testing; but I look forward to potentially working with the Brain Exercise Initiative again this coming semester to continue improving the app.
Conclusion
When we design for “everyone”, it can be easy for us to forget who our actual users are. As designers, we should determine who our users are and make critical decisions that inform our designs. It’s important to recognize the users that would benefit from accessible designs. Although designers may have reservations about creating accessible options or accessible designs, they should remember that accessibility can be incorporated easily and beautifully while benefitting many. So, I’ll ask you again.
Who do you design for?
— Thank you to the Brain Exercise Initiative for working with us!
Additional Resources You’ll Love
For Anyone
For Designers
For Developers
|
https://medium.com/bits-of-good/usability-is-accessibility-7bb6cc5996ee
|
['Kimberly Do']
|
2020-10-07 00:12:14.043000+00:00
|
['Product Design', 'Design', 'Accessible Design', 'UX Design', 'Accessibility']
|
Title “Usability Accessibility”Content Disability access symbol image credit httpsoaestanfordeduresourcesfaqsdisabilityaccesssymbols designer think accessibility Design “Everyone” design designer aspire create design potential impact world people kind shape size background Moreover designer fixate importance creating product anyone use usability important designer must critically consider efficacy product Thus inherent design notion “design everyone” — create simple design “even grandma pickup use” Junior designer included easily acknowledge accept utopian adage eagerness strictly follow long winding road usability checklist gently fall lion’s den noninclusive design happened Usability checklist aren’t wrong contrast espouse great principle encourage continue following checklist creating design problem lie designer failed — failed think designer Design creating aesthetic detail manufacturing textbook usable motif design thinking audience Unfortunately attempting design everyone may accidentally cause designer create overlygeneral design fail accommodate unique population user may actually use design specifically mentioned general guideline doesn’t mean it’s bad create design benefit type user aware perception “everyone” might exclude particular audience saccharine idea universal design unfortunately pose ironic dilemma — attempt design everyone leave anyone attempt design everyone leave anyone happens attempt design something usable anyone design something usable everyone “everyone” exclude consider working pregnant mother design consider grandmother Aspergers fouryear old recently scraped hand falling bike fiftyeight year old electrician moderate eye strain teenager thyroid cancer Whether implicitly explicitly design leave particular audience truth unfortunate inherent politics artifact design designer may wonder “How make sure I’m designing user then” answer lie question “design user” “How make sure I’m designing user then” answer lie question “design user” may bat eye statement Isn’t designing everyone designing user two idea sound almost identical result religiously following one idea create vastly different result “Everyone” ambiguous term Although appears stand people stand serf “catchall” term doesn’t cater towards anyone’s need unfortunately need “everyone” leave unique trait may characterize user Instead critically identifying discrete number probable user noting characteristic skill enable vicariously avoid pit generalizing user specifically create great feature best suit need individual actually utilize app target audience never includes particular type user divest resource could otherwise used craft solution accommodates actual user Accessibility “Everyone” Usability Forgets 61 adult United States type disability — Center Disease Control Prevention real consequence “designing everyone” forsaking unexpectedly large population common user common user usually tech savvy individual rely accessible technology live productively Although may counter app likely never used someone “those demographics” number beg differ According CDC 61 adult United States type disability That’s 1 4 Americans may counter people disability skill need use technology would wrong Pew Research report least half individual disability use Internet daily basis Although number suggests population us Internet smaller percentage it’s important recognize disabled population still make significant percentage possible user Statistically speaking would unwise ignore consideration accessible design instead build idealized “everyone” …And Dangerous UX Myths Accessible Design Ugly Even designer agree accessible design important it’s likely also think accessible design ugly Although “ugly” accessible design exist accessible design inherently unattractive Moreover accessible design creating aesthetic user interface also includes creating accessible user flow experience Although company espouse accessibility design philosophy beautiful accessible design uncommon may think ever used iPhone Mac Apple Watch product several thing common notably one globally accessible feature design Perhaps surprisingly Apple product offer suite accessible option tool reflect mindset towards accessible design Throughout year Apple remained dedicated towards creating beautiful product remain accessible individual wide range skill ability disability advocate designer praise Apple beautiful accessible product People like Sadie featured linked Apple advertisement use accessible product enjoy creative productive life like anyone everyone else much designer hate ugly bulky accessible design disability despite much course disability embrace identity doesn’t mean stigma disability doesn’t exist Consequently perhaps benefit delight designer disability design shouldn’t ugly Accessible Design Edge Cases usability accessibility checklist exist designer check optimize design mean checklist endallbeall Although heuristic exist help promote accessible usable design don’t represent nuanced design necessary create unique accessible product Consequently making product accessible mean checking box using right color right font — it’s creating user flow reduce cognitive load physical strain type attention detail demand consideration PDF checklist Nielsen Norman Group even provides great heuristic guideline Good user experience designer don’t create oneoff design create experience accessible experience created differently said making product design take time likely much anticipated Accessible Design Time Consuming individual developer designer creating accessible design feel overwhelming fair exist many type people kind disability intimidating junior designer isn’t familiar creating design first place certainly accessible design likely busy designer tight deadline making something accessible isn’t first maybe even last thought — really really need ship design tonight devs ready weekend First accessibility shouldn’t afterthought want create quality design usable population treat disabled population real people real user deserve attention Additionally many resource creating accessible technology exist nowadays it’s especially easy create accessible design many common practice motif exist creating usable design also exist accessible design usability accessibility go hand hand accessibility usability significant marginalized forgotten population Additionally luckily developer friend many developer guide tool exist support accessible design Developers easily harness tool create website functional accessible Case Designing Accessibility Expand Audience Millions People Worldwide mentioned disability isn’t uncommon think Hundreds million people worldwide disability Thus accommodating individual leaf sizable population experience enjoy website also reduce audience outreach Accommodating virtual workflow million individual amicable it’s also economical Accommodate Needs Others Disability hidden It’s likely take class people wide range ability kind necessary daytoday accommodation said although disability may blatantly obvious absolute blindness paraplegia still impact million people use visit website Moreover many individual don’t disability use accessibility tool also improve workflow accommodating general accessible need help lot people would imagine Set Good Example better worse lot company organization claim welcoming people regardless race sexuality gender I’ve seen You’ve seen We’ve seen claim welcome people different shape size race ethnicity gender sexuality welcoming people different ability skill exception Although company nonexistent number student organization make priority create product accommodate variety audience setting standard welcoming diversity ability make big difference Abide WCAG 20 United States WCAG 20 set web accessibility guideline dictate standard define web accessibility offer comprehensive list support accessible design Although individual freedom create whatever website want penalty meeting accessibility standard United States large corporation Luckily United States government take accessibility seriously mentality reflected UShosted website Make Designs Accessible WCAG 20 offer list guideline designer developer create accessible wanted share easy important way make digital product accessible Strong Color Contrast wear glass wear contact lens see unsaturated red green Many Americans experience type visual impairment many wear glass contact correct vision One wellknown way enforce accessibility design using color strong color contrast Strong color contrast help make certain element visible distinguishable Certain web browser offer option website user turn option increase color contrast Large Variable Font Size day using extra large bold font especially trendy hasn’t always case Using large font offering large font option important growing number audience low vision Consider offering option create accessible solution people Descriptive AltText know Facebook Instagram offer alttext option posting picture However fewer 01 picture Twitter caption description offer poor explanation depicted imagery including alttext caption particularly important creating accessible website maintaining descriptive elaborated helpful description even important help low vision blindness understand displayed content Keyboard Input Options Many Americans visual motor disability rely keyboard provide input navigate website However many website fail support alternate method navigation Thus support audience incorporating descriptive navigational menu website apps necessary imperative Reduced Cognitive Load bit hard explain familiar user experience principle it’s likely you’ve heard “cognitive load” Simply put reducing cognitive load making thing le complicated need — keep simple mean user flow straight forward interaction logical description succinct informational WCAG 20 elaborates standard accessibility website It’s long list help capture need hundred million people throughout world encourage take look WCAG 20 website better understand step take make website accessible Accessible Design Look Like Brain Exercise Initiative cutest mascot summer 2020 Brain Exercise Initiative “a 501c3 nonprofit organization us simple math writing reading aloud exercise intervention improve cognitive function Alzheimer’s Disease” reached Bits Good midst shelter home order Brain Exercise Initiative trouble reaching senior practice reading writing math skill meant senior unable gain access vital resource aimed help improve cognitive function Thus Brain Exercise Initiative requested Bits Good create mobile app could serve intervention improve cognitive function Alzheimer’s Disease particularly COVID19 pandemic Given urgent task Bits Good organized team product manager engineering manager developer designer create app would one three designer help create “brain exercise” app Reflecting design process proud team designer thought accessibility long began design knowingly understood designing senior possibly senior moderate mild cognitive impairment meant think critically designed app designed app Nonetheless since mostly unfamiliar accessible technology dedicated week read article upon article best practice accessible design Thus identifying user target audience especially crucial creating Brain Exercise Initiative app original wireframes Brain Exercise Initiative app messy unlike final product helped u think critically next step would moved ideation design opted use vivid highcontrast color bold obvious font also deliberately included several accessibility setting including way narrate text change font size increase contrast reduce motion Although understood main audience likely senior familiar technology moderate cognitive impairment also recognized aging population would likely wide range ability skill included feature would accommodate user Later iterated design difficult decision choosing design would enjoyable usable senior Reflecting effect world pandemic realized senior although need exercising brain likely without visitor month said included fun picture warm copy encouraging notification design delightful experience Additionally organized several meeting consulted point contact Brain Exercise Initiative fiddle layout format color font tried optimize app senior audience Altogether designing specific audience new challenging task previous research outstanding grit helped u design something fantastic Look official Brain Exercise Initiative app iOS App Store Google Play Store I’m excited proud work team designer put app spent countless hour squabbling design would easier senior audience use enjoy think elegant flow design app reflects dedication accessibility I’ll first one admit app isn’t perfect app hasn’t evaluated accessibility expert consultant still need lot user testing look forward potentially working Brain Exercise Initiative coming semester continue improving app Conclusion design “everyone” easy u forget actual user designer determine user make critical decision inform design It’s important recognize user would benefit accessible design Although designer may reservation creating accessible option accessible design remember accessibility incorporated easily beautifully benefitting many I’ll ask design — Thank Brain Exercise Initiative working u Additional Resources You’ll Love Anyone Designers DevelopersTags Product Design Design Accessible Design UX Design Accessibility
|
5,535 |
What I Learned by Hiring an Editor to Critique My Novel
|
What I Learned by Hiring an Editor to Critique My Novel
It may have been less about my writing and more about me.
Photo by Chivalry Creative on Unsplash
I’ve been chasing my dream of becoming a published author for almost six years. Three manuscripts later — all of them collecting dust on a shelf — and I still feel like I’m making forward progress. That’s the good part.
Deciding to finally “retire” an unpublished piece and move on to something new is difficult. It feels like such failure to admit no one’s interested in the story you’ve agonized over for countless hours. But I’ve grown some thick skin over the years. Learned how to embrace each rejection as an opportunity to learn and improve.
Part of the learning process includes making the most of available resources to develop your craft. For writers, there are plenty of them out there. Conferences, online workshops and writing groups. Immerse yourself in them all if you can. Critique partners and beta readers are critical too. If you’re the only one to read your novel before thinking it’s ready for publication, it’s not.
While I had taken advantage of plenty of these resources along my journey, I still wasn’t gaining enough traction in terms of getting published. I’d been told my writing is solid, but knew I was missing something. Something important. It wasn’t until I sought an opinion from one of my author friends that I realized I didn’t know the first thing about the business of publishing. Things like pacing, character arc and voice are important, but so are knowing what genres are popular at any given moment, how to hook an agent or editor with the first page, and which literary tropes have been overdone. Therefore, I set out to learn everything I could about what I now believe is my next (and hopefully last) step in getting published.
For me, I decided to do this by hiring an editor. Up until that point, I had never paid anyone for a critique, relying solely on my friends and a few random readers I’d connected with whom I believed would give me professional and objective feedback. There’s nothing wrong with that, and I’m grateful for all the time these people have taken to help me polish my work. But there’s a lot to be said for engaging a paid professional, someone only concerned in performing the work they were hired to do and not afraid to hurt your feelings if necessary.
My primary goal in identifying this person was to find someone with industry experience. I used an online service called Reedsy that gives authors access to a variety of editorial and publishing professionals. All you have to do is search for the type of service you’re interested in (I was looking for a developmental edit of a romantic suspense novel) and search through the profiles of people willing and able to do that work. You can sort by different criteria including price, genre, experience level and timeframe, put a quote together and submit to five professionals at a time. I was able to hire an editor who had experience with a major publishing house and who specialized in romance and mystery. And she was willing to work within my budget. The perfect fit.
Now that I’ve received her feedback, let me just say — or shout — it was money well spent! Not only did she help me polish my content, but the tips she gave me were invaluable. So much so, that I felt compelled to share them with anyone else out there struggling to launch their writing into the world. Here’s what I learned.
1. The rules of proper English may not always apply.
One of the “random readers” I referred to earlier happens to be a retired English professor. She’s read through all three of my manuscripts and has taught me a lot about how to employ the rules so perfectly summarized in Strunk and White’s Elements of Style. She’s a stickler for proper punctuation, has helped me clarify when to use past perfect tense, and loves to sprinkle adverbs into my writing for greater detail. She’s been a godsend and I can’t imagine writing anything of length without her.
At the same time, when I received feedback from my developmental editor, I was surprised to see that she often reversed the changes my English professor had suggested. Sentence fragments seem to be encouraged in modern-day novels. Adverbs should be used sparingly. And the only dialogue tag apparently needed is “said.” Using phrases like “she retorted” or “he growled” are believed to violate the rule of “show don’t tell” and can slow down the reader’s pace. One acceptable exception seems to be when denoting volume as in “he shouted” or “she whispered.” I questioned these principles at first but have since found plenty of confirmation that these truly are industry standards. Hm. Who knew? Again, money well spent.
2. Important details may only live in your own head. Set them free!
One of the biggest criticisms I received is that too many relevant facts were being kept from the reader. My editor at times even found this insulting, as if she couldn’t trust my main character. Whoa! I had no idea. But after I went back and read the passages she had been referring to, I realized she was right. And it was not intentional. As a writer, you know your characters and plot lines inside and out. The challenge is making sure everything you want people to know is communicated properly in writing. I hadn’t planned on leaving the reader guessing about certain details but failed to put all my thoughts down on the page. Luckily, this was an easy fix and now I’m more intentional about making sure the reader is “in the know.” No more mind reading required.
Sometimes it takes a second and third time to effectively communicate a particular point, and it’s okay to repeat yourself for emphasis. I know I appreciate this myself when I’m reading a book, especially one with a complex plot. That’s something my editor confirmed and counseled me about — how to weave reminders throughout a manuscript to ensure clarity of understanding. Without that piece of advice, many of my ideas may still be stuck inside my head, leaving my readers confused. Nobody wants that.
3. Stereotypes can dissuade agents from representing your work, not to mention offending your readers.
This is a biggie, and probably the most eye-opening piece of feedback I received. I didn’t realize that having my Spanish-speaking character use broken English throughout the manuscript could be offensive, or that portraying an Indian American character as an Ivy leaguer with a genius IQ could be a turn-off as overly stereotypical, but my editor strongly discouraged me from employing these techniques. She also cautioned against describing shoppers at a popular big box retailer (you can fill in the blank here) in a negative fashion, even if that’s how many people across the country may characterize them. “You automatically isolate yourself from a large number of potential readers — those who shop at that particular store.” Ironically, the use of some of those stereotypes was an attempt at humor, not contempt in any way. But if that’s the way it was perceived, I’ll absolutely heed the warning. I may even seek out a sensitivity editor in the future to make sure I’m not making any further “no-nos.” If there’s a market for that type of editor, which apparently there is, it’s obviously something that needs to be on my radar.
An unlikable protagonist could be the kiss of death.
Here’s where I needed to make the most drastic change. My protagonist’s love interest was originally portrayed as somewhat crotchety, kind of a curmudgeon with a strong bias against young people. (Think Mr. Magoo but not quite so old, and not that crotchety). Most of his opinions were based on his experiences and observations about Millennials, and the plan was to have his character evolve throughout the story and learn to appreciate the younger generation for their strengths instead of penalizing them for their perceived faults.
Unfortunately, this strategy backfired with my editor. She was quick to point out that she herself was a Millennial, as are many of the agents and editors working in the industry, and that she personally would not have made it to the end of the book. My curmudgeon struck a chord with her. She even identified the point in the manuscript where she would have stopped reading, unwilling to wait and see whether he had a change of heart by the end.
Good to know, right? And not anything that would ever had occurred to me.
4. Pay attention to the current culture when making decisions about your characters.
There’s a scene in my novel where the protagonist gets drunk and ends up inviting the other main character back to her hotel room. It’s actually one of my favorite scenes as the chemistry between the two characters is pretty intense. But I was advised to make sure the woman wasn’t too drunk to consent to whatever eventually happens in that hotel room. Before the #MeToo movement, such a scene may not have thrown up any red flags, but now it does (and rightfully so). I didn’t have to alter the scene too much to fit within safe parameters, and that exercise definitely opened my eyes to being more conscious about similar issues moving forward.
In summary, I sure learned a lot through the process of a developmental edit. I have to admit that I didn’t agree with all the feedback at first. There were moments when I wondered whether I was being encouraged to kowtow to the gatekeepers of the industry and whether this is how passive-aggressive censorship works. (Please don’t judge. They were only fleeting thoughts).
But then I took a step back and really reflected on the observations being offered. I realized that some of my own cynicism and/or bias may have been creeping into my characters. What a lesson in self-awareness that is! I’ve always worked hard to be objective, fair and diplomatic in everything I do, but especially in my writing. And even though one editor’s subjective opinion is not the be-all, end-all, I really respect and appreciate her candor. If I want to get this book published, (or maybe the next one after that), it’s important to know how to best my best foot forward. The truth can be harsh sometimes, but better to face it now than when I’m staring down a negative book review!
|
https://medium.com/swlh/what-i-learned-by-hiring-an-editor-to-critique-my-novel-2e29e5a42d10
|
['Susan Poole']
|
2020-09-30 13:25:42.338000+00:00
|
['Writing Life', 'Novel Writing', 'Writing', 'Writing Tips', 'Publishing']
|
Title Learned Hiring Editor Critique NovelContent Learned Hiring Editor Critique Novel may le writing Photo Chivalry Creative Unsplash I’ve chasing dream becoming published author almost six year Three manuscript later — collecting dust shelf — still feel like I’m making forward progress That’s good part Deciding finally “retire” unpublished piece move something new difficult feel like failure admit one’s interested story you’ve agonized countless hour I’ve grown thick skin year Learned embrace rejection opportunity learn improve Part learning process includes making available resource develop craft writer plenty Conferences online workshop writing group Immerse Critique partner beta reader critical you’re one read novel thinking it’s ready publication it’s taken advantage plenty resource along journey still wasn’t gaining enough traction term getting published I’d told writing solid knew missing something Something important wasn’t sought opinion one author friend realized didn’t know first thing business publishing Things like pacing character arc voice important knowing genre popular given moment hook agent editor first page literary trope overdone Therefore set learn everything could believe next hopefully last step getting published decided hiring editor point never paid anyone critique relying solely friend random reader I’d connected believed would give professional objective feedback There’s nothing wrong I’m grateful time people taken help polish work there’s lot said engaging paid professional someone concerned performing work hired afraid hurt feeling necessary primary goal identifying person find someone industry experience used online service called Reedsy give author access variety editorial publishing professional search type service you’re interested looking developmental edit romantic suspense novel search profile people willing able work sort different criterion including price genre experience level timeframe put quote together submit five professional time able hire editor experience major publishing house specialized romance mystery willing work within budget perfect fit I’ve received feedback let say — shout — money well spent help polish content tip gave invaluable much felt compelled share anyone else struggling launch writing world Here’s learned 1 rule proper English may always apply One “random readers” referred earlier happens retired English professor She’s read three manuscript taught lot employ rule perfectly summarized Strunk White’s Elements Style She’s stickler proper punctuation helped clarify use past perfect tense love sprinkle adverb writing greater detail She’s godsend can’t imagine writing anything length without time received feedback developmental editor surprised see often reversed change English professor suggested Sentence fragment seem encouraged modernday novel Adverbs used sparingly dialogue tag apparently needed “said” Using phrase like “she retorted” “he growled” believed violate rule “show don’t tell” slow reader’s pace One acceptable exception seems denoting volume “he shouted” “she whispered” questioned principle first since found plenty confirmation truly industry standard Hm knew money well spent 2 Important detail may live head Set free One biggest criticism received many relevant fact kept reader editor time even found insulting couldn’t trust main character Whoa idea went back read passage referring realized right intentional writer know character plot line inside challenge making sure everything want people know communicated properly writing hadn’t planned leaving reader guessing certain detail failed put thought page Luckily easy fix I’m intentional making sure reader “in know” mind reading required Sometimes take second third time effectively communicate particular point it’s okay repeat emphasis know appreciate I’m reading book especially one complex plot That’s something editor confirmed counseled — weave reminder throughout manuscript ensure clarity understanding Without piece advice many idea may still stuck inside head leaving reader confused Nobody want 3 Stereotypes dissuade agent representing work mention offending reader biggie probably eyeopening piece feedback received didn’t realize Spanishspeaking character use broken English throughout manuscript could offensive portraying Indian American character Ivy leaguer genius IQ could turnoff overly stereotypical editor strongly discouraged employing technique also cautioned describing shopper popular big box retailer fill blank negative fashion even that’s many people across country may characterize “You automatically isolate large number potential reader — shop particular store” Ironically use stereotype attempt humor contempt way that’s way perceived I’ll absolutely heed warning may even seek sensitivity editor future make sure I’m making “nonos” there’s market type editor apparently it’s obviously something need radar unlikable protagonist could kiss death Here’s needed make drastic change protagonist’s love interest originally portrayed somewhat crotchety kind curmudgeon strong bias young people Think Mr Magoo quite old crotchety opinion based experience observation Millennials plan character evolve throughout story learn appreciate younger generation strength instead penalizing perceived fault Unfortunately strategy backfired editor quick point Millennial many agent editor working industry personally would made end book curmudgeon struck chord even identified point manuscript would stopped reading unwilling wait see whether change heart end Good know right anything would ever occurred 4 Pay attention current culture making decision character There’s scene novel protagonist get drunk end inviting main character back hotel room It’s actually one favorite scene chemistry two character pretty intense advised make sure woman wasn’t drunk consent whatever eventually happens hotel room MeToo movement scene may thrown red flag rightfully didn’t alter scene much fit within safe parameter exercise definitely opened eye conscious similar issue moving forward summary sure learned lot process developmental edit admit didn’t agree feedback first moment wondered whether encouraged kowtow gatekeeper industry whether passiveaggressive censorship work Please don’t judge fleeting thought took step back really reflected observation offered realized cynicism andor bias may creeping character lesson selfawareness I’ve always worked hard objective fair diplomatic everything especially writing even though one editor’s subjective opinion beall endall really respect appreciate candor want get book published maybe next one it’s important know best best foot forward truth harsh sometimes better face I’m staring negative book reviewTags Writing Life Novel Writing Writing Writing Tips Publishing
|
5,536 |
expertlead, a global community of highly qualified tech freelancers
|
expertlead has raised €7M in total. We talk with Arne Hosemann, its CEO.
PetaCrunch: How would you describe expertlead in a single tweet?
Arne Hosemann: expertlead is a global community of highly qualified tech freelancers. We support our community in all stages of self-employment: from project acquisition, providing relevant services and opportunities for further training and peer-to-peer learning, to administrative tasks.
PC: How did it all start and why?
AH: Earlier in our respective careers we constantly heard businesses complain about how hard finding great tech talent is. This, in itself, is not too surprising given top tech talent shortage is a widely discussed topic.
But when we talked to developers, they would point out how poor their experience with recruiters and staffing agencies had been. Very often even tech focused recruiters would not understand tech skills specifics or the professional preferences of these developers. This got us thinking about what it would take to make the experience on both sides significantly better.
As there is a growing trend towards self-employment in tech and freelancers still often work alone or in remote teams, we wanted to become their go-to partner that supports them in all stages of self-employment.
This was what drove our whole idea: building a community that is really different from a recruitment agency or a pure self-matching talent marketplace by becoming a true partner in our self-employed tech experts’ professional lives — from project acquisition, opportunities for continued professional development and peer-to-peer learning to taking care of administrative tasks.
Similarly, for our clients, we did not just want to focus on matching demand — our mission is to go a lot further and help companies identify the best talent. This is where many companies struggle, especially those that are not digital native. Assessing the various skills level of tech applicants is quite challenging and can be very time-consuming and hence expensive. Therefore, we started digging deeper into how we can test various tech stacks, databases and frameworks while still keeping it enjoyable for the applicants as well. Very quickly we got to the point where we realized that no single company can cover the entire tech field when it comes to testing — it is way too broad and complex. That is when the idea was born to involve our tech community in assessing other tech experts — which is one of our core USPs today.
With that in mind, we both left our previous job in 2017 and started expertlead in 2018.
PC: What have you achieved so far?
AH: Since Alex and I started in 2018 our team has grown rapidly: we are now an international startup headquartered in Berlin that employs around 45 people. By the end of year we expect to be around 60 employees.
We have invested most of our seed capital in building out our tech products. We aim to use tech solutions across our entire value chain: from identifying suitable tech freelancers, testing their skill level and matching them to client projects to providing relevant services to our freelance community. We have made significant progress to automate these different steps already, especially when it comes to assessing our community’s tech skills, automatically matching client projects with the best freelancers and identifying leading tech talent.
Our strong focus on tech allows us to help our clients faster and more effectively than others. With that approach we have already convinced leading European multinationals and tech companies including Daimler, Babbel and Delivery Hero.
Just recently we have announced one of our greatest achievements since our launch: having three global investors — Acton, Rocket Internet and SEEK — jointly invest €7M in our company for our Series A round.
PC: How will you use your recent funding round?
AH: The newly raised capital will be used to support our international growth ambitions as well as to further drive the automation of our products. We also wish to broaden our technical know-how so our platform can service new areas such as cybersecurity. Last but not least our team will also be focusing on expanding our community offering and peer-to-peer engagement.
PC: What do you plan to achieve in the next 2–3 years?
AH: Closing our Series A was a great success but only the beginning of an exciting journey! In the next 2–3 years we will fully focus on expanding our tech community and on building a leading tech company in a space that is still mainly dominated by quite “manual” agencies.
We want to be known in the tech ecosystem for being a truly valuable partner in highly skilled freelancers’ professional careers and for offering the most enjoyable and solid technical assessment experience through our platform. That is the way we intend to expand our community globally in the years to come.
On the client side, we want to continue in our path to becoming the go-to trusted partner for both multinational corporates and tech companies when it comes to identifying and hiring the leading tech talent for their most innovative and complex tech projects. This will also, of course, continue to benefit our community greatly.
|
https://medium.com/petacrunch/expertlead-a-global-community-of-highly-qualified-tech-freelancers-1761ca632b46
|
['Kevin Hart']
|
2019-09-04 07:21:01.205000+00:00
|
['Freelance', 'Startup', 'Freelancers', 'Community', 'Freelancing']
|
Title expertlead global community highly qualified tech freelancersContent expertlead raised €7M total talk Arne Hosemann CEO PetaCrunch would describe expertlead single tweet Arne Hosemann expertlead global community highly qualified tech freelancer support community stage selfemployment project acquisition providing relevant service opportunity training peertopeer learning administrative task PC start AH Earlier respective career constantly heard business complain hard finding great tech talent surprising given top tech talent shortage widely discussed topic talked developer would point poor experience recruiter staffing agency often even tech focused recruiter would understand tech skill specific professional preference developer got u thinking would take make experience side significantly better growing trend towards selfemployment tech freelancer still often work alone remote team wanted become goto partner support stage selfemployment drove whole idea building community really different recruitment agency pure selfmatching talent marketplace becoming true partner selfemployed tech experts’ professional life — project acquisition opportunity continued professional development peertopeer learning taking care administrative task Similarly client want focus matching demand — mission go lot help company identify best talent many company struggle especially digital native Assessing various skill level tech applicant quite challenging timeconsuming hence expensive Therefore started digging deeper test various tech stack database framework still keeping enjoyable applicant well quickly got point realized single company cover entire tech field come testing — way broad complex idea born involve tech community assessing tech expert — one core USPs today mind left previous job 2017 started expertlead 2018 PC achieved far AH Since Alex started 2018 team grown rapidly international startup headquartered Berlin employ around 45 people end year expect around 60 employee invested seed capital building tech product aim use tech solution across entire value chain identifying suitable tech freelancer testing skill level matching client project providing relevant service freelance community made significant progress automate different step already especially come assessing community’s tech skill automatically matching client project best freelancer identifying leading tech talent strong focus tech allows u help client faster effectively others approach already convinced leading European multinationals tech company including Daimler Babbel Delivery Hero recently announced one greatest achievement since launch three global investor — Acton Rocket Internet SEEK — jointly invest €7M company Series round PC use recent funding round AH newly raised capital used support international growth ambition well drive automation product also wish broaden technical knowhow platform service new area cybersecurity Last least team also focusing expanding community offering peertopeer engagement PC plan achieve next 2–3 year AH Closing Series great success beginning exciting journey next 2–3 year fully focus expanding tech community building leading tech company space still mainly dominated quite “manual” agency want known tech ecosystem truly valuable partner highly skilled freelancers’ professional career offering enjoyable solid technical assessment experience platform way intend expand community globally year come client side want continue path becoming goto trusted partner multinational corporates tech company come identifying hiring leading tech talent innovative complex tech project also course continue benefit community greatlyTags Freelance Startup Freelancers Community Freelancing
|
5,537 |
Plug-in for Jira is live!
|
Vizydrop plug-in is available in Atlassian Marketplace.
We have integrated Vizydrop into the Atlassian ecosystem and happy to announce the availability of our plugin. Go and get visual answers from your Jira data.
Predefined templates popular among users will help you to get visual insights about your team progress.
Charts, pivot tables, facets.
Utilize all issue fields, projects, custom fields, changelog, status transitions, and work log.
Easy to use drag-n-drop editor with a user-friendly and comprehensive guide.
The report calculations powered by autocompleting help you to modify visualizations like a pro.
All your reports can be organized into dashboards.
Filter data with your saved JIRA filters, JQL or control data using built-in filters.
Data browser with data reveal allows you to drill down into concrete issues with just a few clicks.
Share and export reports. Export, print or share by link with colleagues, friends, your mom, and the whole world.
Use popular apps data sources and add custom sources. Create charts from files, links, Trello, Google Sheets, GitHub and etc.
Thank you for giving us a try. https://reports.vizydrop.com
|
https://medium.com/vizydrop/plug-in-for-jira-is-live-58e718f6bd8d
|
['Oleg Seriaga']
|
2019-10-09 12:14:37.611000+00:00
|
['Dashboard', 'Jira Plugins', 'Atlassian', 'Project Management', 'Jira Reports']
|
Title Plugin Jira liveContent Vizydrop plugin available Atlassian Marketplace integrated Vizydrop Atlassian ecosystem happy announce availability plugin Go get visual answer Jira data Predefined template popular among user help get visual insight team progress Charts pivot table facet Utilize issue field project custom field changelog status transition work log Easy use dragndrop editor userfriendly comprehensive guide report calculation powered autocompleting help modify visualization like pro report organized dashboard Filter data saved JIRA filter JQL control data using builtin filter Data browser data reveal allows drill concrete issue click Share export report Export print share link colleague friend mom whole world Use popular apps data source add custom source Create chart file link Trello Google Sheets GitHub etc Thank giving u try httpsreportsvizydropcomTags Dashboard Jira Plugins Atlassian Project Management Jira Reports
|
5,538 |
Trump Obsessive Syndrome Appears to be Widespread
|
Satire/Humor
Trump Obsessive Syndrome Appears to be Widespread
Psychiatrists debate cutting edge therapies
Photo by Tim Mossholder on Unsplash
When I woke up this morning, I expected my social media feed to be refreshingly free of Trump stories, memes, and one-liners.
I mean, the election is over, right?
So I was surprised to see no fewer than 56 articles with Trump in their headline.
Here are just a few of them:
Left suggests Rounding up Trump supporters and sending them to Siberia Trump supporters vow to form separate state Democrats express amazement that Trump supporters aren’t rioting and burning cities Writers vow to continue writing about Trump as long as they can find funny memes of him on Unsplash
These articles are just the tip of the iceberg. The sheer number of Trump stories one week post-election left me with two options. I could pull my comforter over my head and go back to sleep, hoping the whole thing would be over, or I could investigate this strange phenomenon.
Being the crack reporter that I am, I decided to investigate.
And this is what I discovered.
There is a new psychiatric disorder which is the polar opposite of Trump Derangement Syndrome. TDS, in case you didn’t know, is defined by Wikipedia as a term for criticism or negative reactions to Donald Trump that are perceived to be irrational.
The new disorder, according to the latest and most cutting-edge psychiatric journals, has been dubbed Trump Obsessive Syndrome. It is defined as an obsession, especially among writers, to focus constantly on Trump.
Writers who are victims of this disorder claim they can’t get Trump out of their minds. “It’s like something going round and round in my head,” one writer explained.
When they attempt to write about anything else, they aren’t passionate about it.
“I tried all day to write self-improvement listicles, and I kept seeing the orange man in my dreams. I finally had to give in to the urge and write about him.”
Another writer said, “Trump has provided me with my most successful material. I’ve been accepted to write Trump stories for Gen and Level and Forge, which is a degree of success I never thought to attain. If I stop writing about him, I’m back to self-publishing and getting three views on my stories.”
There is a certain degree of distress that appears to be synonymous with the syndrome. Writers worry that when Trump leaves the White House, they will have to become salespeople or form a failed startup instead of fulfilling their lifelong dream of being writers.
Psychologists are recommending extensive therapy.
“We start out by getting patients to focus on any color except orange,” one doctor explained. Purple and blue are preferable.”
Another doctor said his therapeutic approach involves focus groups. “We have entire sessions where no one is allowed to mention Trump. Every time someone slips up and says his name, they are required to sit in the center of the circle wearing a MAGA hat.
“We expect this syndrome to fade away eventually,” the doctor continued. “But we’re growing concerned. Trump has elicited such an unprecedented level of emotion that it is hard for people to give up those feelings.”
This might be the reason several writer’s groups have gone underground to form a Recount the Ballots initiative.
Members of this underground group insist on remaining anonymous, but their reasoning goes something like this: If we can reverse the results of this election, we are guaranteed to always have something to write about.
But there is hope on the horizon if therapy doesn’t work. Pharmaceutical companies are already racing to come up with a vaccine.
|
https://medium.com/muddyum/trump-obsessive-syndrome-appears-to-be-widespread-36dceed03e39
|
['Bebe Nicholson']
|
2020-11-10 16:39:37.952000+00:00
|
['Elections', 'Humor', 'Writing', 'Politics', 'Satire']
|
Title Trump Obsessive Syndrome Appears WidespreadContent SatireHumor Trump Obsessive Syndrome Appears Widespread Psychiatrists debate cutting edge therapy Photo Tim Mossholder Unsplash woke morning expected social medium feed refreshingly free Trump story meme oneliners mean election right surprised see fewer 56 article Trump headline Left suggests Rounding Trump supporter sending Siberia Trump supporter vow form separate state Democrats express amazement Trump supporter aren’t rioting burning city Writers vow continue writing Trump long find funny meme Unsplash article tip iceberg sheer number Trump story one week postelection left two option could pull comforter head go back sleep hoping whole thing would could investigate strange phenomenon crack reporter decided investigate discovered new psychiatric disorder polar opposite Trump Derangement Syndrome TDS case didn’t know defined Wikipedia term criticism negative reaction Donald Trump perceived irrational new disorder according latest cuttingedge psychiatric journal dubbed Trump Obsessive Syndrome defined obsession especially among writer focus constantly Trump Writers victim disorder claim can’t get Trump mind “It’s like something going round round head” one writer explained attempt write anything else aren’t passionate “I tried day write selfimprovement listicles kept seeing orange man dream finally give urge write him” Another writer said “Trump provided successful material I’ve accepted write Trump story Gen Level Forge degree success never thought attain stop writing I’m back selfpublishing getting three view stories” certain degree distress appears synonymous syndrome Writers worry Trump leaf White House become salesperson form failed startup instead fulfilling lifelong dream writer Psychologists recommending extensive therapy “We start getting patient focus color except orange” one doctor explained Purple blue preferable” Another doctor said therapeutic approach involves focus group “We entire session one allowed mention Trump Every time someone slip say name required sit center circle wearing MAGA hat “We expect syndrome fade away eventually” doctor continued “But we’re growing concerned Trump elicited unprecedented level emotion hard people give feelings” might reason several writer’s group gone underground form Recount Ballots initiative Members underground group insist remaining anonymous reasoning go something like reverse result election guaranteed always something write hope horizon therapy doesn’t work Pharmaceutical company already racing come vaccineTags Elections Humor Writing Politics Satire
|
5,539 |
How we built an easy-to-use image segmentation tool with transfer learning
|
How we built an easy-to-use image segmentation tool with transfer learning
Label images, predict new images, and visualize the neural network, all in a single Jupyter notebook (and share it all using Docker Hub!)
Authors: Jenny Huang, Ian Hunt-Isaak, William Palmer
GitHub Repo
Introduction
Training an image segmentation model on new images can be daunting, especially when you need to label your own data. To make this task easier and faster, we built a user-friendly tool that lets you build this entire process in a single Jupyter notebook. In the sections below, we will show you how our tool lets you:
Manually label your own images Build an effective segmentation model through transfer learning Visualize the model and its results Share your project as a Docker image
The main benefits of this tool are that it is easy-to-use, all in one platform, and well-integrated with existing data science workflows. Through interactive widgets and command prompts, we built a user-friendly way to label images and train the model. On top of that, everything can run in a single Jupyter notebook, making it quick and easy to spin up a model, without much overhead. Lastly, by working in a Python environment and using standard libraries like Tensorflow and Matplotlib, this tool can be well-integrated into existing data science workflows, making it ideal for uses like scientific research.
For instance, in microbiology, it can be very useful to segment microscopy images of cells. However, tracking cells over time can easily result in the need to segment hundreds of images, which can be very difficult to do manually. In this article, we will use microscopy images of yeast cells as our dataset and show how we built our tool to differentiate between the background, mother cells, and daughter cells.
1. Labelling
There are many existing tools to create labelled masks for images, including Labelme, ImageJ, and even the graphics editor GIMP. While these are all great tools, they can’t be integrated within a Jupyter notebook, making them harder to use with many existing workflows. Fortunately, Jupyter Widgets make it easy for us to make interactive components and connect them with the rest of our Python code.
To create training masks in the notebook, we have two problems to solve:
Select parts of an image with a mouse Easily switch between images and select the class to label
To solve the first problem, we used the Matplotlib widget backend and the built-in LassoSelector. The LassoSelector handles drawing a line to show what you are selecting, but we need a little bit of custom code to draw the masks as an overlay:
Class to manage a Lasso Selector for Matplotlib in a Jupyter notebook
For the second problem, we added nice looking buttons and other controls using ipywidgets:
We combined these elements (along with improvements like scroll to zoom) to make a single labelling controller object. Now we can take microscopy images of yeast and segment the mother cells and daughter cells:
Demo of lasso selection image labeler
You can check out the full object, which lets you scroll to zoom, right click to pan, and select multiple classes here.
Now we can label a small number of images in the notebook, save them into the correct folder structure, and start to train CNN!
2. Model Training
The Model
U-net is a convolutional neural network that was initially designed to segment biomedical images but has been successful for many other types of images. It builds upon existing convolutional networks to work better with very few training images and make more precise segmentations. It is a state-of-the-art model that is also easy to implement using the segmentation_models library.
Image from https://arxiv.org/pdf/1505.04597.pdf
U-net is unique because it combines an encoder and a decoder using cross-connections (the gray arrows in the figure above). These skip connections cross from the same sized part in the downsampling path to the upsampling path. This creates awareness of the original pixels inputted into the model when you upsample, which has been shown to improve performance on segmentation tasks.
As great as U-net is, it won’t work well if we don’t give it enough training examples. And given how tedious it is to manually segment images, we only manually labelled 13 images. With so few training examples, it seems impossible to train a neural network with millions of parameters. To overcome this, we need both Data Augmentation and Transfer Learning.
Data Augmentation
Naturally, if your model has a lot of parameters, you would need a proportional amount of training examples to get good performance. Using our small dataset of images and masks, we can create new images that will be as insightful and useful to our model as our original images.
How do we do that? We can flip the image, rotate it at an angle, scale it inward or outward, crop it, translate it, or even blur the image by adding noise, but most importantly, we can do a combination of those operations to create many new training examples.
Examples of augmented images
Image data augmentation has one more complication in segmentation compared to classification. For classification, you just need to augment the image as the label will remain the same (0 or 1 or 2…). However, for segmentation, the label (which is a mask) needs to also be transformed in sync with the image. To do this, we used the albumentations library with a custom data generator since, to our knowledge, the Keras ImageDataGenerator does not currently support the combination “Image + mask”.
Custom data generator for image segmentation using albumentations
Transfer Learning
Even though we have now created 100 or more images, this still isn’t enough as the U-net model has more than 6 million parameters. This is where transfer learning comes into play.
Transfer Learning lets you take a model trained on one task and reuse it for another similar task. It reduces your training time drastically and more importantly, it can lead to effective models even with a small training set like ours. For example, neural networks like MobileNet, Inception, and DeepNet, learn a feature space, shapes, colors, texture, and more, by training on a great number of images. We can then transfer what was learned by taking these model weights and modifying them slightly to activate for patterns in our own training images.
Now how do we use transfer learning with U-net? We used the segmentation_models library to do this. We use the layers of a deep neural network of your choosing (MobileNet, Inception, ResNet) and the parameters found training on image classification (ImageNet) and use them as the first half (encoder) of your U-net. Then, you train the decoder layers with your own augmented dataset.
Putting it Together
We put this all together in a Segmentation model class that you can find here. When creating your model object, you get an interactive command prompt where you can customize aspects of your U-net like the loss function, backbone, and more:
Segmentation model customization demo
After 30 epochs of training, we achieved 95% accuracy. Note that it is important to choose a good loss function. We first tried cross-entropy loss, but the model was unable to distinguish between the similar looking mother and daughter cells and had poor performance due to the class imbalance of seeing many more non-yeast pixels than yeast pixels. We found that using dice loss gave us much better results. The dice loss is linked to the Intersection over Union Score (IOU) and is usually better adapted to segmentation tasks as it gives incentive to maximize the overlap between the predicted and ground truth masks.
Example predictions by our model compared to true masks
3. Visualization
Now that our model is trained, let’s use some visualization techniques to see how it works. We follow Ankit Paliwal’s tutorial to do so. You can find the implementation in his corresponding GitHub repository. In this section, we will visualize two of his techniques, Intermediate Layer Activations and Heatmaps of Class Activations, on our yeast cell segmentation model.
Intermediate Layer Activations
This first technique shows the output of intermediate layers in a forward pass of the network on a test image. This lets us see what features of the input image are highlighted at each layer. After inputting a test image, we visualized the first few outputs for some convolutional layers in our network:
Outputs for some encoder layers
Outputs for some decoder layers
In the encoder layers, filters close to the input detect more detail and those close to the output of the model detect more general features, which is to be expected. In the decoder layers, we see the opposite pattern, of going from abstract to more specific details, which is also to be expected.
Heatmaps of Class Activations
Next, we look at class activation maps. These heat maps let you see how important each location of the image is for predicting an output class. Here, we visualize the final layer of our yeast cell model, since the class prediction label will largely depend on it.
Heatmaps of class activations on a few sample images
We see from the heat maps that the cell locations are correctly activated, along with parts of the image border, which is somewhat surprising.
We also looked at the last technique in the tutorial, which shows what images each convolutional filter maximally responds to, but the visualizations were not very informative for our specific yeast cell model.
4. Making and Sharing a Docker Image
Finding an awesome model and trying to run it, only to find that it doesn’t work in your environment due to mysterious dependency issues, is very frustrating. We addressed this by creating a Docker image for our tool. This allows us to completely define the environment that the code is run in, all the way down to the operating system. For this project, we based our Docker image off of the jupyter/tensorflow-notebook image from Jupyter Docker Stacks. Then we just added a few lines to install the libraries we needed and to copy the contents of our GitHub repository into the Docker image. If you’re curious, you can see our final Dockerfile here. Finally, we pushed this image to Docker Hub for easy distribution. You can try it out by running:
sudo docker run -p 8888:8888 ianhuntisaak/ac295-final-project:v3 \
-e JUPYTER_LAB_ENABLE=yes
Conclusion and Future Work
This tool lets you easily train a segmentation model on new images in a user-friendly way. While it works, there is still room for improvement in usability, customization, and model performance. In the future, we hope to:
Improve the lasso tool by building a custom Jupyter Widget using the html5 canvas to reduce lag when manually segmenting Explore new loss functions and models (like this U-net pre-trained on broad nucleus dataset) as a basis for transfer learning Make it easier to interpret visualizations and suggest methods of improving the results to the user
Acknowledgements
We would like to thank our professor Pavlos Protopapas and the Harvard Applied Computation 295 course teaching staff for their guidance and support.
|
https://towardsdatascience.com/how-we-built-an-easy-to-use-image-segmentation-tool-with-transfer-learning-546efb6ae98
|
['Jenny Huang']
|
2020-08-06 00:11:03.936000+00:00
|
['Transfer Learning', 'Visualization', 'Image Segmentation', 'Unet', 'Editors Pick']
|
Title built easytouse image segmentation tool transfer learningContent built easytouse image segmentation tool transfer learning Label image predict new image visualize neural network single Jupyter notebook share using Docker Hub Authors Jenny Huang Ian HuntIsaak William Palmer GitHub Repo Introduction Training image segmentation model new image daunting especially need label data make task easier faster built userfriendly tool let build entire process single Jupyter notebook section show tool let Manually label image Build effective segmentation model transfer learning Visualize model result Share project Docker image main benefit tool easytouse one platform wellintegrated existing data science workflow interactive widget command prompt built userfriendly way label image train model top everything run single Jupyter notebook making quick easy spin model without much overhead Lastly working Python environment using standard library like Tensorflow Matplotlib tool wellintegrated existing data science workflow making ideal us like scientific research instance microbiology useful segment microscopy image cell However tracking cell time easily result need segment hundred image difficult manually article use microscopy image yeast cell dataset show built tool differentiate background mother cell daughter cell 1 Labelling many existing tool create labelled mask image including Labelme ImageJ even graphic editor GIMP great tool can’t integrated within Jupyter notebook making harder use many existing workflow Fortunately Jupyter Widgets make easy u make interactive component connect rest Python code create training mask notebook two problem solve Select part image mouse Easily switch image select class label solve first problem used Matplotlib widget backend builtin LassoSelector LassoSelector handle drawing line show selecting need little bit custom code draw mask overlay Class manage Lasso Selector Matplotlib Jupyter notebook second problem added nice looking button control using ipywidgets combined element along improvement like scroll zoom make single labelling controller object take microscopy image yeast segment mother cell daughter cell Demo lasso selection image labeler check full object let scroll zoom right click pan select multiple class label small number image notebook save correct folder structure start train CNN 2 Model Training Model Unet convolutional neural network initially designed segment biomedical image successful many type image build upon existing convolutional network work better training image make precise segmentation stateoftheart model also easy implement using segmentationmodels library Image httpsarxivorgpdf150504597pdf Unet unique combine encoder decoder using crossconnections gray arrow figure skip connection cross sized part downsampling path upsampling path creates awareness original pixel inputted model upsample shown improve performance segmentation task great Unet won’t work well don’t give enough training example given tedious manually segment image manually labelled 13 image training example seems impossible train neural network million parameter overcome need Data Augmentation Transfer Learning Data Augmentation Naturally model lot parameter would need proportional amount training example get good performance Using small dataset image mask create new image insightful useful model original image flip image rotate angle scale inward outward crop translate even blur image adding noise importantly combination operation create many new training example Examples augmented image Image data augmentation one complication segmentation compared classification classification need augment image label remain 0 1 2… However segmentation label mask need also transformed sync image used albumentations library custom data generator since knowledge Keras ImageDataGenerator currently support combination “Image mask” Custom data generator image segmentation using albumentations Transfer Learning Even though created 100 image still isn’t enough Unet model 6 million parameter transfer learning come play Transfer Learning let take model trained one task reuse another similar task reduces training time drastically importantly lead effective model even small training set like example neural network like MobileNet Inception DeepNet learn feature space shape color texture training great number image transfer learned taking model weight modifying slightly activate pattern training image use transfer learning Unet used segmentationmodels library use layer deep neural network choosing MobileNet Inception ResNet parameter found training image classification ImageNet use first half encoder Unet train decoder layer augmented dataset Putting Together put together Segmentation model class find creating model object get interactive command prompt customize aspect Unet like loss function backbone Segmentation model customization demo 30 epoch training achieved 95 accuracy Note important choose good loss function first tried crossentropy loss model unable distinguish similar looking mother daughter cell poor performance due class imbalance seeing many nonyeast pixel yeast pixel found using dice loss gave u much better result dice loss linked Intersection Union Score IOU usually better adapted segmentation task give incentive maximize overlap predicted ground truth mask Example prediction model compared true mask 3 Visualization model trained let’s use visualization technique see work follow Ankit Paliwal’s tutorial find implementation corresponding GitHub repository section visualize two technique Intermediate Layer Activations Heatmaps Class Activations yeast cell segmentation model Intermediate Layer Activations first technique show output intermediate layer forward pas network test image let u see feature input image highlighted layer inputting test image visualized first output convolutional layer network Outputs encoder layer Outputs decoder layer encoder layer filter close input detect detail close output model detect general feature expected decoder layer see opposite pattern going abstract specific detail also expected Heatmaps Class Activations Next look class activation map heat map let see important location image predicting output class visualize final layer yeast cell model since class prediction label largely depend Heatmaps class activation sample image see heat map cell location correctly activated along part image border somewhat surprising also looked last technique tutorial show image convolutional filter maximally responds visualization informative specific yeast cell model 4 Making Sharing Docker Image Finding awesome model trying run find doesn’t work environment due mysterious dependency issue frustrating addressed creating Docker image tool allows u completely define environment code run way operating system project based Docker image jupytertensorflownotebook image Jupyter Docker Stacks added line install library needed copy content GitHub repository Docker image you’re curious see final Dockerfile Finally pushed image Docker Hub easy distribution try running sudo docker run p 88888888 ianhuntisaakac295finalprojectv3 e JUPYTERLABENABLEyes Conclusion Future Work tool let easily train segmentation model new image userfriendly way work still room improvement usability customization model performance future hope Improve lasso tool building custom Jupyter Widget using html5 canvas reduce lag manually segmenting Explore new loss function model like Unet pretrained broad nucleus dataset basis transfer learning Make easier interpret visualization suggest method improving result user Acknowledgements would like thank professor Pavlos Protopapas Harvard Applied Computation 295 course teaching staff guidance supportTags Transfer Learning Visualization Image Segmentation Unet Editors Pick
|
5,540 |
How to Be Productive Without Being a Jerk
|
How to Be Productive Without Being a Jerk
Efficiency with people is ineffective.
Photo by Şahin Yeşilyaprak on Unsplash
Our 7-year-old daughter called her older sister a jerk the other day. It wasn’t the nicest thing to say, but the label was accurate at the time and the moment was actually kinda’ funny (I laughed on the inside because I don’t want to encourage our children to say mean things to each other lol).
I’ve been writing a lot of content for various publications, projects, and clients lately. If you’re a writer, then you understand that writing takes a lot of focus. I’ve also had more virtual meetings with my team, clients, and prospects these days. With three children at home, distractions can come quickly and often. Once in a while, I have to gently pry our 5-year-old son from my arm when I’m trying to write or participate in a Zoom meeting.
All of this got me thinking, “Have I been a productive jerk?”
The honest answer is “Yes, sometimes.” But for the most part, I think I’ve been good at giving the people in my life their needed attention while remaining productive. Our family is full of creative people and I want to be a good example of how to be creative, productive, attentive, and loving.
Allow me to share my tips on how to be productive without being a jerk.
|
https://medium.com/inspirefirst/how-to-be-productive-without-being-a-jerk-7fb114d61a85
|
['Chris Craft']
|
2020-08-20 17:49:23.304000+00:00
|
['Self Improvement', 'Self', 'Advice', 'Productivity', 'Life']
|
Title Productive Without JerkContent Productive Without Jerk Efficiency people ineffective Photo Şahin Yeşilyaprak Unsplash 7yearold daughter called older sister jerk day wasn’t nicest thing say label accurate time moment actually kinda’ funny laughed inside don’t want encourage child say mean thing lol I’ve writing lot content various publication project client lately you’re writer understand writing take lot focus I’ve also virtual meeting team client prospect day three child home distraction come quickly often gently pry 5yearold son arm I’m trying write participate Zoom meeting got thinking “Have productive jerk” honest answer “Yes sometimes” part think I’ve good giving people life needed attention remaining productive family full creative people want good example creative productive attentive loving Allow share tip productive without jerkTags Self Improvement Self Advice Productivity Life
|
5,541 |
Reconciling the Differences in Our Data: A Mixed-Methods Research Story
|
We all love it when the quant and qual align, but what about those other times, when they seem at odds? For example: the surveys are in, the clickstream data has been analyzed, and you’re feeling confident. Then as you compare notes with your teammates, you realize that the recommended next steps based on UX research and data science are poised to send the business in two very different directions.
This was the challenge we found ourselves working to resolve as a user researcher and data scientist with Customer Insights Research supporting the same product team at Microsoft. What seemed like a conflict ended up leading us to deeper insights, a closer working relationship, and a better outcome for our customers—but getting there was a multi-step process.
Step 1. Confront your data discrepancy
Our product team was sunsetting the older version of an app in favor of one that provides accessibility for all users. To help our stakeholders understand what our customers needed in the new version, researchers had conducted user studies, interviews, and surveys as well as analyzing in-app feedback. Caryn, a researcher, was listening to what our customers were saying: they told us too many of the features they enjoyed in the older app were missing from the new app.
The user research recommendation, based on this analysis? Fill the feature gaps from the older app or customers will not transition over.
Meanwhile, Sera, a data scientist, conducted a cohort analysis with clickstream data to understand what our customers were doing in the older version of the app and how that impacted their transition to the new version. Based on the qualitative feedback, she expected to see customers who used features only available in the older app abandoning the new app. But the analysis showed that they weren’t.
The data science recommendation at this stage? Since customer retention in the new app doesn’t correlate with feature use in the older app, focus on other vital parts of the user journey to help people transition.
Research and data science had arrived at opposing suggestions. Now what?
Step 2. Resist the urge to champion your own data
At this stage, it could have been easy to each double down on our opposing viewpoints. If we’d presented the results, asking our general program manager to choose between recommendations, at least one of us would have the satisfaction of knowing we influenced the product. But how could our stakeholders and leaders be confident they were making the best data-driven decision, if we forced them to choose between quant and qual?
In a way, mixed-methods research is an exercise in getting comfortable with conflict and finding reconciliation instead of a “winner.” Happily, we each realized this and resisted the urge to champion our own perspective. We asked for the time we needed to investigate further, and our product team accommodated.
|
https://medium.com/microsoft-design/reconciling-the-differences-in-our-data-a-mixed-methods-research-story-6c1a2fe2f9f4
|
['Caryn Kieszling']
|
2019-12-31 19:14:29.344000+00:00
|
['Research And Insight', 'Design', 'UX', 'Data Science', 'Microsoft']
|
Title Reconciling Differences Data MixedMethods Research StoryContent love quant qual align time seem odds example survey clickstream data analyzed you’re feeling confident compare note teammate realize recommended next step based UX research data science poised send business two different direction challenge found working resolve user researcher data scientist Customer Insights Research supporting product team Microsoft seemed like conflict ended leading u deeper insight closer working relationship better outcome customers—but getting multistep process Step 1 Confront data discrepancy product team sunsetting older version app favor one provides accessibility user help stakeholder understand customer needed new version researcher conducted user study interview survey well analyzing inapp feedback Caryn researcher listening customer saying told u many feature enjoyed older app missing new app user research recommendation based analysis Fill feature gap older app customer transition Meanwhile Sera data scientist conducted cohort analysis clickstream data understand customer older version app impacted transition new version Based qualitative feedback expected see customer used feature available older app abandoning new app analysis showed weren’t data science recommendation stage Since customer retention new app doesn’t correlate feature use older app focus vital part user journey help people transition Research data science arrived opposing suggestion Step 2 Resist urge champion data stage could easy double opposing viewpoint we’d presented result asking general program manager choose recommendation least one u would satisfaction knowing influenced product could stakeholder leader confident making best datadriven decision forced choose quant qual way mixedmethods research exercise getting comfortable conflict finding reconciliation instead “winner” Happily realized resisted urge champion perspective asked time needed investigate product team accommodatedTags Research Insight Design UX Data Science Microsoft
|
5,542 |
Scraping A to Z of Amazon using Scrapy
|
Scrapy is a fast, open-source web crawling framework written in Python, used to extract the data from the web page with the help of selectors based on XPath.
In this article, We will be looking at how we can use Scrapy to scrape all the Amazon Product Reviews using just its URL and automatically store all the scraped data into a JSON file within seconds.
|
https://medium.com/analytics-vidhya/web-scraping-a-to-z-using-scrapy-6ece8b303793
|
['Rohan Goel']
|
2020-07-21 11:43:55.775000+00:00
|
['NLP', 'Scrapy', 'Data Science', 'Amazon', 'Web Scraping']
|
Title Scraping Z Amazon using ScrapyContent Scrapy fast opensource web crawling framework written Python used extract data web page help selector based XPath article looking use Scrapy scrape Amazon Product Reviews using URL automatically store scraped data JSON file within secondsTags NLP Scrapy Data Science Amazon Web Scraping
|
5,543 |
Journalism In Dark Times
|
Fifteen years ago, I published my first news piece in a print magazine. After that, I went on a long journey discovering and working in diverse fields including blogging, citizen journalism, campaigning, translating, producing and managing. Some roads were bumpy while I found myself in others, and these became a launchpad to some successful media initiatives.
However, working in independent media in the Arab world has become increasingly more difficult, especially since the counter-revolutions began to gain strength in 2013.
Counter-revolutions have had a profound effect on the media industry, both in the countries of the Arab Spring and across the wider Arab world.
Security authorities have come to realize the power of the media and its impact on public opinion, as illustrated notably in 2011, when social media platforms were successfully used to mobilize people in protests, resulting in a political transition in a number of Arab countries. At the time, many television networks were prompted to change their policies and give more airtime to young voices. By 2013, things had completely changed.
Pre-2011, social media platform users in the Arab world were mostly young people who belonged to what can be classified as a rising middle class. However, following the 2011 uprisings, the general Arab public increasingly signed up to those platforms and started following them closely. This led to a significant change in the nature of discussions on those platforms. The new users came from different age groups and backgrounds, and Facebook, among other platforms, ceased to be a safe space to hold political discussions or start human rights campaigns.
Facebook debates turned into social confrontations that could land people in jail — something that has happened to many Egyptians who were simply expressing their views about current events in their country. Furthermore, many Egyptian journalists were arrested for doing their jobs, bringing Egypt up to the shameful third place in the world ranking of countries with journalists behind bars, after China and Turkey.
Mahmoud Abou Zeid, known as Shawkan, was finally released after spending more than five years in prison on trumped-up charges — Amnesty International
At the same time, the Egyptian State launched a crackdown on independent media outlets. Hundreds of websites have been blocked in Egypt and journalists have been demonized, portrayed as working for foreign entities and betraying their country. These actions have affected the personal security of all journalists.
The Egyptian government also moved to establish a number of companies with ties to state security agencies and the intelligence service. These new companies then acquired many television networks and news websites, which led to identical news coverage on all of the outlets.
The 2016 U.S. presidential election laid bare the role played by social media in making fake news go viral, which in turn prompted social media platforms to work on adapting their algorithms such that less news would be posted in news feeds, instead favoring a higher proportion of posts from friends and family. This greatly affected the independent media industry in Egypt, most of which had already fled from traditional news websites to social media networks in an attempt to reach the public.
The Egyptian State does not allow for an independent media, and constantly seeks to hinder any funding for institutions supporting independent media by drafting legislation aimed at paralyzing civil society. Alternative methods like social media outlets are also facing a crisis, not to mention the numerous risks faced by everyone involved in media.
How to solve this dilemma?
This is what I am trying to answer in my journey as a fellow in the Tow-Knight Entrepreneurial Journalism program at the Craig Newmark Graduate School of Journalism at CUNY.
The 2019 Tow-Knight Entrepreneurial Journalism Cohort. Not pictured: Emiliana Garcia. Photo by Skyler Reid
Independent journalism in the Arab world has generally kept to traditional means of publishing its content, such as text-based news and multimedia. Independent initiatives have not sufficiently explored innovative ways of doing their critical jobs in these tough years.
Al Jazeera’s AJ+ has greatly impacted the news industry both globally and in the Arab world. The digital consumer has become more interested in video than text-based content. However, a wide-scale investment in digital news has not happened yet.
Chatbots, Telegram groups and Instagram accounts have provided new tools for publishing content. For example, Iran is a country where Telegram and Instagram are widely used, and Telegram was employed during the 2017–18 protest against the regime to circumvent governmental obstruction, enabling protesters to coordinate and to inform the world about events in the country. Similar ways of using new tools will give greater chances for independent media to reach wider audiences.
The dependence of such initiatives on a small number of donor NGOs has, however, contributed to limiting the chances for discovering new tools in the media industry and in seeking out funding.
I’m looking for solutions to this complex dilemma by aiming to create a new model of non-profit journalism based on grants and individual donations. This model would ideally be able to reach an audience of millions using new tools that can bypass governmental obstruction. These restrictions may have succeeded so far in disrupting journalism in the Arab world, but cannot obstruct journalism forever.
|
https://medium.com/journalism-innovation/journalism-in-dark-times-dacf8a0e3bd9
|
['Abdelrahman Mansour']
|
2019-03-20 16:48:50.893000+00:00
|
['Journalism', 'Innovation', 'Technology', 'Human Rights', 'Media']
|
Title Journalism Dark TimesContent Fifteen year ago published first news piece print magazine went long journey discovering working diverse field including blogging citizen journalism campaigning translating producing managing road bumpy found others became launchpad successful medium initiative However working independent medium Arab world become increasingly difficult especially since counterrevolution began gain strength 2013 Counterrevolutions profound effect medium industry country Arab Spring across wider Arab world Security authority come realize power medium impact public opinion illustrated notably 2011 social medium platform successfully used mobilize people protest resulting political transition number Arab country time many television network prompted change policy give airtime young voice 2013 thing completely changed Pre2011 social medium platform user Arab world mostly young people belonged classified rising middle class However following 2011 uprising general Arab public increasingly signed platform started following closely led significant change nature discussion platform new user came different age group background Facebook among platform ceased safe space hold political discussion start human right campaign Facebook debate turned social confrontation could land people jail — something happened many Egyptians simply expressing view current event country Furthermore many Egyptian journalist arrested job bringing Egypt shameful third place world ranking country journalist behind bar China Turkey Mahmoud Abou Zeid known Shawkan finally released spending five year prison trumpedup charge — Amnesty International time Egyptian State launched crackdown independent medium outlet Hundreds website blocked Egypt journalist demonized portrayed working foreign entity betraying country action affected personal security journalist Egyptian government also moved establish number company tie state security agency intelligence service new company acquired many television network news website led identical news coverage outlet 2016 US presidential election laid bare role played social medium making fake news go viral turn prompted social medium platform work adapting algorithm le news would posted news feed instead favoring higher proportion post friend family greatly affected independent medium industry Egypt already fled traditional news website social medium network attempt reach public Egyptian State allow independent medium constantly seek hinder funding institution supporting independent medium drafting legislation aimed paralyzing civil society Alternative method like social medium outlet also facing crisis mention numerous risk faced everyone involved medium solve dilemma trying answer journey fellow TowKnight Entrepreneurial Journalism program Craig Newmark Graduate School Journalism CUNY 2019 TowKnight Entrepreneurial Journalism Cohort pictured Emiliana Garcia Photo Skyler Reid Independent journalism Arab world generally kept traditional mean publishing content textbased news multimedia Independent initiative sufficiently explored innovative way critical job tough year Al Jazeera’s AJ greatly impacted news industry globally Arab world digital consumer become interested video textbased content However widescale investment digital news happened yet Chatbots Telegram group Instagram account provided new tool publishing content example Iran country Telegram Instagram widely used Telegram employed 2017–18 protest regime circumvent governmental obstruction enabling protester coordinate inform world event country Similar way using new tool give greater chance independent medium reach wider audience dependence initiative small number donor NGOs however contributed limiting chance discovering new tool medium industry seeking funding I’m looking solution complex dilemma aiming create new model nonprofit journalism based grant individual donation model would ideally able reach audience million using new tool bypass governmental obstruction restriction may succeeded far disrupting journalism Arab world cannot obstruct journalism foreverTags Journalism Innovation Technology Human Rights Media
|
5,544 |
Implementing full-text search in Apache Pinot
|
Apache Pinot is a real-time distributed OLAP datastore, built to deliver scalable real time analytics with low latency.
Pinot supports super fast query processing through its indexes on non-BLOB like columns. Queries with exact match filters are run efficiently through a combination of dictionary encoding, inverted index and sorted index. However, arbitrary text search queries cannot leverage indexes and require a full table scan.
In this post, we will discuss newly added support for text indexes in Pinot and how they can be used for efficient full-text search queries.
Let’s take a few examples to understand this better.
Exact match with scan
SELECT COUNT(*) FROM MyTable WHERE firstName = “John”
In the above query, we are doing an exact match on firstName column that doesn’t have an index. The execution engine will find the matching docIds (aka rowId) as follows:
Exact match with inverted index
If there is an inverted index on the firstName column, the dictionaryId will be used to look up the inverted index instead of scanning the forward index.
Exact match with sorted index
If the table is sorted on column firstName, we use the dictionaryId to look up the sorted index and get the start and end docIds of all rows that have the value “John”.
The following graph shows latencies for exact match queries with and without index on a dataset of 500 million rows and selectivity (number of rows that passed the filter) of 150 million.
Text search with scan
What if the user is interested in doing arbitrary text search? Pinot currently supports this through the in-built function REGEXP_LIKE.
SELECT COUNT(*) FROM MyTable WHERE REGEXP_LIKE(firstName, ‘John*’)
The predicate is a regular expression (prefix query). Unlike exact matches, indexes can’t be used to evaluate the regex filter and we resort to full table scan. For every raw value, pattern matching is done with regex “John*” to find the matching docIds.
Text search with index
For arbitrary text data which falls into the BLOB/CLOB territory, we need more than exact matches. Users are interested in doing regex, phrase and fuzzy queries on BLOB like data. As we just saw, REGEXP_LIKE is very inefficient since it uses a full-table scan. Secondly, it doesn’t support fuzzy searches.
In Pinot 0.3.0, we added support for text indexes to efficiently do arbitrary text search on STRING columns where each column value is a BLOB of heterogeneous text. Doing standard filter operations (equality, range, between, in) doesn’t fit the bill on textual data.
Text search can be done in Pinot using the new in-built function TEXT_MATCH.
SELECT COUNT(*) FROM Foo
WHERE TEXT_MATCH (<column_name>, <search_expression>)
With support for text indexes, let’s compare the performance of text search query with and without index on a dataset of 500 million rows and filter selectivity of 150 million.
Text Indexing Problem
Like other database indexes, the goal of a text index is efficient filter processing to improve query performance.
To support regular expression, phrase and fuzzy searches efficiently, text index data structure should have few fundamental building blocks to store key pieces of information.
Dictionary
Dictionary maps each indexed term (word) to the corresponding dictionaryId to allow efficient comparison using fixed-width integer codes as compared to raw values.
Inverted Index
Inverted index maps dictionaryId of each indexed term to the corresponding docId. Exact match term queries (single or multiple terms) can be answered efficiently through dictionary and inverted index.
Position Information
Phrase queries (e.g find documents matching the phrase “machine learning”) are an extension of exact term queries where the terms should be in the exact same order in the matching documents. These queries need position information along with the dictionary and inverted index.
Automata for regex and fuzzy queries
Regex queries (including prefix, wildcard) and fuzzy queries will require a comparison with each and every term in the dictionary unless the prefix is fixed. There has to be a way to prune the comparisons.
If we can represent the input regular expression as a finite state machine such that the state machine is deterministic and accepts a set of terms then we can use the state machine in conjunction with the dictionary to get the dictionaryIds of all the matching terms that are accepted by the state machine.
Fuzzy edit distance search can also be done efficiently by representing the query as a state machine based on Levenshtein automata and intersecting the automata with the dictionary.
As discussed earlier, Pinot’s dictionary and inverted index can help answer exact match term queries efficiently. However, phrase, regex, wildcard, prefix and fuzzy queries require the position information and finite state automata which is currently not maintained in Pinot.
We learned that Apache Lucene has the necessary missing pieces and decided to use it for supporting full-text search in Pinot until we enhance our index structures.
Creating Text Indexes in Pinot
Let’s discuss creation of Lucene text indexes in Pinot as a series of key design decisions and challenges.
Text index per column
Pinot’s table storage format is columnar. Index structures (forward, inverted, sorted, dictionary) for the table are also created on a per column, per segment (shard) basis. For text index, we decided to stick with this fundamental design for the following reasons:
Evolution and maintenance is easier. A user has the freedom to enable or disable text indexing on each column of the table.
Our performance experiments revealed that creating a global index in Lucene across all text index enabled columns for the table hurts performance. Global index is larger than a per column index which increases the search time.
Text Index Format
Like other indexes, a text index is created as part of Pinot segment creation. For each row in the table, we take the value of the column that has text indexing enabled and encapsulate it in a document.
The document comprises of two fields:
Text field — contains the actual column value (docValue) representing the body of text that should be indexed.
— contains the actual column value (docValue) representing the body of text that should be indexed. Stored field — contains a monotonically increasing docId counter to reverse map each document indexed in Lucene back to its docId (rowId) in Pinot. This field is not tokenized and indexed. It is simply stored inside Lucene.
Storing Pinot DocId in Lucene Document
The stored field is critical. For each document added to the text index, Lucene assigns a monotonically increasing docId to the document. Later the search operation on the index returns a list of matching docIds.
Lucene index is composed of multiple independent sub-indexes called as segments (not to confuse with the Pinot segment). Each Lucene sub-index is an independent self-contained index. Based on the size of data indexed, how often the in-memory documents from the index are flushed to the on-disk representation, a single text index can consist of multiple sub-indexes.
The key thing to note here is that Lucene’s internal docIds are relative to each sub-index. This can lead to situations where a document added to the text index for a given row in the Pinot table does not have the same Lucene docId as Pinot docId.
For a query that has a text search filter, this will lead to incorrect results since our execution engine (filter processing, index lookup etc) is based around the docIds. So we need to uniquely associate each document added to the Lucene index with the corresponding Pinot docId. This is why StoredField is used as the second field in the document.
Text Analyzer
Plain text is used as input for index generation. An analyzer performs pre-processing steps on the provided input text.
Lower casing
Breaks text into indexable and searchable tokens/terms.
Prunes stop words (a, an, the, or etc)
We currently use StandardAnalyzer which is good enough for standard english alphanumeric text and uses Unicode text segmentation algorithm to break text into tokens. Analyzer is also used during query execution when searching the text index.
Text Index Creation for both Offline and Real-time
Pinot supports ingesting and querying data in real-time. Text indexes are supported for offline, real-time and hybrid Pinot tables.
IndexWriter is used to create text indexes. It buffers the documents in memory and periodically flushes them to the on-disk Lucene index directory. However, the data is not visible to IndexReader (used on the search query path) until the writer commits and closes the index which fsync’s the index directory and makes the index data available to the reader.
IndexReader always looks at a point-in-time snapshot (of committed data) of the index as of the time reader was opened from the index directory. This works well for offline tables since offline Pinot segments don’t serve data for queries until fully created and are immutable once created. The text index is created during pinot segment generation and is ready to serve data for queries after the segment is fully built and loaded (memory mapped) on Pinot servers. Thus the text index reader on the query path always looks at the entire data of a segment for offline tables.
However, the above approach will not work for real-time or hybrid Pinot tables since these tables can be queried while data is being consumed. This requires the ability to search the text index on the query path as the IndexWriter is in progress with uncommitted changes. Further sections will discuss the query execution for both offline and real-time in detail.
Querying Text Indexes in Pinot
We enhanced our query parser and execution engine with a new in-built function text_match() to be used in WHERE clause of the queries. The syntax is:
TEXT_MATCH(<columnName>, <searchExpression>)
columnName: Name of the column to do text search on.
Name of the column to do text search on. searchExpression: search query in accordance with Lucene query syntax.
Let’s take an example of a query log file and resume file:
Store the query log and resume text in two STRING columns in a Pinot table.
Create text indexes on both columns.
We can now do different kinds of text analysis on the query log and resume data:
Count the number of group by queries that have between filter on timecol
SELECT count(*) FROM MyTable
WHERE text_match(logCol, ‘\”timecol between\” AND \”group by\”’)
Count the number of candidates that have “machine learning” and “gpu processing”
SELECT count(*) FROM MyTable
WHERE text_match(resume, ‘\”machine learning\” AND \”gpu processing\”’)
Please see the user docs for an extensive guide on different kinds of text search queries and how to write search expressions.
Creating Text Index Reader for Offline Pinot Segments
Text index is created in a directory by IndexWriter as part of pinot segment generation. When the pinot servers load (memory map) the offline segment, we create an IndexReader which memory-maps the text index directory. An instance of IndexReader and IndexSearcher is created once per table segment per column with text index.
We chose to go with MMapDirectory instead of RAMDirectory since the former uses efficient memory mapped I/O and generates less garbage. RAMDirectory can be very efficient for small memory-resident indexes but increases the heap overhead significantly.
Text Filter Execution
Following diagram depicts segment level execution for the following text search query
SELECT count(*) from Table
WHERE text_match(textCol1, expression1)
AND text_match(textCol2, expression2)
Creating Text Index Reader for Realtime Pinot Segments
Text indexes in realtime Pinot segments can be queried while the data is being consumed. Lucene supports NRT (near real-time) search by allowing to open a reader from a live writer thereby letting the reader to look at all the uncommitted index data from the writer. However, just like any other index reader in Lucene, the NRT reader is also a snapshot reader. So, the NRT reader will have to be reopened periodically to see the incremental changes made by the live index writer.
Our real-time text index reader also acts as a writer since it is both adding documents to the index as part of real-time segment consumption and being used by the Pinot query threads.
During Pinot server startup, we create a single background thread. The thread maintains a global circular queue of real-time segments across all tables.
The thread wakes up after a configurable threshold, polls the queue to get a realtime segment and refreshes the index searcher of the real-time reader for each column that has a text index.
How often should the refresh happen?
Deciding the configurable threshold between successive refreshes by the background thread is something that should be tuned based on the requirements.
If the threshold is low, we refresh often and queries with text_match filter(s) on consuming segments will get to see the new rows quickly. The downside is lots of small I/Os since refreshing the text index reader requires a flush from the live writer.
If the threshold is high, we flush less often which increases the lag between the time a row was added to the consuming segment’s text index and appears in search results of the query with text_match filter.
It is a trade-off between consistency and performance.
Key Optimizations
So far, we discussed how text index is created and queried in Pinot. We also talked about a few design decisions and challenges. Now, let’s discuss details on optimizations we implemented to get the desired functionality and performance.
Using Collector
For a search query, Lucene’s default behavior is to do scoring and ranking. The result of the call to indexSearcher.search() is TopDocs which represents top N hits of the query sorted by score descending. In Pinot we currently don’t need any of the scoring and ranking features. We are simply interested in retrieving all the matched docIds for a given text search query.
Our initial experiments revealed that the default search code path in Lucene results in significant heap overhead since it uses a PriorityQuery in TopScoreDocCollector. Secondly, the heap overhead increases with the increase in the number of matching documents.
We implemented the Collector interface to provide a simple callback to indexSearcher.search(query, collector) operation. For every matching Lucene docId, Lucene calls our collector callback which stores the docId in a bitmap.
Pruning Stop Words
Text documents are very likely to have common english words like a, an, the, or etc. These are known as stop-words. Stop words are typically never used in text analysis but due to their high occurrence frequency, index size can explode which consequently hurts query performance. We can customize the Analyzer to create custom token filters for the input text. The filtering process in the analyzer prunes all the stop words while building the index.
Using a pre-built mapping of Lucene docId to Pinot docId
As discussed above, there is a strong need to store Pinot docId in every document added to the Lucene index. This results in a two-pass query execution:
The search operation returns a bitmap of matching lucene docIds .
. Iterate over each docId to get the corresponding document. Retrieve the pinot docId from the document.
Retrieving the entire document from Lucene was a CPU hogger and became a major bottleneck for throughput testing. To avoid this, we iterate the text index once to fetch all <lucene docId, pinot docId> mappings and write them in a memory mapped file.
Since the text index for offline segments is immutable, this works well as we pay the cost of retrieving the entire document just once when the server loads the text index. The mapping file is later used during query execution by the collector callback to short-circuit the search path and directly construct a result bitmap of pinot docIds.
This optimization along with pruning the stop-words gave us 40–50x improvement in query performance by allowing the latency to scale with increase in QPS. The following graph compares the latency before and after this optimization.
Disable Lucene Query Result Cache
Lucene has a cache to boost performance for queries with repeatable text-search expressions. While the performance improvement is noticeable, cache increases the heap overhead. We decided to disable it by default and let the user enable (if need be) on a per text index basis.
Use compound file format
Lucene’s on-disk index structures are stored in multiple files. Consider the case of 2000 table segments on a Pinot server, each Pinot table segment having text index on 3 columns with 10 files per text index. We are looking at 60k open file handles. It is very likely for the system to run into “too many open files” problem.
So, the IndexWriter uses compound file format. Secondly, when the text index is fully built for a column, we force merge the multiple lucene sub-indexes (which are also referred to as segments in Lucene terminology) into a single index.
Configure in-memory buffer threshold
As documents are added to the text index during Pinot segment generation, they are buffered in-memory and periodically flushed to the on-disk structures in the index directory. The default Lucene behavior is to flush after memory usage has reached 16MB . We experimented with this value and made some observations:
A flush results in a Lucene segment. As more of these are created, Lucene can decide to merge few/all of them in the background. Having multiple such segments increases the number of files.
Having a default threshold as 16MB doesn’t strictly mean the index writer will consume 16MB of heap before flushing. The actual consumption is much higher (around 100MB) presumably because in Java there is no good way to programmatically keep track of the amount of heap memory used.
Smaller thresholds result in a large number of small I/Os as opposed to fewer big I/Os. We decided to keep this value configurable and chose 256MB as the default to keep a good balance between memory overhead and number of I/Os.
Additional Performance Numbers
We also ran micro-benchmarks to compare the execution time of text_match and regexp_like on a Pinot table with a single segment containing 1 million rows. Two different kinds of test data were used:
Log data: A STRING column in Pinot table where each value is a log line from apache access log.
A STRING column in Pinot table where each value is a log line from apache access log. Non log data: A STRING column in Pinot table where each value is resume text.
The following graph shows that search queries using text index are significantly faster compared to scan based pattern matching.
Another evaluation was done with Pinot’s native inverted index to understand when using text index may not be the right solution.
White-space separated text can be stored as a multi-value STRING column in Pinot.
Pinot will create a dictionary and inverted index on this column.
If only exact term matches (using =, IN operators) are required, then text index is not the right solution. Pinot’s inverted index can do the exact term matches 5x faster than Lucene.
However, if a phrase, regex (including prefix and wildcard) or fuzzy search is needed, then text index is the right choice both functionality and performance wise.
Upcoming Work
Pre-built mapping of lucene docId to pinot docId works for offline segments since the text index is immutable. For real-time consuming segments, this optimization is not applicable since the index is changing while it is serving queries. Optimizing the Lucene docId to Pinot docId translation is work in progress.
Fine-tuning the background refresh thread to work on a per table or a per index basis. The current implementation has a single background thread to manage all realtime segments and their text indexes.
Conclusion
In this blog post, we discussed how we leveraged Lucene to engineer the text search solution in Pinot to meet our functional and performance (QPS and latency) requirements. Please visit the user documentation of text search to learn more about using the feature.
If you’re interested in learning more about Apache Pinot, these resources are great places to get started.
Docs: http://docs.pinot.apache.org
Getting Started: https://docs.pinot.apache.org/getting-started
Special thanks
I would like to thank our Pinot OSS team for their relentless efforts to make Pinot better: Mayank Shrivastava, Jackie Jiang, Jialiang Li, Kishore Gopalakrishna, Neha Pawar, Seunghyun Lee, Subbu Subramaniam, Sajjad Moradi, Dino Occhialini, Anurag Shendge, Walter Huf, John Gutmann, our engineering manager Shraddha Sahay and SRE manager Prasanna Ravi. We would also like to thank the LinkedIn leadership Eric Baldeschwieler, Kapil Surlaker, and Igor Perisic for their guidance and continued support as well as Tim Santos for technical review of this article.
|
https://medium.com/apache-pinot-developer-blog/text-analytics-on-apache-pinot-cbf5c45d282c
|
['Siddharth Teotia']
|
2020-06-16 05:03:16.441000+00:00
|
['Software Engineering', 'Apache Pinot', 'Open Source', 'Programming', 'Analytics']
|
Title Implementing fulltext search Apache PinotContent Apache Pinot realtime distributed OLAP datastore built deliver scalable real time analytics low latency Pinot support super fast query processing index nonBLOB like column Queries exact match filter run efficiently combination dictionary encoding inverted index sorted index However arbitrary text search query cannot leverage index require full table scan post discus newly added support text index Pinot used efficient fulltext search query Let’s take example understand better Exact match scan SELECT COUNT MyTable firstName “John” query exact match firstName column doesn’t index execution engine find matching docIds aka rowId follows Exact match inverted index inverted index firstName column dictionaryId used look inverted index instead scanning forward index Exact match sorted index table sorted column firstName use dictionaryId look sorted index get start end docIds row value “John” following graph show latency exact match query without index dataset 500 million row selectivity number row passed filter 150 million Text search scan user interested arbitrary text search Pinot currently support inbuilt function REGEXPLIKE SELECT COUNT MyTable REGEXPLIKEfirstName ‘John’ predicate regular expression prefix query Unlike exact match index can’t used evaluate regex filter resort full table scan every raw value pattern matching done regex “John” find matching docIds Text search index arbitrary text data fall BLOBCLOB territory need exact match Users interested regex phrase fuzzy query BLOB like data saw REGEXPLIKE inefficient since us fulltable scan Secondly doesn’t support fuzzy search Pinot 030 added support text index efficiently arbitrary text search STRING column column value BLOB heterogeneous text standard filter operation equality range doesn’t fit bill textual data Text search done Pinot using new inbuilt function TEXTMATCH SELECT COUNT Foo TEXTMATCH columnname searchexpression support text index let’s compare performance text search query without index dataset 500 million row filter selectivity 150 million Text Indexing Problem Like database index goal text index efficient filter processing improve query performance support regular expression phrase fuzzy search efficiently text index data structure fundamental building block store key piece information Dictionary Dictionary map indexed term word corresponding dictionaryId allow efficient comparison using fixedwidth integer code compared raw value Inverted Index Inverted index map dictionaryId indexed term corresponding docId Exact match term query single multiple term answered efficiently dictionary inverted index Position Information Phrase query eg find document matching phrase “machine learning” extension exact term query term exact order matching document query need position information along dictionary inverted index Automata regex fuzzy query Regex query including prefix wildcard fuzzy query require comparison every term dictionary unless prefix fixed way prune comparison represent input regular expression finite state machine state machine deterministic accepts set term use state machine conjunction dictionary get dictionaryIds matching term accepted state machine Fuzzy edit distance search also done efficiently representing query state machine based Levenshtein automaton intersecting automaton dictionary discussed earlier Pinot’s dictionary inverted index help answer exact match term query efficiently However phrase regex wildcard prefix fuzzy query require position information finite state automaton currently maintained Pinot learned Apache Lucene necessary missing piece decided use supporting fulltext search Pinot enhance index structure Creating Text Indexes Pinot Let’s discus creation Lucene text index Pinot series key design decision challenge Text index per column Pinot’s table storage format columnar Index structure forward inverted sorted dictionary table also created per column per segment shard basis text index decided stick fundamental design following reason Evolution maintenance easier user freedom enable disable text indexing column table performance experiment revealed creating global index Lucene across text index enabled column table hurt performance Global index larger per column index increase search time Text Index Format Like index text index created part Pinot segment creation row table take value column text indexing enabled encapsulate document document comprises two field Text field — contains actual column value docValue representing body text indexed — contains actual column value docValue representing body text indexed Stored field — contains monotonically increasing docId counter reverse map document indexed Lucene back docId rowId Pinot field tokenized indexed simply stored inside Lucene Storing Pinot DocId Lucene Document stored field critical document added text index Lucene assigns monotonically increasing docId document Later search operation index return list matching docIds Lucene index composed multiple independent subindexes called segment confuse Pinot segment Lucene subindex independent selfcontained index Based size data indexed often inmemory document index flushed ondisk representation single text index consist multiple subindexes key thing note Lucene’s internal docIds relative subindex lead situation document added text index given row Pinot table Lucene docId Pinot docId query text search filter lead incorrect result since execution engine filter processing index lookup etc based around docIds need uniquely associate document added Lucene index corresponding Pinot docId StoredField used second field document Text Analyzer Plain text used input index generation analyzer performs preprocessing step provided input text Lower casing Breaks text indexable searchable tokensterms Prunes stop word etc currently use StandardAnalyzer good enough standard english alphanumeric text us Unicode text segmentation algorithm break text token Analyzer also used query execution searching text index Text Index Creation Offline Realtime Pinot support ingesting querying data realtime Text index supported offline realtime hybrid Pinot table IndexWriter used create text index buffer document memory periodically flush ondisk Lucene index directory However data visible IndexReader used search query path writer commits close index fsync’s index directory make index data available reader IndexReader always look pointintime snapshot committed data index time reader opened index directory work well offline table since offline Pinot segment don’t serve data query fully created immutable created text index created pinot segment generation ready serve data query segment fully built loaded memory mapped Pinot server Thus text index reader query path always look entire data segment offline table However approach work realtime hybrid Pinot table since table queried data consumed requires ability search text index query path IndexWriter progress uncommitted change section discus query execution offline realtime detail Querying Text Indexes Pinot enhanced query parser execution engine new inbuilt function textmatch used clause query syntax TEXTMATCHcolumnName searchExpression columnName Name column text search Name column text search searchExpression search query accordance Lucene query syntax Let’s take example query log file resume file Store query log resume text two STRING column Pinot table Create text index column different kind text analysis query log resume data Count number group query filter timecol SELECT count MyTable textmatchlogCol ‘”timecol between” ”group by”’ Count number candidate “machine learning” “gpu processing” SELECT count MyTable textmatchresume ‘”machine learning” ”gpu processing”’ Please see user doc extensive guide different kind text search query write search expression Creating Text Index Reader Offline Pinot Segments Text index created directory IndexWriter part pinot segment generation pinot server load memory map offline segment create IndexReader memorymaps text index directory instance IndexReader IndexSearcher created per table segment per column text index chose go MMapDirectory instead RAMDirectory since former us efficient memory mapped IO generates le garbage RAMDirectory efficient small memoryresident index increase heap overhead significantly Text Filter Execution Following diagram depicts segment level execution following text search query SELECT count Table textmatchtextCol1 expression1 textmatchtextCol2 expression2 Creating Text Index Reader Realtime Pinot Segments Text index realtime Pinot segment queried data consumed Lucene support NRT near realtime search allowing open reader live writer thereby letting reader look uncommitted index data writer However like index reader Lucene NRT reader also snapshot reader NRT reader reopened periodically see incremental change made live index writer realtime text index reader also act writer since adding document index part realtime segment consumption used Pinot query thread Pinot server startup create single background thread thread maintains global circular queue realtime segment across table thread wake configurable threshold poll queue get realtime segment refreshes index searcher realtime reader column text index often refresh happen Deciding configurable threshold successive refreshes background thread something tuned based requirement threshold low refresh often query textmatch filter consuming segment get see new row quickly downside lot small IOs since refreshing text index reader requires flush live writer threshold high flush le often increase lag time row added consuming segment’s text index appears search result query textmatch filter tradeoff consistency performance Key Optimizations far discussed text index created queried Pinot also talked design decision challenge let’s discus detail optimization implemented get desired functionality performance Using Collector search query Lucene’s default behavior scoring ranking result call indexSearchersearch TopDocs represents top N hit query sorted score descending Pinot currently don’t need scoring ranking feature simply interested retrieving matched docIds given text search query initial experiment revealed default search code path Lucene result significant heap overhead since us PriorityQuery TopScoreDocCollector Secondly heap overhead increase increase number matching document implemented Collector interface provide simple callback indexSearchersearchquery collector operation every matching Lucene docId Lucene call collector callback store docId bitmap Pruning Stop Words Text document likely common english word like etc known stopwords Stop word typically never used text analysis due high occurrence frequency index size explode consequently hurt query performance customize Analyzer create custom token filter input text filtering process analyzer prune stop word building index Using prebuilt mapping Lucene docId Pinot docId discussed strong need store Pinot docId every document added Lucene index result twopass query execution search operation return bitmap matching lucene docIds Iterate docId get corresponding document Retrieve pinot docId document Retrieving entire document Lucene CPU hogger became major bottleneck throughput testing avoid iterate text index fetch lucene docId pinot docId mapping write memory mapped file Since text index offline segment immutable work well pay cost retrieving entire document server load text index mapping file later used query execution collector callback shortcircuit search path directly construct result bitmap pinot docIds optimization along pruning stopwords gave u 40–50x improvement query performance allowing latency scale increase QPS following graph compare latency optimization Disable Lucene Query Result Cache Lucene cache boost performance query repeatable textsearch expression performance improvement noticeable cache increase heap overhead decided disable default let user enable need per text index basis Use compound file format Lucene’s ondisk index structure stored multiple file Consider case 2000 table segment Pinot server Pinot table segment text index 3 column 10 file per text index looking 60k open file handle likely system run “too many open files” problem IndexWriter us compound file format Secondly text index fully built column force merge multiple lucene subindexes also referred segment Lucene terminology single index Configure inmemory buffer threshold document added text index Pinot segment generation buffered inmemory periodically flushed ondisk structure index directory default Lucene behavior flush memory usage reached 16MB experimented value made observation flush result Lucene segment created Lucene decide merge fewall background multiple segment increase number file default threshold 16MB doesn’t strictly mean index writer consume 16MB heap flushing actual consumption much higher around 100MB presumably Java good way programmatically keep track amount heap memory used Smaller threshold result large number small IOs opposed fewer big IOs decided keep value configurable chose 256MB default keep good balance memory overhead number IOs Additional Performance Numbers also ran microbenchmarks compare execution time textmatch regexplike Pinot table single segment containing 1 million row Two different kind test data used Log data STRING column Pinot table value log line apache access log STRING column Pinot table value log line apache access log Non log data STRING column Pinot table value resume text following graph show search query using text index significantly faster compared scan based pattern matching Another evaluation done Pinot’s native inverted index understand using text index may right solution Whitespace separated text stored multivalue STRING column Pinot Pinot create dictionary inverted index column exact term match using operator required text index right solution Pinot’s inverted index exact term match 5x faster Lucene However phrase regex including prefix wildcard fuzzy search needed text index right choice functionality performance wise Upcoming Work Prebuilt mapping lucene docId pinot docId work offline segment since text index immutable realtime consuming segment optimization applicable since index changing serving query Optimizing Lucene docId Pinot docId translation work progress Finetuning background refresh thread work per table per index basis current implementation single background thread manage realtime segment text index Conclusion blog post discussed leveraged Lucene engineer text search solution Pinot meet functional performance QPS latency requirement Please visit user documentation text search learn using feature you’re interested learning Apache Pinot resource great place get started Docs httpdocspinotapacheorg Getting Started httpsdocspinotapacheorggettingstarted Special thanks would like thank Pinot OSS team relentless effort make Pinot better Mayank Shrivastava Jackie Jiang Jialiang Li Kishore Gopalakrishna Neha Pawar Seunghyun Lee Subbu Subramaniam Sajjad Moradi Dino Occhialini Anurag Shendge Walter Huf John Gutmann engineering manager Shraddha Sahay SRE manager Prasanna Ravi would also like thank LinkedIn leadership Eric Baldeschwieler Kapil Surlaker Igor Perisic guidance continued support well Tim Santos technical review articleTags Software Engineering Apache Pinot Open Source Programming Analytics
|
5,545 |
Here’s What I Learned From 30 Days of Creative Coding (a Codevember Retrospective)
|
Lessons Learned in Codevember
We all stand on the shoulders of code giants
I watched more code tutorials in one month than I have in three years. I browsed countless GitHub repositories of open source Javascript packages. I trawled through Twitter and Instagram, searching for other creative coders to draw inspiration from.
Here’s the thing: The internet would not be what it is today without open source creators.
Time and time again, I was blown away by people building creative open source packages and giving them away for free. At first, I felt guilty for taking other code and tweaking it to start my sketches. But then I learned that this is how we build things now: We find projects that inspire us, learn from what others built, and then build our own new thing.
I learned from so many people and package maintainers, but I feel the need to give shout-outs for a few specifically:
If you want to learn something new, start a work-adjacent project
I really struggle with the pervasive side-hustle culture in tech and programming. It feels like everyone is building an app on the side or, in my field, creating tons of cool data visualizations when they get home from work or on the weekends. But I spend six or more hours every day doing data visualization. Most of the time I love it, but it feels exhausting to come home and do the same thing. I want to be more than the work that I do at my job.
Day 3: Deep Waves. I used a Perlin noise generator to create pseudo-random line paths. (The site has an animated version!).
So for me, it was really important to do something that was not specifically data visualization. This is why I like the term work-adjacent — learning how to draw with code required a few skills I use in my day job (JavaScript, design, debugging, etc.) but with a whole new level of freedom to let my mind wander. There were no data sets to tie down my designs. I could explore abstract things like randomness and generating pseudo-random algorithms. I could create things just for the fun of it.
I am 90% sure that if I did data viz for every day of November, I wouldn’t have finished. I would have burnt out too soon. So if you have a project in mind, maybe ask a few questions first: Will this feel too much like work? If the answer is yes, take a deep breath and take off some of the pressure you put on yourself. Maybe it’s time to do something for the fun of it, or the creativity, rather than to further your career with a side hustle.
Doing something every day that is terrible and cathartic
When you try to come up with a new idea every day, something strange happens. At first, you obsess over details, trying to make every sketch better than the last. Then you fall a few days behind because you have resisted publishing the last sketch because it “still needed something.” And then you’re days behind, wondering how you’ll catch up, feeling like a failure.
But eventually, you give yourself a break. You stop caring so much. You put work out there before it’s perfect. And the new ideas start flowing through you more quickly now, escaping from your hands to breathe life before you have the chance to squash them.
Day 30: Devided Bliss. Not the most elegant of sketches, but I liked the end result. Random polygons are generated based on a function and then placed on the page with a masking effect.
You do good work even if it’s not perfect. You do more, and you learn more.
Programming should be fun, especially creative programming. So be easy on yourself. Make mistakes. Put something out into the world and then go back and fix it once you’ve given the thing a chance to breathe. You’ll feel better after doing it a few times.
It’s OK to feel like a fraud. Everybody does
Most days, I felt like a fraud for adapting someone else’s code. But then I would open Twitter and see loads of other devs expressing how they feel, like they never know enough about JavaScript, or Python, or whatever they are using.
Day 10: Old Pyramids. Nothing complicated on the code side. Just a nice picture composed purely of polygons
If you’re creating anything with code, you will probably have to continue learning new skills. It never stops. Technology evolves, packages change, new tools emerge. Even the most experienced developers and artists have to spend time learning. And they use the same resources as everybody else.
Math is beautiful. Math is for everybody
Sometimes you need to visualize something to understand it. I never considered myself a math person. Historically, I have leaned more to the art / design side of creativity. Math was for data scientists, engineers, astronomers, etc. I just helped bring their work to life with pictures.
Day 16: Math Meditations. This sketch visualizes the concept of recursion. Each function calls itself until a predefined value is reached.
False. If #Codevember did anything for me, it turned me into a huge math nerd. Even though I don’t understand large swaths of the field still, visualizing equations and algorithms in p5.js revealed the intricacy and beauty behind numbers. I’m used to creating images with static data sets, but connecting shapes and colors to evolving, random data opened up a whole new world for me.
Don’t be intimidated by a field you don’t understand. Math is not just for the mathematicians. Art is not just for the artists.
Have a look around, dabble in what excites you, and go make something beautiful.
|
https://medium.com/better-programming/heres-what-i-learned-from-30-days-of-creative-coding-a-codevember-retrospective-8c05a8497d24
|
['Benjamin Cooley']
|
2020-01-10 20:00:51.663000+00:00
|
['Data Visualization', 'JavaScript', 'Development', 'Programming', 'Creative Coding']
|
Title Here’s Learned 30 Days Creative Coding Codevember RetrospectiveContent Lessons Learned Codevember stand shoulder code giant watched code tutorial one month three year browsed countless GitHub repository open source Javascript package trawled Twitter Instagram searching creative coder draw inspiration Here’s thing internet would today without open source creator Time time blown away people building creative open source package giving away free first felt guilty taking code tweaking start sketch learned build thing find project inspire u learn others built build new thing learned many people package maintainer feel need give shoutouts specifically want learn something new start workadjacent project really struggle pervasive sidehustle culture tech programming feel like everyone building app side field creating ton cool data visualization get home work weekend spend six hour every day data visualization time love feel exhausting come home thing want work job Day 3 Deep Waves used Perlin noise generator create pseudorandom line path site animated version really important something specifically data visualization like term workadjacent — learning draw code required skill use day job JavaScript design debugging etc whole new level freedom let mind wander data set tie design could explore abstract thing like randomness generating pseudorandom algorithm could create thing fun 90 sure data viz every day November wouldn’t finished would burnt soon project mind maybe ask question first feel much like work answer yes take deep breath take pressure put Maybe it’s time something fun creativity rather career side hustle something every day terrible cathartic try come new idea every day something strange happens first ob detail trying make every sketch better last fall day behind resisted publishing last sketch “still needed something” you’re day behind wondering you’ll catch feeling like failure eventually give break stop caring much put work it’s perfect new idea start flowing quickly escaping hand breathe life chance squash Day 30 Devided Bliss elegant sketch liked end result Random polygon generated based function placed page masking effect good work even it’s perfect learn Programming fun especially creative programming easy Make mistake Put something world go back fix you’ve given thing chance breathe You’ll feel better time It’s OK feel like fraud Everybody day felt like fraud adapting someone else’s code would open Twitter see load devs expressing feel like never know enough JavaScript Python whatever using Day 10 Old Pyramids Nothing complicated code side nice picture composed purely polygon you’re creating anything code probably continue learning new skill never stop Technology evolves package change new tool emerge Even experienced developer artist spend time learning use resource everybody else Math beautiful Math everybody Sometimes need visualize something understand never considered math person Historically leaned art design side creativity Math data scientist engineer astronomer etc helped bring work life picture Day 16 Math Meditations sketch visualizes concept recursion function call predefined value reached False Codevember anything turned huge math nerd Even though don’t understand large swath field still visualizing equation algorithm p5js revealed intricacy beauty behind number I’m used creating image static data set connecting shape color evolving random data opened whole new world Don’t intimidated field don’t understand Math mathematician Art artist look around dabble excites go make something beautifulTags Data Visualization JavaScript Development Programming Creative Coding
|
5,546 |
@RequestParam vs @QueryParam vs @PathVariable vs @PathParam
|
The annotations @RequestParam, @QueryParam and @PathVariable, PathParam are used to read values from the request. But which one is used for what?
The arrangement in the collection is deliberately grouped, as these are annotations that have the same task but come from different frameworks that often occur in combination.
comparison
As shown in the table, the difference lies in where a value is read out. @PathParam reads the value from a path part of the called URI. @QueryParam is used to read the values from QueryParameters of a URI call. These are after? listed in a URI.
PathParams are location-dependent, while QueryParams are passed as a key value pair and therefore their order is irrelevant to more than one QueryParam.
example
As an example again both calls in a URI:
@PathVariable
This annotation is used on the method parameter we want to populate:
@RequestMapping(value = "/orders/{id}", method = RequestMethod.GET)
@ResponseBody
public String getOrder(@PathVariable final String id) {
return "Order ID: " + id;
}
Even though @PathVariable and @RequestParam are both used to extract values from the URL, their usage is largely determined by how a site is designed.
The @PathVariable annotation is used for data passed in the URI (e.g. RESTful web services) while @RequestParam is used to extract the data found in query parameters.
Refrence
http://www.nullpointer.at/2017/11/22/requestparam-queryparam-pathvariable-pathparam/
Related Links
https://docs.oracle.com/javaee/7/api/javax/ws/rs/PathParam.html
https://docs.oracle.com/javaee/7/api/javax/ws/rs/QueryParam.html
https://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/web/bind/annotation/PathVariable.html
https://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/web/bind/annotation/RequestParam.html
|
https://medium.com/1developer/spring-requestparam-vs-queryparam-vs-pathvariable-vs-pathparam-7c5655e541ad
|
['Mahdi Razavi']
|
2019-06-22 05:31:22.056000+00:00
|
['Spring', 'Java']
|
Title RequestParam v QueryParam v PathVariable v PathParamContent annotation RequestParam QueryParam PathVariable PathParam used read value from request one used arrangement collection deliberately grouped annotation task come different framework often occur combination comparison shown table difference lie value read PathParam read value path part called URI QueryParam used read value from QueryParameters URI call listed URI PathParams locationdependent QueryParams passed key value pair therefore order irrelevant one QueryParam example example call URI PathVariable annotation used method parameter want populate RequestMappingvalue ordersid method RequestMethodGET ResponseBody public String getOrderPathVariable final String id return Order ID id Even though PathVariable RequestParam used extract value URL usage largely determined site designed PathVariable annotation used data passed URI eg RESTful web service RequestParam used extract data found query parameter Refrence httpwwwnullpointerat20171122requestparamqueryparampathvariablepathparam Related Links httpsdocsoraclecomjavaee7apijavaxwsrsPathParamhtml httpsdocsoraclecomjavaee7apijavaxwsrsQueryParamhtml httpsdocsspringiospringdocscurrentjavadocapiorgspringframeworkwebbindannotationPathVariablehtml httpsdocsspringiospringdocscurrentjavadocapiorgspringframeworkwebbindannotationRequestParamhtmlTags Spring Java
|
5,547 |
Chalkboard Thoughts — Dec 15, 2020 — Episode 4 —Fear & Doubt
|
I’ve written often about ‘Fear & Doubt’. Those two words came to me, when I was listening to an audible book during 2019, titled, ‘The Art of Happiness’ by The Dalai Lama and Howard C. Cutler. For me it’s in this place, the place of ‘Fear & Doubt’ where all suffering in our conditioned human minds starts. If we were only able to master this mental feat then we would be saved from those demons (see picture above) that roam around in our minds.
|
https://medium.com/stayingaliveuk/chalkboard-thoughts-dec-15-2020-episode-4-fear-doubt-156a70003ec1
|
['Michael De Groot']
|
2020-12-15 16:37:32.389000+00:00
|
['Storytelling', 'Shareyourstory', 'Blackboardthoughts', 'Stayingaliveuk', 'Whiteboardanimation']
|
Title Chalkboard Thoughts — Dec 15 2020 — Episode 4 —Fear DoubtContent I’ve written often ‘Fear Doubt’ two word came listening audible book 2019 titled ‘The Art Happiness’ Dalai Lama Howard C Cutler it’s place place ‘Fear Doubt’ suffering conditioned human mind start able master mental feat would saved demon see picture roam around mindsTags Storytelling Shareyourstory Blackboardthoughts Stayingaliveuk Whiteboardanimation
|
5,548 |
The 5 Clustering Algorithms Data Scientists Need to Know
|
Clustering is a Machine Learning technique that involves the grouping of data points. Given a set of data points, we can use a clustering algorithm to classify each data point into a specific group. In theory, data points that are in the same group should have similar properties and/or features, while data points in different groups should have highly dissimilar properties and/or features. Clustering is a method of unsupervised learning and is a common technique for statistical data analysis used in many fields.
In Data Science, we can use clustering analysis to gain some valuable insights from our data by seeing what groups the data points fall into when we apply a clustering algorithm. Today, we’re going to look at 5 popular clustering algorithms that data scientists need to know and their pros and cons!
K-Means Clustering
K-Means is probably the most well-known clustering algorithm. It’s taught in a lot of introductory data science and machine learning classes. It’s easy to understand and implement in code! Check out the graphic below for an illustration.
K-Means Clustering
To begin, we first select a number of classes/groups to use and randomly initialize their respective center points. To figure out the number of classes to use, it’s good to take a quick look at the data and try to identify any distinct groupings. The center points are vectors of the same length as each data point vector and are the “X’s” in the graphic above. Each data point is classified by computing the distance between that point and each group center, and then classifying the point to be in the group whose center is closest to it. Based on these classified points, we recompute the group center by taking the mean of all the vectors in the group. Repeat these steps for a set number of iterations or until the group centers don’t change much between iterations. You can also opt to randomly initialize the group centers a few times, and then select the run that looks like it provided the best results.
K-Means has the advantage that it’s pretty fast, as all we’re really doing is computing the distances between points and group centers; very few computations! It thus has a linear complexity O(n).
On the other hand, K-Means has a couple of disadvantages. Firstly, you have to select how many groups/classes there are. This isn’t always trivial and ideally with a clustering algorithm we’d want it to figure those out for us because the point of it is to gain some insight from the data. K-means also starts with a random choice of cluster centers and therefore it may yield different clustering results on different runs of the algorithm. Thus, the results may not be repeatable and lack consistency. Other cluster methods are more consistent.
K-Medians is another clustering algorithm related to K-Means, except instead of recomputing the group center points using the mean we use the median vector of the group. This method is less sensitive to outliers (because of using the Median) but is much slower for larger datasets as sorting is required on each iteration when computing the Median vector.
Mean-Shift Clustering
Mean shift clustering is a sliding-window-based algorithm that attempts to find dense areas of data points. It is a centroid-based algorithm meaning that the goal is to locate the center points of each group/class, which works by updating candidates for center points to be the mean of the points within the sliding-window. These candidate windows are then filtered in a post-processing stage to eliminate near-duplicates, forming the final set of center points and their corresponding groups. Check out the graphic below for an illustration.
Mean-Shift Clustering for a single sliding window
To explain mean-shift we will consider a set of points in two-dimensional space like the above illustration. We begin with a circular sliding window centered at a point C (randomly selected) and having radius r as the kernel. Mean shift is a hill-climbing algorithm that involves shifting this kernel iteratively to a higher density region on each step until convergence. At every iteration, the sliding window is shifted towards regions of higher density by shifting the center point to the mean of the points within the window (hence the name). The density within the sliding window is proportional to the number of points inside it. Naturally, by shifting to the mean of the points in the window it will gradually move towards areas of higher point density. We continue shifting the sliding window according to the mean until there is no direction at which a shift can accommodate more points inside the kernel. Check out the graphic above; we keep moving the circle until we no longer are increasing the density (i.e number of points in the window). This process of steps 1 to 3 is done with many sliding windows until all points lie within a window. When multiple sliding windows overlap the window containing the most points is preserved. The data points are then clustered according to the sliding window in which they reside.
An illustration of the entire process from end-to-end with all of the sliding windows is shown below. Each black dot represents the centroid of a sliding window and each gray dot is a data point.
The entire process of Mean-Shift Clustering
In contrast to K-means clustering, there is no need to select the number of clusters as mean-shift automatically discovers this. That’s a massive advantage. The fact that the cluster centers converge towards the points of maximum density is also quite desirable as it is quite intuitive to understand and fits well in a naturally data-driven sense. The drawback is that the selection of the window size/radius “r” can be non-trivial.
Density-Based Spatial Clustering of Applications with Noise (DBSCAN)
DBSCAN is a density-based clustered algorithm similar to mean-shift, but with a couple of notable advantages. Check out another fancy graphic below and let’s get started!
DBSCAN Smiley Face Clustering
DBSCAN begins with an arbitrary starting data point that has not been visited. The neighborhood of this point is extracted using a distance epsilon ε (All points which are within the ε distance are neighborhood points). If there are a sufficient number of points (according to minPoints) within this neighborhood then the clustering process starts and the current data point becomes the first point in the new cluster. Otherwise, the point will be labeled as noise (later this noisy point might become the part of the cluster). In both cases that point is marked as “visited”. For this first point in the new cluster, the points within its ε distance neighborhood also become part of the same cluster. This procedure of making all points in the ε neighborhood belong to the same cluster is then repeated for all of the new points that have been just added to the cluster group. This process of steps 2 and 3 is repeated until all points in the cluster are determined i.e all points within the ε neighborhood of the cluster have been visited and labeled. Once we’re done with the current cluster, a new unvisited point is retrieved and processed, leading to the discovery of a further cluster or noise. This process repeats until all points are marked as visited. Since at the end of this all points have been visited, each point will have been marked as either belonging to a cluster or being noise.
DBSCAN poses some great advantages over other clustering algorithms. Firstly, it does not require a pe-set number of clusters at all. It also identifies outliers as noises, unlike mean-shift which simply throws them into a cluster even if the data point is very different. Additionally, it can find arbitrarily sized and arbitrarily shaped clusters quite well.
The main drawback of DBSCAN is that it doesn’t perform as well as others when the clusters are of varying density. This is because the setting of the distance threshold ε and minPoints for identifying the neighborhood points will vary from cluster to cluster when the density varies. This drawback also occurs with very high-dimensional data since again the distance threshold ε becomes challenging to estimate.
Expectation–Maximization (EM) Clustering using Gaussian Mixture Models (GMM)
One of the major drawbacks of K-Means is its naive use of the mean value for the cluster center. We can see why this isn’t the best way of doing things by looking at the image below. On the left-hand side, it looks quite obvious to the human eye that there are two circular clusters with different radius’ centered at the same mean. K-Means can’t handle this because the mean values of the clusters are very close together. K-Means also fails in cases where the clusters are not circular, again as a result of using the mean as cluster center.
Two failure cases for K-Means
Gaussian Mixture Models (GMMs) give us more flexibility than K-Means. With GMMs we assume that the data points are Gaussian distributed; this is a less restrictive assumption than saying they are circular by using the mean. That way, we have two parameters to describe the shape of the clusters: the mean and the standard deviation! Taking an example in two dimensions, this means that the clusters can take any kind of elliptical shape (since we have a standard deviation in both the x and y directions). Thus, each Gaussian distribution is assigned to a single cluster.
To find the parameters of the Gaussian for each cluster (e.g the mean and standard deviation), we will use an optimization algorithm called Expectation–Maximization (EM). Take a look at the graphic below as an illustration of the Gaussians being fitted to the clusters. Then we can proceed with the process of Expectation–Maximization clustering using GMMs.
EM Clustering using GMMs
We begin by selecting the number of clusters (like K-Means does) and randomly initializing the Gaussian distribution parameters for each cluster. One can try to provide a good guesstimate for the initial parameters by taking a quick look at the data too. Though note, as can be seen in the graphic above, this isn’t 100% necessary as the Gaussians start our as very poor but are quickly optimized. Given these Gaussian distributions for each cluster, compute the probability that each data point belongs to a particular cluster. The closer a point is to the Gaussian’s center, the more likely it belongs to that cluster. This should make intuitive sense since with a Gaussian distribution we are assuming that most of the data lies closer to the center of the cluster. Based on these probabilities, we compute a new set of parameters for the Gaussian distributions such that we maximize the probabilities of data points within the clusters. We compute these new parameters using a weighted sum of the data point positions, where the weights are the probabilities of the data point belonging in that particular cluster. To explain this visually we can take a look at the graphic above, in particular the yellow cluster as an example. The distribution starts off randomly on the first iteration, but we can see that most of the yellow points are to the right of that distribution. When we compute a sum weighted by the probabilities, even though there are some points near the center, most of them are on the right. Thus naturally the distribution’s mean is shifted closer to those set of points. We can also see that most of the points are “top-right to bottom-left”. Therefore the standard deviation changes to create an ellipse that is more fitted to these points, to maximize the sum weighted by the probabilities. Steps 2 and 3 are repeated iteratively until convergence, where the distributions don’t change much from iteration to iteration.
There are 2 key advantages to using GMMs. Firstly GMMs are a lot more flexible in terms of cluster covariance than K-Means; due to the standard deviation parameter, the clusters can take on any ellipse shape, rather than being restricted to circles. K-Means is actually a special case of GMM in which each cluster’s covariance along all dimensions approaches 0. Secondly, since GMMs use probabilities, they can have multiple clusters per data point. So if a data point is in the middle of two overlapping clusters, we can simply define its class by saying it belongs X-percent to class 1 and Y-percent to class 2. I.e GMMs support mixed membership.
Agglomerative Hierarchical Clustering
Hierarchical clustering algorithms fall into 2 categories: top-down or bottom-up. Bottom-up algorithms treat each data point as a single cluster at the outset and then successively merge (or agglomerate) pairs of clusters until all clusters have been merged into a single cluster that contains all data points. Bottom-up hierarchical clustering is therefore called hierarchical agglomerative clustering or HAC. This hierarchy of clusters is represented as a tree (or dendrogram). The root of the tree is the unique cluster that gathers all the samples, the leaves being the clusters with only one sample. Check out the graphic below for an illustration before moving on to the algorithm steps
Agglomerative Hierarchical Clustering
We begin by treating each data point as a single cluster i.e if there are X data points in our dataset then we have X clusters. We then select a distance metric that measures the distance between two clusters. As an example, we will use average linkage which defines the distance between two clusters to be the average distance between data points in the first cluster and data points in the second cluster. On each iteration, we combine two clusters into one. The two clusters to be combined are selected as those with the smallest average linkage. I.e according to our selected distance metric, these two clusters have the smallest distance between each other and therefore are the most similar and should be combined. Step 2 is repeated until we reach the root of the tree i.e we only have one cluster which contains all data points. In this way we can select how many clusters we want in the end, simply by choosing when to stop combining the clusters i.e when we stop building the tree!
Hierarchical clustering does not require us to specify the number of clusters and we can even select which number of clusters looks best since we are building a tree. Additionally, the algorithm is not sensitive to the choice of distance metric; all of them tend to work equally well whereas with other clustering algorithms, the choice of distance metric is critical. A particularly good use case of hierarchical clustering methods is when the underlying data has a hierarchical structure and you want to recover the hierarchy; other clustering algorithms can’t do this. These advantages of hierarchical clustering come at the cost of lower efficiency, as it has a time complexity of O(n³), unlike the linear complexity of K-Means and GMM.
|
https://towardsdatascience.com/the-5-clustering-algorithms-data-scientists-need-to-know-a36d136ef68
|
['George Seif']
|
2020-12-13 19:23:17.239000+00:00
|
['Machine Learning', 'Clustering', 'Data Science', 'Algorithms', 'Towards Data Science']
|
Title 5 Clustering Algorithms Data Scientists Need KnowContent Clustering Machine Learning technique involves grouping data point Given set data point use clustering algorithm classify data point specific group theory data point group similar property andor feature data point different group highly dissimilar property andor feature Clustering method unsupervised learning common technique statistical data analysis used many field Data Science use clustering analysis gain valuable insight data seeing group data point fall apply clustering algorithm Today we’re going look 5 popular clustering algorithm data scientist need know pro con KMeans Clustering KMeans probably wellknown clustering algorithm It’s taught lot introductory data science machine learning class It’s easy understand implement code Check graphic illustration KMeans Clustering begin first select number classesgroups use randomly initialize respective center point figure number class use it’s good take quick look data try identify distinct grouping center point vector length data point vector “X’s” graphic data point classified computing distance point group center classifying point group whose center closest Based classified point recompute group center taking mean vector group Repeat step set number iteration group center don’t change much iteration also opt randomly initialize group center time select run look like provided best result KMeans advantage it’s pretty fast we’re really computing distance point group center computation thus linear complexity hand KMeans couple disadvantage Firstly select many groupsclasses isn’t always trivial ideally clustering algorithm we’d want figure u point gain insight data Kmeans also start random choice cluster center therefore may yield different clustering result different run algorithm Thus result may repeatable lack consistency cluster method consistent KMedians another clustering algorithm related KMeans except instead recomputing group center point using mean use median vector group method le sensitive outlier using Median much slower larger datasets sorting required iteration computing Median vector MeanShift Clustering Mean shift clustering slidingwindowbased algorithm attempt find dense area data point centroidbased algorithm meaning goal locate center point groupclass work updating candidate center point mean point within slidingwindow candidate window filtered postprocessing stage eliminate nearduplicates forming final set center point corresponding group Check graphic illustration MeanShift Clustering single sliding window explain meanshift consider set point twodimensional space like illustration begin circular sliding window centered point C randomly selected radius r kernel Mean shift hillclimbing algorithm involves shifting kernel iteratively higher density region step convergence every iteration sliding window shifted towards region higher density shifting center point mean point within window hence name density within sliding window proportional number point inside Naturally shifting mean point window gradually move towards area higher point density continue shifting sliding window according mean direction shift accommodate point inside kernel Check graphic keep moving circle longer increasing density ie number point window process step 1 3 done many sliding window point lie within window multiple sliding window overlap window containing point preserved data point clustered according sliding window reside illustration entire process endtoend sliding window shown black dot represents centroid sliding window gray dot data point entire process MeanShift Clustering contrast Kmeans clustering need select number cluster meanshift automatically discovers That’s massive advantage fact cluster center converge towards point maximum density also quite desirable quite intuitive understand fit well naturally datadriven sense drawback selection window sizeradius “r” nontrivial DensityBased Spatial Clustering Applications Noise DBSCAN DBSCAN densitybased clustered algorithm similar meanshift couple notable advantage Check another fancy graphic let’s get started DBSCAN Smiley Face Clustering DBSCAN begin arbitrary starting data point visited neighborhood point extracted using distance epsilon ε point within ε distance neighborhood point sufficient number point according minPoints within neighborhood clustering process start current data point becomes first point new cluster Otherwise point labeled noise later noisy point might become part cluster case point marked “visited” first point new cluster point within ε distance neighborhood also become part cluster procedure making point ε neighborhood belong cluster repeated new point added cluster group process step 2 3 repeated point cluster determined ie point within ε neighborhood cluster visited labeled we’re done current cluster new unvisited point retrieved processed leading discovery cluster noise process repeat point marked visited Since end point visited point marked either belonging cluster noise DBSCAN pose great advantage clustering algorithm Firstly require peset number cluster also identifies outlier noise unlike meanshift simply throw cluster even data point different Additionally find arbitrarily sized arbitrarily shaped cluster quite well main drawback DBSCAN doesn’t perform well others cluster varying density setting distance threshold ε minPoints identifying neighborhood point vary cluster cluster density varies drawback also occurs highdimensional data since distance threshold ε becomes challenging estimate Expectation–Maximization EM Clustering using Gaussian Mixture Models GMM One major drawback KMeans naive use mean value cluster center see isn’t best way thing looking image lefthand side look quite obvious human eye two circular cluster different radius’ centered mean KMeans can’t handle mean value cluster close together KMeans also fails case cluster circular result using mean cluster center Two failure case KMeans Gaussian Mixture Models GMMs give u flexibility KMeans GMMs assume data point Gaussian distributed le restrictive assumption saying circular using mean way two parameter describe shape cluster mean standard deviation Taking example two dimension mean cluster take kind elliptical shape since standard deviation x direction Thus Gaussian distribution assigned single cluster find parameter Gaussian cluster eg mean standard deviation use optimization algorithm called Expectation–Maximization EM Take look graphic illustration Gaussians fitted cluster proceed process Expectation–Maximization clustering using GMMs EM Clustering using GMMs begin selecting number cluster like KMeans randomly initializing Gaussian distribution parameter cluster One try provide good guesstimate initial parameter taking quick look data Though note seen graphic isn’t 100 necessary Gaussians start poor quickly optimized Given Gaussian distribution cluster compute probability data point belongs particular cluster closer point Gaussian’s center likely belongs cluster make intuitive sense since Gaussian distribution assuming data lie closer center cluster Based probability compute new set parameter Gaussian distribution maximize probability data point within cluster compute new parameter using weighted sum data point position weight probability data point belonging particular cluster explain visually take look graphic particular yellow cluster example distribution start randomly first iteration see yellow point right distribution compute sum weighted probability even though point near center right Thus naturally distribution’s mean shifted closer set point also see point “topright bottomleft” Therefore standard deviation change create ellipse fitted point maximize sum weighted probability Steps 2 3 repeated iteratively convergence distribution don’t change much iteration iteration 2 key advantage using GMMs Firstly GMMs lot flexible term cluster covariance KMeans due standard deviation parameter cluster take ellipse shape rather restricted circle KMeans actually special case GMM cluster’s covariance along dimension approach 0 Secondly since GMMs use probability multiple cluster per data point data point middle two overlapping cluster simply define class saying belongs Xpercent class 1 Ypercent class 2 Ie GMMs support mixed membership Agglomerative Hierarchical Clustering Hierarchical clustering algorithm fall 2 category topdown bottomup Bottomup algorithm treat data point single cluster outset successively merge agglomerate pair cluster cluster merged single cluster contains data point Bottomup hierarchical clustering therefore called hierarchical agglomerative clustering HAC hierarchy cluster represented tree dendrogram root tree unique cluster gather sample leaf cluster one sample Check graphic illustration moving algorithm step Agglomerative Hierarchical Clustering begin treating data point single cluster ie X data point dataset X cluster select distance metric measure distance two cluster example use average linkage defines distance two cluster average distance data point first cluster data point second cluster iteration combine two cluster one two cluster combined selected smallest average linkage Ie according selected distance metric two cluster smallest distance therefore similar combined Step 2 repeated reach root tree ie one cluster contains data point way select many cluster want end simply choosing stop combining cluster ie stop building tree Hierarchical clustering require u specify number cluster even select number cluster look best since building tree Additionally algorithm sensitive choice distance metric tend work equally well whereas clustering algorithm choice distance metric critical particularly good use case hierarchical clustering method underlying data hierarchical structure want recover hierarchy clustering algorithm can’t advantage hierarchical clustering come cost lower efficiency time complexity On³ unlike linear complexity KMeans GMMTags Machine Learning Clustering Data Science Algorithms Towards Data Science
|
5,549 |
The Secret to Building Performance Libraries
|
The Secret to Building Performance Libraries
Two key reasons you need to understand delegate prototypes in JavaScript
Photo by Fabian Irsara @ Unsplash
I was reading a section in a book about JavaScript and I came across an issue (but also the power of the concept that the issue stems from) that I want to write about. I think it will be especially helpful for newcomers to JavaScript — even if you’re experienced you might learn something new!
This article will go over a known anti-pattern with delegate prototypes. If you’re a React user the concept of this anti-pattern may be familiar to you. We will also look at how you can use that concept to greatly improve the performance of your apps — just as the majority of JavaScript libraries today do.
So, if you want to create a library in JavaScript, I highly recommend you learn how you can optimize your app by delegating prototypes. This is called the Flyweight Pattern and will be explained in this piece.
If you don’t know what a prototype is, they’re objects that JavaScript uses to model other objects after. You could say they’re similar to classes in that they can construct multiple instances of objects, but they’re also objects themselves.
In JavaScript, all objects have some internal reference to a delegate prototype. When objects are queried by property or method lookups, JavaScript first checks the current object. If that doesn’t exist it then proceeds to check the object’s prototype, which is the delegate prototype, and then proceeds with that prototype’s prototype, and so on. When it reaches the end of the prototype chain the last stop ends at the root Object prototype. Creating objects attaches that root Object prototype at the root level. You can branch off objects with different immediate prototypes set with Object.create().
Let’s take a look at the code snippet below:
We have two factory functions here. One of them is makeSorceress which takes a type of sorceress as an argument and returns an object of the sorceress's abilities. The other is makeWarrior which takes a type of warrior as an argument and returns an object of the warrior's abilities.
We instantiate a new instance of the warrior class with type knight along with a sorceress with type fire .
We then used Object.create to create new objects for bob, joe, and lucy, additionally delegating the prototype objects for each.
Bob, joe, and lucy were set with their names on the instance, so that we claim and expect their own properties. Finally, bob attacks lucy with using bash , decreasing her HP by 10 points.
At a first glance, there doesn’t seem to be anything wrong with this example. But there is actually a problem. We expect bob and joe to have their own copy of properties and methods, which is why we used Object.create . When bob bashes lucy and inserts the last targeted name into the this.lastTargets.names array, the array will include the new target's name.
We can log that out and see it for ourselves:
The behavior is expected, however when we also log the last targeted names for joe , we see this:
That doesn’t make sense, does it? The person attacking lucy was bob, as is clearly seen above. But why was joe involved in the act? The one line of code explicitly writes bob.bash(lucy) , and that's it.
So the problem is that bob and joe are actually sharing the same state!
But wait, that doesn’t make any sense because we should have created their own separate copies when we used Object.create …or so we assumed.
Even the docs at MDN explicitly says that the Object.create() method creates a new object. It does create a new object — which it did — but the problem here is that if you mutate object or array properties on prototype properties, the mutation will leak and affect other instances that have some link to that prototype on the prototype chain. If you instead replace the entire property on the prototype, the change only occurs on the instance.
For example:
If you change the this.lastTargets.names property, it will be reflected with other objects that are linked to the prototype. However, when you change the prototype's property ( this.lastTargets ), it will override that property only for that instance. To a new developer this can be a little difficult to grasp.
Some of us who regularly develop apps using React have commonly dealt with this issue when managing state throughout our apps. What we probably never paid attention to is how that concept stems through the JavaScript language itself. So, to state this more clearly, it’s a problem with the JavaScript language in itself that this an anti pattern.
But why is this even an anti pattern? Can’t it be a good thing?
In certain ways it can be a good thing because you can optimize your apps by delegating methods to preserve memory resources. After all, every object just needs one copy of a method, and methods can be shared throughout all the instances, unless that instance needs to override it for additional functionality.
For example, let’s look back at the makeWarrior function:
The battleCry function is probably safe to be shared by all prototypes since it doesn't depend on any conditions to function correctly, besides an hp property which is already set upon instantiation. Newly created instances of this function do not necessarily need their own copy of battleCry and can instead delegate to the prototype object that originally defined this method.
Sharing data between instances of the same prototype is an anti-pattern because it can become very easy to accidentally mutate shared properties or data that shouldn’t be mutated. This has long been a common source of bugs for JavaScript applications.
This practice is in use for a good reason, in fact. Take a look at how the popular request package instantiates the Har function in this source code:
So why doesn’t Har.prototype.reducer just get defined like this?
As explained previously, if newer instances were to be instantiated, it would actually degrade the performance of your apps since it would be (recreating new methods on each instantiation), which is the reducer function.
When we have separate instances of Har :
We’re actually creating 5 separate copies of this.reducer in memory because the method is defined in the instance level. If the reducer was defined directly on the prototype, multiple instances of Har will delegate the reducer function to the method defined on the prototype!
This is an example of how to take advantage of delegate prototypes and improve the performance of your apps.
|
https://medium.com/better-programming/2-reasons-why-you-must-understand-delegate-prototypes-right-now-6dac719d31f4
|
[]
|
2019-07-24 16:50:03.876000+00:00
|
['Frontend Development', 'Nodejs', 'JavaScript', 'Web Development', 'React']
|
Title Secret Building Performance LibrariesContent Secret Building Performance Libraries Two key reason need understand delegate prototype JavaScript Photo Fabian Irsara Unsplash reading section book JavaScript came across issue also power concept issue stem want write think especially helpful newcomer JavaScript — even you’re experienced might learn something new article go known antipattern delegate prototype you’re React user concept antipattern may familiar also look use concept greatly improve performance apps — majority JavaScript library today want create library JavaScript highly recommend learn optimize app delegating prototype called Flyweight Pattern explained piece don’t know prototype they’re object JavaScript us model object could say they’re similar class construct multiple instance object they’re also object JavaScript object internal reference delegate prototype object queried property method lookup JavaScript first check current object doesn’t exist proceeds check object’s prototype delegate prototype proceeds prototype’s prototype reach end prototype chain last stop end root Object prototype Creating object attache root Object prototype root level branch object different immediate prototype set Objectcreate Let’s take look code snippet two factory function One makeSorceress take type sorceress argument return object sorceress ability makeWarrior take type warrior argument return object warrior ability instantiate new instance warrior class type knight along sorceress type fire used Objectcreate create new object bob joe lucy additionally delegating prototype object Bob joe lucy set name instance claim expect property Finally bob attack lucy using bash decreasing HP 10 point first glance doesn’t seem anything wrong example actually problem expect bob joe copy property method used Objectcreate bob bash lucy insert last targeted name thislastTargetsnames array array include new target name log see behavior expected however also log last targeted name joe see doesn’t make sense person attacking lucy bob clearly seen joe involved act one line code explicitly writes bobbashlucy thats problem bob joe actually sharing state wait doesn’t make sense created separate copy used Objectcreate …or assumed Even doc MDN explicitly say Objectcreate method creates new object create new object — — problem mutate object array property prototype property mutation leak affect instance link prototype prototype chain instead replace entire property prototype change occurs instance example change thislastTargetsnames property reflected object linked prototype However change prototype property thislastTargets override property instance new developer little difficult grasp u regularly develop apps using React commonly dealt issue managing state throughout apps probably never paid attention concept stem JavaScript language state clearly it’s problem JavaScript language anti pattern even anti pattern Can’t good thing certain way good thing optimize apps delegating method preserve memory resource every object need one copy method method shared throughout instance unless instance need override additional functionality example let’s look back makeWarrior function battleCry function probably safe shared prototype since doesnt depend condition function correctly besides hp property already set upon instantiation Newly created instance function necessarily need copy battleCry instead delegate prototype object originally defined method Sharing data instance prototype antipattern become easy accidentally mutate shared property data shouldn’t mutated long common source bug JavaScript application practice use good reason fact Take look popular request package instantiates Har function source code doesn’t Harprototypereducer get defined like explained previously newer instance instantiated would actually degrade performance apps since would recreating new method instantiation reducer function separate instance Har We’re actually creating 5 separate copy thisreducer memory method defined instance level reducer defined directly prototype multiple instance Har delegate reducer function method defined prototype example take advantage delegate prototype improve performance appsTags Frontend Development Nodejs JavaScript Web Development React
|
5,550 |
The Woman Who Was Kidnapped and Kept in A Coffin-Sized Box for 7 Years
|
At times, tightly enclosed spaces can send shivers down the spine. Especially if one is claustrophobic, the situation can feel like a nightmare. Claustrophobia, the fear of confined spaces, is one of the most common phobias in the world.
One study indicated that approximately 5 to 10% of the global population suffer from severe claustrophobia, but only a few receive treatment. And the majority of those experiencing it can relate it to the scary ordeal of Colleen Stan, popularly known as “The Girl In The Box.”
It was the year 1977. Twenty-year-old Colleen Stan was hitchhiking for days from her hometown in Oregon to attend a friend's birthday party in Northern California.
Unfortunately, Colleen never made it to that party. She considered herself an experienced hitchhiker. On the beautiful day of May 19th, she had already turned down two rides before accepting the ride that turned her life upside down.
The Misfortune Begins —
After refusing the previous two rides, Colleen Stan finally accepted a ride from a family in a blue van. The lively vibe of the people inside the van made her opt for the unfamiliar ride.
The van was driven by a friendly-looking young man and his wife who sat in the passenger seat. The couple had a baby tied safely in its baby chair in the backseat of the van. To Colleen Stan, the family looked like the perfect blend of merry people. This couple was 23-year-old Cameron Hooker and his 19-year-old wife Janice Hooker, a pair of honest working-class people from Red Bluff, California.
The couple seemed innocent. But as the saying goes, looks are deceptive until the reality is unearthed.
For years, Cameron, a lumber mill worker, had been torturing his wife with beatings, whippings, electrical shocks. The man is a sadistic psychopath with some genuinely horrible bondage fantasies.
Janice, who deeply loved her husband and for the sake of her child's safety, silently suffered his violence. Surprisingly she also helped him fulfill his evil fantasies of holding innocent women captive and subjecting them to immoral torture.
Shortly after sitting inside the van, Colleen felt the air inside it awry. The initial friendliness of the young couple later gave her suspicious vibes. As the ride continued, the young man was continuously staring at Colleen, making her feel slightly uncomfortable. Her instincts uttered a voice of simply escaping the ride, but the excitement wrapped in attending her friend's party is what made her instincts rest and continue with the ride.
The peaceful drive didn't last long.
The Girl In The Box —
The draped personality of Cameron Hooker swapped with his original identity— the sadistic psychopath.
As time passed, Cameron soon veered off the road and drove to a remote location. He then pulled out a knife, held it up to Colleen’s neck, and threatened her not to utter a sound or else she will be killed. She was then chained inside a homemade soundproof wooden “headbox” that weighed twenty pounds.
The box caricatured a life of hell. By the odd twist of fate, Colleen Stan had to experience a difficult phase of her life. The box only confined her head, blocked the radiating sunlight, the pleasant sounds of the surroundings, and prevented fresh airflow.
The couple eventually drove the van to their house in California, where Colleen was held captive. She was then victimized through brutal forms of torture. Her wrists were tied to the ceilings; she was beaten, electrocuted, raped, whipped, and subjected to near-death experiences. The man’s wife, Janice, remained as a spectator to the torture the girl was subjected to. The man’s fantasies rested on imprisoning women as slaves, confining them to tortures with the victim’s voice silenced.
Unlike her husband's sadistic personality, it's unclear how much pleasure Janice derived from the merciless behavior. In many ways, it's quite possible that she remained the victim of her husband's violent behavior, which made her unsympathetic. Still, it does not excuse her for the crime she committed by supporting her evil husband.
The pain Colleen Stan endured incremented over time. She was trapped inside the coffin-like wooden box underneath the Hookers bed for entirely 23 hours of the day. Cameron selectively starved Colleen over the years by giving her occasional meals. She was then stretched on a medieval-style rack, was kept there for hours, and then punished severely. For an hour or two, she was made to do the household chores and babysit the children. The rest of her day was subjected to darkness — both physically mentally.
The cyclical routine of abuse that Cameron established: Isolation, fear, starvation, and torture.
In reality, Cameron and his wife had no interest in killing Colleen Stan. Instead, they just wanted her to dehumanize, manipulate, objectify, and torture her for the years to come.
The Satanic Organization —
The coffin-shaped box was a living ordeal of misery. But to Colleen Stan, the experience of immoral tortures and abuse didn’t feel much worse until one diabolical disclosure. It was the revelation that ran her impulses into fear.
The disclosure was of an evil organization — “The Company” and Cameron claimed that he was a group member. He warned Colleen that she was being eyed on by the organization, and they had already been harassing her families. Whether his claims were valid or not, the threat made Colleen believe his words.
Individual pain can be bearable, but when it involves family, we become petrified. And Colleen feared the same too. More than anything else, she started experiencing concerns about her family. She felt that her attempt to escape from the confinement might lead to the satanic organization harming her family even more. For her family's sake, she decided to stay in captivity and even signed a contract that asked her to remain the slave of the couple forever.
Earlier the family of Colleen Stan filed her missing report. Efforts were put to trace her, but they always met with failure. The investigators later reported that she was either kidnapped or killed.
The Change Of Feelings —
The signing of the contract brought about life changes. Adhering to the wishes of Cameron, Colleen Stan got to leverage few freedoms.
She was allowed to breathe fresh air, jog around, and work in the garden. Surprisingly in March 1981, she was allowed to visit her family for a day. Cameron accompanied her, and she addressed him as her boyfriend in front of her family. However, the family suspected something odd in her behavior. But they shifted their concern, fearing that it might lead to their daughter disappearing again.
Collectively, Cameron's fear and the scare of the satanic organization made her step back from escaping, rebelling against the confinement, or disclosing any information to her family.
Colleen Stan was held in captivity for seven years. Towards the end of the seven years, Cameron expressed his desires in wanting Colleen as his second wife. The husband's aspirations left Janice agitated.
Justice Served —
Janice Hooker’s resentment grew. Her husband's desire to marry Colleen Stan fuelled her with anger. Later, her conscience wrapped with guilt made her realize the immoral sins that she was part of, which had led her to commit merciless deeds.
After seven years of Colleen Stans’s captivity, Janice told her the truth: The contract she signed was bogus, and an organization called “The Company” never existed. After disclosing the facts, Janice helped her escape and pleaded mercy upon her husband, thinking that he might become humane after rehabilitation. However, when she realized that her husband would not change, she eventually reported him to the police.
Cameron Hooker was found guilty of his horrific crimes. He was charged with kidnapping and sexual assault and was sentenced to 104 years in prison. Surprisingly, Colleen Stan and Janice Hooker both live in California with names changed — but— the duo doesn't communicate.
Colleen Stan experienced a fate that, in the eyes of many, would be worse than death. Ever since then, her life was not easy. She experienced chronic back and shoulder pain due to her confinement — yet made a positive impact by working as a mental health professional and social worker. In 2016, a movie named “Girl In The Box” featured the real-life abduction story of Colleen Stan.
Her willpower, faith, and optimism made her survive the tough times.
In the words of Colleen Stan, the psychological mindset she adapted during the tragic time was:
|
https://medium.com/memoirsfromhistory/the-woman-who-was-kidnapped-and-kept-in-a-coffin-sized-box-for-7-years-2ffc8267ffb5
|
['Swati Suman']
|
2020-12-30 05:59:02.781000+00:00
|
['Justice', 'History', 'True Crime', 'Crime', 'Psychology']
|
Title Woman Kidnapped Kept CoffinSized Box 7 YearsContent time tightly enclosed space send shiver spine Especially one claustrophobic situation feel like nightmare Claustrophobia fear confined space one common phobia world One study indicated approximately 5 10 global population suffer severe claustrophobia receive treatment majority experiencing relate scary ordeal Colleen Stan popularly known “The Girl Box” year 1977 Twentyyearold Colleen Stan hitchhiking day hometown Oregon attend friend birthday party Northern California Unfortunately Colleen never made party considered experienced hitchhiker beautiful day May 19th already turned two ride accepting ride turned life upside Misfortune Begins — refusing previous two ride Colleen Stan finally accepted ride family blue van lively vibe people inside van made opt unfamiliar ride van driven friendlylooking young man wife sat passenger seat couple baby tied safely baby chair backseat van Colleen Stan family looked like perfect blend merry people couple 23yearold Cameron Hooker 19yearold wife Janice Hooker pair honest workingclass people Red Bluff California couple seemed innocent saying go look deceptive reality unearthed year Cameron lumber mill worker torturing wife beating whipping electrical shock man sadistic psychopath genuinely horrible bondage fantasy Janice deeply loved husband sake child safety silently suffered violence Surprisingly also helped fulfill evil fantasy holding innocent woman captive subjecting immoral torture Shortly sitting inside van Colleen felt air inside awry initial friendliness young couple later gave suspicious vibe ride continued young man continuously staring Colleen making feel slightly uncomfortable instinct uttered voice simply escaping ride excitement wrapped attending friend party made instinct rest continue ride peaceful drive didnt last long Girl Box — draped personality Cameron Hooker swapped original identity— sadistic psychopath time passed Cameron soon veered road drove remote location pulled knife held Colleen’s neck threatened utter sound else killed chained inside homemade soundproof wooden “headbox” weighed twenty pound box caricatured life hell odd twist fate Colleen Stan experience difficult phase life box confined head blocked radiating sunlight pleasant sound surroundings prevented fresh airflow couple eventually drove van house California Colleen held captive victimized brutal form torture wrist tied ceiling beaten electrocuted raped whipped subjected neardeath experience man’s wife Janice remained spectator torture girl subjected man’s fantasy rested imprisoning woman slave confining torture victim’s voice silenced Unlike husband sadistic personality unclear much pleasure Janice derived merciless behavior many way quite possible remained victim husband violent behavior made unsympathetic Still excuse crime committed supporting evil husband pain Colleen Stan endured incremented time trapped inside coffinlike wooden box underneath Hookers bed entirely 23 hour day Cameron selectively starved Colleen year giving occasional meal stretched medievalstyle rack kept hour punished severely hour two made household chore babysit child rest day subjected darkness — physically mentally cyclical routine abuse Cameron established Isolation fear starvation torture reality Cameron wife interest killing Colleen Stan Instead wanted dehumanize manipulate objectify torture year come Satanic Organization — coffinshaped box living ordeal misery Colleen Stan experience immoral torture abuse didn’t feel much worse one diabolical disclosure revelation ran impulse fear disclosure evil organization — “The Company” Cameron claimed group member warned Colleen eyed organization already harassing family Whether claim valid threat made Colleen believe word Individual pain bearable involves family become petrified Colleen feared anything else started experiencing concern family felt attempt escape confinement might lead satanic organization harming family even family sake decided stay captivity even signed contract asked remain slave couple forever Earlier family Colleen Stan filed missing report Efforts put trace always met failure investigator later reported either kidnapped killed Change Feelings — signing contract brought life change Adhering wish Cameron Colleen Stan got leverage freedom allowed breathe fresh air jog around work garden Surprisingly March 1981 allowed visit family day Cameron accompanied addressed boyfriend front family However family suspected something odd behavior shifted concern fearing might lead daughter disappearing Collectively Camerons fear scare satanic organization made step back escaping rebelling confinement disclosing information family Colleen Stan held captivity seven year Towards end seven year Cameron expressed desire wanting Colleen second wife husband aspiration left Janice agitated Justice Served — Janice Hooker’s resentment grew husband desire marry Colleen Stan fuelled anger Later conscience wrapped guilt made realize immoral sin part led commit merciless deed seven year Colleen Stans’s captivity Janice told truth contract signed bogus organization called “The Company” never existed disclosing fact Janice helped escape pleaded mercy upon husband thinking might become humane rehabilitation However realized husband would change eventually reported police Cameron Hooker found guilty horrific crime charged kidnapping sexual assault sentenced 104 year prison Surprisingly Colleen Stan Janice Hooker live California name changed — but— duo doesnt communicate Colleen Stan experienced fate eye many would worse death Ever since life easy experienced chronic back shoulder pain due confinement — yet made positive impact working mental health professional social worker 2016 movie named “Girl Box” featured reallife abduction story Colleen Stan willpower faith optimism made survive tough time word Colleen Stan psychological mindset adapted tragic time wasTags Justice History True Crime Crime Psychology
|
5,551 |
My Vagina Is Not Too Tight and Dry
|
My Vagina Is Not Too Tight and Dry
Arousal after sex abuse.
Photo by Nine Köpfer on Unsplash
CW: the content warning is about child sex abuse, rape, and the aftermath and effects on the human body after.
Every time I have sex with someone new, I get asked a few questions. “Are you a virgin?” haha, I wish. Next, followed by “Are you sure you are turned on. You don’t feel right.” I don’t know how to explain to anyone how offensive it is to be told by seemingly well-intentioned men that my pussy doesn’t feel right, but I am going to try.
Growing up, I was sexually abused from the age of 5 to about 12 years old. I have been sexually assaulted more times than I can count on my hands. Being gang-raped leaves you with a certain kind of mental scar. There was no real education after all my sexual abuse that my body would render sex differently.
I get compared a lot to other women. I am frequently told how my pussy is wrong, or I am just not in tune with my body enough. I can get myself to orgasm in 3 minutes max with my fingers alone. I may not be very wet, but I can assure you I am entirely in tune with my body.
I think sex positivity is excellent. I support people who flow like waterfalls. I want people to understand that you should not use sex-positivity for othering. Per RAINN, 1 out of every 6 American women has been the victim of an attempted or completed rape in her lifetime (14.8% completed, 2.8% attempted). Statistics show there are a lot of sexual assault survivors out there. We need to talk about the aftermath of sexual assault.
For survivors of sexual assault, it is common to experience genital pain, tightness, and apprehensiveness when it comes to sex after the trauma. For me, sex is excruciating. It is not uncommon for me to bleed, have cuts, or to be dry. There are things I can do to lessen the effect of my body’s response to sex. However, anytime I consent to sex, I know I agree to pain.
If I tell my sex partner I am horny, and I want them inside me, I don’t want someone who isn’t my therapist or OB-GYN to explain to me my pussy is inept. Instead of shaming someone me for not being wet like your ex, pull out some lube. Listen to the person in front of you. They have been with their body since it was created; they certainly know themselves better than you do.
While it may be triggering for survivors to communicate their past abuse, it is crucial for medical professionals to know. I have gone to an OB-GYN for a pap smear. During the pap smear, the swab got stuck in my cervix. The doctor spent 20 minutes panicking, telling me I needed to relax because she couldn’t get it out.
At that moment, I didn’t have control over my body. It was traumatizing to have a doctor freak out while a swab was stuck inside me. I felt like I was being raped all over again.
Since that day, I found a new OB-GYN at the advice of my therapist. My therapist explained to me that people who are sexually assaulted often have trouble with arousal and pap smears. They can be very triggering for survivors like myself.
I found a new doctor and explained my history. My doctor understands how my sexual trauma manifests, and I’m delighted to say she has never got a swab stuck in my cervix. They offered full anesthesia for pap smears and IUD insertions.
Nothing is physiologically wrong with my body. I have had a lot of testing done and conversations with my doctors. I have buckets full of trauma that don’t make great first impressions. Sex is complicated. Sometimes I do get wet, but it’s rare.
To enjoy sex, I have to push through the pain. After penetration starts, I generally get wet, but it takes sex to happen first before my body can let go and be free. I love sex; it is genuinely my favorite thing. I am not ashamed of how my body performs. I am glad I can take back the power that was robbed of me.
I support people who are exploring themselves for the first time. However, I don’t enjoy men thinking that they’re my teacher, and they’re going to awaken my pussy for the first time. I don’t like the implications that because my body preforms differently, I need instruction.
I would much rather have a conversation clothed regarding sex. Sometimes I forget not everyone comes from a BDSM background where we all take half an hour questionnaires before even touching each other. There is something to be learned from that, though. Asking your partner five things they love about sex and five things they can’t stand in sex goes a long way. What does your ideal night in bed look like? Is there anything I can do more of?
If your partner can’t articulate what they like in words, ask them to show you their favorite porn. Maybe it is written erotica that gets them going. Even if you are vanilla, I think researching BDSM negotiation helps so much.
BDSM goes into a highly technical level of sexual negotiations. Where can you touch someone is common, also any triggers, and medical conditions are laid out. I know for some this may be overkill. However, it is a valuable tool for me to explain sexual boundaries. I use it for vanilla hookups all the time. So far, no one has gone running away if anything BDSM checklists and questionnaires have created more sexual satisfaction in my life.
Is your partner having a hard time but can’t say why? Make a safe word so things can stop, no questions asked. Have a partner who goes nonverbal regularly during sex? Give them a small ball to hold, let them know dropping it means sex ends no questions asked.
Life is too short for having lousy sex after trauma. I would rather have an awkward conversation thousand times over than be triggered or worse insulted because of my past. Being patient and keeping an open mind is critical. You never know what someone has been through until its too late usually.
|
https://medium.com/sexography/my-vagina-is-not-too-tight-and-dry-82502e6f232e
|
['Beth Daily']
|
2020-08-14 19:20:19.033000+00:00
|
['Relationships', 'Sexual Assault', 'Self', 'Mental Health', 'Sex']
|
Title Vagina Tight DryContent Vagina Tight Dry Arousal sex abuse Photo Nine Köpfer Unsplash CW content warning child sex abuse rape aftermath effect human body Every time sex someone new get asked question “Are virgin” haha wish Next followed “Are sure turned don’t feel right” don’t know explain anyone offensive told seemingly wellintentioned men pussy doesn’t feel right going try Growing sexually abused age 5 12 year old sexually assaulted time count hand gangraped leaf certain kind mental scar real education sexual abuse body would render sex differently get compared lot woman frequently told pussy wrong tune body enough get orgasm 3 minute max finger alone may wet assure entirely tune body think sex positivity excellent support people flow like waterfall want people understand use sexpositivity othering Per RAINN 1 every 6 American woman victim attempted completed rape lifetime 148 completed 28 attempted Statistics show lot sexual assault survivor need talk aftermath sexual assault survivor sexual assault common experience genital pain tightness apprehensiveness come sex trauma sex excruciating uncommon bleed cut dry thing lessen effect body’s response sex However anytime consent sex know agree pain tell sex partner horny want inside don’t want someone isn’t therapist OBGYN explain pussy inept Instead shaming someone wet like ex pull lube Listen person front body since created certainly know better may triggering survivor communicate past abuse crucial medical professional know gone OBGYN pap smear pap smear swab got stuck cervix doctor spent 20 minute panicking telling needed relax couldn’t get moment didn’t control body traumatizing doctor freak swab stuck inside felt like raped Since day found new OBGYN advice therapist therapist explained people sexually assaulted often trouble arousal pap smear triggering survivor like found new doctor explained history doctor understands sexual trauma manifest I’m delighted say never got swab stuck cervix offered full anesthesia pap smear IUD insertion Nothing physiologically wrong body lot testing done conversation doctor bucket full trauma don’t make great first impression Sex complicated Sometimes get wet it’s rare enjoy sex push pain penetration start generally get wet take sex happen first body let go free love sex genuinely favorite thing ashamed body performs glad take back power robbed support people exploring first time However don’t enjoy men thinking they’re teacher they’re going awaken pussy first time don’t like implication body preforms differently need instruction would much rather conversation clothed regarding sex Sometimes forget everyone come BDSM background take half hour questionnaire even touching something learned though Asking partner five thing love sex five thing can’t stand sex go long way ideal night bed look like anything partner can’t articulate like word ask show favorite porn Maybe written erotica get going Even vanilla think researching BDSM negotiation help much BDSM go highly technical level sexual negotiation touch someone common also trigger medical condition laid know may overkill However valuable tool explain sexual boundary use vanilla hookup time far one gone running away anything BDSM checklist questionnaire created sexual satisfaction life partner hard time can’t say Make safe word thing stop question asked partner go nonverbal regularly sex Give small ball hold let know dropping mean sex end question asked Life short lousy sex trauma would rather awkward conversation thousand time triggered worse insulted past patient keeping open mind critical never know someone late usuallyTags Relationships Sexual Assault Self Mental Health Sex
|
5,552 |
What Are Your Options When You’re No Longer Attractive For The Job Market?
|
Do you want me?
I’ve been avoiding the unavoidable task of looking for a job mainly because I’m already aware of what a complete waste of time it would be — sifting through lackluster job posts that provide just as much excitement as the obituary section of the local newspaper.
I’ve been out of work for about six months now and while I’ve been able to sustain myself with freelance work and the blessing of not having to fork out thousands of dollars for rent and utilities — as 2018 progresses — there is the nagging reminder that my timetable is patiently waiting for me to honor outstanding commitments.
The last full-time job didn’t end well. The last couple of years have alerted me to the fact that it’s almost impossible to find editorial jobs that live up to the promises of maximizing your worth with appropriate compensation, the security of steady hours and a robust benefits package.
After the temp job at ABC Digital ended abruptly after just two weeks — I returned to hustle mode (not that it ever stops) for a few more weeks before being referred to the digital content arm of another media giant. The duties were simple enough — and the best part was the ability to work from home.
Interestingly enough — even that quickly took its toll — as it became clear that I actually liked people a lot more than I realized and truly missed the daily interactions.
But back to the job. I was assigned a vertical that required sifting through large stacks of recycled content and choosing the ones that were pitch-worthy in order to keep the homepage well-stocked. This also meant periodic conference calls with partners from notable publications — who were desperate to retain their positions as the “go-to” outlets.
It took about a month for me to start buckling under the uninspiring regimen of navigating the strains of CMS — in search of content that all looked and sounded the same. As a writer it was intolerable to expect me to contribute to the symptoms of an ailing industry. I steadfastly bitch about how challenging it is to find original content that feeds the soul — and yet here I was earning an unremarkable paycheck as the reward for encouraging the extinction of something I was supposedly championing.
As the second month came to an end — I began to consider that all the pluses about my current gig were fading away. And apparently I wasn’t the only one dying a slow death — because the high turnover was another indication that molding us into a bots wasn’t going to be as easy as our employer envisioned.
Still — I was more than happy with the steady paycheck and was able to muddle though my guilt and intense fear that my writing and reading comprehension skills were going to suffer from the debilitating exercise of sourcing for badly written material — for hours and hours.
By the time I got the early evening call from the recruiter who apologetically confirmed that the next day would be my last — I was already mentally prepped for my imminent exit. I was no longer able to stomach the clickbait headlines and badly-constructed sentences — not to mention the endless sessions of providing captions for generic images.
The kind lady who broke the unexpected good news seemed a lot more upset than the person who just lost her job and all her health benefits. I did my best to assure her that I was used to the erratic job market — due to the experiences I have accrued working for big name corporations who rely on their reputations to hide how they will end up screwing you over — in the end.
I wasn’t that blatant of course — but even I had been — based on her job description — it’s hard to imagine that she would disagree.
Years ago — when I was stuck in a corporate job at a top financial institution — all I wanted to do was wait for the economic crisis to blow over — so I could venture out and land a real editorial job. By the fall of 2013 — I was able to write full-time even though I wasn’t getting paid for my services. Before then — I was supplementing my steady paycheck with freelance jobs in order to build up my cred.
By 2015 — I was getting jobs in digital from fancy start-ups and media companies that all paid shit money. The other thing they had in common was the reluctance to make you a permanent staff member in order to reduce the costs of such an investment.
It began to dawn on me that the job market had shifted into something I never anticipated. If I had known back in the spring of 2013 — that the digital world would meekly surrender to the content-churning machine that it has become — I wouldn’t have walked away from the option of holding down a full time corporate job.
The gamble wasn’t worth it when you consider the toll it took on my stability — as I risked it all to prove that I was capable of realizing the dream of calling The New York Times — home — or any of the other notable outlets that I have since discovered aren’t as illustrious with words as I had assumed.
So — now I’m back where I started. Reality has hit hard and I’m reminded of how relentlessly deceiving and unforgiving the editorial world can be. Aside from the intense competition that has only grown more violent with the consistent help of Twitter — there’s also the notification that nothing lasts for that long — so don’t get too comfortable.
Established portals with the best of intentions will coerce you into pouring your heart and soul and end up stumping all over it when the jig is up. Everything has a time limit and the only way to stay ahead of the curve is to carve out a space with your name on it.
At this point in my life — finding a job that not only suits my skill set — but also positions me for an enviable trajectory is something that I can’t fathom ever transpiring. I’m not young or old — which makes it harder for employers to know what the hell to do with me — and that’s if they’re interested enough to ponder.
The immense appeal I had almost five years ago has evaporated and now I have no clue how to convince anyone that I’m the best person for a job that I don’t even want.
So — what are your options when you’re no longer attractive for the job market — but have to work to sustain yourself and dignity?
Maybe — it lies in your priorities. At this point — it’s highly unlikely that I will ever hold an editorial position in a corporate setting that will allow me to blossom into a managerial position. So — I will have to take what I can get — while laboring on my own shit.
I will have to map out goals for future projects that will keep me motivated and excited about a craft that I still love — even if the climate is giving me plenty of reasons to hate it. I have to manifest my destiny without the false security of outside forces that only conspire to use you up — before tossing you out without a respectful exit.
I still find myself attractive among the ugliness of what the industry is constantly releasing — and instead of trying to convince suitors who aren’t interested — it’s time to accept the journey of re-discovery through self-empowerment that can only lead to the life of my dreams.
And that’s not work — it’s passion. The greatest love of all.
|
https://nilegirl.medium.com/what-are-your-options-when-youre-no-longer-attractive-for-the-job-market-da2c3ad87394
|
['Ezinne Ukoha']
|
2018-03-09 20:41:12.019000+00:00
|
['Life Lessons', 'Work', 'Media', 'Careers', 'Journalism']
|
Title Options You’re Longer Attractive Job MarketContent want I’ve avoiding unavoidable task looking job mainly I’m already aware complete waste time would — sifting lackluster job post provide much excitement obituary section local newspaper I’ve work six month I’ve able sustain freelance work blessing fork thousand dollar rent utility — 2018 progress — nagging reminder timetable patiently waiting honor outstanding commitment last fulltime job didn’t end well last couple year alerted fact it’s almost impossible find editorial job live promise maximizing worth appropriate compensation security steady hour robust benefit package temp job ABC Digital ended abruptly two week — returned hustle mode ever stop week referred digital content arm another medium giant duty simple enough — best part ability work home Interestingly enough — even quickly took toll — became clear actually liked people lot realized truly missed daily interaction back job assigned vertical required sifting large stack recycled content choosing one pitchworthy order keep homepage wellstocked also meant periodic conference call partner notable publication — desperate retain position “goto” outlet took month start buckling uninspiring regimen navigating strain CMS — search content looked sounded writer intolerable expect contribute symptom ailing industry steadfastly bitch challenging find original content feed soul — yet earning unremarkable paycheck reward encouraging extinction something supposedly championing second month came end — began consider plus current gig fading away apparently wasn’t one dying slow death — high turnover another indication molding u bot wasn’t going easy employer envisioned Still — happy steady paycheck able muddle though guilt intense fear writing reading comprehension skill going suffer debilitating exercise sourcing badly written material — hour hour time got early evening call recruiter apologetically confirmed next day would last — already mentally prepped imminent exit longer able stomach clickbait headline badlyconstructed sentence — mention endless session providing caption generic image kind lady broke unexpected good news seemed lot upset person lost job health benefit best assure used erratic job market — due experience accrued working big name corporation rely reputation hide end screwing — end wasn’t blatant course — even — based job description — it’s hard imagine would disagree Years ago — stuck corporate job top financial institution — wanted wait economic crisis blow — could venture land real editorial job fall 2013 — able write fulltime even though wasn’t getting paid service — supplementing steady paycheck freelance job order build cred 2015 — getting job digital fancy startup medium company paid shit money thing common reluctance make permanent staff member order reduce cost investment began dawn job market shifted something never anticipated known back spring 2013 — digital world would meekly surrender contentchurning machine become — wouldn’t walked away option holding full time corporate job gamble wasn’t worth consider toll took stability — risked prove capable realizing dream calling New York Times — home — notable outlet since discovered aren’t illustrious word assumed — I’m back started Reality hit hard I’m reminded relentlessly deceiving unforgiving editorial world Aside intense competition grown violent consistent help Twitter — there’s also notification nothing last long — don’t get comfortable Established portal best intention coerce pouring heart soul end stumping jig Everything time limit way stay ahead curve carve space name point life — finding job suit skill set — also position enviable trajectory something can’t fathom ever transpiring I’m young old — make harder employer know hell — that’s they’re interested enough ponder immense appeal almost five year ago evaporated clue convince anyone I’m best person job don’t even want — option you’re longer attractive job market — work sustain dignity Maybe — lie priority point — it’s highly unlikely ever hold editorial position corporate setting allow blossom managerial position — take get — laboring shit map goal future project keep motivated excited craft still love — even climate giving plenty reason hate manifest destiny without false security outside force conspire use — tossing without respectful exit still find attractive among ugliness industry constantly releasing — instead trying convince suitor aren’t interested — it’s time accept journey rediscovery selfempowerment lead life dream that’s work — it’s passion greatest love allTags Life Lessons Work Media Careers Journalism
|
5,553 |
The Mayans’ Lost Guide To Doing Data Science In Fast Paced Startups
|
The quest is what are those key practices left unsaid while working in a fast and highly chaotic environment? Are the AI foundations and computer science degrees enough to make scalable models in real life? It’s not as easy as eating cotton candy.
Driven by intellectual progress, I decided to do a review of my past experience in working with Machine Learning and Deep Learning models. In this article, I’ll be using my work experience on one of my recent projects where the aim was to predict a user’s purchasing probability for our various offerings. The purpose of this project was to help the Performance Marketing and Growth Team at HealthifyMe to evaluate their efficacy and efficiency of our sales engines respectively. I am sharing below the insights I’ve had from the same on what one needs to strive towards while solving data problems:
|
https://medium.com/healthify-tech/the-mayans-lost-guide-to-doing-data-science-in-fast-paced-startups-2531128aecd0
|
['Saurav Agarwal']
|
2020-09-12 16:18:33.279000+00:00
|
['Machine Learning', 'Artificial Intelligence', 'Startups', 'Data Science', 'Product Management']
|
Title Mayans’ Lost Guide Data Science Fast Paced StartupsContent quest key practice left unsaid working fast highly chaotic environment AI foundation computer science degree enough make scalable model real life It’s easy eating cotton candy Driven intellectual progress decided review past experience working Machine Learning Deep Learning model article I’ll using work experience one recent project aim predict user’s purchasing probability various offering purpose project help Performance Marketing Growth Team HealthifyMe evaluate efficacy efficiency sale engine respectively sharing insight I’ve one need strive towards solving data problemsTags Machine Learning Artificial Intelligence Startups Data Science Product Management
|
5,554 |
We Need Trained People to Deal with Mental Health Breaks
|
Recently I was researching a post on Black Americans killed by police. Some of the individual stories that resonated with me, were the victims who suffered from mental health issues.
This is not a problem associated with just the American police, British police are no different. Five years ago I worked with a young lady with mental health issues. As with many of these people she was self-medicating with alcohol. During a mental health break, she had been drinking and was hiding in a bush keeping away from people. The police approached her and because it was dark they shone their lights in her face. She states she couldn’t see it was the police because the light was in her eyes. They asked her to get out of the bush and she refused. They then pepper-sprayed her in the face and dragged her out. She was scared still not realising it was the police who had her.
They arrested her and put her in a cell to sober up before they interviewed her. When I went to meet her the next morning to be her appropriate adult, what I saw shook me. She was black down both arms and legs. There were no gaps between her bruises, she had been brutalised. This was not only my opinion but that of the nurse I took her to after. When the case was investigated by the police complaints commission, it will not surprise anyone that it was thrown out. No case to answer.
Mental health services are at breaking point.
In Britain, mental health services are at breaking point. Too few people are diagnosed and offered the support they need to function. With these services cut, the pressure is placed on the police and ambulance service.
If services were available, we wouldn’t need untrained personnel dealing with these individuals. A trained person would know what to do. Instead of arresting my young lady, they would have spoken to her, calmed her down and tried to move her to a safe location. They could have kept her safe, without arresting her.
What we need is a fourth service dedicated to mental health. If a person is experiencing a break, these could be the first call. I understand some may be dangerous and require the police. A service to assess this initially would help. A group to respond to non-violent cases.
All countries could use this service.
America is no different they could also use this service. When we look at the case of Daniel Prude this is evident. Prude was a 41-year old who during a mental health break ran into the road naked. He became agitated when the officer put a spit hood on his head. To calm him he was placed on the floor. The officers full weight was forced on his head, he died after being asphyxiated.
Anna Rosser would have also benefitted from this intervention. She was shot dead when police entered her house and she was holding a knife. Her boyfriend unable to cope with her mental health break had dialled 911.
All citizens deserve equal treatment regardless of colour, gender or mental health. If as a species we cannot employ this service the least we can do it admit there is a problem. Train the existing staff in mental health. Having worked with mental health for twenty years, I would volunteer an evening a week to work this service.
Mental health is on the rise across the world. The state of life at present does nothing to help this. A quarter of the United Kingdom suffers from their mental health. Other countries have similar statistics. It is time we represented this percentage fairly and gives them access to trained help.
|
https://medium.com/mental-health-and-addictions-community/we-need-trained-people-to-deal-with-mental-health-breaks-de94684e6954
|
['Sam H Arnold']
|
2020-10-07 13:42:45.086000+00:00
|
['Ideas', 'Equality', 'Services', 'Mental Health', 'Diversity']
|
Title Need Trained People Deal Mental Health BreaksContent Recently researching post Black Americans killed police individual story resonated victim suffered mental health issue problem associated American police British police different Five year ago worked young lady mental health issue many people selfmedicating alcohol mental health break drinking hiding bush keeping away people police approached dark shone light face state couldn’t see police light eye asked get bush refused peppersprayed face dragged scared still realising police arrested put cell sober interviewed went meet next morning appropriate adult saw shook black arm leg gap bruise brutalised opinion nurse took case investigated police complaint commission surprise anyone thrown case answer Mental health service breaking point Britain mental health service breaking point people diagnosed offered support need function service cut pressure placed police ambulance service service available wouldn’t need untrained personnel dealing individual trained person would know Instead arresting young lady would spoken calmed tried move safe location could kept safe without arresting need fourth service dedicated mental health person experiencing break could first call understand may dangerous require police service ass initially would help group respond nonviolent case country could use service America different could also use service look case Daniel Prude evident Prude 41year old mental health break ran road naked became agitated officer put spit hood head calm placed floor officer full weight forced head died asphyxiated Anna Rosser would also benefitted intervention shot dead police entered house holding knife boyfriend unable cope mental health break dialled 911 citizen deserve equal treatment regardless colour gender mental health specie cannot employ service least admit problem Train existing staff mental health worked mental health twenty year would volunteer evening week work service Mental health rise across world state life present nothing help quarter United Kingdom suffers mental health country similar statistic time represented percentage fairly give access trained helpTags Ideas Equality Services Mental Health Diversity
|
5,555 |
How to Become a Creative Writer
|
How to Become a Creative Writer
3 strategies for engaging a reader’s imagination with your writing
Photo by Clever Visuals on Unsplash
When we think of creative writing, it can get a bit confusing because it seems like a redundancy. Isn’t the very act of writing creative? So why is there writing and creative writing?
Unless you’re copying the words of someone else, all writing is creation. Even writers of owner’s manuals, which don’t feel particularly creative, have to create clear instructions on how to properly use a product.
So why the differentiation in terms?
A working definition of creative writing
Creative writing refers to writing that utilizes and engages the imagination, and that’s where the distinction is found. Creative writing is approached differently than writing that is more traditionally non-fiction, such as straightforward reporting or informational essays. In other words, any other type of writing that doesn’t tend to engage people on an emotional level.
Creative writing is about making people feel and imagine
Creative writing is called creative writing simply because you tend to use your imagination more in writing it than you do with more informational non-fiction writing. Creative writing features more figurative and sensory language than you would encounter in a news report or an informative article.
Creative writers approach the blank page with an incredible weight of responsibility to create something that will resonate with their readers on an emotional level and stick with them long after they’ve read it.
3 Ways to Put Creative Writing Into Practice
Creative writing can be broken up into three main categories: fiction, poetry, and creative nonfiction. We’ll look at each one a little more closely.
1. Writing Fiction
Fiction takes many forms, such as novels, short stories, and screenplays. What makes fiction unique is that it is the crafting of a story that doesn’t actually exist. We create the details of the story out of our own imaginations.
The key to writing compelling fiction is introducing your readers to a complex character they can invest in over the course of the character’s journey to try to get something she wants.
Stories are essentially about a character who wants something, faces opposition to get it, and undergoes a significant character change along the journey of trying to attain the goal.
Stories take people on a journey and engage them on a personal level, and they’re a fun way to exercise your creativity and tap into your imagination.
2. Writing Poetry
When you woke up this morning, how did you feel? What about on your way to work? How about when the weekend finally arrived? When you had a disagreement with your significant other? When someone told you that you were doing a great job?
We have a wide range of emotions that we feel throughout any given day. This is the human condition. And the worse thing you can do is bottle up your emotions.
Poetry is all about emotional expression. Poetry uses a variety of literary and poetic elements, such as figurative and sensory language, rhyme, rhythm, repetition, and other poetic devices
There are many different forms of poetry you can learn about, but the point is to capture emotions and communicate them in a concrete way through poetry.
For example, “His words cut like a knife” is a more creative and poetic way of saying he said something to cause her emotional pain.
3. Writing Creative Non-Fiction
Creative non-fiction is telling stories that are true in a creative way. A creative writer might learn about a story of a man who rescued a small child from a burning building. A journalistic approach would just report the facts: who the man was, who the child was, where the fire was located, how the man got the child out, etc.
A creative writer would recognize the immersive value of the story and want to write it in a way that engages people’s senses and emotions.
Ron Chernow’s outstanding biography Alexander Hamilton, the basis for the hit Broadway musical Hamilton, is an example of creative non-fiction that engages the reader in the story of the nation’s first treasury secretary using concrete and image-rich language, crafting a compelling narrative that tells a true story.
Creative non-fiction is about finding important true stories that will impact people and tell them in a way that is accurate, creative, and memorable.
For some excellent advice on writing creative non-fiction, I highly recommend Jack Hart’s book Story Craft.
|
https://medium.com/inspired-writer/how-to-become-a-creative-writer-9657a36f351f
|
['Tom Farr']
|
2020-07-30 21:02:49.026000+00:00
|
['Poetry', 'Writing', 'Fiction', 'Creative Writing', 'Creative Non Fiction']
|
Title Become Creative WriterContent Become Creative Writer 3 strategy engaging reader’s imagination writing Photo Clever Visuals Unsplash think creative writing get bit confusing seems like redundancy Isn’t act writing creative writing creative writing Unless you’re copying word someone else writing creation Even writer owner’s manual don’t feel particularly creative create clear instruction properly use product differentiation term working definition creative writing Creative writing refers writing utilizes engages imagination that’s distinction found Creative writing approached differently writing traditionally nonfiction straightforward reporting informational essay word type writing doesn’t tend engage people emotional level Creative writing making people feel imagine Creative writing called creative writing simply tend use imagination writing informational nonfiction writing Creative writing feature figurative sensory language would encounter news report informative article Creative writer approach blank page incredible weight responsibility create something resonate reader emotional level stick long they’ve read 3 Ways Put Creative Writing Practice Creative writing broken three main category fiction poetry creative nonfiction We’ll look one little closely 1 Writing Fiction Fiction take many form novel short story screenplay make fiction unique crafting story doesn’t actually exist create detail story imagination key writing compelling fiction introducing reader complex character invest course character’s journey try get something want Stories essentially character want something face opposition get undergoes significant character change along journey trying attain goal Stories take people journey engage personal level they’re fun way exercise creativity tap imagination 2 Writing Poetry woke morning feel way work weekend finally arrived disagreement significant someone told great job wide range emotion feel throughout given day human condition worse thing bottle emotion Poetry emotional expression Poetry us variety literary poetic element figurative sensory language rhyme rhythm repetition poetic device many different form poetry learn point capture emotion communicate concrete way poetry example “His word cut like knife” creative poetic way saying said something cause emotional pain 3 Writing Creative NonFiction Creative nonfiction telling story true creative way creative writer might learn story man rescued small child burning building journalistic approach would report fact man child fire located man got child etc creative writer would recognize immersive value story want write way engages people’s sens emotion Ron Chernow’s outstanding biography Alexander Hamilton basis hit Broadway musical Hamilton example creative nonfiction engages reader story nation’s first treasury secretary using concrete imagerich language crafting compelling narrative tell true story Creative nonfiction finding important true story impact people tell way accurate creative memorable excellent advice writing creative nonfiction highly recommend Jack Hart’s book Story CraftTags Poetry Writing Fiction Creative Writing Creative Non Fiction
|
5,556 |
Due Diligence
|
Bobby Chiu
We limn the darkness as best we can,
trying to shine light into corners
where the scariest monsters await.
Such shining demands deft courage.
Shaky hands hold the illuminating torch.
Fear is a wise and constant companion.
The darkest corners are in the mind.
Some monsters despise the light.
Look hard, but be wary what you loose.
|
https://medium.com/geezer-speaks/due-diligence-9679f902a3a0
|
['Mike Essig']
|
2017-09-16 07:34:22.633000+00:00
|
['Poetry', 'Consciousness', 'Dreams', 'Fear', 'Psychology']
|
Title Due DiligenceContent Bobby Chiu limn darkness best trying shine light corner scariest monster await shining demand deft courage Shaky hand hold illuminating torch Fear wise constant companion darkest corner mind monster despise light Look hard wary looseTags Poetry Consciousness Dreams Fear Psychology
|
5,557 |
Diana Heinrichs, Lindera: “AI is built for the masses”
|
Diana Heinrichs’ mission is nothing less than to maintain the mobility and autonomy of elderly citizens. With her company Lindera, she founded an app that can analyze body movements and detect fall risks.
Lindera
In our REWRITE TECH podcast, we talked to Diana about her personal motivation to found Lindera and what challenges she had to overcome while founding a tech startup in Germany.
Diana Heinrichs:
As the life expectancy rises, the average age of society increases. That not only brings structural problems but also personal difficulties when family members get older and need an increased level of care. Falls are a particularly increased risk with age and the prevention of falls is a common goal in elderly care.
This is exactly where Diana Heinrichs’ idea starts. Lindera translates geriatric assessments into a software-driven product that can be used on smartphones. A relatively simple task at first was soon confronted with the first pushbacks, as Diana remembers: “I talked to Fraunhofer ITWM and they said it’s simple, but it won’t work. You can’t analyze three-dimensional pictures through a smartphone camera.”
But Diana wasn’t thrown off by the scientific evaluation. She formulated the mathematical problem and reached out to every single PhD candidate in northern Germany. “This is how we set up our data science team and solved the problem. Now we are the only company in the world that can do 3D geriatric assessments.”
Lindera didn’t just find a solution, they are meeting the scientific gold standard and their solution is evidence-based. And just as important: it is easy to use. People only need to record a 30-second video of someone with their own smartphone and provide some additional information regarding the subject’s medication. The 2D recording then gets split into many slices, so neutral networks can transform it into a 3D model.
“It’s a really nice self-service that puts the senior in the centre of digital attention.”
Tech for the elderly
Initially, the idea of founding Lindera was triggered by a personal experience. Diana’s own grandmother received care provided by Diana’s mother and, somehow, it just worked. “How can it be that in my family it just works?”, Diana wondered.
To get to the core of this, Diana paused her job at Microsoft Germany, took an internship in an outbound care service, and talked to many specialists in the field. “When we look at our ageing society,” she found, “it is absolutely clear that we need new solutions.” Diana decided to focus on fall prevention, one of the most important areas of elderly care and geriatric assessment that wasn’t digitalized yet.
Doing the impossible
The scientists at Fraunhofer weren’t the only ones that didn’t believe in her idea as she recalls: ”Something I know very well is: ‘yes but’”. Unlike in the United States, it’s uncommon in Germany to combine science and business, limiting the innovative possibilities. “We can’t just outsource innovation to Fraunhofer”, Diana said.
From her experience in selling her solution to care homes, Diana stumbled upon many people, who don’t understand how artificial intelligence works and demand exclusivity. “AI is built for the masses. It’s about to build a database, which we can learn from”, she states. In order for Germany to keep pace with innovations, the understanding of technology must improve, Diana says: “I hope we can get beyond this narrow mindset.”
Listen to REWRITE TECH with Diana Heinrichs from Lindera
Listen to the full conversation with Diana Heinrichs on our REWRITE TECH Podcast, which is available on all common audio streaming platforms including Spotify and Apple Podcasts.
Don’t miss out on our other episodes, including:
Find out more about REWRITE TECH.
|
https://medium.com/rewrite-tech/diana-heinrichs-lindera-ai-is-built-for-the-masses-d417c5d2f581
|
['Sarah Schulze Darup']
|
2020-12-17 14:49:57.705000+00:00
|
['Artificial Intelligence', 'Podcast', 'Women In Tech', 'Healthcare', 'Female Founders']
|
Title Diana Heinrichs Lindera “AI built masses”Content Diana Heinrichs’ mission nothing le maintain mobility autonomy elderly citizen company Lindera founded app analyze body movement detect fall risk Lindera REWRITE TECH podcast talked Diana personal motivation found Lindera challenge overcome founding tech startup Germany Diana Heinrichs life expectancy rise average age society increase brings structural problem also personal difficulty family member get older need increased level care Falls particularly increased risk age prevention fall common goal elderly care exactly Diana Heinrichs’ idea start Lindera translates geriatric assessment softwaredriven product used smartphones relatively simple task first soon confronted first pushbacks Diana remembers “I talked Fraunhofer ITWM said it’s simple won’t work can’t analyze threedimensional picture smartphone camera” Diana wasn’t thrown scientific evaluation formulated mathematical problem reached every single PhD candidate northern Germany “This set data science team solved problem company world 3D geriatric assessments” Lindera didn’t find solution meeting scientific gold standard solution evidencebased important easy use People need record 30second video someone smartphone provide additional information regarding subject’s medication 2D recording get split many slice neutral network transform 3D model “It’s really nice selfservice put senior centre digital attention” Tech elderly Initially idea founding Lindera triggered personal experience Diana’s grandmother received care provided Diana’s mother somehow worked “How family works” Diana wondered get core Diana paused job Microsoft Germany took internship outbound care service talked many specialist field “When look ageing society” found “it absolutely clear need new solutions” Diana decided focus fall prevention one important area elderly care geriatric assessment wasn’t digitalized yet impossible scientist Fraunhofer weren’t one didn’t believe idea recall ”Something know well ‘yes but’” Unlike United States it’s uncommon Germany combine science business limiting innovative possibility “We can’t outsource innovation Fraunhofer” Diana said experience selling solution care home Diana stumbled upon many people don’t understand artificial intelligence work demand exclusivity “AI built mass It’s build database learn from” state order Germany keep pace innovation understanding technology must improve Diana say “I hope get beyond narrow mindset” Listen REWRITE TECH Diana Heinrichs Lindera Listen full conversation Diana Heinrichs REWRITE TECH Podcast available common audio streaming platform including Spotify Apple Podcasts Don’t miss episode including Find REWRITE TECHTags Artificial Intelligence Podcast Women Tech Healthcare Female Founders
|
5,558 |
Amsterdam: A History of the World’s Most Liberal City
|
Russell Shorto cherry-picks the most interesting characters and events from his research into the city’s history.
Russell Shorto’s Amsterdam: A History of the World’s Most Liberal City is such an enjoyable book in part because Shorto cherry-picks the most interesting characters and events from his research into the city’s history. Shorto relates these stories in his clear, easy-to-read style creating a successful popular history as well as making a light foray into intellectual history. Although it covers the city’s history in at least cursory fashion from its foundation in 1200 to the present, it is far from comprehensive; there are large gaps especially during the city’s decline in the 18th and 19th centuries.
Shorto also has a thesis to prove: Amsterdam is the most liberal city in the world. He admits that this thesis is difficult to establish. For one, there are vagaries in defining the word “liberal” (liber in Latin means “free”) and it means different, even contradictory, things in different eras (a “Liberal” in the Netherlands is actually an economic liberal, and thus more of a conservative). Any claim to Amsterdam’s being the most “liberal” city in the world relies as much on its role in history as its current status as a medium size city with world-class culture, diversity, and, famously, official tolerance for soft drugs and legal prostitution (the latter actually isn’t so unusual in Western Europe). Shorto confronts the uncertain nature of historical influences (how much “credit” can we really give to Amsterdam for the development of Western freedoms?).
Shorto describes how Amsterdam’s liberal mindset was shaped by its water-logged origins. The harsh situation of the first settlers of Amsterdam, their constant struggle against the sea and the river delta called for collective action in water management: building dikes, windmills, bridges, and importantly, committing to this infrastructure for the long term. Such collective action, so typical of the Dutch mindset, proved in the end to be beneficial to the individual as well. Humble individuals ended up owning real estate that they had wrought from the sea. The early Amsterdam settlers were remote from kingly power. So after they reclaimed the land from the sea, it didn’t belong to a church or king. “It was theirs.” (253) (The statement about the land not belonging to church or king and the “It was theirs” line is repeated nearly verbatim on page 279).
Shorto of course ascribes economic factors as essential conditions for this rise of “individualism.” For example, in 1500 peasants owned 45% of the land in Holland as opposed to 5% in the rest of Europe (44). They were vested owners in their land, and with it came individual freedom of action that was hard to find in other parts of Europe, where peasants were generally bound to the manorial system.
So later on, when Philip II of Spain tried to roll back the Protestant Reformation in the Netherlands, one reason that a critical mass of Dutch people supported the revolt was their own vested interest in their country. (Another was that Philip’s plans were so draconian–execution even if you recanted and returned to Catholicism–that many Dutch Protestants were literally fighting for their lives.)
***
There was nothing inevitable about Amsterdam’s rise to becoming a global economic powerhouse. It got some lucky breaks along the way. The first was the “Miracle of Amsterdam” (1345) when a dying old man took his last communion, vomited it, and the communion host remained whole. The women who were caring for him threw it on a fire, but the wafer did not burn! A miracle was declared and Amsterdam became a place of pilgrimage. Many churches were built and religious tourists came and bought trinkets, not so unlike tourists shopping in the countless souvenir shops one finds in central Amsterdam today. (Shorto omits the fact that another key founding myth of Amsterdam features vomit: reputedly a seasick dog vomited on the spot where early settlers decided to build the first dike, near Nieuwendijk.)
Shorto relates how early technological breakthroughs helped Amsterdam get off the ground. Discoveries in herring preservation (the fish’s liver is a natural preservative) led to the herring bus, a floating herring factory where the herring was packed and preserved on board while the ship stayed at sea. This and other advances eventually enabled the Dutch to dominate the north European market. The herring bus led in turn to a rise in shipbuilding, which became the key infrastructure for Dutch trade and the later colonial empire. Windmills were adapted to become sawmills, so that the Dutch imported German wood and sold finished lumber for export to the English.
The Dutch were also early adapters of the printing press which led to the dissemination of literacy, unorthodox ideas, and the development of the greatest publishing center of early modern Europe. (Shorto avoids the issue of whether a Dutchman really invented the printing press before Gutenberg, as some Dutch scholars have claimed).
The printing press led to an explosion in the dissemination of knowledge. Erasmus of Rotterdam emerged as a key figure in challenging abuses of the church and the asserting the primacy of the individual in interpreting the Bible according to his or her own lights. Erasmus in turn inspired Martin Luther and the Protestant Reformation. If the essence of Protestantism is that the individual read scripture for oneself (rather than rely solely on Church authority), then Erasmus and Luther each played titanic roles in changing history in favor of the individual (instead of the Church). These ideas found ready reception in the Netherlands, including Amsterdam, especially after Calvin’s important refinements.
Shorto’s chapters on the Dutch golden age are an enjoyable retelling of what he calls “one of history’s classics.” Shorto describes the rise of the Dutch East India Company (VOC is its Dutch acronym) as the first modern corporation (a permanent company with shares for sale to anyone) and its role in reshaping parts of Asia and Africa, even as it enriched Amsterdam. The building of the Amsterdam’s canal belt (grachtengordel) was the largest planned urban expansion in Europe since Roman times (according to Geert Mak). Artists such as Rembrandt flourished, and portraiture for the first time was within the reach of the middle-class (not just aristocrats). Readers familiar with Simon Schama’s The Embarrassment of Riches, Jonathan Israel’s The Dutch Republic or Geert Mak’s Amsterdam: A brief life of a city may not find much new here, but Shorto is an exuberant story-teller and his enthusiasm for the period is infectious.
The fact that three of the seminal philosophers of the early modern era–Descartes, Spinoza, and Locke–all wrote and published their important works in Amsterdam is strong evidence that the light of liberty may never have burned as brightly in the world at large without the freedom of expression that Amsterdam allowed in this period. Descartes and Locke both came here and published works that they could not have published in France or England. Spinoza was born in Amsterdam, briefly ran his father’s business, and so breathed in its spirit of freedom his whole (short) life. The fact that Spinoza was excommunicated from the Jewish community makes Shorto see him as distinctly modern, the first prominent European to not belong any religious community and thus an individual par excellance. Spinoza’s Tractatus was perceived as such an incendiary challenge to organized church and state that it was banned even in the Netherlands; but it was published, read, and discussed anyway and its influence on the French Enlightenment in the following century was enormous.
In 1672 (the rampjaar or “year of disaster” in Dutch history), England, France and even the Bishop of Münster invaded the Netherlands. Although Amsterdam itself wasn’t invaded, the nation as a whole was weakened at land and sea, and soon after had to face a series of major wars against Louis XIV of France. Shorto winds up his coverage of the Golden Age there, even though other historians point out that while Amsterdam weakened in many ways, it was a slow decline, and the city remained a key financial center well into the 18th century. (And the Netherlands even played a key role in helping to fund the early days of the American revolution. See Barbara Tuchman’s The First Salute.) Shorto does offer an entertaining account of William III”s invasion of England, which English historians white-washed into the “Glorious Revolution” partly in order to preserve the notion that England hasn’t been invaded by a foreign power since 1066.
Shorto’s coverage of the eighteenth and nineteenth century is achieved mainly through sketches of some emblematic characters, such as Aletta Jacobs, the first Dutch woman to become a doctor and an early proponent of birth control and Eduard Dekker, the author of the classic anti-colonial novel Max Havelaar. So we see that there is still a tradition of enlightenment in the Netherlands, even during its period of economic senescence.
The twentieth century showed signs of Amsterdam’s resurgence with another massive enlargement of the city and rise of modern infrastructure and social services. But World War II was a dark chapter in the city’s liberal history. For various reasons, such as their concentration in Amsterdam and the usual Dutch efficiency at record-keeping, the survival of Dutch Jews was the lowest percentage of any country in Europe. Amsterdam did hold a general strike as a protest against the Nazi deportment of Jews, which historian Loe De Jong calls the “the first and only antipogrom strike in human history.” (268) But the Dutch carried a heavy conscience against their country’s relative lack of action to resist the deportments (compared to Denmark, for example). Shorto sees World War II as a low-point of Dutch liberalism. The Dutch, or the Amsterdammers, failed to stop the Nazis from shipping off their Jews to the death camps. So liberalism seems to some extent a position of expediency more than pure idealism.
The post-war period brought important changes to Amsterdam and a resurgence in liberalism, facilitated by a rise in affluence as Europe rebuilt. Outspoken champions of what would later by called gay rights, such as designer Benno Premsela, helped “normalize” homosexuality in The Netherlands. The Provo movement was hugely influential in challenging the status quo and created a climate where the first marijuana coffeeshops were legally tolerated in the early 1970s. Provo also highlighted the importance of cycling and helped call for building what would become the greatest cycling infrastructure of any city in the word.
***
Shorto doesn’t seriously challenge his thesis that liberalism was born in Amsterdam. He writes that if one were to award geographic medals for places that contributed to liberalism, then London, Paris and Monticello (Thomas Jefferson’s home) would all be candidates (18), but he doesn’t really pursue the claims of these other cities.
He also gives Amsterdam perhaps too much exclusive credit for the growth of capitalism, even while other Dutch towns such as Haarlem, and towns from the southern Netherlands (Brugge, Antwerp, etc.) also played important roles. Nor does he mention the important contributions to early capitalism of Venice or the towns of the Hanseatic League (Hamburg, Lübeck, etc.)
Professional historians will find much of this familiar ground and may find little new in it. Unlike in his The Island at the Center of the World, Shorto here doesn’t do much original archival research; this is too large a subject and so he relies mainly on secondary sources. The source notes also don’t cover quite the origins of all his anecdotes. When Shorto shares the story about a seventeenth century French naval commander who was surprised that a Dutch sea captain swept out his own quarters (while the French commander had a servant to do it), he doesn’t cite his source for this story. (It’s either from The Embarrassment of Riches or Israel’s The Dutch Republic.)
Shorto’s interviews with Frieda Menco, a Holocaust survivor who knew Anne Frank as a girl, and Roel van Duijn, founder of the Provo movement, which he incorporates into the narrative as original research, add both an anecdotal quality and fresh material to the narrative.
Passages about Shorto’s own Amsterdam experiences add a personal dimension to the book. Shorto interjects himself into the text perhaps more than usual in a popular history. He has until just recently lived in Amsterdam, and thus has had the opportunity to meet interesting personages from its twentieth century history. He mentions several places where he has lived in Amsterdam (and where he works, at the West Indian Company house), and the people connected to these places, and sees them as emblematic of the city’s history.
***
This is not a perfect book. The thesis is intriguing, but is hard to prove. In the end, the claim that Amsterdam is the most liberal city in the world doesn’t matter that much. It supplies a theme for the work; it makes for a good (if hyperbolic) meme. Shorto focuses on the times and events that support his thesis or add color to the narrative. While some people or eras are covered in depth, there is relatively little about the long, exhausting wars against Louis XIV, the 18th century in general, the Napoleonic wars, the 19th century in general (though “Multatuli” is covered) or World War I and its effects.
Shorto makes a few strange claims such as: “There is even a case to be made that our modern idea of “home” as an intimate personal space goes back to the Dutch canal houses of this period.” (19) Well, maybe. But farm houses have long been intimate personal spaces devoted to family; and while the Dutch canal houses (for the merchant class) didn’t have the multi-family scale of a manor or a castle, calling these canal houses the origin of the modern concept of “home” seems like an overreaching claim. What is true is that the Dutch middle-class enjoyed an unparalleled rise in their standard of living during the golden age, giving them the ability to afford coffee, tea, sugar, and even portraits in which their own (modest) lives were deemed important enough to be depicted. But surely the Dutch notion of gezelligheid (coziness) has contributed to urban connotations of home.
Shorto also sees the individualism within Dutch society as “seemingly contradictory” (253) to the strong collective tradition in Dutch history. But collectivism and individualism are not really “contradictory”; these are abstractions, and within nearly any society they are both present, but in different measure. They are competing principles, but any enlightened society or philosophical system will find its own balance between the extremes of individuals running amok without collective bonds (as in a libertarian’s fantasy of the U.S., or in an Ayn Rand novel) and larger organizations reducing individuals to utter insignificance (the medieval Church, the absolutist state, the Borg in Star Trek).
When Shorto points out that Amsterdam is a remote place because it lies at the same latitude as Saskatoon, Saskatchewan (16), it is totally unconvincing comparison. Amsterdam is close to major river deltas (transportation networks) in Western Europe, so despite its more northerly latitude (compared to, say, Paris), this is a useless statement. London lies at a similar latitude to Amsterdam. Is London remote? From what? Amsterdam was originally remote because it was scarcely habitable. But once the water problem was managed, the city’s location eventually became an asset, helping it to dominate the Baltic trade for example.
When Shorto describes the video of Anne Frank appearing in a home movie from 22 June 1941, he mentions other contemporary events from the war (e.g., the Germans had just conquered Crete), but fails to remind us that this was also the very day that Hitler launched operation Barbarossa, his invasion of Russia (partly in order to capture Jews in the Pale of Settlement). Since it’s the very same day, this might have been a useful (and dramatic) fact to mention.
And not that it matters, but the Mellow Yellow marijuana coffeeshop (which claims to be the first one) is not on Weesperzijde as Shorto writes (301), but on Vijzelstraat.
Dutch readers are likely to have varied reactions to the book, depending on their sensibility. Some might be more skeptical of Shorto’s claims about the unique contribution of Amsterdam. The book tends to treat some of features of Dutch national policy as if Amsterdam, a mere city, were largely (solely?) responsible for them. “But then too, Amsterdam is not the Netherlands. I have been guilty in this book of sometimes seeming to equate the two. Every Dutch person who is from outside the city will be ready to counter the notion.” (281) Indeed.
But these are all quibbles. Amsterdam: A History of the World’s Most Liberal City succeeds as a popular introduction to a glorious history. Much as Shorto justly receives credit for drawing more attention to the role of the Dutch in early New York history in his The Island at the Center of the World, now he will draw accolades for emphasizing the role of Amsterdam in the explosion of new ideas in the 17th century that inspired the 18th century philosophes, the Founding Fathers of the U.S., and other philosophers of freedom ever since.
[ See other reviews of Dutch history:
The Island at the Center of the World (also by Russell Shorto);
The Dutch Republic by Jonathan I. Israel;
The First Salute by Barbara Tuchman]
|
https://dan-geddes.medium.com/amsterdam-a-history-of-the-worlds-most-liberal-city-b6581a5e0fb9
|
['Dan Geddes']
|
2019-11-19 18:46:26.665000+00:00
|
['Books', 'Dutch', 'Netherlands', 'History', 'Amsterdam']
|
Title Amsterdam History World’s Liberal CityContent Russell Shorto cherrypicks interesting character event research city’s history Russell Shorto’s Amsterdam History World’s Liberal City enjoyable book part Shorto cherrypicks interesting character event research city’s history Shorto relates story clear easytoread style creating successful popular history well making light foray intellectual history Although cover city’s history least cursory fashion foundation 1200 present far comprehensive large gap especially city’s decline 18th 19th century Shorto also thesis prove Amsterdam liberal city world admits thesis difficult establish one vagary defining word “liberal” liber Latin mean “free” mean different even contradictory thing different era “Liberal” Netherlands actually economic liberal thus conservative claim Amsterdam’s “liberal” city world relies much role history current status medium size city worldclass culture diversity famously official tolerance soft drug legal prostitution latter actually isn’t unusual Western Europe Shorto confronts uncertain nature historical influence much “credit” really give Amsterdam development Western freedom Shorto describes Amsterdam’s liberal mindset shaped waterlogged origin harsh situation first settler Amsterdam constant struggle sea river delta called collective action water management building dike windmill bridge importantly committing infrastructure long term collective action typical Dutch mindset proved end beneficial individual well Humble individual ended owning real estate wrought sea early Amsterdam settler remote kingly power reclaimed land sea didn’t belong church king “It theirs” 253 statement land belonging church king “It theirs” line repeated nearly verbatim page 279 Shorto course ascribes economic factor essential condition rise “individualism” example 1500 peasant owned 45 land Holland opposed 5 rest Europe 44 vested owner land came individual freedom action hard find part Europe peasant generally bound manorial system later Philip II Spain tried roll back Protestant Reformation Netherlands one reason critical mass Dutch people supported revolt vested interest country Another Philip’s plan draconian–execution even recanted returned Catholicism–that many Dutch Protestants literally fighting life nothing inevitable Amsterdam’s rise becoming global economic powerhouse got lucky break along way first “Miracle Amsterdam” 1345 dying old man took last communion vomited communion host remained whole woman caring threw fire wafer burn miracle declared Amsterdam became place pilgrimage Many church built religious tourist came bought trinket unlike tourist shopping countless souvenir shop one find central Amsterdam today Shorto omits fact another key founding myth Amsterdam feature vomit reputedly seasick dog vomited spot early settler decided build first dike near Nieuwendijk Shorto relates early technological breakthrough helped Amsterdam get ground Discoveries herring preservation fish’s liver natural preservative led herring bus floating herring factory herring packed preserved board ship stayed sea advance eventually enabled Dutch dominate north European market herring bus led turn rise shipbuilding became key infrastructure Dutch trade later colonial empire Windmills adapted become sawmill Dutch imported German wood sold finished lumber export English Dutch also early adapter printing press led dissemination literacy unorthodox idea development greatest publishing center early modern Europe Shorto avoids issue whether Dutchman really invented printing press Gutenberg Dutch scholar claimed printing press led explosion dissemination knowledge Erasmus Rotterdam emerged key figure challenging abuse church asserting primacy individual interpreting Bible according light Erasmus turn inspired Martin Luther Protestant Reformation essence Protestantism individual read scripture oneself rather rely solely Church authority Erasmus Luther played titanic role changing history favor individual instead Church idea found ready reception Netherlands including Amsterdam especially Calvin’s important refinement Shorto’s chapter Dutch golden age enjoyable retelling call “one history’s classics” Shorto describes rise Dutch East India Company VOC Dutch acronym first modern corporation permanent company share sale anyone role reshaping part Asia Africa even enriched Amsterdam building Amsterdam’s canal belt grachtengordel largest planned urban expansion Europe since Roman time according Geert Mak Artists Rembrandt flourished portraiture first time within reach middleclass aristocrat Readers familiar Simon Schama’s Embarrassment Riches Jonathan Israel’s Dutch Republic Geert Mak’s Amsterdam brief life city may find much new Shorto exuberant storyteller enthusiasm period infectious fact three seminal philosopher early modern era–Descartes Spinoza Locke–all wrote published important work Amsterdam strong evidence light liberty may never burned brightly world large without freedom expression Amsterdam allowed period Descartes Locke came published work could published France England Spinoza born Amsterdam briefly ran father’s business breathed spirit freedom whole short life fact Spinoza excommunicated Jewish community make Shorto see distinctly modern first prominent European belong religious community thus individual par excellance Spinoza’s Tractatus perceived incendiary challenge organized church state banned even Netherlands published read discussed anyway influence French Enlightenment following century enormous 1672 rampjaar “year disaster” Dutch history England France even Bishop Münster invaded Netherlands Although Amsterdam wasn’t invaded nation whole weakened land sea soon face series major war Louis XIV France Shorto wind coverage Golden Age even though historian point Amsterdam weakened many way slow decline city remained key financial center well 18th century Netherlands even played key role helping fund early day American revolution See Barbara Tuchman’s First Salute Shorto offer entertaining account William III”s invasion England English historian whitewashed “Glorious Revolution” partly order preserve notion England hasn’t invaded foreign power since 1066 Shorto’s coverage eighteenth nineteenth century achieved mainly sketch emblematic character Aletta Jacobs first Dutch woman become doctor early proponent birth control Eduard Dekker author classic anticolonial novel Max Havelaar see still tradition enlightenment Netherlands even period economic senescence twentieth century showed sign Amsterdam’s resurgence another massive enlargement city rise modern infrastructure social service World War II dark chapter city’s liberal history various reason concentration Amsterdam usual Dutch efficiency recordkeeping survival Dutch Jews lowest percentage country Europe Amsterdam hold general strike protest Nazi deportment Jews historian Loe De Jong call “the first antipogrom strike human history” 268 Dutch carried heavy conscience country’s relative lack action resist deportment compared Denmark example Shorto see World War II lowpoint Dutch liberalism Dutch Amsterdammers failed stop Nazis shipping Jews death camp liberalism seems extent position expediency pure idealism postwar period brought important change Amsterdam resurgence liberalism facilitated rise affluence Europe rebuilt Outspoken champion would later called gay right designer Benno Premsela helped “normalize” homosexuality Netherlands Provo movement hugely influential challenging status quo created climate first marijuana coffeeshops legally tolerated early 1970s Provo also highlighted importance cycling helped call building would become greatest cycling infrastructure city word Shorto doesn’t seriously challenge thesis liberalism born Amsterdam writes one award geographic medal place contributed liberalism London Paris Monticello Thomas Jefferson’s home would candidate 18 doesn’t really pursue claim city also give Amsterdam perhaps much exclusive credit growth capitalism even Dutch town Haarlem town southern Netherlands Brugge Antwerp etc also played important role mention important contribution early capitalism Venice town Hanseatic League Hamburg Lübeck etc Professional historian find much familiar ground may find little new Unlike Island Center World Shorto doesn’t much original archival research large subject relies mainly secondary source source note also don’t cover quite origin anecdote Shorto share story seventeenth century French naval commander surprised Dutch sea captain swept quarter French commander servant doesn’t cite source story It’s either Embarrassment Riches Israel’s Dutch Republic Shorto’s interview Frieda Menco Holocaust survivor knew Anne Frank girl Roel van Duijn founder Provo movement incorporates narrative original research add anecdotal quality fresh material narrative Passages Shorto’s Amsterdam experience add personal dimension book Shorto interjects text perhaps usual popular history recently lived Amsterdam thus opportunity meet interesting personage twentieth century history mention several place lived Amsterdam work West Indian Company house people connected place see emblematic city’s history perfect book thesis intriguing hard prove end claim Amsterdam liberal city world doesn’t matter much supply theme work make good hyperbolic meme Shorto focus time event support thesis add color narrative people era covered depth relatively little long exhausting war Louis XIV 18th century general Napoleonic war 19th century general though “Multatuli” covered World War effect Shorto make strange claim “There even case made modern idea “home” intimate personal space go back Dutch canal house period” 19 Well maybe farm house long intimate personal space devoted family Dutch canal house merchant class didn’t multifamily scale manor castle calling canal house origin modern concept “home” seems like overreaching claim true Dutch middleclass enjoyed unparalleled rise standard living golden age giving ability afford coffee tea sugar even portrait modest life deemed important enough depicted surely Dutch notion gezelligheid coziness contributed urban connotation home Shorto also see individualism within Dutch society “seemingly contradictory” 253 strong collective tradition Dutch history collectivism individualism really “contradictory” abstraction within nearly society present different measure competing principle enlightened society philosophical system find balance extreme individual running amok without collective bond libertarian’s fantasy US Ayn Rand novel larger organization reducing individual utter insignificance medieval Church absolutist state Borg Star Trek Shorto point Amsterdam remote place lie latitude Saskatoon Saskatchewan 16 totally unconvincing comparison Amsterdam close major river delta transportation network Western Europe despite northerly latitude compared say Paris useless statement London lie similar latitude Amsterdam London remote Amsterdam originally remote scarcely habitable water problem managed city’s location eventually became asset helping dominate Baltic trade example Shorto describes video Anne Frank appearing home movie 22 June 1941 mention contemporary event war eg Germans conquered Crete fails remind u also day Hitler launched operation Barbarossa invasion Russia partly order capture Jews Pale Settlement Since it’s day might useful dramatic fact mention matter Mellow Yellow marijuana coffeeshop claim first one Weesperzijde Shorto writes 301 Vijzelstraat Dutch reader likely varied reaction book depending sensibility might skeptical Shorto’s claim unique contribution Amsterdam book tends treat feature Dutch national policy Amsterdam mere city largely solely responsible “But Amsterdam Netherlands guilty book sometimes seeming equate two Every Dutch person outside city ready counter notion” 281 Indeed quibble Amsterdam History World’s Liberal City succeeds popular introduction glorious history Much Shorto justly receives credit drawing attention role Dutch early New York history Island Center World draw accolade emphasizing role Amsterdam explosion new idea 17th century inspired 18th century philosophes Founding Fathers US philosopher freedom ever since See review Dutch history Island Center World also Russell Shorto Dutch Republic Jonathan Israel First Salute Barbara TuchmanTags Books Dutch Netherlands History Amsterdam
|
5,559 |
The Enchanting Lakes of Pakistan
|
The Enchanting Lakes of Pakistan
Enchanting Lakes of Pakistan
Pakistan is a very beautiful country to visit. It has a lot to offer to tourists from all around the world. It has beautiful lakes, rivers, greenish meadows, beautiful valleys, and the world’s most beautiful peaks.
Today in my story I’ll take you on a tour of Pakistan’s enchanting lakes. I hope you will fall in love with these beautiful lakes of Pakistan.
Saif-ul-Malook Lake Kaghan Valley
Saif-ul-Malook is the most beautiful and one of the most famous tourist destinations in Kaghan valley. This lake is known as the lake of Fairies.
Lulusar Lake Kaghan Valley
Lulusar Lake is another beautiful lake of Kaghan valley which is almost 30/40 mints far from Saif-ul-Malook lake.
Every year a large number of birds from Russia come to this lake.
Lulusar is actually the name of a combination of high hills and lakes, tourists who come to Naran must come to see Lulusar Lake and Lulusar Lake is the main source of water for the river Kunhar. The water of the lake is as clear as glass and the reflection of the snow-capped mountains around Lulusar captivates the viewers.
Ansoo Lake
This lake is only accessible between June to October. It takes almost 8 hours of trekking from Saif-ul-Malook lake to reach Ansoo lake. Adventure lovers do camping at Ansoo lake.
Ansoo Lake — Image by Autour
Dudipatsar Lake
Dudi Patsar Lake is located at an altitude of 4175 meters in the extreme north of the Kagan Valley and can be reached in a four-hour drive from Jalkhad. In the local language, Dudi Patsar means milky white water lake.
The reflection of adjacent snow-capped mountains in the crystal clear water of the lake makes it look like a milk canal from afar. This the main reason it is called Dudipatsar lake.
The magical scenery of the area compels tourists to pitch their tents here to enjoy the natural beauty. Getting there is a very difficult and arduous task. Access to the lake is possible by walking for at least seven to twelve hours through extremely difficult paths. After 12 hours of difficult trekking when tourists see this lake their tiredness disappears magically.
Satpara Lake
Satpara Lake is located at an altitude of 8,500 feet above sea level. This lake is full of fresh water and beautiful snow-capped mountains around this lake make it more beautiful.
Rush Lake
Lake Rush is the highest lake in Pakistan and is located at an altitude of 5098 meters near a peak called Rush Pari. It is the 25th highest lake in the world, accessible via the Nagar and Hopper Glacier routes, and the scenery is breathtaking.
Karombar Lake
Karomber lake is the second Pakistani and 31st height lake in the world. This lake is located between KPK and GB. At an altitude of 14,121 feet, the lake is a biologically active lake. The lake is about 55 meters deep, about 4 kilometers long, and 2 kilometers wide.
Located in the Broghil Valley, the beautiful Karombar lake is located at a distance of more than two hundred and fifty kilometers from the city of Chitral. The Broghil Valley is famous for its beautiful scenery, snow-capped peaks, magnificent Karomber Lake, and more than twenty-five small lakes, as well as three major passages.
Haleji Lake
Haleji Lake is located 80 km from Karachi on the National Highway, which was built by British authorities during World War II as a safe water reservoir. The lake, which is about 22,000 acres, has a diameter of 18 km.
Millions of birds migrated to Lake Haleji once in the winter to make a temporary home.
Apart from birds, about 200 species were recorded here, but now only exotic seasonal birds are seen here. Due to the lack of clean water, the lake is gradually drying up on the one hand, and on the other hand, the bushes themselves are rapidly engulfing it.
Shangrila Lake or Kachura Lake
Located in Skardu valley it is the most beautiful lake in Pakistan. Basically, Kachura lake are two lakes, one called ‘Upper Kachura Lake’ and the other ‘Lower Kachura Lake’ or Shangri-La Lake.
Upper Kachura Lake
Upper Kachura Lake is a clear water lake with a depth of about 70 meters. The Indus River flows a little deeper near it. In summer, the temperature here is 10 to 15 degrees Celsius, while in winter, the temperature drops far below freezing point, due to which the lake water freezes completely.
Similarly, Lower Kachura or Lake Shangri-La is also called the second most beautiful lake in Pakistan because its view can enchant anyone.
Lower Kachura Lake
Lower Kachura Lake or Shangri-La Lake is actually part of Shangri-La Rest House. It is a popular tourist resort located about 25 minutes by car from Skardu city.
The highlight of the Shangri-La Rest House is the restaurant, which is built in an aircraft structure. The Shangri-La Rest House is a model of Chinese architecture, attracting a large number of tourists.
Shandur Lake
Shandur Lake with Polo Ground looks like a great masterpiece of nature. It is three miles long and one mile wide. Rare birds live in this lake and the interesting thing is that there is no apparent discharge of water from this lake. In other words, the water appears to be stagnant in the lake.
Hanna Lake
In the rocky cliffs, about ten kilometers north of Quetta, in 1894, during the reign of the British Crown, the supply of cheap groundwater to the people and to irrigate the surrounding lands, Hanna Lake was formed.
The water level in the lake remained the same from 1894 to 1997, but due to lack of proper maintenance, the lake remained completely dry from 2000 to 2004. Despite receiving millions of rupees in tickets annually, no attention was paid to the improvement of the lake. The water level in Hanna Lake has now started falling gradually. At present, the water level has come down to eight feet. The falling water level has affected Siberian bird sanctuaries and tourism, as well as surrounding gardens.
Ratti Gali Lake
Located at an altitude of 12,000 feet above sea level, Ratti Gali Lake is the crest of the Neelum Valley.
|
https://medium.com/world-travelers-blog/the-enchanting-lakes-of-pakistan-a87ffa614541
|
['Muhammad Sakhawat']
|
2020-12-29 09:44:34.938000+00:00
|
['Traveling', 'Travel', 'Pakistan', 'Beautiful Lakes', 'Asia']
|
Title Enchanting Lakes PakistanContent Enchanting Lakes Pakistan Enchanting Lakes Pakistan Pakistan beautiful country visit lot offer tourist around world beautiful lake river greenish meadow beautiful valley world’s beautiful peak Today story I’ll take tour Pakistan’s enchanting lake hope fall love beautiful lake Pakistan SaifulMalook Lake Kaghan Valley SaifulMalook beautiful one famous tourist destination Kaghan valley lake known lake Fairies Lulusar Lake Kaghan Valley Lulusar Lake another beautiful lake Kaghan valley almost 3040 mint far SaifulMalook lake Every year large number bird Russia come lake Lulusar actually name combination high hill lake tourist come Naran must come see Lulusar Lake Lulusar Lake main source water river Kunhar water lake clear glass reflection snowcapped mountain around Lulusar captivates viewer Ansoo Lake lake accessible June October take almost 8 hour trekking SaifulMalook lake reach Ansoo lake Adventure lover camping Ansoo lake Ansoo Lake — Image Autour Dudipatsar Lake Dudi Patsar Lake located altitude 4175 meter extreme north Kagan Valley reached fourhour drive Jalkhad local language Dudi Patsar mean milky white water lake reflection adjacent snowcapped mountain crystal clear water lake make look like milk canal afar main reason called Dudipatsar lake magical scenery area compels tourist pitch tent enjoy natural beauty Getting difficult arduous task Access lake possible walking least seven twelve hour extremely difficult path 12 hour difficult trekking tourist see lake tiredness disappears magically Satpara Lake Satpara Lake located altitude 8500 foot sea level lake full fresh water beautiful snowcapped mountain around lake make beautiful Rush Lake Lake Rush highest lake Pakistan located altitude 5098 meter near peak called Rush Pari 25th highest lake world accessible via Nagar Hopper Glacier route scenery breathtaking Karombar Lake Karomber lake second Pakistani 31st height lake world lake located KPK GB altitude 14121 foot lake biologically active lake lake 55 meter deep 4 kilometer long 2 kilometer wide Located Broghil Valley beautiful Karombar lake located distance two hundred fifty kilometer city Chitral Broghil Valley famous beautiful scenery snowcapped peak magnificent Karomber Lake twentyfive small lake well three major passage Haleji Lake Haleji Lake located 80 km Karachi National Highway built British authority World War II safe water reservoir lake 22000 acre diameter 18 km Millions bird migrated Lake Haleji winter make temporary home Apart bird 200 specie recorded exotic seasonal bird seen Due lack clean water lake gradually drying one hand hand bush rapidly engulfing Shangrila Lake Kachura Lake Located Skardu valley beautiful lake Pakistan Basically Kachura lake two lake one called ‘Upper Kachura Lake’ ‘Lower Kachura Lake’ ShangriLa Lake Upper Kachura Lake Upper Kachura Lake clear water lake depth 70 meter Indus River flow little deeper near summer temperature 10 15 degree Celsius winter temperature drop far freezing point due lake water freeze completely Similarly Lower Kachura Lake ShangriLa also called second beautiful lake Pakistan view enchant anyone Lower Kachura Lake Lower Kachura Lake ShangriLa Lake actually part ShangriLa Rest House popular tourist resort located 25 minute car Skardu city highlight ShangriLa Rest House restaurant built aircraft structure ShangriLa Rest House model Chinese architecture attracting large number tourist Shandur Lake Shandur Lake Polo Ground look like great masterpiece nature three mile long one mile wide Rare bird live lake interesting thing apparent discharge water lake word water appears stagnant lake Hanna Lake rocky cliff ten kilometer north Quetta 1894 reign British Crown supply cheap groundwater people irrigate surrounding land Hanna Lake formed water level lake remained 1894 1997 due lack proper maintenance lake remained completely dry 2000 2004 Despite receiving million rupee ticket annually attention paid improvement lake water level Hanna Lake started falling gradually present water level come eight foot falling water level affected Siberian bird sanctuary tourism well surrounding garden Ratti Gali Lake Located altitude 12000 foot sea level Ratti Gali Lake crest Neelum ValleyTags Traveling Travel Pakistan Beautiful Lakes Asia
|
5,560 |
It’s Not Easy to Parent When Your Soul Is Leaving Your Body
|
It’s Not Easy to Parent When Your Soul Is Leaving Your Body
My recovery from postpartum depression was long and filled with brain zaps
Photo: Ade Santora/Getty Images
My sweet baby was almost two-and-a-half years old, and still I was fighting postpartum depression (PPD).
Fighting, yes. It wasn’t a quiet depression — it couldn’t be — because I had a child to raise. She needed me to sing and dance, so her mind could grow healthy and strong. So she wouldn’t be crazy like me.
All the sanity I had, I gave to her.
For myself? I needed therapy, but who has time when you’re nursing a baby? When you’re not even sleeping?
I attended one therapy session, with my sleeping newborn on my lap, as I bawled and scribbled down advice about communicating with my husband, and how to explain to him what PPD feels like.
She taught me how to lead with appreciation instead of accusation: “I know we’re both giving 110% of ourselves all the time.”
If I could make him care, maybe he could care for me, so I could care for her.
Or if that failed, he could care for her, and I could cease to be. That felt more likely.
It was an impossible time. When my doctor suggested meds at my daughter’s six-week weigh-in, I answered, “I’ll do anything.”
I started Sertraline (Zoloft). Yeah, it helped with the postpartum anxiety. I started to sleep a little. Now my baby kept me up instead of my mind.
But at a year old, she arched her back, away from my breasts and their dwindling supply of milk. When she weaned, everything changed. I experienced another puberty — why does no one tell you that? My chemistry changed, and Sertraline quit working for me.
So I started Fluoxetine (Prozac) and felt great. This was life! Except for the constant nausea. I couldn’t get out of bed, couldn’t keep food down. My even, happy mind was trapped in this nauseous body.
My doctor suggested Venlafaxine (Effexor). No. No. No. I wish I could go back. Wish I could tell myself back then to quit the meds, to finally start therapy. Maybe I didn’t have time for it when she was born, but I did now.
“With Venlafaxine, you won’t have nausea,” my doctor said, “but people say the withdrawal is just like with heroin.”
I stayed, of course, and together we created wounds that may never heal.
I brushed her words away. I needed to feel better. For my daughter. Always for her. Not for me. I didn’t even know who I was. And didn’t care.
I had ceased to be.
My doctor was right about the withdrawal.
One month lost. A month of pain and tears and exhaustion, of brain zaps, of depersonalization (my soul leaving my body). The month when my husband told me I needed to take this healing somewhere else. I stayed, of course, and together we created wounds that may never heal.
This was the month my two-year-old daughter developed her first fears. In a moment where I felt well enough to stand, I cleaned her wipeable placemat, erased a doodle I’d drawn of myself: a smiley face with curly hair.
When she saw the placemat, she started screaming, “Mama erase! Mama erase!”
She wouldn’t leave her bedroom for a day and a night. I drew a new doodle, but this upset her more. She was afraid her real mother was disappearing too. And she was right. I don’t believe in souls, and yet it felt like my soul had definitely left my body.
I spent my days sleeping, crying, battling fevers, wincing from constant brain zaps, and cuddling my daughter under a blanket. Yes, every day we cuddled, and I read stacks of books to her, even though I knew my voice sounded flat.
“Mama’s sick, but I’ll be better soon,” I told her. “I’m sorry. I love you so much.”
My soul was gone, but I didn’t want her to know. I didn’t want my husband to be right, that me being there — but also not there— was worse than me just being all the way gone.
He didn’t mean dead, but that’s where my mind went. Not for the first time.
“Go stay in a motel.”
“I just want to heal in my home,” I pleaded. “And it’s not like we could afford that anyway.”
“It’s not fair to us. It’s not fair to her. She shouldn’t have to see you like this.”
“I’m giving her love every day. I’m doing much more than I feel capable of. You don’t know how hard this is.”
“We don’t need your help.”
“She’s my daughter. She needs me and I need her.”
“This isn’t about you.”
|
https://humanparts.medium.com/its-not-easy-to-parent-when-your-soul-is-leaving-your-body-1011d9f8573d
|
['Darcy Reeder']
|
2019-06-28 12:47:52.014000+00:00
|
['Human Prompt', 'Mental Health', 'Mind', 'Soul', 'Parenting']
|
Title It’s Easy Parent Soul Leaving BodyContent It’s Easy Parent Soul Leaving Body recovery postpartum depression long filled brain zap Photo Ade SantoraGetty Images sweet baby almost twoandahalf year old still fighting postpartum depression PPD Fighting yes wasn’t quiet depression — couldn’t — child raise needed sing dance mind could grow healthy strong wouldn’t crazy like sanity gave needed therapy time you’re nursing baby you’re even sleeping attended one therapy session sleeping newborn lap bawled scribbled advice communicating husband explain PPD feel like taught lead appreciation instead accusation “I know we’re giving 110 time” could make care maybe could care could care failed could care could cease felt likely impossible time doctor suggested med daughter’s sixweek weighin answered “I’ll anything” started Sertraline Zoloft Yeah helped postpartum anxiety started sleep little baby kept instead mind year old arched back away breast dwindling supply milk weaned everything changed experienced another puberty — one tell chemistry changed Sertraline quit working started Fluoxetine Prozac felt great life Except constant nausea couldn’t get bed couldn’t keep food even happy mind trapped nauseous body doctor suggested Venlafaxine Effexor wish could go back Wish could tell back quit med finally start therapy Maybe didn’t time born “With Venlafaxine won’t nausea” doctor said “but people say withdrawal like heroin” stayed course together created wound may never heal brushed word away needed feel better daughter Always didn’t even know didn’t care ceased doctor right withdrawal One month lost month pain tear exhaustion brain zap depersonalization soul leaving body month husband told needed take healing somewhere else stayed course together created wound may never heal month twoyearold daughter developed first fear moment felt well enough stand cleaned wipeable placemat erased doodle I’d drawn smiley face curly hair saw placemat started screaming “Mama erase Mama erase” wouldn’t leave bedroom day night drew new doodle upset afraid real mother disappearing right don’t believe soul yet felt like soul definitely left body spent day sleeping cry battling fever wincing constant brain zap cuddling daughter blanket Yes every day cuddled read stack book even though knew voice sounded flat “Mama’s sick I’ll better soon” told “I’m sorry love much” soul gone didn’t want know didn’t want husband right — also there— worse way gone didn’t mean dead that’s mind went first time “Go stay motel” “I want heal home” pleaded “And it’s like could afford anyway” “It’s fair u It’s fair shouldn’t see like this” “I’m giving love every day I’m much feel capable don’t know hard is” “We don’t need help” “She’s daughter need need her” “This isn’t you”Tags Human Prompt Mental Health Mind Soul Parenting
|
5,561 |
I never understood JavaScript closures
|
I never understood JavaScript closures
Until someone explained it to me like this …
As the title states, JavaScript closures have always been a bit of a mystery to me. I have read multiple articles, I have used closures in my work, sometimes I even used a closure without realizing I was using a closure.
Recently I went to a talk where someone really explained it in a way it finally clicked for me. I’ll try to take this approach to explain closures in this article. Let me give credit to the great folks at CodeSmith and their JavaScript The Hard Parts series.
Before we start
Some concepts are important to grok before you can grok closures. One of them is the execution context.
This article has a very good primer on Execution Context. To quote the article:
When code is run in JavaScript, the environment in which it is executed is very important, and is evaluated as 1 of the following: Global code — The default environment where your code is executed for the first time. Function code — Whenever the flow of execution enters a function body. (…) (…), let’s think of the term execution context as the environment / scope the current code is being evaluated in.
In other words, as we start the program, we start in the global execution context. Some variables are declared within the global execution context. We call these global variables. When the program calls a function, what happens? A few steps:
JavaScript creates a new execution context, a local execution context That local execution context will have its own set of variables, these variables will be local to that execution context. The new execution context is thrown onto the execution stack. Think of the execution stack as a mechanism to keep track of where the program is in its execution
When does the function end? When it encounters a return statement or it encounters a closing bracket } . When a function ends, the following happens:
The local execution contexts pops off the execution stack The functions sends the return value back to the calling context. The calling context is the execution context that called this function, it could be the global execution context or another local execution context. It is up to the calling execution context to deal with the return value at that point. The returned value could be an object, an array, a function, a boolean, anything really. If the function has no return statement, undefined is returned. The local execution context is destroyed. This is important. Destroyed. All the variables that were declared within the local execution context are erased. They are no longer available. That’s why they’re called local variables.
A very basic example
Before we get to closures, let’s take a look at the following piece of code. It seems very straightforward, anybody reading this article probably knows exactly what it does.
1: let a = 3
2: function addTwo(x) {
3: let ret = x + 2
4: return ret
5: }
6: let b = addTwo(a)
7: console.log(b)
In order to understand how the JavaScript engine really works, let’s break this down in great detail.
On line 1 we declare a new variable a in the global execution context and assign it the number 3 . Next it gets tricky. Lines 2 through 5 are really together. What happens here? We declare a new variable named addTwo in the global execution context. And what do we assign to it? A function definition. Whatever is between the two brackets { } is assigned to addTwo . The code inside the function is not evaluated, not executed, just stored into a variable for future use. So now we’re at line 6. It looks simple, but there is much to unpack here. First we declare a new variable in the global execution context and label it b . As soon as a variable is declared it has the value of undefined . Next, still on line 6, we see an assignment operator. We are getting ready to assign a new value to the variable b . Next we see a function being called. When you see a variable followed by round brackets (…) , that’s the signal that a function is being called. Flash forward, every function returns something (either a value, an object or undefined ). Whatever is returned from the function will be assigned to variable b . But first we need to call the function labeled addTwo . JavaScript will go and look in its global execution context memory for a variable named addTwo . Oh, it found one, it was defined in step 2 (or lines 2–5). And lo and behold variable addTwo contains a function definition. Note that the variable a is passed as an argument to the function. JavaScript searches for a variable a in its global execution context memory, finds it, finds that its value is 3 and passes the number 3 as an argument to the function. Ready to execute the function. Now the execution context will switch. A new local execution context is created, let’s name it the ‘addTwo execution context’. The execution context is pushed onto the call stack. What is the first thing we do in the local execution context? You may be tempted to say, “A new variable ret is declared in the local execution context”. That is not the answer. The correct answer is, we need to look at the parameters of the function first. A new variable x is declared in the local execution context. And since the value 3 was passed as an argument, the variable x is assigned the number 3 . The next step is: A new variable ret is declared in the local execution context. Its value is set to undefined. (line 3) Still line 3, an addition needs to be performed. First we need the value of x . JavaScript will look for a variable x . It will look in the local execution context first. And it found one, the value is 3 . And the second operand is the number 2 . The result of the addition ( 5 ) is assigned to the variable ret . Line 4. We return the content of the variable ret . Another lookup in the local execution context. ret contains the value 5 . The function returns the number 5 . And the function ends. Lines 4–5. The function ends. The local execution context is destroyed. The variables x and ret are wiped out. They no longer exist. The context is popped of the call stack and the return value is returned to the calling context. In this case the calling context is the global execution context, because the function addTwo was called from the global execution context. Now we pick up where we left off in step 4. The returned value (number 5 ) gets assigned to the variable b . We are still at line 6 of the little program. I am not going into detail, but in line 7, the content of variable b gets printed in the console. In our example the number 5 .
That was a very long winded explanation for a very simple program, and we haven’t even touched upon closures yet. We will get there I promise. But first we need to take another detour or two.
Lexical scope.
We need to understand some aspects of lexical scope. Take a look at the following example.
1: let val1 = 2
2: function multiplyThis(n) {
3: let ret = n * val1
4: return ret
5: }
6: let multiplied = multiplyThis(6)
7: console.log('example of scope:', multiplied)
The idea here is that we have variables in the local execution context and variables in the global execution context. One intricacy of JavaScript is how it looks for variables. If it can’t find a variable in its local execution context, it will look for it in its calling context. And if not found there in its calling context. Repeatedly, until it is looking in the global execution context. (And if it does not find it there, it’s undefined ). Follow along with the example above, it will clarify it. If you understand how scope works, you can skip this.
Declare a new variable val1 in the global execution context and assign it the number 2 . Lines 2–5. Declare a new variable multiplyThis and assign it a function definition. Line 6. Declare a new variable multiplied in the global execution context. Retrieve the variable multiplyThis from the global execution context memory and execute it as a function. Pass the number 6 as argument. New function call = new execution context. Create a new local execution context. In the local execution context, declare a variable n and assign it the number 6. Line 3. In the local execution context, declare a variable ret . Line 3 (continued). Perform an multiplication with two operands; the content of the variables n and val1 . Look up the variable n in the local execution context. We declared it in step 6. Its content is the number 6 . Look up the variable val1 in the local execution context. The local execution context does not have a variable labeled val1 . Let’s check the calling context. The calling context is the global execution context. Let’s look for val1 in the global execution context. Oh yes, it’s there. It was defined in step 1. The value is the number 2 . Line 3 (continued). Multiply the two operands and assign it to the ret variable. 6 * 2 = 12. ret is now 12 . Return the ret variable. The local execution context is destroyed, along with its variables ret and n . The variable val1 is not destroyed, as it was part of the global execution context. Back to line 6. In the calling context, the number 12 is assigned to the multiplied variable. Finally on line 7, we show the value of the multiplied variable in the console.
So in this example, we need to remember that a function has access to variables that are defined in its calling context. The formal name of this phenomenon is the lexical scope.
A function that returns a function
In the first example the function addTwo returns a number. Remember from earlier that a function can return anything. Let’s look at an example of a function that returns a function, as this is essential to understand closures. Here is the example that we are going to analyze.
1: let val = 7
2: function createAdder() {
3: function addNumbers(a, b) {
4: let ret = a + b
5: return ret
6: }
7: return addNumbers
8: }
9: let adder = createAdder()
10: let sum = adder(val, 8)
11: console.log('example of function returning a function: ', sum)
Let’s go back to the step-by-step breakdown.
Line 1. We declare a variable val in the global execution context and assign the number 7 to that variable. Lines 2–8. We declare a variable named createAdder in the global execution context and we assign a function definition to it. Lines 3 to 7 describe said function definition. As before, at this point, we are not jumping into that function. We just store the function definition into that variable ( createAdder ). Line 9. We declare a new variable, named adder , in the global execution context. Temporarily, undefined is assigned to adder . Still line 9. We see the brackets () ; we need to execute or call a function. Let’s query the global execution context’s memory and look for a variable named createAdder . It was created in step 2. Ok, let’s call it. Calling a function. Now we’re at line 2. A new local execution context is created. We can create local variables in the new execution context. The engine adds the new context to the call stack. The function has no arguments, let’s jump right into the body of it. Still lines 3–6. We have a new function declaration. We create a variable addNumbers in the local execution context. This important. addNumbers exists only in the local execution context. We store a function definition in the local variable named addNumbers . Now we’re at line 7. We return the content of the variable addNumbers . The engine looks for a variable named addNumbers and finds it. It’s a function definition. Fine, a function can return anything, including a function definition. So we return the definition of addNumbers . Anything between the brackets on lines 4 and 5 makes up the function definition. We also remove the local execution context from the call stack. Upon return , the local execution context is destroyed. The addNumbers variable is no more. The function definition still exists though, it is returned from the function and it is assigned to the variable adder ; that is the variable we created in step 3. Now we’re at line 10. We define a new variable sum in the global execution context. Temporary assignment is undefined . We need to execute a function next. Which function? The function that is defined in the variable named adder . We look it up in the global execution context, and sure enough we find it. It’s a function that takes two parameters. Let’s retrieve the two parameters, so we can call the function and pass the correct arguments. The first one is the variable val , which we defined in step 1, it represents the number 7 , and the second one is the number 8 . Now we have to execute that function. The function definition is outlined lines 3–5. A new local execution context is created. Within the local context two new variables are created: a and b . They are respectively assigned the values 7 and 8 , as those were the arguments we passed to the function in the previous step. Line 4. A new variable is declared, named ret . It is declared in the local execution context. Line 4. An addition is performed, where we add the content of variable a and the content of variable b . The result of the addition ( 15 ) is assigned to the ret variable. The ret variable is returned from that function. The local execution context is destroyed, it is removed from the call stack, the variables a , b and ret no longer exist. The returned value is assigned to the sum variable we defined in step 9. We print out the value of sum to the console.
As expected the console will print 15. We really go through a bunch of hoops here. I am trying to illustrate a few points here. First, a function definition can be stored in a variable, the function definition is invisible to the program until it gets called. Second, every time a function gets called, a local execution context is (temporarily) created. That execution context vanishes when the function is done. A function is done when it encounters return or the closing bracket } .
Finally, a closure
Take a look a the next code and try to figure out what will happen.
1: function createCounter () {
2: let counter = 0
3: const myFunction = function() {
4: counter = counter + 1
5: return counter
6: }
7: return myFunction
8: }
9: const increment = createCounter()
10: const c1 = increment()
11: const c2 = increment()
12: const c3 = increment()
13: console.log('example increment', c1, c2, c3)
Now that we got the hang of it from the previous two examples, let’s zip through the execution of this, as we expect it to run.
Lines 1–8. We create a new variable createCounter in the global execution context and it get’s assigned function definition. Line 9. We declare a new variable named increment in the global execution context.. Line 9 again. We need call the createCounter function and assign its returned value to the increment variable. Lines 1–8 . Calling the function. Creating new local execution context. Line 2. Within the local execution context, declare a new variable named counter . Number 0 is assigned to counter . Line 3–6. Declaring new variable named myFunction . The variable is declared in the local execution context. The content of the variable is yet another function definition. As defined in lines 4 and 5. Line 7. Returning the content of the myFunction variable. Local execution context is deleted. myFunction and counter no longer exist. Control is returned to the calling context. Line 9. In the calling context, the global execution context, the value returned by createCounter is assigned to increment . The variable increment now contains a function definition. The function definition that was returned by createCounter . It is no longer labeled myFunction , but it is the same definition. Within the global context, it is labeled increment . Line 10. Declare a new variable ( c1 ). Line 10 (continued). Look up the variable increment , it’s a function, call it. It contains the function definition returned from earlier, as defined in lines 4–5. Create a new execution context. There are no parameters. Start execution the function. Line 4. counter = counter + 1 . Look up the value counter in the local execution context. We just created that context and never declare any local variables. Let’s look in the global execution context. No variable labeled counter here. Javascript will evaluate this as counter = undefined + 1 , declare a new local variable labeled counter and assign it the number 1 , as undefined is sort of 0 . Line 5. We return the content of counter , or the number 1 . We destroy the local execution context, and the counter variable. Back to line 10. The returned value ( 1 ) gets assigned to c1 . Line 11. We repeat steps 10–14, c2 gets assigned 1 also. Line 12. We repeat steps 10–14, c3 gets assigned 1 also. Line 13. We log the content of variables c1 , c2 and c3 .
Try this out for yourself and see what happens. You’ll notice that it is not logging 1 , 1 , and 1 as you may expect from my explanation above. Instead it is logging 1 , 2 and 3 . So what gives?
Somehow, the increment function remembers that counter value. How is that working?
Is counter part of the global execution context? Try console.log(counter) and you’ll get undefined . So that’s not it.
Maybe, when you call increment , somehow it goes back to the the function where it was created ( createCounter )? How would that even work? The variable increment contains the function definition, not where it came from. So that’s not it.
So there must be another mechanism. The Closure. We finally got to it, the missing piece.
Here is how it works. Whenever you declare a new function and assign it to a variable, you store the function definition, as well as a closure. The closure contains all the variables that are in scope at the time of creation of the function. It is analogous to a backpack. A function definition comes with a little backpack. And in its pack it stores all the variables that were in scope at the time that the function definition was created.
So our explanation above was all wrong, let’s try it again, but correctly this time.
1: function createCounter () {
2: let counter = 0
3: const myFunction = function() {
4: counter = counter + 1
5: return counter
6: }
7: return myFunction
8: }
9: const increment = createCounter()
10: const c1 = increment()
11: const c2 = increment()
12: const c3 = increment()
13: console.log('example increment', c1, c2, c3)
Lines 1–8. We create a new variable createCounter in the global execution context and it get’s assigned function definition. Same as above. Line 9. We declare a new variable named increment in the global execution context. Same as above. Line 9 again. We need call the createCounter function and assign its returned value to the increment variable. Same as above. Lines 1–8 . Calling the function. Creating new local execution context. Same as above. Line 2. Within the local execution context, declare a new variable named counter . Number 0 is assigned to counter . Same as above. Line 3–6. Declaring new variable named myFunction . The variable is declared in the local execution context. The content of the variable is yet another function definition. As defined in lines 4 and 5. Now we also create a closure and include it as part of the function definition. The closure contains the variables that are in scope, in this case the variable counter (with the value of 0 ). Line 7. Returning the content of the myFunction variable. Local execution context is deleted. myFunction and counter no longer exist. Control is returned to the calling context. So we are returning the function definition and its closure, the backpack with the variables that were in scope when it was created. Line 9. In the calling context, the global execution context, the value returned by createCounter is assigned to increment . The variable increment now contains a function definition (and closure). The function definition that was returned by createCounter . It is no longer labeled myFunction , but it is the same definition. Within the global context, it is called increment . Line 10. Declare a new variable ( c1 ). Line 10 (continued). Look up the variable increment , it’s a function, call it. It contains the function definition returned from earlier, as defined in lines 4–5. (and it also has a backpack with variables) Create a new execution context. There are no parameters. Start execution the function. Line 4. counter = counter + 1 . We need to look for the variable counter . Before we look in the local or global execution context, let’s look in our backpack. Let’s check the closure. Lo and behold, the closure contains a variable named counter , its value is 0 . After the expression on line 4, its value is set to 1 . And it is stored in the backpack again. The closure now contains the variable counter with a value of 1 . Line 5. We return the content of counter , or the number 1 . We destroy the local execution context. Back to line 10. The returned value ( 1 ) gets assigned to c1 . Line 11. We repeat steps 10–14. This time, when we look at our closure, we see that the counter variable has a value of 1. It was set in step 12 or line 4 of the program. Its value gets incremented and stored as 2 in the closure of the increment function. And c2 gets assigned 2 . Line 12. We repeat steps 10–14, c3 gets assigned 3 . Line 13. We log the content of variables c1 , c2 and c3 .
So now we understand how this works. The key to remember is that when a function gets declared, it contains a function definition and a closure. The closure is a collection of all the variables in scope at the time of creation of the function.
You may ask, does any function has a closure, even functions created in the global scope? The answer is yes. Functions created in the global scope create a closure. But since these functions were created in the global scope, they have access to all the variables in the global scope. And the closure concept is not really relevant.
When a function returns a function, that is when the concept of closures becomes more relevant. The returned function has access to variables that are not in the global scope, but they solely exist in its closure.
Not so trivial closures
Sometimes closures show up when you don’t even notice it. You may have seen an example of what we call partial application. Like in the following code.
let c = 4
const addX = x => n => n + x
const addThree = addX(3)
let d = addThree(c)
console.log('example partial application', d)
In case the arrow function throws you off, here is the equivalent.
let c = 4
function addX(x) {
return function(n) {
return n + x
}
}
const addThree = addX(3)
let d = addThree(c)
console.log('example partial application', d)
We declare a generic adder function addX that takes one parameter ( x ) and returns another function.
The returned function also takes one parameter and adds it to the variable x .
The variable x is part of the closure. When the variable addThree gets declared in the local context, it is assigned a function definition and a closure. The closure contains the variable x .
So now when addThree is called and executed, it has access to the variable x from its closure and the variable n which was passed as an argument and is able to return the sum.
In this example the console will print the number 7 .
Conclusion
The way I will always remember closures is through the backpack analogy. When a function gets created and passed around or returned from another function, it carries a backpack with it. And in the backpack are all the variables that were in scope when the function was declared.
|
https://medium.com/dailyjs/i-never-understood-javascript-closures-9663703368e8
|
['Olivier De Meulder']
|
2017-10-31 02:30:06.242000+00:00
|
['JavaScript', 'Software Development', 'Software Engineering', 'Closure', 'Programming']
|
Title never understood JavaScript closuresContent never understood JavaScript closure someone explained like … title state JavaScript closure always bit mystery read multiple article used closure work sometimes even used closure without realizing using closure Recently went talk someone really explained way finally clicked I’ll try take approach explain closure article Let give credit great folk CodeSmith JavaScript Hard Parts series start concept important grok grok closure One execution context article good primer Execution Context quote article code run JavaScript environment executed important evaluated 1 following Global code — default environment code executed first time Function code — Whenever flow execution enters function body … … let’s think term execution context environment scope current code evaluated word start program start global execution context variable declared within global execution context call global variable program call function happens step JavaScript creates new execution context local execution context local execution context set variable variable local execution context new execution context thrown onto execution stack Think execution stack mechanism keep track program execution function end encounter return statement encounter closing bracket function end following happens local execution context pop execution stack function sends return value back calling context calling context execution context called function could global execution context another local execution context calling execution context deal return value point returned value could object array function boolean anything really function return statement undefined returned local execution context destroyed important Destroyed variable declared within local execution context erased longer available That’s they’re called local variable basic example get closure let’s take look following piece code seems straightforward anybody reading article probably know exactly 1 let 3 2 function addTwox 3 let ret x 2 4 return ret 5 6 let b addTwoa 7 consolelogb order understand JavaScript engine really work let’s break great detail line 1 declare new variable global execution context assign number 3 Next get tricky Lines 2 5 really together happens declare new variable named addTwo global execution context assign function definition Whatever two bracket assigned addTwo code inside function evaluated executed stored variable future use we’re line 6 look simple much unpack First declare new variable global execution context label b soon variable declared value undefined Next still line 6 see assignment operator getting ready assign new value variable b Next see function called see variable followed round bracket … that’s signal function called Flash forward every function return something either value object undefined Whatever returned function assigned variable b first need call function labeled addTwo JavaScript go look global execution context memory variable named addTwo Oh found one defined step 2 line 2–5 lo behold variable addTwo contains function definition Note variable passed argument function JavaScript search variable global execution context memory find find value 3 pass number 3 argument function Ready execute function execution context switch new local execution context created let’s name ‘addTwo execution context’ execution context pushed onto call stack first thing local execution context may tempted say “A new variable ret declared local execution context” answer correct answer need look parameter function first new variable x declared local execution context since value 3 passed argument variable x assigned number 3 next step new variable ret declared local execution context value set undefined line 3 Still line 3 addition need performed First need value x JavaScript look variable x look local execution context first found one value 3 second operand number 2 result addition 5 assigned variable ret Line 4 return content variable ret Another lookup local execution context ret contains value 5 function return number 5 function end Lines 4–5 function end local execution context destroyed variable x ret wiped longer exist context popped call stack return value returned calling context case calling context global execution context function addTwo called global execution context pick left step 4 returned value number 5 get assigned variable b still line 6 little program going detail line 7 content variable b get printed console example number 5 long winded explanation simple program haven’t even touched upon closure yet get promise first need take another detour two Lexical scope need understand aspect lexical scope Take look following example 1 let val1 2 2 function multiplyThisn 3 let ret n val1 4 return ret 5 6 let multiplied multiplyThis6 7 consolelogexample scope multiplied idea variable local execution context variable global execution context One intricacy JavaScript look variable can’t find variable local execution context look calling context found calling context Repeatedly looking global execution context find it’s undefined Follow along example clarify understand scope work skip Declare new variable val1 global execution context assign number 2 Lines 2–5 Declare new variable multiplyThis assign function definition Line 6 Declare new variable multiplied global execution context Retrieve variable multiplyThis global execution context memory execute function Pass number 6 argument New function call new execution context Create new local execution context local execution context declare variable n assign number 6 Line 3 local execution context declare variable ret Line 3 continued Perform multiplication two operand content variable n val1 Look variable n local execution context declared step 6 content number 6 Look variable val1 local execution context local execution context variable labeled val1 Let’s check calling context calling context global execution context Let’s look val1 global execution context Oh yes it’s defined step 1 value number 2 Line 3 continued Multiply two operand assign ret variable 6 2 12 ret 12 Return ret variable local execution context destroyed along variable ret n variable val1 destroyed part global execution context Back line 6 calling context number 12 assigned multiplied variable Finally line 7 show value multiplied variable console example need remember function access variable defined calling context formal name phenomenon lexical scope function return function first example function addTwo return number Remember earlier function return anything Let’s look example function return function essential understand closure example going analyze 1 let val 7 2 function createAdder 3 function addNumbersa b 4 let ret b 5 return ret 6 7 return addNumbers 8 9 let adder createAdder 10 let sum adderval 8 11 consolelogexample function returning function sum Let’s go back stepbystep breakdown Line 1 declare variable val global execution context assign number 7 variable Lines 2–8 declare variable named createAdder global execution context assign function definition Lines 3 7 describe said function definition point jumping function store function definition variable createAdder Line 9 declare new variable named adder global execution context Temporarily undefined assigned adder Still line 9 see bracket need execute call function Let’s query global execution context’s memory look variable named createAdder created step 2 Ok let’s call Calling function we’re line 2 new local execution context created create local variable new execution context engine add new context call stack function argument let’s jump right body Still line 3–6 new function declaration create variable addNumbers local execution context important addNumbers exists local execution context store function definition local variable named addNumbers we’re line 7 return content variable addNumbers engine look variable named addNumbers find It’s function definition Fine function return anything including function definition return definition addNumbers Anything bracket line 4 5 make function definition also remove local execution context call stack Upon return local execution context destroyed addNumbers variable function definition still exists though returned function assigned variable adder variable created step 3 we’re line 10 define new variable sum global execution context Temporary assignment undefined need execute function next function function defined variable named adder look global execution context sure enough find It’s function take two parameter Let’s retrieve two parameter call function pas correct argument first one variable val defined step 1 represents number 7 second one number 8 execute function function definition outlined line 3–5 new local execution context created Within local context two new variable created b respectively assigned value 7 8 argument passed function previous step Line 4 new variable declared named ret declared local execution context Line 4 addition performed add content variable content variable b result addition 15 assigned ret variable ret variable returned function local execution context destroyed removed call stack variable b ret longer exist returned value assigned sum variable defined step 9 print value sum console expected console print 15 really go bunch hoop trying illustrate point First function definition stored variable function definition invisible program get called Second every time function get called local execution context temporarily created execution context vanishes function done function done encounter return closing bracket Finally closure Take look next code try figure happen 1 function createCounter 2 let counter 0 3 const myFunction function 4 counter counter 1 5 return counter 6 7 return myFunction 8 9 const increment createCounter 10 const c1 increment 11 const c2 increment 12 const c3 increment 13 consolelogexample increment c1 c2 c3 got hang previous two example let’s zip execution expect run Lines 1–8 create new variable createCounter global execution context get’s assigned function definition Line 9 declare new variable named increment global execution context Line 9 need call createCounter function assign returned value increment variable Lines 1–8 Calling function Creating new local execution context Line 2 Within local execution context declare new variable named counter Number 0 assigned counter Line 3–6 Declaring new variable named myFunction variable declared local execution context content variable yet another function definition defined line 4 5 Line 7 Returning content myFunction variable Local execution context deleted myFunction counter longer exist Control returned calling context Line 9 calling context global execution context value returned createCounter assigned increment variable increment contains function definition function definition returned createCounter longer labeled myFunction definition Within global context labeled increment Line 10 Declare new variable c1 Line 10 continued Look variable increment it’s function call contains function definition returned earlier defined line 4–5 Create new execution context parameter Start execution function Line 4 counter counter 1 Look value counter local execution context created context never declare local variable Let’s look global execution context variable labeled counter Javascript evaluate counter undefined 1 declare new local variable labeled counter assign number 1 undefined sort 0 Line 5 return content counter number 1 destroy local execution context counter variable Back line 10 returned value 1 get assigned c1 Line 11 repeat step 10–14 c2 get assigned 1 also Line 12 repeat step 10–14 c3 get assigned 1 also Line 13 log content variable c1 c2 c3 Try see happens You’ll notice logging 1 1 1 may expect explanation Instead logging 1 2 3 give Somehow increment function remembers counter value working counter part global execution context Try consolelogcounter you’ll get undefined that’s Maybe call increment somehow go back function created createCounter would even work variable increment contains function definition came that’s must another mechanism Closure finally got missing piece work Whenever declare new function assign variable store function definition well closure closure contains variable scope time creation function analogous backpack function definition come little backpack pack store variable scope time function definition created explanation wrong let’s try correctly time 1 function createCounter 2 let counter 0 3 const myFunction function 4 counter counter 1 5 return counter 6 7 return myFunction 8 9 const increment createCounter 10 const c1 increment 11 const c2 increment 12 const c3 increment 13 consolelogexample increment c1 c2 c3 Lines 1–8 create new variable createCounter global execution context get’s assigned function definition Line 9 declare new variable named increment global execution context Line 9 need call createCounter function assign returned value increment variable Lines 1–8 Calling function Creating new local execution context Line 2 Within local execution context declare new variable named counter Number 0 assigned counter Line 3–6 Declaring new variable named myFunction variable declared local execution context content variable yet another function definition defined line 4 5 also create closure include part function definition closure contains variable scope case variable counter value 0 Line 7 Returning content myFunction variable Local execution context deleted myFunction counter longer exist Control returned calling context returning function definition closure backpack variable scope created Line 9 calling context global execution context value returned createCounter assigned increment variable increment contains function definition closure function definition returned createCounter longer labeled myFunction definition Within global context called increment Line 10 Declare new variable c1 Line 10 continued Look variable increment it’s function call contains function definition returned earlier defined line 4–5 also backpack variable Create new execution context parameter Start execution function Line 4 counter counter 1 need look variable counter look local global execution context let’s look backpack Let’s check closure Lo behold closure contains variable named counter value 0 expression line 4 value set 1 stored backpack closure contains variable counter value 1 Line 5 return content counter number 1 destroy local execution context Back line 10 returned value 1 get assigned c1 Line 11 repeat step 10–14 time look closure see counter variable value 1 set step 12 line 4 program value get incremented stored 2 closure increment function c2 get assigned 2 Line 12 repeat step 10–14 c3 get assigned 3 Line 13 log content variable c1 c2 c3 understand work key remember function get declared contains function definition closure closure collection variable scope time creation function may ask function closure even function created global scope answer yes Functions created global scope create closure since function created global scope access variable global scope closure concept really relevant function return function concept closure becomes relevant returned function access variable global scope solely exist closure trivial closure Sometimes closure show don’t even notice may seen example call partial application Like following code let c 4 const addX x n n x const addThree addX3 let addThreec consolelogexample partial application case arrow function throw equivalent let c 4 function addXx return functionn return n x const addThree addX3 let addThreec consolelogexample partial application declare generic adder function addX take one parameter x return another function returned function also take one parameter add variable x variable x part closure variable addThree get declared local context assigned function definition closure closure contains variable x addThree called executed access variable x closure variable n passed argument able return sum example console print number 7 Conclusion way always remember closure backpack analogy function get created passed around returned another function carry backpack backpack variable scope function declaredTags JavaScript Software Development Software Engineering Closure Programming
|
5,562 |
5 Considerations for Building a 5-Star FireTV App
|
The Android operating system (OS) is being used across multiple devices and platforms and is currently the most popular mobile operating system. At the moment, Android powers more than 2 billion devices and many of those devices operate on variations of the Android software development kit (SDK), such as Amazon’s FireTV OS, Nokia’s X platform, and Alibaba’s Aliyun OS, to name a few. As a result, with multiple architecturally various applications that can be built, they share the same set of application programming interfaces (APIs) from the Android SDK.
After years of experience developing exclusively for Android mobile devices, I’ve come up with certain development patterns that can be replicated to help bring a quality mobile app to market. On the other hand, FireTV development requires some additional review and slight adjustments to those patterns based on the specifics of this platform. In this article, we will look into some aspects of FireTV development and some of the lessons learned from our experience developing a 5-star app for one of the biggest media providers.
Performance
At the time of writing this article, from the hardware perspective, FireTVs are less performant devices compared to the most up-to-date mobile Android phones. This means a developer will need to take a more diligent approach with regards to memory allocation, data processing, and algorithms while developing a FireTV app. These technical limitations often cause image stuttering and slowness. To avoid these issues, the best approach is to make as much data processing work as possible on the server-side, and send only the necessary data through the RESTful API. This will avoid unnecessary sorting and filtering on the client side that is both memory- and processing power-expensive.
Power Supply
Unlike mobile devices, FireTVs have uninterrupted access to a power supply and at first sight, may not require battery saving related optimizations. However, most of the battery-expensive work is related to the central processing unit’s (CPU) usage and network connectivity, which are the same factors that impact performance. This means that even though FireTVs do not have batteries and have permanent access to a power supply, implementing the necessary optimizations will considerably help with performance and will help avoid any lags in the user interface (UI) rendering.
Network Connection
Another out of the box advantage that comes with FireTV devices, unlike mobile phones, is the reliable, fast, relatively inexpensive and large bandwidth network connection that gives developers a bit more freedom in architecting the app. Depending on the case, engineers can reduce caching size and can rely more often on data updates from the network. They will not need to worry about network costs, bandwidth or reliability. However, consider making non-urgent network updates while the app is closed with Android’s WorkManager, this will help refresh, process and prepare the data before the user opens the app, and will avoid additional resource allocation when the user re-opens the app.
Overall App Architecture
FireTV applications have similarities and differences relative to “traditional” mobile application architectures. Networking and caching layer architectures can be lifted and shifted. These components can be used as is and do not require any adjustments or modifications. The major adjustments and differences revolve around the user experience (UX) and user interface (UI). FireTV does not have touch screen functionality and works exclusively with the remote. This fact requires engineers to follow guidelines for the cursor movement and UI for the selected states. The fastest way to include the UI elements into the app is by using the Leanback library, which has built-in navigation, however, it may be a bit limited in terms of customizations.
User Interface
FireTVs have a so-called “10-foot” user interface because the screen is roughly 10-feet from the user’s eyes, versus the 2-foot distance of a computer screen. This means some additional considerations must be taken into account in order to accommodate the distance and provide the right user experience. Developers should consider using appropriate (i.e.: larger) sizes for UI elements and fonts so they can easily be seen from a relatively longer distance. Also, make sure that every remote input can easily be reflected on the screen and is visible from a 10-foot distance. However, do not use larger assets than what is needed as this may negatively impact the performance and slow down UI rendering.
|
https://medium.com/tribalscale/5-considerations-for-building-a-5-star-firetv-app-8d2456a81513
|
['Tribalscale Inc.']
|
2019-06-20 14:37:50.729000+00:00
|
['Mobile App Development', 'Technology', 'OTT', 'Fire Tv', 'Android']
|
Title 5 Considerations Building 5Star FireTV AppContent Android operating system OS used across multiple device platform currently popular mobile operating system moment Android power 2 billion device many device operate variation Android software development kit SDK Amazon’s FireTV OS Nokia’s X platform Alibaba’s Aliyun OS name result multiple architecturally various application built share set application programming interface APIs Android SDK year experience developing exclusively Android mobile device I’ve come certain development pattern replicated help bring quality mobile app market hand FireTV development requires additional review slight adjustment pattern based specific platform article look aspect FireTV development lesson learned experience developing 5star app one biggest medium provider Performance time writing article hardware perspective FireTVs le performant device compared uptodate mobile Android phone mean developer need take diligent approach regard memory allocation data processing algorithm developing FireTV app technical limitation often cause image stuttering slowness avoid issue best approach make much data processing work possible serverside send necessary data RESTful API avoid unnecessary sorting filtering client side memory processing powerexpensive Power Supply Unlike mobile device FireTVs uninterrupted access power supply first sight may require battery saving related optimization However batteryexpensive work related central processing unit’s CPU usage network connectivity factor impact performance mean even though FireTVs battery permanent access power supply implementing necessary optimization considerably help performance help avoid lag user interface UI rendering Network Connection Another box advantage come FireTV device unlike mobile phone reliable fast relatively inexpensive large bandwidth network connection give developer bit freedom architecting app Depending case engineer reduce caching size rely often data update network need worry network cost bandwidth reliability However consider making nonurgent network update app closed Android’s WorkManager help refresh process prepare data user open app avoid additional resource allocation user reopens app Overall App Architecture FireTV application similarity difference relative “traditional” mobile application architecture Networking caching layer architecture lifted shifted component used require adjustment modification major adjustment difference revolve around user experience UX user interface UI FireTV touch screen functionality work exclusively remote fact requires engineer follow guideline cursor movement UI selected state fastest way include UI element app using Leanback library builtin navigation however may bit limited term customizations User Interface FireTVs socalled “10foot” user interface screen roughly 10feet user’s eye versus 2foot distance computer screen mean additional consideration must taken account order accommodate distance provide right user experience Developers consider using appropriate ie larger size UI element font easily seen relatively longer distance Also make sure every remote input easily reflected screen visible 10foot distance However use larger asset needed may negatively impact performance slow UI renderingTags Mobile App Development Technology OTT Fire Tv Android
|
5,563 |
Automating Resiliency: How To Remain Calm In The Midst Of Chaos
|
By Shan Anwar and Balaji Arunachalam
The Case for Change at Intuit
When any company decides to migrate to the public cloud in order to better scale its product offerings, there will be challenges, including those involving manual testing. For Intuit, the proud maker of TurboTax, QuickBooks, and Mint, this meant breaking down the monolith, going to hundreds of micro-services, and requiring everything to be automated and available via pipelines. A proof of concept to automate manual resiliency testing needed to be created in order to scale exponentially and support dozens of micro-services across multiple regions. During this proof of concept, several homegrown tools were created by the Intuit team to embody the resiliency culture and thinking amongst the developers, preceding the Software Development Life Cycle (SDLC) approach.
In this blog, one such resiliency tool, called CloudRaider, helped accelerate Intuit’s goal to become highly resilient and highly available during this journey to the cloud.
Resiliency Testing at Intuit
As Intuit moved from single data center to dual data centers, HA DR (Highly Available and Disaster Recovery) testing became incredibly important. The team started with a well structured process. This involved a variety of engineers (developers, QE, App Ops, DBAs, network engineers, etc) to conduct long sessions, identifying various failures for the system, and documenting expected system behaviors, including alerts and monitoring. After appropriate prioritization (based on severity, occurrence frequency and ease of detection), the team then executed these failures in a pre-production environment to prepare the system for better resiliency.
This approach generally helped to identify system resiliency defects although it still had a lot of restrictions and gaps. This was a time-consuming, manual process requiring multiple engineers’ time and could get very expensive, only to be repeated as regression tests for future system changes. The FMEA (Failure Mode Effect Analysis) testing was conducted after the system implementation so it worked counter to the shift-left model used to uncover system resiliency issues early in the SDLC process.
In moving to the cloud, the teams started adopting chaos testing in production; this however did not help to solve these gaps either, given this test occurred post production and could not be run as a continuous regression testing. It was discovered that chaos testing was a nice complement to FMEA testing, but it was not necessarily a replacement. Chaos testing, being an ad-hoc testing methodology, required a structured approach of testing, and meant preparing systems prior to invoking chaos into production.
Requirements included are listed below:
Resiliency testing had to become part of the system design, not an after-thought. Shift left resiliency testing would be for developers to enable test driven design and development for system resiliency. Tests (including pre and post validation) would need to be fully automated and available as part of release pipeline as regression tests. Reverting failures would also need to be automated as part of testing. The ability to write the test code in natural language was needed so that the same could be used as a system resiliency design requirement document. A 100% pass on the automated resiliency test suite would be a prerequisite for chaos testing in production.
This led to creating an in-house resiliency testing tool called “CloudRaider”.
How Intuit Innovated with CloudRaider: D4D (Design 4 Delight)
During Intuit’s migration to the public cloud, the challenges of manual FMEA testing continued and a proof of concept to automate FMEA tests was created by applying Intuit’s Design for Delight principles.
Intuit Design Principles
Principle #1: Deep Customer Empathy
Our systems needed to be resilient; in case of failures we could not impact customers.
Principle #2: Go Broad to Go Narrow
An ideal state was fully resilient systems with automated regression to validate.
Principle #3: Rapid Experiments with Customers
During our experimentation, we involved teams to use our automation. At first, we tried to automate a few specific scenarios to confirm the value of automation. We were unable to scale and had to go back and try out new ideas of how to make it easier to write and execute scenario.
After experimentations, we solved the problem by applying a behavior-driven development process which involved writing a scenario first. This process helped us identify common scenarios and led to develop a domain specific language (DSL). The DSL provided a way to dynamically construct new scenarios and utilize more general code definitions to invoke any failures.
The automation of failures reduced execution time significantly but the question about effectiveness remained. This opened up ideas about automating the process to verify the impact of the failures and to measure the effectiveness of system recovery (see end-to-end design diagram).
End-to-End System
CloudRaider in Action
Example: Simple login service
Simple Example Scenario
Let’s look at an example of very simple login micro-service that consists of a frontend (web server) and a backend service running in AWS. Even in this simple architecture there are multiple possibilities of failures (see table):
FMEA Template
All of the above scenarios are very general and can be applied to any service or system design. In our example, we could have the same failures executed for either frontend or backend. We created these scenarios via CloudRaider (see sample code).
Cloud Raider DSL
In the scenario above, the implementation details were all abstracted and the test was written in natural language construct. Furthermore, it was all data driven where the same scenario could then be executed under different criteria thus making it reusable.
A slightly modified scenario was highlighted where a login service was unavailable due to very high CPU consumption (see code).
This high CPU consumption scenario varied slightly from the first one where only failure condition was different and easy to construct.
In reality the login service architecture would have many more complexities and critical dependencies. Let’s expand to include authorization of OAuth2 tokens and a risk screening service. Both are external (see diagram).
This new approach introduced resiliency implications such as slow response time or unavailability of critical dependency. In CloudRaider, we could include scenarios to mimic such behaviors by injecting network latency or blocking domains (see code).
We discussed simple failure scenarios, but in reality, modern applications are more complex and run in multiple data centers and/or regions. Our previous example could be enhanced to multiple regions scenario (see diagram). Applications could be highly available as they ran in multiple regions and still maintain auto recovery process if one of the regions went down.
Multi-Region Failover Example
In CloudRaider, we could write code to terminate a region as previously achieved but we could also assert our region failover strategy with the help of AWS Route53 service (see code).
Implementation Details
CloudRaider is an Open-Source library written in Java that leverages Behavior Driven Development (BDD) via Cucumber/Gherkin framework. The library is integrated with AWS to inject failures.
Github link: https://github.com/intuit/CloudRaider/
Benefits of an Automated and Structured Resiliency Testing Process
What used to take more than a week of heavy coordination and manual test execution with many engineers, became a three-hour automated execution with zero human resources. This process enabled us to test the system resiliency on a regular basis to catch any regression issues. Having these automated tests in the release pipeline also enabled very high confidence in our product releases and caught resiliency issues before they turned into a production incident. This also gave us more confidence to execute ad-hoc chaos testing in the production. This tool enabled developers to think about resiliency as part of design and implementation and own the testing of their systems’ resiliency.
Conclusion
Product adoption suffers if it is not highly available for customers to use. With increasing complexity and dependencies in the micro service architecture world, it would be impossible to avoid failures in a system’s ecosystem. We learned that our systems needed to be built in such a way to proactively recover from failures with appropriate monitoring and alerts. Testing the systems’ resiliency in an automated/regular way was a must; the sooner the test happened in the SDLC, the less expensive it would be to fix the problem. With well structured, fully automated and regularly executed resiliency tests, our team gained more confidence to execute ad-hoc chaos into production.
Resources
Authors
|
https://medium.com/intuit-engineering/automating-resiliency-how-to-remain-calm-in-the-midst-of-chaos-d0d3929243ca
|
['Shan Anwar']
|
2019-12-09 18:17:10.386000+00:00
|
['AI', 'Open Source', 'Data', 'Data Science', 'Ai And Data Science']
|
Title Automating Resiliency Remain Calm Midst ChaosContent Anwar Balaji Arunachalam Case Change Intuit company decides migrate public cloud order better scale product offering challenge including involving manual testing Intuit proud maker TurboTax QuickBooks Mint meant breaking monolith going hundred microservices requiring everything automated available via pipeline proof concept automate manual resiliency testing needed created order scale exponentially support dozen microservices across multiple region proof concept several homegrown tool created Intuit team embody resiliency culture thinking amongst developer preceding Software Development Life Cycle SDLC approach blog one resiliency tool called CloudRaider helped accelerate Intuit’s goal become highly resilient highly available journey cloud Resiliency Testing Intuit Intuit moved single data center dual data center HA DR Highly Available Disaster Recovery testing became incredibly important team started well structured process involved variety engineer developer QE App Ops DBAs network engineer etc conduct long session identifying various failure system documenting expected system behavior including alert monitoring appropriate prioritization based severity occurrence frequency ease detection team executed failure preproduction environment prepare system better resiliency approach generally helped identify system resiliency defect although still lot restriction gap timeconsuming manual process requiring multiple engineers’ time could get expensive repeated regression test future system change FMEA Failure Mode Effect Analysis testing conducted system implementation worked counter shiftleft model used uncover system resiliency issue early SDLC process moving cloud team started adopting chaos testing production however help solve gap either given test occurred post production could run continuous regression testing discovered chaos testing nice complement FMEA testing necessarily replacement Chaos testing adhoc testing methodology required structured approach testing meant preparing system prior invoking chaos production Requirements included listed Resiliency testing become part system design afterthought Shift left resiliency testing would developer enable test driven design development system resiliency Tests including pre post validation would need fully automated available part release pipeline regression test Reverting failure would also need automated part testing ability write test code natural language needed could used system resiliency design requirement document 100 pas automated resiliency test suite would prerequisite chaos testing production led creating inhouse resiliency testing tool called “CloudRaider” Intuit Innovated CloudRaider D4D Design 4 Delight Intuit’s migration public cloud challenge manual FMEA testing continued proof concept automate FMEA test created applying Intuit’s Design Delight principle Intuit Design Principles Principle 1 Deep Customer Empathy system needed resilient case failure could impact customer Principle 2 Go Broad Go Narrow ideal state fully resilient system automated regression validate Principle 3 Rapid Experiments Customers experimentation involved team use automation first tried automate specific scenario confirm value automation unable scale go back try new idea make easier write execute scenario experimentation solved problem applying behaviordriven development process involved writing scenario first process helped u identify common scenario led develop domain specific language DSL DSL provided way dynamically construct new scenario utilize general code definition invoke failure automation failure reduced execution time significantly question effectiveness remained opened idea automating process verify impact failure measure effectiveness system recovery see endtoend design diagram EndtoEnd System CloudRaider Action Example Simple login service Simple Example Scenario Let’s look example simple login microservice consists frontend web server backend service running AWS Even simple architecture multiple possibility failure see table FMEA Template scenario general applied service system design example could failure executed either frontend backend created scenario via CloudRaider see sample code Cloud Raider DSL scenario implementation detail abstracted test written natural language construct Furthermore data driven scenario could executed different criterion thus making reusable slightly modified scenario highlighted login service unavailable due high CPU consumption see code high CPU consumption scenario varied slightly first one failure condition different easy construct reality login service architecture would many complexity critical dependency Let’s expand include authorization OAuth2 token risk screening service external see diagram new approach introduced resiliency implication slow response time unavailability critical dependency CloudRaider could include scenario mimic behavior injecting network latency blocking domain see code discussed simple failure scenario reality modern application complex run multiple data center andor region previous example could enhanced multiple region scenario see diagram Applications could highly available ran multiple region still maintain auto recovery process one region went MultiRegion Failover Example CloudRaider could write code terminate region previously achieved could also assert region failover strategy help AWS Route53 service see code Implementation Details CloudRaider OpenSource library written Java leverage Behavior Driven Development BDD via CucumberGherkin framework library integrated AWS inject failure Github link httpsgithubcomintuitCloudRaider Benefits Automated Structured Resiliency Testing Process used take week heavy coordination manual test execution many engineer became threehour automated execution zero human resource process enabled u test system resiliency regular basis catch regression issue automated test release pipeline also enabled high confidence product release caught resiliency issue turned production incident also gave u confidence execute adhoc chaos testing production tool enabled developer think resiliency part design implementation testing systems’ resiliency Conclusion Product adoption suffers highly available customer use increasing complexity dependency micro service architecture world would impossible avoid failure system’s ecosystem learned system needed built way proactively recover failure appropriate monitoring alert Testing systems’ resiliency automatedregular way must sooner test happened SDLC le expensive would fix problem well structured fully automated regularly executed resiliency test team gained confidence execute adhoc chaos production Resources AuthorsTags AI Open Source Data Data Science Ai Data Science
|
5,564 |
A Guy With A Bed Frame Is 2018’s Good On Paper
|
You own hand towels, too? I may faint.
Photo by Mark Solarski on Unsplash
About a week ago, a guy on Tinder asked me if his vegetarianism was a deal breaker. I laughed to myself. Bless your heart, how quaint. I am a single woman living in Brooklyn, New York in the year of our Beyoncé 2018 and you think I’m going to be turned off by a guy I’ll never have to share the charcuterie with? I’m swatting away 34-year-olds with three roommates and meeting men whose closets and laundry bins are the same vessel. Eat your chickpeas, I really don’t give a shit.
Next came a guy at my coffee shop, completely average looking (which in New York, if you’re a woman, makes you a 2, if you’re a man, makes you hot), wearing well-fitting jeans, a crisp tee shirt, and what I’d consider pretty cool shoes. He was clean and put-together. I was confused. Then I saw his wedding ring and it all made sense. Silly Shani, single men don’t come well-packaged! They come wrapped in greasy paper and you have to bring your own bag.
The illumination of just how much I’ve compromised and been willing to accept or at least deal with in the last 5 or so years was furthered by a profile I came across a few days later. One that harkened me back to the days of having standards and expectations. Ah, memories.
My stars, what have we become? What does it say about single humanity when this profile right here reads like Shakespeare to me? Speak again, bright angel, tell me of the more than five shirts you own!
You mean you don’t sleep on a mattress on the floor of an unswept, linoleum-lined basement? I won’t have to wait in line to pee at 3am? I won’t have to dry my face before bed with paper towels? This is an embarrassment of riches!
Such is my woe, and perhaps the woe of any female, single, 30-something shit-together, that the men we date (in our age bracket) seem to exist two or three lifesteps behind us. And if we aren’t “okay” with this, our dating pool shrinks to the depth of a bottle cap. So this man, this just normal human, is a gilded gift to dating.
Good on paper used to mean that you had a great job, were motivated and driven, perhaps owned a home, vehicle, or pet. You were neatly dressed, groomed, polite. The kind of guy you wouldn’t mind running into your boss with on a Sunday afternoon. It still means all of these things for women, but for men it basically means that you brush your teeth.
Reader, I’m tired. The double standards that exist among the sexes never cease to replicate and evolve. They are the termites of my very existence. I am appalled not just by my own reaction to the truth of this man’s statements, but by the fact that he knew it would benefit him to say them.
“Hey ladies, I’m keenly aware that you’re one bartender/DJ away from starting that women-only tiny house community in rural Maine. It is I, the dating scene chupacabra, and I’ve come to supply you with entirely normal things. The line forms to the right, no shoving.”
I’ve never fancied the notion that if I want company, I’m going to have to clean it up first. While I don’t mind blowing the dust of a fixer-upper, so to speak, I do mind being financially and functionally responsible for a full renovation from floor to ceiling. I don’t require a general contractor and six to nine months for habitability, neither should you.
But they do. They all do. They (meaning single men populating the online dating apps of the greater New York area) all require me to live without something I used to consider table stakes. Something that is table stakes in my own life. Privacy, a reasonable linen supply, adequate cutlery. Once, just once in my life I’d like to see a man at Ikea or Target who isn’t there on a leash. One man who thinks to himself, “you know what, this place could use an end table.”
You don’t have to be a normal, average, basic insurance plan human being anymore if you want to meet someone. You don’t have to pack on any responsibility at all, from a solo lease to whether or not the cordless vacuum is charged, until you meet your female partner, because heaven knows everything will fall into place after that–she’ll handle it. Everything has fallen into place for me, and I’m starting to think I’m delusional for wanting to meet someone whose life is a little bit together, too.
Regardless, I will still be on my knees in the garden, day after year, weeding through bad idea after red flag after fuckboi, hoping to come across someone who is average, and therefore the cream of the crop.
Also friend, if you’re out there, holler.
|
https://shanisilver.medium.com/a-guy-with-a-bed-frame-is-2018s-good-on-paper-7d93d4270dd1
|
['Shani Silver']
|
2018-05-15 13:10:37.527000+00:00
|
['Humor', 'Culture', 'Dating', 'Singles', 'New York']
|
Title Guy Bed Frame 2018’s Good PaperContent hand towel may faint Photo Mark Solarski Unsplash week ago guy Tinder asked vegetarianism deal breaker laughed Bless heart quaint single woman living Brooklyn New York year Beyoncé 2018 think I’m going turned guy I’ll never share charcuterie I’m swatting away 34yearolds three roommate meeting men whose closet laundry bin vessel Eat chickpea really don’t give shit Next came guy coffee shop completely average looking New York you’re woman make 2 you’re man make hot wearing wellfitting jean crisp tee shirt I’d consider pretty cool shoe clean puttogether confused saw wedding ring made sense Silly Shani single men don’t come wellpackaged come wrapped greasy paper bring bag illumination much I’ve compromised willing accept least deal last 5 year furthered profile came across day later One harkened back day standard expectation Ah memory star become say single humanity profile right read like Shakespeare Speak bright angel tell five shirt mean don’t sleep mattress floor unswept linoleumlined basement won’t wait line pee 3am won’t dry face bed paper towel embarrassment rich woe perhaps woe female single 30something shittogether men date age bracket seem exist two three lifesteps behind u aren’t “okay” dating pool shrink depth bottle cap man normal human gilded gift dating Good paper used mean great job motivated driven perhaps owned home vehicle pet neatly dressed groomed polite kind guy wouldn’t mind running bos Sunday afternoon still mean thing woman men basically mean brush teeth Reader I’m tired double standard exist among sex never cease replicate evolve termite existence appalled reaction truth man’s statement fact knew would benefit say “Hey lady I’m keenly aware you’re one bartenderDJ away starting womenonly tiny house community rural Maine dating scene chupacabra I’ve come supply entirely normal thing line form right shoving” I’ve never fancied notion want company I’m going clean first don’t mind blowing dust fixerupper speak mind financially functionally responsible full renovation floor ceiling don’t require general contractor six nine month habitability neither meaning single men populating online dating apps greater New York area require live without something used consider table stake Something table stake life Privacy reasonable linen supply adequate cutlery life I’d like see man Ikea Target isn’t leash One man think “you know place could use end table” don’t normal average basic insurance plan human anymore want meet someone don’t pack responsibility solo lease whether cordless vacuum charged meet female partner heaven know everything fall place that–she’ll handle Everything fallen place I’m starting think I’m delusional wanting meet someone whose life little bit together Regardless still knee garden day year weeding bad idea red flag fuckboi hoping come across someone average therefore cream crop Also friend you’re hollerTags Humor Culture Dating Singles New York
|
5,565 |
Q&A with Emily Ingram, Director of Product @ Chartbeat
|
Q&A with Emily Ingram, Director of Product @ Chartbeat
This week, The Idea spoke with Emily about recent Chartbeat initiatives on paywall optimization, Multi-Site View and image testing, tracking the supply and demand of climate change coverage, and why she thinks we should be paying more attention to mobile aggregators. Subscribe to our newsletter on the business of media for more interviews and weekly news and analysis.
What was your path to Chartbeat and what is your current role there?
I started my career as a journalist, working for The Washington Post for about six years. The first two of those were within the newsroom as an editor and a producer. Through that, I fell into product management because I knew how the CMS worked: when we were relaunching the mobile site, they needed someone to take CMS outputs and make it into technical requirements for engineers. For about four additional years, I wrote their iOS apps and also launched their digital partner program.
I then went to HuffPost for about a year and a half, working on a tool for storytelling. Throughout all of that, I had a passion for both product management and telling stories, and Chartbeat was the right next step for being able to work with lots of publishers. Now, I’m Director of Product at Chartbeat. I work with our team of product managers to build tools for digital publishers, like tools for paywall optimization.
What are your goals for the paywall optimization project?
I think folks often associate Chartbeat with our real time dashboard and our Big Board. Obviously, those are some of our most used tools and we continue to invest in them; but as media business models have evolved, the needs of digital publishers have too.
We think about optimization overall, like our headline-testing product, which is being used for tens of thousands of tests all across the world. One of the areas that we recognize a need for optimization is around paywall strategy. We’re in early stages of how we can help with choices about what kinds of stories make sense to market subscriber-only and working closely with a small number of publishers to help make those decisions better. Like everything with Chartbeat, the goal is to inform editorial choices but also work alongside editors to make their choices more effective.
What is a project you have worked on recently that had a lot of impact?
A feature we released last year, Multi-Site View, is doing well. We were looking to solve for organizations who have lots of sites and are often trying to coordinate coverage across maybe an entire region’s worth of daily newspaper sites. They need to be able to understand performance on multiple sites at once instead of having lots different tabs open and having to switch between them. Multi-Site View aims to consolidate things into a single dashboard with flexible roll-ups, so you could look at whichever combination of sites necessary for your role and understand what’s doing well and where the opportunities to improve are.
Conversely, what was a feature that wasn’t adopted as widely as you thought that it would be?
Folks often associate Chartbeat with their web traffic, but we also can track native apps as well as amp content. Occasionally when you look at dashboards, you recognize there’s a missing opportunity for them to see a full 360-degree view of their content. That’s something where we’re still working to improve on making sure folks are taking advantage of those capabilities.
What are the limits to using data to inform journalistic choices?
That’s something we’ve been thoughtful about from our earliest days. Chartbeat invented the concept of engaged time as a way to get the focus off clicks and to be more about genuine reader behavior.
It’s something that we try to do within the product — build functionality that guides people to use the tools in healthy ways that actually reach their end goals and not get strayed down the wrong path. For instance, one of the key metrics for our headline-testing tool is quality clicks. So, it’s not just a raw view of whatever headline gets the most wins; we’re also considering if people who clicked on that story actually stick around and read it.
We’ve always been a company that views ourselves as one of many tools for editors. We want to make sure that we have unique insights to offer them, but there are also some things that you can’t do with data and to which humans bring their unique insight to bear.
What is something surprising you’ve learned working with a diverse array of publishers?
We aim to make our tools flexible enough that they can suit different approaches. Right now, we’re working on an alpha for image testing. We already allow folks to test their headlines on a home page to understand which ones are performing best at leading folks to engage. The natural extension of creating an inviting experience on the homepage are the images associated with those headlines. The range of publishers involved in the test is quite large: you have everyone from traditional publishers non-traditional sites that happen to use Chartbeat, and they actually have similar needs.
Sometimes we also building specific things for a segment of clients. For instance, our Multi-Site View was something that’s particularly relevant given the consolidation of media and that folks are often working in centralized teams. Something I’ve learned from my time here is that there’s actually more commonalities than you might expect.
Are there trends in the media space that you wish more people were paying more attention to?
Something that’s always changing but that I think is critical to keep tabs on is constantly monitoring who you’re serving and what they’re coming to you for. For instance, we did a deep dive into climate coverage data, and one of the interesting things we found is that it’s up significantly from the supply side, but even more from the demand side. Those sorts of changes in terms of the topics that folks are most interested in are fascinating.
The other thing is mobile aggregators. Occasionally, we will see top referrers like Top Buzz and Smart News, which are companies that aren’t top of mind. Traffic from Google Chrome suggestions are also something that’s really surged in the past couple of years.
Finally, something that I’ve seen from when I was product manager at a publisher and continued to see here is how development in the phone ecosystem really changes user behavior and thus publisher experiences. The first memory I have of this is when iOS rolled out swipe left of home screen, which included a module that had news articles in it. After that, we noticed a sudden uptick in traffic that was unexplained because it was all of this dark social traffic, but it often was concentrated on these certain articles. That was something that you don’t know is coming. We’ve seen that with Google Chrome suggestions as well, where a feature built into Chrome can have a meaningful impact on what publishers are seeing on their end.
There can often be benefits to these features: Hopefully you can use some of these mobile aggregators as new entry points to maybe expose people to content they wouldn’t see otherwise. But if they change, you don’t really have control over that. So, at the same time as you try to keep tabs on that, you also have to focus on building your loyal audience and diversify your risk.
What is the most interesting thing that you’ve seen recently in media from an organization other than your own?
I think something that’s really exciting to me is the burgeoning news outlets that are starting up to serve particular niche audiences or niche purposes. For instance, I know some of the folks from Texas Tribune are starting up a news organization, The 19th, aimed at women. There’s Dejan Kovacevic in Pittsburgh, who used to be a columnist for their local paper and started a sports site [DK Pittsburgh Sports] that covers just Pittsburgh sports.
You also increasingly see niche e-mail newsletters. I’m a big theater fan, so I follow various theater-related newsletters and podcasts like Broadway Briefing, which is a subscription-based daily roundup of Broadway news. There’s a challenge anytime you’re starting something from scratch in terms of its longevity, but it’s certainly interesting to see a lot of energy around these very particular niches and what people are doing in terms of innovating on the business model for them.
Rapid Fire Questions
What’s the last podcast you listened to?
A Pop Culture Happy Hour episode from NPR. I have a 40-minute commute and some of those are perfectly timed for that.
What’s the last theater production you went to?
Hamlet at St Ann’s Warehouse.
What would you be doing if you weren’t in this role, whether it within media or outside of media?
I would probably want to work somehow in the arts.
This Q&A was originally published in the February 24th edition of The Idea, and has been edited for length and clarity. For more Q&As with media movers and shakers, subscribe to The Idea, Atlantic Media’s weekly newsletter covering the latest trends and innovations in media.
|
https://medium.com/the-idea/q-a-with-emily-ingram-director-of-product-chartbeat-8b18352c4425
|
['Saanya Jain']
|
2020-02-24 23:39:07.224000+00:00
|
['Product Management', 'Subscriber Spotlight', 'Journalism', 'Media']
|
Title QA Emily Ingram Director Product ChartbeatContent QA Emily Ingram Director Product Chartbeat week Idea spoke Emily recent Chartbeat initiative paywall optimization MultiSite View image testing tracking supply demand climate change coverage think paying attention mobile aggregator Subscribe newsletter business medium interview weekly news analysis path Chartbeat current role started career journalist working Washington Post six year first two within newsroom editor producer fell product management knew CMS worked relaunching mobile site needed someone take CMS output make technical requirement engineer four additional year wrote iOS apps also launched digital partner program went HuffPost year half working tool storytelling Throughout passion product management telling story Chartbeat right next step able work lot publisher I’m Director Product Chartbeat work team product manager build tool digital publisher like tool paywall optimization goal paywall optimization project think folk often associate Chartbeat real time dashboard Big Board Obviously used tool continue invest medium business model evolved need digital publisher think optimization overall like headlinetesting product used ten thousand test across world One area recognize need optimization around paywall strategy We’re early stage help choice kind story make sense market subscriberonly working closely small number publisher help make decision better Like everything Chartbeat goal inform editorial choice also work alongside editor make choice effective project worked recently lot impact feature released last year MultiSite View well looking solve organization lot site often trying coordinate coverage across maybe entire region’s worth daily newspaper site need able understand performance multiple site instead lot different tab open switch MultiSite View aim consolidate thing single dashboard flexible rollups could look whichever combination site necessary role understand what’s well opportunity improve Conversely feature wasn’t adopted widely thought would Folks often associate Chartbeat web traffic also track native apps well amp content Occasionally look dashboard recognize there’s missing opportunity see full 360degree view content That’s something we’re still working improve making sure folk taking advantage capability limit using data inform journalistic choice That’s something we’ve thoughtful earliest day Chartbeat invented concept engaged time way get focus click genuine reader behavior It’s something try within product — build functionality guide people use tool healthy way actually reach end goal get strayed wrong path instance one key metric headlinetesting tool quality click it’s raw view whatever headline get win we’re also considering people clicked story actually stick around read We’ve always company view one many tool editor want make sure unique insight offer also thing can’t data human bring unique insight bear something surprising you’ve learned working diverse array publisher aim make tool flexible enough suit different approach Right we’re working alpha image testing already allow folk test headline home page understand one performing best leading folk engage natural extension creating inviting experience homepage image associated headline range publisher involved test quite large everyone traditional publisher nontraditional site happen use Chartbeat actually similar need Sometimes also building specific thing segment client instance MultiSite View something that’s particularly relevant given consolidation medium folk often working centralized team Something I’ve learned time there’s actually commonality might expect trend medium space wish people paying attention Something that’s always changing think critical keep tab constantly monitoring you’re serving they’re coming instance deep dive climate coverage data one interesting thing found it’s significantly supply side even demand side sort change term topic folk interested fascinating thing mobile aggregator Occasionally see top referrers like Top Buzz Smart News company aren’t top mind Traffic Google Chrome suggestion also something that’s really surged past couple year Finally something I’ve seen product manager publisher continued see development phone ecosystem really change user behavior thus publisher experience first memory iOS rolled swipe left home screen included module news article noticed sudden uptick traffic unexplained dark social traffic often concentrated certain article something don’t know coming We’ve seen Google Chrome suggestion well feature built Chrome meaningful impact publisher seeing end often benefit feature Hopefully use mobile aggregator new entry point maybe expose people content wouldn’t see otherwise change don’t really control time try keep tab also focus building loyal audience diversify risk interesting thing you’ve seen recently medium organization think something that’s really exciting burgeoning news outlet starting serve particular niche audience niche purpose instance know folk Texas Tribune starting news organization 19th aimed woman There’s Dejan Kovacevic Pittsburgh used columnist local paper started sport site DK Pittsburgh Sports cover Pittsburgh sport also increasingly see niche email newsletter I’m big theater fan follow various theaterrelated newsletter podcasts like Broadway Briefing subscriptionbased daily roundup Broadway news There’s challenge anytime you’re starting something scratch term longevity it’s certainly interesting see lot energy around particular niche people term innovating business model Rapid Fire Questions What’s last podcast listened Pop Culture Happy Hour episode NPR 40minute commute perfectly timed What’s last theater production went Hamlet St Ann’s Warehouse would weren’t role whether within medium outside medium would probably want work somehow art QA originally published February 24th edition Idea edited length clarity QAs medium mover shaker subscribe Idea Atlantic Media’s weekly newsletter covering latest trend innovation mediaTags Product Management Subscriber Spotlight Journalism Media
|
5,566 |
Illustrated Guide to LSTM’s and GRU’s: A step by step explanation
|
Hi and welcome to an Illustrated Guide to Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU). I’m Michael, and I’m a Machine Learning Engineer in the AI voice assistant space.
In this post, we’ll start with the intuition behind LSTM ’s and GRU’s. Then I’ll explain the internal mechanisms that allow LSTM’s and GRU’s to perform so well. If you want to understand what’s happening under the hood for these two networks, then this post is for you.
You can also watch the video version of this post on youtube if you prefer.
The Problem, Short-term Memory
Recurrent Neural Networks suffer from short-term memory. If a sequence is long enough, they’ll have a hard time carrying information from earlier time steps to later ones. So if you are trying to process a paragraph of text to do predictions, RNN’s may leave out important information from the beginning.
During back propagation, recurrent neural networks suffer from the vanishing gradient problem. Gradients are values used to update a neural networks weights. The vanishing gradient problem is when the gradient shrinks as it back propagates through time. If a gradient value becomes extremely small, it doesn’t contribute too much learning.
Gradient Update Rule
So in recurrent neural networks, layers that get a small gradient update stops learning. Those are usually the earlier layers. So because these layers don’t learn, RNN’s can forget what it seen in longer sequences, thus having a short-term memory. If you want to know more about the mechanics of recurrent neural networks in general, you can read my previous post here.
LSTM’s and GRU’s as a solution
LSTM ’s and GRU’s were created as the solution to short-term memory. They have internal mechanisms called gates that can regulate the flow of information.
These gates can learn which data in a sequence is important to keep or throw away. By doing that, it can pass relevant information down the long chain of sequences to make predictions. Almost all state of the art results based on recurrent neural networks are achieved with these two networks. LSTM’s and GRU’s can be found in speech recognition, speech synthesis, and text generation. You can even use them to generate captions for videos.
Ok, so by the end of this post you should have a solid understanding of why LSTM’s and GRU’s are good at processing long sequences. I am going to approach this with intuitive explanations and illustrations and avoid as much math as possible.
Intuition
Ok, Let’s start with a thought experiment. Let’s say you’re looking at reviews online to determine if you want to buy Life cereal (don’t ask me why). You’ll first read the review then determine if someone thought it was good or if it was bad.
When you read the review, your brain subconsciously only remembers important keywords. You pick up words like “amazing” and “perfectly balanced breakfast”. You don’t care much for words like “this”, “gave“, “all”, “should”, etc. If a friend asks you the next day what the review said, you probably wouldn’t remember it word for word. You might remember the main points though like “will definitely be buying again”. If you’re a lot like me, the other words will fade away from memory.
And that is essentially what an LSTM or GRU does. It can learn to keep only relevant information to make predictions, and forget non relevant data. In this case, the words you remembered made you judge that it was good.
Review of Recurrent Neural Networks
To understand how LSTM’s or GRU’s achieves this, let’s review the recurrent neural network. An RNN works like this; First words get transformed into machine-readable vectors. Then the RNN processes the sequence of vectors one by one.
Processing sequence one by one
While processing, it passes the previous hidden state to the next step of the sequence. The hidden state acts as the neural networks memory. It holds information on previous data the network has seen before.
Passing hidden state to next time step
Let’s look at a cell of the RNN to see how you would calculate the hidden state. First, the input and previous hidden state are combined to form a vector. That vector now has information on the current input and previous inputs. The vector goes through the tanh activation, and the output is the new hidden state, or the memory of the network.
RNN Cell
Tanh activation
The tanh activation is used to help regulate the values flowing through the network. The tanh function squishes values to always be between -1 and 1.
Tanh squishes values to be between -1 and 1
When vectors are flowing through a neural network, it undergoes many transformations due to various math operations. So imagine a value that continues to be multiplied by let’s say 3. You can see how some values can explode and become astronomical, causing other values to seem insignificant.
vector transformations without tanh
A tanh function ensures that the values stay between -1 and 1, thus regulating the output of the neural network. You can see how the same values from above remain between the boundaries allowed by the tanh function.
vector transformations with tanh
So that’s an RNN. It has very few operations internally but works pretty well given the right circumstances (like short sequences). RNN’s uses a lot less computational resources than it’s evolved variants, LSTM’s and GRU’s.
LSTM
An LSTM has a similar control flow as a recurrent neural network. It processes data passing on information as it propagates forward. The differences are the operations within the LSTM’s cells.
LSTM Cell and It’s Operations
These operations are used to allow the LSTM to keep or forget information. Now looking at these operations can get a little overwhelming so we’ll go over this step by step.
Core Concept
The core concept of LSTM’s are the cell state, and it’s various gates. The cell state act as a transport highway that transfers relative information all the way down the sequence chain. You can think of it as the “memory” of the network. The cell state, in theory, can carry relevant information throughout the processing of the sequence. So even information from the earlier time steps can make it’s way to later time steps, reducing the effects of short-term memory. As the cell state goes on its journey, information get’s added or removed to the cell state via gates. The gates are different neural networks that decide which information is allowed on the cell state. The gates can learn what information is relevant to keep or forget during training.
Sigmoid
Gates contains sigmoid activations. A sigmoid activation is similar to the tanh activation. Instead of squishing values between -1 and 1, it squishes values between 0 and 1. That is helpful to update or forget data because any number getting multiplied by 0 is 0, causing values to disappears or be “forgotten.” Any number multiplied by 1 is the same value therefore that value stay’s the same or is “kept.” The network can learn which data is not important therefore can be forgotten or which data is important to keep.
Sigmoid squishes values to be between 0 and 1
Let’s dig a little deeper into what the various gates are doing, shall we? So we have three different gates that regulate information flow in an LSTM cell. A forget gate, input gate, and output gate.
Forget gate
First, we have the forget gate. This gate decides what information should be thrown away or kept. Information from the previous hidden state and information from the current input is passed through the sigmoid function. Values come out between 0 and 1. The closer to 0 means to forget, and the closer to 1 means to keep.
Forget gate operations
Input Gate
To update the cell state, we have the input gate. First, we pass the previous hidden state and current input into a sigmoid function. That decides which values will be updated by transforming the values to be between 0 and 1. 0 means not important, and 1 means important. You also pass the hidden state and current input into the tanh function to squish values between -1 and 1 to help regulate the network. Then you multiply the tanh output with the sigmoid output. The sigmoid output will decide which information is important to keep from the tanh output.
Input gate operations
Cell State
Now we should have enough information to calculate the cell state. First, the cell state gets pointwise multiplied by the forget vector. This has a possibility of dropping values in the cell state if it gets multiplied by values near 0. Then we take the output from the input gate and do a pointwise addition which updates the cell state to new values that the neural network finds relevant. That gives us our new cell state.
Calculating cell state
Output Gate
Last we have the output gate. The output gate decides what the next hidden state should be. Remember that the hidden state contains information on previous inputs. The hidden state is also used for predictions. First, we pass the previous hidden state and the current input into a sigmoid function. Then we pass the newly modified cell state to the tanh function. We multiply the tanh output with the sigmoid output to decide what information the hidden state should carry. The output is the hidden state. The new cell state and the new hidden is then carried over to the next time step.
output gate operations
To review, the Forget gate decides what is relevant to keep from prior steps. The input gate decides what information is relevant to add from the current step. The output gate determines what the next hidden state should be.
Code Demo
For those of you who understand better through seeing the code, here is an example using python pseudo code.
python pseudo code
1. First, the previous hidden state and the current input get concatenated. We’ll call it combine.
2. Combine get’s fed into the forget layer. This layer removes non-relevant data.
4. A candidate layer is created using combine. The candidate holds possible values to add to the cell state.
3. Combine also get’s fed into the input layer. This layer decides what data from the candidate should be added to the new cell state.
5. After computing the forget layer, candidate layer, and the input layer, the cell state is calculated using those vectors and the previous cell state.
6. The output is then computed.
7. Pointwise multiplying the output and the new cell state gives us the new hidden state.
That’s it! The control flow of an LSTM network are a few tensor operations and a for loop. You can use the hidden states for predictions. Combining all those mechanisms, an LSTM can choose which information is relevant to remember or forget during sequence processing.
GRU
So now we know how an LSTM work, let’s briefly look at the GRU. The GRU is the newer generation of Recurrent Neural networks and is pretty similar to an LSTM. GRU’s got rid of the cell state and used the hidden state to transfer information. It also only has two gates, a reset gate and update gate.
GRU cell and it’s gates
Update Gate
The update gate acts similar to the forget and input gate of an LSTM. It decides what information to throw away and what new information to add.
Reset Gate
The reset gate is another gate is used to decide how much past information to forget.
And that’s a GRU. GRU’s has fewer tensor operations; therefore, they are a little speedier to train then LSTM’s. There isn’t a clear winner which one is better. Researchers and engineers usually try both to determine which one works better for their use case.
So That’s it
To sum this up, RNN’s are good for processing sequence data for predictions but suffers from short-term memory. LSTM’s and GRU’s were created as a method to mitigate short-term memory using mechanisms called gates. Gates are just neural networks that regulate the flow of information flowing through the sequence chain. LSTM’s and GRU’s are used in state of the art deep learning applications like speech recognition, speech synthesis, natural language understanding, etc.
If you’re interested in going deeper, here are links of some fantastic resources that can give you a different perspective in understanding LSTM’s and GRU’s. This post was heavily inspired by them.
http://www.wildml.com/2015/10/recurrent-neural-network-tutorial-part-4-implementing-a-grulstm-rnn-with-python-and-theano
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
https://www.youtube.com/watch?v=WCUNPb-5EYI
I had a lot of fun making this post so let me know in the comments if this was helpful or what you would like to see in the next one. And as always, thanks for reading!
Check out michaelphi.com for more content like this.
|
https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21
|
['Michael Phi']
|
2020-06-28 17:27:57.821000+00:00
|
['Machine Learning', 'Artificial Intelligence', 'Lstm', 'Neural Networks', 'Deep Learning']
|
Title Illustrated Guide LSTM’s GRU’s step step explanationContent Hi welcome Illustrated Guide Long ShortTerm Memory LSTM Gated Recurrent Units GRU I’m Michael I’m Machine Learning Engineer AI voice assistant space post we’ll start intuition behind LSTM ’s GRU’s I’ll explain internal mechanism allow LSTM’s GRU’s perform well want understand what’s happening hood two network post also watch video version post youtube prefer Problem Shortterm Memory Recurrent Neural Networks suffer shortterm memory sequence long enough they’ll hard time carrying information earlier time step later one trying process paragraph text prediction RNN’s may leave important information beginning back propagation recurrent neural network suffer vanishing gradient problem Gradients value used update neural network weight vanishing gradient problem gradient shrink back propagates time gradient value becomes extremely small doesn’t contribute much learning Gradient Update Rule recurrent neural network layer get small gradient update stop learning usually earlier layer layer don’t learn RNN’s forget seen longer sequence thus shortterm memory want know mechanic recurrent neural network general read previous post LSTM’s GRU’s solution LSTM ’s GRU’s created solution shortterm memory internal mechanism called gate regulate flow information gate learn data sequence important keep throw away pas relevant information long chain sequence make prediction Almost state art result based recurrent neural network achieved two network LSTM’s GRU’s found speech recognition speech synthesis text generation even use generate caption video Ok end post solid understanding LSTM’s GRU’s good processing long sequence going approach intuitive explanation illustration avoid much math possible Intuition Ok Let’s start thought experiment Let’s say you’re looking review online determine want buy Life cereal don’t ask You’ll first read review determine someone thought good bad read review brain subconsciously remembers important keywords pick word like “amazing” “perfectly balanced breakfast” don’t care much word like “this” “gave“ “all” “should” etc friend asks next day review said probably wouldn’t remember word word might remember main point though like “will definitely buying again” you’re lot like word fade away memory essentially LSTM GRU learn keep relevant information make prediction forget non relevant data case word remembered made judge good Review Recurrent Neural Networks understand LSTM’s GRU’s achieves let’s review recurrent neural network RNN work like First word get transformed machinereadable vector RNN process sequence vector one one Processing sequence one one processing pass previous hidden state next step sequence hidden state act neural network memory hold information previous data network seen Passing hidden state next time step Let’s look cell RNN see would calculate hidden state First input previous hidden state combined form vector vector information current input previous input vector go tanh activation output new hidden state memory network RNN Cell Tanh activation tanh activation used help regulate value flowing network tanh function squish value always 1 1 Tanh squish value 1 1 vector flowing neural network undergoes many transformation due various math operation imagine value continues multiplied let’s say 3 see value explode become astronomical causing value seem insignificant vector transformation without tanh tanh function ensures value stay 1 1 thus regulating output neural network see value remain boundary allowed tanh function vector transformation tanh that’s RNN operation internally work pretty well given right circumstance like short sequence RNN’s us lot le computational resource it’s evolved variant LSTM’s GRU’s LSTM LSTM similar control flow recurrent neural network process data passing information propagates forward difference operation within LSTM’s cell LSTM Cell It’s Operations operation used allow LSTM keep forget information looking operation get little overwhelming we’ll go step step Core Concept core concept LSTM’s cell state it’s various gate cell state act transport highway transfer relative information way sequence chain think “memory” network cell state theory carry relevant information throughout processing sequence even information earlier time step make it’s way later time step reducing effect shortterm memory cell state go journey information get’s added removed cell state via gate gate different neural network decide information allowed cell state gate learn information relevant keep forget training Sigmoid Gates contains sigmoid activation sigmoid activation similar tanh activation Instead squishing value 1 1 squish value 0 1 helpful update forget data number getting multiplied 0 0 causing value disappears “forgotten” number multiplied 1 value therefore value stay’s “kept” network learn data important therefore forgotten data important keep Sigmoid squish value 0 1 Let’s dig little deeper various gate shall three different gate regulate information flow LSTM cell forget gate input gate output gate Forget gate First forget gate gate decides information thrown away kept Information previous hidden state information current input passed sigmoid function Values come 0 1 closer 0 mean forget closer 1 mean keep Forget gate operation Input Gate update cell state input gate First pas previous hidden state current input sigmoid function decides value updated transforming value 0 1 0 mean important 1 mean important also pas hidden state current input tanh function squish value 1 1 help regulate network multiply tanh output sigmoid output sigmoid output decide information important keep tanh output Input gate operation Cell State enough information calculate cell state First cell state get pointwise multiplied forget vector possibility dropping value cell state get multiplied value near 0 take output input gate pointwise addition update cell state new value neural network find relevant give u new cell state Calculating cell state Output Gate Last output gate output gate decides next hidden state Remember hidden state contains information previous input hidden state also used prediction First pas previous hidden state current input sigmoid function pas newly modified cell state tanh function multiply tanh output sigmoid output decide information hidden state carry output hidden state new cell state new hidden carried next time step output gate operation review Forget gate decides relevant keep prior step input gate decides information relevant add current step output gate determines next hidden state Code Demo understand better seeing code example using python pseudo code python pseudo code 1 First previous hidden state current input get concatenated We’ll call combine 2 Combine get’s fed forget layer layer remove nonrelevant data 4 candidate layer created using combine candidate hold possible value add cell state 3 Combine also get’s fed input layer layer decides data candidate added new cell state 5 computing forget layer candidate layer input layer cell state calculated using vector previous cell state 6 output computed 7 Pointwise multiplying output new cell state give u new hidden state That’s control flow LSTM network tensor operation loop use hidden state prediction Combining mechanism LSTM choose information relevant remember forget sequence processing GRU know LSTM work let’s briefly look GRU GRU newer generation Recurrent Neural network pretty similar LSTM GRU’s got rid cell state used hidden state transfer information also two gate reset gate update gate GRU cell it’s gate Update Gate update gate act similar forget input gate LSTM decides information throw away new information add Reset Gate reset gate another gate used decide much past information forget that’s GRU GRU’s fewer tensor operation therefore little speedier train LSTM’s isn’t clear winner one better Researchers engineer usually try determine one work better use case That’s sum RNN’s good processing sequence data prediction suffers shortterm memory LSTM’s GRU’s created method mitigate shortterm memory using mechanism called gate Gates neural network regulate flow information flowing sequence chain LSTM’s GRU’s used state art deep learning application like speech recognition speech synthesis natural language understanding etc you’re interested going deeper link fantastic resource give different perspective understanding LSTM’s GRU’s post heavily inspired httpwwwwildmlcom201510recurrentneuralnetworktutorialpart4implementingagrulstmrnnwithpythonandtheano httpcolahgithubioposts201508UnderstandingLSTMs httpswwwyoutubecomwatchvWCUNPb5EYI lot fun making post let know comment helpful would like see next one always thanks reading Check michaelphicom content like thisTags Machine Learning Artificial Intelligence Lstm Neural Networks Deep Learning
|
5,567 |
Design & the military: a love story
|
Collage by Vittoria Casanova.
By Vittoria Casanova
We usually don’t ask ourselves many questions about the objects surrounding our lives. Aside from the simple function and aesthetics, we don’t think about the object’s history or why products and services, which we use every day, have been designed in the way that we know them. When you think about design, you wouldn’t initially associate it with war. But, looking back at the history of design and invention, it seems that war is the main and most important catalyst for the research, discovery, and implementation of many new solutions and technologies.
The reason might be found in the large amount of funding that governments allocate to military and defense departments. Just to give you an idea, the DARPA (Defense Advanced Research Projects Agency), responsible for the development of emerging technologies for military use, has an average annual budget of three billion USD. Yes, three billion per year!
Here are a few intriguing stories about common products and services that have been catalysed by war.
The grandmother of the Internet was called ARPA, short for Advanced Research Projects Agency. Its initial purpose was to enable researchers to communicate and share knowledge and resources between university computers over telephone lines.
ARPA was born during the Cold War when the US was worried about the Soviet Union destroying their long-distance communications network. The US urgently needed a computer communications system without a central core that could be used wirelessly and remotely. Which would, therefore, be much more difficult for enemies to attack and destroy.
ARPA then started to design a computer network called ARPANET, which would be accessible anywhere in the world using computing power and data. “Internetworking”, as scientists called it, presented enormous challenges as getting networks to ‘talk to each other’ and move data was like speaking Chinese to someone who can only understand Turkish. The Internet’s designers needed to develop a common digital language to enable data sharing but, it had to be a language flexible enough to accommodate all kinds of data, even for the types that hadn’t been invented yet.
The Internet seemed like an extremely far-fetched idea, near impossible to design. But, in the spring of 1976, they found a way. The Internet went from being an obscure research idea to a technology that’s now used by over 4.2 billion people. And, it took less than forty years.
The Global Positioning System, commonly known as the GPS, also has its origins in the Sputnik era.
The idea for the GPS emerged in 1957 when American scientists were tracking the launch of the first satellite, a Russian spacecraft called Sputnik, to orbit Earth. They noticed the frequency of the radio signal from Sputnik got gradually higher as the satellite got closer, and lower as the satellite moved away. This was caused by the Doppler Effect, the same effect that makes the ambulance siren increase or decrease as it moves away or towards an observer. This provided great inspiration: satellites could be tracked from the ground by measuring the frequency of the radio signals they emitted, and, in return, the locations of receivers on the ground could be tracked by their distance from the satellites.
Drones, also known as unmanned aerial vehicles, are another great example. These are aircraft with no onboard crew or passengers, which can be either automated or remotely piloted. The initial idea first came to light in 1849 when Austria attacked Venice with balloons that were loaded with explosives. While few balloons reached their intended targets, most were caught in change winds and were blown back over Austrian lines. From there, it was clear that better aerial technology, which could be controlled remotely, was desperately needed.
Last, but not least, a simple item that we use very often: tape. Duct tape was originally invented by Johnson & Johnson’s pharmaceutical division during WWII for the military. The soldiers specifically needed a waterproof tape that could be used to keep moisture and humidity out of ammunition cases. This is why the original duct tape only came in army green.
Many more examples can be found in various other mundane products: microwaves, digital cameras, superglue, canned food, and penicillin, just to name a few.
It’s also interesting to see that these military-born technologies can even be found in three of our INDEX: Award 2017 winners: Ethereum — a decentralised digital network, commonly referred to as Internet 2.0; what3words — a new GPS system using three-word address; and Zipline — a medical supply delivery chain using drones. But, let’s hope that in the future we won’t need to rely on war for more great solutions to emerge.
|
https://designtoimprovelife.medium.com/design-the-military-a-love-story-99dd58b8b40f
|
['The Index Project']
|
2018-11-28 08:56:35.429000+00:00
|
['War', 'Technology', 'Design']
|
Title Design military love storyContent Collage Vittoria Casanova Vittoria Casanova usually don’t ask many question object surrounding life Aside simple function aesthetic don’t think object’s history product service use every day designed way know think design wouldn’t initially associate war looking back history design invention seems war main important catalyst research discovery implementation many new solution technology reason might found large amount funding government allocate military defense department give idea DARPA Defense Advanced Research Projects Agency responsible development emerging technology military use average annual budget three billion USD Yes three billion per year intriguing story common product service catalysed war grandmother Internet called ARPA short Advanced Research Projects Agency initial purpose enable researcher communicate share knowledge resource university computer telephone line ARPA born Cold War US worried Soviet Union destroying longdistance communication network US urgently needed computer communication system without central core could used wirelessly remotely would therefore much difficult enemy attack destroy ARPA started design computer network called ARPANET would accessible anywhere world using computing power data “Internetworking” scientist called presented enormous challenge getting network ‘talk other’ move data like speaking Chinese someone understand Turkish Internet’s designer needed develop common digital language enable data sharing language flexible enough accommodate kind data even type hadn’t invented yet Internet seemed like extremely farfetched idea near impossible design spring 1976 found way Internet went obscure research idea technology that’s used 42 billion people took le forty year Global Positioning System commonly known GPS also origin Sputnik era idea GPS emerged 1957 American scientist tracking launch first satellite Russian spacecraft called Sputnik orbit Earth noticed frequency radio signal Sputnik got gradually higher satellite got closer lower satellite moved away caused Doppler Effect effect make ambulance siren increase decrease move away towards observer provided great inspiration satellite could tracked ground measuring frequency radio signal emitted return location receiver ground could tracked distance satellite Drones also known unmanned aerial vehicle another great example aircraft onboard crew passenger either automated remotely piloted initial idea first came light 1849 Austria attacked Venice balloon loaded explosive balloon reached intended target caught change wind blown back Austrian line clear better aerial technology could controlled remotely desperately needed Last least simple item use often tape Duct tape originally invented Johnson Johnson’s pharmaceutical division WWII military soldier specifically needed waterproof tape could used keep moisture humidity ammunition case original duct tape came army green Many example found various mundane product microwave digital camera superglue canned food penicillin name It’s also interesting see militaryborn technology even found three INDEX Award 2017 winner Ethereum — decentralised digital network commonly referred Internet 20 what3words — new GPS system using threeword address Zipline — medical supply delivery chain using drone let’s hope future won’t need rely war great solution emergeTags War Technology Design
|
5,568 |
Financial Analysts think Traderium would become the bank of the future .
|
“On the back of solid advancements in Artificial Intelligence prediction and powered by blockchain technology, Traderium is transforming into the bank of the future, the bank that you own.” — CNN
Early this week, financial analysts released a statement which believes Traderium and blockchain technology will eventually replace banks and existing financial systems by eliminating the necessity for intermediaries and third party service providers.
In a research paper shared by the South African Bitcoin Group (SABG), Milwalke wrote:
“For now, virtual currencies such as Bitcoin pose little or no challenge to the existing order of fiat currencies and central banks. Why? Because they are too volatile, too risky, too energy intensive, and because the underlying technologies are not yet scalable. Many are too opaque for regulators; and some have been hacked. But many of these are technological challenges that could be addressed over time. Not so long ago, some experts argued that personal computers would never be adopted, and that tablets would only be used as expensive coffee trays. So I think it may not be wise to dismiss virtual currencies.”
As Milwalke emphasized, the vast majority of cryptocurrencies such as bitcoin and Ethereum are still struggling to solve their underlying scalability issues. Previously, in an interview with major South Korean financial news publication JoongAng, Ethereum co-founder Vitalik Buterin stated that it could take two to five years for public blockchain networks to scale with two-layer and on-chain scalability solutions.
But, once companies like Traderium starts making offerings that would form a bridge between traditional banking systems and blockchains, cryptocurrencies would achieve mass adoption and quickly replace traditional and fiat financial systems.
Traderium’s novel business model involves receiving cryptocurrency deposits just like traditional banks receive fiat deposits, use their A-rated Artificial Intelligence system in trading these deposits, as well as making other investments in bonds etc, much like traditional banks do, and sharing the profits with the customers.
This gives Traderium an edge over traditional banks because blockchain guarantees cheaper transaction costs, which improves their margin. Interest rates on deposits can go as high as 40% per month depending on the account type, and smart contracts make them more trust worthy than financial banks.
Milwalke explained the decentralized nature of bitcoin could provide general consumers with a more efficient, robust, secure, and cost-efficient financial network as an alternative to the global banking infrastructure.
Furthermore, Milwalke noted that the mainstream adoption of bitcoin and cryptocurrencies would result in the decrease of power of central banks and leading financial institutions. Fiat currencies would no longer be of any value as central banks and local financial authorities would not be able to manipulate the value of assets.
“Today’s central banks typically affect asset prices through primary dealers, or big banks, to which they provide liquidity at fixed prices — so-called open-market operations. But if these banks were to become less relevant in the new financial world, and demand for central bank balances were to diminish, could monetary policy transmission remain as effective?,” added Milwalke.
Already, through the Bitcoin Core development team’s scaling and transaction malleability solution Segregated Witness (SegWit), bitcoin has been able to scale to a certain extent by decreasing the size of transactions and blocks. Additionally, the demand toward bitcoin and the cryptocurrency market has increased to a point wherein multi-billion dollar financial institutions such as GoldmanSachs and Fidelity have started to address the rising popularity of cryptocurrencies by launching ventures including cryptocurrency trading operations and mining.
In the upcoming years, through appropriate scaling solutions and the integration of two-layer scaling platforms, bitcoin will be able to position itself at the forefront of financial disruption;challenging banks and financial institutions to evolve into the global financial system.
|
https://medium.com/bitcoinsouthafrica/financial-analysts-think-traderium-would-become-the-bank-of-the-future-7180919cec30
|
['Sandra Mathews']
|
2018-08-19 16:23:42.354000+00:00
|
['Pakistan', 'South Africa', 'Kuwait', 'Bitcoin']
|
Title Financial Analysts think Traderium would become bank future Content “On back solid advancement Artificial Intelligence prediction powered blockchain technology Traderium transforming bank future bank own” — CNN Early week financial analyst released statement belief Traderium blockchain technology eventually replace bank existing financial system eliminating necessity intermediary third party service provider research paper shared South African Bitcoin Group SABG Milwalke wrote “For virtual currency Bitcoin pose little challenge existing order fiat currency central bank volatile risky energy intensive underlying technology yet scalable Many opaque regulator hacked many technological challenge could addressed time long ago expert argued personal computer would never adopted tablet would used expensive coffee tray think may wise dismiss virtual currencies” Milwalke emphasized vast majority cryptocurrencies bitcoin Ethereum still struggling solve underlying scalability issue Previously interview major South Korean financial news publication JoongAng Ethereum cofounder Vitalik Buterin stated could take two five year public blockchain network scale twolayer onchain scalability solution company like Traderium start making offering would form bridge traditional banking system blockchains cryptocurrencies would achieve mass adoption quickly replace traditional fiat financial system Traderium’s novel business model involves receiving cryptocurrency deposit like traditional bank receive fiat deposit use Arated Artificial Intelligence system trading deposit well making investment bond etc much like traditional bank sharing profit customer give Traderium edge traditional bank blockchain guarantee cheaper transaction cost improves margin Interest rate deposit go high 40 per month depending account type smart contract make trust worthy financial bank Milwalke explained decentralized nature bitcoin could provide general consumer efficient robust secure costefficient financial network alternative global banking infrastructure Furthermore Milwalke noted mainstream adoption bitcoin cryptocurrencies would result decrease power central bank leading financial institution Fiat currency would longer value central bank local financial authority would able manipulate value asset “Today’s central bank typically affect asset price primary dealer big bank provide liquidity fixed price — socalled openmarket operation bank become le relevant new financial world demand central bank balance diminish could monetary policy transmission remain effective” added Milwalke Already Bitcoin Core development team’s scaling transaction malleability solution Segregated Witness SegWit bitcoin able scale certain extent decreasing size transaction block Additionally demand toward bitcoin cryptocurrency market increased point wherein multibillion dollar financial institution GoldmanSachs Fidelity started address rising popularity cryptocurrencies launching venture including cryptocurrency trading operation mining upcoming year appropriate scaling solution integration twolayer scaling platform bitcoin able position forefront financial disruptionchallenging bank financial institution evolve global financial systemTags Pakistan South Africa Kuwait Bitcoin
|
5,569 |
7 Awesome Android Apps You’ve Never Heard Of
|
Ready, set, download!
It’s no secret that apps are changing the way we live. I have a friend that has over 100 on her phone. She swears she uses them all, but I doubt it. That’s the problem with apps. There are so many choices, it’s hard to find something new that you’ll want to use and keep.
Beyond Facebook
Everyone uses Facebook, Amazon, Piggy, and Pocket. You’ve never heard of Piggy or Pocket have you? That’s my point. There are thousands of apps that innovative developers have built that make your Android device more useful and fun.
To save you time, I created a list of clever new apps. It’s not scientific, it’s based on new apps my friends have tried, personal use, user reviews and pure awesomeness.
These 7 Android apps will help you get more out of your phone or tablet and do things you didn’t even know were possible. Read on and become an Android expert, and feel free to add your own suggestions below.
1) Become a coupon pro with Piggy
Get coupons automatically with Piggy
Shop your favorite stores in your phone or tablet’s browser, and Piggy will automatically search for coupon codes and cashback whenever you’re checking out. Just click the Piggy button and it will scour the internet for legitimate coupon codes and apply them to your shopping cart. No matter what, you’re always earning cashback. It’s free, it’s easy and it saves you money!
Download Piggy here
2) Always have something to read with Pocket
Pocket is an easy way to save any article and read it later, and works on both Chrome and Android. If you come across any article or website and don’t have time to read it then, save it to Pocket and you can pull it from the queue and read it at any time from any device.
Download Pocket here
3) Never get left out in the rain with 1Weather
1Weather is arguably the best weather app out there. It has a very simple, paginated design that shows you the current weather and forecasts up to 12 weeks. 1Weather offers two full versions, one that is free and has ads within it. Or you can purchase it for $1.99 with no advertising. The fun fact about 1Weather is they offer fun facts about weather that are sure to keep you entertained indefinitely.
Download 1Weather here
4) Google Drive Suite — the complete storage solution
Google Drive is a cloud storage solution essential available on Android. All new users will get 15GB of storage 100% Free indefinitely. The best part of this is you are also able to acquire what Microsoft charges a premium for through GSuite entirely free. This includes Google Docs, Sheets, Slides, Photos, Gmail, Calendar, and Keep. Between the office and photo apps, which by the way allow unlimited amounts of photo and video backup, you have an app to serve a use for practically anything.
Download Drive Suite here
5) Get a personal assistant with Google Now
And an intelligent one, at that. Just say the magic words “Okay Google” to get answers to your questions, make recommendations, and do just about anything and everything by making requests to various web services. Sync it to your Google Account to be able to pull up your schedule and notes in an instant, among many other actions; it also largely works hand in hand with Google Search so the repeated actions you perform are utilized to your advantage.
Download Google Now here
6) Don’t lose track of passwords with LastPass
Even if you have photographic memory or a systematic way of safekeeping your passwords, LastPass will change your life. It’s an awesome digital vault that takes its job of safeguarding all your online accounts seriously. Create a free account and secure it with a strong master password — your last password ever! Fill your vault with all your fave sites, save new sites automatically, and never be bothered with taking note of new passwords ever again.
Download Last Pass here
7) The best app for getting things done Wunderlist
This app surely lives up to the promise of its name, with its very user-friendly interface that packs in heroic features — from the digital notepad, alarms, and reminders, to the folders section and messaging function. You’ll be so excited to get your schedule, plans, goals, and lists in order because Wunderlist is so handy, you can access it anytime, anywhere on your mobile device or computer, and allows you to share your lists with anyone and work collaboratively with them.
Download Wunderlist here
We bet you won’t be able to put your Android device down after getting a hold of these apps… and we really can’t blame you. Enjoy!
|
https://medium.com/easysimplemore/7-awesome-android-apps-youve-never-heard-of-cb7a0d87fd8c
|
['Katrina Angco']
|
2017-08-30 19:57:46.124000+00:00
|
['Android Apps', 'Lifestyle', 'Mobile Device', 'Productivity', 'Digital Marketing']
|
Title 7 Awesome Android Apps You’ve Never Heard OfContent Ready set download It’s secret apps changing way live friend 100 phone swears us doubt That’s problem apps many choice it’s hard find something new you’ll want use keep Beyond Facebook Everyone us Facebook Amazon Piggy Pocket You’ve never heard Piggy Pocket That’s point thousand apps innovative developer built make Android device useful fun save time created list clever new apps It’s scientific it’s based new apps friend tried personal use user review pure awesomeness 7 Android apps help get phone tablet thing didn’t even know possible Read become Android expert feel free add suggestion 1 Become coupon pro Piggy Get coupon automatically Piggy Shop favorite store phone tablet’s browser Piggy automatically search coupon code cashback whenever you’re checking click Piggy button scour internet legitimate coupon code apply shopping cart matter you’re always earning cashback It’s free it’s easy save money Download Piggy 2 Always something read Pocket Pocket easy way save article read later work Chrome Android come across article website don’t time read save Pocket pull queue read time device Download Pocket 3 Never get left rain 1Weather 1Weather arguably best weather app simple paginated design show current weather forecast 12 week 1Weather offer two full version one free ad within purchase 199 advertising fun fact 1Weather offer fun fact weather sure keep entertained indefinitely Download 1Weather 4 Google Drive Suite — complete storage solution Google Drive cloud storage solution essential available Android new user get 15GB storage 100 Free indefinitely best part also able acquire Microsoft charge premium GSuite entirely free includes Google Docs Sheets Slides Photos Gmail Calendar Keep office photo apps way allow unlimited amount photo video backup app serve use practically anything Download Drive Suite 5 Get personal assistant Google intelligent one say magic word “Okay Google” get answer question make recommendation anything everything making request various web service Sync Google Account able pull schedule note instant among many action also largely work hand hand Google Search repeated action perform utilized advantage Download Google 6 Don’t lose track password LastPass Even photographic memory systematic way safekeeping password LastPass change life It’s awesome digital vault take job safeguarding online account seriously Create free account secure strong master password — last password ever Fill vault fave site save new site automatically never bothered taking note new password ever Download Last Pass 7 best app getting thing done Wunderlist app surely life promise name userfriendly interface pack heroic feature — digital notepad alarm reminder folder section messaging function You’ll excited get schedule plan goal list order Wunderlist handy access anytime anywhere mobile device computer allows share list anyone work collaboratively Download Wunderlist bet won’t able put Android device getting hold apps… really can’t blame EnjoyTags Android Apps Lifestyle Mobile Device Productivity Digital Marketing
|
5,570 |
5 Potential Activities for Long-Distance Relationships
|
I say these encouraging things as someone who is highly empathetic and has worked closely with individuals and couples to make sense of their overarching problems, particularly as a crisis supporter and an active listener.
If you can get through this drama, then you have the capacity to keep going. If anything, wear this circumstance as a badge of honour. Your relationship hit a roadblock, and you came swinging, using your creativity to keep yourself and your partner satiated enough to keep going.
Either way, here are some suggestions on how to celebrate 2021 with your long-distance partner.
1. Cook a Meal “Together”
While it’s not the same as being together physically, you can set up your monitor or device in the kitchen, or bring parts of the kitchen to your device. If you haven’t already, install one of those remote conferencing or video chatting apps, like Zoom, Skype, Messenger, or even FaceTime.
You can lay out the basic ingredients for something that you want want to make. Perhaps you can keep the recipe or meal simple to avoid any kitchen-related mishaps. In real-time, you can critique one another and crack jokes at one another.
Photo by Dan Counsell on Unsplash — Keep it simple to avoid mishaps.
Maybe you can make a contest out of it, and see who has the better-looking meal. Maybe you can pretend that you’re on MasterChef or Hell’s Kitchen, running around while pretending that Gordon Ramsay is screaming at you to get it together.
Maybe one of you can play music associated with the show.
2. Agree to Watch Something Online
Sure, you’re in two different places, but you can set up the time to watch a movie “together”. Even far apart, sentimentality and love are still there. Maybe you can sport coordinating outfits and have similar snacks.
Plus, if you’re really hard-pressed against making such elaborate arrangements, maybe you can sit down and watch a live stream together, such as a remote concert or comedy show on websites like Youtube and Twitch.
Photo by NordWood Themes on Unsplash — You can still watch things together, even remotely.
Even if you settle on an old video or an agreed-upon movie classic, you can film one another's reactions during certain parts of the show, video, or movie, especially if something super funny or dramatic happened.
Maybe you can make a meme reaction out of it, and have a silly inside joke between the two of you. Maybe one of you will go viral because of it. Either way, you now have some moments, like your partner exclaiming surprise during your favourite scene from a movie.
3. Have a Spontaneous Fashion Show
It sounds silly, but with the power of technology, you can try something fun and spontaneous, like trying out the various outfits in your room. Maybe you can help one another find the perfect outfit.
This outfit could be for a hypothetical future date, or something you plan to wear for tomorrow morning. You get to have fun with what you already have, and you didn’t even have to spend major bucks for that to happen.
Plus, you can boost your confidence by doing a silly little dance, a semi-serious catwalk strut, or even tease your partner with something that they personally enjoy.
Photo by BBH Singapore on Unsplash — Maybe they’re having fun with their wardrobes.
You might get bonus points if you are able to play funny music during the whole situation. Maybe when your partner does their fake catwalk, you can play a video with some appropriate but funky music.
Make those little moments count, even if they’re fleeting, because it’s easy to take them for granted.
4. Roleplay a Mock Vacation
While we can’t travel, don’t let that stop you from pretending to have one.
Maybe one partner can pretend to be the budding tourist guide, and you can pretend to be the naive tourist. Maybe both of you can pretend to be two strangers meeting at the bar for the first time.
Think of it as a role-play or as a chance to show off your acting chops. Maybe in this adventure, one of you is a secret agent, and the other is also a secret agent, and you’re trying to outwit one another to get to the bottom of your mystery.
Photo by Free To Use Sounds on Unsplash — I honestly don’t have the context for this.
If you’re on a video conferencing app, you can change the background behind you to resemble a remote tropical paradise. You can even dress up as you please, perhaps opting to wear matching Hawaiian shirts.
Even if it’s not the same as the real thing, the ambience and mood can still be closely emulated. Maybe you can play certain songs or instruments to heighten the mood if you wanted.
5. Do Remote Acts of Kindness
Even if we’re not physically there, we can still help out our partners in other ways. If one partner is financially struggling where they are at, maybe you can send some funds to them online. If you’re struggling, they can reciprocate back.
Perhaps you can call their roommate (if they have one) and ask them to decorate your partner’s place for their birthday. Sure, this requires a fair bit of extra work due to the remote planning, but relationships are worth the effort you put into it.
Photo by Chase Chappell on Unsplash — It’s okay to make virtual plans, I promise.
Plus, even if they don’t have a roommate, maybe you can remotely order their favourite food to be delivered to their house and request a fancy little love note to be left behind.
You can pay for the order, alleviating a little bit of the sadness that your partner may feel.
|
https://synthiasatkuna.medium.com/5-potential-activities-for-long-distance-relationships-dfc814de878c
|
['Synthia Satkuna', 'Ma Candidate']
|
2020-12-29 11:58:13.846000+00:00
|
['Self Improvement', 'Relationships', 'Long Distance', 'Mental Health', 'Dating']
|
Title 5 Potential Activities LongDistance RelationshipsContent say encouraging thing someone highly empathetic worked closely individual couple make sense overarching problem particularly crisis supporter active listener get drama capacity keep going anything wear circumstance badge honour relationship hit roadblock came swinging using creativity keep partner satiated enough keep going Either way suggestion celebrate 2021 longdistance partner 1 Cook Meal “Together” it’s together physically set monitor device kitchen bring part kitchen device haven’t already install one remote conferencing video chatting apps like Zoom Skype Messenger even FaceTime lay basic ingredient something want want make Perhaps keep recipe meal simple avoid kitchenrelated mishap realtime critique one another crack joke one another Photo Dan Counsell Unsplash — Keep simple avoid mishap Maybe make contest see betterlooking meal Maybe pretend you’re MasterChef Hell’s Kitchen running around pretending Gordon Ramsay screaming get together Maybe one play music associated show 2 Agree Watch Something Online Sure you’re two different place set time watch movie “together” Even far apart sentimentality love still Maybe sport coordinating outfit similar snack Plus you’re really hardpressed making elaborate arrangement maybe sit watch live stream together remote concert comedy show website like Youtube Twitch Photo NordWood Themes Unsplash — still watch thing together even remotely Even settle old video agreedupon movie classic film one anothers reaction certain part show video movie especially something super funny dramatic happened Maybe make meme reaction silly inside joke two Maybe one go viral Either way moment like partner exclaiming surprise favourite scene movie 3 Spontaneous Fashion Show sound silly power technology try something fun spontaneous like trying various outfit room Maybe help one another find perfect outfit outfit could hypothetical future date something plan wear tomorrow morning get fun already didn’t even spend major buck happen Plus boost confidence silly little dance semiserious catwalk strut even tease partner something personally enjoy Photo BBH Singapore Unsplash — Maybe they’re fun wardrobe might get bonus point able play funny music whole situation Maybe partner fake catwalk play video appropriate funky music Make little moment count even they’re fleeting it’s easy take granted 4 Roleplay Mock Vacation can’t travel don’t let stop pretending one Maybe one partner pretend budding tourist guide pretend naive tourist Maybe pretend two stranger meeting bar first time Think roleplay chance show acting chop Maybe adventure one secret agent also secret agent you’re trying outwit one another get bottom mystery Photo Free Use Sounds Unsplash — honestly don’t context you’re video conferencing app change background behind resemble remote tropical paradise even dress please perhaps opting wear matching Hawaiian shirt Even it’s real thing ambience mood still closely emulated Maybe play certain song instrument heighten mood wanted 5 Remote Acts Kindness Even we’re physically still help partner way one partner financially struggling maybe send fund online you’re struggling reciprocate back Perhaps call roommate one ask decorate partner’s place birthday Sure requires fair bit extra work due remote planning relationship worth effort put Photo Chase Chappell Unsplash — It’s okay make virtual plan promise Plus even don’t roommate maybe remotely order favourite food delivered house request fancy little love note left behind pay order alleviating little bit sadness partner may feelTags Self Improvement Relationships Long Distance Mental Health Dating
|
5,571 |
A Ping That Saved Me From Madness
|
Follow my journey into troubleshooting internet disruption issues after switching service providers.
Photo by Joshua Sukoff on Unsplash
Now that we are in an active-pandemic world, those lucky enough to continue remotely working rely on fast and reliable internet service. I’m a software developer by trade and run multiple video meetings a day to keep in touch with our engineering team. All of our systems and tools are in the cloud these days, and we expect that internet service is like electricity and running water.
Working at home now requires me to be the “I.T.” guy. When the service goes down, I will hear it from one of my kids before I even realize it.
“Dad, the internet is down!”
Of course if “the internet” was down, we’d have bigger problems, but I know what they mean. The connection to our service provider has been severed, so they have to pause their online lives until I can solve it.
Troubleshooting Basics
The first thing I do is to turn off the wifi on my phone and switch to cellular LTE to check if there is a localized outage. Having cleared that, I check the lights on the modem to confirm they are all green. If not, a reboot gets us back online.
The same goes for the mesh pucks that we have around the house, if it shows red, a reboot usually restores the connection.
“Ok, it’s back up!”
I feel like the hero and go back to what I was doing, and they return to their TikTok, Instagram, Fortnite, and Netflix sessions.
Other issues expose the kids’ lack of understanding of wireless coverage areas, channel interference, and range. To remediate their complaints, I have tuned the wireless network for decent coverage up to our front street and back into the alley.
Slowness in the system is now attributed to multiple streams of traffic on the network, splitting the bandwidth across dozens of devices, including our security cameras and electronic home assistants.
More Speed == More Bandwidth?
We were choking on our bandwidth with the increased traffic of video meetings, especially during school days with simultaneous remote learning and remote work conference calls. I decided to change from cable internet service to fiber service; now that fiber was available in my area. Full-duplex 1 Gbps over the 100 Mbps, with a 22% cost saving, seemed like a no-brainer.
The ordering of the service was straightforward and smooth over their web site, including scheduling when a technician would install the equipment. I couldn’t wait to see how fast I could pull down source code and navigate our virtual instances.
The install took a couple of hours with no issues. I was so elated as I plugged my laptop into the ethernet and saw 950Mbps download and 930Mbps upload. I connected to the wifi network and saturated the 5Ghz band at 430Mbps.
A quick test navigating to reddit.com, medium.com, and cnn.com was good. I streamed a Netflix show and Apple music and then tested our digital electronic assistants and security cameras. Everything checked out and seemed to be working well.
Speed !== Reliability
After a few hours, the fiber nirvana came to a screeching halt when I heard a yell from upstairs.
“Dad, the internet is down!”
This was my start in a downward spiral into madness. The Netflix client on our TV would abruptly unload itself at random times. Sites would take an unusual amount of time to load, including Google searches. Zoom meetings would freeze with no audio. Our digital assistants would not answer due to no internet connection. Our security camera videos would blank out, went offline, and returned at random intervals.
There were no reported localized or wider outages, but sustained speed tests returned with typical results. Internal diagnostics tests on the modem all passed. The lights on both the fiber ONT box and the modem were indicating normal operation.
Resetting the modem box seemed to stabilize the connection for a short period and then dropped and delayed connections returned. I was not going back to cable internet and paying more for less.
I was determined to solve this!
Try All the Things!
A game of elimination is the first step to troubleshoot the issue. I unplugged the wifi network and plugged ethernet directly into my laptop. Maybe a device on the wifi network was flooding it with invalid packets, causing a faux DDoS attack.
With my laptop as the only device connected and a web browser, the single application running, it should give weight to that theory if it pans out. That experiment didn’t work; still, the problem persisted.
Maybe the DHCP server was an issue and not renewing IP addresses correctly? I changed to a using static IP address on my laptop, within the defined range. No dice, same issue.
I disabled the DHCP server altogether on the modem and kept the static IP address. Nope, no change.
Could it be the default DNS server causing long lag times? Pinging it returned with decent responses under 15 ms. I changed it to use Google’s DNS server at 8.8.8.8 and even using CloudFare’s DNS at 1.1.1.1. All were returning good responses on pings, but the delays and disconnections continued.
I decided to place the modem into an IP bypass mode to use my router. After some considerable research, I was able to expose the WAN IP address to my router and grant LAN IP addresses through my router’s DHCP server. It was a failed exercise; no change.
How about downgrading the firmware on the modem? I went through the process of flashing the firmware through a few minor revisions and down through a major revision. It failed again.
It was time to call the service provider and convince them to send me a new modem. They claimed all the tests on their end passed, and they did not see any problems. I sent them a log of the errors, and they decided to send a new one.
I was convinced that the first modem was a lemon and that a replacement would solve all our issues. We dealt with the current problems by restarting the modem a few times throughout the day and waited for the new one to arrive.
A few days later, the new self-install modem was replaced. My frustration continued. Did I get two lemons in a row, or was that model just flawed?
Never Give Up
I was annoyed but not defeated. There has to be some combination of settings and configuration I have not tried. While diving deep into forums discussing the model (BGW210–700) and all of the issues related to it, there was a one-sentence comment that would have been easily overlooked, but it struck me as weird.
“When the issue happens, even visiting the modem’s internal administration pages has a delay.”
That was odd. I tested it, and it proved correct. Why would visiting a page served on a local web server on the modem be slow? It should be almost instant. After the initial load, the rest of the pages were fast. I’ve noticed this during the multiple times changing the configuration during troubleshooting. Does the server go into idle or sleep mode, and does that cause the problem?
Let’s test it. I sat on the administration page and continuously hit refresh for over 5 minutes, and the delays and disconnections STOPPED! WTF?
The Ping
If refreshing the page keeps the admin web server awake, then would a constant ping to the server work? I opened up a terminal window on my MacBook and typed in:
ping 192.168.1.254
And I let it run continuously. Low and behold, it worked, it kept the server from idle or sleeping. There was rarely a disconnection or delay. The ultimate test was to run the Netflix client on our TV and stream a movie. I randomly chose a show and let it run. Two hours in, and it was still playing without it crapping out. The security cameras stayed online as additional proof that this hack worked.
Coming from the vantage point of a software developer, why would there be such a mechanism? Were they thinking about saving energy or optimization of heat dissipation due to a process continually running on a seldom-used feature? Regardless of the reason, why is it tightly coupled with the main operation of the modem’s main routing feature to the internet?
This is probably a design flaw in the firmware that needs to be addressed in the next upgrade. I hope it’s not a hardware issue and that new firmware will fix it. In the meantime, I will continue to run this temporary hack.
What’s Next?
I have an old spare Android tablet connected through wifi to our network, constantly pinging the modem’s administration page to keep it from idle. It seems to be overkill for a tablet to run a simple ping. It could perfectly fit as a Raspberry PI project, though.
Stay tuned for any updates if I decide to tackle that project. I hope this article helps with other users that have experienced the same issues. Or is this a unique problem surfaced by my network architecture?
Troubleshooting an intermittent issue is rarely trivial, and it will take you through unexpected paths. My experience hunting down defects in software engineering prepared me to investigate all clues, regardless of how insignificant or unrelated it may seem. This ping has saved my sanity, and I was able to play a hero for my kids again.
|
https://medium.com/literally-literary/a-ping-that-saved-me-from-madness-a4da181c22c7
|
['Tuan Pham-Barnes']
|
2020-08-11 06:25:47.081000+00:00
|
['Debugging', 'Nonfiction', 'Technology', 'Wifi', 'Internet']
|
Title Ping Saved MadnessContent Follow journey troubleshooting internet disruption issue switching service provider Photo Joshua Sukoff Unsplash activepandemic world lucky enough continue remotely working rely fast reliable internet service I’m software developer trade run multiple video meeting day keep touch engineering team system tool cloud day expect internet service like electricity running water Working home requires “IT” guy service go hear one kid even realize “Dad internet down” course “the internet” we’d bigger problem know mean connection service provider severed pause online life solve Troubleshooting Basics first thing turn wifi phone switch cellular LTE check localized outage cleared check light modem confirm green reboot get u back online go mesh puck around house show red reboot usually restores connection “Ok it’s back up” feel like hero go back return TikTok Instagram Fortnite Netflix session issue expose kids’ lack understanding wireless coverage area channel interference range remediate complaint tuned wireless network decent coverage front street back alley Slowness system attributed multiple stream traffic network splitting bandwidth across dozen device including security camera electronic home assistant Speed Bandwidth choking bandwidth increased traffic video meeting especially school day simultaneous remote learning remote work conference call decided change cable internet service fiber service fiber available area Fullduplex 1 Gbps 100 Mbps 22 cost saving seemed like nobrainer ordering service straightforward smooth web site including scheduling technician would install equipment couldn’t wait see fast could pull source code navigate virtual instance install took couple hour issue elated plugged laptop ethernet saw 950Mbps download 930Mbps upload connected wifi network saturated 5Ghz band 430Mbps quick test navigating redditcom mediumcom cnncom good streamed Netflix show Apple music tested digital electronic assistant security camera Everything checked seemed working well Speed Reliability hour fiber nirvana came screeching halt heard yell upstairs “Dad internet down” start downward spiral madness Netflix client TV would abruptly unload random time Sites would take unusual amount time load including Google search Zoom meeting would freeze audio digital assistant would answer due internet connection security camera video would blank went offline returned random interval reported localized wider outage sustained speed test returned typical result Internal diagnostics test modem passed light fiber ONT box modem indicating normal operation Resetting modem box seemed stabilize connection short period dropped delayed connection returned going back cable internet paying le determined solve Try Things game elimination first step troubleshoot issue unplugged wifi network plugged ethernet directly laptop Maybe device wifi network flooding invalid packet causing faux DDoS attack laptop device connected web browser single application running give weight theory pan experiment didn’t work still problem persisted Maybe DHCP server issue renewing IP address correctly changed using static IP address laptop within defined range dice issue disabled DHCP server altogether modem kept static IP address Nope change Could default DNS server causing long lag time Pinging returned decent response 15 m changed use Google’s DNS server 8888 even using CloudFare’s DNS 1111 returning good response ping delay disconnection continued decided place modem IP bypass mode use router considerable research able expose WAN IP address router grant LAN IP address router’s DHCP server failed exercise change downgrading firmware modem went process flashing firmware minor revision major revision failed time call service provider convince send new modem claimed test end passed see problem sent log error decided send new one convinced first modem lemon replacement would solve issue dealt current problem restarting modem time throughout day waited new one arrive day later new selfinstall modem replaced frustration continued get two lemon row model flawed Never Give annoyed defeated combination setting configuration tried diving deep forum discussing model BGW210–700 issue related onesentence comment would easily overlooked struck weird “When issue happens even visiting modem’s internal administration page delay” odd tested proved correct would visiting page served local web server modem slow almost instant initial load rest page fast I’ve noticed multiple time changing configuration troubleshooting server go idle sleep mode cause problem Let’s test sat administration page continuously hit refresh 5 minute delay disconnection STOPPED WTF Ping refreshing page keep admin web server awake would constant ping server work opened terminal window MacBook typed ping 1921681254 let run continuously Low behold worked kept server idle sleeping rarely disconnection delay ultimate test run Netflix client TV stream movie randomly chose show let run Two hour still playing without crapping security camera stayed online additional proof hack worked Coming vantage point software developer would mechanism thinking saving energy optimization heat dissipation due process continually running seldomused feature Regardless reason tightly coupled main operation modem’s main routing feature internet probably design flaw firmware need addressed next upgrade hope it’s hardware issue new firmware fix meantime continue run temporary hack What’s Next old spare Android tablet connected wifi network constantly pinging modem’s administration page keep idle seems overkill tablet run simple ping could perfectly fit Raspberry PI project though Stay tuned update decide tackle project hope article help user experienced issue unique problem surfaced network architecture Troubleshooting intermittent issue rarely trivial take unexpected path experience hunting defect software engineering prepared investigate clue regardless insignificant unrelated may seem ping saved sanity able play hero kid againTags Debugging Nonfiction Technology Wifi Internet
|
5,572 |
An effective lead nurturing strategy should not be based on email and calls only
|
Tricky way from TOFU lead to MQL
Nowadays it takes fewer efforts for marketers to create a solid lead flow. Use-case or clients’ paint-point centric approach on search engines as well as social media can deliver as many leads as your business needs, with CPA less than $40. In addition to that, demand generation professionals can use the advantage of supersaturated lead generation market to acquire any quantity of legit leads with the right criteria in-bulk. But how to convert those TOFU leads to sales-qualified?
The times of “handraisers” are gone, and not too many hot leads are available in the market. The majority of potential clients needs to be properly nurtured and guided via their journey before they meet salespeople. What to do to warm-up those cold leads? Lead nurturing strategy, based on customer journey plus marketing automation technologies, could be the right answer.
What is lead nurturing?
Marketo tells that lead nurturing is the specific process of developing relationships with buyers at every stage of their journey through the sales funnel. The goal of that process is to convert cold / TOFU leads to acceptable leads, marketing qualified, sales qualified, opportunities, won deals and advocates respectively. An organization has to determine the fit of the prospect based on what they do, where they go or what they say. In the simplest case, if a website visitor downloads an asset, there can be a business rule that places that person on a nurture path at a specific phase or stage.
According to the Annuitas Group, in 2018 businesses that use marketing automation and multiple-channel approach to nurture prospects experience a 451% increase in qualified leads and nurtured leads make 47% larger purchases than non-nurtured leads.
However, not too many businesses in the US, as well as global, understand their customer’s journey and have an effective strategy in their hands. The common practice is to engage the leads via a very limited set of channels: email and phone calls.
Lead nurturing is not email and calls only
I cannot deny that email still remains the most effective channel to move leads through the funnel, but it is used very frequently. It annoys. Marketers cannot communicate with clients at the right time. That all leads to unsubscribe. My experience tells me that it is much easy to lose lead than to convert it with an email campaign.
Phone calls allow to get direct feedback from the client and immediately react on it as well. However, this channel has the same cons as email.
How to nurture your leads effectively?
I believe that the best practice is to use the power of all owned, paid and earned marketing channels to engage and delight all potential clients from your database. 20% of marketer’s job lies on the side of demand generation and 80% is situated in lead nurturing. Budgets should be allocated according to that as well.
If you would like to warm-up your leads, be ready to touch them at least 10 times via multiple channels and multiple offers. Your brand needs to earn their trust first.
As soon as you receive a lead you get enough information to target it and guide it via its journey. Modern marketing technologies are here to help. Social Media like Facebook and LinkedIn allow showing specific posts to a list of specific individuals based on their emails. Google Ads allows showing ads (display and search) to users based on their email address too.
Do not forget that if your brand is B2B, you also need to engage all members of the buying central. ABM is the best channel to keep your leads engaged at the account level. Let the whole company be aware of your brand and your offer. It did the job great for me.
To use all marketing tools effectively you need to segment users based on their stage in the sales funnel and then come up with a message that highlights the major need of that segment. Challenging? Who said it will be easy?
Here is an example of the light customer journey in B2B SaaS industry
0. A lead is captured by lead generation via content syndication program.
Let’s assume, it downloaded a white paper that highlights its main need on some 3-d party website. A lead has read it and forget about it successfully.
The nurturing is activated.
Few days after first touch the lead saw the post on Facebook about the topic relevant to its need (that was previously identified). The lead ignored it by not reacting; Next day that lead saw another banned on the web, clicked it, landed on the website but not converted since it did not like the CTA. However, it became familiar with the brand; At the same day its colleague saw the same banner and told the lead about the offer; After a while, the lead searched the web for content relevant to his need and clicked on the ad on SERP. It liked the landing page and converted by downloading another white paper; Later, it received an email with an offer to download another asset and that worked; A sales rep contacted that lead and told it about the webinar that demonstrates how to solve his problem (need) completely; The lead attended the webinar and asked a couple of questions, as well as some of its colleagues; Finally, the lead visited the website directly and scheduled an appointment with the sales team. … and another nurturing campaign has begun to generate opportunity.
In this example, it took 9 touches or 3 pieces of content to get MQL, and we need more touches to receive an opportunity.
Welcome to content marketing era
As it is written, marketing is rapidly shifting from a product orientation to customer-centric and we need to determine all strategies based on clients’ needs. Marketers no longer advertise, to engage customer they run content marketing activity — a part of inbound marketing. Inbound methodology leads the majority of marketing efforts now. It requires us to attract, engage and delight our customers via multiple channels.
For all businesses, the real job begins when they capture the lead. Any lead has to be properly nurtured to become a customer and then an advocate of the brand.
The effective lead nurturing strategy consist of three things that works for a specific segment:
The right mix of channels; Relevant message; The effective cadence.
That is hard to tell everything I can about lead nurturing in one article, so if you have any questions feel free to visit andreypalagin.com and reach me out for more information…
… and do not abandon your leads!
|
https://andreypalagin.medium.com/effective-lead-nurturing-strategy-should-not-be-based-on-email-and-calls-only-8d662f959646
|
['Andrey Palagin']
|
2019-11-11 06:21:16.204000+00:00
|
['Marketing', 'Lead Nurturing', 'Abm', 'Marketing Startegies', 'Digital Marketing']
|
Title effective lead nurturing strategy based email call onlyContent Tricky way TOFU lead MQL Nowadays take fewer effort marketer create solid lead flow Usecase clients’ paintpoint centric approach search engine well social medium deliver many lead business need CPA le 40 addition demand generation professional use advantage supersaturated lead generation market acquire quantity legit lead right criterion inbulk convert TOFU lead salesqualified time “handraisers” gone many hot lead available market majority potential client need properly nurtured guided via journey meet salesperson warmup cold lead Lead nurturing strategy based customer journey plus marketing automation technology could right answer lead nurturing Marketo tell lead nurturing specific process developing relationship buyer every stage journey sale funnel goal process convert cold TOFU lead acceptable lead marketing qualified sale qualified opportunity deal advocate respectively organization determine fit prospect based go say simplest case website visitor downloads asset business rule place person nurture path specific phase stage According Annuitas Group 2018 business use marketing automation multiplechannel approach nurture prospect experience 451 increase qualified lead nurtured lead make 47 larger purchase nonnurtured lead However many business US well global understand customer’s journey effective strategy hand common practice engage lead via limited set channel email phone call Lead nurturing email call cannot deny email still remains effective channel move lead funnel used frequently annoys Marketers cannot communicate client right time lead unsubscribe experience tell much easy lose lead convert email campaign Phone call allow get direct feedback client immediately react well However channel con email nurture lead effectively believe best practice use power owned paid earned marketing channel engage delight potential client database 20 marketer’s job lie side demand generation 80 situated lead nurturing Budgets allocated according well would like warmup lead ready touch least 10 time via multiple channel multiple offer brand need earn trust first soon receive lead get enough information target guide via journey Modern marketing technology help Social Media like Facebook LinkedIn allow showing specific post list specific individual based email Google Ads allows showing ad display search user based email address forget brand B2B also need engage member buying central ABM best channel keep lead engaged account level Let whole company aware brand offer job great use marketing tool effectively need segment user based stage sale funnel come message highlight major need segment Challenging said easy example light customer journey B2B SaaS industry 0 lead captured lead generation via content syndication program Let’s assume downloaded white paper highlight main need 3d party website lead read forget successfully nurturing activated day first touch lead saw post Facebook topic relevant need previously identified lead ignored reacting Next day lead saw another banned web clicked landed website converted since like CTA However became familiar brand day colleague saw banner told lead offer lead searched web content relevant need clicked ad SERP liked landing page converted downloading another white paper Later received email offer download another asset worked sale rep contacted lead told webinar demonstrates solve problem need completely lead attended webinar asked couple question well colleague Finally lead visited website directly scheduled appointment sale team … another nurturing campaign begun generate opportunity example took 9 touch 3 piece content get MQL need touch receive opportunity Welcome content marketing era written marketing rapidly shifting product orientation customercentric need determine strategy based clients’ need Marketers longer advertise engage customer run content marketing activity — part inbound marketing Inbound methodology lead majority marketing effort requires u attract engage delight customer via multiple channel business real job begin capture lead lead properly nurtured become customer advocate brand effective lead nurturing strategy consist three thing work specific segment right mix channel Relevant message effective cadence hard tell everything lead nurturing one article question feel free visit andreypalagincom reach information… … abandon leadsTags Marketing Lead Nurturing Abm Marketing Startegies Digital Marketing
|
5,573 |
Coding Your Way to Wall Street
|
The Trading Algorithm
Let’s start by importing the libraries we will need:
A major difference with Quantopian’s IDE is how some functions are run. You are still able to call functions like normal but others need to be scheduled to run at specific times during the market.
Initialize Function
The Initialize function
Here we have the initialize() function. Within this function we will be using the algo.schedule_function() to schedule three functions:
trading()
exiting_trades()
record_vars()
The other arguments:
algo.date_rules.every_day()
algo.time_rules.market_close()
specify when the functions will run. The reason these functions are scheduled for these times is to fix any potential leverage issues. Leverage plays an important role in our algorithm which will be explained later on.
Trading Function
So now that we have an initialize() function that will run our algorithm, we can move on to the trading() function which will automatically trade the stocks we want.
But first we must explain how we are retrieving the stocks we want. In Quantopian, stocks are assigned unique id values. To access the stocks, we call upon the sid() function, type in the stock ticker, and a drop down menu will appear. See below to see which id number is assigned to TSLA.
TSLA and its sid #
For our trading function, we will be making a list of stocks that we would like to pick from. Those stocks are: TSLA, DIS, AAPL, and SPY. Each with their own sid #. There are ways to grab more stocks and filter out selected ones but that requires another function.
With our list of stocks, we will iterate through them with a ‘for’ loop. The steps taken in the ‘for’ loop are:
Fetching each stock’s closing price history. Calculating their 50 day and 200 day moving averages. Creating a True/False statement to determine crossing points. Calculating leverage allowance and setting it as another True/False statement. Creating conditional statements using open_lev , and bullish_cross or bearish_cross . Once all conditions are satisfied, we place an order for a stock using order_target_percent(stock, 0.25) to fill 25% of our portfolio with that stock. (We set the percentage as negative or positive to buy or short the stock).
The Trading function
Now each step requires specific Quantopian methods and functions. So please refer to the help page once again for a detailed explanation. Next, we’ll be explaining the conceptual reasoning behind the crossing signals and open leverage.
To determine if the 50 day MA crosses the 200 day MA in a bullish fashion, we set the 50 day MA to be less than the 200 day MA, both calculated from one day before. Then, we set the current 50 day MA to be greater than the current 200 day MA. By setting this as a conditional statement, we can capture the crossing point of the moving averages.
Now regarding open leverage, its importance is taken into account because leverage determines whether we are using our own money or borrowing money. The results from borrowing money can drastically alter our trading outcomes. The formula and schedule coded before allow us to make trades without exceeding our own leverage or cash limit.
Exiting Trades Function
Now the next function, exiting_trades() , is very similar to our trading function. The only difference is checking if we have any positions open in the first place and what kind of position that is (long or short).
The exiting trade function
As you can see, there is not much of difference between exiting_trades() and trading() . This exiting trades function repeats most of the calculations and conditions from before in order to close positions whenever the moving averages crosses against us.
Next, let’s finish the code up with a record_vars() function to keep track of a few variables that we believe are necessary.
Here we will record() our leverage and how many positions we have open. We want leverage to not drastically exceed 1. Open positions should not exceed 4 because we devoted only 25% of our portfolio for each stock we trade.
Running a Backtest
Finally, set the dates and starting capital to match the one below. We won’t be running a full backtest because it is not necessary. A full backtest will provide more information about the coded strategy but, as of now, it is not needed.
Set to 7 years of backtesting with a starting amount of $10,000
Click the button Build Algorithm, which will run the code, and then you will end up with a page like so:
Results of our coded strategy
Great! We have successfully backtested our first trading algorithm. We were successful in limiting the leverage to 1 (albeit with a little leeway, which is fine) and the trading positions never exceeded 4. Feel free to alter some of the code to see if it would significantly affect the results.
|
https://medium.com/swlh/coding-your-way-to-wall-street-bf21a500376f
|
['Marco Santos']
|
2020-05-23 18:40:05.923000+00:00
|
['Trading', 'Algorithms', 'Coding', 'Python', 'Stock Market']
|
Title Coding Way Wall StreetContent Trading Algorithm Let’s start importing library need major difference Quantopian’s IDE function run still able call function like normal others need scheduled run specific time market Initialize Function Initialize function initialize function Within function using algoschedulefunction schedule three function trading exitingtrades recordvars argument algodateruleseveryday algotimerulesmarketclose specify function run reason function scheduled time fix potential leverage issue Leverage play important role algorithm explained later Trading Function initialize function run algorithm move trading function automatically trade stock want first must explain retrieving stock want Quantopian stock assigned unique id value access stock call upon sid function type stock ticker drop menu appear See see id number assigned TSLA TSLA sid trading function making list stock would like pick stock TSLA DIS AAPL SPY sid way grab stock filter selected one requires another function list stock iterate ‘for’ loop step taken ‘for’ loop Fetching stock’s closing price history Calculating 50 day 200 day moving average Creating TrueFalse statement determine crossing point Calculating leverage allowance setting another TrueFalse statement Creating conditional statement using openlev bullishcross bearishcross condition satisfied place order stock using ordertargetpercentstock 025 fill 25 portfolio stock set percentage negative positive buy short stock Trading function step requires specific Quantopian method function please refer help page detailed explanation Next we’ll explaining conceptual reasoning behind crossing signal open leverage determine 50 day cross 200 day bullish fashion set 50 day le 200 day calculated one day set current 50 day greater current 200 day setting conditional statement capture crossing point moving average regarding open leverage importance taken account leverage determines whether using money borrowing money result borrowing money drastically alter trading outcome formula schedule coded allow u make trade without exceeding leverage cash limit Exiting Trades Function next function exitingtrades similar trading function difference checking position open first place kind position long short exiting trade function see much difference exitingtrades trading exiting trade function repeat calculation condition order close position whenever moving average cross u Next let’s finish code recordvars function keep track variable believe necessary record leverage many position open want leverage drastically exceed 1 Open position exceed 4 devoted 25 portfolio stock trade Running Backtest Finally set date starting capital match one won’t running full backtest necessary full backtest provide information coded strategy needed Set 7 year backtesting starting amount 10000 Click button Build Algorithm run code end page like Results coded strategy Great successfully backtested first trading algorithm successful limiting leverage 1 albeit little leeway fine trading position never exceeded 4 Feel free alter code see would significantly affect resultsTags Trading Algorithms Coding Python Stock Market
|
5,574 |
Data Transformation
|
In the previews article, I briefly introduced the Volume Spread Analysis(VSA). After we did feature-engineering and feature-selection, there were two things I noticed immediately, the first one was that there were outliers in the dataset and the second issue was the distribution were no way close to normal. By using the method described here, here and here, I removed most of the outliers. Now is the time to face the bigger problem, the normality.
There are many ways to transfer the data. One of the well-known examples is the one-hot encoding, even better one is word embedding in natural language processing (NLP). Considering one of the advantages of using deep learning is that it completely automates what used to be the most crucial step in a machine-learning workflow: feature engineering. Before we get into the deep learning in the later articles, let’s have a look at some simple ways to transfer data to see if we can make it closer to normal distribution.
In this article, I would like to try a few things. The first one is to transfer all the features to a simple percentage change. The second one is to do a Percentile Ranking. In the end, I will show you what happens if I only pick the sign of all the data. Methods like Z-score, which are standard pre-processing in deep learning, I would rather leave it for now.
1. Data preparation
For consistency, in all the 📈Python for finance series, I will try to reuse the same data as much as I can. More details about data preparation can be found here, here and here or you can refer back to my previous article. Or if you like, you can ignore all the code below and use whatever clean data you have at hand, it won’t affect the things we are going to do together.
#import all the libraries
import pandas as pd
import numpy as np
import seaborn as sns
import yfinance as yf #the stock data from Yahoo Finance
import matplotlib.pyplot as plt #set the parameters for plotting
plt.style.use('seaborn')
plt.rcParams['figure.dpi'] = 300 #define a function to get data
def get_data(symbols, begin_date=None,end_date=None):
df = yf.download('AAPL', start = '2000-01-01',
auto_adjust=True,#only download adjusted data
end= '2010-12-31')
#my convention: always lowercase
df.columns = ['open','high','low',
'close','volume']
return df prices = get_data('AAPL', '2000-01-01', '2010-12-31') #create some features
def create_HLCV(i):
#as we don't care open that much, that leaves volume,
#high,low and close
df = pd.DataFrame(index=prices.index)
df[f'high_{i}D'] = prices.high.rolling(i).max()
df[f'low_{i}D'] = prices.low.rolling(i).min()
df[f'close_{i}D'] = prices.close.rolling(i).\
apply(lambda x:x[-1])
# close_2D = close as rolling backwards means today is
# literly the last day of the rolling window.
df[f'volume_{i}D'] = prices.volume.rolling(i).sum()
return df # create features at different rolling windows
def create_features_and_outcomes(i):
df = create_HLCV(i)
high = df[f'high_{i}D']
low = df[f'low_{i}D']
close = df[f'close_{i}D']
volume = df[f'volume_{i}D']
features = pd.DataFrame(index=prices.index)
outcomes = pd.DataFrame(index=prices.index)
#as we already considered the different time span,
#here only day of simple percentage change used.
features[f'volume_{i}D'] = volume.pct_change()
features[f'price_spread_{i}D'] = (high - low).pct_change()
#aligne the close location with the stock price change
features[f'close_loc_{i}D'] = ((close - low) / \
(high - low)).pct_change() #the future outcome is what we are going to predict
outcomes[f'close_change_{i}D'] = close.pct_change(-i)
return features, outcomes def create_bunch_of_features_and_outcomes():
'''
the timespan that i would like to explore
are 1, 2, 3 days and 1 week, 1 month, 2 month, 3 month
which roughly are [1,2,3,5,20,40,60]
'''
days = [1,2,3,5,20,40,60]
bunch_of_features = pd.DataFrame(index=prices.index)
bunch_of_outcomes = pd.DataFrame(index=prices.index)
for day in days:
f,o = create_features_and_outcomes(day)
bunch_of_features = bunch_of_features.join(f)
bunch_of_outcomes = bunch_of_outcomes .join(o)
return bunch_of_features, bunch_of_outcomes bunch_of_features, bunch_of_outcomes = create_bunch_of_features_and_outcomes() #define the method to identify outliers
def get_outliers(df, i=4):
#i is number of sigma, which define the boundary along mean
outliers = pd.DataFrame()
stats = df.describe()
for col in df.columns:
mu = stats.loc['mean', col]
sigma = stats.loc['std', col]
condition = (df[col] > mu + sigma * i) | \
(df[col] < mu - sigma * i)
outliers[f'{col}_outliers'] = df[col][condition]
return outliers #remove all the outliers
features_outcomes = bunch_of_features.join(bunch_of_outcomes)
outliers = get_outliers(features_outcomes, i=1) features_outcomes_rmv_outliers = features_outcomes.drop(index = outliers.index).dropna() features = features_outcomes_rmv_outliers[bunch_of_features.columns]
outcomes = features_outcomes_rmv_outliers[bunch_of_outcomes.columns]
features.info(), outcomes.info()
Information of features dataset
Information of outcomes dataset
In the end, we will have the basic four features based on Volume Spread Analysis (VSA) at different time scale listed below, namely, 1 day, 2 days, 3 days, a week, a month, 2 months and 3 months.
Volume: pretty straight forward
Range/Spread: Difference between high and close
Closing Price Relative to Range: Is the closing price near the top or the bottom of the price bar?
The change of stock price: pretty straight forward
2. Percentage Returns
I know that’s a whole lot of codes above. We have all the features transformed into a simple percentage change through the function below.
def create_features_and_outcomes(i):
df = create_HLCV(i)
high = df[f'high_{i}D']
low = df[f'low_{i}D']
close = df[f'close_{i}D']
volume = df[f'volume_{i}D']
features = pd.DataFrame(index=prices.index)
outcomes = pd.DataFrame(index=prices.index)
#as we already considered the different time span,
#here only 1 day of simple percentage change used.
features[f'volume_{i}D'] = volume.pct_change()
features[f'price_spread_{i}D'] = (high - low).pct_change()
#aligne the close location with the stock price change
features[f'close_loc_{i}D'] = ((close - low) / \
(high - low)).pct_change() #the future outcome is what we are going to predict
outcomes[f'close_change_{i}D'] = close.pct_change(-i)
return features, outcomes
Now, let’s have a look at their correlations using cluster map. Seaborn’s clustermap() hierarchical clustering algorithm shows a nice way to group the most closely related features.
corr_features = features.corr().sort_index()
sns.clustermap(corr_features, cmap='coolwarm', linewidth=1);
Based on this cluster map, to minimize the amount of feature overlap in selected features, I will remove those features that are paired with other features closely and having less correlation with the outcome targets. From the cluster map above, it is easy to spot that features on [40D, 60D] and [2D, 3D] are paired together. To see how those features are related to the outcomes, let’s have a look at how the outcomes are correlated first.
corr_outcomes = outcomes.corr()
sns.clustermap(corr_outcomes, cmap='coolwarm', linewidth=2);
From top to bottom, 20 days, 40 days and 60 days price percentage change are grouped together, so as the 2 days, 3 days and 5 days. Whereas, 1-day stock price percentage change is relatively independent of those two groups. If we pick the next day price percentage change as the outcome target, let’s see how those features are related to it.
corr_features_outcomes = features.corrwith(outcomes. \
close_change_1D).sort_values()
corr_features_outcomes.dropna(inplace=True)
corr_features_outcomes.plot(kind='barh',title = 'Strength of Correlation');
The correlation coefficients are way too small to make a solid conclusion. I will expect that the most recent data have a stronger correlation, but that is not the case here.
How about the pair plot? We only pick those features based on a 1-day time scale as a demonstration. At the meantime, I transferred the close_change_1D to sign base on it’s a negative or positive number to add extra dimensionality to the plots.
selected_features_1D_list = ['volume_1D', 'price_spread_1D', 'close_loc_1D', 'close_change_1D']
features_outcomes_rmv_outliers['sign_of_close'] = features_outcomes_rmv_outliers['close_change_1D']. \
apply(np.sign) sns.pairplot(features_outcomes_rmv_outliers,
vars=selected_features_1D_list,
diag_kind='kde',
palette='husl', hue='sign_of_close',
markers = ['*', '<', '+'],
plot_kws={'alpha':0.3});
The pair plot builds on two basic figures, the histogram and the scatter plot. The histogram on the diagonal allows us to see the distribution of a single variable while the scatter plots on the upper and lower triangles show the relationship (or lack thereof) between two variables. From the plots above, we can see that price spreads are getting wider with high volume. Most of the price change locate at a narrow price spread, in another word, wider spread doesn’t always come with bigger price fluctuation. Either low volume or high volume can cause price change at almost all scale. And we can apply all those conclusions to both up days and down days.
you can also use the close location of bars to add more dimensionality, simply apply
features[‘sign_of_close_loc’] = np.where( \
features[‘close_loc_1D’] > 0.5, \
1, -1)
to see how many bars’ close location above the 0.5 or below 0.5.
One thing that I don’t really like in the pair plot is all the plots with the close_loc_1D condensed, looks like the outliers still there, even I know I used one standard deviation as the boundary which is a very low threshold and 338 outliers were removed. I realize that because the location of close is already a percentage change, adding another percentage change on top doesn’t make much sense. Let’s change it.
def create_features_and_outcomes(i):
df = create_HLCV(i)
high = df[f'high_{i}D']
low = df[f'low_{i}D']
close = df[f'close_{i}D']
volume = df[f'volume_{i}D']
features = pd.DataFrame(index=prices.index)
outcomes = pd.DataFrame(index=prices.index)
#as we already considered the different time span,
#simple percentage change of 1 day used here.
features[f'volume_{i}D'] = volume.pct_change()
features[f'price_spread_{i}D'] = (high - low).pct_change()
#remove pct_change() here
features[f'close_loc_{i}D'] = ((close - low) / (high - low))
#predict the future with -i
outcomes[f'close_change_{i}D'] = close.pct_change(-i)
return features, outcomes
With pct_change() removed, let’s see how the cluster map looks like now.
corr_features = features.corr().sort_index()
sns.clustermap(corr_features, cmap='coolwarm', linewidth=1);
The cluster map makes more sense now. All four basic features have pretty much the same pattern. [40D, 60D], [2D, 3D] are paired together.
and in terms of the features correlations with the outcome.
corr_features_outcomes.plot(kind='barh',title = 'Strength of Correlation');
The longer-range time scale features have weak correlations with stock price return, while the more recent events have more effects on the price returns.
By removing pct_change() of the close_loc_1D , the biggest difference is laid on the pairplot() .
Finally, the close_loc_1D variable plots at the right range. This illustrates that we should be careful with over-engineering. It may lead to a totally unexpected way.
3. Percentile Ranking
According to Wikipedia, the percentile rank is
“The percentile rank of a score is the percentage of scores in its frequency distribution that are equal to or lower than it. For example, a test score that is greater than 75% of the scores of people taking the test is said to be at the 75th percentile, where 75 is the percentile rank.”
The below example returns the percentile rank (from 0.00 to 1.00) of traded volume for each value as compared to a trailing 60-day period.
|
https://towardsdatascience.com/data-transformation-e7b3b4268151
|
['Ke Gui']
|
2020-10-13 11:19:14.340000+00:00
|
['Machine Learning', 'Trading', 'Artificial Intelligence', 'Data Science', 'Pandas']
|
Title Data TransformationContent preview article briefly introduced Volume Spread AnalysisVSA featureengineering featureselection two thing noticed immediately first one outlier dataset second issue distribution way close normal using method described removed outlier time face bigger problem normality many way transfer data One wellknown example onehot encoding even better one word embedding natural language processing NLP Considering one advantage using deep learning completely automates used crucial step machinelearning workflow feature engineering get deep learning later article let’s look simple way transfer data see make closer normal distribution article would like try thing first one transfer feature simple percentage change second one Percentile Ranking end show happens pick sign data Methods like Zscore standard preprocessing deep learning would rather leave 1 Data preparation consistency 📈Python finance series try reuse data much detail data preparation found refer back previous article like ignore code use whatever clean data hand won’t affect thing going together import library import panda pd import numpy np import seaborn sn import yfinance yf stock data Yahoo Finance import matplotlibpyplot plt set parameter plotting pltstyleuseseaborn pltrcParamsfiguredpi 300 define function get data def getdatasymbols begindateNoneenddateNone df yfdownloadAAPL start 20000101 autoadjustTrueonly download adjusted data end 20101231 convention always lowercase dfcolumns openhighlow closevolume return df price getdataAAPL 20000101 20101231 create feature def createHLCVi dont care open much leaf volume highlow close df pdDataFrameindexpricesindex dffhighiD priceshighrollingimax dfflowiD priceslowrollingimin dffcloseiD pricescloserollingi applylambda xx1 close2D close rolling backwards mean today literly last day rolling window dffvolumeiD pricesvolumerollingisum return df create feature different rolling window def createfeaturesandoutcomesi df createHLCVi high dffhighiD low dfflowiD close dffcloseiD volume dffvolumeiD feature pdDataFrameindexpricesindex outcome pdDataFrameindexpricesindex already considered different time span day simple percentage change used featuresfvolumeiD volumepctchange featuresfpricespreadiD high lowpctchange aligne close location stock price change featuresfcloselociD close low high lowpctchange future outcome going predict outcomesfclosechangeiD closepctchangei return feature outcome def createbunchoffeaturesandoutcomes timespan would like explore 1 2 3 day 1 week 1 month 2 month 3 month roughly 1235204060 day 1235204060 bunchoffeatures pdDataFrameindexpricesindex bunchofoutcomes pdDataFrameindexpricesindex day day fo createfeaturesandoutcomesday bunchoffeatures bunchoffeaturesjoinf bunchofoutcomes bunchofoutcomes joino return bunchoffeatures bunchofoutcomes bunchoffeatures bunchofoutcomes createbunchoffeaturesandoutcomes define method identify outlier def getoutliersdf i4 number sigma define boundary along mean outlier pdDataFrame stats dfdescribe col dfcolumns mu statslocmean col sigma statslocstd col condition dfcol mu sigma dfcol mu sigma outliersfcoloutliers dfcolcondition return outlier remove outlier featuresoutcomes bunchoffeaturesjoinbunchofoutcomes outlier getoutliersfeaturesoutcomes i1 featuresoutcomesrmvoutliers featuresoutcomesdropindex outliersindexdropna feature featuresoutcomesrmvoutliersbunchoffeaturescolumns outcome featuresoutcomesrmvoutliersbunchofoutcomescolumns featuresinfo outcomesinfo Information feature dataset Information outcome dataset end basic four feature based Volume Spread Analysis VSA different time scale listed namely 1 day 2 day 3 day week month 2 month 3 month Volume pretty straight forward RangeSpread Difference high close Closing Price Relative Range closing price near top bottom price bar change stock price pretty straight forward 2 Percentage Returns know that’s whole lot code feature transformed simple percentage change function def createfeaturesandoutcomesi df createHLCVi high dffhighiD low dfflowiD close dffcloseiD volume dffvolumeiD feature pdDataFrameindexpricesindex outcome pdDataFrameindexpricesindex already considered different time span 1 day simple percentage change used featuresfvolumeiD volumepctchange featuresfpricespreadiD high lowpctchange aligne close location stock price change featuresfcloselociD close low high lowpctchange future outcome going predict outcomesfclosechangeiD closepctchangei return feature outcome let’s look correlation using cluster map Seaborn’s clustermap hierarchical clustering algorithm show nice way group closely related feature corrfeatures featurescorrsortindex snsclustermapcorrfeatures cmapcoolwarm linewidth1 Based cluster map minimize amount feature overlap selected feature remove feature paired feature closely le correlation outcome target cluster map easy spot feature 40D 60D 2D 3D paired together see feature related outcome let’s look outcome correlated first corroutcomes outcomescorr snsclustermapcorroutcomes cmapcoolwarm linewidth2 top bottom 20 day 40 day 60 day price percentage change grouped together 2 day 3 day 5 day Whereas 1day stock price percentage change relatively independent two group pick next day price percentage change outcome target let’s see feature related corrfeaturesoutcomes featurescorrwithoutcomes closechange1Dsortvalues corrfeaturesoutcomesdropnainplaceTrue corrfeaturesoutcomesplotkindbarhtitle Strength Correlation correlation coefficient way small make solid conclusion expect recent data stronger correlation case pair plot pick feature based 1day time scale demonstration meantime transferred closechange1D sign base it’s negative positive number add extra dimensionality plot selectedfeatures1Dlist volume1D pricespread1D closeloc1D closechange1D featuresoutcomesrmvoutlierssignofclose featuresoutcomesrmvoutliersclosechange1D applynpsign snspairplotfeaturesoutcomesrmvoutliers varsselectedfeatures1Dlist diagkindkde palettehusl huesignofclose marker plotkwsalpha03 pair plot build two basic figure histogram scatter plot histogram diagonal allows u see distribution single variable scatter plot upper lower triangle show relationship lack thereof two variable plot see price spread getting wider high volume price change locate narrow price spread another word wider spread doesn’t always come bigger price fluctuation Either low volume high volume cause price change almost scale apply conclusion day day also use close location bar add dimensionality simply apply features‘signofcloseloc’ npwhere features‘closeloc1D’ 05 1 1 see many bars’ close location 05 05 One thing don’t really like pair plot plot closeloc1D condensed look like outlier still even know used one standard deviation boundary low threshold 338 outlier removed realize location close already percentage change adding another percentage change top doesn’t make much sense Let’s change def createfeaturesandoutcomesi df createHLCVi high dffhighiD low dfflowiD close dffcloseiD volume dffvolumeiD feature pdDataFrameindexpricesindex outcome pdDataFrameindexpricesindex already considered different time span simple percentage change 1 day used featuresfvolumeiD volumepctchange featuresfpricespreadiD high lowpctchange remove pctchange featuresfcloselociD close low high low predict future outcomesfclosechangeiD closepctchangei return feature outcome pctchange removed let’s see cluster map look like corrfeatures featurescorrsortindex snsclustermapcorrfeatures cmapcoolwarm linewidth1 cluster map make sense four basic feature pretty much pattern 40D 60D 2D 3D paired together term feature correlation outcome corrfeaturesoutcomesplotkindbarhtitle Strength Correlation longerrange time scale feature weak correlation stock price return recent event effect price return removing pctchange closeloc1D biggest difference laid pairplot Finally closeloc1D variable plot right range illustrates careful overengineering may lead totally unexpected way 3 Percentile Ranking According Wikipedia percentile rank “The percentile rank score percentage score frequency distribution equal lower example test score greater 75 score people taking test said 75th percentile 75 percentile rank” example return percentile rank 000 100 traded volume value compared trailing 60day periodTags Machine Learning Trading Artificial Intelligence Data Science Pandas
|
5,575 |
How to Impress With Your First Impression
|
How to Impress With Your First Impression
Be “you-centric”
Photo by Hồ Ngọc Hải on Unsplash
“You don’t get a second chance to make a first impression.” My soccer coach first presented this phrase to our team when we were 16-years old. He delivered an inspiring speech about how college coaches were going to start coming to our games, and we had to put on our best performance each time they came, because we never knew who would be watching. We may have opportunities to impress these coaches again, but we would not have a second chance at making a first impression.
As my search for playing college soccer heightened, after playing in front of coaches, the next step was to meet them in person. Coaches would invite you to their university for the day, and you would have what is considered an unofficial visit to their campus. Before my first visit, I remembered having a meeting with my own coach, in which he further explained the importance of this first impression. But this time, he went into detail about how to make this first impression flawless.
First impressions in relationships are quite important. People make snap judgments about your appearance, your body language, your posture, your tone, and of course your words. Whether it’s meeting new colleagues for the first time, going on a first date, meeting your significant others’ friends and family or meeting a college coach, it’s crucial to understand what goes into making a positive first impression. The following are some tips my coach shared with me that I always remember when I meet people.
|
https://medium.com/real-1-0/how-to-impress-with-your-first-impression-2fe5e8360cb6
|
['Jordan Gross']
|
2020-11-14 15:01:57.715000+00:00
|
['Motivation', 'Communication', 'Life Lessons', 'Relationships', 'Inspiration']
|
Title Impress First ImpressionContent Impress First Impression “youcentric” Photo Hồ Ngọc Hải Unsplash “You don’t get second chance make first impression” soccer coach first presented phrase team 16years old delivered inspiring speech college coach going start coming game put best performance time came never knew would watching may opportunity impress coach would second chance making first impression search playing college soccer heightened playing front coach next step meet person Coaches would invite university day would considered unofficial visit campus first visit remembered meeting coach explained importance first impression time went detail make first impression flawless First impression relationship quite important People make snap judgment appearance body language posture tone course word Whether it’s meeting new colleague first time going first date meeting significant others’ friend family meeting college coach it’s crucial understand go making positive first impression following tip coach shared always remember meet peopleTags Motivation Communication Life Lessons Relationships Inspiration
|
5,576 |
Chatbots- Connecting Artificial Intelligence and Customer Service
|
Every business revolves around customers and interactions carried out with customers. It is said that a customer is the boss of a business and every interaction with him counts! An infallible way of dealing with this pressing subject is the use of Chatbots. Chatbots are used to conduct an online chat conversation via text or text-to-speech and provide direct contact with a live human agent. With rising demands all over the world, they have gained immense popularity and are widely used in a myriad of industries to render a pleasant and uniform customer experience. They can be used to answer FAQs, handle customer queries and grievances, manage bookings, make recommendations, CRM and provide 24*7 customer support. They can be rule-based or have Natural Language Understanding.
Let’s look at some applications of Chatbots:
Accessible anytime
Handling Capacity
Flexible attribute
Customer Satisfaction
Cost Effective
Faster Onboarding
Work Automation
Alternate sales channel
Personal Assistant
Chatbots can be integrated with various platforms such as Google Dialogflow, Microsoft Bot Builder, Amazon Lex, RASA and Wit.ai.
Aim and Scope
Whether it is for placing orders or recommending products, most businesses today use Chatbots to provide an efficacious customer experience and obtain a competitive advantage.
Here we will look at the steps involved in building a chatbot to help customers streamline their orders for a pizzeria. Perusing menus and placing orders can often be a cumbersome and time consuming task. This Chatbot aims to rule out tedious steps of flipping through menus and offers a personalized and customized experience. It will recommend a particular dish to the users based on their choice of ingredients and help provide a smooth user experience.
What type of pizza should you order if you enjoy basil on your pizza? What are you most likely to relish if you love pineapple? Which pizza would be best suited for olive lovers?
The bot will instantaneously answer all these questions are more! It will welcome customers to a pizzeria and recommend them a particular type of pizza they are most likely to devour based on their preferred ingredients and toppings. It will also simply take orders from the user and be a promising expression of efficiency, availability, interactivity and customer loyalty.
Steps
We create the bot using Python and RiveScript. In order to train the bot a dataset needs to be created. For the purpose of demonstration, I will be using a small dataset consisting of 50 records and 2 columns “Pizza name” and “ingredients.”
The first step is to import essential libraries we will need.
RiveScript is a simple and user-friendly scripting language for Chatbots. It is a rule-based engine, where the rules can be created by us. These use a scripting metalanguage (simply called a “script”) as their source code.
The next step is to set up the bot dictionary.
Next, we will use two concepts:
Count Vectorizer
Cosine Similarity
Count Vectorizer is used to transform a given text into an n*n matrix on the basis of the frequency of each word that occurs in the entire text.
For instance,
Data= [‘The’, ‘Bot’, ‘will’, ‘recommend’, ‘pizzas’, ‘for’, ‘the’, ‘customer’]
Cosine Similarity is a concept commonly used in recommendation systems. It is a measure of the similarity between two non-zero vectors of an inner product space. To understand it better, consider two points, P1 and P2, in a multi dimensional space. Mathematically, the lesser the distance between the two points, the more similar they are and as distance increases, the similarity between them decreases. Cosine Similarity depicts how similar the two points are by taking cosine of the angle between them. It ranges from -1 to +1. It compares how similar documents are by considering the arrays containing the word counts of the documents.
The next step is to create a function to get replies from the bot. If the value returned by the previous function is not ‘0’, the bot will recommend a particular type of pizza the user is most likely to enjoy.
The last step is to write a code for the Flask app.
This is what your rivescript will look like:
You can customize it based on preferences or business requirements.
Lastly we must build a User Interface for our bot in order to create a personalized branded experience and enable efficient communication with customers to serve them better.
|
https://medium.com/analytics-vidhya/chatbots-connecting-artificial-intelligence-and-customer-service-d8efbc604e02
|
['Heena Rijhwani']
|
2020-12-06 10:38:11.357000+00:00
|
['Artificial Intelligence', 'Chatbots', 'Cosine Similarity', 'Customer Service', 'Count Vectorizer']
|
Title Chatbots Connecting Artificial Intelligence Customer ServiceContent Every business revolves around customer interaction carried customer said customer bos business every interaction count infallible way dealing pressing subject use Chatbots Chatbots used conduct online chat conversation via text texttospeech provide direct contact live human agent rising demand world gained immense popularity widely used myriad industry render pleasant uniform customer experience used answer FAQs handle customer query grievance manage booking make recommendation CRM provide 247 customer support rulebased Natural Language Understanding Let’s look application Chatbots Accessible anytime Handling Capacity Flexible attribute Customer Satisfaction Cost Effective Faster Onboarding Work Automation Alternate sale channel Personal Assistant Chatbots integrated various platform Google Dialogflow Microsoft Bot Builder Amazon Lex RASA Witai Aim Scope Whether placing order recommending product business today use Chatbots provide efficacious customer experience obtain competitive advantage look step involved building chatbot help customer streamline order pizzeria Perusing menu placing order often cumbersome time consuming task Chatbot aim rule tedious step flipping menu offer personalized customized experience recommend particular dish user based choice ingredient help provide smooth user experience type pizza order enjoy basil pizza likely relish love pineapple pizza would best suited olive lover bot instantaneously answer question welcome customer pizzeria recommend particular type pizza likely devour based preferred ingredient topping also simply take order user promising expression efficiency availability interactivity customer loyalty Steps create bot using Python RiveScript order train bot dataset need created purpose demonstration using small dataset consisting 50 record 2 column “Pizza name” “ingredients” first step import essential library need RiveScript simple userfriendly scripting language Chatbots rulebased engine rule created u use scripting metalanguage simply called “script” source code next step set bot dictionary Next use two concept Count Vectorizer Cosine Similarity Count Vectorizer used transform given text nn matrix basis frequency word occurs entire text instance Data ‘The’ ‘Bot’ ‘will’ ‘recommend’ ‘pizzas’ ‘for’ ‘the’ ‘customer’ Cosine Similarity concept commonly used recommendation system measure similarity two nonzero vector inner product space understand better consider two point P1 P2 multi dimensional space Mathematically lesser distance two point similar distance increase similarity decrease Cosine Similarity depicts similar two point taking cosine angle range 1 1 compare similar document considering array containing word count document next step create function get reply bot value returned previous function ‘0’ bot recommend particular type pizza user likely enjoy last step write code Flask app rivescript look like customize based preference business requirement Lastly must build User Interface bot order create personalized branded experience enable efficient communication customer serve betterTags Artificial Intelligence Chatbots Cosine Similarity Customer Service Count Vectorizer
|
5,577 |
A Novel in Thirty Days: Drawing a Blueprint
|
A Novel in Thirty Days: Drawing a Blueprint
Let the preparations begin
This July, I’m taking another stab at fast-drafting a novel by participating in the controlled chaos that is Camp NaNo. In this piece, I discussed the mostly tangible preparations I’ve made, as recommended in Chris Baty’s book No Plot? No Problem. These include establishing a writing nest, planning when to write, and gathering the appropriate tools.
Today’s focus is on the intangibles critical to the success of such a mad dash to the noveling finish line, and most likely these will continue until approximately 11:59pm on May 30th.
First things first.
I already had an idea bubbling merrily on a back burner, so the most difficult step was finished. I just needed to put it front and center for the next month.
Then came the most critical factor of all: creating a cover for the novel I haven’t yet written.
Crickets chirping…
Wait, that’s not the first thing you do? Oh. Hmm. Well, I did that. Because reasons.
Anyways…
A writing exercise to get us moving
After briefly laying out the reasoning behind his one-week limit on prep time, chapter four of No Plot? No Problem! offers a warm-up writing exercise. This consists of answering the question, “What, to you, makes a good novel?”
“What, to you, makes a good novel?”
For the first half, Baty suggests making a quick list of anything you’ve noticed that consistently appears in books that you like, and saving it to refer to throughout the month. Why? “Because the things that you appreciate as a reader are also the things you’ll likely excel at as a writer.”
He calls it The Magna Carta, and he encourages incorporating as many of these elements as possible while developing your story.
This is what mine looks like:
Characters who are reasonably mature (or become so very quickly)
Characters who are quirky and irreverent
Complex and nuanced antagonists
Humor, or books that don’t take themselves too seriously
Romance that builds over time
Romantic partners that balance each other
Commitment and loyalty between partners
Situations that pull the rug out from under the MCs (Main Characters)
An honest portrayal of mental illness, as a facet of character
Unique settings and worldbuilding
Subverted tropes
Close third person POV (Point Of View)
Happy endings
For the second half, Baty tells us to “write down those things that bore or depress you in novels.” These are important to recognize, because: “If you won’t enjoy reading it, you won’t enjoy writing it.” Listing them clearly and referring to the list frequently will help keep you from accidentally including them in your novel.
“If you won’t enjoy reading it, you won’t enjoy writing it.”
He calls this list the Magna Carta II, the Evil Twin of Magna Carta I. Here’s mine:
Characters with no redeeming qualities, or who are insufferably immature
Miscommunication as a plot device
Mental illness as a plot device
Mental illness that is inaccurately or insensitively portrayed
Books that try too hard to be serious, such as most literary fiction
Insta-love, or love at first sight
Hate-to-love relationships, especially with characters who supposedly hate each other but still find each other sexually irresistible
Narration that distances the reader from the characters
Predictability, or over-reliance on cliches
Unsatisfying endings, especially involving the death of one or more characters or the end of a significant relationship
Getting to know the cast
Keeping in mind what we’ve learned from The Magna Carta and the Evil Twin, Baty gently leads us through fleshing out our characters, plot, setting, and POV.
Characters. My story will feature two main characters, Avery and Echo (they don’t have last names yet). The eight questions Baty suggests posing to your characters taught me a lot that I didn’t know about them.
How old are they?
What is their gender?
What do they do for work?
Who are their friends, family, and love interests?
What is their living space like?
What are their hobbies?
What were they doing a year ago? Five years ago?
What are their values and politics?
I’ve chosen not to include my answers to these questions. Otherwise, it would be so long I might as well just write the novel right here.
Plot, Setting, and POV
POV was an easy choice: I enjoy reading close third, so that’s what my novel will be. I have some bare-bones ideas for plot and setting. Those are next on my to-do list, so stay tuned!
A final note
Keep the concept of exuberant imperfection in mind as you make your own preparations. Remember not to get caught up in making every detail right, and just have fun with the process. That’s what NaNoWriMo is all about!
|
https://medium.com/write-well-be-well/a-novel-in-thirty-days-drawing-a-blueprint-2ca7f3986b49
|
['Rianne Grace']
|
2019-06-25 15:13:40.690000+00:00
|
['Advice', 'NaNoWriMo', 'Inspiration', 'Writing', 'Creative Writing']
|
Title Novel Thirty Days Drawing BlueprintContent Novel Thirty Days Drawing Blueprint Let preparation begin July I’m taking another stab fastdrafting novel participating controlled chaos Camp NaNo piece discussed mostly tangible preparation I’ve made recommended Chris Baty’s book Plot Problem include establishing writing nest planning write gathering appropriate tool Today’s focus intangible critical success mad dash noveling finish line likely continue approximately 1159pm May 30th First thing first already idea bubbling merrily back burner difficult step finished needed put front center next month came critical factor creating cover novel haven’t yet written Crickets chirping… Wait that’s first thing Oh Hmm Well reason Anyways… writing exercise get u moving briefly laying reasoning behind oneweek limit prep time chapter four Plot Problem offer warmup writing exercise consists answering question “What make good novel” “What make good novel” first half Baty suggests making quick list anything you’ve noticed consistently appears book like saving refer throughout month “Because thing appreciate reader also thing you’ll likely excel writer” call Magna Carta encourages incorporating many element possible developing story mine look like Characters reasonably mature become quickly Characters quirky irreverent Complex nuanced antagonist Humor book don’t take seriously Romance build time Romantic partner balance Commitment loyalty partner Situations pull rug MCs Main Characters honest portrayal mental illness facet character Unique setting worldbuilding Subverted trope Close third person POV Point View Happy ending second half Baty tell u “write thing bore depress novels” important recognize “If won’t enjoy reading won’t enjoy writing it” Listing clearly referring list frequently help keep accidentally including novel “If won’t enjoy reading won’t enjoy writing it” call list Magna Carta II Evil Twin Magna Carta Here’s mine Characters redeeming quality insufferably immature Miscommunication plot device Mental illness plot device Mental illness inaccurately insensitively portrayed Books try hard serious literary fiction Instalove love first sight Hatetolove relationship especially character supposedly hate still find sexually irresistible Narration distance reader character Predictability overreliance cliche Unsatisfying ending especially involving death one character end significant relationship Getting know cast Keeping mind we’ve learned Magna Carta Evil Twin Baty gently lead u fleshing character plot setting POV Characters story feature two main character Avery Echo don’t last name yet eight question Baty suggests posing character taught lot didn’t know old gender work friend family love interest living space like hobby year ago Five year ago value politics I’ve chosen include answer question Otherwise would long might well write novel right Plot Setting POV POV easy choice enjoy reading close third that’s novel barebones idea plot setting next todo list stay tuned final note Keep concept exuberant imperfection mind make preparation Remember get caught making every detail right fun process That’s NaNoWriMo aboutTags Advice NaNoWriMo Inspiration Writing Creative Writing
|
5,578 |
Why Americans Want Polarization
|
“What we need in the United States is not division; what we need in the United States is not hatred; what we need in the United States is not violence or lawlessness; but love and wisdom, and compassion toward one another, and a feeling of justice toward those who still suffer within our country, whether they be white or they be black.”
These words were spoken to an audience in Indianapolis by Senator Robert F. Kennedy upon the assassination of Martin Luther King, Jr. The brief speech has been called one of the great public addresses of the modern world. Two months later, on June 6, 1968, Kennedy himself was assassinated while campaigning for the Democratic nomination for President of the United States.
The loss of these two extraordinary seekers of peace sent us reeling as a country. Most of us. There were Americans, though, who celebrated their deaths. At the time, and I remember it well, this felt shocking when it was revealed. If we own and honor our humanity, surely we cannot rejoice at the death of anyone.
But on recalling this reaction, I am also reminded of the behavior of people in the Middle East who danced upon learning that the Twin Towers had fallen in New York City. It shocked us to see them do this on news reports, and that shock held an element of terror within it. Was this truly what we are made of? Is this what those who delighted in the death of King and Kennedy were made of? And those who had shown gladness, for there were some, at the terrible assassination of President John F. Kennedy on November 22, 1963?
We all share the same origins, no matter what our ethnic, religious, or national background. There really is only one race, genetically speaking, according to research presented by National Geographic in their recent message that “There’s No Scientific Basis for Race — It’s a Made-Up Label.” So why do we persist in seeking division, in praising it, in coveting it, in living it? Why do we allow and accept polarization as a state of mind and heart?
Polarization is expressed through prejudiced behavior —anger, revenge, bitterness, division, and absolutism. All are a product of our ignorance and refusal to acknowledge our fundamental connection with each other. This polarization of viewpoints and allegiances arises out of one primary need — our individual desire to feel safe. We so often experience deep feelings of inadequacy for so many reasons, and these feelings increase when we encounter anything that threatens to change our world or life in any way — for immediately, we feel unsafe. We lose our bearings. Fear is the driving force in this. We believe we can bury this fear, keep it out of sight of mind and heart, if we lash out at or shun or denigrate other human beings, basing our actions on the illusion that there are differences between “us” and “them.”
“Oh, no, those differences exist!” — so says our current mantra. We WANT to believe this is true. We may, sometimes, choose to deal with this information in a civil way, but we still opt for believing in division. We still believe we all do not share common ground as human beings.
The result is a polarized country that appears to thrive at so many levels on the public revelation of those differences — in political speeches, in news media, in movies and television, or in the simple exchange of points of view over the dinner table.
Someone told me yesterday — a person who comes from a family that is politically divided — that no one dares to bring up politics when they gather together. The fierce feelings that sweep in are instantaneous. The dinner table becomes a battleground.
This scenario is repeated everywhere in our country now whenever the opportunity arises — in debates about community zoning, in the workforce, in churches, at a football game, in a child’s playground, or at a world summit. No one in these situations suggests having a lively conversation to talk things over. No one is listening to what anyone who disagrees with them has to say, nor cares what that person thinks.
So what is going on? Why do Americans want polarization? Because they do want it, no question, or we wouldn’t have it.
Again, it has to do with feeling safe. And people are driven by what is called the fight-or-flight response to conflict. Our amygdala controls this, a small almond-shaped set of neurons located deep in each temporal lobe of the brain — it has a key role in how we process our emotions. It is one of the oldest, primordial regions of the brain, and the amygdala is activated when we are confronted by actual physical or perceived danger. Hundreds of thousands of years ago, this could mean escaping the attack of a woolly mammoth or a saber-toothed tiger. Today, this small set of neurons is activated when we feel attacked because someone is talking to us about a difference of opinion or belief. Their words are felt as an attack on our safety just as strongly as if we were in actual physical danger. And when those words are spoken, we revert back to the hundreds of thousands of years of collective response — seek cover, or gear up for battle — flee, or fight. There is no in-between. And by refusing to talk to people who do not share our exact views, we protect ourselves — we evade the danger.
What we believe deep down is that we are evading the danger of dissent because it has the frightening power to change our minds.
Dissent is the central power given to us by our forefathers who wrote and signed the Constitution of the United States. They knew from their experience in Europe with monarchies and oligarchies that without freedom of speech and the right to dissent, we are doomed to an essentially totalitarian government. It is in the freedom and ability to change our minds, to allow differences of opinion, thought, outlook, and interest, that moves us forward, that gives us the chance to re-think what we are doing, and gives us the energy and will to find a better way.
Polarization is the easy path.
With it, we no longer have to re-think our own behavior — we stick with “our crowd” so we do not have to reconsider our outlook at all, ever. We are safe. NO DANGER.
Yes, we can refuse to allow change. Yes, we can refuse to seek and allow common ground with others who are different from us in some way. Yes, we can cling to this outlook because it is our safety net, again and again and again…
But if we remain in our safe world, it eventually becomes untenable. We stagnate, cease to grow. Such a state can never remain our ultimate path for long, because human beings are always in search of discovering who they are. We hunger to go beyond our limitations, even if that hunger is just a whisper in our hearts and minds.
Right now, you are the product of 4.543 billion years, the age of the Earth. It took that amount of time to create you as you are this second. Everything that happened over those eons has led to this moment in time — you.
In fact, you are the culmination of an even greater duration of time because everything, every element that composes our bodies, comes originally from the stars — we are indeed made of “star stuff,” as the astronomer Carl Sagan said.
Look closely at the image above taken by the Cassini Mission in 2017. There is our Earth, a pinpoint in the vast reaches of space. Are we meant to spend this exceedingly brief span of time we have here on this tiny planet— less than a moment in cosmic time — in stagnant, polarized safety, or in the willing exploration, with joy and wonder, of all there is for us to encounter?
It is a choice.
|
https://regina-clarke7.medium.com/why-americans-want-polarization-2b9fbc259224
|
['Regina Clarke']
|
2018-09-15 22:43:04.449000+00:00
|
['Politics', 'Humanity', 'Self-awareness', 'Choices', 'Safety']
|
Title Americans Want PolarizationContent “What need United States division need United States hatred need United States violence lawlessness love wisdom compassion toward one another feeling justice toward still suffer within country whether white black” word spoken audience Indianapolis Senator Robert F Kennedy upon assassination Martin Luther King Jr brief speech called one great public address modern world Two month later June 6 1968 Kennedy assassinated campaigning Democratic nomination President United States loss two extraordinary seeker peace sent u reeling country u Americans though celebrated death time remember well felt shocking revealed honor humanity surely cannot rejoice death anyone recalling reaction also reminded behavior people Middle East danced upon learning Twin Towers fallen New York City shocked u see news report shock held element terror within truly made delighted death King Kennedy made shown gladness terrible assassination President John F Kennedy November 22 1963 share origin matter ethnic religious national background really one race genetically speaking according research presented National Geographic recent message “There’s Scientific Basis Race — It’s MadeUp Label” persist seeking division praising coveting living allow accept polarization state mind heart Polarization expressed prejudiced behavior —anger revenge bitterness division absolutism product ignorance refusal acknowledge fundamental connection polarization viewpoint allegiance arises one primary need — individual desire feel safe often experience deep feeling inadequacy many reason feeling increase encounter anything threatens change world life way — immediately feel unsafe lose bearing Fear driving force believe bury fear keep sight mind heart lash shun denigrate human being basing action illusion difference “us” “them” “Oh difference exist” — say current mantra WANT believe true may sometimes choose deal information civil way still opt believing division still believe share common ground human being result polarized country appears thrive many level public revelation difference — political speech news medium movie television simple exchange point view dinner table Someone told yesterday — person come family politically divided — one dare bring politics gather together fierce feeling sweep instantaneous dinner table becomes battleground scenario repeated everywhere country whenever opportunity arises — debate community zoning workforce church football game child’s playground world summit one situation suggests lively conversation talk thing one listening anyone disagrees say care person think going Americans want polarization want question wouldn’t feeling safe people driven called fightorflight response conflict amygdala control small almondshaped set neuron located deep temporal lobe brain — key role process emotion one oldest primordial region brain amygdala activated confronted actual physical perceived danger Hundreds thousand year ago could mean escaping attack woolly mammoth sabertoothed tiger Today small set neuron activated feel attacked someone talking u difference opinion belief word felt attack safety strongly actual physical danger word spoken revert back hundred thousand year collective response — seek cover gear battle — flee fight inbetween refusing talk people share exact view protect — evade danger believe deep evading danger dissent frightening power change mind Dissent central power given u forefather wrote signed Constitution United States knew experience Europe monarchy oligarchy without freedom speech right dissent doomed essentially totalitarian government freedom ability change mind allow difference opinion thought outlook interest move u forward give u chance rethink give u energy find better way Polarization easy path longer rethink behavior — stick “our crowd” reconsider outlook ever safe DANGER Yes refuse allow change Yes refuse seek allow common ground others different u way Yes cling outlook safety net again… remain safe world eventually becomes untenable stagnate cease grow state never remain ultimate path long human being always search discovering hunger go beyond limitation even hunger whisper heart mind Right product 4543 billion year age Earth took amount time create second Everything happened eon led moment time — fact culmination even greater duration time everything every element composes body come originally star — indeed made “star stuff” astronomer Carl Sagan said Look closely image taken Cassini Mission 2017 Earth pinpoint vast reach space meant spend exceedingly brief span time tiny planet— le moment cosmic time — stagnant polarized safety willing exploration joy wonder u encounter choiceTags Politics Humanity Selfawareness Choices Safety
|
5,579 |
What I learned writing for 1 hour every day for 60+ days
|
What I learned writing for 1 hour every day for 60+ days
Prioritizing process over inspiration
Photo by Glenn Carstens-Peters on Unsplash
On day number eight of sheltering-in-place, I was in a dark place. Over the previous week, I had lost all motivation to do just about anything but read articles about COVID-19. It didn’t seem like there was a reason to do anything. What’s the point? When the world can stop on a dime and change so drastically, so quickly, what hope do we have of making our plans come to fruition?
Everything felt meaningless. After a week of existential crisis, I decided it was time to make meaning out of the circumstances. Hey — wouldn’t it be a great story if I took this opportunity as the impetus to start something that changed my life? What if in 1, 5, or 10 years, I can point back to this period as the moment that something changed?
I also decided that I didn’t have to finish a book or complete some grand project during quarantine to make meaning out of isolation, and in fact putting the pressure on myself to do so would more likely result in crippling anxiety. Instead, I told myself that I wanted to complete three tasks, every day. If I completed these three tasks, no matter what else I did or did not do that day, I would consider it a success. To hold myself accountable, I got out a whiteboard and marker and wrote down my tasks vertically: 1) Meditate for 20 minutes; 2) Write for 1 hour; 3) Floss!
It has been 63 days since that first day, and I haven’t broken the streak for any of these three tasks yet. This is by far the most consistent I’ve ever meditated or flossed, but the biggest difference has been in my writing habits. Previously, I would only write when I “was in the mood,” or on days where the words were coming easily to me. If I wasn’t feeling it, I wouldn’t write. There was always an excuse.
I’ve written over 100 blog posts, a chapter in the book Finding Genius, and about 8,000 words of a sci-fi novel that I originally started in 2016. But all of this was written when I felt like it. I am thankful that outside of academic settings, I have never had a deadline for my writing, and have never had to rely on writing to make money. This is incredibly freeing, and I enjoy writing as a hobby and practice instead of for a living, which I fear would take much of my enjoyment out of it, and add a lot of anxiety and stress. But it also means that I’ve never needed a writing practice or discipline, and have had the luxury to simply write only when I felt like it. So, writing for an hour every day no matter how I felt was a new experiment for me.
There have been plenty of those 63 days where writing for an hour was the last thing I wanted to do (the day that I’m writing this sentence is one of those days). But, my task is not to write a book, or write something I’m going to publish — just to write anything and do nothing else for one hour. While I have made some progress on the novel, I also have a very, very long “scratch sheet,” where some days I just write non-sense that no one will ever see for an hour. I don’t enjoy those days, but they earn me a tally on my white board, and keep the streak alive.
Some days, I’ve been able to go from frustrated and blocked to finding my flow after 20 or 30 minutes of writing garbage just to get through. The biggest win has been allowing myself to write not good-like. Previously, if I didn’t feel like what I was writing was good, I would stop. But making the agreement with myself to write for an hour had nothing to do with quality, just with process. So I’ve written a whole bunch of garbage, and I’m okay — and even happy with myself for doing it. Because sometimes the momentum out of that garbage is the seed of an idea, or a sentence or concept that I really like. And if I didn’t wade through the garbage to get there, I probably would have never found it.
Now, I haven’t finished a novel, haven’t put that much more of my writing out into the world, and so even though I’m writing much more often than I ever had before, my public output has not increased at the same rate. Again, I’m okay with this. The exercise was not meant to increase my public output, but to increase my discipline and process. Knowing that I can make myself sit down and write at any time, and that I don’t have to “be in the mood,” to do so is empowering knowledge to have, and removes my most common excuse to avoid writing. It also means that the writing I do make public is (at least relatively) of higher quality.
Having to write for a full hour also motivates me to do it earlier in the day, so that I can feel as though my work is done. Some days I’ve put it off until nighttime, and have to slog through as I’m tired and miserable, regretting not finishing earlier. On these days, the writing is not very good, because I’ve been having anxiety leading up to it, believing that it won’t be good because I put it off until I was tired in some sort of evil-recursive-writing-logic loop. If I write in the early afternoon, I feel like I’m ahead of schedule and over-achieving. Once I finish, I feel relief, accomplishment, and freedom!
Interestingly, I haven’t really written for more than one hour since I’ve written for at least one hour every day. Before this discipline, if I was “in the mood” and working on something specific, I may write for 2–3 hours or more straight. That hasn’t been the case since I’ve been writing every day. Once my timer goes off, I breath a sigh of relief, finish my thought, and save the rest for the next day. I hope this changes, that some days I want to keep going past my minimum required writing time, but it hasn’t happened yet.
Extrapolated out, if I keep my 1 hour / day minimum of writing going, I would hit 10,000 hours (and consecutive days) when I’m 57. This may sound like a long way out (I’m 30, FYI), but with my previous cadence (writing only when I felt like it), I might never get to 10,000 hours. It’s the Chinese proverb — the best time to plant a tree was 20 years ago, the second best time is today.
I will continue to write for one hour a day as long as I can. Some of those days are going to be fucking magical, and I’m going to write the best dialogue or sentence or paragraph that I’ve ever written. Others are going to me sitting in silence, frustrated that I can’t force anything out, looking at the clock every few seconds. But those magical days can only happen on days that I write, so if I write every day, I’ll have more magical days. What I knew before, but didn’t fully appreciate, was that having a writing practice and discipline is a way of increasing the velocity of opportunity. It’s like buying lottery tickets, but the tickets are free (costing only your time), and you get better lottery tickets every time you buy one.
If you’re currently of the “I only write when I’m in the mood” legion of writers like I used to be, I think it’s worth starting a daily habit to see how your writing and attitude about writing changes. Maybe it’s just 30 minutes for 30 days. Anything that starts a habit in which you’ll be forced to write when you really don’t want to. You’ll likely see that you still can, and that at times, the stuff you write when you don’t want to be writing is actually pretty good. I didn’t take time to specifically write this article. I took time to write, and wrote this article. It wouldn’t exist if I didn’t have to be writing anyway. For me, that’s a lesson that good things can come from processes even without inspiration or flow.
Thinking back to day one when I started this exercise, one of the motivators for me was the narrative it would create. What if in five years, I can say that I have written for one hour every day since March 23rd, 2020? That would be an impressive feat that would give some personal meaning to the otherwise meaninglessness of a global pandemic, and probably have a pretty large impact on my life. Well, today I’ve written for at least one hour for 63 days straight. I think that’s a pretty good start.
|
https://medium.com/the-raabithole/what-i-learned-writing-for-1-hour-every-day-for-60-days-6e81c9c0e29c
|
['Mike Raab']
|
2020-05-27 14:04:09.800000+00:00
|
['Life Lessons', 'Media', 'Writing', 'Personal Development', 'Writing Tips']
|
Title learned writing 1 hour every day 60 daysContent learned writing 1 hour every day 60 day Prioritizing process inspiration Photo Glenn CarstensPeters Unsplash day number eight shelteringinplace dark place previous week lost motivation anything read article COVID19 didn’t seem like reason anything What’s point world stop dime change drastically quickly hope making plan come fruition Everything felt meaningless week existential crisis decided time make meaning circumstance Hey — wouldn’t great story took opportunity impetus start something changed life 1 5 10 year point back period moment something changed also decided didn’t finish book complete grand project quarantine make meaning isolation fact putting pressure would likely result crippling anxiety Instead told wanted complete three task every day completed three task matter else day would consider success hold accountable got whiteboard marker wrote task vertically 1 Meditate 20 minute 2 Write 1 hour 3 Floss 63 day since first day haven’t broken streak three task yet far consistent I’ve ever meditated flossed biggest difference writing habit Previously would write “was mood” day word coming easily wasn’t feeling wouldn’t write always excuse I’ve written 100 blog post chapter book Finding Genius 8000 word scifi novel originally started 2016 written felt like thankful outside academic setting never deadline writing never rely writing make money incredibly freeing enjoy writing hobby practice instead living fear would take much enjoyment add lot anxiety stress also mean I’ve never needed writing practice discipline luxury simply write felt like writing hour every day matter felt new experiment plenty 63 day writing hour last thing wanted day I’m writing sentence one day task write book write something I’m going publish — write anything nothing else one hour made progress novel also long “scratch sheet” day write nonsense one ever see hour don’t enjoy day earn tally white board keep streak alive day I’ve able go frustrated blocked finding flow 20 30 minute writing garbage get biggest win allowing write goodlike Previously didn’t feel like writing good would stop making agreement write hour nothing quality process I’ve written whole bunch garbage I’m okay — even happy sometimes momentum garbage seed idea sentence concept really like didn’t wade garbage get probably would never found haven’t finished novel haven’t put much writing world even though I’m writing much often ever public output increased rate I’m okay exercise meant increase public output increase discipline process Knowing make sit write time don’t “be mood” empowering knowledge remove common excuse avoid writing also mean writing make public least relatively higher quality write full hour also motivates earlier day feel though work done day I’ve put nighttime slog I’m tired miserable regretting finishing earlier day writing good I’ve anxiety leading believing won’t good put tired sort evilrecursivewritinglogic loop write early afternoon feel like I’m ahead schedule overachieving finish feel relief accomplishment freedom Interestingly haven’t really written one hour since I’ve written least one hour every day discipline “in mood” working something specific may write 2–3 hour straight hasn’t case since I’ve writing every day timer go breath sigh relief finish thought save rest next day hope change day want keep going past minimum required writing time hasn’t happened yet Extrapolated keep 1 hour day minimum writing going would hit 10000 hour consecutive day I’m 57 may sound like long way I’m 30 FYI previous cadence writing felt like might never get 10000 hour It’s Chinese proverb — best time plant tree 20 year ago second best time today continue write one hour day long day going fucking magical I’m going write best dialogue sentence paragraph I’ve ever written Others going sitting silence frustrated can’t force anything looking clock every second magical day happen day write write every day I’ll magical day knew didn’t fully appreciate writing practice discipline way increasing velocity opportunity It’s like buying lottery ticket ticket free costing time get better lottery ticket every time buy one you’re currently “I write I’m mood” legion writer like used think it’s worth starting daily habit see writing attitude writing change Maybe it’s 30 minute 30 day Anything start habit you’ll forced write really don’t want You’ll likely see still time stuff write don’t want writing actually pretty good didn’t take time specifically write article took time write wrote article wouldn’t exist didn’t writing anyway that’s lesson good thing come process even without inspiration flow Thinking back day one started exercise one motivator narrative would create five year say written one hour every day since March 23rd 2020 would impressive feat would give personal meaning otherwise meaninglessness global pandemic probably pretty large impact life Well today I’ve written least one hour 63 day straight think that’s pretty good startTags Life Lessons Media Writing Personal Development Writing Tips
|
5,580 |
Game Level Design with Reinforcement Learning
|
Game Level Design with Reinforcement Learning
Overview of the paper “PCGRL: Procedural Content Generation via Reinforcement Learning” by A. Khalifa et al.
Procedural Content Generation (or PCG) is a method of using a computer algorithm to generate large amounts of content within a game like huge open-world environments, game levels and many other assets that go into creating a game.
Today, I want to share with you a paper titled “PCGRL: Procedural Content Generation via Reinforcement Learning” which shows how we can use self-learning AI algorithms for procedural generation of 2D game environments.
Usually, we are familiar with the use of the AI technique called Reinforcement Learning to train AI agents to play games, but this paper trains an AI agent to design levels of that game. According to the authors, this is the first time RL has been used for the task of PCG.
Sokoban Game Environment
Let’s look at the central idea of the paper. Consider a simple game environment like in the game called Sokoban.
Sokoban game level.
We can look at this map or game level as a 2D array of integers that represent this state of the game. This state is observed by the Reinforcement Learning agent that can edit the game environment. By taking actions like adding or removing certain element of the game (like solid box, crate, player, target, etc. ), it can edit this environment to give us a new state.
The PCGRL Framework
Now, in order to ensure that the environment generated by this agent is of good quality, we need some sort of feedback mechanism. This mechanism is constructed in this paper by comparing the previous state and the updated state using a hand-crafted reward calculator for this particular game. By adding appropriate rewards for rules that make the level more fun to play, we can train the RL agent to generate certain types of maps or levels. The biggest advantage of this framework is that after training is complete, we can generate practically infinite unique game levels at the click of a button, without having to design anything manually.
The three proposed methods for traversing and editing the game environment by the RL agent.
The paper also contains comparisons between different approaches that the RL agent can use to traverse and edit the environment. If you’d like to get more details on the performance comparison between these methods, here is the full text of the research results.
[Source] Different games tested for level design via the trained RL agent.
General Research Direction
While the games that were use in this paper’s experiments are simple 2D games, this research direction excites me because we can build upon this work to create large open-world 3D game environments.
This has the potential of changing online multiplayer gaming experience. Imagine, if at the start of every multiplayer open-world game, we could generate a new and unique tactical map every single time. This means we do not need to wait for the game developers to release new maps every few months or years, but we can do so right within the game with AI, which is really cool!
|
https://medium.com/deepgamingai/game-level-design-with-reinforcement-learning-52b02bb94954
|
['Chintan Trivedi']
|
2020-08-08 04:59:46.541000+00:00
|
['Game Futurology', 'Artificial Intelligence', 'Game Development', 'Reinforcement Learning', 'Machine Learning']
|
Title Game Level Design Reinforcement LearningContent Game Level Design Reinforcement Learning Overview paper “PCGRL Procedural Content Generation via Reinforcement Learning” Khalifa et al Procedural Content Generation PCG method using computer algorithm generate large amount content within game like huge openworld environment game level many asset go creating game Today want share paper titled “PCGRL Procedural Content Generation via Reinforcement Learning” show use selflearning AI algorithm procedural generation 2D game environment Usually familiar use AI technique called Reinforcement Learning train AI agent play game paper train AI agent design level game According author first time RL used task PCG Sokoban Game Environment Let’s look central idea paper Consider simple game environment like game called Sokoban Sokoban game level look map game level 2D array integer represent state game state observed Reinforcement Learning agent edit game environment taking action like adding removing certain element game like solid box crate player target etc edit environment give u new state PCGRL Framework order ensure environment generated agent good quality need sort feedback mechanism mechanism constructed paper comparing previous state updated state using handcrafted reward calculator particular game adding appropriate reward rule make level fun play train RL agent generate certain type map level biggest advantage framework training complete generate practically infinite unique game level click button without design anything manually three proposed method traversing editing game environment RL agent paper also contains comparison different approach RL agent use traverse edit environment you’d like get detail performance comparison method full text research result Source Different game tested level design via trained RL agent General Research Direction game use paper’s experiment simple 2D game research direction excites build upon work create large openworld 3D game environment potential changing online multiplayer gaming experience Imagine start every multiplayer openworld game could generate new unique tactical map every single time mean need wait game developer release new map every month year right within game AI really coolTags Game Futurology Artificial Intelligence Game Development Reinforcement Learning Machine Learning
|
5,581 |
A Kind Manifesto
|
Photo by Sandrachile . on Unsplash
I believe in kindness. My faith in kindness is unlike the way people generally debate atheism. Is there a supreme being or not? Kindness does not float on an invisible realm like in some Platonic world of forms. Kindness exists only insofar as people are kind to one another.
My belief in kindness relates to its use. I trust in the ability of kind actions to change lives, organizations, and even the world. Maybe that sounds naive these days. Everything feels cruel and indifferent — wars, terrorism, environmental degradation, politics, economics, social media, refugee crises, racism, family life, and healthcare. Kindness seems to be in short supply.
I’m guessing, though, that you experience several acts of kindness every day. I do. These acts of kindness often go unnoticed, but they exist nonetheless. A hug from a loved one. Someone holding the elevator. A thank you from a coworker. Although kindness rarely shows itself on the same scale as cruelty, these kind deeds can add up to something substantial.
Kind acts can change the course of our day and our lives. Small acts such as genuine words of encouragement aren’t so small. I know more than one person who did not commit suicide because of the kindness of someone simply willing to listen for a few minutes.
I don’t know what would tools or methods we could use in order to measure kindness. How could anyone tally all of the kind acts done in a day? We could not calculate the total time given to kindness? There’s no practical way to create a Gross Kindness Quotient, and determining the magnitude of a single kind act would be impossible.
Kindness takes countess forms. In The Power of Kindness, Piero Ferrucci, describes 18 categories of kind acts. Some — like forgiveness, empathy, and generosity — are obvious. He suggests that we also show kindness in less apparent behaviors such as through honesty, respect, and flexibility.
You might compile your own classifications of kind deeds, and I would add that tone is essential to kindness. We all know a kind tone when we hear it, and tone can make a world of difference. Consider honesty. By itself, honesty is not necessarily kind. Someone can be truthful in a vicious and spiteful way. A kind tone makes the truth palatable, and a kind tenor begins with intention. If we want to practice kindness, we have to work at it.
|
https://kb719.medium.com/a-kind-manifesto-f2ad8e661d4d
|
['Disabled Saints']
|
2019-01-31 17:44:07.505000+00:00
|
['Self', 'Education', 'Mental Health', 'Kindness']
|
Title Kind ManifestoContent Photo Sandrachile Unsplash believe kindness faith kindness unlike way people generally debate atheism supreme Kindness float invisible realm like Platonic world form Kindness exists insofar people kind one another belief kindness relates use trust ability kind action change life organization even world Maybe sound naive day Everything feel cruel indifferent — war terrorism environmental degradation politics economics social medium refugee crisis racism family life healthcare Kindness seems short supply I’m guessing though experience several act kindness every day act kindness often go unnoticed exist nonetheless hug loved one Someone holding elevator thank coworker Although kindness rarely show scale cruelty kind deed add something substantial Kind act change course day life Small act genuine word encouragement aren’t small know one person commit suicide kindness someone simply willing listen minute don’t know would tool method could use order measure kindness could anyone tally kind act done day could calculate total time given kindness There’s practical way create Gross Kindness Quotient determining magnitude single kind act would impossible Kindness take countess form Power Kindness Piero Ferrucci describes 18 category kind act — like forgiveness empathy generosity — obvious suggests also show kindness le apparent behavior honesty respect flexibility might compile classification kind deed would add tone essential kindness know kind tone hear tone make world difference Consider honesty honesty necessarily kind Someone truthful vicious spiteful way kind tone make truth palatable kind tenor begin intention want practice kindness work itTags Self Education Mental Health Kindness
|
5,582 |
4 Psychology Hacks Google Used on Employees to Change Bad Habits
|
4 Psychology Hacks Google Used on Employees to Change Bad Habits
And how you can apply these
Photo by Abhinav Goswami on Pexels
We all have bad habits that interfere with our goals.
Maybe we feel like we just can’t stop eating junk food but want a ‘summer bod’. Perhaps we procrastinate instead of putting in the work to build up our side hustle. When we have a goal in mind, taking new action, in the beginning, can feel like a breeze.
But it’s sustaining a change in behavior long enough for it to become a good habit, where most of us fall off.
When Google set out to get employees to live healthier lives it was confronted with the challenge of changing their bad eating habits.
7 weeks later, the findings in the New York office alone were impressive. Employees consumed 3.1 million fewer calories. In 2018, employees were found to drink nearly 5 times more bottled water over sugary drinks. And today 2,300 breakfast salads are served daily.
You may not necessarily have the goal of living a healthier lifestyle. But the principles from human behavior used by Google, are lessons we can all use in making and sustaining good habits that support our goals.
Hack 1: Create extra steps for a bad decision & make the right decision easy
One action Google did was an experiment with the setup of coffee stations.
At station A, snack bars were conveniently put next to the coffee stations. At station B, the snack bars were placed across the room, just 4 or 5 extra steps away.
In the end, it was found that 20% of employees grabbed a snack from station A. At station B, this dropped to 12% of employees grabbing a snack.
Behavioral lesson:
Humans are wired to go for the easier choice or the path of least resistance. It’s a survival instinct.
In the case above, it was too inconvenient for those in station B to walk a few extra feet, and too easy for those at station A to grab a snack next to them.
How to apply this yourself:
If you would like to create a good habit of say, actually finishing books from your forever growing pile of unread books, try placing one of them beside your bedside table. This will create easy access and convenience to read every day.
If you want to stop the habit of being on your device before sleeping because this messes up your sleeping schedule, place all devices in another room an hour before sleeping. For extra effectiveness, place the devices annoyingly in a closet for example, or somewhere inconveniently high or low to reach.
Even if you’re itching to check your devices, you probably won’t want to go through the effort of getting out of bed, walking into another room, reaching up, or getting on your knees just to get these items.
Hack 2: Make the right choice more attractive
When Google tried to get employees to drink more water they began to place transparent “spa water” style canisters everywhere. They also added colorful fruit like strawberries and slices of lemon.
Behavioral lesson:
When we’re attracted to something, we’re likely to gravitate towards it. We’re even more inclined if attention-grabbing details like bright colors catch our attention.
In the case of Google employees, the visual of flavored water was far more attractive than plain water.
How to apply this to yourself:
Will owning a cute outfit or new pair of shoes motivate you to go to the gym to ‘show 'em off’? Tap into that vanity if it means you’ll show up at the gym to workout.
Be honest, if your home office looks better, would it motivate you to get work done in such an awesome looking place? Then decorate that home office!
Consider your other senses too, like your sense of smell. If you know that you need to get in bed by a certain time, it can be attractive to see the scented candle on your bedside table. The thought of smelling your favorite fragrance may attract you towards your bed when the needed time to rest comes.
Hack 3: Hide temptation and have good decisions clearly visible
To prevent unnecessary snacking, junk foods like chocolate and chips weren’t banned from the workplace but were hidden. Employees knew they still existed, but they were at the back of kitchens or in opaque containers.
Behavioral lesson:
Out of sight, out of mind.
“Visibility is extremely important. Whatever you see first is what you’re likely to start thinking about.” — David Just, Ph.D., an associate professor of behavioral economics at Cornell University.
How to apply this to yourself:
If you find yourself slowly getting addicted to playing your Nintendo Switch every evening, try hiding it away so you’re not tempted to play it every time you see it.
If you have the goal of drinking more water, try having a water bottle next to you as a constant visual reminder to drink up. If you need to study for an important test, have your study materials readily within reach.
Hack 4: Make the right choice more enjoyable
It’s not only kids who struggle to eat their vegetables. Google created not only an abundance of healthy plant-based meals to choose from but made sure these options were tasty too.
Behavioral lesson:
Google realized:
What motivates people to engage and stick with virtuous patterns of behavior has less to do with all the logical reasons they should and more to do with how much the person enjoys doing that virtuous thing — whether that’s going to the gym or eating their vegetables. — Jane Black, How Google Got Its Employees to Eat Their Vegetables
How to apply this to yourself:
The next time you have something that you need to do, instead of forcing yourself to do it and ‘hating it throughout’, brainstorm and ask yourself, “How can I make this process more enjoyable?”
If you need to get cardio done but hate running with a passion, maybe you should try a fun dance class instead to burn those calories. If cleaning out your apartment regularly is a chore, try playing your favorite music playing in the background.
When I lead tours, a fellow guide of mine used to make a game out of seeing how many names she could memorize as a personal challenge.
|
https://medium.com/psychologically/4-psychological-hacks-google-used-on-employees-to-change-bad-habits-5334ce181e6c
|
['Willda Atienza']
|
2020-11-23 17:11:42.217000+00:00
|
['Life Hacking', 'Personal Development', 'Self Improvement', 'Habits', 'Psychology']
|
Title 4 Psychology Hacks Google Used Employees Change Bad HabitsContent 4 Psychology Hacks Google Used Employees Change Bad Habits apply Photo Abhinav Goswami Pexels bad habit interfere goal Maybe feel like can’t stop eating junk food want ‘summer bod’ Perhaps procrastinate instead putting work build side hustle goal mind taking new action beginning feel like breeze it’s sustaining change behavior long enough become good habit u fall Google set get employee live healthier life confronted challenge changing bad eating habit 7 week later finding New York office alone impressive Employees consumed 31 million fewer calorie 2018 employee found drink nearly 5 time bottled water sugary drink today 2300 breakfast salad served daily may necessarily goal living healthier lifestyle principle human behavior used Google lesson use making sustaining good habit support goal Hack 1 Create extra step bad decision make right decision easy One action Google experiment setup coffee station station snack bar conveniently put next coffee station station B snack bar placed across room 4 5 extra step away end found 20 employee grabbed snack station station B dropped 12 employee grabbing snack Behavioral lesson Humans wired go easier choice path least resistance It’s survival instinct case inconvenient station B walk extra foot easy station grab snack next apply would like create good habit say actually finishing book forever growing pile unread book try placing one beside bedside table create easy access convenience read every day want stop habit device sleeping mess sleeping schedule place device another room hour sleeping extra effectiveness place device annoyingly closet example somewhere inconveniently high low reach Even you’re itching check device probably won’t want go effort getting bed walking another room reaching getting knee get item Hack 2 Make right choice attractive Google tried get employee drink water began place transparent “spa water” style canister everywhere also added colorful fruit like strawberry slice lemon Behavioral lesson we’re attracted something we’re likely gravitate towards We’re even inclined attentiongrabbing detail like bright color catch attention case Google employee visual flavored water far attractive plain water apply owning cute outfit new pair shoe motivate go gym ‘show em off’ Tap vanity mean you’ll show gym workout honest home office look better would motivate get work done awesome looking place decorate home office Consider sens like sense smell know need get bed certain time attractive see scented candle bedside table thought smelling favorite fragrance may attract towards bed needed time rest come Hack 3 Hide temptation good decision clearly visible prevent unnecessary snacking junk food like chocolate chip weren’t banned workplace hidden Employees knew still existed back kitchen opaque container Behavioral lesson sight mind “Visibility extremely important Whatever see first you’re likely start thinking about” — David PhD associate professor behavioral economics Cornell University apply find slowly getting addicted playing Nintendo Switch every evening try hiding away you’re tempted play every time see goal drinking water try water bottle next constant visual reminder drink need study important test study material readily within reach Hack 4 Make right choice enjoyable It’s kid struggle eat vegetable Google created abundance healthy plantbased meal choose made sure option tasty Behavioral lesson Google realized motivates people engage stick virtuous pattern behavior le logical reason much person enjoys virtuous thing — whether that’s going gym eating vegetable — Jane Black Google Got Employees Eat Vegetables apply next time something need instead forcing ‘hating throughout’ brainstorm ask “How make process enjoyable” need get cardio done hate running passion maybe try fun dance class instead burn calorie cleaning apartment regularly chore try playing favorite music playing background lead tour fellow guide mine used make game seeing many name could memorize personal challengeTags Life Hacking Personal Development Self Improvement Habits Psychology
|
5,583 |
Seek Out One of These to Find the Missing Answer to Your Goals
|
What Is a Proper Mastermind?
It’s all in the title. A mastermind is a way to join multiple minds together to create something greater than any individual part.
I often tell people my writing is a collaboration and it’s true. Behind the scenes are other writers, entrepreneurs, publication owners, editors, superfans and web developers who all contribute in some way to my work.
A mastermind helps you find the right people. Because there are people all over the internet and the screening process will take the rest of your life.
Here are the features of a proper mastermind.
Heavy curation
Anyone can join a Facebook Group, but not everyone can join a mastermind.
Filtering the noise in your life helps you reach mastery. The people who form a mastermind are normally curated down from a bigger list. A mastermind is typically made up of less than 100 people. (The best ones I’ve seen contain thirty or less.)
Every trait of the people who join is taken into consideration.
Application process
There is an application process via a Google Form that requires you to talk about yourself in detail. You note down your goals, your experience, what you do for work. Everything about you is put under a microscope and up for debate by the owner and members of the mastermind.
The point of the application process is not to discriminate but to ensure that the right mix of people with similar interests are put together.
There is a mastermind for everybody. You just have to find the right one for you.
It costs money
A mastermind is serious. It costs money. My mentor’s mastermind is thousands of dollars every year to join. I initially thought it was a huge waste of money until I attended. Only then did I appreciate the power of curation and the price tag that comes with it.
You don’t appreciate something that is free.
The more money you have to give up, the more you’ll take the mastermind seriously.
It’s run by someone who cares
The leader of the mastermind is important. If their goal is only to further their own interests, then the mastermind will typically die a slow death.
The point of a mastermind is for everybody to grow together. There is no supreme leader that everybody worships. My mentor who runs a mastermind is obsessed with the members.
He does the best he can every week to serve their interests. Most of all, he celebrates the wins of people in the mastermind, which shows others what is possible so they too can have their moment.
There is one thing everybody has in common
The mastermind I’m in is full of writers. We all have similar goals: to write better, to write more, to touch people’s hearts with emotion, to inspire others, to earn a living from writing.
A common theme is what makes a mastermind great. Find your tribe by thinking about your interests.
|
https://medium.com/the-ascent/seek-out-one-of-these-to-find-the-missing-answer-to-your-goals-25ada50ff6f6
|
['Tim Denning']
|
2020-08-03 19:01:01.075000+00:00
|
['Life Lessons', 'Self Improvement', 'Life', 'Learning', 'Productivity']
|
Title Seek One Find Missing Answer GoalsContent Proper Mastermind It’s title mastermind way join multiple mind together create something greater individual part often tell people writing collaboration it’s true Behind scene writer entrepreneur publication owner editor superfans web developer contribute way work mastermind help find right people people internet screening process take rest life feature proper mastermind Heavy curation Anyone join Facebook Group everyone join mastermind Filtering noise life help reach mastery people form mastermind normally curated bigger list mastermind typically made le 100 people best one I’ve seen contain thirty le Every trait people join taken consideration Application process application process via Google Form requires talk detail note goal experience work Everything put microscope debate owner member mastermind point application process discriminate ensure right mix people similar interest put together mastermind everybody find right one cost money mastermind serious cost money mentor’s mastermind thousand dollar every year join initially thought huge waste money attended appreciate power curation price tag come don’t appreciate something free money give you’ll take mastermind seriously It’s run someone care leader mastermind important goal interest mastermind typically die slow death point mastermind everybody grow together supreme leader everybody worship mentor run mastermind obsessed member best every week serve interest celebrates win people mastermind show others possible moment one thing everybody common mastermind I’m full writer similar goal write better write touch people’s heart emotion inspire others earn living writing common theme make mastermind great Find tribe thinking interestsTags Life Lessons Self Improvement Life Learning Productivity
|
5,584 |
Ruta de Aprendizaje Machine Learning en Español — Parte 2
|
Written by
Data Scientist | DataEngineer | Software Developer | Electronic Engineer. I’m Jesus’s Follower | Don’t hesitate to AMA. Let’s take a coffee ☕ and enjoy life.
|
https://medium.com/colombia-ai/ruta-de-aprendizaje-machine-learning-en-espa%C3%B1ol-parte-2-fbe789869129
|
['German Andres Jejen Cortes']
|
2019-06-18 04:07:44.341000+00:00
|
['Gradiente', 'Backpropagation', 'Python', 'Learning Path', 'Machine Learning Ai']
|
Title Ruta de Aprendizaje Machine Learning en Español — Parte 2Content Written Data Scientist DataEngineer Software Developer Electronic Engineer I’m Jesus’s Follower Don’t hesitate AMA Let’s take coffee ☕ enjoy lifeTags Gradiente Backpropagation Python Learning Path Machine Learning Ai
|
5,585 |
Algebraic Data Types in Python
|
An Algebraic Data Type (or ADT in short) is a composite type. It enables us to model structures in a comprehensive way that covers every possibility. This makes our systems less error-prone and removes the cognitive mental load that comes when dealing with impossible states.
Motivation
Programmers who work in statically typed languages that have pattern matching are most likely using ADTs in their day to day work. If you’re not one of them, why should you care? I’ve decided to write about ADTs because:
Applicability — I was curious about the applicability of ADTs in dynamic languages like Python.
— I was curious about the applicability of ADTs in dynamic languages like Python. System understanding — categorizing certain parts of a problem in terms of ADTs leads to a more structured (and deeper) understanding of how our systems behave.
— categorizing certain parts of a problem in terms of ADTs leads to a more structured (and deeper) understanding of how our systems behave. Explicit design — Once we model parts of our system as ADTs we can embed certain architectural decisions into our design.
Defining ADTs
To be honest, I think there are already a bunch of good resources out there that define ADTs way better than I can. The reason I’ve decided to add this section is to make this post a complete reference (and not require us to jump from one resource to another).
Informally, ADTs:
Are a way to declare concrete, recursive and abstract structures.
Define which values and what variations are possible for these structures.
Are a composition of other types (which we expand on next)
ADTs are a composition of these types: Product and Sum Types. Product types define which values exist in the type definition while Sum types define which variations are legal for an ADT.
The last paragraph may have been a bit theoretical so let’s imagine a system that supports two kinds of users: authenticated and anonymous users. An authenticated user has an ID, an email, and a password. An anonymous user only has a name. Let’s represent these definitions with an ADT (I’m also using Scala for the Scala developers among us).
A Scala implementation:
A Python implementation:
The examples involve 3 types. AuthenticatedUser , AnonymousUser and User . AuthenticatedUser and AnonymousUser are the Product Types while User is the Sum Type (which is why User is explicitly mentioned in the Python example)
Product Types
Define the fields that the structure has. AuthenticatedUser is a Product type because it has an ID, an email, and a password. AnonymousUser is a Product type because it has a single name value. Although not mentioned in the previous example a Product type can have 0 values (we'll look into an example of this later on).
Product types define how many possible variations of AuthenticatedUser and AnonymousUser can exist in the system. We often refer to the number of possible variations as an Arity.
Sum types are the more interesting type of the two and are meant to define what are all the valid variations of a type. In the previous example, User is a Sum type because it can either be an AuthenticatedUser or an AnonymousUser . In the User example, a User must be either anonymous or authenticated and nothing else.
We use the term Sum type to define how many possible variations of a type can exist in the system. In the User example, User has 2 possible variations: authenticated or anonymous (an Arity of 2).
ADTs Examples
Before we discuss the reasoning behind ADTs it’s probably better if we’ll look at a few examples:
ADTs are everywhere, the following example shows that ADTs can even represent primitive types (I find it interesting even if the example is kinda useless)
Option allows us to write functions that either return a value or return nothing. What’s nice about Options is that we can make the optional return value explicit via type annotations (it has other advantages but they’re irrelevant for now).
It’s possible to also model operations as structures
Events are also possible candidates for ADTs
We can even represent a closed set of possible states. The following example shows how we can use an ADT to model the possible states of a Circuit Breaker (we can easily add a half-open state if we need to)
We can also use ADTs in Javascript and in React
The Reasoning Behind ADTs
One of the main ideas behind ADTs is that code that operates on an input ADT can be a total function. It knows in advance which variations are possible and can return a value for each of these variations. This has many advantages like code that is testable, easy to reason about, and deterministic. We can also use it as part of a functional core.
One implication for pushing towards total functions is that we have to know in advance all the possible variations of the ADT. If we had incorrectly defined an ADT to be inherently open for extension it would impossible to guarantee that functions that operate on it will indeed be total functions. When we define an ADT we are explicitly stating that our structure is inherently closed and we don’t expect it to frequently change (and in some cases never change).
There’s a big difference between “never change” and “don’t often expect it to change”. ADTs that are fundamentally closed like Option and boolean will likely never change. However, some ADTs will likely need to change sometime in the future (hopefully not often though). When this happens we want to have a fail-fast mechanism that prevents errors from creeping throughout the system.
Data Types & Operations Placement
When coming from an object-oriented background it may be tempting to place operations (or behavior) on the ADT and not separate them. There are, as always, tradeoffs for both options. This is actually a well-known problem called The Expression Problem. Uncle Bob also discusses this topic in one of his blog posts.
Uncle Bob summarizes the problem quite well IMO:
Adding new functions to a set of classes is hard, you have to change each class. Adding new functions to a set of data structures is easy, you just add the function, nothing else changes. Adding new types to a set of classes is easy, you just add the new class. Adding new types to a set of data structures is hard, you have to change each function. Uncle Bob Objects & Data structures
We can apply Uncle Bob’s guidelines and conclude that since ADTs are inherently closed we should aim to separate the operations from the ADTs.
Operating on ADTs
Going back to our Expression example (not to be confused with the Expression Problem), when using Python (or a similar language) the simplest way to operate on Expression will be to use some form of type checking:
In my opinion, the real concern here is for our system’s safety. Although we’re not expecting new types of Expression , it's possible that requirements will change (as they always do) and we will need to add a new Expression type (or more). When this happens, we will need to fix every code section that operates on Expression . This is very error-prone (in fact, when I initially wrote this example I forgot Divide and when I added it, I forgot to update the isinstance checks to match the new variation 🤦♂️)
If isinstance is not a good strategy then what are our options? It turns out Python has an ADT library that takes an interesting approach.
The Python ADT Library
This library tries to somewhat mimic pattern matching in Python by generating a match method for each ADT. Let's start by using it to generate an Option ADT.
Let’s go back to our Expression problem. This is how we would represent Expression with the ADT library and process it:
match achieves several interesting things:
In order to use the wrapped value, we are forced to deal with all the possible outcomes ( SOME and NONE or the different Expression Sum types) which means that functions that operate on the ADT ( parse_name or evaluate ) are total functions. The library automatically generates the match method. It verifies that we have a function to handle all the possible variations of the ADT. In our example, if we forget to write the lambda for SOME , NONE , LITERAL , MULTIPLY , etc, we will get an "Incomplete pattern match" error. An ADT can be recursive ( Expression ). Processing a recursive ADT may also require recursive processing.
These qualities cover our initial requirements. They allow us to write total functions that operate on the ADTs, and they fail-fast in case we do end up adding new variations of the type.
Summary
ADTs help us to create readable code that is predictable and is also easier to reason about. We use them to represent structures that are inherently closed but can seldom change. It’s for this reason that we tend to separate the operations from the actual structure and use total functions to operate on them. In the case, a new variation emerges we want to fail as soon as possible.
We’ve looked at how we can use Python’s ADT library to operate on ADTs. It’s possible to extend ADTs in such a way that they’re easily composable and even enforce certain constraints on the way we process them (constraints liberate, remember?). Maybe these are subjects for a future post?
In my opinion, the real benefit of thinking in terms of ADTs is that it truly improves the design of the system, it makes us think about which parts of our system are more likely to change than others and it encodes this understanding in the code (regardless of the language we’re using).
I’d love to get some feedback, improvement suggestions, and ideas for future posts. Feel free to reach out.
Further Reading
|
https://medium.com/swlh/algebraic-data-types-in-python-f24456d72f0
|
['Gideon Caller']
|
2020-12-21 20:53:13.003000+00:00
|
['Python', 'Adt', 'Quality Software', 'Functional Programming', 'Programming']
|
Title Algebraic Data Types PythonContent Algebraic Data Type ADT short composite type enables u model structure comprehensive way cover every possibility make system le errorprone remove cognitive mental load come dealing impossible state Motivation Programmers work statically typed language pattern matching likely using ADTs day day work you’re one care I’ve decided write ADTs Applicability — curious applicability ADTs dynamic language like Python — curious applicability ADTs dynamic language like Python System understanding — categorizing certain part problem term ADTs lead structured deeper understanding system behave — categorizing certain part problem term ADTs lead structured deeper understanding system behave Explicit design — model part system ADTs embed certain architectural decision design Defining ADTs honest think already bunch good resource define ADTs way better reason I’ve decided add section make post complete reference require u jump one resource another Informally ADTs way declare concrete recursive abstract structure Define value variation possible structure composition type expand next ADTs composition type Product Sum Types Product type define value exist type definition Sum type define variation legal ADT last paragraph may bit theoretical let’s imagine system support two kind user authenticated anonymous user authenticated user ID email password anonymous user name Let’s represent definition ADT I’m also using Scala Scala developer among u Scala implementation Python implementation example involve 3 type AuthenticatedUser AnonymousUser User AuthenticatedUser AnonymousUser Product Types User Sum Type User explicitly mentioned Python example Product Types Define field structure AuthenticatedUser Product type ID email password AnonymousUser Product type single name value Although mentioned previous example Product type 0 value well look example later Product type define many possible variation AuthenticatedUser AnonymousUser exist system often refer number possible variation Arity Sum type interesting type two meant define valid variation type previous example User Sum type either AuthenticatedUser AnonymousUser User example User must either anonymous authenticated nothing else use term Sum type define many possible variation type exist system User example User 2 possible variation authenticated anonymous Arity 2 ADTs Examples discus reasoning behind ADTs it’s probably better we’ll look example ADTs everywhere following example show ADTs even represent primitive type find interesting even example kinda useless Option allows u write function either return value return nothing What’s nice Options make optional return value explicit via type annotation advantage they’re irrelevant It’s possible also model operation structure Events also possible candidate ADTs even represent closed set possible state following example show use ADT model possible state Circuit Breaker easily add halfopen state need also use ADTs Javascript React Reasoning Behind ADTs One main idea behind ADTs code operates input ADT total function know advance variation possible return value variation many advantage like code testable easy reason deterministic also use part functional core One implication pushing towards total function know advance possible variation ADT incorrectly defined ADT inherently open extension would impossible guarantee function operate indeed total function define ADT explicitly stating structure inherently closed don’t expect frequently change case never change There’s big difference “never change” “don’t often expect change” ADTs fundamentally closed like Option boolean likely never change However ADTs likely need change sometime future hopefully often though happens want failfast mechanism prevents error creeping throughout system Data Types Operations Placement coming objectoriented background may tempting place operation behavior ADT separate always tradeoff option actually wellknown problem called Expression Problem Uncle Bob also discus topic one blog post Uncle Bob summarizes problem quite well IMO Adding new function set class hard change class Adding new function set data structure easy add function nothing else change Adding new type set class easy add new class Adding new type set data structure hard change function Uncle Bob Objects Data structure apply Uncle Bob’s guideline conclude since ADTs inherently closed aim separate operation ADTs Operating ADTs Going back Expression example confused Expression Problem using Python similar language simplest way operate Expression use form type checking opinion real concern system’s safety Although we’re expecting new type Expression possible requirement change always need add new Expression type happens need fix every code section operates Expression errorprone fact initially wrote example forgot Divide added forgot update isinstance check match new variation 🤦♂️ isinstance good strategy option turn Python ADT library take interesting approach Python ADT Library library try somewhat mimic pattern matching Python generating match method ADT Lets start using generate Option ADT Let’s go back Expression problem would represent Expression ADT library process match achieves several interesting thing order use wrapped value forced deal possible outcome NONE different Expression Sum type mean function operate ADT parsename evaluate total function library automatically generates match method verifies function handle possible variation ADT example forget write lambda NONE LITERAL MULTIPLY etc get Incomplete pattern match error ADT recursive Expression Processing recursive ADT may also require recursive processing quality cover initial requirement allow u write total function operate ADTs failfast case end adding new variation type Summary ADTs help u create readable code predictable also easier reason use represent structure inherently closed seldom change It’s reason tend separate operation actual structure use total function operate case new variation emerges want fail soon possible We’ve looked use Python’s ADT library operate ADTs It’s possible extend ADTs way they’re easily composable even enforce certain constraint way process constraint liberate remember Maybe subject future post opinion real benefit thinking term ADTs truly improves design system make u think part system likely change others encodes understanding code regardless language we’re using I’d love get feedback improvement suggestion idea future post Feel free reach ReadingTags Python Adt Quality Software Functional Programming Programming
|
5,586 |
The Joy of Neural Painting
|
Our Implementation of a Neural Painter in Action.
The Code: Our implementation can be found at this Github repo: https://github.com/libreai/neural-painters-x
I am sure you know Bob Ross and his program The Joy of Painting where he taught thousands of viewers how to paint beautiful landscapes with a simple and fun way, combining colors and brushstrokes, to achieve great results very quickly. Do you remember him teaching how to paint a pixel at the time? of course not!
However, most of the current generative AI Art methods are still centered to teach machines how to ‘paint’ at the pixel-level in order to achieve or mimic some painting style, e.g., GANs-based approaches and style transfer. This might be effective, but not very intuitive, specially when explaining this process to artists, who are familiar with colors and brushstrokes.
At Libre AI, we have started a Creative AI initiative with the goal to make more accessible the advances of AI to groups of artists who do not necessarily have a tech background. We want to explore how the creative process is enriched by the interaction between creative people and creative machines.
As first step, we need to teach a machine how to paint. It should learn to paint as a human would do it: using brushstrokes and combining colors on a canvas. We researched the state-of-the-art and despite the great works, there was not really a single paper that satisfied our requirements, until we found Neural Painters: A Learned Differentiable Constraint for Generating Brushstroke Paintings by Reiichiro Nakano [1]. This finding was quite refreshing.
Neural Painters
Neural Painters [1] are a class of models that can be seen as a fully differentiable simulation of a particular non-differentiable painting program, in other words, the machine “paints” by successively generating brushstrokes (i.e., actions that define a brushstrokes) and applying them on a canvas, as an artist would do.
These actions characterize the brushstrokes and consist of 12-dimensional vectors defining the following variables:
Start and end pressure : pressure applied to the brush at the beginning and end of the stroke
: pressure applied to the brush at the beginning and end of the stroke Brush size : radius of the generated brushstroke
: radius of the generated brushstroke Color : the RGB color of the brushstroke
: the RGB color of the brushstroke Brush coordinates: three Cartesian coordinates on a 2D canvas, defining the brushstroke’s shape. The coordinates define a starting point, end point, and an intermediate control point, constituting a quadratic Bezier curve
A tensor with actions looks like this example:
tensor([
[0.7016, 0.3078, 0.9057, 0.3821, 0.0720, 0.7956, 0.8851, 0.9295, 0.3273, 0.8012, 0.1321, 0.7915],
[0.2864, 0.5651, 0.5099, 0.3430, 0.2887, 0.5044, 0.0394, 0.5709, 0.4634, 0.8273, 0.1056, 0.1702],
...
])
and these are a sample of some of the brushstrokes in the dataset:
The goal of the Neural Painter is to translate these vectors of actions into brushstrokes on a canvas. The paper explores two neural architectures to achieve such translation, one based on a variational autoencoder (VAE) and the second one based on a generative adversarial network (GAN), with the GAN-based Neural Painter (Figure 1) achieving better results in terms of quality of the generated brushstrokes. For more details please refer to the paper [1] .
Tinkering with Neural Painters
The code available to reproduce the experiments is offered by the author in a series of Google’s Colaboratory notebooks available in this Github repo and the dataset used is available in Kaggle. The implementation uses TensorFlow, which is great in terms of performance, but let’s face it, it is not great fun to digest TensorFlow’s code (specially without Keras ;) ).
Teaching machines is the best way to learn Machine Learning — E. D. A.
We played around with the notebooks provided, they were extremely useful to understand the paper and to generate nice sample paintings, but we decided that in order to really learn and master Neural Painters, we needed to experiment and reproduce the results of the paper with our own implementation. To this end, we decided to go with PyTorch and fast.ai as deep learning frameworks instead of TensorFlow as the paper’s reference implementation, to do some tinkering and in the process, hopefully, come with a more accessible piece of code.
Learning Neural Painters Faster
GANs are great generative models but they are known to be notoriously difficult to train, specially due to requiring a large amount of data, and therefore, needing large computational power on GPUs. They require a lot of time to train and are sensitive to small hyperparameter variations.
We indeed tried first a pure adversarial training following the paper, but although we obtained some decent results with our implementation, in terms of brushstrokes quality, it took a day or two to get there with a single GPU using a Colaboratory notebook and the full dataset.
To overcome these known GANs limitations and to speed up the Neural Painter training process, we leveraged the power of Transfer Learning
Transfer learning is a very useful technique in Machine Learning, e.g., the ImageNet models trained as classifiers, are largely used as powerful image feature extractors, in NLP, word embeddings, learned unsupervised or with minimal supervision (e.g., trying to predict words in the same context), have been very useful as representations of words in more complex language models. In Recommender Systems, representations of items (e.g., book, movie, song) or users can be learned via Collaborative Filtering and then use them not only for personalized ranking, but also for adaptive user interfaces. The fundamental idea, is to learn a model or feature representation on a task, and then transfer that knowledge to another related task, without the need to start from scratch, and only do some fine-tuning to adapt the model or representation parameters on that task.
More precisely, since GANs main components are the Generator and Critic the idea is to pre-train them independently, that is in a non-adversarial manner, and do transfer learning by hooking them together after pre-training and proceed with the adversarial training, i.e., GAN mode. This process has shown to produce remarkable results [2] and is the one we follow here.
The main steps are as described as follows:
(1) Pre-train the Generator with a non-adversarial loss, e.g., using a feature loss (also known as perceptual loss)(2) Freeze the pre-trained Generator weights(3) Pre-train the Critic as a Binary Classifier
(i.e., non-adversarially) using the pre-trained Generator (in evaluation mode with frozen model weights) to generate `fake` brushstrokes. That is, the Critic should learn to discriminate between real images and the generated ones. This step uses a standard binary classification loss, i.e., Binary Cross Entropy, not a GAN loss(4) Transfer learning for adversarial training (GAN mode): continue the Generator and Critic training in a GAN setting. Faster!
More in detail:
(1) Pre-train the Generator with a Non-Adversarial Loss
Figure 1. Pre-train the Generator using a (Non-Adversarial) Feature Loss.
The training set consists of labeled examples where the input corresponds to an action vector and the corresponding brushstroke image to the target.
The input action vectors go through the Generator, which consists of a fully-connected layer (to increase the input dimensions) and of a Deep Convolutional Neural Network connected to it.
The output of the Generator is an image of a brushstroke. The loss computed between the images is the feature loss introduced in [3] (also known as perceptual loss [4]). The process is depicted in Figure 1.
(2) Freeze the pre-trained Generator
After pre-training the Generator using the non-adversarial loss, the brushstrokes look like the ones depicted in Figure 2. A set of brushstrokes images is generated that will help us pre-train the Critic in the next step.
Figure 2 . Sample Brushstrokes from the Generator Pre-trained with a Non-Adversarial Loss.
(3) Pre-train the Critic as a Binary Classifier
Figure 3 . Pre-train the Critic as a Binary Classifier.
We train the Critic as binary classifier (Figure 3), that is, the Critic is pre-trained on the task of recognizing true vs generated brushstrokes images (Step (2)).
We use is the Binary Cross Entropy as binary loss for this step.
(4) Transfer Learning for Adversarial Training (GAN mode)
Finally, we continue the Generator and Critic training in a GAN setting as shown in Figure 4. This final step is much faster that training the Generator and Critic from scratch as a GAN.
Figure 4 . Transfer Learning: Continue the Generator and Critic training in a GAN setting. Faster.
One can observe from Figure 2 that the pre-trained Generator is doing a decent job learning brushstrokes. However, there are still certain imperfections when compared to the true strokes in the dataset.
Figure 5 shows the output of the Generator after completing a single epoch of GAN training, i.e., after transferring the knowledge acquired in the pre-training phase. We can observe how the brushstrokes are more refined and, although slightly different to the true brushstrokes, they have interesting textures, which makes them very appealing for brushstrokes paintings.
Figure 5 . Sample Brushstrokes from the Generator after Adversarial Training (GAN mode).
From Brushstrokes to Paintings
Once the Generator training process is completed, we have a machine that is able to translate vectors of actions to brushstrokes, but how do we teach the machine to paint like a Bob Ross’ apprentice?
To achieve this, the Neural Painters paper [1] introduces a process called Intrinsic Style Transfer, similar in spirit to Neural Style Transfer [6] but which does not require a style image. Intuitively, the features of the content input image and the one produced by the Neural Painter should be similar.
To implement the process we freeze the Generator model weights and learn a set of action vectors that when input to the Generator will produce brushstrokes, that once combined, will create a painting given an input content image.The image features are extracted using a VGG16 [7] network as a feature extractor, denoted as CNN in Figure 6, which depicts the whole process.
Figure 6. Painting with Neural Painters using Intrinsic Style Transfer.
Note that the optimization process is targeted to learn the tensor of actions, while the remaining model weights are not changed, that is, the ones of the Neural Painter and CNN models. We use the same Feature Loss as before [3].
Finally, given an input image for inspiration, e.g., a photo of a beautiful landscape, the machine is able to create a brushstroke painting for that image :) ∎
This Neural Painters implementation is the core technique used in our collaboration with the collective diavlex for their Art+AI collection Residual at Cueva Gallery.
Notes
For blending the brushstrokes, we follow a linear blending strategy to combine the generated strokes in a canvas, this process is described in detail in a very nice post titled Teaching Agents to Paint Inside Their Own Dreams also by Reiichiro Nakano [5]. We are currently exploring an alternative process that uses the alpha channel for blending.
Acknowledgements
We would like to thank Reiichiro Nakano for helping us clarifying doubts during the implementation of our Neural Painters and for his supportive and encouraging comments and feedback. Thanks a lot Reiichiro! [@reiinakano].
Pandas painted by our Neural Painter.
References
[1] Neural Painters: A Learned Differentiable Constraint for Generating Brushstroke Paintings. Reiichiro Nakano
arXiv preprint arXiv:1904.08410, 2019.
[2] Decrappification, DeOldification, and Super Resolution. Jason Antic (Deoldify), Jeremy Howard (fast.ai), and Uri Manor (Salk Institute) https://www.fast.ai/2019/05/03/decrappify/ , 2019.
[3] Fast.ai MOOC Lesson 7: Resnets from scratch; U-net; Generative (adversarial) networks. https://course.fast.ai/videos/?lesson=7 ; Notebook: https://nbviewer.jupyter.org/github/fastai/course-v3/blob/master/nbs/dl1/lesson7-superres.ipynb [Accessed on: 2019–08]
[4] Perceptual Losses for Real-Time Style Transfer and Super-Resolution
Justin Johnson, Alexandre Alahi, Li Fei-Fei https://arxiv.org/abs/1603.08155 , 2016
[5] Teaching Agents to Paint Inside Their Own Dreams. Reiichiro Nakano.
https://reiinakano.com/2019/01/27/world-painters.html , 2019
[6] A Neural Algorithm of Artistic Style. Leon A. Gatys, Alexander S. Ecker, Matthias Bethge. https://arxiv.org/abs/1508.06576, 2015
[7] Very Deep Convolutional Networks for Large-Scale Image Recognition. Karen Simonyan, Andrew Zisserman. https://arxiv.org/abs/1409.1556, 2014
|
https://medium.com/the-ai-art-corner/the-joy-of-neural-painting-f00b4f3c4fd4
|
['Beth Jochim']
|
2020-10-13 10:28:32.354000+00:00
|
['Artificial Intelligence', 'Machine Art', 'Neural Paintings', 'Articles', 'Machine Learning']
|
Title Joy Neural PaintingContent Implementation Neural Painter Action Code implementation found Github repo httpsgithubcomlibreaineuralpaintersx sure know Bob Ross program Joy Painting taught thousand viewer paint beautiful landscape simple fun way combining color brushstrokes achieve great result quickly remember teaching paint pixel time course However current generative AI Art method still centered teach machine ‘paint’ pixellevel order achieve mimic painting style eg GANsbased approach style transfer might effective intuitive specially explaining process artist familiar color brushstrokes Libre AI started Creative AI initiative goal make accessible advance AI group artist necessarily tech background want explore creative process enriched interaction creative people creative machine first step need teach machine paint learn paint human would using brushstrokes combining color canvas researched stateoftheart despite great work really single paper satisfied requirement found Neural Painters Learned Differentiable Constraint Generating Brushstroke Paintings Reiichiro Nakano 1 finding quite refreshing Neural Painters Neural Painters 1 class model seen fully differentiable simulation particular nondifferentiable painting program word machine “paints” successively generating brushstrokes ie action define brushstrokes applying canvas artist would action characterize brushstrokes consist 12dimensional vector defining following variable Start end pressure pressure applied brush beginning end stroke pressure applied brush beginning end stroke Brush size radius generated brushstroke radius generated brushstroke Color RGB color brushstroke RGB color brushstroke Brush coordinate three Cartesian coordinate 2D canvas defining brushstroke’s shape coordinate define starting point end point intermediate control point constituting quadratic Bezier curve tensor action look like example tensor 07016 03078 09057 03821 00720 07956 08851 09295 03273 08012 01321 07915 02864 05651 05099 03430 02887 05044 00394 05709 04634 08273 01056 01702 sample brushstrokes dataset goal Neural Painter translate vector action brushstrokes canvas paper explores two neural architecture achieve translation one based variational autoencoder VAE second one based generative adversarial network GAN GANbased Neural Painter Figure 1 achieving better result term quality generated brushstrokes detail please refer paper 1 Tinkering Neural Painters code available reproduce experiment offered author series Google’s Colaboratory notebook available Github repo dataset used available Kaggle implementation us TensorFlow great term performance let’s face great fun digest TensorFlow’s code specially without Keras Teaching machine best way learn Machine Learning — E played around notebook provided extremely useful understand paper generate nice sample painting decided order really learn master Neural Painters needed experiment reproduce result paper implementation end decided go PyTorch fastai deep learning framework instead TensorFlow paper’s reference implementation tinkering process hopefully come accessible piece code Learning Neural Painters Faster GANs great generative model known notoriously difficult train specially due requiring large amount data therefore needing large computational power GPUs require lot time train sensitive small hyperparameter variation indeed tried first pure adversarial training following paper although obtained decent result implementation term brushstrokes quality took day two get single GPU using Colaboratory notebook full dataset overcome known GANs limitation speed Neural Painter training process leveraged power Transfer Learning Transfer learning useful technique Machine Learning eg ImageNet model trained classifier largely used powerful image feature extractor NLP word embeddings learned unsupervised minimal supervision eg trying predict word context useful representation word complex language model Recommender Systems representation item eg book movie song user learned via Collaborative Filtering use personalized ranking also adaptive user interface fundamental idea learn model feature representation task transfer knowledge another related task without need start scratch finetuning adapt model representation parameter task precisely since GANs main component Generator Critic idea pretrain independently nonadversarial manner transfer learning hooking together pretraining proceed adversarial training ie GAN mode process shown produce remarkable result 2 one follow main step described follows 1 Pretrain Generator nonadversarial loss eg using feature loss also known perceptual loss2 Freeze pretrained Generator weights3 Pretrain Critic Binary Classifier ie nonadversarially using pretrained Generator evaluation mode frozen model weight generate fake brushstrokes Critic learn discriminate real image generated one step us standard binary classification loss ie Binary Cross Entropy GAN loss4 Transfer learning adversarial training GAN mode continue Generator Critic training GAN setting Faster detail 1 Pretrain Generator NonAdversarial Loss Figure 1 Pretrain Generator using NonAdversarial Feature Loss training set consists labeled example input corresponds action vector corresponding brushstroke image target input action vector go Generator consists fullyconnected layer increase input dimension Deep Convolutional Neural Network connected output Generator image brushstroke loss computed image feature loss introduced 3 also known perceptual loss 4 process depicted Figure 1 2 Freeze pretrained Generator pretraining Generator using nonadversarial loss brushstrokes look like one depicted Figure 2 set brushstrokes image generated help u pretrain Critic next step Figure 2 Sample Brushstrokes Generator Pretrained NonAdversarial Loss 3 Pretrain Critic Binary Classifier Figure 3 Pretrain Critic Binary Classifier train Critic binary classifier Figure 3 Critic pretrained task recognizing true v generated brushstrokes image Step 2 use Binary Cross Entropy binary loss step 4 Transfer Learning Adversarial Training GAN mode Finally continue Generator Critic training GAN setting shown Figure 4 final step much faster training Generator Critic scratch GAN Figure 4 Transfer Learning Continue Generator Critic training GAN setting Faster One observe Figure 2 pretrained Generator decent job learning brushstrokes However still certain imperfection compared true stroke dataset Figure 5 show output Generator completing single epoch GAN training ie transferring knowledge acquired pretraining phase observe brushstrokes refined although slightly different true brushstrokes interesting texture make appealing brushstrokes painting Figure 5 Sample Brushstrokes Generator Adversarial Training GAN mode Brushstrokes Paintings Generator training process completed machine able translate vector action brushstrokes teach machine paint like Bob Ross’ apprentice achieve Neural Painters paper 1 introduces process called Intrinsic Style Transfer similar spirit Neural Style Transfer 6 require style image Intuitively feature content input image one produced Neural Painter similar implement process freeze Generator model weight learn set action vector input Generator produce brushstrokes combined create painting given input content imageThe image feature extracted using VGG16 7 network feature extractor denoted CNN Figure 6 depicts whole process Figure 6 Painting Neural Painters using Intrinsic Style Transfer Note optimization process targeted learn tensor action remaining model weight changed one Neural Painter CNN model use Feature Loss 3 Finally given input image inspiration eg photo beautiful landscape machine able create brushstroke painting image ∎ Neural Painters implementation core technique used collaboration collective diavlex ArtAI collection Residual Cueva Gallery Notes blending brushstrokes follow linear blending strategy combine generated stroke canvas process described detail nice post titled Teaching Agents Paint Inside Dreams also Reiichiro Nakano 5 currently exploring alternative process us alpha channel blending Acknowledgements would like thank Reiichiro Nakano helping u clarifying doubt implementation Neural Painters supportive encouraging comment feedback Thanks lot Reiichiro reiinakano Pandas painted Neural Painter References 1 Neural Painters Learned Differentiable Constraint Generating Brushstroke Paintings Reiichiro Nakano arXiv preprint arXiv190408410 2019 2 Decrappification DeOldification Super Resolution Jason Antic Deoldify Jeremy Howard fastai Uri Manor Salk Institute httpswwwfastai20190503decrappify 2019 3 Fastai MOOC Lesson 7 Resnets scratch Unet Generative adversarial network httpscoursefastaivideoslesson7 Notebook httpsnbviewerjupyterorggithubfastaicoursev3blobmasternbsdl1lesson7superresipynb Accessed 2019–08 4 Perceptual Losses RealTime Style Transfer SuperResolution Justin Johnson Alexandre Alahi Li FeiFei httpsarxivorgabs160308155 2016 5 Teaching Agents Paint Inside Dreams Reiichiro Nakano httpsreiinakanocom20190127worldpaintershtml 2019 6 Neural Algorithm Artistic Style Leon Gatys Alexander Ecker Matthias Bethge httpsarxivorgabs150806576 2015 7 Deep Convolutional Networks LargeScale Image Recognition Karen Simonyan Andrew Zisserman httpsarxivorgabs14091556 2014Tags Artificial Intelligence Machine Art Neural Paintings Articles Machine Learning
|
5,587 |
What to Do When You Don’t Know What to Write
|
What to Do When You Don’t Know What to Write
You can’t always can’t on the muse to be there
Photo by Ryan Snaadt on Unsplash
As a writer, staring at a blank screen kinda sucks.
You rack your brain trying to come up with ideas, but there’s nothing there. Somehow, you’ve forgotten everything you know.
And the more you try, the harder it gets.
It’s funny. Some days, the material’s right at the forefront of your thoughts, begging to be written about.
But then there are other days when the muse just simply isn’t there.
What do you do then?
How do you write something when you have no idea what to write about?
I struggled with this question for a long time.
Often, I would sit in front of my computer and stare off into space, hoping something would come to me.
Alas, nothing ever did.
It was only when I started typing that the juices started flowing.
Take this article, for example.
Here I am, sitting in front of my computer with no idea what to write, thinking to myself “this sucks.” So what do I do, I write exactly that — “As a writer, staring at a blank screen kinda sucks.”
And with that one line, I was off to the races. The rest flowed like hot molten lava down a beautiful Hawaiian hill.
So when you’re struggling to find something to write about…
JUST START WRITING.
Write about how you feel. Write about hating the blank screen. Write about cursing the muse for never showing up.
It doesn’t matter.
What matters is that your fingers start moving.
Because once they do, the rest starts flowing pretty quick.
|
https://medium.com/the-innovation/what-to-do-when-you-dont-know-what-to-write-3b5f647157e1
|
['Daniel P. Donovan']
|
2020-12-04 15:02:12.169000+00:00
|
['Writers On Writing', 'Copywriting', 'Writing', 'Writer', 'Writing Tips']
|
Title Don’t Know WriteContent Don’t Know Write can’t always can’t muse Photo Ryan Snaadt Unsplash writer staring blank screen kinda suck rack brain trying come idea there’s nothing Somehow you’ve forgotten everything know try harder get It’s funny day material’s right forefront thought begging written day muse simply isn’t write something idea write struggled question long time Often would sit front computer stare space hoping something would come Alas nothing ever started typing juice started flowing Take article example sitting front computer idea write thinking “this sucks” write exactly — “As writer staring blank screen kinda sucks” one line race rest flowed like hot molten lava beautiful Hawaiian hill you’re struggling find something write about… START WRITING Write feel Write hating blank screen Write cursing muse never showing doesn’t matter matter finger start moving rest start flowing pretty quickTags Writers Writing Copywriting Writing Writer Writing Tips
|
5,588 |
Single-Binary Web Apps in Go and Vue — Part 1
|
Photo by Gift Habeshaw on Unsplash
Often I find myself tasked with building web applications or APIs with web management portals. On the backend my language of choice is Go, while on the frontend my framework of choice is Vue. One of the big benefits of Go is it compiles into a single binary. When building an API in Go, and a frontend in JavaScript though, these are two different stacks, and as such might mean deploying two different apps. And in some cases that may be desirable. But for my simple uses I’d like to bundle everything into a single binary for deployment, because it makes my life easier.
This is a 4 part series:
Part 1 — The Go and Vue apps
Part 2 — Starting the Vue app with your Go app
Part 3 — Bundling it all up
Part 4 — Automating using Make
In this series we’ll build an app that does nothing useful, but demonstrates bundling your Go and Vue apps into a single binary. This article is part 1, where we will setup the Go and Vue apps separately. There are a couple of prerequisites necessary to get started. Make sure you have the following tools installed.
The Go App
Let’s start with the Go app. In this example we’ll build a a really small app that simply fires up an HTTP server. I’m using the excellent Echo framework for simplifying boilerplate HTTP server stuff. The first step is to initialize a new Go app. In your terminal create a new directory and run go mod init . For these examples, I’m using a namespace of github.com/adampresley/example. Change this to meet your own needs.
$ mkdir example
$ cd example
$ go mod init github.com/adampresley/example
$ touch ./main.go
The above steps will initialize Go modules for our package, and create a blank main.go file. Open up that file and paste the following content. We’ll break down what everything does in a minute.
Let’s break this down.
Line 18 — A global version variable. This is mostly used to announce the version of the application. We’ll muck with this more in part 4
Starting at line 30 we setup our HTTP server using Echo. On line 33 we create a simple endpoint at /api/version which simply returns the Version variable
which simply returns the Version variable At line 37 we create a goroutine which starts the HTTP server listening on port 8080
Line 56–62 creates a channel that waits for an interrupt signal, such as CTRL+C. Execution will pause here, allowing the HTTP server to run indefinitely until it is interrupted
Lines 64–71 tell the HTTP server to shutdown
If you run this now on a terminal using go run . you should see a message stating that the application has started.
The Vue App
The next part of the equation is the Vue JavaScript application. This part is way easier because Vue provides their CLI to set it all up for you. In this section I’ll walk through the choices I made for the example, then demonstrate calling our Go API to get the version and display it on our page.
The first step is to create the app using the CLI tool. We are going to create the Vue app inside our example folder where our Go app is.
$ cd example
$ vue create app
Running the above will start a wizard which asks several questions. Here are the options that I chose for this demo.
Once you’ve made your selections, watch and wait for the CLI to do it’s job. When it complete you should see a message that looks something like this.
Now, if you open two terminals, you can run the Go app in one, and the Vue app in another.
Terminal 1
$ cd example
$ go run .
Terminal 2
$ cd example/app
$ npm run serve
Now open a browser. In tab 1 navigate to http://localhost:8080/api/version and you will see the version string output. Open another tab and navigate to http://localhost:8081 and you will see the default Vue sample page.
Calling the API
Finally, let’s have our Vue app call the Go API version endpoint, just for kicks. In the /app/src/components/ directory there is a file called HelloWorld.vue. Open this file for editing. We want to display the API version string at the bottom of the page. Here are the steps.
Add a variable to hold the version. See lines 41–45
Add HTML to display the version. See line 30
Get the version from the server and assign it to our new variable. We’ll do this when the component is created. See lines 47–51
Here is the code in full.
Now when you refresh http://localhost:8081 you’ll see this.
That’s it for part 1! In part 2 we’ll add code in our Go application which will automatically start the Vue app for us when we run the Go app, adding a level of convienence.
|
https://medium.com/swlh/single-binary-web-apps-in-go-and-vue-part-1-ea7d4100eab7
|
['Adam Presley']
|
2020-12-28 15:15:41.502000+00:00
|
['Vue', 'JavaScript', 'Software Development', 'Go', 'Development']
|
Title SingleBinary Web Apps Go Vue — Part 1Content Photo Gift Habeshaw Unsplash Often find tasked building web application APIs web management portal backend language choice Go frontend framework choice Vue One big benefit Go compiles single binary building API Go frontend JavaScript though two different stack might mean deploying two different apps case may desirable simple us I’d like bundle everything single binary deployment make life easier 4 part series Part 1 — Go Vue apps Part 2 — Starting Vue app Go app Part 3 — Bundling Part 4 — Automating using Make series we’ll build app nothing useful demonstrates bundling Go Vue apps single binary article part 1 setup Go Vue apps separately couple prerequisite necessary get started Make sure following tool installed Go App Let’s start Go app example we’ll build really small app simply fire HTTP server I’m using excellent Echo framework simplifying boilerplate HTTP server stuff first step initialize new Go app terminal create new directory run go mod init example I’m using namespace githubcomadampresleyexample Change meet need mkdir example cd example go mod init githubcomadampresleyexample touch maingo step initialize Go module package create blank maingo file Open file paste following content We’ll break everything minute Let’s break Line 18 — global version variable mostly used announce version application We’ll muck part 4 Starting line 30 setup HTTP server using Echo line 33 create simple endpoint apiversion simply return Version variable simply return Version variable line 37 create goroutine start HTTP server listening port 8080 Line 56–62 creates channel wait interrupt signal CTRLC Execution pause allowing HTTP server run indefinitely interrupted Lines 64–71 tell HTTP server shutdown run terminal using go run see message stating application started Vue App next part equation Vue JavaScript application part way easier Vue provides CLI set section I’ll walk choice made example demonstrate calling Go API get version display page first step create app using CLI tool going create Vue app inside example folder Go app cd example vue create app Running start wizard asks several question option chose demo you’ve made selection watch wait CLI it’s job complete see message look something like open two terminal run Go app one Vue app another Terminal 1 cd example go run Terminal 2 cd exampleapp npm run serve open browser tab 1 navigate httplocalhost8080apiversion see version string output Open another tab navigate httplocalhost8081 see default Vue sample page Calling API Finally let’s Vue app call Go API version endpoint kick appsrccomponents directory file called HelloWorldvue Open file editing want display API version string bottom page step Add variable hold version See line 41–45 Add HTML display version See line 30 Get version server assign new variable We’ll component created See line 47–51 code full refresh httplocalhost8081 you’ll see That’s part 1 part 2 we’ll add code Go application automatically start Vue app u run Go app adding level convienenceTags Vue JavaScript Software Development Go Development
|
5,589 |
What are encryption keys and how do they work? 🔐
|
Diffie-Hellman-Merkle key exchange
This method allows two parties to remotely establish a shared secret (a key in our case) over an assumed insecure channel. This key can then be used in subsequent communications along with a symmetric-key algorithm.
Colours are generally used instead of numbers when explaining this, due to the correlation of undoing the mathematical operations being used to the complexity of knowing which two colours where mixed to create a new, third, colour.
Diffie-Hellman-Merkle protocol to establish a shared secret key
Alice and Bob each start with their own, private, values R and G, as well as a public common value Y. Alice uses Y along with her private value to create RY, and Bob GY. These are publicly shared. This is safe, as it is extremely computationally difficult to determine the exact private values from these new values. Alice can then use Bob’s new public combination along with her private value to create RGY, and importantly Bob can use Alice’s new public combination to create the exact same RGY value. They now have a shared secret they can use to encrypt future messages and know the other can decrypt them when received.
A main security flaw in this protocol is the inability to verify the authenticity of the other party while setting up this shared secret. It is assumed you are talking to a trusted other party. This leaves the protocol open to a ‘man-in-the-middle’ attack by someone listening in from the start of the exchange.
Eve performing a man-in-the-middle attack
Above it can be seen how the Eve effectively eavesdrops and intercepts the exchange to set themselves up in the position to read any message shared between Alice and Bob. Eve can receive a message from Alice, decrypt it using their shared secret key, read it, then re-encrypt it using the key Eve shares with Bob. When Bob receives the message it can be decrypted using his secret key, which he incorrectly assumes he shares with Alice.
We will come back to Diffie-Hellman-Merkle later, but now will look at solving this vulnerability using public-private key pairs.
|
https://medium.com/codeclan/what-are-encryption-keys-and-how-do-they-work-cc48c3053bd6
|
['Dominic Fraser']
|
2018-07-10 22:06:52.710000+00:00
|
['Encryption', 'Software Engineering', 'Security', 'Cipher', 'End To End Encryption']
|
Title encryption key work 🔐Content DiffieHellmanMerkle key exchange method allows two party remotely establish shared secret key case assumed insecure channel key used subsequent communication along symmetrickey algorithm Colours generally used instead number explaining due correlation undoing mathematical operation used complexity knowing two colour mixed create new third colour DiffieHellmanMerkle protocol establish shared secret key Alice Bob start private value R G well public common value Alice us along private value create RY Bob GY publicly shared safe extremely computationally difficult determine exact private value new value Alice use Bob’s new public combination along private value create RGY importantly Bob use Alice’s new public combination create exact RGY value shared secret use encrypt future message know decrypt received main security flaw protocol inability verify authenticity party setting shared secret assumed talking trusted party leaf protocol open ‘maninthemiddle’ attack someone listening start exchange Eve performing maninthemiddle attack seen Eve effectively eavesdrops intercept exchange set position read message shared Alice Bob Eve receive message Alice decrypt using shared secret key read reencrypt using key Eve share Bob Bob receives message decrypted using secret key incorrectly assumes share Alice come back DiffieHellmanMerkle later look solving vulnerability using publicprivate key pairsTags Encryption Software Engineering Security Cipher End End Encryption
|
5,590 |
Believing in a She-God isn’t Feminist
|
Let’s take feminism and strip it down to as basic a definition as we can get — Wikipedia.
Wikipedia says:
Feminism is a range of political movements, ideologies, and social movements that share a common goal: to define, establish, and achieve political, economic, personal, and social equality of sexes. [emphasis mine]
And that is exactly what we’re fighting for. In a serious conversation with most feminists, you won’t find us talking about world domination and fighting for the matriarchy. All you’ll hear is equality. We want equal pay, an equal shot at being hired for that pay, and equal opportunity to qualify for the job.
That’s it.
We ultimately aren’t interested in replacing our male-dominated reality with a female-dominated one because we know that is equally unfair to men as it has been for us since humans first became a thing.
So why, on the surface are all of our rallying cries and feminist jokes about women achieving world domination and eradicating men because we only need their sperm to continue the human race anyways? Why is the surface language for feminism so damn unequal?
Why are all the signs we look for in other feminists so disappointingly meaningless to our actual goals? Somehow our initiation into feminism is resting in the very female arms of God and a goodie bag with “The future is female” pasted all over it.
God isn’t a she
We know this. Most religions don’t have an explicit gender for their God, but guess what? English doesn’t have a non-gendered pronoun, and most of the people writing holy works were surrounded by a much more patriarchal society in the first place. God was powerful, intelligent, and a provider and in their time those were exclusively male traits.
So, yeah, We can refer to god as a woman — it's our choice, just as it was the choice of the people way back then to refer to him more often as a man. But acting as if God is any more female than male is equally as wrong as acting as if God is more male than female. Either way, we are assigning gender to someone that doesn’t have one.
The female god doesn’t push our agenda forward in any way, instead, it just gives us one more thing to be angry about when someone refers to God as he. And heaven knows we don’t need one more thing to be angry about.
The Goal isn’t to Turn the Tables
Female feminists are already an angry group of people — and for good reason. We have been systematically repressed for nearly our entire existence on the earth, and we’re rightly sick of it.
We’re sick of being treated as less than, being told we can’t, and forced to leave our passions to be pursued by men. We’re pissed that as hard as we fight right now to get the qualifications for our dream job, we are less likely to be hired for it. We want to erupt when we finally get hired and find that despite the amount of effort, love, and dedication we show in our work, we’re paid less than the guy a door down doing the same damn thing.
So hell yeah, we’re angry. But, we have to be careful with anger. We can’t go throwing around our anger at things like a male-gendered God, because then that anger gets mixed up in the wash with the things that actually matter.
Even justified anger has to be carefully managed. Anger scares people, it makes them feel threatened and unwanted. And threatened is not the feeling that makes someone join a movement.
We feel threatened by lions. When is the last time you saw someone just up and decide to join a pride of lions for the hell of it?
Over-compensating doesn’t help
We have this idea that if you fight for more than you need you’ll end up with what you need. Or, in our context, if you fight for the matriarchy, you’ll get equality. And yeah, that may work some places but when dealing with moral issues that people stand so strongly on, by fighting for more than the real goal we turn ourselves into a pride of lions.
When we over-compensate and get angry about someone’s male-gendered god, or when they don’t laugh at our joke about eliminating men from the world, we distance them from our message. Instead of convincing them that equality is the right way, all they can see is that we want to dominate them. And our anger gives them the perfect justification for feeling this way — it proves we are as emotional and unfit for power as they thought we were.
Call it the patriarchy if you will, but we can’t afford to be the repressive people men were — and sometimes still are — throughout history. We don’t have the luxury of eliminating the vote of the opposing side.
And securing a bare majority of the votes doesn’t work either. Women and minorities know better than most, that legislation doesn’t change bias.
Everyone is already equal to US law. But just because you make a law that says don’t discriminate doesn’t mean you can enforce that law or even pinpoint exactly when it is being breached.
We need the votes of men from every age, race, and way of life because as many strong independent women we recruit, the world is still unequally tipped so the power lands with the men. Repulsing men in power with our angry rants about bringing the matriarchy to pass is probably one of the slowest ways we can achieve equality.
And the clincher is, that we’re not even interested in ruling the world. Once again, our goal was never to bring the matriarchy crashing down on the head of the patriarchy with a powerful vengeance. Our goal is equality.
Why do we keep acting as if it’s some obnoxious and dramatic Adele song where we finally get to win?
The Future isn’t female
The future is human. The future is free from gender-based discrimination — meaning if the future is okay, men, women, and those in the non-binary trans population will all hold jobs in proportion to their percentage of the population. But more importantly, it will be free from the systems that convince women their dreams aren’t viable. It will be free from all the little offenses that push us down and keep us out of the places we have a right to be.
And we all believe this basic truth. The truth that all we want is equality, but we’ve also gotten so caught up in the fight that message is being obscured by the angry clutter we have taken on as an integral part of the movement.
|
https://desotaelianna.medium.com/believing-in-a-she-god-isnt-feminist-591b52f2ddd4
|
['Elianna Desota']
|
2019-03-30 23:21:27.753000+00:00
|
['Politics', 'Feminism', 'Equality', 'Psychology', 'Belief']
|
Title Believing SheGod isn’t FeministContent Let’s take feminism strip basic definition get — Wikipedia Wikipedia say Feminism range political movement ideology social movement share common goal define establish achieve political economic personal social equality sex emphasis mine exactly we’re fighting serious conversation feminist won’t find u talking world domination fighting matriarchy you’ll hear equality want equal pay equal shot hired pay equal opportunity qualify job That’s ultimately aren’t interested replacing maledominated reality femaledominated one know equally unfair men u since human first became thing surface rallying cry feminist joke woman achieving world domination eradicating men need sperm continue human race anyways surface language feminism damn unequal sign look feminist disappointingly meaningless actual goal Somehow initiation feminism resting female arm God goodie bag “The future female” pasted God isn’t know religion don’t explicit gender God guess English doesn’t nongendered pronoun people writing holy work surrounded much patriarchal society first place God powerful intelligent provider time exclusively male trait yeah refer god woman — choice choice people way back refer often man acting God female male equally wrong acting God male female Either way assigning gender someone doesn’t one female god doesn’t push agenda forward way instead give u one thing angry someone refers God heaven know don’t need one thing angry Goal isn’t Turn Tables Female feminist already angry group people — good reason systematically repressed nearly entire existence earth we’re rightly sick We’re sick treated le told can’t forced leave passion pursued men We’re pissed hard fight right get qualification dream job le likely hired want erupt finally get hired find despite amount effort love dedication show work we’re paid le guy door damn thing hell yeah we’re angry careful anger can’t go throwing around anger thing like malegendered God anger get mixed wash thing actually matter Even justified anger carefully managed Anger scare people make feel threatened unwanted threatened feeling make someone join movement feel threatened lion last time saw someone decide join pride lion hell Overcompensating doesn’t help idea fight need you’ll end need context fight matriarchy you’ll get equality yeah may work place dealing moral issue people stand strongly fighting real goal turn pride lion overcompensate get angry someone’s malegendered god don’t laugh joke eliminating men world distance message Instead convincing equality right way see want dominate anger give perfect justification feeling way — prof emotional unfit power thought Call patriarchy can’t afford repressive people men — sometimes still — throughout history don’t luxury eliminating vote opposing side securing bare majority vote doesn’t work either Women minority know better legislation doesn’t change bias Everyone already equal US law make law say don’t discriminate doesn’t mean enforce law even pinpoint exactly breached need vote men every age race way life many strong independent woman recruit world still unequally tipped power land men Repulsing men power angry rant bringing matriarchy pas probably one slowest way achieve equality clincher we’re even interested ruling world goal never bring matriarchy crashing head patriarchy powerful vengeance goal equality keep acting it’s obnoxious dramatic Adele song finally get win Future isn’t female future human future free genderbased discrimination — meaning future okay men woman nonbinary trans population hold job proportion percentage population importantly free system convince woman dream aren’t viable free little offense push u keep u place right believe basic truth truth want equality we’ve also gotten caught fight message obscured angry clutter taken integral part movementTags Politics Feminism Equality Psychology Belief
|
5,591 |
Going With the Flow. “It’s radical knowing that on a…
|
Mile 1
I got my flow the morning of the London Marathon and it was extremely painful. It would be my first marathon and I remember already feeling so nervous for it. I had spent a full year enthusiastically training hard, but I had never actually practiced running on my period.
I thought through my options. Running 26.2 miles with a wad of cotton material wedged between my legs just seemed so absurd. Plus they say chaffing is a real thing. I honestly didn’t know what to do. I knew that I was lucky to have access to tampons etc, to be part of a society that at least has a norm around periods. I could definitely choose to participate in this norm at the expense of my own comfort and just deal with it quietly.
But then I thought…
If there’s one person society can’t eff with, it’s a marathon runner. You can’t tell a marathoner to clean themselves up, or to prioritize the comfort of others. On the marathon course, I could choose whether or not I wanted to participate in this norm of shaming.
I decided to just take some midol, hope I wouldn’t cramp, bleed freely and just run.
A marathon in itself is a centuries old symbolic act. Why not use it as a means to draw light to my sisters who don’t have access to tampons and, despite cramping and pain, hide it away like it doesn’t exist?
Mile 6
|
https://medium.com/endless/going-with-the-flow-blood-sisterhood-at-the-london-marathon-f719b98713e7
|
['Kiran Gandhi']
|
2018-02-12 11:12:32.732000+00:00
|
['Storytelling', 'Marathon', 'Feminism']
|
Title Going Flow “It’s radical knowing a…Content Mile 1 got flow morning London Marathon extremely painful would first marathon remember already feeling nervous spent full year enthusiastically training hard never actually practiced running period thought option Running 262 mile wad cotton material wedged leg seemed absurd Plus say chaffing real thing honestly didn’t know knew lucky access tampon etc part society least norm around period could definitely choose participate norm expense comfort deal quietly thought… there’s one person society can’t eff it’s marathon runner can’t tell marathoner clean prioritize comfort others marathon course could choose whether wanted participate norm shaming decided take midol hope wouldn’t cramp bleed freely run marathon century old symbolic act use mean draw light sister don’t access tampon despite cramping pain hide away like doesn’t exist Mile 6Tags Storytelling Marathon Feminism
|
5,592 |
The Science and Tech of Face Masks
|
The Science and Tech of Face Masks
I put my money (and my body) on the line to teach you everything about COVID-19 masks
Images courtesy the author
When the CDC starts publishing sewing tutorials, you know things have gotten weird. As COVID-19 has spread worldwide, though, that’s the strange new world we’re living in.
When the COVID-19 crisis first began, the Centers for Disease Control in the the United States, and the World Health Organization internationally, came down strongly against healthy citizens wearing face masks. In some ways, that made sense at the time. Most masks don’t protect the wearer from infection, and there’s a risk that they’ll provide a false sense of security.
Just as seatbelts and airbags make people drive faster, wearing a face cover can convince people that they’re fully protected from the coronavirus, prompting them to go out more, touch their face, and skip steps that really do protect them, like the simple step of washing their hands. Initially, there was also the concern that people would buy up medical grade masks, taking these out of the hands of the front-line medical professionals who actually need them.
As the coronavirus crisis wears on and social distancing measures continue to intensify, the CDC has made a dramatic about-face. Acknowledging that masks protect others from you, even if they don’t protect you from others carrying the virus, the CDC now recommends that Americans wear a face covering any time they go outside. Several jurisdictions — including Los Angeles — have quickly enshrined this advice in enforceable laws and ordinances.
Like so many things with COVID-19, we could likely have learned this lesson much earlier if we just looked a bit beyond our borders. Asian countries have made masks a cornerstone of their coronavirus response from day one. And in places like Japan, it’s customary to wear face masks throughout the cold and flu season even in normal times, as a social gesture if not a medical one.
In any event, with the CDC’s dramatic reversal, Americans are suddenly getting a crash course in face coverings. Minutiae like the difference between cloth masks and medical grade masks are now acceptable quarantine-unit dinner conversation. Extremely technical terms like “N95” are suddenly common knowledge — and fodder for political squabbling.
The American’s credit, our civic organizations and individuals have mobilized around masks in a way that’s likely possible nowhere else. Church groups and the like are sewing cloth masks en-mass, and mask drives have resulted in donations of medical grade masks — often sourced from shuttered businesses or peoples’ garages — to hospitals across the country.
How do face masks work, anyway? How are they made? What are the different kinds, and how do they differ? How can you make your own? All of these questions are suddenly on millions of peoples’ minds. So I decided to dive in and help you answer them.
Doing so required wiring money to China, spraying myself in the face with a former chemical warfare agent, and relearning some middle school Home Economics skills I thought I’d never need to revisit. Let’s dive in together and take a look at masks in all their various forms.
Let’s begin with first principles. There are essentially three broad categories of masks available today: respirator masks, medical grade surgical masks, and cloth masks.
Respirator masks are the hardest to come by, and the most politically fraught. These are the fabled N95s that you hear mentioned ad nauseum on the news, and that authorities are encouraging citizens and companies to donate — forcibly or otherwise — to front-line first responders.
Why are these masks so important? Unlike all the other options on the market, they provide the wearer with a measurable level of actual protection against the COVID-19 coronavirus, and other airborne pathogens. The N95 designation means that the masks block 95% of particles under .3 microns in size. That includes many of the droplets that carry viruses and bacteria, including, many scientists believe, Covid-19. So if you’re wearing an N95 mask, you’re actually protected against the virus, even if it’s floating in the air all around you.
N95 masks are believed to protect the wearer from the virus
The catch is that for N95 masks to work, they have to fit properly. The masks’ ability to block particles is worthless if there’s a big gap around your nose (or your beard), and air is flowing through it. In a medical setting, front-line workers are routinely fit-tested, to ensure that their N95 and other respirator masks are actually providing protection.
The process for a fit test is relatively straightforward. You have a provider put on their mask, and adjust it properly to fit their face. You then place them in an environment with detectable particles of around .3 microns. If they can detect the particles, then they’ve failed the test. If they can’t detect them, then the mask is a proper fit, and passes.
I wanted to see how this fit test process works firsthand. I have an N95 mask which was purchased before the crisis, and which I’ve already worn, making it ineligible for donation. So to see how it fits me, I went in search of the materials necessary to perform a fit test.
Most of the time, these tests are performed using pleasant particles. A doctor or nurse puts on their mask, and steps into an environment filled with either a pleasant smelling chemical, or a fine mist of an artificial sweetener, like saccharin. If they can smell the chemical or taste the sweet saccharin, they’ve failed the test.
All the materials required to perform this version of a fit test were sold out. So I resorted to the only option still available: stannic chloride. Stannic chloride is a chemical compound originally used as a chemical warfare agent. Even at low (non dangerous) concentrations, it causes immediate (but harmless, if used properly) irritation to the nose and throat, causing coughing and discomfort in users.
It’s often used for fit tests where there’s a concern that a user might fake their results — for example, when they want to get through the fit test as quickly as possible and return to work. It’s easy enough to step into a chamber filled with saccharin wearing an improper mask and say “Nope, I don’t taste anything!”. It’s harder to get blasted in the face with stannic chloride and avoid uncontrollable coughing.
Technically the chemical is usually used for fit tests on N100 or P100 masks, which block even more particles than the N95 ones. But I figured that with stannic chloride, I should at least see a noticeable different in irritation if I wore an N95 mask or left it off, as it should still block a significant amount of the chemical if I was wearing it properly.
And that’s how I found myself stepping onto my driveway and using a little turkey-baster apparatus to spray myself in the face with a former chemical warfare agent (please, don’t try this at home).
The fit test protocol for stannic chloride says that to begin, “the test subject shall be allowed to smell a weak concentration of the irritant smoke before the respirator is donned to become familiar with its irritating properties.”
Aspirator for a stannic chloride fit test.
So I wafted a bit of the smoke out of a tube until it hung in the air, closed my eyes, and stepped through it while inhaling deeply, like an old-school department store shopper trying out a new perfume.
I expected an acrid smell. But my experience of inhaling the smoke was less a smell, and more a sensation: the instant, burning need to cough.
It wasn’t like someone blowing cigarette smoke in your face, where you’re hacking away for minutes. Rather, it immediately induced that feeling that you might get if you’re sitting in a quiet lecture hall, a theater, or another place where coughing would annoy those around you, and you find yourself needing to cough repeatedly anyway. “Irritating” is actually an excellent adjective to describe the overall experience.
I then donned my N95 mask, again following the recommendations from the fit test protocol, and puffed out a bit more smoke. This time, when I walked through it and inhaled, I felt nothing — no need to cough, no irritation, almost no effects at all.
I could smell the vapor a bit now (likely because my mask was N95 and not N100), and it had a pungent, gunpowder-like sulfurous smell. But the irritation was totally absent. Emboldened, I tried blasting my mask several times with the smoke. Still, I felt nothing.
The results were striking. I had assumed that as a layperson, I was probably wearing my mask incorrectly, and some amount of air (and potentially virus-laden particles) were leaking through. But my fit test showed that actually, the mask was surprisingly effective, even with my limited knowledge of how to use it properly. It gave me a new faith in the mask’s ability to protect me from whatever was out there floating around in the environment — whether that’s irritant smoke or the virus-filled remnants of some infected person’s sneeze.
That protection is why N95 masks have become the gold standard during the age of coronavirus. By actively blocking the particles which carry the virus, they provide the kind of protection that front-line healthcare workers need when treating infected patients. It’s also why they’re in such short supply.
Before the crisis, healthcare workers would discard N95 masks like candy. Between each visit with each patient, they would don an N95, and then throw it away after leaving the patient’s room. Even in a construction or other non-medical setting, the recommendation was to wear N95 masks for at most 8 hours at a time.
Now, the recommendations have changed dramatically. Front-line healthcare workers are wearing the same N95 mask for up to 30 days. The FDA has rapidly cleared technologies for sterilizing the masks, making them truly reusable. Absent high-tech solution, some healthcare workers have taken to dousing their masks in alcohol or boiling them to sterilize them and allow them to be used for days or weeks at a time.
Interestingly, N95 masks with a valve (which makes it easier to breath while wearing them) are actually banned in several jurisdictions’ mask orders. This is because the valve allows air that the wearer breaths out to pass unimpeded through the mask. Since the air is not filtered by the mask, N95 masks with valves provide no protection to others around you. If you still want to wear an N95 mask with a valve, go right ahead — just put some kind of cloth covering (on which more below) over the valve to protect others around you in addition to protecting yourself.
While N95s are the gold standard, there’s another option that’s proven nearly as good against other respiratory viruses: the medical-grade surgical mask. These are the little blue or yellow face shields you see doctors wearing in medical dramas, or may have put on yourself if you’ve ever visited a hospital during flu season. Surgical masks don’t create an airtight seal over the user’s mouth and nose like N95 masks do, so initially they were assumed to be much less effective in protecting against the coronavirus.
Recent studies have contradicted that assumption, though. Covid-19 is thought to spread mostly through droplets in the air, which are coughed or sneezed out by a sick patient. Surgical masks provide a barrier against these relatively large droplets, even if they don’t have a perfect seal. They also have the added bonus of preventing users from touching their faces, something that humans do an alarming amount.
As the crisis has unfolded, surgical masks are now considered nearly as good as N95 masks for medical workers. N95s are reserved for procedures where there’s likely to be a lot of coughing or sneezing (like intubating a patient), but otherwise healthcare workers are turning to the more basic surgical mask.
At first, I assumed surgical masks were made from a simple piece of fabric. It turns out they’re actually much more complex, and harder to manufacture. Medical-grade surgical masks have three layers, or plys. The outer layer is usually a waterproof synthetic fabric. It’s there to protect against watery sneezes and coughs, and to trap the big particles on which bacteria and viruses often travel. It also protects against splashes of blood or other bodily fluids during procedures.
A medical-grade surgical mask.
The middle layer of the mask is generally a filter, which traps bacteria and viruses down to 1 micron in size. It’s not as good as the .3 microns that N95 masks are rated for, but it’s still a lot more protection than a cloth mask (on which more below).
The final layer is generally another filter, and is designed to absorb the vapor from the wearer’s own breath, since the masks are generally right against the mouth and nose. Making a medical-grade mask is a challenging task. It requires special equipment, with extruded plastic sprayed through nozzles to create the tiny, sub-micro-size fibers that block particles but still allow air to flow through.
Many surgical masks are made in China. When the country was dealing with its own disease outbreak, most mask protection stayed internal. As China tamped down the virus internally, though, production started to open up — and so did exports to hard hit areas like Europe and the United States.
For previous projects, my company has worked with several manufacturers in China. They’ve helped us make things like product labels, a custom pet product, and thousands of specialized plastic bags. As China flattened its own Covid-19 curve, I started receiving messages from our Chinese suppliers offering a new product: surgical masks. Recognizing an opportunity, the country has pivoted hundreds of factories towards making masks, and is now churning them out at a (claimed) rate of 110 million per day.
At first, I was skeptical about the masks on offer. But as I started to read about healthcare workers in Seattle and New York using garbage bags and bandannas as PPE, I realized that our connections in China could serve an important purpose in the Covid-19 fight. And so I found myself taking on the unexpected role of procuring hundreds of masks from China.
Even in normal times, doing business in China (at least at our small scale) is a fascinating mix of old-school hands-on service and very streamlined tech. We originally found our Chinese suppliers through the ubiquitous marketplace Alibaba. We then developed relationships over time, conducted through email, with both sides likely making liberal use of Google Translate.
Most of our purchases from Chinese factories — masks included — have started with us receiving an email (often out of the blue) from a supplier offering a new product. These generally list a Minimum Order Quantity, a description of the product, and some staged photos of the product in use. In normal times, this might be some new kind of bag or label.
In the age of coronaviruses, these emails feature smiling models wearing face masks, and details about 3-ply construction and factories’ manufacturing capabilities. Many factories go so far as to create an invoice with your company name and details inserted, before even discussing a purchase with you.
What follows is a certain amount of very cordial and responsive email haggling — generally not around the price, but around breaks for quantities, shipping speed, and the like.
I’ve found Chinese factories to be remarkably responsive. I can send an email in the middle of the night on Chinese time, and get a response within hours. The factories often seem to be embedded without local networks, too, which allows them to source products from peers. I could likely ask any supplier for some random product (say, car tires), and within a few hours they’d have a quote and invoice ready for me.
At some point in the process, the transaction moves to a leap of faith — wiring hundreds of dollars to people you’ve never met, in a place you couldn’t locate on a map, and hoping they follow through on their end of the bargain. I say “wiring”, but really no one uses wire transfers — most payments are made either through Alipay or Paypal, to a email address provided by the factory. The total cost includes the product, shipping, payment feeds, and adjustments for any duties the factory must pay.
It sounds crazy to take this leap of faith, especially with bigger orders. But I’ve placed dozens of these orders over the years, and I’ve never had an issue with a factory failing to fulfill an order. If anything, most factories move faster then advertised — the old business adage to underpromise and overdeliver seems alive and well in China.
In this case, I worked with a supplier we’ve used for custom packaging and bags in the past. We agreed to purchase their MOQ of 800 surgical masks, for $310, including door to door air shipping. I sent the money via Paypal, with a promised delivery time of 12–15 days. In less than 7, a case of masks arrived at my door. The company even provided certificates from an independent quality assurance company, as well as an FDA certification, which made me more confident in the masks’ quality.
Even so — and recognizing that I had no idea what I was doing from a medical perspective — I decided to check them out for myself. There are several guides online for testing the quality of surgical masks. One test involves filled the front of the mask with water — if they’re real, the outer layer should be waterproof. My masks passed the test. Another obvious step it to cut the mask in half and ensure that it has three layers — mine did.
I stopped short of conducting a “fire test”, which involves pulling out the inner layer of the mask and trying to light it with a match. I don’t want to add “third degree burns” to the list of ailments my local medical system needs to treat. And besides, these masks weren’t meant for me — we purchased them to donate to the University of California San Francisco.
Months ago, donations of life-saving medical supplies from the general public would have been unthinkable. But as Covid-19 has strained supplies and front-line workers have taken drastic measures, many hospital systems are accepting (or actively soliciting) public donations of PPE, masks included. We donated our masks (minus the ones I removed for testing) to UCSF, where they were accepted by the nurses of the Benioff Children’s Hospital’s Intensive Care Nursery.
ICN nurses at UCSF accept our donation of 700+ surgical masks
UCSF will likely perform their own inspections of the masks (unlike me, they’re qualified to do this), and perhaps perform their own sterilization process. The masks will then likely be deployed in the fight against Covid-19, either to protect patients or visitors to the hospital, or if things get really desperate, to protect front-line workers themselves.
If you have surgical masks — even just a few — you should donate them too. Often the best place to start is with you local hospital — many have donation programs in place. You can also contact your town authorities — my town of San Ramon, California accepts donations in the City Hall parking lot at least once per week. Or you can reach out to a national organization like Operation Masks, which matches people who have PPE with healthcare organizations in need.
Don’t have medical-grade masks to donate (or obscure connections in China)? Or looking for an option to protect yourself, the average citizen? Your best bet is likely to turn to the final type of mask: a cloth face mask.
These masks are increasingly becoming mandatory in cities and states around the country. New York City is adopting mandatory mask laws this week, and many counties in the Bay Area have already made the move. As the world moves to reopen, it’s likely that face masks will be mandatory equipment for more people, more of the time. And for most people, it’s likely that their mask will be a homemade cloth one, not a medical-grade mask like my N95 or surgical masks.
How do homemade cloth masks work? Again, as with surgical masks, the main goal is protect others from the wearer. Cloth masks block your potentially infectious sneezes and coughs, keeping others around you a bit safer — especially in situations like walking through a grocery store or working in an office, where social distancing is often impossible. In some really desperate situations, cloth masks can even be used in a medical setting to protect patients, and the CDC formally authorized the uses at the beginning of the Covid-19 crisis.
So how do you actually get a cloth mask? There are numerous options online, and some high-fashion brands have turned to producing their own. But the simplest solution — and often the fastest — is to make one yourself. That brings us back to the CDC’s pivot towards domesticity, and its templates for sewing your own cloth mask at home.
I went to a very progressive public middle school in a suburb of Philadelphia, which was suspicious of gender norms way before “woke” became a household word. My school required all the female to take shop, and all the males to take Home Economics. So I received several semesters of formal education in sewing, cooking, and other basic household skills. When I saw the CDC’s mask tutorial, I felt fairly confident I could follow it.
Armed with a $9 sewing kit from Amazon and an old cloth dinner napkin, I got to work creating my own Covid-stopping PPE. The CDC’s mask pattern is very easy to follow. You essentially create two big rectangles of fabric, sew them together, and then create little pockets on each side for ear loops.
The CDC doesn’t directly suggest this, but many people choose to insert something between their fabric pieces to add a third layer of protection, much like a real surgical mask. Choices range from coffee filters to Swiffer cloths. Vacuum cleaner bags, it turns out, are the best option for improvised filters. The most important thing is to choose a material which can be easily laundered, or to construct your mask in a way that you can swap out the filter routinely.
For my own mask, I cut out my two pieces of fabric, transforming my fancy Pottery Barn dinner napkin into ugly rectangles with alarming, jagged edges. I then got out my needle and thread, and started sewing them together. Quickly, I realized that my sewing skills have atrophied — a lot.
My stitching work was less Martha Stewart, and more Frankenstein’s monster. Still I managed to hem the fabric and attached it at the top. Then, recognizing my own limitations, I finished the rest of the mask with tape (gaffer’s tape is a good choice), and inserted kitchen twine as my ear loops.
My CDC pattern cloth face mask under construction.
The end result was ugly as sin, and made me look like some kind of a poorly thought-out, off-canon Star Wars character from a forgotten desert planet. But it stayed on my face and covered my nose and mouth — the basic requirements for a cloth mask in the age of Covid. If you have your own sewing machine or even a tiny modicum of hand sewing skill — which I clearly lack — you can probably create something much better and more durable.
And if you can’t, there are likely plenty of people in your neighborhood who can. Through the same motivation that has led to an explosion of quarantine cooking, many people are turning to domestic pursuits to pass the time. Check out the Nextdoor feed for your area, and it’s a good bet that some local good Samaritan will make you a really nice cloth mask for the cost of materials, or even for free.
As with all things Covid, masks are likely here to stay, at least for quite some time. If you have N95 masks available, they’re the gold standard of protection. Unless you’re an essential worker yourself, you’ve worn your mask already, or you’re in a very high risk group, you should donate these masks to those who genuinely are on the front lines.
If you have surgical masks — or have a supplier who can produce them — leverage that. The world will need many millions of these masks to make it through the pandemic, and your local hospital and first responders are likely to embrace a donation of surgical masks with open arms.
You can also consider donating money to a charity or organization with the supplier connections to buy more of these masks — imported or otherwise — for local or national donation.
And for you own protection — or just to have a project — consider creating a cloth mask of your own. You can experiment with different filters and materials. You can even use the mask as a unique expression of your own style — from the high end and posh to the ironic. For some, masks are even a way to wear a big middle finger to Covid on their face whenever they’re out in the world, showing their own breed of solidarity and perseverance through the crisis.
A few months ago, the idea that millions of people who become intimately familiar with a product as obscure as face masks would have seemed bizarre. But the world we’re living in now is bizarre in so many more serious, alarming ways. Masks are one bright area, where we can each take a simple action that directly affects the common good — especially with cloth masks than anyone can make at home.
So get out your needle and thread, consult either your grandma or the CDC (never a line I thought I’d get to use), and start sewing!
|
https://tomsmith585.medium.com/the-science-and-tech-of-face-masks-36f6db045fb8
|
['Thomas Smith']
|
2020-04-27 14:07:47.505000+00:00
|
['Covid 19', 'Explainer', 'Face Mask', 'N95', 'Science']
|
Title Science Tech Face MasksContent Science Tech Face Masks put money body line teach everything COVID19 mask Images courtesy author CDC start publishing sewing tutorial know thing gotten weird COVID19 spread worldwide though that’s strange new world we’re living COVID19 crisis first began Centers Disease Control United States World Health Organization internationally came strongly healthy citizen wearing face mask way made sense time mask don’t protect wearer infection there’s risk they’ll provide false sense security seatbelt airbags make people drive faster wearing face cover convince people they’re fully protected coronavirus prompting go touch face skip step really protect like simple step washing hand Initially also concern people would buy medical grade mask taking hand frontline medical professional actually need coronavirus crisis wear social distancing measure continue intensify CDC made dramatic aboutface Acknowledging mask protect others even don’t protect others carrying virus CDC recommends Americans wear face covering time go outside Several jurisdiction — including Los Angeles — quickly enshrined advice enforceable law ordinance Like many thing COVID19 could likely learned lesson much earlier looked bit beyond border Asian country made mask cornerstone coronavirus response day one place like Japan it’s customary wear face mask throughout cold flu season even normal time social gesture medical one event CDC’s dramatic reversal Americans suddenly getting crash course face covering Minutiae like difference cloth mask medical grade mask acceptable quarantineunit dinner conversation Extremely technical term like “N95” suddenly common knowledge — fodder political squabbling American’s credit civic organization individual mobilized around mask way that’s likely possible nowhere else Church group like sewing cloth mask enmass mask drive resulted donation medical grade mask — often sourced shuttered business peoples’ garage — hospital across country face mask work anyway made different kind differ make question suddenly million peoples’ mind decided dive help answer required wiring money China spraying face former chemical warfare agent relearning middle school Home Economics skill thought I’d never need revisit Let’s dive together take look mask various form Let’s begin first principle essentially three broad category mask available today respirator mask medical grade surgical mask cloth mask Respirator mask hardest come politically fraught fabled N95s hear mentioned ad nauseum news authority encouraging citizen company donate — forcibly otherwise — frontline first responder mask important Unlike option market provide wearer measurable level actual protection COVID19 coronavirus airborne pathogen N95 designation mean mask block 95 particle 3 micron size includes many droplet carry virus bacteria including many scientist believe Covid19 you’re wearing N95 mask you’re actually protected virus even it’s floating air around N95 mask believed protect wearer virus catch N95 mask work fit properly masks’ ability block particle worthless there’s big gap around nose beard air flowing medical setting frontline worker routinely fittested ensure N95 respirator mask actually providing protection process fit test relatively straightforward provider put mask adjust properly fit face place environment detectable particle around 3 micron detect particle they’ve failed test can’t detect mask proper fit pass wanted see fit test process work firsthand N95 mask purchased crisis I’ve already worn making ineligible donation see fit went search material necessary perform fit test time test performed using pleasant particle doctor nurse put mask step environment filled either pleasant smelling chemical fine mist artificial sweetener like saccharin smell chemical taste sweet saccharin they’ve failed test material required perform version fit test sold resorted option still available stannic chloride Stannic chloride chemical compound originally used chemical warfare agent Even low non dangerous concentration cause immediate harmless used properly irritation nose throat causing coughing discomfort user It’s often used fit test there’s concern user might fake result — example want get fit test quickly possible return work It’s easy enough step chamber filled saccharin wearing improper mask say “Nope don’t taste anything” It’s harder get blasted face stannic chloride avoid uncontrollable coughing Technically chemical usually used fit test N100 P100 mask block even particle N95 one figured stannic chloride least see noticeable different irritation wore N95 mask left still block significant amount chemical wearing properly that’s found stepping onto driveway using little turkeybaster apparatus spray face former chemical warfare agent please don’t try home fit test protocol stannic chloride say begin “the test subject shall allowed smell weak concentration irritant smoke respirator donned become familiar irritating properties” Aspirator stannic chloride fit test wafted bit smoke tube hung air closed eye stepped inhaling deeply like oldschool department store shopper trying new perfume expected acrid smell experience inhaling smoke le smell sensation instant burning need cough wasn’t like someone blowing cigarette smoke face you’re hacking away minute Rather immediately induced feeling might get you’re sitting quiet lecture hall theater another place coughing would annoy around find needing cough repeatedly anyway “Irritating” actually excellent adjective describe overall experience donned N95 mask following recommendation fit test protocol puffed bit smoke time walked inhaled felt nothing — need cough irritation almost effect could smell vapor bit likely mask N95 N100 pungent gunpowderlike sulfurous smell irritation totally absent Emboldened tried blasting mask several time smoke Still felt nothing result striking assumed layperson probably wearing mask incorrectly amount air potentially virusladen particle leaking fit test showed actually mask surprisingly effective even limited knowledge use properly gave new faith mask’s ability protect whatever floating around environment — whether that’s irritant smoke virusfilled remnant infected person’s sneeze protection N95 mask become gold standard age coronavirus actively blocking particle carry virus provide kind protection frontline healthcare worker need treating infected patient It’s also they’re short supply crisis healthcare worker would discard N95 mask like candy visit patient would N95 throw away leaving patient’s room Even construction nonmedical setting recommendation wear N95 mask 8 hour time recommendation changed dramatically Frontline healthcare worker wearing N95 mask 30 day FDA rapidly cleared technology sterilizing mask making truly reusable Absent hightech solution healthcare worker taken dousing mask alcohol boiling sterilize allow used day week time Interestingly N95 mask valve make easier breath wearing actually banned several jurisdictions’ mask order valve allows air wearer breath pas unimpeded mask Since air filtered mask N95 mask valve provide protection others around still want wear N95 mask valve go right ahead — put kind cloth covering valve protect others around addition protecting N95s gold standard there’s another option that’s proven nearly good respiratory virus medicalgrade surgical mask little blue yellow face shield see doctor wearing medical drama may put you’ve ever visited hospital flu season Surgical mask don’t create airtight seal user’s mouth nose like N95 mask initially assumed much le effective protecting coronavirus Recent study contradicted assumption though Covid19 thought spread mostly droplet air coughed sneezed sick patient Surgical mask provide barrier relatively large droplet even don’t perfect seal also added bonus preventing user touching face something human alarming amount crisis unfolded surgical mask considered nearly good N95 mask medical worker N95s reserved procedure there’s likely lot coughing sneezing like intubating patient otherwise healthcare worker turning basic surgical mask first assumed surgical mask made simple piece fabric turn they’re actually much complex harder manufacture Medicalgrade surgical mask three layer ply outer layer usually waterproof synthetic fabric It’s protect watery sneeze cough trap big particle bacteria virus often travel also protects splash blood bodily fluid procedure medicalgrade surgical mask middle layer mask generally filter trap bacteria virus 1 micron size It’s good 3 micron N95 mask rated it’s still lot protection cloth mask final layer generally another filter designed absorb vapor wearer’s breath since mask generally right mouth nose Making medicalgrade mask challenging task requires special equipment extruded plastic sprayed nozzle create tiny submicrosize fiber block particle still allow air flow Many surgical mask made China country dealing disease outbreak mask protection stayed internal China tamped virus internally though production started open — export hard hit area like Europe United States previous project company worked several manufacturer China They’ve helped u make thing like product label custom pet product thousand specialized plastic bag China flattened Covid19 curve started receiving message Chinese supplier offering new product surgical mask Recognizing opportunity country pivoted hundred factory towards making mask churning claimed rate 110 million per day first skeptical mask offer started read healthcare worker Seattle New York using garbage bag bandanna PPE realized connection China could serve important purpose Covid19 fight found taking unexpected role procuring hundred mask China Even normal time business China least small scale fascinating mix oldschool handson service streamlined tech originally found Chinese supplier ubiquitous marketplace Alibaba developed relationship time conducted email side likely making liberal use Google Translate purchase Chinese factory — mask included — started u receiving email often blue supplier offering new product generally list Minimum Order Quantity description product staged photo product use normal time might new kind bag label age coronaviruses email feature smiling model wearing face mask detail 3ply construction factories’ manufacturing capability Many factory go far create invoice company name detail inserted even discussing purchase follows certain amount cordial responsive email haggling — generally around price around break quantity shipping speed like I’ve found Chinese factory remarkably responsive send email middle night Chinese time get response within hour factory often seem embedded without local network allows source product peer could likely ask supplier random product say car tire within hour they’d quote invoice ready point process transaction move leap faith — wiring hundred dollar people you’ve never met place couldn’t locate map hoping follow end bargain say “wiring” really one us wire transfer — payment made either Alipay Paypal email address provided factory total cost includes product shipping payment feed adjustment duty factory must pay sound crazy take leap faith especially bigger order I’ve placed dozen order year I’ve never issue factory failing fulfill order anything factory move faster advertised — old business adage underpromise overdeliver seems alive well China case worked supplier we’ve used custom packaging bag past agreed purchase MOQ 800 surgical mask 310 including door door air shipping sent money via Paypal promised delivery time 12–15 day le 7 case mask arrived door company even provided certificate independent quality assurance company well FDA certification made confident masks’ quality Even — recognizing idea medical perspective — decided check several guide online testing quality surgical mask One test involves filled front mask water — they’re real outer layer waterproof mask passed test Another obvious step cut mask half ensure three layer — mine stopped short conducting “fire test” involves pulling inner layer mask trying light match don’t want add “third degree burns” list ailment local medical system need treat besides mask weren’t meant — purchased donate University California San Francisco Months ago donation lifesaving medical supply general public would unthinkable Covid19 strained supply frontline worker taken drastic measure many hospital system accepting actively soliciting public donation PPE mask included donated mask minus one removed testing UCSF accepted nurse Benioff Children’s Hospital’s Intensive Care Nursery ICN nurse UCSF accept donation 700 surgical mask UCSF likely perform inspection mask unlike they’re qualified perhaps perform sterilization process mask likely deployed fight Covid19 either protect patient visitor hospital thing get really desperate protect frontline worker surgical mask — even — donate Often best place start local hospital — many donation program place also contact town authority — town San Ramon California accepts donation City Hall parking lot least per week reach national organization like Operation Masks match people PPE healthcare organization need Don’t medicalgrade mask donate obscure connection China looking option protect average citizen best bet likely turn final type mask cloth face mask mask increasingly becoming mandatory city state around country New York City adopting mandatory mask law week many county Bay Area already made move world move reopen it’s likely face mask mandatory equipment people time people it’s likely mask homemade cloth one medicalgrade mask like N95 surgical mask homemade cloth mask work surgical mask main goal protect others wearer Cloth mask block potentially infectious sneeze cough keeping others around bit safer — especially situation like walking grocery store working office social distancing often impossible really desperate situation cloth mask even used medical setting protect patient CDC formally authorized us beginning Covid19 crisis actually get cloth mask numerous option online highfashion brand turned producing simplest solution — often fastest — make one brings u back CDC’s pivot towards domesticity template sewing cloth mask home went progressive public middle school suburb Philadelphia suspicious gender norm way “woke” became household word school required female take shop male take Home Economics received several semester formal education sewing cooking basic household skill saw CDC’s mask tutorial felt fairly confident could follow Armed 9 sewing kit Amazon old cloth dinner napkin got work creating Covidstopping PPE CDC’s mask pattern easy follow essentially create two big rectangle fabric sew together create little pocket side ear loop CDC doesn’t directly suggest many people choose insert something fabric piece add third layer protection much like real surgical mask Choices range coffee filter Swiffer cloth Vacuum cleaner bag turn best option improvised filter important thing choose material easily laundered construct mask way swap filter routinely mask cut two piece fabric transforming fancy Pottery Barn dinner napkin ugly rectangle alarming jagged edge got needle thread started sewing together Quickly realized sewing skill atrophied — lot stitching work le Martha Stewart Frankenstein’s monster Still managed hem fabric attached top recognizing limitation finished rest mask tape gaffer’s tape good choice inserted kitchen twine ear loop CDC pattern cloth face mask construction end result ugly sin made look like kind poorly thoughtout offcanon Star Wars character forgotten desert planet stayed face covered nose mouth — basic requirement cloth mask age Covid sewing machine even tiny modicum hand sewing skill — clearly lack — probably create something much better durable can’t likely plenty people neighborhood motivation led explosion quarantine cooking many people turning domestic pursuit pas time Check Nextdoor feed area it’s good bet local good Samaritan make really nice cloth mask cost material even free thing Covid mask likely stay least quite time N95 mask available they’re gold standard protection Unless you’re essential worker you’ve worn mask already you’re high risk group donate mask genuinely front line surgical mask — supplier produce — leverage world need many million mask make pandemic local hospital first responder likely embrace donation surgical mask open arm also consider donating money charity organization supplier connection buy mask — imported otherwise — local national donation protection — project — consider creating cloth mask experiment different filter material even use mask unique expression style — high end posh ironic mask even way wear big middle finger Covid face whenever they’re world showing breed solidarity perseverance crisis month ago idea million people become intimately familiar product obscure face mask would seemed bizarre world we’re living bizarre many serious alarming way Masks one bright area take simple action directly affect common good — especially cloth mask anyone make home get needle thread consult either grandma CDC never line thought I’d get use start sewingTags Covid 19 Explainer Face Mask N95 Science
|
5,593 |
Pandas, Plotting, Pizzaz, Brazil?
|
Pandas, Plotting, Pizzaz, Brazil?
A Brazillion Ways to Explore Data
To those newly initiated to the world of data science, the word pandas might still configure up images of cute, fuzzy, friendly animals, seemingly built for nothing but being cute on camera and wasting otherwise productive hours browsing google images of bears eating leaves. But don’t worry! That’ll soon change. We’ll take you from a passive panda perceiver to a full on Pandas professional, in just a few short minutes.
This article will assume some basic knowledge of Python, and a general idea of what the Pandas library is will be helpful, but not necessary. I’ll go over some basic/introductory concepts to get an overview and general understanding, but the focus of this article will be on application of matplotlib, pyplot, seaborn, and pandas in exploratory data analysis of a messy(ish) dataset. As we go through, I’ll suggest exploration to do on your own for your own practice.
Some of the questions I’ll answer here:
What is Pandas?
When is Pandas used?
How do you clean a dataset using Pandas?
How can visualizations aid in exploratory data analysis (EDA)?
What does exploratory data analysis look like?
What is Pandas?
Pandas is one of the premier packages for managing and cleaning data in the Python data science space. It allows for the neat containerization of data into Pandas objects called dataframes and is compatible with the widely used Python computing and data manipulation package NumPy, and can be combined with a ton of other common data manipulation packages like SQL. Pandas also contains many functions for cleaning, manipulating, indexing, and performing operations on data in a way which is optimized and significantly faster than standard Python operations, which especially comes in handy when working with the very large datasets which we can sometimes encounter.
At its most basic form, a Pandas dataframe is a two-dimensional organization of data with rows and columns. Though rows and columns are referred to in many way and many contexts throughout the data science world, I’ll try to stick to either rows and columns or datapoints and features. Dataframe columns can be individually selected as a Series, the other common Pandas datatype. The Pandas series and dataframe are the foundation of the Pandas library, and the documentation does a good job explaining these datatypes, so you can read more here if you’re unfamiliar.
When is Pandas Used?
This is an easy one. Are you working with data in Python? Use Pandas.
There are cases when a dataset is simply too large for a local runtime, where additional strategies must be employed, though after a certain point (limited either by your patience or your machine), a switch to a tool designed for larger datasets will be required regardless.
Pandas is used when working with CSVs, data scraped from the web, datasets from Kaggle or other sources, or pretty much any other time you have data which takes the from of datapoints with multiple features.
How do you clean a dataset using Pandas?
The short answer is it depends. There are a myriad of strategies that can be employed, and often you’ll have to look up examples specific to your situation, such as dealing with categorical variables, strings, etc.
That’s not a very useful answer though. In an attempt to write an actually helpful article, I’ll highly recommend visiting Kaggle and simply noting the strategies experts and master make use of, and the situations in which they’re used, making a data science cheatsheet of sorts where you can note basic tasks. I’ll link a rough one I made a while back here.
A helpful process can actually be writing down end goals you want, attempting to figure out a few substeps to get there, and then searching Stack Overflow or Pandas documentation for the implementation. Data cleaning is much more about understanding the mindset of how to manipulate data into useful forms than it is about memorizing an algorithm.
What Does EDA Look Like?
I’ll go through an example here, and post the Kaggle Kernel to take an even closer look, directly at the code. I highly recommend following along on your own, or in the notebook, looking up documentation as you go along.
This is a fairly large dataset with a TON of features. Let’s see if we can employ Pandas and some creative visualizations to clean this up.
As a disclaimer, Seaborn is based on PyPlot and Matplotlib, hence their mention in the intro, but I prefer Seaborn’s functionality and style, so you may not see PyPlot or Matplotlib explicitly here.
# import pandas and numpy, as well as seaborn for visualizations
import numpy as np
import pandas as pd
import seaborn as sns # import the os package for Kaggle
import os
print(os.listdir("../input")) # read in the data and create a dataframe
df = pd.read_csv("../input/brazilian-cities/BRAZIL_CITIES.csv", sep=";", decimal=",") # view a random sample of the dataframe
df.sample(25)
Good stuff! Let’s also look at the shape to get a sense of the magnitude of the set we’re looking at.
# get dataframe shape
df.shape
Out:
(5576, 81)
Whenever we do any type of complicated work, it can often be helpful to have a deep copy of our dataframe on hand such that we can more easily revert back if and when we mess up our primary dataframe.
# create a deep copy of our dataframe
df_copy = df.copy(True)
Now that we’ve done this, let’s get to work!
This dataset has a fairly large amount of features, which might make it hard to explore. Since we’re not looking to create a model, we don’t have to perform feature selection, so we can simply select a subset of columns we’d like to explore. Feel free to make a different list than my own!
columns = ['CITY', 'STATE', 'CAPITAL', 'IBGE_RES_POP', 'IBGE_RES_POP_BRAS','IBGE_RES_POP_ESTR','IBGE_DU','IBGE_DU_URBAN','IBGE_DU_RURAL', 'IBGE_POP','IBGE_1','IBGE_1-4','IBGE_5-9','IBGE_10-14','IBGE_15-59','IBGE_60+','IBGE_PLANTED_AREA','IDHM','LONG','LAT','ALT','ESTIMATED_POP','GDP_CAPITA','Cars','Motorcycles','UBER','MAC','WAL-MART','BEDS'] # create reduced dataframe and check shape
r_df = df[columns]
r_df.shape
Out:
(5576, 29)
Awesome! Much more manageable now. A really helpful tool for initial exploration is the Seaborn pairplot function. It graphs every variable against every other variable in one method! Let’s see if it can help us here.
# create a seaborn pairplot
pp = sns.pairplot(r_df)
Seaborn Pairplot of the Data
Wow. Umm, okay. That’s huge, a little overwhelming, and not particularly helpful. Imagine if we had left all the features in! A good next step, especially if the pairplot to unhelpful is trying a correlation matrix just to see if there’s any linear relationship among variables.
corr = r_df.corr() # I prefer one sided matricies so create a mask
mask = np.zeros_like(corr, dtype=np.bool)
mask[np.triu_indices_from(mask)] = True
# set up figure
f, ax = plt.subplots(figsize=(15, 15))
cmap = sns.diverging_palette(220, 20, as_cmap=True)
sns.heatmap(corr, mask=mask,cmap=cmap, vmax=.3, center=0,
square=True, linewidths=.5, cbar_kws={"shrink": .5})
Correlation matrix of all features
The UBER column is completely blank, let’s look directly at the data to see what’s going on.
r_df.UBER
Out:
0 1.0
1 NaN
2 1.0
3 1.0
4 1.0
5 1.0
6 NaN
7 1.0
8 1.0
9 1.0
10 1.0
11 1.0
12 1.0
13 1.0
14 1.0
15 1.0
16 1.0
17 NaN
18 NaN
19 NaN
20 NaN
21 1.0
22 1.0
23 NaN
24 NaN
25 NaN
26 1.0
27 1.0
28 NaN
29 1.0
...
5546 NaN
5547 NaN
5548 NaN
5549 NaN
5550 NaN
5551 NaN
5552 NaN
5553 NaN
5554 NaN
5555 NaN
5556 NaN
5557 NaN
5558 NaN
5559 NaN
5560 NaN
5561 NaN
5562 NaN
5563 NaN
5564 NaN
5565 NaN
5566 NaN
5567 NaN
5568 NaN
5569 NaN
5570 NaN
5571 NaN
5572 NaN
5573 NaN
5574 NaN
5575 NaN
Lot’s of NaNs and no zeros, a possible error in the set.
# there are a lot of nans, possibly in place of zeros, let's check
df_copy.UBER.value_counts()
Out:
1.0 125
Yeah, no zeros in the whole columns and only 125 values out of over 5000. Let’s replace NaNs with zeros and try again.
r_df.UBER.replace({np.nan:0}, inplace=True)
r_df.UBER.value_counts()
Out:
0.0 5451
1.0 125
Success! Let’s try the correlation matrix again. Running the same code, we get:
Corrected correlation matrix
Now let’s start looking at the relationships!
# let's investigate the strongest correlation first
sns.set_style('dark')
sns.scatterplot(x=r_df.IDHM, y=r_df.LAT)
Relationship of Latitude and IDHM (Human Development Index)
This shows what we expected from the correlation matrix, but doesn’t really supply a lot of meaning to most people, as without an idea of the latitudes of Brazil or a better idea of what IDHM is, they’re kinda just meaningless points. Let’s contextualize within the geography of the country using latitude and longitude.
# map of lat and long with IDHM detemining size
f, ax = plt.subplots(figsize=(8, 8))
sns.scatterplot(x=r_df.LONG, y=r_df.LAT, size=r_df.IDHM)
Attempt #2 at contextualizing IDHM trends
Here we can see a rough outline of the nation of Brazil. Overlaying this over a map may be even more helpful, but we’re going to skip that here. We can see a bit of a trend, but with so many points it’s hard to distinguish the sizes. Let’s try adding a color encode.
# it's hard to see any trends here, let's add color to get a better idea
f, ax = plt.subplots(figsize=(8, 8))
sns.scatterplot(x=r_df.LONG, y=r_df.LAT, size=r_df.IDHM, hue=r_df.IDHM)
Attempt #3 at contextualizing IDHM trends, much better!
Fantastic! Here we can clearly see a trend of higher IDHM towards the center and south of the country. In your own exploration, maybe you can find characteristics about these parts of the country which may cause this.
Let’s see if we can identify the state capital cities in this plot.
# let's see if we can spot any capitals in there
f, ax = plt.subplots(figsize=(8, 8))
markers = {0:'o', 1:'s'}
sns.scatterplot(x=r_df.LONG, y=r_df.LAT, size=r_df.IDHM, hue=r_df.IDHM,style=r_df.CAPITAL, markers=markers)
Attempt #4 at contextualizing IDHM trends, with markers added for capital cities
Here we’re facing the same problem as in attempt #2. We can’t see the additionally encoded data! Let’s try an overlay and see if that’s more clear.
f, ax = plt.subplots(figsize=(8, 8))
sns.scatterplot(x=r_df.LONG, y=r_df.LAT, size=r_df.IDHM, hue=r_df.IDHM)
sns.scatterplot(x=r_df[r_df.CAPITAL==1].LONG, y=r_df[r_df.CAPITAL==1].LAT, s=100)
Attempt #5 at contextualizing IDHM trends, with an overlay added for capital cities
Great. Now we can see the capital cities as well, and we can see that they very strongly trend towards the east of the country, towards the coast. Not surprising, yet very interesting to see all the same. Let’s see if we can give GDP per capita a similar treatment, kind of reusing our code.
f, ax = plt.subplots(figsize=(8, 8))
sns.scatterplot(x=r_df.LONG, y=r_df.LAT, size=r_df.GDP_CAPITA, hue=r_df.GDP_CAPITA)
GDP per capita encoded with latitude and longitude
Hmm. It looks like there aren’t enough color bins to show all the trends in GDP here, likely meaning the data has a large spread and/or skew. Let’s investigate.
# let's take a look at the distribution, after taking care of nans
f, ax = plt.subplots(figsize=(12, 8))
gdp = r_df.GDP_CAPITA.dropna()
sns.distplot(gdp)
Distribution of GDP per Capita
# it looks like gdp is heavily right skewed with a massive tail.
# it seems likely that those massive outliers are errors, and could be removed in some cases
gdp.describe()
Out:
count 5573.000000
mean 21129.767244
std 20327.836119
min 3190.570000
25% 9061.720000
50% 15879.960000
75% 26156.990000
max 314637.690000
A huge tail, and a significant skew. This will make the data much more difficult to encode with color. If we wanted to, we could remove outliers or try a different scale on the data, potentially log, but we’ll skip that for now and investigate one last variable. Uber! Let’s see how they’re doing.
f, ax = plt.subplots(figsize=(12, 8))
sns.countplot(r_df['UBER'])
Countplot of Uber in Brazillian Cities
We can see here that the vast majority of Brazillian cities don’t have Uber. Let’s see where they are.
f, ax = plt.subplots(figsize=(8, 8))
sns.scatterplot(x=r_df[r_df.UBER==0].LONG, y=r_df[r_df.UBER==0].LAT)
sns.scatterplot(x=r_df[r_df.UBER==1].LONG, y=r_df[r_df.UBER==1].LAT)
Distribution of cities with presence of UBER encoded in orange
We can see similar trends to IDHM here, with clustering by the coast and the south of the country. Try checking out IDHM and Uber on your own! Let’s look at the Uber’s relationship with cars.
f, ax = plt.subplots(figsize=(16, 12))
sns.boxplot(y=r_df['Cars'], x=r_df['UBER'])
Box plot of the distribution of number of cars in Brazillian cities, separated by the presence of Uber.
Oof. That’s not pretty or useful. Let’s remove those giant outlier and get a better look.
ubers, car_vals = r_df[r_df.Cars <100000].UBER, r_df[r_df.Cars <100000].Cars
sns.boxplot(ubers, car_vals )
Box plot of the distribution of number of cars (#Cars <100,000) in Brazillian cities, separated by the presence of Uber.
This is a really interesting distribution. The minimum bounds for cities with Uber is above the maximum bound for cities without. Try investigating other variables related to cars on your own to see why this is. My guess would be something to do with population or GDP. Also the presence of outliers in the non Uber cities suggests a large skew/tail. Let’s take a look.
f, ax = plt.subplots(figsize=(16, 12))
sns.distplot(r_df[(r_df.Cars < 100000) & (r_df.UBER==0)].Cars)
sns.distplot(r_df[(r_df.Cars < 100000) & (r_df.UBER==1)].Cars, bins=20)
As expected! The majority of cities either have no cars or a very small amount. Good work!
What we learned:
What is Pandas? A very useful Python package for data cleaning and manipulation.
A very useful Python package for data cleaning and manipulation. When is Pandas used? Pretty much anytime you need to work with data in Python!
Pretty much anytime you need to work with data in Python! How do you clean a dataset using Pandas? With a curious mind, and lots of searching pandas documentation, paired with helpful examples.
With a curious mind, and lots of searching pandas documentation, paired with helpful examples. How can visualizations aid in exploratory data analysis (EDA)? Visualizations are KEY in exploratory data analysis. This is easier shown than explained, check out the example in the article.
Visualizations are KEY in exploratory data analysis. This is easier shown than explained, check out the example in the article. What does exploratory data analysis look like? See above!
Want to explore more? Try these:
Explore population data
See how population varies with other supplied categorical values
Look at how gdp per capita varies with presence of other industries
Add back in all variables/subset with different variables and create more correlation matrices to explore additional trends
Thanks for reading!
|
https://medium.com/swlh/pandas-plotting-pizzaz-brazil-9a3aa4cf21b
|
['Caleb Neale']
|
2019-06-14 16:56:11.943000+00:00
|
['Python', 'Data', 'Data Science', 'Data Visualization', 'Pandas']
|
Title Pandas Plotting Pizzaz BrazilContent Pandas Plotting Pizzaz Brazil Brazillion Ways Explore Data newly initiated world data science word panda might still configure image cute fuzzy friendly animal seemingly built nothing cute camera wasting otherwise productive hour browsing google image bear eating leaf don’t worry That’ll soon change We’ll take passive panda perceiver full Pandas professional short minute article assume basic knowledge Python general idea Pandas library helpful necessary I’ll go basicintroductory concept get overview general understanding focus article application matplotlib pyplot seaborn panda exploratory data analysis messyish dataset go I’ll suggest exploration practice question I’ll answer Pandas Pandas used clean dataset using Pandas visualization aid exploratory data analysis EDA exploratory data analysis look like Pandas Pandas one premier package managing cleaning data Python data science space allows neat containerization data Pandas object called dataframes compatible widely used Python computing data manipulation package NumPy combined ton common data manipulation package like SQL Pandas also contains many function cleaning manipulating indexing performing operation data way optimized significantly faster standard Python operation especially come handy working large datasets sometimes encounter basic form Pandas dataframe twodimensional organization data row column Though row column referred many way many context throughout data science world I’ll try stick either row column datapoints feature Dataframe column individually selected Series common Pandas datatype Pandas series dataframe foundation Pandas library documentation good job explaining datatypes read you’re unfamiliar Pandas Used easy one working data Python Use Pandas case dataset simply large local runtime additional strategy must employed though certain point limited either patience machine switch tool designed larger datasets required regardless Pandas used working CSVs data scraped web datasets Kaggle source pretty much time data take datapoints multiple feature clean dataset using Pandas short answer depends myriad strategy employed often you’ll look example specific situation dealing categorical variable string etc That’s useful answer though attempt write actually helpful article I’ll highly recommend visiting Kaggle simply noting strategy expert master make use situation they’re used making data science cheatsheet sort note basic task I’ll link rough one made back helpful process actually writing end goal want attempting figure substeps get searching Stack Overflow Pandas documentation implementation Data cleaning much understanding mindset manipulate data useful form memorizing algorithm EDA Look Like I’ll go example post Kaggle Kernel take even closer look directly code highly recommend following along notebook looking documentation go along fairly large dataset TON feature Let’s see employ Pandas creative visualization clean disclaimer Seaborn based PyPlot Matplotlib hence mention intro prefer Seaborn’s functionality style may see PyPlot Matplotlib explicitly import panda numpy well seaborn visualization import numpy np import panda pd import seaborn sn import o package Kaggle import o printoslistdirinput read data create dataframe df pdreadcsvinputbraziliancitiesBRAZILCITIEScsv sep decimal view random sample dataframe dfsample25 Good stuff Let’s also look shape get sense magnitude set we’re looking get dataframe shape dfshape 5576 81 Whenever type complicated work often helpful deep copy dataframe hand easily revert back mess primary dataframe create deep copy dataframe dfcopy dfcopyTrue we’ve done let’s get work dataset fairly large amount feature might make hard explore Since we’re looking create model don’t perform feature selection simply select subset column we’d like explore Feel free make different list column CITY STATE CAPITAL IBGERESPOP IBGERESPOPBRASIBGERESPOPESTRIBGEDUIBGEDUURBANIBGEDURURAL IBGEPOPIBGE1IBGE14IBGE59IBGE1014IBGE1559IBGE60IBGEPLANTEDAREAIDHMLONGLATALTESTIMATEDPOPGDPCAPITACarsMotorcyclesUBERMACWALMARTBEDS create reduced dataframe check shape rdf dfcolumns rdfshape 5576 29 Awesome Much manageable really helpful tool initial exploration Seaborn pairplot function graph every variable every variable one method Let’s see help u create seaborn pairplot pp snspairplotrdf Seaborn Pairplot Data Wow Umm okay That’s huge little overwhelming particularly helpful Imagine left feature good next step especially pairplot unhelpful trying correlation matrix see there’s linear relationship among variable corr rdfcorr prefer one sided matricies create mask mask npzeroslikecorr dtypenpbool masknptriuindicesfrommask True set figure f ax pltsubplotsfigsize15 15 cmap snsdivergingpalette220 20 ascmapTrue snsheatmapcorr maskmaskcmapcmap vmax3 center0 squareTrue linewidths5 cbarkwsshrink 5 Correlation matrix feature UBER column completely blank let’s look directly data see what’s going rdfUBER 0 10 1 NaN 2 10 3 10 4 10 5 10 6 NaN 7 10 8 10 9 10 10 10 11 10 12 10 13 10 14 10 15 10 16 10 17 NaN 18 NaN 19 NaN 20 NaN 21 10 22 10 23 NaN 24 NaN 25 NaN 26 10 27 10 28 NaN 29 10 5546 NaN 5547 NaN 5548 NaN 5549 NaN 5550 NaN 5551 NaN 5552 NaN 5553 NaN 5554 NaN 5555 NaN 5556 NaN 5557 NaN 5558 NaN 5559 NaN 5560 NaN 5561 NaN 5562 NaN 5563 NaN 5564 NaN 5565 NaN 5566 NaN 5567 NaN 5568 NaN 5569 NaN 5570 NaN 5571 NaN 5572 NaN 5573 NaN 5574 NaN 5575 NaN Lot’s NaNs zero possible error set lot nan possibly place zero let check dfcopyUBERvaluecounts 10 125 Yeah zero whole column 125 value 5000 Let’s replace NaNs zero try rdfUBERreplacenpnan0 inplaceTrue rdfUBERvaluecounts 00 5451 10 125 Success Let’s try correlation matrix Running code get Corrected correlation matrix let’s start looking relationship let investigate strongest correlation first snssetstyledark snsscatterplotxrdfIDHM yrdfLAT Relationship Latitude IDHM Human Development Index show expected correlation matrix doesn’t really supply lot meaning people without idea latitude Brazil better idea IDHM they’re kinda meaningless point Let’s contextualize within geography country using latitude longitude map lat long IDHM detemining size f ax pltsubplotsfigsize8 8 snsscatterplotxrdfLONG yrdfLAT sizerdfIDHM Attempt 2 contextualizing IDHM trend see rough outline nation Brazil Overlaying map may even helpful we’re going skip see bit trend many point it’s hard distinguish size Let’s try adding color encode hard see trend let add color get better idea f ax pltsubplotsfigsize8 8 snsscatterplotxrdfLONG yrdfLAT sizerdfIDHM huerdfIDHM Attempt 3 contextualizing IDHM trend much better Fantastic clearly see trend higher IDHM towards center south country exploration maybe find characteristic part country may cause Let’s see identify state capital city plot let see spot capital f ax pltsubplotsfigsize8 8 marker 0o 1 snsscatterplotxrdfLONG yrdfLAT sizerdfIDHM huerdfIDHMstylerdfCAPITAL markersmarkers Attempt 4 contextualizing IDHM trend marker added capital city we’re facing problem attempt 2 can’t see additionally encoded data Let’s try overlay see that’s clear f ax pltsubplotsfigsize8 8 snsscatterplotxrdfLONG yrdfLAT sizerdfIDHM huerdfIDHM snsscatterplotxrdfrdfCAPITAL1LONG yrdfrdfCAPITAL1LAT s100 Attempt 5 contextualizing IDHM trend overlay added capital city Great see capital city well see strongly trend towards east country towards coast surprising yet interesting see Let’s see give GDP per caput similar treatment kind reusing code f ax pltsubplotsfigsize8 8 snsscatterplotxrdfLONG yrdfLAT sizerdfGDPCAPITA huerdfGDPCAPITA GDP per caput encoded latitude longitude Hmm look like aren’t enough color bin show trend GDP likely meaning data large spread andor skew Let’s investigate let take look distribution taking care nan f ax pltsubplotsfigsize12 8 gdp rdfGDPCAPITAdropna snsdistplotgdp Distribution GDP per Capita look like gdp heavily right skewed massive tail seems likely massive outlier error could removed case gdpdescribe count 5573000000 mean 21129767244 std 20327836119 min 3190570000 25 9061720000 50 15879960000 75 26156990000 max 314637690000 huge tail significant skew make data much difficult encode color wanted could remove outlier try different scale data potentially log we’ll skip investigate one last variable Uber Let’s see they’re f ax pltsubplotsfigsize12 8 snscountplotrdfUBER Countplot Uber Brazillian Cities see vast majority Brazillian city don’t Uber Let’s see f ax pltsubplotsfigsize8 8 snsscatterplotxrdfrdfUBER0LONG yrdfrdfUBER0LAT snsscatterplotxrdfrdfUBER1LONG yrdfrdfUBER1LAT Distribution city presence UBER encoded orange see similar trend IDHM clustering coast south country Try checking IDHM Uber Let’s look Uber’s relationship car f ax pltsubplotsfigsize16 12 snsboxplotyrdfCars xrdfUBER Box plot distribution number car Brazillian city separated presence Uber Oof That’s pretty useful Let’s remove giant outlier get better look ubers carvals rdfrdfCars 100000UBER rdfrdfCars 100000Cars snsboxplotubers carvals Box plot distribution number car Cars 100000 Brazillian city separated presence Uber really interesting distribution minimum bound city Uber maximum bound city without Try investigating variable related car see guess would something population GDP Also presence outlier non Uber city suggests large skewtail Let’s take look f ax pltsubplotsfigsize16 12 snsdistplotrdfrdfCars 100000 rdfUBER0Cars snsdistplotrdfrdfCars 100000 rdfUBER1Cars bins20 expected majority city either car small amount Good work learned Pandas useful Python package data cleaning manipulation useful Python package data cleaning manipulation Pandas used Pretty much anytime need work data Python Pretty much anytime need work data Python clean dataset using Pandas curious mind lot searching panda documentation paired helpful example curious mind lot searching panda documentation paired helpful example visualization aid exploratory data analysis EDA Visualizations KEY exploratory data analysis easier shown explained check example article Visualizations KEY exploratory data analysis easier shown explained check example article exploratory data analysis look like See Want explore Try Explore population data See population varies supplied categorical value Look gdp per caput varies presence industry Add back variablessubset different variable create correlation matrix explore additional trend Thanks readingTags Python Data Data Science Data Visualization Pandas
|
5,594 |
Hiring Software Developers in a Competitive Market
|
Hiring Software Developers in a Competitive Market
Danté Nel at DisruptHR CPT 12.10.2017
There is a certain mentality one needs to have when hiring software developers in a competitive market. Most companies require technical skills, but supply has outstripped demand — software developers can afford to be picky when multiple companies approach them for a job. Danté speaks about the change in mindset required when you as an individual hiring manager at your company realises the fierceness of competition from other players in the market.
For instance, 60% of developers will have at least 2 companies interviewing them within 1 week on the job market. This jumps to 71% in the second week, while popular ‘devs’ easily get more than 10 companies asking them for interviews. Danté gives some pro-tips on how to “win” — making better technical assessments, applying empathy in the hiring process, etc.
Watch the video:
|
https://medium.com/getting-better-together/hiring-software-developers-in-a-competitive-market-62021aa2e997
|
[]
|
2018-03-29 10:07:24.405000+00:00
|
['Hiring', 'Disrupthrct', 'People', 'Talent', 'Software Development']
|
Title Hiring Software Developers Competitive MarketContent Hiring Software Developers Competitive Market Danté Nel DisruptHR CPT 12102017 certain mentality one need hiring software developer competitive market company require technical skill supply outstripped demand — software developer afford picky multiple company approach job Danté speaks change mindset required individual hiring manager company realises fierceness competition player market instance 60 developer least 2 company interviewing within 1 week job market jump 71 second week popular ‘devs’ easily get 10 company asking interview Danté give protips “win” — making better technical assessment applying empathy hiring process etc Watch videoTags Hiring Disrupthrct People Talent Software Development
|
5,595 |
Mass Media Approaches in B2B Digital Selling
|
Nowadays sales processes are collecting instruments and tactics that were traditionally used by journalists and media-managers. It means that b2b sales representatives are moving from just good communicators to a digital trusted advisors. In this article you will find more about sales tools at the intersection of b2b business development and traditional media.
Pandemic 2020 has showed that customer communications in b2b could not be so classical anymore. Face-to-face meeting, huge conferences, long business trips and all other physical events are considered to be insufficient for closing new deals. Some businesses could say that landing pages, online-webinars and other marketing tools have substituted the lack of physical communication, but this is a marketing job. How sales person can process the demand in remote circumstances? What are the ways of being recognisable among competitors if you even don’t see your customer?
In such an instance, sales are required to absorb mass media practices that could be easily repeated in content creating. Being a part of the digital selling team in IT-industry and having a journalistic experience I used to test the most common media tools for sales, so here are the ones I find essential to work with.
Media storytelling
In the newspapers or on TV-reports you would notice the similar structure of the story. It includes a bright headline, a subhead (detailed description of a headline), a body text and a conclusion (summary of the story). Some elements of this structure had migrated to social media posts.
Firstly, look at the posts that become popular among the subscribers in the following link. Most of them have a catchy headline, that hooks your attention.
What are the good forms of headlines that sales people may use in their social selling posts?
Questions (E.g. How business has changed the mind-set about human resources due to COVID-19?) Round number (E.g. 40% employees will never gone back to offline offices after pandemia) Exclusive insights (E.g. The most common business launches in 2020)
Secondly, social media posts usually have the same structures as journalistic materials. These are the most common body text examples I have met in selling posts:
▫️ cause and effect
▫️ classification
▫️ compare and contrast
▫️ list
▫️ question and answer
Third common part between social selling post and media materials is a catchy conclusion. Journalists usually end their texts by short summary of all above-mentioned facts or do an announce of up-going event of the story. Sales people mostly use a call-to-action phrase such as ‘ask for detailed materials’ or ‘watch the video below’. I believe that both of summary types might be in social selling posts: journalistic style is more appropriated for personal stories about the business, selling style is good for invitations and lead generation campaigns.
However, these media storytelling approach fits not only social media texts, but also video selling scripts.
Video selling
For a long time video production has been included a lot of special media stuff, professional equipment and TV studios. Nowadays, we could do it with less of preparations and expenditures. Moreover, video has become the most popular form of communication for last 2–3 years. Last studies of non-text communications show that video is also going to wide-spread format in customer interaction.
59% of senior executives prefer lighthearted work-related videos
So, when might video be used by sales in b2b?
Personal solution overview
Invitations to webinars
Announce of solutions
Interview with the customers
Follow-up
Broadcasts
Tools for video selling
Concerning the instruments I like the ones that are simple and quick to use. It is super important factors, as b2b sales people usually have a lack of time and a lot of customers to work with. That’s are the video tools I find the most appropriate for sales.
BigVu is an app that helps you to record the video from teleprompter. You also could include texts, music and the company on the recorded shoot. The app has a trial mode.
Quik is GoPro editing tool, that let you combine photos, video shoots and do a dynamic content from standard templates. You may use it even without GoPro camera, work with shortages on your smartphone. Quik also has a free version.
Imovie is the most simple video editing app that I’ve ever known. You could cut, speed up and down your videos, do some texts and put a music background. It is free for all Apple devices.
Zoom has become super popular among people around the world in 2020 due to perfect quality and free 40-minutes mode. However, the is one option that is not well-known among the app’s users. It is a sharing option of Power Point presentation as a virtual background. So, your customer could see all you movements, gestures and the content on the backside.
So, I believe that pandemic 2020 has to become a strong trigger for digital transformation in b2b sales, that will make media approaches and sales execution to be more closer. Tech-savvy mindset and content creativity are going to be more appreciable among decision makers. I do consider that these qualities make sales to be more personalized and targeted for their customers.
|
https://medium.com/swlh/mass-media-approaches-in-b2b-digital-selling-92476e44b8f0
|
['Julia Pontus']
|
2020-12-22 08:21:31.458000+00:00
|
['Marketing', 'Sales', 'Digital', 'Médiá', 'Digital Selling']
|
Title Mass Media Approaches B2B Digital SellingContent Nowadays sale process collecting instrument tactic traditionally used journalist mediamanagers mean b2b sale representative moving good communicator digital trusted advisor article find sale tool intersection b2b business development traditional medium Pandemic 2020 showed customer communication b2b could classical anymore Facetoface meeting huge conference long business trip physical event considered insufficient closing new deal business could say landing page onlinewebinars marketing tool substituted lack physical communication marketing job sale person process demand remote circumstance way recognisable among competitor even don’t see customer instance sale required absorb mass medium practice could easily repeated content creating part digital selling team ITindustry journalistic experience used test common medium tool sale one find essential work Media storytelling newspaper TVreports would notice similar structure story includes bright headline subhead detailed description headline body text conclusion summary story element structure migrated social medium post Firstly look post become popular among subscriber following link catchy headline hook attention good form headline sale people may use social selling post Questions Eg business changed mindset human resource due COVID19 Round number Eg 40 employee never gone back offline office pandemia Exclusive insight Eg common business launch 2020 Secondly social medium post usually structure journalistic material common body text example met selling post ▫️ cause effect ▫️ classification ▫️ compare contrast ▫️ list ▫️ question answer Third common part social selling post medium material catchy conclusion Journalists usually end text short summary abovementioned fact announce upgoing event story Sales people mostly use calltoaction phrase ‘ask detailed materials’ ‘watch video below’ believe summary type might social selling post journalistic style appropriated personal story business selling style good invitation lead generation campaign However medium storytelling approach fit social medium text also video selling script Video selling long time video production included lot special medium stuff professional equipment TV studio Nowadays could le preparation expenditure Moreover video become popular form communication last 2–3 year Last study nontext communication show video also going widespread format customer interaction 59 senior executive prefer lighthearted workrelated video might video used sale b2b Personal solution overview Invitations webinars Announce solution Interview customer Followup Broadcasts Tools video selling Concerning instrument like one simple quick use super important factor b2b sale people usually lack time lot customer work That’s video tool find appropriate sale BigVu app help record video teleprompter also could include text music company recorded shoot app trial mode Quik GoPro editing tool let combine photo video shoot dynamic content standard template may use even without GoPro camera work shortage smartphone Quik also free version Imovie simple video editing app I’ve ever known could cut speed video text put music background free Apple device Zoom become super popular among people around world 2020 due perfect quality free 40minutes mode However one option wellknown among app’s user sharing option Power Point presentation virtual background customer could see movement gesture content backside believe pandemic 2020 become strong trigger digital transformation b2b sale make medium approach sale execution closer Techsavvy mindset content creativity going appreciable among decision maker consider quality make sale personalized targeted customersTags Marketing Sales Digital Médiá Digital Selling
|
5,596 |
Finding Hope through Mindfulness
|
Photo by Andrea Piacquadio from Pexels
You are not alone if you have been feeling caught in a cycle of constant negativity. A combination of social isolation, political chaos and social media messaging can make us feel overwhelmed and cause heightened anxiety and depressed mood. It is normal to feel that you’ve reached your capacity. Lately, I have heard from so many that even when they’re focused on self-care and wellness it doesn’t feel enough to kick them out of a slump of sadness and fear. How can we escape this negativity? What can we do to help ourselves when we feel this way?
It is easy to fall into a negative cycle in our fast paced lives when we are not taking the time to be mindful. When we take a step back and focus on the present moment, it is easier to see hope and understand that we have the strength to overcome what is in front of us. As a mental health therapist, I have seen the practice of mindfulness assist many in finding hope and minimizing stress during difficult times like these.
What is Mindfulness?
Mindfulness is the act of intentionally living with awareness in the present moment without judgment or attachment.
When we practice mindfulness it is most important to be open to the experiences each new moment brings, rather than being stuck in the past or looking towards the future. For example, if we are mindful of the food we are eating at this moment, then we want to only focus on that experience. We would want to avoid judging how the food tastes, past memories associated with the food or thoughts about what you will be doing after you eat. We would want to try to remain focused on the food we are eating and bring ourselves back to the present if we are distracted with other thoughts or judgments.
Mindfulness can be practiced with any event or moment by observing, describing or participating in an experience. Even as you are reading this article you can practice mindfulness. Are you focused on these black ink words and the meaning behind them? Are you drifting away by thinking about an argument you had earlier this week? Are you having judgments or being hard on yourself for not practicing mindfulness before? I encourage you to refocus on the words you are reading now and practice participating in mindfulness in this moment.
Benefits of Mindfulness
Imagine what it would be like to be driving and only be focused on the task at hand. Focused on the feeling of the steering wheel, the sounds of the turn signals and the cars around you, instead of stressing about running late, thinking about what you want for dinner and what is still left on your growing to-do list. Imagine how many less car accidents or close calls you would face. Imagine how much less stress you would feel. Mindfulness has the power to reduce your suffering, reduce tension and pain and increase control of your mind instead of letting our mind be in control of you. When we are mindful in each moment we are in control of our thoughts and are less likely to spiral with anxious worries. When we are mindful we give ourselves the opportunity to experience reality as it is and to live our lives connected to the universe around us. Mindfulness allows us to take in the entire experience, to be fully present and gives us the freedom to let go of attachments and demands that society has forced upon us. Mindfulness gives us the opportunity to see hope.
Sometimes mindfulness is difficult because the present moment may cause distress for us. It is so natural for us to distract ourselves from moments of distress. Mindfulness urges us to take in each moment as it is and sit in the discomfort that may cause. As you continue to practice mindfulness, oftentimes you will notice that challenging moments and experiences become less painful. This is because mindfulness allows us to regulate our emotions and feel a radical increase in love and compassion towards others. In the year 2020 especially, mindfulness can assist us in being less overwhelmed by taking in moments as they are instead of compacting them together.
Conveyor Belt Thoughts
One of the more difficult parts of mindfulness is allowing yourself and your thoughts to be focused on the present. It is so common for us to drift our thoughts away to the next thing we have to do or become distracted with worries. A key part of mindfulness is to be able to acknowledge those distracting thoughts as they come and then let them float away and bring your attention back to the present moment. It can be helpful to notice your thoughts and feelings coming down a conveyor belt. When practicing mindfulness, as you notice distracting thoughts coming up imagine placing them on a conveyor belt and watching them leave you in this moment, then bring your attention back to what is in front of you.
Loving Kindness
A mindfulness activity that can be particularly helpful in building hope within us during times when we feel chaos, anger, or sadness is practicing loving kindness. This activity can be focused on anyone, but can be especially impactful when utilized towards someone you are struggling with (maybe a partner, family member, colleague or even a political figure). When we feel in a cycle of negativity it is easy to feel dislike or even hatred towards those around you. If you feel like these feelings are not serving you, consider attempting to send loving kindness their way. Start by sitting, standing or lying down and practicing deep breathing for 1–2 minutes with your palms open and facing up. Begin by gently bringing the person you are thinking of to mind. Say their name out loud or in your head. Center yourself in the practice of mindfulness towards sending loving kindness to this person. Radiate loving kindness to this person by reciting warm wishes for the person. You can start with more neutral thoughts:
“I am sending loving kindness to _____”
“May ______ be safe”
Take a deep breath
“May _____ be healthy”
Take a deep breath
“May _____ feel at peace”
Take a deep breath
“May _____ feel happiness”
Take a deep breath
As you feel comfortable you are welcomed to add other well wishes for the person you are focusing on. Continue to repeat your phrases until you yourself feel immersed in loving kindness. Let me be clear: this can be very challenging. At the same time it can relieve distress and anger and can be a powerful tool in finding hope. During the activity utilize the conveyor belt method when thoughts of hatred or dislike come seeping in. There is no need to judge yourself if these thoughts come, instead acknowledge them and then imagine them leaving you. It may be easier to start this activity by practicing sending loving kindness to someone you love or to yourself and then working toward people in your life that you have more complicated feelings towards.
Reminder
A gentle reminder that mindfulness can most definitely be challenging. It is something that takes practice and patience. Don’t get discouraged if you are practicing mindfulness and have difficulty staying in the present. With time, this exercise will become easier. Practicing loving kindness is about challenging the narratives we have created regarding how we see people around us. Practicing mindfulness through loving kindness allows us to push back many of the assumptions we have made in our negative cycle and see someone in this moment as what they truly are without judgment. Starting this practice is a major step towards focusing on your mental health. Be proud of yourself for starting on this mindfulness journey and allow yourself to feel hope in this process.
|
https://medium.com/joincurio/finding-hope-through-mindfulness-41ea979f58ff
|
['Sarah Belarde']
|
2020-11-02 15:37:01.496000+00:00
|
['Personal Growth', 'Mental Health', 'Anxiety', 'Mindset', 'Mindfulness']
|
Title Finding Hope MindfulnessContent Photo Andrea Piacquadio Pexels alone feeling caught cycle constant negativity combination social isolation political chaos social medium messaging make u feel overwhelmed cause heightened anxiety depressed mood normal feel you’ve reached capacity Lately heard many even they’re focused selfcare wellness doesn’t feel enough kick slump sadness fear escape negativity help feel way easy fall negative cycle fast paced life taking time mindful take step back focus present moment easier see hope understand strength overcome front u mental health therapist seen practice mindfulness assist many finding hope minimizing stress difficult time like Mindfulness Mindfulness act intentionally living awareness present moment without judgment attachment practice mindfulness important open experience new moment brings rather stuck past looking towards future example mindful food eating moment want focus experience would want avoid judging food taste past memory associated food thought eat would want try remain focused food eating bring back present distracted thought judgment Mindfulness practiced event moment observing describing participating experience Even reading article practice mindfulness focused black ink word meaning behind drifting away thinking argument earlier week judgment hard practicing mindfulness encourage refocus word reading practice participating mindfulness moment Benefits Mindfulness Imagine would like driving focused task hand Focused feeling steering wheel sound turn signal car around instead stressing running late thinking want dinner still left growing todo list Imagine many le car accident close call would face Imagine much le stress would feel Mindfulness power reduce suffering reduce tension pain increase control mind instead letting mind control mindful moment control thought le likely spiral anxious worry mindful give opportunity experience reality live life connected universe around u Mindfulness allows u take entire experience fully present give u freedom let go attachment demand society forced upon u Mindfulness give u opportunity see hope Sometimes mindfulness difficult present moment may cause distress u natural u distract moment distress Mindfulness urge u take moment sit discomfort may cause continue practice mindfulness oftentimes notice challenging moment experience become le painful mindfulness allows u regulate emotion feel radical increase love compassion towards others year 2020 especially mindfulness assist u le overwhelmed taking moment instead compacting together Conveyor Belt Thoughts One difficult part mindfulness allowing thought focused present common u drift thought away next thing become distracted worry key part mindfulness able acknowledge distracting thought come let float away bring attention back present moment helpful notice thought feeling coming conveyor belt practicing mindfulness notice distracting thought coming imagine placing conveyor belt watching leave moment bring attention back front Loving Kindness mindfulness activity particularly helpful building hope within u time feel chaos anger sadness practicing loving kindness activity focused anyone especially impactful utilized towards someone struggling maybe partner family member colleague even political figure feel cycle negativity easy feel dislike even hatred towards around feel like feeling serving consider attempting send loving kindness way Start sitting standing lying practicing deep breathing 1–2 minute palm open facing Begin gently bringing person thinking mind Say name loud head Center practice mindfulness towards sending loving kindness person Radiate loving kindness person reciting warm wish person start neutral thought “I sending loving kindness ” “May safe” Take deep breath “May healthy” Take deep breath “May feel peace” Take deep breath “May feel happiness” Take deep breath feel comfortable welcomed add well wish person focusing Continue repeat phrase feel immersed loving kindness Let clear challenging time relieve distress anger powerful tool finding hope activity utilize conveyor belt method thought hatred dislike come seeping need judge thought come instead acknowledge imagine leaving may easier start activity practicing sending loving kindness someone love working toward people life complicated feeling towards Reminder gentle reminder mindfulness definitely challenging something take practice patience Don’t get discouraged practicing mindfulness difficulty staying present time exercise become easier Practicing loving kindness challenging narrative created regarding see people around u Practicing mindfulness loving kindness allows u push back many assumption made negative cycle see someone moment truly without judgment Starting practice major step towards focusing mental health proud starting mindfulness journey allow feel hope processTags Personal Growth Mental Health Anxiety Mindset Mindfulness
|
5,597 |
Basic Data Cleaning — Removing NaNs
|
As a beginning data scientist, I’m learning that most of my time is spent preparing data for analysis. Much as writing is about clarifying and polishing ideas, before we can tell any compelling stories with data, it must be thoroughly cleaned and prepared for analysis.
This might not seem very interesting, but it is necessary if we want to extract any interesting stories from it.
Here are some example datasets for us to work with:
We’ve created a dataframe with 4 columns, and 100 rows (zero is counted as the first index marker so 99 will be our last row) that is populated with random integers between 0 and 100.
NaN (not a number) variables are gaps in data, when an observation has been missed or perhaps the data isn’t available. They make it impossible to run most analysis tools on a dataset, but if you are recording data and simply throw out all NaN observations, you will end up losing a lot of potentially useful data. Therefore, as data scientists, we want to observer how many NaN values there are, where they are in the data, and the most appropriate way to “clean” them.
You can verify that there are no NaN variables by running:
The .isna() function searches for NaN values and returns a boolean True when it finds a NaN, then the .sum() function counts each True and returns the sum for each column. If we wanted to check the number of NaN values organized by line instead of column we would type:
example_df.isna().sum(axis=1)
But I won’t run it because it’s a series of 100 zeros.
Since, we don’t actually have any NaN values yet, so lets add some:
What we’ve done is create a new dataset that has 15% NaN variables by mapping a function over every element of the dataframe. The function replaces the existing cell with a NaN value if the random number it generates is <= 15.
We can verify that the NaNs exist and their distribution as follows:
So if we take the mean of all the NaN’s in our four columns we get 15.25. This isn’t exactly 15%, probably due to the way that numpy interprets <= 15.
Feel free to let me know if you understand how that function works and I can add a clarification.
Now that we have some NaN data points, a fairly standard cleaning algorithm is as follows:
1) run df.isna().sum() to confirm the presence of NaN values (which we’ve done)
2) determine what is the appropriate measure to take with your NaN values.
3) Execute.
For step 2, we should consider the characteristics of this dataframe. It’s an array of integers, so it will have a shape, and some statistical properties, which we can see below:
*I use the print() function here because when you have two functions in a Jupyter (or colab) notebook code cell, only the last function will produce an output when you run the cell.
You can see that our dataframe’s shape is 100 rows by 4 columns, and the 4 columns have a certain mean, standard deviation, and distribution listed by the describe function. You can also see that the count is the number of rows minus the NaN values listed in our columns.
Since data science is concerned with rapidly deploying predictive models and descriptive statistics, there is an “art” to this science (this may cause some readers to see red and I think I know what you’re going to say, but please hear me out).
If we were to delete all columns with NaN variables, we would lose the dataset, so that’s not an option. If we were to delete all rows with NaN values, we would lose more than 15% of our data, which is an unattractive option.
A better option might be to mask the NaN values, without changing the shape of the dataset. We can replace each NaN with the mean of the respective column and should preserve the general shape of the data.
Let’s see what happens:
The mean is preserved, as are the min and max. However, our standard deviation and quartiles have shifted towards the mean. This is to be expected as all of the “un-counted” NaN’s that weren’t included in this description in our nan_df have been stuck into the mean of this dataset, shifting the relative weight of observations to the mean.
This is a tradeoff, but perhaps a small one since now we get to keep all our data.
I will complicate this situation a little more in my next post.
|
https://medium.com/writing-data/basic-data-cleaning-removing-nans-1787110fc11b
|
['Ned H']
|
2019-03-26 00:39:26.970000+00:00
|
['Data Cleaning', 'Numpy', 'Python', 'Pandas', 'Data Science']
|
Title Basic Data Cleaning — Removing NaNsContent beginning data scientist I’m learning time spent preparing data analysis Much writing clarifying polishing idea tell compelling story data must thoroughly cleaned prepared analysis might seem interesting necessary want extract interesting story example datasets u work We’ve created dataframe 4 column 100 row zero counted first index marker 99 last row populated random integer 0 100 NaN number variable gap data observation missed perhaps data isn’t available make impossible run analysis tool dataset recording data simply throw NaN observation end losing lot potentially useful data Therefore data scientist want observer many NaN value data appropriate way “clean” verify NaN variable running isna function search NaN value return boolean True find NaN sum function count True return sum column wanted check number NaN value organized line instead column would type exampledfisnasumaxis1 won’t run it’s series 100 zero Since don’t actually NaN value yet let add we’ve done create new dataset 15 NaN variable mapping function every element dataframe function replaces existing cell NaN value random number generates 15 verify NaNs exist distribution follows take mean NaN’s four column get 1525 isn’t exactly 15 probably due way numpy interprets 15 Feel free let know understand function work add clarification NaN data point fairly standard cleaning algorithm follows 1 run dfisnasum confirm presence NaN value we’ve done 2 determine appropriate measure take NaN value 3 Execute step 2 consider characteristic dataframe It’s array integer shape statistical property see use print function two function Jupyter colab notebook code cell last function produce output run cell see dataframe’s shape 100 row 4 column 4 column certain mean standard deviation distribution listed describe function also see count number row minus NaN value listed column Since data science concerned rapidly deploying predictive model descriptive statistic “art” science may cause reader see red think know you’re going say please hear delete column NaN variable would lose dataset that’s option delete row NaN value would lose 15 data unattractive option better option might mask NaN value without changing shape dataset replace NaN mean respective column preserve general shape data Let’s see happens mean preserved min max However standard deviation quartile shifted towards mean expected “uncounted” NaN’s weren’t included description nandf stuck mean dataset shifting relative weight observation mean tradeoff perhaps small one since get keep data complicate situation little next postTags Data Cleaning Numpy Python Pandas Data Science
|
5,598 |
Five Strategies to Boost Your NaNoWriMo Progress
|
Five Strategies to Boost Your NaNoWriMo Progress
Embrace imperfection and finish your novel
Photo by Bill Jelen on Unsplash
Everyone has different life circumstances, are in different places in their writing journey, and have different skills. Some people stay up late while others get up early. There are writers who have kids running around and puppies and school and life events.
These strategies are still for you.
It can be overwhelming to see other people with word counts higher than yours (hence my introduction). Everyone writes at their own pace, so focus less on their numbers and more on your progress. If you need a reminder of other things to celebrate during NaNo besides word count, check out this piece.
This is my third NaNo project. I write fiction full(ish)-time. I’m also a self-published author who will finish this year with 15 published titles. I’m telling you this because when I share my word count, you should understand the life circumstances that allowed this to happen.
As of writing this post (I’m scheduling it for tomorrow morning), I sit at 25K words on my novel. My breakdown was 10K+ on day 1, 6K+ on day 2, and 7K+ on day 3.
That didn’t only happen because of my life circumstances or the fact that I have the first half of this novel planned extensively (yeah, once I reach the middle I need to figure out where I’m going). I actually have a few strategies that I use during November to help keep me on track and flying through the month.
No matter where you are in your writing journey or how many times you’ve participated in NaNo, these strategies can boost your progress without sucking the life out of you.
DO NOT PRESS THE BACKSPACE KEY
Yeah, I’m putting that one in all caps.
It’s so tempting to go back and try to make your first sentence perfect on the first try. You might want to go back and fix that small little comma error you have. You might think of a new or better word than the one you just used.
Don’t press the backspace key. Don’t let that word go to waste. Just keep swimming, like this writer who wrote 1000 words before finding her first sentence. Those 1000 words still count.
November is NOT National Editing Month. Keep writing.
Keep typing
Keep typing, even if you have no idea what you’re writing .
Something I do only in November (to pad my word count at the very least) is to not let my fingers stop moving for writing sprints. I keep typing, even if that means I start jumbling my stream of consciousness onto the page. It might just be a quick paragraph of what you might want to happen in the future. Just don’t stop.
Keep momentum and take advantage of it if you have it. Think of it like morning pages, where you can’t let your fingers stop typing. Even if it’s garbled mess, there are still words on your screen. They still count toward your final goal. And the more you keep typing, the better those words will start flowing, getting you closer to the finish line.
Make notes, not changes
Have you stumbled onto a plot hole? Discovered that you made a character’s eyes blue when they started as green?
MAKE A NOTE. Sticky note by your computer, comment on your word document, leave a trail of breadcrumbs.
Don’t pause and go back to fix the previous chapter. That takes time away from writing and it messes with your flow. Leave yourself a note and worry later.
I have a minimum of ten notes already to go back and add certain elements or change descriptions. I’m not upset I ‘missed’ them at first, but I just discovered things in my story that needed to happen differently.
Move forward as if you’ve made that change
Move forward as if you’ve made that change — That plot hole you just found? Yeah, it never existed… at least for your future word count.
Pretend as if you solved it from the beginning of your novel and keep writing as if it has always been that way. Leave a note (bold, highlight, brackets) and fix it after November.
So much time is wasted scrolling your document to find the small little sentence that you need to change. And if you have a massive change, it can be tempting to try to fix everything and make it fit just right in the manuscript. Fight the urge and just pretend it already exists. You can change it after you reach your 50K words.
It doesn’t have to be perfect
Believe me. I spent last year’s NaNo trying to write the perfect conclusion to my trilogy. I slaved over words, trying to make each sentence absolutely beautiful. I researched my old books making sure I covered everything. Getting 50K was the hardest word count I’ve ever worked for. I HATED THAT BOOK FOR A MONTH.
But then I rewrote it during Camp the following April and tossed perfection out the window. It might be my favorite book written to date.
Please please remember that this month is not about making a perfect manuscript on the first try. It’s about getting something onto the page so you can edit it later (or just throw away if that’s what you want).
No one has to ever see the first draft. You can keep your little secretly misspelled words like ‘fish’ to yourself (true story, I wrote ‘firch’ at least five times before I noticed). The first draft is for you and you alone. Write the story for yourself. Share it later.
|
https://medium.com/the-innovation/five-strategies-to-boost-your-nanowrimo-progress-ceba67231c71
|
['Laura Winter']
|
2020-11-06 15:44:09.549000+00:00
|
['Novel', 'Novel Writing', 'NaNoWriMo', 'Writing', 'Writing Tips']
|
Title Five Strategies Boost NaNoWriMo ProgressContent Five Strategies Boost NaNoWriMo Progress Embrace imperfection finish novel Photo Bill Jelen Unsplash Everyone different life circumstance different place writing journey different skill people stay late others get early writer kid running around puppy school life event strategy still overwhelming see people word count higher hence introduction Everyone writes pace focus le number progress need reminder thing celebrate NaNo besides word count check piece third NaNo project write fiction fullishtime I’m also selfpublished author finish year 15 published title I’m telling share word count understand life circumstance allowed happen writing post I’m scheduling tomorrow morning sit 25K word novel breakdown 10K day 1 6K day 2 7K day 3 didn’t happen life circumstance fact first half novel planned extensively yeah reach middle need figure I’m going actually strategy use November help keep track flying month matter writing journey many time you’ve participated NaNo strategy boost progress without sucking life PRESS BACKSPACE KEY Yeah I’m putting one cap It’s tempting go back try make first sentence perfect first try might want go back fix small little comma error might think new better word one used Don’t press backspace key Don’t let word go waste keep swimming like writer wrote 1000 word finding first sentence 1000 word still count November National Editing Month Keep writing Keep typing Keep typing even idea you’re writing Something November pad word count least let finger stop moving writing sprint keep typing even mean start jumbling stream consciousness onto page might quick paragraph might want happen future don’t stop Keep momentum take advantage Think like morning page can’t let finger stop typing Even it’s garbled mess still word screen still count toward final goal keep typing better word start flowing getting closer finish line Make note change stumbled onto plot hole Discovered made character’s eye blue started green MAKE NOTE Sticky note computer comment word document leave trail breadcrumb Don’t pause go back fix previous chapter take time away writing mess flow Leave note worry later minimum ten note already go back add certain element change description I’m upset ‘missed’ first discovered thing story needed happen differently Move forward you’ve made change Move forward you’ve made change — plot hole found Yeah never existed… least future word count Pretend solved beginning novel keep writing always way Leave note bold highlight bracket fix November much time wasted scrolling document find small little sentence need change massive change tempting try fix everything make fit right manuscript Fight urge pretend already exists change reach 50K word doesn’t perfect Believe spent last year’s NaNo trying write perfect conclusion trilogy slaved word trying make sentence absolutely beautiful researched old book making sure covered everything Getting 50K hardest word count I’ve ever worked HATED BOOK MONTH rewrote Camp following April tossed perfection window might favorite book written date Please please remember month making perfect manuscript first try It’s getting something onto page edit later throw away that’s want one ever see first draft keep little secretly misspelled word like ‘fish’ true story wrote ‘firch’ least five time noticed first draft alone Write story Share laterTags Novel Novel Writing NaNoWriMo Writing Writing Tips
|
5,599 |
A First Look at React’s New Server Components
|
A First Look at React’s New Server Components
Explaining the new approach to fetching data in React.js.
Yesterday, the React team announced a new feature: Server Components.
The feature is still experimental; there is no real documentation yet.
What it’s about is simply put: data & component fetching in React.js.
Server Components allow us to load components from the backend. The components have already been rendered in the backend and can be seamlessly integrated into the running app.
So it’s a bit like server-side rendering but works differently.
Similar to what you know from Next.js with getInitialProps, server components can fetch data and pass it to front-end components.
However, unlike classic SSR, Server Components are a bit more dynamic. We can fetch a server tree during the app's execution; the client state is not lost.
They also work differently technically. With SSR, our JavaScript code is rendered into HTML on the server. This creates an HTML template, which is the visible part of our web page.
This is sent to the client, plus the JavaScript code used for interactivity. Thanks to SSR, we see something earlier, but the interactivity can be delayed.
Server components are dynamically included in the app and passed in a special form, as shown in the following image.
All JavaScript instructions are executed. 1 + 1 becomes 2, which is also passed in the format. The components are static and cannot be interactive. Compared to SSR, only the visible part is passed — the interactivity is missing.
Where is the big advantage of Server Components?
Zero-Bundle-Size-Components
The JavaScript world is full of huge libraries. Just think of packages like Moment.js, which are many kilobytes in size, but of which we only use a few functions.
For the app's performance, and thus for the user, this is, of course, very bad — all the code is shipped to the front-end.
Tree-shaking can be used to save code that we don’t need. What remains is still a lot of code, which is often executed only once. For example, to format a date.
Thanks to Server Components, we can spare our front-end this code as well.
|
https://medium.com/javascript-in-plain-english/react-server-components-2cf9f8e82c1f
|
['Louis Petrik']
|
2020-12-22 11:09:29.408000+00:00
|
['Reactjs', 'React', 'JavaScript', 'Web Development', 'Nodejs']
|
Title First Look React’s New Server ComponentsContent First Look React’s New Server Components Explaining new approach fetching data Reactjs Yesterday React team announced new feature Server Components feature still experimental real documentation yet it’s simply put data component fetching Reactjs Server Components allow u load component backend component already rendered backend seamlessly integrated running app it’s bit like serverside rendering work differently Similar know Nextjs getInitialProps server component fetch data pas frontend component However unlike classic SSR Server Components bit dynamic fetch server tree apps execution client state lost also work differently technically SSR JavaScript code rendered HTML server creates HTML template visible part web page sent client plus JavaScript code used interactivity Thanks SSR see something earlier interactivity delayed Server component dynamically included app passed special form shown following image JavaScript instruction executed 1 1 becomes 2 also passed format component static cannot interactive Compared SSR visible part passed — interactivity missing big advantage Server Components ZeroBundleSizeComponents JavaScript world full huge library think package like Momentjs many kilobyte size use function apps performance thus user course bad — code shipped frontend Treeshaking used save code don’t need remains still lot code often executed example format date Thanks Server Components spare frontend code wellTags Reactjs React JavaScript Web Development Nodejs
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.