Unnamed: 0
int64 0
192k
| title
stringlengths 1
200
| text
stringlengths 10
100k
| url
stringlengths 32
885
| authors
stringlengths 2
392
| timestamp
stringlengths 19
32
⌀ | tags
stringlengths 6
263
| info
stringlengths 45
90.4k
|
---|---|---|---|---|---|---|---|
4,100 | For those who don’t want to read the new ‘Robert Galbraith’ serial killer tale | For those who don’t want to read the new ‘Robert Galbraith’ serial killer tale
Why are people complaining about J. K. Rowling’s new book?
J. K. Rowling, the author of the Harry Potter series, has recently made a series of anti-transgender statements. I won’t repeat them here, but you can look them up if you like. They are on her website and her Twitter account.
As a result, a large number of transgender people and their allies have expressed a collective desire to stop consuming her work (or at least to avoid paying for it or otherwise to refrain from amplifying or encouraging her as an author). Many people have fond memories of Harry Potter, and they do not wish to give up this fantasy world that meant a great deal to them. I won’t start a fight about that; I was never a Harry Potter fan, so I cannot “give up” something that I never had, and I don’t feel I can tell others exactly how to part with something they care about. That process might be personal, and it might play out differently for different people. Nonetheless, I can make a general statement that readers should feel warmly and gently invited to avoid supporting a billionaire author who has recently chosen to use her enormous platform to denigrate transgender people.
While I never had much interest in J. K. Rowling’s work and have even less now, I had a reason to pick up her latest book, and I have a reason to tell you about it.
Troubled Blood was released yesterday (September 15, 2020) under J. K. Rowling’s pen name, Robert Galbraith. It’s the fifth novel in a series featuring the fictional detective Cormoran Strike. This instalment is 944 pages. The LGBT+ publication Pink News announced it as “a cis male serial killer who dresses as a woman to kill his cis female victims.” (“Cis” is an abbreviation for “cisgender,” meaning “not transgender.”)
Here’s why I care. In 2018, I published Painting Dragons, an examination of the “eunuch villain” trope. A eunuch villain is not the same as a cross-dressing villain, but there may be some overlap in the Venn diagram; to put it another way, at least in the fantasy-land of metaphors, the concepts are adjacent. There is a broader problem of “queer villains,” and, here, I’m talking about the sort of queerness that has to do more with gender than with sexuality. Since I’ve positioned myself as a person who is knowledgable about this literary trope, I feel it is my responsibility to weigh in on Troubled Blood.
I read the 944 pages on the day it came out. Because, as a transgender person myself, I have no motivation to preserve or heighten the suspense of this particular author’s book, this essay includes some detail from the end of the book. I don’t think of it as a spoiler. How can I spoil something that the transgender community has made a collective decision not to enjoy? It is, rather, simply information. I am informing.
If you don’t want to read it, but you’re curious what’s in it, let me tell you about it.
What happens in ‘Troubled Blood’
A serial killer, Dennis Creed, began his murder career in England in 1968 when he was in his early thirties. He rented a permanent room in a boarding house on Liverpool Road near Paradise Park. This is where he kept the women he abducted. Now 77 — the novel is set in 2014 — he has been in jails and psychiatric facilities for decades. Some unsolved crimes seem to point to him.
The detective finds photographs of Creed “at various ages, from pretty, curly-haired blond toddler all the way through to the police mugshot of a slender man with a weak, sensual mouth and large, square glasses.” In Chapter 53, we are told:
“Dennis Creed had been a meticulous planner, a genius of misdirection in his neat little white van, dressed in the pink coat he’d stolen from Vi Cooper, and sometimes wearing a wig that, from a distance, to a drunk victim, gave his hazy form a feminine appearance just long enough for his large hands to close over a gasping mouth.”
Having confused his victims — by drugging them or by appearing from a distance to be a trustworthy female — Creed drove them to the boarding house, chained them to the radiator in the basement, physically tortured them for months in especially sadistic ways, and eventually killed and dismembered them. He did this multiple times. Police knew there was an “Essex Butcher” but didn’t identify Dennis Creed until 1976.
Relying heavily on information in a 1985 true-crime book The Demon of Paradise Park devoted to Creed’s crimes (which of course exists only within this novel), detectives reopen the 1974 disappearance of Dr. Margot Bamborough. No one had ever found her body nor convincingly tied her murder to Creed. The detective goes to the psychiatric facility to interview Creed, who has a “working-class, East London accent” and, by then, a “triple chin.” He provides information about one of his prior victims. (At the end of the nearly thousand-page book: Yes, they find Bamborough’s body. No, her killer wasn’t Creed.)
Within the Dr. Bamborough case, there’s another theme of gender confusion: A mysterious patient at the clinic the day the doctor disappeared. The patient’s name had been written in the receptionist’s log simply as “Theo question mark.” One of the doctors remembered seeing this person and assumed Theo was a man. The receptionist insisted otherwise: “She was broad-shouldered, I noticed that when she came to the desk, but she was definitely a woman.” The detectives are interested in finding and questioning Theo, whoever he or she is, so they repeatedly bring up the mystery of Theo’s gender.
Oddly — remember, this case is reopened forty years later, in 2014 in the novel’s fictional world, and the novel is being published in 2020— no character ever floats the idea that Theo might have been a transgender woman. In fact, the prefix “trans-”, applied to gender (as in “transgender,” “transsexual,” “transvestite,” or simply “trans”), appears nowhere in this book. Theo’s possible gender is never discussed in this light.
Nor is Creed’s crossdressing as part of his misogynistic violence ever analyzed psychologically.
So those are the relevant parts of the plot in J. K. Rowling’s new book.
Is it original?
No.
In 1937 in New York City, amid public panic about murderous rapists, the magazine Inside Detective warned that one killer on the run “MAY BE MASQUERADING IN FEMALE ATTIRE!” Place the emphasis on the “may be,” please, because he was not. (You can read more about this true case in Harold Schechter’s book The Mad Sculptor.)
Within fiction, you can find the “crossdressing killer” motif in Psycho (a 1959 novel, then a film), Dressed to Kill (a 1980 film), and Silence of the Lambs (a 1988 novel, then a film).
If you’d like more information on these cinematic tropes, I highly recommend the Netflix documentary Disclosure, released earlier this year.
Is it weird?
…that a woman author who claims to worry that transgender women threaten cisgender women’s safety should devote her career to (a) writing explicit scenes in which a man tortures women, not to inform readers but to entertain them, and that she should do so (b) under a masculine pen name?
Yes. That is very, very weird.
Is it transphobic?
Yes.
Considering the book’s sins in isolation — that is, if I had read the text without knowing who wrote it — I’d say its sins are relatively mild. Regarding the serial killer, Dennis Creed, the crossdressing element could have been explained better to make it more than just a replay of old horror movies. His crossdressing was just a deliberate ruse to lure his victims; nothing more is ever done with that information. Regarding the mysterious visitor to the doctor’s office, Theo, the detectives make themselves look foolish in failing to consider the possibility that Theo is transgender. There’s no reason to exclude the word “transgender” from the novel. The author was not struggling to fit a word limit. The topic could have been better addressed: distinguish concepts of clothing and identity, acknowledge that transgender people generally aren’t violent, and exculpate queer people at the end. Add another page or ten and deal with it. If I didn’t know who the author was, I’d say these were sins of omission and of ignorance.
But the book can’t be considered in isolation from its author. This is a billionaire author, famous for writing Harry Potter, who has, just within the past year, assumed the mantle of anti-transgender rants. She absolutely knew what she was doing in this novel. Her framing is intentional. She wants to scare people about transgender women, not only in fiction, but also in real life. We know from life context that this book is serving a larger agenda.
Nick Cohen, a columnist for the Spectator who read an early review copy of Troubled Blood, wrote on the book’s release day that “transvestism barely features. When it does, nothing is made of the fact that the killer wears a wig and a woman’s coat…” But that’s exactly the problem. Why does Rowling mention it at all, if she intends to make “nothing” of it? Especially when she’s been criticized all year for expressing her anti-transgender viewpoints? If she cared how this was received and interpreted, she could have made a bigger effort. If anything, I imagine she is happy to leverage this novel to deliberately capitalize on the publicity she gets from repeatedly offending transgender people.
Readers are taught to consider a work as a whole and not complain excessively about a relatively tiny detail. Teachers want us demonstrate that we’ve actually read the entire book; meanwhile, living authors and marketers generally plead with readers to “be fair” (and, ideally, generous) to their personality and product. But I am not in school anymore, nor do I have a motivation to be generous to a billionaire whose new brand is slandering my community.
I wonder if J. K. Rowling wrote 944 pages with the intention of minimizing the passage about crossdressing so that her defenders can object that her book, as a whole, isn’t about that.
They would be correct; the book, as a whole, is not about the villain putting on a wig.
But part of the book is about that.
The transphobic part. | https://medium.com/books-are-our-superpower/robert-galbraith-serial-killer-jk-rowling-transphobic-5b79031cae6e | ['Tucker Lieberman'] | 2020-09-16 18:43:01.762000+00:00 | ['Reading', 'Books', 'LGBTQ', 'Robert Galbraith', 'Jk Rowling'] | Title don’t want read new ‘Robert Galbraith’ serial killer taleContent don’t want read new ‘Robert Galbraith’ serial killer tale people complaining J K Rowling’s new book J K Rowling author Harry Potter series recently made series antitransgender statement won’t repeat look like website Twitter account result large number transgender people ally expressed collective desire stop consuming work least avoid paying otherwise refrain amplifying encouraging author Many people fond memory Harry Potter wish give fantasy world meant great deal won’t start fight never Harry Potter fan cannot “give up” something never don’t feel tell others exactly part something care process might personal might play differently different people Nonetheless make general statement reader feel warmly gently invited avoid supporting billionaire author recently chosen use enormous platform denigrate transgender people never much interest J K Rowling’s work even le reason pick latest book reason tell Troubled Blood released yesterday September 15 2020 J K Rowling’s pen name Robert Galbraith It’s fifth novel series featuring fictional detective Cormoran Strike instalment 944 page LGBT publication Pink News announced “a ci male serial killer dress woman kill ci female victims” “Cis” abbreviation “cisgender” meaning “not transgender” Here’s care 2018 published Painting Dragons examination “eunuch villain” trope eunuch villain crossdressing villain may overlap Venn diagram put another way least fantasyland metaphor concept adjacent broader problem “queer villains” I’m talking sort queerness gender sexuality Since I’ve positioned person knowledgable literary trope feel responsibility weigh Troubled Blood read 944 page day came transgender person motivation preserve heighten suspense particular author’s book essay includes detail end book don’t think spoiler spoil something transgender community made collective decision enjoy rather simply information informing don’t want read you’re curious what’s let tell happens ‘Troubled Blood’ serial killer Dennis Creed began murder career England 1968 early thirty rented permanent room boarding house Liverpool Road near Paradise Park kept woman abducted 77 — novel set 2014 — jail psychiatric facility decade unsolved crime seem point detective find photograph Creed “at various age pretty curlyhaired blond toddler way police mugshot slender man weak sensual mouth large square glasses” Chapter 53 told “Dennis Creed meticulous planner genius misdirection neat little white van dressed pink coat he’d stolen Vi Cooper sometimes wearing wig distance drunk victim gave hazy form feminine appearance long enough large hand close gasping mouth” confused victim — drugging appearing distance trustworthy female — Creed drove boarding house chained radiator basement physically tortured month especially sadistic way eventually killed dismembered multiple time Police knew “Essex Butcher” didn’t identify Dennis Creed 1976 Relying heavily information 1985 truecrime book Demon Paradise Park devoted Creed’s crime course exists within novel detective reopen 1974 disappearance Dr Margot Bamborough one ever found body convincingly tied murder Creed detective go psychiatric facility interview Creed “workingclass East London accent” “triple chin” provides information one prior victim end nearly thousandpage book Yes find Bamborough’s body killer wasn’t Creed Within Dr Bamborough case there’s another theme gender confusion mysterious patient clinic day doctor disappeared patient’s name written receptionist’s log simply “Theo question mark” One doctor remembered seeing person assumed Theo man receptionist insisted otherwise “She broadshouldered noticed came desk definitely woman” detective interested finding questioning Theo whoever repeatedly bring mystery Theo’s gender Oddly — remember case reopened forty year later 2014 novel’s fictional world novel published 2020— character ever float idea Theo might transgender woman fact prefix “trans” applied gender “transgender” “transsexual” “transvestite” simply “trans” appears nowhere book Theo’s possible gender never discussed light Creed’s crossdressing part misogynistic violence ever analyzed psychologically relevant part plot J K Rowling’s new book original 1937 New York City amid public panic murderous rapist magazine Inside Detective warned one killer run “MAY MASQUERADING FEMALE ATTIRE” Place emphasis “may be” please read true case Harold Schechter’s book Mad Sculptor Within fiction find “crossdressing killer” motif Psycho 1959 novel film Dressed Kill 1980 film Silence Lambs 1988 novel film you’d like information cinematic trope highly recommend Netflix documentary Disclosure released earlier year weird …that woman author claim worry transgender woman threaten cisgender women’s safety devote career writing explicit scene man torture woman inform reader entertain b masculine pen name Yes weird transphobic Yes Considering book’s sin isolation — read text without knowing wrote — I’d say sin relatively mild Regarding serial killer Dennis Creed crossdressing element could explained better make replay old horror movie crossdressing deliberate ruse lure victim nothing ever done information Regarding mysterious visitor doctor’s office Theo detective make look foolish failing consider possibility Theo transgender There’s reason exclude word “transgender” novel author struggling fit word limit topic could better addressed distinguish concept clothing identity acknowledge transgender people generally aren’t violent exculpate queer people end Add another page ten deal didn’t know author I’d say sin omission ignorance book can’t considered isolation author billionaire author famous writing Harry Potter within past year assumed mantle antitransgender rant absolutely knew novel framing intentional want scare people transgender woman fiction also real life know life context book serving larger agenda Nick Cohen columnist Spectator read early review copy Troubled Blood wrote book’s release day “transvestism barely feature nothing made fact killer wear wig woman’s coat…” that’s exactly problem Rowling mention intends make “nothing” Especially she’s criticized year expressing antitransgender viewpoint cared received interpreted could made bigger effort anything imagine happy leverage novel deliberately capitalize publicity get repeatedly offending transgender people Readers taught consider work whole complain excessively relatively tiny detail Teachers want u demonstrate we’ve actually read entire book meanwhile living author marketer generally plead reader “be fair” ideally generous personality product school anymore motivation generous billionaire whose new brand slandering community wonder J K Rowling wrote 944 page intention minimizing passage crossdressing defender object book whole isn’t would correct book whole villain putting wig part book transphobic partTags Reading Books LGBTQ Robert Galbraith Jk Rowling |
4,101 | UX … it’s more than just graphic design | UX … it’s more than just graphic design
Every specialist involved in designing UX is a UX designer.
A content strategist sits at a table with a stack of user personas, drawing bubbles on a page, mapping the information a web user is going to need, how they’re going to use it, and in what order.
A graphic designer stands at their desk, drafting content blocks on a wireframe, anticipating the needs of the user who will be visiting that page.
An interaction designer sits in traffic on their way home, thinking about what a button should do when the user clicks it, and what type of user action should make the email signup form unfold before them on the page.
All these specialists are helping to design a user experience. And while it’s easy to suppose that “design” is simply a shorthand for “graphic design,” in the case of UX, it’s so much more.
What’s user experience?
Well — in addition to being a buzzword — UX is also “an important, strategically relevant, multidisciplinary field for solving very difficult problems at the intersection of technology, spaces, the Internet and people.” (So says Trip O’Dell, product design manager at Amazon.)
Literally defined, UX is a person’s perceptions and responses from the use of a product, system, or service. That’s how the International Organization for Standardization puts it.
In less stuffy speech, user experience is “how you feel about every interaction you have with what’s in front of you in the moment you’re using it.” User Testing Blog followed that latter definition with several worthy questions:
Can you use it?
Can you find it?
Does it serve a need that you have?
Do you want to use it?
Do you find it valuable?
Do you trust it?
Is it accessible to you?
These questions comprise a good litmus test for UX on the Internet. When creating a website, you’re aiming for “yes” all the way down the line.
What’s UX design?
According to Wikipedia, user experience design is “the process of enhancing user satisfaction with a product by improving the usability, accessibility, and pleasure provided in the interaction with the product.”
Make it fantastic, in other words. Do everything you can to wow your user on all those litmus questions above.
Which brings us to the point
UX design is not a synonym for graphic design for the web. While it’s easy to assume that hey, design is design, these animals are pretty different. And in this case, that difference is pretty crucial. A graphic designer plays an important role in UX design — but there are other roles, no less important.
Truth: you can’t have peerless UX without a disarmingly attractive, elegantly simple, self-explanatory visual design.
Equally true, you can’t have great UX without an architecture that’s sensitive to a user’s needs, structuring information in a logical, comprehensible way.
Or without page layouts (wireframes) that offer the right content, in the right place, so intuitively that a user doesn’t even have to think about what they came for, because they’re already doing it.
Or without on-point messaging that appeals to the user’s immediate practical priorities and underlying emotional needs in a deeply compelling way.
Or without on-page elements — breadcrumbs, for example — that support the experience by making the website effortlessly navigable.
Or user testing to catch hangups and refine the design. Or best-practice web development to put the site on its feet and get it rolling.
In short, there’s a difference between designing a visual user interface (UI design) and designing every aspect of a multi-dimensional experience (UX design). Here’s Kyla Tom, lead graphic designer at Madison Ave. Collective, on the big picture:
“Web design … requires content development from individuals with editorial expertise, a graphic designer to really dig into the final UI design and create iconography, an interaction designer who knows exactly how smooth actions and transitions need to be, and a back-end as well as front-end developer to maintain the site and bring everything to life on screen.”
In short, UX is teamwork.
UX designers come in many shapes
Because UX design isn’t the sole purview of any one individual — some mythical being who’s able to handle it all solo — it’s worth thinking about the various specialists involved in designing an excellent user experience, and acknowledging their role as UX designers. The magic happens at the intersection of several very different, very vital skill sets:
Information architecture
Content strategy
Wireframing
Graphic design
Copywriting
User interaction
Web development
And it’s more than the sum of the parts. UX is strategic. It’s iterative. It’s multidisciplinary. UX design is what happens when content, graphic design, and development click. It’s the satisfaction you feel when, as a user, you land on a website and your needs are answered before you even have to ask.
That’s no buzzword. That’s an ideal worth striving for.
So, when asked what UX design is, don’t fill in the bubble next to “graphic design for the web.” Remember that on the multiple choice test, the correct answer for UX is: all of the above. | https://medium.com/madison-ave-collective/ux-its-more-than-just-graphic-design-45e894517fbc | ['Elisabeth Mccumber'] | 2018-02-20 20:52:43.573000+00:00 | ['UX', 'Design', 'UI Design', 'UX Design', 'Ui Ux Design'] | Title UX … it’s graphic designContent UX … it’s graphic design Every specialist involved designing UX UX designer content strategist sits table stack user persona drawing bubble page mapping information web user going need they’re going use order graphic designer stand desk drafting content block wireframe anticipating need user visiting page interaction designer sits traffic way home thinking button user click type user action make email signup form unfold page specialist helping design user experience it’s easy suppose “design” simply shorthand “graphic design” case UX it’s much What’s user experience Well — addition buzzword — UX also “an important strategically relevant multidisciplinary field solving difficult problem intersection technology space Internet people” say Trip O’Dell product design manager Amazon Literally defined UX person’s perception response use product system service That’s International Organization Standardization put le stuffy speech user experience “how feel every interaction what’s front moment you’re using it” User Testing Blog followed latter definition several worthy question use find serve need want use find valuable trust accessible question comprise good litmus test UX Internet creating website you’re aiming “yes” way line What’s UX design According Wikipedia user experience design “the process enhancing user satisfaction product improving usability accessibility pleasure provided interaction product” Make fantastic word everything wow user litmus question brings u point UX design synonym graphic design web it’s easy assume hey design design animal pretty different case difference pretty crucial graphic designer play important role UX design — role le important Truth can’t peerless UX without disarmingly attractive elegantly simple selfexplanatory visual design Equally true can’t great UX without architecture that’s sensitive user’s need structuring information logical comprehensible way without page layout wireframes offer right content right place intuitively user doesn’t even think came they’re already without onpoint messaging appeal user’s immediate practical priority underlying emotional need deeply compelling way without onpage element — breadcrumb example — support experience making website effortlessly navigable user testing catch hangups refine design bestpractice web development put site foot get rolling short there’s difference designing visual user interface UI design designing every aspect multidimensional experience UX design Here’s Kyla Tom lead graphic designer Madison Ave Collective big picture “Web design … requires content development individual editorial expertise graphic designer really dig final UI design create iconography interaction designer know exactly smooth action transition need backend well frontend developer maintain site bring everything life screen” short UX teamwork UX designer come many shape UX design isn’t sole purview one individual — mythical who’s able handle solo — it’s worth thinking various specialist involved designing excellent user experience acknowledging role UX designer magic happens intersection several different vital skill set Information architecture Content strategy Wireframing Graphic design Copywriting User interaction Web development it’s sum part UX strategic It’s iterative It’s multidisciplinary UX design happens content graphic design development click It’s satisfaction feel user land website need answered even ask That’s buzzword That’s ideal worth striving asked UX design don’t fill bubble next “graphic design web” Remember multiple choice test correct answer UX aboveTags UX Design UI Design UX Design Ui Ux Design |
4,102 | Use Entities For Watson Assistant Node Conditions | When evaluating “non-intent” user responses in Watson Assistant (WA), try to use entities instead of evaluating the contents of “input.text”. Entities are both reusable and not case sensitive, meaning you will get cleaner code.
Using “input.text” in WA is a great way to capture and save the input into a context variable for later use or for determining the length of what was said, but for dialog node conditions, it can directly short-cut some of WA’s capabilities and can become a maintenance nightmare.
For example, let’s say the user is asked “Would you like to receive your statement by mail or fax?”. If you use “input.text” to test the user response for the value of “mail” you will miss common variations.
The condition (input.text == “Mail”) ||(input.text == “mail”) doesn’t capture all case variations
The condition (Input.text.toLowerCase() = “mail”) is better for case sensitivity, but would not handle the situation where the utterance is something like “send it by mail”
In speech applications, mis-transcriptions (homonyms) are possible. Neither of the cases above would work if the utterance came to WA as “male”
Avoid these issues by setting up entities to capture key items in the utterance and configuring the node conditions to look for those entities.
Create Entity
WA Entities — My entities
Configure Dialog
WA Dialog
Using @deliveryPreference:mail for the condition…
Tests for the occurrence of “mail” in the utterance
Captures all synonyms of “mail” configured for the entity (ie “male”)
Is case insensitive
Ignores any additional words in the utterance
There are many uses for “input.text” and powerful string methods available that can be used to evaluate the object. However, when configuring node conditions, it’s good practice to try to use entities to simplify and organize your WA design.
We have built a workspace analyzer that detects “input.text” conditions at https://github.com/cognitive-catalyst/WA-Testing-Tool/. Download the tool and navigate to the ‘validate_workspace’ section. This will help you quickly discover these conditions and others that you may wish to improve.
Find more Watson Assistant Best Practices at https://medium.com/ibm-watson/best-practices-for-building-and-maintaining-a-chatbot-a8b78f0b1b72. For help implementing these practices, reach out to IBM Data and AI Expert Labs and Learning. | https://medium.com/ibm-data-ai/use-entities-for-watson-assistant-node-conditions-4cc33b2f25ba | ['Leo Mazzoli'] | 2020-02-05 18:58:43.253000+00:00 | ['Watson Assistant', 'Tutorial', 'NLP', 'Artificial Intelligence', 'Chatbots'] | Title Use Entities Watson Assistant Node ConditionsContent evaluating “nonintent” user response Watson Assistant WA try use entity instead evaluating content “inputtext” Entities reusable case sensitive meaning get cleaner code Using “inputtext” WA great way capture save input context variable later use determining length said dialog node condition directly shortcut WA’s capability become maintenance nightmare example let’s say user asked “Would like receive statement mail fax” use “inputtext” test user response value “mail” miss common variation condition inputtext “Mail” inputtext “mail” doesn’t capture case variation condition InputtexttoLowerCase “mail” better case sensitivity would handle situation utterance something like “send mail” speech application mistranscriptions homonym possible Neither case would work utterance came WA “male” Avoid issue setting entity capture key item utterance configuring node condition look entity Create Entity WA Entities — entity Configure Dialog WA Dialog Using deliveryPreferencemail condition… Tests occurrence “mail” utterance Captures synonym “mail” configured entity ie “male” case insensitive Ignores additional word utterance many us “inputtext” powerful string method available used evaluate object However configuring node condition it’s good practice try use entity simplify organize WA design built workspace analyzer detects “inputtext” condition httpsgithubcomcognitivecatalystWATestingTool Download tool navigate ‘validateworkspace’ section help quickly discover condition others may wish improve Find Watson Assistant Best Practices httpsmediumcomibmwatsonbestpracticesforbuildingandmaintainingachatbota8b78f0b1b72 help implementing practice reach IBM Data AI Expert Labs LearningTags Watson Assistant Tutorial NLP Artificial Intelligence Chatbots |
4,103 | Handling asynchronous errors in Scala at Hootsuite | Introduction
Every day Hootsuite makes hundreds of thousands of API calls, and processes millions of events that happened in various social networks. Our microservice architecture, and a handful of asynchronous servers with efficient error handling, make this possible. Let’s take a look at how the Scala servers deal with errors.
Different types of error handling in Scala
First, let’s see what kinds of error handling mechanisms exist in Scala
Exceptions
Unlike Java, all exceptions in Scala are unchecked. We need to write a partial function in order to catch one explicitly. It is important to make sure that we are catching what we want to catch. For example, use scala.util.control.NonFatal to catch the normal errors only.
// Example code try {
dbQuery()
} catch {
case NonFatal(e) => handleErrors(e)
}
If we replace NonFatal(e) with _ , the block will catch every single exception including JVM errors such as java.lang.OutOfMemoryError .
Options
Programming in Java often produces abuse of null to represent an absent optional value and it led to many nasty NullPointerExceptions. Scala offers a container type named Option to get rid of the usage of null. An Option[T] instance may or may not contain an instance of T. If an Option[T] object contains a present value of T, then it is a Some[T] instance. If it contains an absent value of T, then it is the None object.
// Example code val maybeProfileId: Option[String] = request.body.profileId
maybeProfileId match {
case None => MissingArgumentsError(“ProfileId is required”))
case Some(profileId) => updateProfileId(profileId)
}
Note that Some(null) is still possible in Scala and it is potentially a very nasty bug. When we have code that returns null , it is best to wrap it in Option() .
Try
Unlike Option, Try can be used to handle specific exceptions more explicitly. Try[T] represents a computation that may result in a wrapped value of type T, Success[T] when it’s successful or a wrapped throwable, Failure[T] when it’s unsuccessful. If you know that a computation may result in an error, you can simply use Try[T] as the return type of the function. This allows the clients of the function to explicitly deal with the possibility of an error.
// Example code Try(portString.toInt) match {
case Success(port) => new ServerAddress(hostName, port)
case Failure(_) => throw new ParseException(portString)
}
Either
Either is the more complicated but better way to handle errors; we can create a custom Algebraic Data Type to structure and maintain the exceptions. Either takes two type parameters; an Either[L, R] instance can contain either an instance of L or an instance of R. The Either type has two sub-types, Left and Right. If an Either [L, R] object contains an instance of L, then it is a Left[L] instance and vice versa. For error handling, Left is used to represent failure and Right is used to represent success by convention. It’s perfect for dealing with expected external failures such as parsing or validation.
// Example code trait ApiError {
val message: String
} object ApiError {
case object MissingProfileError extends ApiError {
override val message: String = “Missing profile”
}
} def getProfileResult(
response: Either[ProfileError, ProfileResponse]
): Result =
response match {
case Right(profileIdResesponse) =>
Ok(Json.toJson(profileIdResponse))
case Left(MissingProfileError) =>
NotFound(ApiError.MissingProfileError)
}
Asynchronous usage
We have looked at various of methods used for handling errors, but how will they be used in multi-threaded environments?
Future with failure
Scala has another container type called Future[T], representing a computation that is supposed to complete and return a value of type T eventually. If the process fails or times out, the Future will contain an error instead.
// Example code val hasPermission: Future[Boolean] = permission match {
case “canManageGroup” => memberId match {
case Some(memberId) => canManageGroup(memberId)
case _ => Future.failed(BadRequestException(MissingParams))
}
}
Future without failure
If we review the example code above, one improvement we can make is to not raise an exception for a missing argument. To handle the error in a more controlled, self contained way, we can combine the usage of Future and Either.
// Example code val hasPermission: Future[Either[PermissionError, Boolean]] =
perm match {
case “canManageGroup” => memberId match {
case Some(memberId) => canManageGroup(memberId).asRight
case _ => BadRequest(MissingParams)).asLeft
}
}
Simplify Future[Either[L, R]] with Cats EitherT
While it is a good practice to handle errors or perform validation asynchronously using Future and Either, adding chains of operations such as (flat)mapping and pattern matching on the containers can require a lot of boilerplate. EitherT can be used to remove the hassle. EitherT[F[_], A, B] is a lightweight wrapper for F[Either[A, B]]. In our case, Future[Either[L, R]] can be transformed into EitherT[Future, L, R] which gets rid of the extra layer between Future and Either.
// Example code def updateFirstName(name: String):
Future[Either[DataError, UpdateResult]] = ???
def updateLastName(name: String):
Future[Either[DataError, UpdateResult]] = ??? def updateFirstAndLastName(firstName: String, lastName: String):
Future[Either[DataError, UpdateResult]] =
updateFirstName(firstName) flatMap { firstEither =>
updateLastName(lastName) flatMap { secondEither =>
(firstEither, secondEither) match {
case (Right(res1), Right(res2)) => sumResult(res1, res2)
case (Left(err), _) => Future.successful(Left(err))
case (_, Left(err)) => Future.successful(Left(err))
}
}
}
The function can be re-written using EitherT as:
// Example code def updateFirstAndLastName(firstName: String, lastName: String):
EitherT[Future, DataError, UpdateResult] =
for {
a <- EitherT(updateFirstName(firstName))
b <- EitherT(updateLastName(lastName))
result <- EitherT(aggregateResult(firstRes, lastRes))
} yield result
Conclusion
Most Scala services at Hootsuite use all of the error handling patterns mentioned above in appropriate situations. Either is widely used to gracefully control business errors, Try filters expected failure more explicitly, and Option is seen in a lot of places where the value can be absent. The combination of Future and Either is definitely the most prominent, but this can make the code quite noisy due to double wrapping of objects. This problem is solved by adopting EitherT, the monad transformer from the Cats library. It allows us to create clean and readable but powerful asynchronous code. | https://medium.com/hootsuite-engineering/handling-asynchronous-errors-in-scala-at-hootsuite-935f3d0461af | ['Brian Pak'] | 2018-08-09 23:20:27.078000+00:00 | ['Microservices', 'Programming', 'Error Handling', 'Co Op', 'Scala'] | Title Handling asynchronous error Scala HootsuiteContent Introduction Every day Hootsuite make hundred thousand API call process million event happened various social network microservice architecture handful asynchronous server efficient error handling make possible Let’s take look Scala server deal error Different type error handling Scala First let’s see kind error handling mechanism exist Scala Exceptions Unlike Java exception Scala unchecked need write partial function order catch one explicitly important make sure catching want catch example use scalautilcontrolNonFatal catch normal error Example code try dbQuery catch case NonFatale handleErrorse replace NonFatale block catch every single exception including JVM error javalangOutOfMemoryError Options Programming Java often produce abuse null represent absent optional value led many nasty NullPointerExceptions Scala offer container type named Option get rid usage null OptionT instance may may contain instance OptionT object contains present value SomeT instance contains absent value None object Example code val maybeProfileId OptionString requestbodyprofileId maybeProfileId match case None MissingArgumentsError“ProfileId required” case SomeprofileId updateProfileIdprofileId Note Somenull still possible Scala potentially nasty bug code return null best wrap Option Try Unlike Option Try used handle specific exception explicitly TryT represents computation may result wrapped value type SuccessT it’s successful wrapped throwable FailureT it’s unsuccessful know computation may result error simply use TryT return type function allows client function explicitly deal possibility error Example code TryportStringtoInt match case Successport new ServerAddresshostName port case Failure throw new ParseExceptionportString Either Either complicated better way handle error create custom Algebraic Data Type structure maintain exception Either take two type parameter EitherL R instance contain either instance L instance R Either type two subtypes Left Right Either L R object contains instance L LeftL instance vice versa error handling Left used represent failure Right used represent success convention It’s perfect dealing expected external failure parsing validation Example code trait ApiError val message String object ApiError case object MissingProfileError extends ApiError override val message String “Missing profile” def getProfileResult response EitherProfileError ProfileResponse Result response match case RightprofileIdResesponse OkJsontoJsonprofileIdResponse case LeftMissingProfileError NotFoundApiErrorMissingProfileError Asynchronous usage looked various method used handling error used multithreaded environment Future failure Scala another container type called FutureT representing computation supposed complete return value type eventually process fails time Future contain error instead Example code val hasPermission FutureBoolean permission match case “canManageGroup” memberId match case SomememberId canManageGroupmemberId case FuturefailedBadRequestExceptionMissingParams Future without failure review example code one improvement make raise exception missing argument handle error controlled self contained way combine usage Future Either Example code val hasPermission FutureEitherPermissionError Boolean perm match case “canManageGroup” memberId match case SomememberId canManageGroupmemberIdasRight case BadRequestMissingParamsasLeft Simplify FutureEitherL R Cats EitherT good practice handle error perform validation asynchronously using Future Either adding chain operation flatmapping pattern matching container require lot boilerplate EitherT used remove hassle EitherTF B lightweight wrapper FEitherA B case FutureEitherL R transformed EitherTFuture L R get rid extra layer Future Either Example code def updateFirstNamename String FutureEitherDataError UpdateResult def updateLastNamename String FutureEitherDataError UpdateResult def updateFirstAndLastNamefirstName String lastName String FutureEitherDataError UpdateResult updateFirstNamefirstName flatMap firstEither updateLastNamelastName flatMap secondEither firstEither secondEither match case Rightres1 Rightres2 sumResultres1 res2 case Lefterr FuturesuccessfulLefterr case Lefterr FuturesuccessfulLefterr function rewritten using EitherT Example code def updateFirstAndLastNamefirstName String lastName String EitherTFuture DataError UpdateResult EitherTupdateFirstNamefirstName b EitherTupdateLastNamelastName result EitherTaggregateResultfirstRes lastRes yield result Conclusion Scala service Hootsuite use error handling pattern mentioned appropriate situation Either widely used gracefully control business error Try filter expected failure explicitly Option seen lot place value absent combination Future Either definitely prominent make code quite noisy due double wrapping object problem solved adopting EitherT monad transformer Cats library allows u create clean readable powerful asynchronous codeTags Microservices Programming Error Handling Co Op Scala |
4,104 | Easy Text Annotation in a Jupyter Notebook | Easy Text Annotation in a Jupyter Notebook
How to use tortus annotation tool
Image by author
At the heart of any sentiment analysis project is a good set of labeled data. Pre-labeled datasets can be found on various sites all over the internet. But…
What if you have come up with a custom dataset that has no labels ?
? What if you have to provide those labels before proceeding with your project?
What if you are not willing to pay to outsource the task of labeling?
I was recently faced with this very issue while retrieving text data from the Twitter Streaming API for a sentiment analysis project. I quickly discovered annotating the data myself would be a painful task without a good tool. This was the inspiration behind building tortus, a tool that makes it easy to label your text data within a Jupyter Notebook! | https://towardsdatascience.com/tortus-e4002d95134b | ['Siphu Langeni'] | 2020-10-10 11:37:56.253000+00:00 | ['Sentiment Analysis', 'Jupyter Notebook', 'Annotation Tools', 'NLP', 'Data Science'] | Title Easy Text Annotation Jupyter NotebookContent Easy Text Annotation Jupyter Notebook use tortus annotation tool Image author heart sentiment analysis project good set labeled data Prelabeled datasets found various site internet But… come custom dataset label provide label proceeding project willing pay outsource task labeling recently faced issue retrieving text data Twitter Streaming API sentiment analysis project quickly discovered annotating data would painful task without good tool inspiration behind building tortus tool make easy label text data within Jupyter NotebookTags Sentiment Analysis Jupyter Notebook Annotation Tools NLP Data Science |
4,105 | Investigate and solve Compute Engine cold starts like a detective🕵🏽♀️ | Investigate and solve Compute Engine cold starts like a detective🕵🏽♀️
Season of Scale
Season of Scale
“Season of Scale” is a blog and video series to help enterprises and developers build scale and resilience into your design patterns. In this series we plan on walking you through some patterns and practices for creating apps that are resilient and scalable, two essential goals of many modern architecture exercises.
In Season 2, we’re covering how to optimize your applications to improve instance startup time! If you haven’t seen Season 1, check it out here.
How to improve Compute Engine startup times (this article) How to improve App Engine startup times How to improve Cloud Run startup times
Shaving seconds off compute startup times might take a bit of detective work. How do you know if the issue lies within request, provision, or boot phases? In this article, we hone in on profiling Compute instances. I’ll explain how to pinpoint whether provisioning, scripts, or images contribute to slower instance startup times.
Check out the video
Review
So far we have looked at how Critter Junction, a multiplayer online game following life simulation as a critter. They’ve successfully launched and globally scaled a their gaming app on Compute Engine. With their growing daily active users, we helped them set up autoscaling, global load balancing, and autohealing to handle globally distributed and constantly rising traffic.
Cold start time woes
But, Critter Junction’s been seeing longer than wanted startup times for their Compute Engine instances, even though they set everything according to our autoscaling recommendations. They knew they were running some logic on their game servers on Compute Engine, like taking user inputs to spawn them into a new critter’s island. After profiling their startup times, they were seeing more than 380 second cold start times, while the response latency for a request was in the 300 millisecond range.
They also did a performance test to see how long Compute Engine was taking to create their instances versus how much time their code was taking to run,
Right from Cloud Shell, it showed:
Request, Provision, Boot
Request is the time between asking for a VM and getting a response back from the Create Instance API acknowledging that you’ve asked for it. You can profile this by timing how long it takes Google Cloud to respond to the Insert Instance REST command.
Provision is the time Compute Engine takes to find space for the VM on its architecture. Use the Get Instance API on a regular basis and wait for the status flag to change from provisioning to running.
Boot time is when startup scripts and other custom code executes up to the point when the instance is available. Just repeatedly poll a health check that is served by the same runtime as your app. Then time the change between receiving 500, 400 and 200 status codes.
After doing these, Critter Junction noticed the majority of instance startup time usually happened during the boot phase, when the instance executes startup scripts. This is not uncommon, so you should profile your boot scripts to see which phases are creating performance bottlenecks.
Introducing the Seconds Variable
To get a sense of what stages of your script are taking the most boot time, one trick is to wrap each section of your startup script with a command that utilizes the SECONDS variable, then append the time elapsed for each stage to a file, and set up a new endpoint to serve that file when requested.
SECONDS=0
# do some work
duration=$SECONDS
echo "$(($duration / 60)) minutes and $(($duration % 60)) seconds elapsed."
This let Critter Junction dig even deeper to poll the endpoint and get data back without too much heavy lifting or modification to their service.
And there it was!
An example graph generated by timing the startup phases of the instance. Notice that the graph on the right is in sub-second scale.
The performance bottleneck seemed to be public images — preconfigured combinations of the OS and bootloaders. These images are great when you want to get up and running, but as you start building production-level systems, the large portion of bootup time is no longer booting the OS, but the user-executed startup sequence that grabs packages and binaries, and initializes them.
Use custom images
Critter Junction was able to address this by creating custom images for their instances. Which you can do from source disks, images, snapshots, or images stored in Cloud Storage, then use the images to create VM instances.
Custom images list
When the target instance is booted, the image information is copied right to the hard drive. This is great when you’ve created and modified a root persistent disk to a certain state and want to save that state to reuse it with new instances, and when your setup includes installing (and compiling) big libraries, or pieces of software.
Armed and ready
When you’re trying to scale to millions of requests per second, being serviced by thousands of instances, a small change in boot time can make a big difference in costs, response time and most importantly, the perception of performance by your users. Stay tuned for what’s next for Critter Junction.
And remember, always be architecting.
Next steps and references: | https://medium.com/google-cloud/investigate-and-solve-compute-engine-cold-starts-like-a-detective-%EF%B8%8F-66a03736cb03 | ['Stephanie Wong'] | 2020-09-14 18:47:29.968000+00:00 | ['Software Development', 'Google', 'Google Cloud Platform', 'Computer Science', 'Cloud'] | Title Investigate solve Compute Engine cold start like detective🕵🏽♀️Content Investigate solve Compute Engine cold start like detective🕵🏽♀️ Season Scale Season Scale “Season Scale” blog video series help enterprise developer build scale resilience design pattern series plan walking pattern practice creating apps resilient scalable two essential goal many modern architecture exercise Season 2 we’re covering optimize application improve instance startup time haven’t seen Season 1 check improve Compute Engine startup time article improve App Engine startup time improve Cloud Run startup time Shaving second compute startup time might take bit detective work know issue lie within request provision boot phase article hone profiling Compute instance I’ll explain pinpoint whether provisioning script image contribute slower instance startup time Check video Review far looked Critter Junction multiplayer online game following life simulation critter They’ve successfully launched globally scaled gaming app Compute Engine growing daily active user helped set autoscaling global load balancing autohealing handle globally distributed constantly rising traffic Cold start time woe Critter Junction’s seeing longer wanted startup time Compute Engine instance even though set everything according autoscaling recommendation knew running logic game server Compute Engine like taking user input spawn new critter’s island profiling startup time seeing 380 second cold start time response latency request 300 millisecond range also performance test see long Compute Engine taking create instance versus much time code taking run Right Cloud Shell showed Request Provision Boot Request time asking VM getting response back Create Instance API acknowledging you’ve asked profile timing long take Google Cloud respond Insert Instance REST command Provision time Compute Engine take find space VM architecture Use Get Instance API regular basis wait status flag change provisioning running Boot time startup script custom code executes point instance available repeatedly poll health check served runtime app time change receiving 500 400 200 status code Critter Junction noticed majority instance startup time usually happened boot phase instance executes startup script uncommon profile boot script see phase creating performance bottleneck Introducing Seconds Variable get sense stage script taking boot time one trick wrap section startup script command utilizes SECONDS variable append time elapsed stage file set new endpoint serve file requested SECONDS0 work durationSECONDS echo duration 60 minute duration 60 second elapsed let Critter Junction dig even deeper poll endpoint get data back without much heavy lifting modification service example graph generated timing startup phase instance Notice graph right subsecond scale performance bottleneck seemed public image — preconfigured combination OS bootloaders image great want get running start building productionlevel system large portion bootup time longer booting OS userexecuted startup sequence grab package binary initializes Use custom image Critter Junction able address creating custom image instance source disk image snapshot image stored Cloud Storage use image create VM instance Custom image list target instance booted image information copied right hard drive great you’ve created modified root persistent disk certain state want save state reuse new instance setup includes installing compiling big library piece software Armed ready you’re trying scale million request per second serviced thousand instance small change boot time make big difference cost response time importantly perception performance user Stay tuned what’s next Critter Junction remember always architecting Next step referencesTags Software Development Google Google Cloud Platform Computer Science Cloud |
4,106 | Design interval training | Sprint
In 2016 the tech world seemed abuzz about a “Design Sprint” process detailed in the book Sprint by Jake Knapp and other Google Ventures employees. The premise was that a new area of opportunity for product could be tackled by a dedicated small team in just five working days, at least conceptually. The authors had honed the process during real company visits.
Plenty of readers latched on to the idea and tried it at their own companies. On the flipside, there was a backlash, with some feeling like a thoughtful ideation phase could not possibly be completed in a week, or that blocking out several busy calendars for a week was just not feasible.
Somewhere in-between, I feel there are emphases from the Sprint process that are important to cover, even if they are not all possible within the same week. By tackling these key areas intermittently over a longer period of time, you can cover some vital stages of the ideation process with more room to think and fit in other demands. Tackling these steps in a more protracted way can mean you get all the bases covered at a more manageable pace.
Focus on the Problem
The start of the Sprint week is spent learning all you can about the problem space being tackled, something which is all too commonly skipped over in the rush to get things done. “Ask the experts” the Sprint book advises; get a thorough understanding of the subject matter at hand. Getting deep into the problem is an important fundamental to tackle before working out any solutions. Talk to those affected, particularly those suffering from the problem, to learn more. You do not necessarily need to schedule all your interviews in one afternoon though. Take your time to dwell on the problem; who is affected and why does it matter to them? When does it impact their lives? How widespread is the problem? Who is not affected? Your understanding of the problem will inform all your later work, so you need to make sure the problem is as complete as possible before you move on to solutions…
More is more
Being set up with a thorough understanding of the problem at hand, your team is then in a great position to work on solving it. There is usually more than one solution to a problem, and you do not always get things right the first time; continuous improvement and failing fast are popular tech concepts. Sprint advocates aiming for multiple solutions right away; iterating on sketches until you come up with the most compelling variant. Try spending more time in-between to allow your ideas to marinate, and one day whilst tackling something else you may get that brain-wave that adds the missing piece you’ve been looking for.
Prototype
Waiting until engineering resources can be scheduled to undertake your new idea can be costly in terms of both time and money. Prototyping allows you to validate the way ideas fit together, both in your own mind and others, without the heavy upfront costs. Linking together a few mock-ups without code or a back-end allows viewers to visualise your product before you set about building it. Coming up with a believable prototype in one day when sprinting can be a challenge though, and is certainly rushed. Spending a bit longer allows you to cover up the seams and ensure that your prototype does a convincing job. Then you can use the prototype for validating the idea in question, rather than answering questions about whether the finished version would be full colour.
Try it out
The earlier you can get feedback, the earlier you can learn. The earlier you can learn, the earlier you can shape your solution to be more in-line with what’s required, or learn from your mistakes and start over. It’s like the virtuous feedback loop detailed in the well-known Lean Startup book by Eric Ries, except starting it before you build puts you even further upstream in terms of learning. That’s why Sprint crams Friday feedback into that week as the final key requirement; without the knowledge gained from the feedback sessions, you just have nicely presented assumptions, and risk proceeding towards outputs over outcomes.
Commencing the feedback gathering promptly but spreading it over a slightly longer period, say one session every day or two, can be more practical though. That way you do not have to try and slot in five people to help you out in one day; you can accommodate their schedules. You can spend more time discussing the area at hand with your participants both before and after the prototype test to get deeper into the issues without hurrying them through your script and revolving door. If something goes wrong, as it always can, and you discover a gaping prototype flaw during the first session that threatens to compromise all your feedback, you can realistically correct it for the later tests.
That isn’t to say you need more than five tests to see the trends though; the Sprint book very helpfully references a study showing the diminishing returns you get after the fifth session, and other similar studies reinforce the finding. It just means you can spend more time gathering, organising and analysing your feedback before taking action, soaking up all the value your in-person tests will give.
More haste, less speed
I would encourage you to read the Sprint book in detail for great insights in the areas mentioned. By all means try a Design Sprint at your company if you feel you can be accommodating enough to make it work. Life can get in the way though, so rather than taking an all-or-nothing approach, why not deconstruct the Design Sprint and tackle the key stages along a timeline that works for your situation? That may be a more realistic way for you to cover the important analysis steps that can put your product on the right track. | https://medium.com/the-daily-standup/design-interval-training-50b09bf8bd25 | ['Mark Jones'] | 2017-10-06 02:57:07.589000+00:00 | ['Product Design', 'Design Sprint', 'Design', 'User Experience', 'Product Management'] | Title Design interval trainingContent Sprint 2016 tech world seemed abuzz “Design Sprint” process detailed book Sprint Jake Knapp Google Ventures employee premise new area opportunity product could tackled dedicated small team five working day least conceptually author honed process real company visit Plenty reader latched idea tried company flipside backlash feeling like thoughtful ideation phase could possibly completed week blocking several busy calendar week feasible Somewhere inbetween feel emphasis Sprint process important cover even possible within week tackling key area intermittently longer period time cover vital stage ideation process room think fit demand Tackling step protracted way mean get base covered manageable pace Focus Problem start Sprint week spent learning problem space tackled something commonly skipped rush get thing done “Ask experts” Sprint book advises get thorough understanding subject matter hand Getting deep problem important fundamental tackle working solution Talk affected particularly suffering problem learn necessarily need schedule interview one afternoon though Take time dwell problem affected matter impact life widespread problem affected understanding problem inform later work need make sure problem complete possible move solutions… set thorough understanding problem hand team great position work solving usually one solution problem always get thing right first time continuous improvement failing fast popular tech concept Sprint advocate aiming multiple solution right away iterating sketch come compelling variant Try spending time inbetween allow idea marinate one day whilst tackling something else may get brainwave add missing piece you’ve looking Prototype Waiting engineering resource scheduled undertake new idea costly term time money Prototyping allows validate way idea fit together mind others without heavy upfront cost Linking together mockups without code backend allows viewer visualise product set building Coming believable prototype one day sprinting challenge though certainly rushed Spending bit longer allows cover seam ensure prototype convincing job use prototype validating idea question rather answering question whether finished version would full colour Try earlier get feedback earlier learn earlier learn earlier shape solution inline what’s required learn mistake start It’s like virtuous feedback loop detailed wellknown Lean Startup book Eric Ries except starting build put even upstream term learning That’s Sprint crams Friday feedback week final key requirement without knowledge gained feedback session nicely presented assumption risk proceeding towards output outcome Commencing feedback gathering promptly spreading slightly longer period say one session every day two practical though way try slot five people help one day accommodate schedule spend time discussing area hand participant prototype test get deeper issue without hurrying script revolving door something go wrong always discover gaping prototype flaw first session threatens compromise feedback realistically correct later test isn’t say need five test see trend though Sprint book helpfully reference study showing diminishing return get fifth session similar study reinforce finding mean spend time gathering organising analysing feedback taking action soaking value inperson test give haste le speed would encourage read Sprint book detail great insight area mentioned mean try Design Sprint company feel accommodating enough make work Life get way though rather taking allornothing approach deconstruct Design Sprint tackle key stage along timeline work situation may realistic way cover important analysis step put product right trackTags Product Design Design Sprint Design User Experience Product Management |
4,107 | 6 Tips For Junior Devs, After 10,000h Of Engineering Software | Since the beginning of my career as a Software Dev in early 2018, I constantly sought growth and challenge in my day-to-day.
I have attended a multitude of tech conferences, networked with dozens of world-class engineers, and consumed hundreds of hours of tech-related content in my spare time.
On top of that, I have graduated with a Software Engineering degree and wrote a couple of successful Medium articles. I’ve built many side projects, contributed to open source libs, and even tried building a bunch of tech startups.
Nothing of that would have taught me anything about how to be a developer, had I not learned it the hard way on the job, working with a living software and real people.
By now, I’ve worked in multiple companies (been fired from one too), working different parts of the technology stack. I’ve done QA automation, Back-end, Front-end, Ops, DevOps, and now ended up as a Junior Site Reliability Engineer.
Some say that my experience is way above the Junior position.
Yet, I still proudly hold on to this title, believing that a title is worth as much as one’s professional maturity. | https://vyrwu.medium.com/5-tips-for-junior-devs-from-over-10-000h-of-software-engineering-9aad682f6468 | ['Aleksander', 'Vyrwu'] | 2020-12-21 00:54:22.572000+00:00 | ['Advice', 'Programming', 'Software Development', 'Engineering', 'Junior'] | Title 6 Tips Junior Devs 10000h Engineering SoftwareContent Since beginning career Software Dev early 2018 constantly sought growth challenge daytoday attended multitude tech conference networked dozen worldclass engineer consumed hundred hour techrelated content spare time top graduated Software Engineering degree wrote couple successful Medium article I’ve built many side project contributed open source libs even tried building bunch tech startup Nothing would taught anything developer learned hard way job working living software real people I’ve worked multiple company fired one working different part technology stack I’ve done QA automation Backend Frontend Ops DevOps ended Junior Site Reliability Engineer say experience way Junior position Yet still proudly hold title believing title worth much one’s professional maturityTags Advice Programming Software Development Engineering Junior |
4,108 | Ocean Waves (Sinusoidal) Regression | Definition :
A Sine wave or sinusoidal wave is a mathematical curve that describes a smooth periodic oscillation. A Sine wave is a continuous wave, it goes from 0 to 360 degrees.
Table Representing Sine Values
Generation of Sine Wave
Sinusoidal function is given by,
Sine Function Formula
The period of the sine curve is the length of one cycle of the curve. The natural period of the sine curve is 2π. So, a coefficient of b=1 is equivalent to a period of 2π. To get the period of the sine curve for any coefficient B, just divide 2π by the coefficient b to get the new period of the curve.
Real Life Application Of Sine Function :
(1) Generation of music waves.
(2) Sound travels in waves.
(3) Trigonometric functions in constructions.
(4) Used in space flights.
(5) GPS location calculations.
(6) Architecture.
(7) Electrical current.
(8) Radio broadcasting.
(9) Low and high tides of the ocean.
(10) Buildings.
Now, I’m going to show you different kinds of sine waves that can be generated by modifying its parameters. My ultimate goal is to show you how modification of parameters affects the shape of the graph. After that, I’m going to take an example that will show how we can implement sinusoidal regression in python.
First of all, we are going to have a look at different graphs of sine waves by modifying the parameter values.
Why are we going to do this?
As we know data visualization has a major role in data science. While working with data (regression) we need to find the best fit curve for it. For that, we’ll have a lot of parameters in our function. Now if we don’t know what happens when we change these parameters then it’s going to be a cumbersome journey to go through it, right? So here we’ll take examples to understand what happens when we change the parameter values.
How we should understand it?
We will take our main sine function and then we’ll modify the parameter values then there will be a graph for that to visualize it. What I want you to do is take a pen and paper and try to plot the sine graph while going through examples. I think that’ll help you understand better.
Let’s have a look at different sine graphs! ☀️
Example : 1
Y = 1*Sin(1(X+0))+0
Y = SinX
A = 1
B = 1
C = 0
D = 0
Period = 2*pi/1 = 2*pi
Y = SinX
Here we can see that the sine wave has the amplitude of 1 and the length of cycle for the sine wave goes from 0 to 2pi.
Example 2 :
Y = 2*Sin(1(X+0))+0
Y = 2SinX
A = 2
B = 1
C = 0
D = 0
Period = 2*pi/1 = 2*pi
Y = 2SinX
Here we can see that the sine wave has an amplitude of 2. As we can see that it increases the height of our sine wave. The length of cycle for the sine wave goes from 0 to 2pi.
Example 3 :
Y = 1*Sin(2(X+0))+0
Y = Sin2X
A = 1
B = 2
C = 0
D = 0
Period = 2*pi/2 = pi
Y = Sin2X
Here we can see that the sine wave has an amplitude of 1. The length of cycle for the sine wave goes from 0 to pi.
Example 4:
Y = 2*Sin(2(X+0))+0
Y = 2Sin2X
A = 2
B = 2
C = 0
D = 0
Period = 2*pi/2 = pi
Y = 2Sin2X
Here we can see that the sine wave has an amplitude of 2. The length of cycle for the sine wave goes from 0 to pi. As we can see from the graph it has increased the height of our wave and one cycle completer at pi.
Example 5:
Y = 2*Sin(1(X+1))+0
Y = 2Sin(X+1)
A = 2
B = 1
C = 1 (Shift to Left)
D =0
Period = 2*pi/1 = 2*pi
Y = 2Sin(X+1)
Here we have shifted our curve to the left by 1. We took the amplitude value as 1. Notice that here we have the period of 2*pi. That means one cycle has a length of 2*pi. Since we have shifted it to the left by one unit, the first cycle will be shifted 1 unit to the left from 2pi.
Example 6 :
Y = 2*Sin(1(X-1))+0
Y = 2Sin(X-1)
A = 2
B = 1
C = -1 (Shift to Right)
Period = 2*pi/1 = 2*pi
Y = 2Sin(X-1)
Here we have shifted our curve to the right by 1. We took the amplitude value as 1. Notice that here we have the period of 2*pi. That means one cycle has a length of 2*pi. Since we have shifted it to the right by one unit, the first cycle will be shifted 1 unit to the right from 2pi.
Example 7:
Y = 1*Sin(1(X+0))+2
Y = SinX +2
A = 1
B = 1
C = 0
D =2
Period = 2*pi/1 = 2*pi
Y = SinX +2
Here notice that we have shifted our curve 2 points on the positive y-axis. The amplitude of the curve is 1. The period as you can see is also 2*pi.
Example 8 :
Y = 1*Sin(1(X+0)) — 2
Y = SinX — 2
A = 1
B = 1
C =0
D =-2
Period = 2*pi/1 = 2*pi
Y = SinX — 2
Here notice that we have shifted our curve 2 points on the negative y-axis. The amplitude of the curve is 1. The period as you can see is also 2*pi.
Example 9:
Y = -1*Sin(1(X+0))+0
Y = -SinX
A = -1
B = 1
C = 0
D =0
Period = 2*pi/1 = 2*pi
Y = -SinX
Here we have changed the amplitude value to -1. From the illustration above, we can see that our graph is inverted from the previous version which has amplitude of 1. It means the positive y-axis is replaced by the negative y-axis.
Example 10 :
Y = -2*Sin(1(X+0))+0
Y = -2SinX
A = -2
B = 1
C = 0
D = 0
Period = 2*pi/1 = 2*pi
Y = -2SinX
Here we are going to set the value of amplitude to -2. So it’s just like our previous graph but the height of sine curve is increased. Also notice that the period of sine curve is 2*pi.
Example 11 :
Y = -2*Sin(1(X-1))+0
Y = -2Sin(X-1)
A = -2
B = 1
C = -1
D = 0
Period = 2*pi/1 = 2*pi
Y = -2Sin(X-1)
Here we have shifted the curve to the right by 1 point and also changed the amplitude value to -1. The period of the sine curve is 2*pi.
Example 12 :
Y = -2*Sin(1(X+1))+0
Y = -2Sin(X+1)
A = -2
B = 1
C = 1
D = 0
Period = 2*pi/1 = 2*pi
Y = -2Sin(X+1)
Here we have shifted the curve to the left by 1 point and also changed the amplitude value to -1. So it’s going to negative y-axis first. The period of the sine curve is 2*pi.
Example 13 :
Y = 2*Sin(-1(X+1))+0
Y = 2Sin(-1(X+1))
A = 2
B = -1
C = 1
D = 0
Period = 2*pi/-1 = -2*pi
Y = 2Sin(-1(X+1))
Here we have shifted the curve to the left by 1 unit. One thing to notice is that, since we have period of -2*pi, our graph is going to the left side or we can say on the negative x-axis. When we have a positive value of the period it goes to the positive x-axis.
Example :14
Y = -2*Sin(-1(X-1))+0
Y = -2Sin(-1(X-1))
A = -2
B = -1
C = -1
D = 0
Period = 2*pi/-1 = -2*pi
Y = -2Sin(-1(X-1))
Here we have shifted the curve to the right by 1 unit. One thing to notice is that, since we have period of -2*pi, our graph is going to the left side or we can say on the negative x-axis.
Example 15 :
Y = 1*Sin(1(X+1))+1
Y = 1*Sin(X+1) + 1
A =1
B = 1
C =1
D=1
Period = 2*pi/1 = 2*pi
Y = 1*Sin(X+1) + 1
Here we have the amplitude value of 1 and we have also shifted the curve to the left by 1 unit. Here notice that the period of our curve is 2*pi. One more thing to notice is that we have shifted our curve by 1 on the positive y-axis.
Example 16 :
Y = -1*Sin(-1(X-1))-1
A = -1
B = -1
C = -1
D =-1
Period = 2*pi/-1 = -2*pi
Y = -1*Sin(-1(X-1))-1
Here we have the amplitude value of -1 and we have also shifted the curve to the right by 1 unit. Here notice that the period of our curve is -2*pi. So it’s going to go to the left first. One more thing to notice is that we have shifted our curve by 1 on the negative y-axis.
Credits : Unsplash
Let’s code :
(1) Import required libraries :
Here we are going to import four libraries.
numpy : for calculations.
for calculations. matplotlib : to plot our dataset and curves.
to plot our dataset and curves. curve_fit : to find the optimal parameters values for our sine curve.
to find the optimal parameters values for our sine curve. r2_score : to calculate the accuracy of our model.
# Import required libraries : import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit # For curve fitting
from sklearn.metrics import r2_score # To check accuracy
(2) Generate Dataset :
Since we don’t have an actual dataset that represents a sine wave pattern, what we are going to do is make our own dataset.
Here we are going to use linspace function to get the value of X. For Y value we’ll use 2*pi*X. Now our real-life dataset isn’t going to follow the exact since pattern, right? There will be some noise in the dataset. We’re also going to add some noise in our dataset to make it look more realistic!
After then we are just going to scatter the X, Y points on the plane. That way we can visualize the dataset we have created.
# Generating dataset : # Y = A*sin(B(X + C)) + D
# A = Amplitude
# Period = 2*pi/0B
# Period = Length of One Cycle
# C = Phase Shift (In Radian)
# D = Vertical Shift X = np.linspace(0,1,100) #(Start,End,Points) # Here…
# A = 1
# B= 2*pi
# B = 2*pi/Period
# Period = 1
# C = 0
# D = 0 Y = 1*np.sin(2*np.pi*X) # Adding some Noise :
Noise = 0.4*np.random.normal(size=100) Y_data = Y + Noise plt.scatter(X,Y_data,c=”r”)
(3) Finding the best fit line for our dataset :
Here I’m going to show you, how we can fit a “Regression Line” to our dataset. Here we’ll calculate the error and in the next part, we’ll plot the sine curve that best fits our dataset. From the accuracy of both models, we can be sure that why we should use sinusoidal regression in this case.
# Function to calculate the value :
def calc_line(X,m,b):
return b + X*m # It returns optimized parameters for our function :
# popt stores optimal parameters
# pcov stores the covariance between each parameters.
popt,pcov = curve_fit(calc_line,X,Y_data) # Plot the main data :
plt.scatter(X,Y_data) # Plot the best fit line :
plt.plot(X,calc_line(X,*popt),c=”r”) # Check the accuracy of model :
Accuracy =r2_score(Y_data,calc_line(X,*popt))
print (“Accuracy of Linear Model : “,Accuracy)
Here notice that for our data-set which follows sine wave pattern, we have found the best fit line for it. Also notice that the accuracy of our model is only around 40%. So here we can conclude that for data-sets that follows sine wave pattern if we use simple linear regression then we may not achieve higher accuracy. That’s the reason to use sine wave regression.
(4) Finding the optimal sine curve that fits our data :
Now it’s time to find the best fit curve for it. Here we can see that our data follows sine wave pattern. So we are going to find the optimal parameters using curve_fit method for sine wave. After then are going to plot the data with the best fit curve to visualize it.
# Calculate the value :
def calc_sine(x,a,b,c,d):
return a * np.sin(b* ( x + np.radians(c))) + d # Finding optimal parameters :
popt,pcov = curve_fit(calc_sine,X,Y_data) # Plot the main data :
plt.scatter(X,Y_data) # Plot the best fit curve :
plt.plot(X,calc_sine(X,*popt),c=”r”) # Check the accuracy :
Accuracy =r2_score(Y_data,calc_sine(X,*popt))
print (Accuracy)
Here notice that our best fit curve is in the shape of the sine wave. Also, notice that the accuracy of our model has increased to around 79%. So here we can conclude that for sine regression helped us achieve higher accuracy.
Putting it all together :
Okay. So I think that covers almost everything we are going to need in Machine Learning from sinusoidal waves. If you enjoyed reading this, then please hit the clap icon, that’ll motivate me to write such comprehensive articles on various machine learning algorithms.
Thank you for reading this article. I hope it helped!
I regularly post my articles on : patrickstar0110.blogspot.com
All my articles are available on: medium.com/@shuklapratik22
If you have any doubts then feel free to contact me: [email protected] | https://medium.com/nightingale/ocean-waves-sinusoidal-regression-5c46c8bd4e58 | ['Pratik Shukla'] | 2020-06-18 13:01:01.197000+00:00 | ['Machine Learning', 'Mathematics', 'Artificial Intelligence', 'Data Science', 'Data Visualization'] | Title Ocean Waves Sinusoidal RegressionContent Definition Sine wave sinusoidal wave mathematical curve describes smooth periodic oscillation Sine wave continuous wave go 0 360 degree Table Representing Sine Values Generation Sine Wave Sinusoidal function given Sine Function Formula period sine curve length one cycle curve natural period sine curve 2π coefficient b1 equivalent period 2π get period sine curve coefficient B divide 2π coefficient b get new period curve Real Life Application Sine Function 1 Generation music wave 2 Sound travel wave 3 Trigonometric function construction 4 Used space flight 5 GPS location calculation 6 Architecture 7 Electrical current 8 Radio broadcasting 9 Low high tide ocean 10 Buildings I’m going show different kind sine wave generated modifying parameter ultimate goal show modification parameter affect shape graph I’m going take example show implement sinusoidal regression python First going look different graph sine wave modifying parameter value going know data visualization major role data science working data regression need find best fit curve we’ll lot parameter function don’t know happens change parameter it’s going cumbersome journey go right we’ll take example understand happens change parameter value understand take main sine function we’ll modify parameter value graph visualize want take pen paper try plot sine graph going example think that’ll help understand better Let’s look different sine graph ☀️ Example 1 1Sin1X00 SinX 1 B 1 C 0 0 Period 2pi1 2pi SinX see sine wave amplitude 1 length cycle sine wave go 0 2pi Example 2 2Sin1X00 2SinX 2 B 1 C 0 0 Period 2pi1 2pi 2SinX see sine wave amplitude 2 see increase height sine wave length cycle sine wave go 0 2pi Example 3 1Sin2X00 Sin2X 1 B 2 C 0 0 Period 2pi2 pi Sin2X see sine wave amplitude 1 length cycle sine wave go 0 pi Example 4 2Sin2X00 2Sin2X 2 B 2 C 0 0 Period 2pi2 pi 2Sin2X see sine wave amplitude 2 length cycle sine wave go 0 pi see graph increased height wave one cycle completer pi Example 5 2Sin1X10 2SinX1 2 B 1 C 1 Shift Left 0 Period 2pi1 2pi 2SinX1 shifted curve left 1 took amplitude value 1 Notice period 2pi mean one cycle length 2pi Since shifted left one unit first cycle shifted 1 unit left 2pi Example 6 2Sin1X10 2SinX1 2 B 1 C 1 Shift Right Period 2pi1 2pi 2SinX1 shifted curve right 1 took amplitude value 1 Notice period 2pi mean one cycle length 2pi Since shifted right one unit first cycle shifted 1 unit right 2pi Example 7 1Sin1X02 SinX 2 1 B 1 C 0 2 Period 2pi1 2pi SinX 2 notice shifted curve 2 point positive yaxis amplitude curve 1 period see also 2pi Example 8 1Sin1X0 — 2 SinX — 2 1 B 1 C 0 2 Period 2pi1 2pi SinX — 2 notice shifted curve 2 point negative yaxis amplitude curve 1 period see also 2pi Example 9 1Sin1X00 SinX 1 B 1 C 0 0 Period 2pi1 2pi SinX changed amplitude value 1 illustration see graph inverted previous version amplitude 1 mean positive yaxis replaced negative yaxis Example 10 2Sin1X00 2SinX 2 B 1 C 0 0 Period 2pi1 2pi 2SinX going set value amplitude 2 it’s like previous graph height sine curve increased Also notice period sine curve 2pi Example 11 2Sin1X10 2SinX1 2 B 1 C 1 0 Period 2pi1 2pi 2SinX1 shifted curve right 1 point also changed amplitude value 1 period sine curve 2pi Example 12 2Sin1X10 2SinX1 2 B 1 C 1 0 Period 2pi1 2pi 2SinX1 shifted curve left 1 point also changed amplitude value 1 it’s going negative yaxis first period sine curve 2pi Example 13 2Sin1X10 2Sin1X1 2 B 1 C 1 0 Period 2pi1 2pi 2Sin1X1 shifted curve left 1 unit One thing notice since period 2pi graph going left side say negative xaxis positive value period go positive xaxis Example 14 2Sin1X10 2Sin1X1 2 B 1 C 1 0 Period 2pi1 2pi 2Sin1X1 shifted curve right 1 unit One thing notice since period 2pi graph going left side say negative xaxis Example 15 1Sin1X11 1SinX1 1 1 B 1 C 1 D1 Period 2pi1 2pi 1SinX1 1 amplitude value 1 also shifted curve left 1 unit notice period curve 2pi One thing notice shifted curve 1 positive yaxis Example 16 1Sin1X11 1 B 1 C 1 1 Period 2pi1 2pi 1Sin1X11 amplitude value 1 also shifted curve right 1 unit notice period curve 2pi it’s going go left first One thing notice shifted curve 1 negative yaxis Credits Unsplash Let’s code 1 Import required library going import four library numpy calculation calculation matplotlib plot dataset curve plot dataset curve curvefit find optimal parameter value sine curve find optimal parameter value sine curve r2score calculate accuracy model Import required library import numpy np import matplotlibpyplot plt scipyoptimize import curvefit curve fitting sklearnmetrics import r2score check accuracy 2 Generate Dataset Since don’t actual dataset represents sine wave pattern going make dataset going use linspace function get value X value we’ll use 2piX reallife dataset isn’t going follow exact since pattern right noise dataset We’re also going add noise dataset make look realistic going scatter X point plane way visualize dataset created Generating dataset AsinBX C Amplitude Period 2pi0B Period Length One Cycle C Phase Shift Radian Vertical Shift X nplinspace01100 StartEndPoints Here… 1 B 2pi B 2piPeriod Period 1 C 0 0 1npsin2nppiX Adding Noise Noise 04nprandomnormalsize100 Ydata Noise pltscatterXYdatac”r” 3 Finding best fit line dataset I’m going show fit “Regression Line” dataset we’ll calculate error next part we’ll plot sine curve best fit dataset accuracy model sure use sinusoidal regression case Function calculate value def calclineXmb return b Xm return optimized parameter function popt store optimal parameter pcov store covariance parameter poptpcov curvefitcalclineXYdata Plot main data pltscatterXYdata Plot best fit line pltplotXcalclineXpoptc”r” Check accuracy model Accuracy r2scoreYdatacalclineXpopt print “Accuracy Linear Model “Accuracy notice dataset follows sine wave pattern found best fit line Also notice accuracy model around 40 conclude datasets follows sine wave pattern use simple linear regression may achieve higher accuracy That’s reason use sine wave regression 4 Finding optimal sine curve fit data it’s time find best fit curve see data follows sine wave pattern going find optimal parameter using curvefit method sine wave going plot data best fit curve visualize Calculate value def calcsinexabcd return npsinb x npradiansc Finding optimal parameter poptpcov curvefitcalcsineXYdata Plot main data pltscatterXYdata Plot best fit curve pltplotXcalcsineXpoptc”r” Check accuracy Accuracy r2scoreYdatacalcsineXpopt print Accuracy notice best fit curve shape sine wave Also notice accuracy model increased around 79 conclude sine regression helped u achieve higher accuracy Putting together Okay think cover almost everything going need Machine Learning sinusoidal wave enjoyed reading please hit clap icon that’ll motivate write comprehensive article various machine learning algorithm Thank reading article hope helped regularly post article patrickstar0110blogspotcom article available mediumcomshuklapratik22 doubt feel free contact shuklapratik22gmailcomTags Machine Learning Mathematics Artificial Intelligence Data Science Data Visualization |
4,109 | Thank You for Resisting the Cheeto-in-Chief | Thank You for Resisting the Cheeto-in-Chief
How Americans cheered one rogue government tweeter
by DAVID AXE
In the days following Donald Trump’s Jan. 20, 2017 inauguration as the 45th president of the United States, his administration moved quickly to remove all mentions of climate change from U.S. government websites and social media.
Everyday Americans were … not fans, if notes of encouragement that citizens sent to one rogue climate-change tweeter are any indication. “Thank you to the bad-ass,” one American wrote to the tweeter. “You fine people may be our nation’s last line of defense,” another commented.
Trump’s government-wide act of science-denial included a gag order targeting the Environmental Protection Agency. The White House barred the EPA and its employees from speaking to the press or posting on social media.
Famously, a former seasonal employee at Badlands National Park in South Dakota fought back. Taking advantage of their access to the park’s official Twitter account, on Jan. 24, 2017 the former employee tweeted several statements about climate change.
“Today, the amount of carbon dioxide in the atmosphere is higher than at any time in the last 650,000 years,” one tweet read.
The National Park Service quickly deleted the tweets — illegally, as official tweets are public records that the federal government is required to archive and make available to the public.
DEFIANT requested, under the Freedom of Information Act, copies of records regarding the tweet controversy. The trove of documents, which the park service made public on April 13, 2017, includes copies of emails that members of the public sent to the park service.
They’re pure gold. To quote a couple —
“If I could visit the Badlands right now, I would do it just to shake the hand of whoever updates your Twitter account. Roll on, NPS employees who believe in climate change.”
“Thank you to the bad-ass — I mean, Badlands social-media rep — who stood up to the Cheeto-in-chief regarding climate change. Every act of resistance is so important right now.”
And most powerfully —
Who would’ve thought it? National Park employees waging a digital guerilla war against an OCD moron who still insists climate change doesn’t exist. You fine people may be our nation’s last line of defense against destruction of not only our own national treasures but the natural world as a whole. Please rest assured the nation appreciates your courageous determination to protect our natural wonders. Don’t be cowered. Don’t be bullied. Don’t be silenced. The dark cloud that hangs over America at the moment will pass and a new day will most certainly dawn. Until then, stay strong and resolute. And thanks to all of you for your peerless service.
Stay defiant.
Follow DEFIANT on Facebook and Twitter. | https://medium.com/defiant/thank-you-for-resisting-the-cheeto-in-chief-e43dfdb8a07 | ['David Axe'] | 2017-04-25 06:55:42.608000+00:00 | ['National Parks', 'Defiant Science', 'Environment'] | Title Thank Resisting CheetoinChiefContent Thank Resisting CheetoinChief Americans cheered one rogue government tweeter DAVID AXE day following Donald Trump’s Jan 20 2017 inauguration 45th president United States administration moved quickly remove mention climate change US government website social medium Everyday Americans … fan note encouragement citizen sent one rogue climatechange tweeter indication “Thank badass” one American wrote tweeter “You fine people may nation’s last line defense” another commented Trump’s governmentwide act sciencedenial included gag order targeting Environmental Protection Agency White House barred EPA employee speaking press posting social medium Famously former seasonal employee Badlands National Park South Dakota fought back Taking advantage access park’s official Twitter account Jan 24 2017 former employee tweeted several statement climate change “Today amount carbon dioxide atmosphere higher time last 650000 years” one tweet read National Park Service quickly deleted tweet — illegally official tweet public record federal government required archive make available public DEFIANT requested Freedom Information Act copy record regarding tweet controversy trove document park service made public April 13 2017 includes copy email member public sent park service They’re pure gold quote couple — “If could visit Badlands right would shake hand whoever update Twitter account Roll NPS employee believe climate change” “Thank badass — mean Badlands socialmedia rep — stood Cheetoinchief regarding climate change Every act resistance important right now” powerfully — would’ve thought National Park employee waging digital guerilla war OCD moron still insists climate change doesn’t exist fine people may nation’s last line defense destruction national treasure natural world whole Please rest assured nation appreciates courageous determination protect natural wonder Don’t cowered Don’t bullied Don’t silenced dark cloud hang America moment pas new day certainly dawn stay strong resolute thanks peerless service Stay defiant Follow DEFIANT Facebook TwitterTags National Parks Defiant Science Environment |
4,110 | The Knowledge Triangle | The Knowledge Triangle — a graph technologies metaphor where raw data is converted into information about people, places and things and connected into a query-ready graph.
Although we use the term “knowledge” broadly in normal conversation, it has a specific meaning in the AI and graph database community. Even within computer science, it has many different meanings based on the context of a discussion. This article gives a suggested definition of the term “knowledge” and uses the Knowledge Triangle metaphor to explain our definition. We will then show some variations of the Knowledge Triangle and see how the word knowledge is used in information management and learning management systems. I have found that having a clear image of the knowledge triangle in your mind is essential to understanding the processes around modern database architectures.
Here is our definition of Knowledge in the context of AI and graph databases:
Knowledge is connected-information that is query ready.
This definition is a much shorter than the Wikipedia Knowledge page which is:
…a familiarity, awareness, or understanding of someone or something, such as facts, information, descriptions, or skills, which is acquired through experience or education by perceiving, discovering, or learning.
The Wikipedia definition is longer, more general and applicable to many domains like philosophy, learning, and cognitive science. Our definition is shorter and only intended for the context of computing. Our definition is also dependant on how we define “information”, “connected”, and “query ready”. To understand these terms, let’s reference the Knowledge Triangle figure above.
In the knowledge triangle diagram, let’s start at the bottom Data Layer. The data layer contains unprocessed raw information in the forms of binary codes, numeric codes, dates, strings, and full-text descriptions that we find in documents. The data layer can also include images (just as a jpeg file), speech (in the form of a sound file), and video data. You can imagine raw data as a stream of ones and zeros. It is a raw dump of data from your hard drive. Some types of raw data, such as an image — can be directly understood by a person just by viewing it. Usually, raw data is not typically useful without additional processing. We call this processing of raw data enrichment.
Enrichment
Enrichment takes raw data and extracts the things we care about and converts data into Information. This Information consistest of items we call business entities: people, places, events, and concepts. Information is the second layer of the Knowledge Triangle. Information is more useful than raw data, but Information itself consists of islands of disconnected items. When we start to link information together, it becomes part of the Knowledge layer. Knowledge is the top layer of the Knowledge Triangle. The knowledge layer puts information into context with the rest of the information in our system. It is this context that gives information structure. Structure gives us hints about how relevant information is for a given task.
Structure Informs Relevancy
How does structure inform relevance? Let’s take a search example. Let’s say we have a book on the topic of NoSQL. The word “NoSQL” should appear in the title of that book. There also might be other books on related topics, but they only mention NoSQL in a footnote of the book. If the counts of the term NoSQL are the same in both books then the book on NoSQL might be buried far down in the search results. A search engine that uses structural search knows that titles are essential in findability. Structural search engines boost hits of a keyword within a title of a document by a factor of 10 or 100. Many search engines (notability Sharepoint) ignore the structure of a document when doing document retrieval, so they have a reputation for their inability to find documents.
The structured search example above is an excellent example of where query readiness is enhanced in the knowledge layer. The fact that a keyword appears somewhere in a document reflects very little structure. The fact that a keyword appears in a title has much more value. The fact that the keyword appeared in a chapter title gives us some knowledge that that entire chapter is about that keyword.
Enrichment and Machine Learning
Today most enrichment is done by using simple rule-based systems. The most basic rules are called data ingestion rules where data transformation maps are created and executed when new data is loaded into our system. A typical map rule says take data from the fourth column of the CSV file and assign this to the field PersonFamilyName. These rules are manually created and maintained. About 70% of the cost of building enterprise knowledge graphs are related to building and maintaining these mapping processes. These mapping steps are often the most tedious parts of building AI systems since they require attention to detail and validation. Source systems frequently change, and there the meaning of codes may drift over time. Continuous testing and data quality staff are essential for these processes to be robust. The phrase garbage-in, garbage-out (GIGO) applies.
What is revolutionary about the mapping process is we are just starting to see machine learning play a role in this process. These processes are often called automated schema mapping or algorithm assisted mapping. To be automated, these processes involve keeping a careful log of prior mappings as a training set. New maps can then be predicted with up to 95% accuracy for new data sources. These algorithms leverage lexical names, field definitions, data profiles, and semantic links for predicting matching. Automatic schema mapping is an active field of research for many organizations building knowledge graphs. Automated mapping will lower the cost of building enterprise knowledge graphs dramatically. Graph algorithms such as cosine similarity can be ideal for finding the right matches.
Structure and Abstraction
We should also note that many things in the real world also reflect the Knowledge Triangle architecture of raw data at the bottom and connected concepts at the top. One of my favorite examples is the multi-level architecture of the neural networks in animal brains, as depicted below.
Brains have multiple layers of neural networks. Data arrives at the bottom layers and travels upward with each layer representing more abstract concepts. The human neocortex has up to six layers of processing. This figure is derived from Jeff Hawkin’s book On Intelligence.
Just like the Knowledge Triangle, raw data arrives at the bottom layer and travels upwards. But unlike our three-layer model, brains have up to six layers of complex non-linear data transformations. At the top of the stack, concepts such as recognition of an object in a file, the detection of a specific object in an image or the detection a person’s face are turned to the “on” state. There are also feedback layers downward so that if the output of one layer has quality problems, new signals are sent back down to gain more insights at what objects are recognized.
Many people like to use brain metaphors when they explain knowledge graphs. Although some of these metaphors are useful, I urge you to use them cautiously. Brains typically have 10,000 connections per-vertex, and each connection does complex signal processing. So the architectures are very different in practice.
The last term we need to define is query readiness.
Query Readiness
Of the many ways we can store data, which forms are the most useful for general analysis? Which forms need the minimum of processing before we can look for insights? What are the queries, searches, and algorithms we can use plug it to quickly find meaning in the data? The larger the number of these things you can use without modification, the more query ready your data is.
What the industry is finding is that the number of algorithms available to graph developers today is large and growing. The rise of distributed native labeled property graphs is making these algorithms available even for tens of billions of vertices. In summary, graphs are wining the algorithms race. The performance and scale-out abilities of modern graph database are pushing them to the forefront of innovation.
Variations on the Knowledge Triangle
There are also many variations on the basic knowledge triangle metaphor that are useful in some situations. One of the most common is to add a Wisdom layer on top of the Knowledge layer. This four-layer triangle is known as the DIKW pyramid and is used frequently in information architecture discussions. I tend to downplay the role of wisdom in my early knowledge graph courses since the wisdom layer seems to be associated with touchy-feely topics or stories about visiting the guru on the mountain top for advice. That being said, there are some useful things to consider about the term wisdom.
For example, when you go to an experienced person for advice, you share with them your problem and the context of that problem. You expect them to use their knowledge to give you advice about your problem. You are expecting them to transfer a bit of their knowledge to you. We imagine the wisdom layer as a feedback layer to the structure of the knowledge layer. Wisdom can inform us how we can structure our knowledge in a way that it can be transferred from one context to another and still be valuable.
Stated in another way, can we take a small sub-graph out of an enterprise knowledge graph and port it to another graph and still be useful? For example, let’s suppose we have a sub-graph that stores information about geography. It might have postal codes, cities, counties, states, regions, countries, islands and continents in the graphs. Can we lift the geospatial subgraph out of one graph and drop into another graph? In neural networks and deep learning, taking a few layers of one neural network and dropping it into another network is called transfer learning. Transfer learning is frequently used in image and language models where training times are extensive. How you reconnect these networks into a new setting is a non-trivial problem.
These are the questions about the knowledge layer that you should be asking when you design your enterprise knowledge graph. If calling reuse issues the “Wisdom” layer helps you in your discussions, we encourage you to adopt this layer.
Data Science and Knowlege Science
In some of my prior articles, I discussed the trends of moving from data science to knowledge science. We can also use the Knowledge Triangle metaphor to explain this process. This process is fundamentally about allowing staff direct access to a connected graph of your enterprise knowledge, thus saving them all the hassles of making meaning out of your raw data in the data lake.
Data science staff can get faster time to insights using direct access to a knowledge graph.
To wrap up the post, I also want to suggest one other version of the knowledge triangle that has been mapped to an actual set of tools in a production knowledge graph. Instead of the abstract concept of raw data, we replace it with a diagram of a Data Lake or an object store such as Amazon S3. At the Information layer, we list the concepts we are looking for in the Data Lake, the definitions of these concepts, and the rules to validate each of these atomic data elements to make them valid. We also allow users to associate each business entity with a URI so they can be linked together in the next higher step. At the Knowledge Graph layer, we talk about making connections between the entities found in the information layer and the tools we use to connect data and find missing relationships automatically. These processes include entity resolution, master data management, deduplication and shape validation.
From Data Lakes to transfer learning. The Knowledge Triangle in practice.
This diagram also mentions that there is often a feedback layer that automatically sends alerts to the data enrichers that there might be missing data and clues on how this data can be found.
Knowledge Spaces in Learning Management Systems
Lastly, we want to mention that modern AI-powered learning management systems (LMS) also use the term Knowledge Space. In the context of an LMS, knowledge space is a set of concepts that must be mastered to achieve proficiency in a field. Each student has a Knowledge State that shows where they are in learning a topic. AI-powered LMS systems use recommendation engines to recommend learning content associated with the edges of known concepts in each student's Knowledge Space. I will be discussing the topic of AI in education and Knowledge Spaces in a future blog post.
Summary
In summary, The Knowlege Triangle is one of the most useful metaphors in our graph architecture toolkit. Along with The Neighborhood Walk, the Open World, and the Jenga Tower, it forms the basis for our introductory chapter on Knowledge Graph concepts.
I want to thank my friend Arun Batchu for introducing me to the Knowledge Triangle his willingness to transfer his wisdom to me. | https://dmccreary.medium.com/the-knowledge-triangle-c5124637d54c | ['Dan Mccreary'] | 2019-09-01 19:03:55.851000+00:00 | ['Dikw', 'Knowledge Triangle', 'Information Architecture', 'Artificial Intelligence', 'Graph Databases'] | Title Knowledge TriangleContent Knowledge Triangle — graph technology metaphor raw data converted information people place thing connected queryready graph Although use term “knowledge” broadly normal conversation specific meaning AI graph database community Even within computer science many different meaning based context discussion article give suggested definition term “knowledge” us Knowledge Triangle metaphor explain definition show variation Knowledge Triangle see word knowledge used information management learning management system found clear image knowledge triangle mind essential understanding process around modern database architecture definition Knowledge context AI graph database Knowledge connectedinformation query ready definition much shorter Wikipedia Knowledge page …a familiarity awareness understanding someone something fact information description skill acquired experience education perceiving discovering learning Wikipedia definition longer general applicable many domain like philosophy learning cognitive science definition shorter intended context computing definition also dependant define “information” “connected” “query ready” understand term let’s reference Knowledge Triangle figure knowledge triangle diagram let’s start bottom Data Layer data layer contains unprocessed raw information form binary code numeric code date string fulltext description find document data layer also include image jpeg file speech form sound file video data imagine raw data stream one zero raw dump data hard drive type raw data image — directly understood person viewing Usually raw data typically useful without additional processing call processing raw data enrichment Enrichment Enrichment take raw data extract thing care convert data Information Information consistest item call business entity people place event concept Information second layer Knowledge Triangle Information useful raw data Information consists island disconnected item start link information together becomes part Knowledge layer Knowledge top layer Knowledge Triangle knowledge layer put information context rest information system context give information structure Structure give u hint relevant information given task Structure Informs Relevancy structure inform relevance Let’s take search example Let’s say book topic NoSQL word “NoSQL” appear title book also might book related topic mention NoSQL footnote book count term NoSQL book book NoSQL might buried far search result search engine us structural search know title essential findability Structural search engine boost hit keyword within title document factor 10 100 Many search engine notability Sharepoint ignore structure document document retrieval reputation inability find document structured search example excellent example query readiness enhanced knowledge layer fact keyword appears somewhere document reflects little structure fact keyword appears title much value fact keyword appeared chapter title give u knowledge entire chapter keyword Enrichment Machine Learning Today enrichment done using simple rulebased system basic rule called data ingestion rule data transformation map created executed new data loaded system typical map rule say take data fourth column CSV file assign field PersonFamilyName rule manually created maintained 70 cost building enterprise knowledge graph related building maintaining mapping process mapping step often tedious part building AI system since require attention detail validation Source system frequently change meaning code may drift time Continuous testing data quality staff essential process robust phrase garbagein garbageout GIGO applies revolutionary mapping process starting see machine learning play role process process often called automated schema mapping algorithm assisted mapping automated process involve keeping careful log prior mapping training set New map predicted 95 accuracy new data source algorithm leverage lexical name field definition data profile semantic link predicting matching Automatic schema mapping active field research many organization building knowledge graph Automated mapping lower cost building enterprise knowledge graph dramatically Graph algorithm cosine similarity ideal finding right match Structure Abstraction also note many thing real world also reflect Knowledge Triangle architecture raw data bottom connected concept top One favorite example multilevel architecture neural network animal brain depicted Brains multiple layer neural network Data arrives bottom layer travel upward layer representing abstract concept human neocortex six layer processing figure derived Jeff Hawkin’s book Intelligence like Knowledge Triangle raw data arrives bottom layer travel upwards unlike threelayer model brain six layer complex nonlinear data transformation top stack concept recognition object file detection specific object image detection person’s face turned “on” state also feedback layer downward output one layer quality problem new signal sent back gain insight object recognized Many people like use brain metaphor explain knowledge graph Although metaphor useful urge use cautiously Brains typically 10000 connection pervertex connection complex signal processing architecture different practice last term need define query readiness Query Readiness many way store data form useful general analysis form need minimum processing look insight query search algorithm use plug quickly find meaning data larger number thing use without modification query ready data industry finding number algorithm available graph developer today large growing rise distributed native labeled property graph making algorithm available even ten billion vertex summary graph wining algorithm race performance scaleout ability modern graph database pushing forefront innovation Variations Knowledge Triangle also many variation basic knowledge triangle metaphor useful situation One common add Wisdom layer top Knowledge layer fourlayer triangle known DIKW pyramid used frequently information architecture discussion tend downplay role wisdom early knowledge graph course since wisdom layer seems associated touchyfeely topic story visiting guru mountain top advice said useful thing consider term wisdom example go experienced person advice share problem context problem expect use knowledge give advice problem expecting transfer bit knowledge imagine wisdom layer feedback layer structure knowledge layer Wisdom inform u structure knowledge way transferred one context another still valuable Stated another way take small subgraph enterprise knowledge graph port another graph still useful example let’s suppose subgraph store information geography might postal code city county state region country island continent graph lift geospatial subgraph one graph drop another graph neural network deep learning taking layer one neural network dropping another network called transfer learning Transfer learning frequently used image language model training time extensive reconnect network new setting nontrivial problem question knowledge layer asking design enterprise knowledge graph calling reuse issue “Wisdom” layer help discussion encourage adopt layer Data Science Knowlege Science prior article discussed trend moving data science knowledge science also use Knowledge Triangle metaphor explain process process fundamentally allowing staff direct access connected graph enterprise knowledge thus saving hassle making meaning raw data data lake Data science staff get faster time insight using direct access knowledge graph wrap post also want suggest one version knowledge triangle mapped actual set tool production knowledge graph Instead abstract concept raw data replace diagram Data Lake object store Amazon S3 Information layer list concept looking Data Lake definition concept rule validate atomic data element make valid also allow user associate business entity URI linked together next higher step Knowledge Graph layer talk making connection entity found information layer tool use connect data find missing relationship automatically process include entity resolution master data management deduplication shape validation Data Lakes transfer learning Knowledge Triangle practice diagram also mention often feedback layer automatically sends alert data enrichers might missing data clue data found Knowledge Spaces Learning Management Systems Lastly want mention modern AIpowered learning management system LMS also use term Knowledge Space context LMS knowledge space set concept must mastered achieve proficiency field student Knowledge State show learning topic AIpowered LMS system use recommendation engine recommend learning content associated edge known concept student Knowledge Space discussing topic AI education Knowledge Spaces future blog post Summary summary Knowlege Triangle one useful metaphor graph architecture toolkit Along Neighborhood Walk Open World Jenga Tower form basis introductory chapter Knowledge Graph concept want thank friend Arun Batchu introducing Knowledge Triangle willingness transfer wisdom meTags Dikw Knowledge Triangle Information Architecture Artificial Intelligence Graph Databases |
4,111 | Attitude Is Everything | How do you currently see Life?
After you wake each morning and push through the grogginess and grumpiness, how do you honestly feel about Life? Don’t pay much attention to the people and situations influencing you, focus rather on how you feel inside. Don’t try to impress me or some other person with your answer. Ask yourself from your heart, from your soul, do you like your current perspective of life?
Your Perspective is important because the one you hold has a hold over you
It affects the way you see the world and how you carry yourself through each moment. It chooses the words you speak and the actions you take. And as the world sees you expressing yourself, they are, in some way, influenced by you — some may even end up following you. So, you have to be responsible for yourself because everything you do could change someone’s life for better or worse.
Remember, you alone see your Perspective, the world feels only your Attitude
You are responsible for your Perspective. You need to be able to take everything weighing down on you and still carry out yourself well. It doesn’t matter whether the day feels good or bad, you must never succumb to the people or situations influencing you the wrong way. You need to be your own and hold firm on the belief that nothing is stronger than your belief in yourself.
And everything out there exists only to serve you in some way
You may not know how at first, but you know that at some point in the future, its meaning will reveal itself to you. Don’t consider yourself a victim and succumb to the things facing you, see yourself as being the one who benefits and overcome the things facing you. Don’t sit idle, hold strong, and keep moving forward. This Attitude can change the world in some way every day.
Keep with you the words of Arthur Gordon:
Be bold and mighty forces will come to your aid. In the past, whenever I had fallen short in almost any undertaking, it was seldom because I had tried and failed. It was because I had let fear of failure stop me from trying at all.
I would like for you to keep these with you:
Your Perspective Powers Your Attitude
Inky Johnson once said something quite interesting: perspective drives performance. And I am paraphrasing here but, the adversity that finds you is not as important as your perspective of it. You may not be able to control the adversity that will find you, but you choose how you see it and this will decide what you do with it.
Your Perspective will influence your Attitude
A destructive perspective can lead you towards a destructive attitude in the same way a constructive perspective can lead you towards a constructive attitude. The latter asks for more effort but do not let this discourage you because this brings with it a great deal of value. Embrace it. If you were to be consumed by anything, let it be this. You will not regret the value you find in the end.
“I am still determined to be cheerful and happy, in whatever situation I may be; for I have also learned from experience that the greater part of our happiness or misery depends upon our dispositions, and not upon our circumstances.” — Martha Washington
A Comprehensive Understanding Leads To A Better A Perspective
Make room for Doubt and ask the questions you need to ask. Understanding is a journey and the questions you ask help you explore this journey. Look at the good and the bad, and try to draw value from them both. Be not afraid to explore, for the more ground you cover, the broader you will be able to think. And along the way, the understanding will reveal itself to you.
This helps you find and follow what makes sense to you
But be careful not to mislead yourself. The understanding you find is meant to help you choose a Perspective that will bring you value today, tomorrow, and each day after. Don’t choose one without considering the long-term value. But once you find the one that aligns with your soul, keep it. Let it be expressed through your Attitude and be the reason why the world remembers you.
“Carve your name on hearts, not tombstones. A legacy is etched into the minds of others and the stories they share about you.” ― Shannon Alder
This Is How Your Brave The World
It is no secret that the world can feel harsh at times. Things often do not work in your favor and they may compound into an enormous and powerful weight resting itself on your shoulders. I can say, if you leave this weight alone, it will grow and hold a great influence over you and the life you live. It will do its best to overwhelm you.
But whatever you do, do not succumb
Don’t be influenced by everything living outside your control. They may feel strong, they may feel impossible, but know that you are capable, know that you are stronger. Stay true to your belief in yourself. You may feel confused, but you will understand it better at some point in the future. Don’t let go of your hope and the terrors of the world will hold no power over you.
“It takes courage to grow up and become who you really are.” ― E.E. Cummings
Photo by Ali Pazani on Unsplash
You are going to live your life each day and go through the time given to you. You may do it willingly or unwillingly, but this is an important decision you must make each day. I do say, choose the former because Life is far too interesting to live it unwillingly. Some days may be good and some days may be bad, but neither should be enough to sway your Perspective of life as a whole.
Your Perspective may be nourished by the outside world but it stems from within
Your Perspective belongs to you, it is for you to maintain and express through each thing that you do. Let it not be influenced but rather nourished by the people and situations living outside your control. Let everything serve you on your way. A well-rooted Perspective and indomitable Will leaves you with an empowering Attitude, one that can brave the world and change it at the same time.
“Life is not easy for any of us. But what of that? We must have perseverance and, above all, confidence in ourselves. We must believe that we are gifted for something, and that this thing, at whatever cost, must be attained” — Marie Curie
Invest In Your Existence, Kind Reader. | https://medium.com/live-your-life-on-purpose/attitude-is-everything-5d46cc4046c5 | ['René Chunilall'] | 2020-12-25 17:03:12.792000+00:00 | ['Life Lessons', 'Life', 'Self Improvement', 'Self-awareness', 'Self Mastery'] | Title Attitude EverythingContent currently see Life wake morning push grogginess grumpiness honestly feel Life Don’t pay much attention people situation influencing focus rather feel inside Don’t try impress person answer Ask heart soul like current perspective life Perspective important one hold hold affect way see world carry moment chooses word speak action take world see expressing way influenced — may even end following responsible everything could change someone’s life better worse Remember alone see Perspective world feel Attitude responsible Perspective need able take everything weighing still carry well doesn’t matter whether day feel good bad must never succumb people situation influencing wrong way need hold firm belief nothing stronger belief everything exists serve way may know first know point future meaning reveal Don’t consider victim succumb thing facing see one benefit overcome thing facing Don’t sit idle hold strong keep moving forward Attitude change world way every day Keep word Arthur Gordon bold mighty force come aid past whenever fallen short almost undertaking seldom tried failed let fear failure stop trying would like keep Perspective Powers Attitude Inky Johnson said something quite interesting perspective drive performance paraphrasing adversity find important perspective may able control adversity find choose see decide Perspective influence Attitude destructive perspective lead towards destructive attitude way constructive perspective lead towards constructive attitude latter asks effort let discourage brings great deal value Embrace consumed anything let regret value find end “I still determined cheerful happy whatever situation may also learned experience greater part happiness misery depends upon disposition upon circumstances” — Martha Washington Comprehensive Understanding Leads Better Perspective Make room Doubt ask question need ask Understanding journey question ask help explore journey Look good bad try draw value afraid explore ground cover broader able think along way understanding reveal help find follow make sense careful mislead understanding find meant help choose Perspective bring value today tomorrow day Don’t choose one without considering longterm value find one aligns soul keep Let expressed Attitude reason world remembers “Carve name heart tombstone legacy etched mind others story share you” ― Shannon Alder Brave World secret world feel harsh time Things often work favor may compound enormous powerful weight resting shoulder say leave weight alone grow hold great influence life live best overwhelm whatever succumb Don’t influenced everything living outside control may feel strong may feel impossible know capable know stronger Stay true belief may feel confused understand better point future Don’t let go hope terror world hold power “It take courage grow become really are” ― EE Cummings Photo Ali Pazani Unsplash going live life day go time given may willingly unwillingly important decision must make day say choose former Life far interesting live unwillingly day may good day may bad neither enough sway Perspective life whole Perspective may nourished outside world stem within Perspective belongs maintain express thing Let influenced rather nourished people situation living outside control Let everything serve way wellrooted Perspective indomitable leaf empowering Attitude one brave world change time “Life easy u must perseverance confidence must believe gifted something thing whatever cost must attained” — Marie Curie Invest Existence Kind ReaderTags Life Lessons Life Self Improvement Selfawareness Self Mastery |
4,112 | Flask’s Latest Rival in Data Science | On the comparison between Flask and Streamlit: a reader noted that Flask has capabilities in excess of Streamlit. I appreciate this point and would encourage users to look at their use cases and use the right technology. For the users who require a tool to deploy models for your team or clients, Streamlit is very efficient, however, for users who require more advanced solutions, Flask is probably better. Competitors of Streamlit would include Bokeh and Dash.
Streamlit
This is where Streamlit comes into its own, and why they just raised $6m to get the job done. They created a library off the back of an existing python framework that allows users to deploy functional code. Kind of similar to how Tensorflow works: Streamlit adds on a new feature in its UI, corresponding to a new function being called in the Python Script.
For example the following 6 lines of code. I append a “title” method, a “write” method, a “select” method and a “write” method (from Streamlit):
import streamlit as st st.title(‘Hello World’)
st.write(‘Pick an option’) keys = [‘Normal’,’Uniform’]
dist_key = st.selectbox(‘Which Distribution do you want?’,keys) st.write(‘You have chosen {}’.format(dist_key))
Save that into a file called “test.py”, then run “streamlit run test.py” and it produces the following in your browser on http://localhost:8501 /:
Code above produced this. Fantastic how efficient Streamlits library makes UI programming.
Now this is awesome. It’s both clean to look at and clearly efficient to create.
Jupyter Notebooks is also another successful “alternative” but it’s a bit different. Notebooks is better as a framework for research or report writing however there’s little you can do in the way of actually letting someone else use your code as it’s impractical to give someone else a notebook of code. Co-labs kind of bridges that gap but it’s still not as clean.
Streamlit fills this void by giving the user an ability to deploy code in an easy manner so the client use the product. For those of us who like making small things, this has always been an issue.
Ease of Use
Ok so let’s create something that we may actually want someone else to use.
Let’s say I want to teach my nephew about distributions. I want to make an app that he can use where he selects a distribution, and then it draws a line chart of it. Something as simple as the following:
Code provided below how to create this
In this example, you can see that the user has a choice between 2 items in a drop down menu: and when he selects either, you hope that the line chart would up date with the chart. Taking a step back, I’m providing the user with:
Some Information about a problem The user then has the ability to make a choice The corresponding chart is then returned to the user
Now in Flask, something like the above would easily require hundreds of lines of code (before even getting to the aesthetics) however Streamlit have achieved the above in a negligible amount of code. Note that the above required the following ~11 lines of code:
import streamlit as st
import numpy as np # Write a title and a bit of a blurb
st.title(‘Distribution Tester’)
st.write(‘Pick a distribution from the list and we shall draw the a line chart from a random sample from the distribution’) # Make some choices for a user to select
keys = [‘Normal’,’Uniform’]
dist_key = st.selectbox(‘Which Distribution do you want to plot?’,keys) # Logic of our program
if dist_key == ‘Normal’:
nums = np.random.randn(1000)
elif dist_key == ‘Uniform’:
nums = np.array([np.random.randint(100) for i in range(1000)]) # Display User
st.line_chart(nums)
I find it amazing because the amount of code required is so small to produce something that actually looks and works pretty good.
For anyone who’s played around with UI before, you’ll know how difficult it is to achieve something of this quality. To have Streamlit produce an open-source framework for researchers and teams a like, development time has been immensely reduced. I cannot emphasis this point enough.
Given this, no Data Scientist or Machine Learning Researcher can ever complain about not being able to deploy work. Nor can they complain about getting an MVP running. Streamlit have done all the hard work.
Amazing job guys! | https://towardsdatascience.com/the-end-of-flask-in-data-science-738920090ed9 | ['Mohammad Ahmad'] | 2020-06-27 09:37:45.212000+00:00 | ['Programming', 'Artificial Intelligence', 'Data Science', 'UX', 'Machine Learning'] | Title Flask’s Latest Rival Data ScienceContent comparison Flask Streamlit reader noted Flask capability excess Streamlit appreciate point would encourage user look use case use right technology user require tool deploy model team client Streamlit efficient however user require advanced solution Flask probably better Competitors Streamlit would include Bokeh Dash Streamlit Streamlit come raised 6m get job done created library back existing python framework allows user deploy functional code Kind similar Tensorflow work Streamlit add new feature UI corresponding new function called Python Script example following 6 line code append “title” method “write” method “select” method “write” method Streamlit import streamlit st sttitle‘Hello World’ stwrite‘Pick option’ key ‘Normal’’Uniform’ distkey stselectbox‘Which Distribution want’keys stwrite‘You chosen ’formatdistkey Save file called “testpy” run “streamlit run testpy” produce following browser httplocalhost8501 Code produced Fantastic efficient Streamlits library make UI programming awesome It’s clean look clearly efficient create Jupyter Notebooks also another successful “alternative” it’s bit different Notebooks better framework research report writing however there’s little way actually letting someone else use code it’s impractical give someone else notebook code Colabs kind bridge gap it’s still clean Streamlit fill void giving user ability deploy code easy manner client use product u like making small thing always issue Ease Use Ok let’s create something may actually want someone else use Let’s say want teach nephew distribution want make app use selects distribution draw line chart Something simple following Code provided create example see user choice 2 item drop menu selects either hope line chart would date chart Taking step back I’m providing user Information problem user ability make choice corresponding chart returned user Flask something like would easily require hundred line code even getting aesthetic however Streamlit achieved negligible amount code Note required following 11 line code import streamlit st import numpy np Write title bit blurb sttitle‘Distribution Tester’ stwrite‘Pick distribution list shall draw line chart random sample distribution’ Make choice user select key ‘Normal’’Uniform’ distkey stselectbox‘Which Distribution want plot’keys Logic program distkey ‘Normal’ nums nprandomrandn1000 elif distkey ‘Uniform’ nums nparraynprandomrandint100 range1000 Display User stlinechartnums find amazing amount code required small produce something actually look work pretty good anyone who’s played around UI you’ll know difficult achieve something quality Streamlit produce opensource framework researcher team like development time immensely reduced cannot emphasis point enough Given Data Scientist Machine Learning Researcher ever complain able deploy work complain getting MVP running Streamlit done hard work Amazing job guysTags Programming Artificial Intelligence Data Science UX Machine Learning |
4,113 | Designing your Company Architecture on Google Cloud Platform | Photo by Austin Distel on Unsplash
Introduction
In this blog, I am going to cover the basic aspects of setting up your company architecture on Google Cloud. It is essential that the infrastructure you develop has high cohesion and low coupling, setting up such an architecture helps you to scale your target services and apps at an incredible speed without worrying about it affecting your entire workflow. Also, a well defined and structured architecture enables faster bug tracking/fixation and prevents the problem of single-point failure.
Google Cloud Platform Resource Hierarchy
Lets first understand how the resource hierarchy needs to be set up on google cloud platform. While designing your workflow you can take a top-down or bottom-up approach whichever you prefer, understanding these concepts is easier if you take a bottom-up approach however I strongly recommend that you keep in mind the top-down approach as well when you actually begin setting up your architecture.
GCP Resource Hierarchy
If you have worked on any cloud platform before such as AWS, Azure, GCP, Digital Ocean and so on you must be familiar with Virtual Machines or EC2 Instances, these comprise your CPU/GPU instances. Your Virtual Machines are organized in projects and your projects are in turn organized in folders. Folders can be organized inside parent/child folders and in the end, your folders come under Organization Node. The Organization Node is the root of your company architecture. Generation of folders is optional in GCP however I strongly recommend that you use folders inside an organization as using a well-defined folder structure will make your life a hell of a lot easier further down the line when you want to create Teams and give IAM permissions to your Team Members or Virtual Instances. Technically you can add permissions in either of the three places — Organization Node, Folder, or Project. I personally prefer the allotment of IAM permissions at a folder level as it makes management of teams and resources much easier and ubiquitous in a team environment. Only the project Owners and Admins should have the Organisation Level Permissions. The IAM permissions have a downwards Inheritance that is why Folders are the best place to assign permissions. Often in a startup environment, numerous companies end up setting up their entire tech stack inside a single project, naturally, if everything is at a single place and it would make your life easier as you won't have to create VPC’s or subnets to make the shared resources available inside your other projects, however when you start tending toward the creation of an enterprise from a startup you would realize that doing everything inside a single project was not a good idea at all. Generation of separate projects and a folder structure might seem daunting and complex at first but trust me its all worth it. The Levels of the hierarchy provide trust boundaries and resource isolation in your organization.
Here’s an example of how you might organize your resources. There are three projects, each of which uses resources from several GCP services. Resources inherit the policies of their parent resource. For instance, if you set a policy at the organization level, it is automatically inherited by all its children's projects. And this inheritance is transitive, which means that all the resources in those projects inherit the policy too.
Google Cloud Platform Console
This is the place where you manage everything. Cloud Console in the place where you can switch between your various projects. All Google Cloud Platform resources belong to a Google Cloud Platform Console project. Thus a project is a place where all your services and apps would be.
Key Features of the Project are :
● Track resource and quota usage.
● Enable billing.
● Manage permissions and credentials.
● Enable services and APIs.
Some people think that the generation of numerous projects would result in high billings and all but this is a false assumption as projects are billed and managed separately. Your billings depends on the resources you use, so it doesn't matter that they are in which project the billing amount would be exactly the same. Moreover, it might actually help you to understand which project costs you how much.
Cloud Security
Security requires a collaborative environment from both the consumer and provider ends. One of the major advantages of using any cloud platform is that you don’t have to worry about the physical security of your resources i.e VM instances and all, as often setting up on-premise security for your data centers and servers is not feasible especially for startups and new companies. Moreover, you will also have to worry about power outages due to some calamity or some other reason. Cloud servers are backed up on various regions and on numerous continents due to which a backup system is available even if one region goes down due to some unprecedented circumstances. That being said consumers need to be careful while handling Customer -managed security responsibilities. , especially the setup of IAM roles across different members/engineers of your company and limiting the accesses to specific tasks that a particular engineer needs to perform. Also, network setting should be carefully set up while exposing your apps to the external DNS or publically accessible IP’s. Negligence on this part might often become catastrophic if some hacker gets wind of your network vulnerabilities.
IAM Roles
There are three types of IAM Roles:
Primitive Roles: IAM primitive roles apply across all GCP services in a project. Primitive roles are broad. You apply them to a GCP project, and they affect all resources in that project.
These are the Owner, Editor, and Viewer roles. If you’re a viewer on a given resource, you can examine it but not change its state. If you’re an editor, you can do everything a viewer can do plus change its state. And if you’re an owner, you can do everything an editor can do plus manage roles and permissions on the resource. The owner role on a project lets you do one more thing too: you can set up billing. Often companies want someone to be able to control the billing for a project without the right to change the resources in the project, and that’s why you can grant someone the billing administrator role.
Primitive roles provide segregation at a higher level but personally I do not prefer primitive roles for all tasks because things in real like production scenarios are not as ideal as primitive roles make them seem. We often need fine-grained roles in order to create appropriate segregation of resources and accesses across all members of our organization.
2. Predefined Roles: These roles apply to a particular GCP service in a project. GCP services offer their own sets of predefined roles, and they define where those roles can be applied. This is the role management that any initial stage startup should use. However, these roles are often better defined at a folder level rather than at user level in an organization as these are very fine-grained privileges, and maintaining them at a user level would become a tedious task as the organization grows and the number of employees increases. Maintaining this at the user level is not a viable or feasible option at all.
3. Custom Roles: These roles let you define a precise set of permissions. What if you need something even finer-grained? That’s what custom roles permit. A lot of companies use a “least-privilege” model, in which each person in your organization the minimal amount of privilege needed to do his or her job. So, for example, maybe I want to define an “instanceOperator” role, to allow some users to stop and start Compute Engine virtual machines but not reconfigure them. Custom roles allow me to do that.
Using Custom roles comes under the category of the higher level of IAM role management and early phase startups should avoid these as maintaining and setting these up is a herculean task and would need a good solid team to set this up and manage it. If you decide to use custom roles, you’ll need to manage the permissions that make them up. Some companies decide they’d rather stick with the predefined roles. Second, custom roles can only be used at the project or organization levels. They can’t be used at the folder level.
Service Accounts
Services accounts are used in the case when you want to give accesses to a resource rather than to a person. For instance, maybe you have an application running in a virtual machine that needs to store data in Google Cloud Storage. But you don’t want to let just anyone on the Internet have access to that data; only that virtual machine. So you’d create a service account to authenticate your VM to Cloud Storage. Service accounts are named with an email address, but instead of passwords, they use cryptographic keys to access resources.
In this simple example, a service account has been granted Compute Engine’s Instance Admin role. This would allow an application running in a VM with that service account to create, modify, and delete other VMs.
Here’s a more complex example. Say you have an application that’s implemented across a group of Compute Engine virtual machines. One component of your application needs to have an editor role on another project, but another component doesn’t. So you would create two different service accounts, one for each subgroup of virtual machines. Only the first service account has privilege on the other project. That reduces the potential impact of a miscoded application or a compromised virtual machine.
Summary
Company Hierarchy: Company hierarchical setup is a very important task and should never be done in a rushed manner as it is the base for your entire tech stack and your product, spend as much time as needed in order to set up the best infrastructure for your organization. An Organization Node > Folders > Projects > Resources architecture customized as per your organization is a good option, to begin with.
Security: Good collaboration needs to be maintained between the customer and the provider in order to make your apps/services highly secured. Customers mainly need to work on IAM Role level and Network level for securing their organization.
Role Management: Startups should prefer using Predefined Roles allocated at a folder level or use service accounts for the role management purpose. | https://medium.com/swlh/designing-your-company-architecture-on-google-cloud-platform-be705de7eb64 | ['Arneesh Aima'] | 2020-05-19 14:48:17.482000+00:00 | ['Permission', 'Startup', 'Infrastructure', 'Google Cloud Platform', 'Cloud'] | Title Designing Company Architecture Google Cloud PlatformContent Photo Austin Distel Unsplash Introduction blog going cover basic aspect setting company architecture Google Cloud essential infrastructure develop high cohesion low coupling setting architecture help scale target service apps incredible speed without worrying affecting entire workflow Also well defined structured architecture enables faster bug trackingfixation prevents problem singlepoint failure Google Cloud Platform Resource Hierarchy Lets first understand resource hierarchy need set google cloud platform designing workflow take topdown bottomup approach whichever prefer understanding concept easier take bottomup approach however strongly recommend keep mind topdown approach well actually begin setting architecture GCP Resource Hierarchy worked cloud platform AWS Azure GCP Digital Ocean must familiar Virtual Machines EC2 Instances comprise CPUGPU instance Virtual Machines organized project project turn organized folder Folders organized inside parentchild folder end folder come Organization Node Organization Node root company architecture Generation folder optional GCP however strongly recommend use folder inside organization using welldefined folder structure make life hell lot easier line want create Teams give IAM permission Team Members Virtual Instances Technically add permission either three place — Organization Node Folder Project personally prefer allotment IAM permission folder level make management team resource much easier ubiquitous team environment project Owners Admins Organisation Level Permissions IAM permission downwards Inheritance Folders best place assign permission Often startup environment numerous company end setting entire tech stack inside single project naturally everything single place would make life easier wont create VPC’s subnets make shared resource available inside project however start tending toward creation enterprise startup would realize everything inside single project good idea Generation separate project folder structure might seem daunting complex first trust worth Levels hierarchy provide trust boundary resource isolation organization Here’s example might organize resource three project us resource several GCP service Resources inherit policy parent resource instance set policy organization level automatically inherited childrens project inheritance transitive mean resource project inherit policy Google Cloud Platform Console place manage everything Cloud Console place switch various project Google Cloud Platform resource belong Google Cloud Platform Console project Thus project place service apps would Key Features Project ● Track resource quota usage ● Enable billing ● Manage permission credential ● Enable service APIs people think generation numerous project would result high billing false assumption project billed managed separately billing depends resource use doesnt matter project billing amount would exactly Moreover might actually help understand project cost much Cloud Security Security requires collaborative environment consumer provider end One major advantage using cloud platform don’t worry physical security resource ie VM instance often setting onpremise security data center server feasible especially startup new company Moreover also worry power outage due calamity reason Cloud server backed various region numerous continent due backup system available even one region go due unprecedented circumstance said consumer need careful handling Customer managed security responsibility especially setup IAM role across different membersengineers company limiting access specific task particular engineer need perform Also network setting carefully set exposing apps external DNS publically accessible IP’s Negligence part might often become catastrophic hacker get wind network vulnerability IAM Roles three type IAM Roles Primitive Roles IAM primitive role apply across GCP service project Primitive role broad apply GCP project affect resource project Owner Editor Viewer role you’re viewer given resource examine change state you’re editor everything viewer plus change state you’re owner everything editor plus manage role permission resource owner role project let one thing set billing Often company want someone able control billing project without right change resource project that’s grant someone billing administrator role Primitive role provide segregation higher level personally prefer primitive role task thing real like production scenario ideal primitive role make seem often need finegrained role order create appropriate segregation resource access across member organization 2 Predefined Roles role apply particular GCP service project GCP service offer set predefined role define role applied role management initial stage startup use However role often better defined folder level rather user level organization finegrained privilege maintaining user level would become tedious task organization grows number employee increase Maintaining user level viable feasible option 3 Custom Roles role let define precise set permission need something even finergrained That’s custom role permit lot company use “leastprivilege” model person organization minimal amount privilege needed job example maybe want define “instanceOperator” role allow user stop start Compute Engine virtual machine reconfigure Custom role allow Using Custom role come category higher level IAM role management early phase startup avoid maintaining setting herculean task would need good solid team set manage decide use custom role you’ll need manage permission make company decide they’d rather stick predefined role Second custom role used project organization level can’t used folder level Service Accounts Services account used case want give access resource rather person instance maybe application running virtual machine need store data Google Cloud Storage don’t want let anyone Internet access data virtual machine you’d create service account authenticate VM Cloud Storage Service account named email address instead password use cryptographic key access resource simple example service account granted Compute Engine’s Instance Admin role would allow application running VM service account create modify delete VMs Here’s complex example Say application that’s implemented across group Compute Engine virtual machine One component application need editor role another project another component doesn’t would create two different service account one subgroup virtual machine first service account privilege project reduces potential impact miscoded application compromised virtual machine Summary Company Hierarchy Company hierarchical setup important task never done rushed manner base entire tech stack product spend much time needed order set best infrastructure organization Organization Node Folders Projects Resources architecture customized per organization good option begin Security Good collaboration need maintained customer provider order make appsservices highly secured Customers mainly need work IAM Role level Network level securing organization Role Management Startups prefer using Predefined Roles allocated folder level use service account role management purposeTags Permission Startup Infrastructure Google Cloud Platform Cloud |
4,114 | Three Writers to Revisit (or Discover) as Black History Month Ends | Three Writers to Revisit (or Discover) as Black History Month Ends
Only one is famous, but all three broke new ground
Three very different writers . . .
Poet, playwright, activist, and educator Amiri Baraka (1934–2014) was famously controversial, under more than one name. Teacher, mentor, poet, and novelist Margaret Walker (1915–1998) broke new ground for women of color and changed the way black families were depicted in fiction. Attorney, poet, and storyteller Samuel Alfred Beadle (1857–1932) captured vignettes of black life in the south at the turn of the twentieth century.
Very different. But all three made significant contributions to modern American literature — and together, they represent a span of time that runs from the Civil War to the election of Barack Obama.
I have the privilege of knowing something about both Walker and Beadle because for a long time I wrote biographies of poets for Chadwyck-Healey’s reference series Literature Online. And LION strives to include in its database not only the most famous figures from every era, but also those who may be almost invisible to history.
One project involved profiling writers chosen for the Yale Younger Poets Series, which began in 1918. On that list was Margaret Walker—the first black woman ever to win a national literary prize in America.
In fact, she was the first person of color to be included in the Yale Younger Poets Series, which selects for publication only one outstanding poet under the age of thirty each year. In 1942, Walker’s impassioned collection For My People was chosen by the Series editor Stephen Vincent Benét (then one of America’s most popular poets).
Walker went on to become a respected educator and a successful writer, whose epic novel Jubilee was among the first works of fiction to present a realistic picture of black life in the time of slavery. Written over a period of nearly three decades, Jubilee was finally published in 1966, just after Walker earned a doctoral degree from the University of Iowa. And over the next twenty years, her pathbreaking novel was translated into seven languages, and sold more than a million copies.
Jubilee tells the story of Vyry, a character closely based on Walker’s maternal great-grandmother. The first part focuses on Vyry’s life as a slave and her complicated marriage to a free black man. The second depicts the destruction and violence of the Civil War, while the third follows Vyry’s struggles to establish a home for her family after their emancipation. The book’s fifty-eight chapters are rich with details of daily life, stories drawn from folklore, and scenes from history. Through all her trials, Vyry emerges as a resilient, even heroic woman who manages to maintain a strong spirit, but refuses to limit her freedom of mind with the burden of hate.
Throughout her life, Margaret Walker was an advocate for women of color and an outspoken commentator on issues of race and gender equality. After graduating from Northwestern University when she was just twenty, Walker worked for the Federal Writers Program (a project of the Depression-era Works Progress Administration), and became part of a politically engaged Chicago writing group led by controversial novelist Richard Wright. These experiences — in conjunction with a deep Christian faith — shaped not only her poetry, but her career as a teacher and mentor.
In 1949 Walker moved to Mississippi, where she taught for more than twenty years at Jackson State University, raised a family, and founded the Institute for the Study of the History, Life, and Culture of Black People.
By the time Walker arrived, Mississippi was very different from the state where Samuel Alfred Beadle lived for most of his life. He was among a number of young black men who studied law and became attorneys in the late nineteenth century, hoping to improve a justice system still severely biased by racism. He was also one of eight black writers in Mississippi whose works were published in the early twentieth century, and gained some recognition outside the state.
Despite the fact that he lived in such a politically charged time, Beadle focused most of his first volume — Sketches from Life in Dixie — on traditional themes such as love and courtship, spiritual reflection, and the follies of youth. It contained seven short stories and more than fifty poems, including a long heroic fantasy reminiscent of the Pre-Raphaelites. But there were also commentaries on the problems faced by black citizens, portrayed most notably in the poem “LINES. Suggested by the Assaults made on the Negro Soldiers as they passed through the south on their way to and from our war with Spain.”
“LINES” describes the experience of black soldiers who endured sometimes violent racial discrimination from their own countrymen. But with the poem’s refrain, Beadle returns always to patriotism and love of country.
For three decades, Beadle maintained a successful legal practice, but only by confronting many difficult challenges — and in the preface to his second poetry collection, Lyrics of the Underworld (1912), he expressed frustration with social conditions in the south. Featured in the book were sixteen striking photographs by Beadle’s son, who went on to become one of Mississippi’s best known black photographers.
In 1930, Beadle moved to Chicago — reversing the path taken by Margaret Walker — but lived there for only a short time before his death.
By contrast to those two writers, and their journeys between the deep south and the midwest, Amiri Baraka lived for most of eighty years in or near New York City. Born in 1934 as Leroy Jones, he grew up in Newark, New Jersey in a middle-class family, and earned a scholarship to Rutgers University. But within a year, as he began what would be a long-running quest for self-realization, Jones transferred to historically black Howard University.
By the time he graduated, Jones was disillusioned with what he saw as an emphasis on upward mobility at Howard, and decided to enlist in the Air Force rather than immediately pursuing a career. Along the way, he had changed the spelling of his name from Leroy to LeRoi.
While stationed in Puerto Rico, Jones began an ambitious self-directed reading program, focusing on literature (especially poetry), politics, and economics. These studies were conspicuously outside the Air Force mainstream, however, and raised suspicions that Jones might be a communist sympathizer — leading eventually to his dishonorable discharge.
But by then, LeRoi Jones had found his direction. In the years between 1958 and 1965, he became the only black writer to carve out a place in Manhattan’s frenetic, Beat-inspired literary scene. In addition to co-founding two small but influential publications, he attracted increasing attention as both a poet and a playwright. By 1964, when his controversial, racially charged play Dutchman won a prestigious Obie Award, Jones was one of the most talked-about writers in New York.
But a year later, outraged by the assassination of Malcolm X, he rejected the mostly-white Manhattan milieu, moved to Harlem, and started a short-lived black arts program. After that he returned to his home town of Newark, converted to Islam, and changed his name several times — ending up as Amiri Baraka. For a while he was involved with the Black Nationalist movement, but following a trip to revolutionary Cuba, he became an outspoken proponent of Third World Marxism.
These years of political activism and realignment were reflected in perhaps his most important poetry collection, Hard Facts, 1973–1975. And by the end of the 1970s, Baraka had established himself as both an important writer and an impassioned advocate for black identity.
In 1980, Baraka’s complex life took yet another turn. He joined the faculty of African Studies at SUNY-Stony Brook, where he would teach for the next two decades, and soon began a period of sustained literary accomplishment. His work garnered a Poetry Award from the National Endowment for the Arts (1981); a New Jersey Council for the Arts award (1982); an American Book Award from the Before Columbus Foundation (1984); a PEN-Faulkner Award (1989); the Langston Hughes Medal for outstanding contributions to literature (1989); a Foreign Poet Award from the Ferroni Foundation (1993); and the Playwright’s Award, Winston-Salem Black Drama Festival (1997).
After retiring from Stony Brook at the end of the century, Amiri Baraka once again became a controversial figure —expressing and later recanting anti-Semitic views about the World Trade Center bombing, then refusing public pressure to relinquish his appointment as poet laureate of New Jersey. But eventually the uproar died down, and although his literary work received much less attention in later years, Baraka was revered as a public figure. He continued writing until his death in 2014, and among his late works was an impassioned appreciation of Margaret Walker, who had passed away in 1998.
Looking back at these three figures, we see a continuing pattern of courage and determination. Samuel Alfred Beadle had the courage not only to pursue justice but to write poetry in a time and place still grappling with the very issues that had led to civil war. Margaret Walker had the courage to break through racial barriers in the American literary establishment, and the determination to transform her family’s experiences into a unique work of historical fiction. Amiri Baraka had the courage to express controversial ideas, and the creative persistence to reinvent himself several times over.
Through their work, each of these writers shed light on the experience of being black in America, and for that they deserve to be not only remembered but greatly appreciated.
Especially now. | https://medium.com/literally-literary/three-writers-to-revisit-or-discover-as-black-history-month-ends-33a77817fbfe | ['Cynthia Giles'] | 2020-02-27 02:26:51.676000+00:00 | ['Writing', 'Literature', 'History', 'Women', 'Essay'] | Title Three Writers Revisit Discover Black History Month EndsContent Three Writers Revisit Discover Black History Month Ends one famous three broke new ground Three different writer Poet playwright activist educator Amiri Baraka 1934–2014 famously controversial one name Teacher mentor poet novelist Margaret Walker 1915–1998 broke new ground woman color changed way black family depicted fiction Attorney poet storyteller Samuel Alfred Beadle 1857–1932 captured vignette black life south turn twentieth century different three made significant contribution modern American literature — together represent span time run Civil War election Barack Obama privilege knowing something Walker Beadle long time wrote biography poet ChadwyckHealey’s reference series Literature Online LION strives include database famous figure every era also may almost invisible history One project involved profiling writer chosen Yale Younger Poets Series began 1918 list Margaret Walker—the first black woman ever win national literary prize America fact first person color included Yale Younger Poets Series selects publication one outstanding poet age thirty year 1942 Walker’s impassioned collection People chosen Series editor Stephen Vincent Benét one America’s popular poet Walker went become respected educator successful writer whose epic novel Jubilee among first work fiction present realistic picture black life time slavery Written period nearly three decade Jubilee finally published 1966 Walker earned doctoral degree University Iowa next twenty year pathbreaking novel translated seven language sold million copy Jubilee tell story Vyry character closely based Walker’s maternal greatgrandmother first part focus Vyry’s life slave complicated marriage free black man second depicts destruction violence Civil War third follows Vyry’s struggle establish home family emancipation book’s fiftyeight chapter rich detail daily life story drawn folklore scene history trial Vyry emerges resilient even heroic woman manages maintain strong spirit refuse limit freedom mind burden hate Throughout life Margaret Walker advocate woman color outspoken commentator issue race gender equality graduating Northwestern University twenty Walker worked Federal Writers Program project Depressionera Works Progress Administration became part politically engaged Chicago writing group led controversial novelist Richard Wright experience — conjunction deep Christian faith — shaped poetry career teacher mentor 1949 Walker moved Mississippi taught twenty year Jackson State University raised family founded Institute Study History Life Culture Black People time Walker arrived Mississippi different state Samuel Alfred Beadle lived life among number young black men studied law became attorney late nineteenth century hoping improve justice system still severely biased racism also one eight black writer Mississippi whose work published early twentieth century gained recognition outside state Despite fact lived politically charged time Beadle focused first volume — Sketches Life Dixie — traditional theme love courtship spiritual reflection folly youth contained seven short story fifty poem including long heroic fantasy reminiscent PreRaphaelites also commentary problem faced black citizen portrayed notably poem “LINES Suggested Assaults made Negro Soldiers passed south way war Spain” “LINES” describes experience black soldier endured sometimes violent racial discrimination countryman poem’s refrain Beadle return always patriotism love country three decade Beadle maintained successful legal practice confronting many difficult challenge — preface second poetry collection Lyrics Underworld 1912 expressed frustration social condition south Featured book sixteen striking photograph Beadle’s son went become one Mississippi’s best known black photographer 1930 Beadle moved Chicago — reversing path taken Margaret Walker — lived short time death contrast two writer journey deep south midwest Amiri Baraka lived eighty year near New York City Born 1934 Leroy Jones grew Newark New Jersey middleclass family earned scholarship Rutgers University within year began would longrunning quest selfrealization Jones transferred historically black Howard University time graduated Jones disillusioned saw emphasis upward mobility Howard decided enlist Air Force rather immediately pursuing career Along way changed spelling name Leroy LeRoi stationed Puerto Rico Jones began ambitious selfdirected reading program focusing literature especially poetry politics economics study conspicuously outside Air Force mainstream however raised suspicion Jones might communist sympathizer — leading eventually dishonorable discharge LeRoi Jones found direction year 1958 1965 became black writer carve place Manhattan’s frenetic Beatinspired literary scene addition cofounding two small influential publication attracted increasing attention poet playwright 1964 controversial racially charged play Dutchman prestigious Obie Award Jones one talkedabout writer New York year later outraged assassination Malcolm X rejected mostlywhite Manhattan milieu moved Harlem started shortlived black art program returned home town Newark converted Islam changed name several time — ending Amiri Baraka involved Black Nationalist movement following trip revolutionary Cuba became outspoken proponent Third World Marxism year political activism realignment reflected perhaps important poetry collection Hard Facts 1973–1975 end 1970s Baraka established important writer impassioned advocate black identity 1980 Baraka’s complex life took yet another turn joined faculty African Studies SUNYStony Brook would teach next two decade soon began period sustained literary accomplishment work garnered Poetry Award National Endowment Arts 1981 New Jersey Council Arts award 1982 American Book Award Columbus Foundation 1984 PENFaulkner Award 1989 Langston Hughes Medal outstanding contribution literature 1989 Foreign Poet Award Ferroni Foundation 1993 Playwright’s Award WinstonSalem Black Drama Festival 1997 retiring Stony Brook end century Amiri Baraka became controversial figure —expressing later recanting antiSemitic view World Trade Center bombing refusing public pressure relinquish appointment poet laureate New Jersey eventually uproar died although literary work received much le attention later year Baraka revered public figure continued writing death 2014 among late work impassioned appreciation Margaret Walker passed away 1998 Looking back three figure see continuing pattern courage determination Samuel Alfred Beadle courage pursue justice write poetry time place still grappling issue led civil war Margaret Walker courage break racial barrier American literary establishment determination transform family’s experience unique work historical fiction Amiri Baraka courage express controversial idea creative persistence reinvent several time work writer shed light experience black America deserve remembered greatly appreciated Especially nowTags Writing Literature History Women Essay |
4,115 | Learn NLP the Stanford Way — Lesson 2 | In the previous post, we introduced NLP. To find out word meanings with the Python programming language, we used the NLTK package and worked our way into word embeddings using the gensim package and Word2vec.
Since we only touched the Word2Vec technique from a 10,000-feet overview, we are now going to dive deeper into the training method to create a Word2vec model.
Word2vec family
The Word2vec (Mikolov et al. 2013)[1][2] is not a singular technique or algorithm. It’s actually a family of neural network architectures and optimization techniques that can produce good results learning embeddings for large datasets.
The network architectures are shallow, composed of two layers, and are trained to produce vector representations of words given their context.
The two model variations that can be used are:
Continuous Bag of Words (CBOW), and
Skip-gram.
Training algorithms
Continuous Bag of Words
The CBOW model is based on trying to predict a central word from the context words around it.
We select a few words from a fixed-size window — the authors recommend this technique around a size of 5 — create a dictionary containing the words and their frequencies, and train the model by predicting the central word from the bag of words. The CBOW model doesn’t take into consideration the order of the words inside the “bag.”
Skip-gram
With the Skip-gram model, we predict outside words given a central context word. It works in the opposite way of the CBOW model. With this method, the authors recommend using a window of size 10.
On performance and accuracy: The CBOW model is faster than the Skip-gram, but the Skip-gram architecture works better with infrequent words.
Training Techniques
While the word embeddings created by the network can express the relationships between words, the network itself presents scalability issues. Depending on the vocabulary size, the number of operations needed to calculate the network's output layer is huge. Here are some techniques that are frequently used with Word2vec networks:
Hierarchical Softmax
The hierarchical softmax technique, proposed by Morin and Bengio[1], is applied due to the sheer size of regular vocabularies. In a regular neural network output layer, using the softmax function, the computing power needed to address the probability distribution of a full-sized vocabulary in any given language would be extremely large. We can formalize this by giving a size V vocabulary; we can denote the complexity using O(V).
With hierarchical softmax, the complexity is O(log2(V)) instead of O(V). That is achieved through the use of a multi-layer binary tree to calculate the probability of each word.
Simple exercise
Imagine that we are working with the English vocabulary, which in some libraries is represented by roughly 2 million word embedding — V. That implies a computational cost of O(V) => O(2,000,000). Using the hierarchical softmax, we would work with O(log2(V)) => O(log2(2,000,000) => O(~20).
If you are searching for a more technical, in-depth explanation, I recommend you this blog post.
Negative Sampling
The intuition behind negative sampling, presented by Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean[3] is only to update a subset of weights in the training process, affecting only the target — positive — word and a few of the non-related — negative — words, chosen using a “unigram distribution,” in which more frequent words are preferred to be selected as negative examples.
An illustration of the computations needed in a regular skip-gram model and one with negative sampling — Source
Using negative sampling, the computational cost is dramatically lower than the regular softmax version since it transforms a multi-classification task into a few binary classification tasks.
Implementing different Word2vec models using gensim
You can duplicate my Deepnote notebook here and follow me as I walk through this project for the best experience.
We will use gensim, a Python library, to create different Word2vec models from the same corpus, just passing different parameters to the Word2Vec class constructor.
First, we import the necessary packages and download the corpus:
Then we can create different Word2Vec models using the downloaded corpus and different parameters:
The following parameters are passed to the constructor to define the training algorithm and optimization technique used (source) :
sg ({0, 1}, optional) — Training algorithm: 1 for skip-gram; otherwise CBOW.
({0, 1}, optional) — Training algorithm: 1 for skip-gram; otherwise CBOW. hs ({0, 1}, optional) — If 1, hierarchical softmax will be used for model training. If 0 and negative is non-zero, negative sampling will be used.
({0, 1}, optional) — If 1, hierarchical softmax will be used for model training. If 0 and negative is non-zero, negative sampling will be used. Negative (int, optional) — If > 0, negative sampling will be used; the int for negative specifies how many “noise words” should be drawn (usually between 5–20). If set to 0, no negative sampling is used.
GloVe
GloVe: Global Vectors for Word Representation, presented by Jeffrey Pennington, Richard Socher, and Christopher D. Manning, is another model mainly based on word embeddings.
GloVe is an unsupervised learning algorithm for obtaining vector representations for words. Training is performed on aggregated global word-word co-occurrence statistics from a corpus, and the resulting representations showcase interesting linear substructures of the word vector space. — Stanford GloVe
The main intuition is to scan through the whole corpus and compute the co-occurrence statistics for each word given a context. You can picture a matrix, with rows being the words and columns being the different contexts. Then you would reduce the dimensionality of each row to represent a word vector by factoring the matrix.
GloVe combines two model families: the local context window method and the global matrix factorization method.
The main difference between the two is that while the Word2vec model uses local contexts and a shallow neural network, the GloVe model is based on local and global word co-occurrence and uses the matrix factorization method.
Using GloVe with gensim
To use GloVe with gensim is really easy. You can use the api package to download a trained GloVe model.
You can also convert a GloVe model to a Word2Vec model in gensim using the glove2word2vec script.
Word Senses
Now we know how to create and use word embeddings created by the Word2vec and GloVe models. But still — are those vectors enough to represent words accurately in different contexts?
Common or long-lived words can have several meanings. How can we create embeddings that can capture all meanings from a word?
Linear Algebraic Structure of Word Senses, with Applications to Polysemy
Sanjeev Arora, Yuanzhi Li, Yingyu Liang, Tengyu Ma, Andrej Risteski [4] propose a solution by representing different senses of the same word using a linear superposition — meaning the creation of a word embedding based on a weighted average of each sense embedding and its frequency.
The different senses of the word tie — Taken from the second class slides.
Considering that the vector embedding space is high dimensional and sparse, we can reconstruct the different sense vectors from just the weighted average — or the linear superposition — of the senses.
Conclusion
Next, we will discuss word window classification, neural networks, and PyTorch, topics of the Stanford course’s second lecture. I hope you enjoyed reading this post. If you have any questions, feel free to leave a comment.
Thank you for your time.
Take care, and keep coding!
References
Software and Libraries | https://towardsdatascience.com/learn-nlp-the-stanford-way-lesson-2-7447f2c12b36 | ['Thiago Candido'] | 2020-12-07 17:25:43.244000+00:00 | ['Programming', 'NLP', 'Python', 'Data Science', 'Machine Learning'] | Title Learn NLP Stanford Way — Lesson 2Content previous post introduced NLP find word meaning Python programming language used NLTK package worked way word embeddings using gensim package Word2vec Since touched Word2Vec technique 10000feet overview going dive deeper training method create Word2vec model Word2vec family Word2vec Mikolov et al 201312 singular technique algorithm It’s actually family neural network architecture optimization technique produce good result learning embeddings large datasets network architecture shallow composed two layer trained produce vector representation word given context two model variation used Continuous Bag Words CBOW Skipgram Training algorithm Continuous Bag Words CBOW model based trying predict central word context word around select word fixedsize window — author recommend technique around size 5 — create dictionary containing word frequency train model predicting central word bag word CBOW model doesn’t take consideration order word inside “bag” Skipgram Skipgram model predict outside word given central context word work opposite way CBOW model method author recommend using window size 10 performance accuracy CBOW model faster Skipgram Skipgram architecture work better infrequent word Training Techniques word embeddings created network express relationship word network present scalability issue Depending vocabulary size number operation needed calculate network output layer huge technique frequently used Word2vec network Hierarchical Softmax hierarchical softmax technique proposed Morin Bengio1 applied due sheer size regular vocabulary regular neural network output layer using softmax function computing power needed address probability distribution fullsized vocabulary given language would extremely large formalize giving size V vocabulary denote complexity using OV hierarchical softmax complexity Olog2V instead OV achieved use multilayer binary tree calculate probability word Simple exercise Imagine working English vocabulary library represented roughly 2 million word embedding — V implies computational cost OV O2000000 Using hierarchical softmax would work Olog2V Olog22000000 O20 searching technical indepth explanation recommend blog post Negative Sampling intuition behind negative sampling presented Tomas Mikolov Ilya Sutskever Kai Chen Greg Corrado Jeffrey Dean3 update subset weight training process affecting target — positive — word nonrelated — negative — word chosen using “unigram distribution” frequent word preferred selected negative example illustration computation needed regular skipgram model one negative sampling — Source Using negative sampling computational cost dramatically lower regular softmax version since transforms multiclassification task binary classification task Implementing different Word2vec model using gensim duplicate Deepnote notebook follow walk project best experience use gensim Python library create different Word2vec model corpus passing different parameter Word2Vec class constructor First import necessary package download corpus create different Word2Vec model using downloaded corpus different parameter following parameter passed constructor define training algorithm optimization technique used source sg 0 1 optional — Training algorithm 1 skipgram otherwise CBOW 0 1 optional — Training algorithm 1 skipgram otherwise CBOW h 0 1 optional — 1 hierarchical softmax used model training 0 negative nonzero negative sampling used 0 1 optional — 1 hierarchical softmax used model training 0 negative nonzero negative sampling used Negative int optional — 0 negative sampling used int negative specifies many “noise words” drawn usually 5–20 set 0 negative sampling used GloVe GloVe Global Vectors Word Representation presented Jeffrey Pennington Richard Socher Christopher Manning another model mainly based word embeddings GloVe unsupervised learning algorithm obtaining vector representation word Training performed aggregated global wordword cooccurrence statistic corpus resulting representation showcase interesting linear substructure word vector space — Stanford GloVe main intuition scan whole corpus compute cooccurrence statistic word given context picture matrix row word column different context would reduce dimensionality row represent word vector factoring matrix GloVe combine two model family local context window method global matrix factorization method main difference two Word2vec model us local context shallow neural network GloVe model based local global word cooccurrence us matrix factorization method Using GloVe gensim use GloVe gensim really easy use api package download trained GloVe model also convert GloVe model Word2Vec model gensim using glove2word2vec script Word Senses know create use word embeddings created Word2vec GloVe model still — vector enough represent word accurately different context Common longlived word several meaning create embeddings capture meaning word Linear Algebraic Structure Word Senses Applications Polysemy Sanjeev Arora Yuanzhi Li Yingyu Liang Tengyu Andrej Risteski 4 propose solution representing different sens word using linear superposition — meaning creation word embedding based weighted average sense embedding frequency different sens word tie — Taken second class slide Considering vector embedding space high dimensional sparse reconstruct different sense vector weighted average — linear superposition — sens Conclusion Next discus word window classification neural network PyTorch topic Stanford course’s second lecture hope enjoyed reading post question feel free leave comment Thank time Take care keep coding References Software LibrariesTags Programming NLP Python Data Science Machine Learning |
4,116 | Bangun Aplikasi iOS Pertamamu menggunakan Xcode Storyboard. (Aplikasi COVID-19) | Journal about apps development for business and eCommerce from GITS Indonesia, a Google Certified Agency and Google Cloud Partner. | Website: gits.id
Follow | https://medium.com/gits-apps-insight/bangun-aplikasi-ios-pertamamu-menggunakan-xcode-storyboard-aplikasi-covid-19-866cbdd5ed14 | ['Muhammad Rahman'] | 2020-05-12 06:28:11.620000+00:00 | ['Mobile App Development', 'iOS App Development', 'Storyboard', 'iOS', 'Xcode'] | Title Bangun Aplikasi iOS Pertamamu menggunakan Xcode Storyboard Aplikasi COVID19Content Journal apps development business eCommerce GITS Indonesia Google Certified Agency Google Cloud Partner Website gitsid FollowTags Mobile App Development iOS App Development Storyboard iOS Xcode |
4,117 | The Quality of Information You Consume Can Determine the Quality of Your Life | The Quality of Information You Consume Can Determine the Quality of Your Life
6 websites that can make you a smarter person.
The internet is teeming with websites. The majority of them are not worth your time. Think of websites like Buzzfeed, Bored Panda or any other site that is chock full of clickbait and low-quality posts.
Despite this, there are corners of the internet where you can find high-quality websites full of meaningful and insightful posts. The quality of the information you consume determines the quality of your life. If you only frequented sites that are full of conspiracy theories, you’d view the world very differently from someone who didn’t.
The information age has made knowledge more accessible than ever, but it also means we have to sift through the rubbish before we find something valuable.
Learning is a lifelong pursuit and one that doesn’t finish when you leave school or university. With information available at our fingertips, you’re doing yourself a disservice if you’re not seeking to learn more. Lifelong learning is the best way to improve your career chances and life.
The internet is arguably the best place to learn due to the resources it contains. Twenty years ago, if we wanted to learn about a specific topic, we had to turn to an encyclopedia or buy a book on the topic.
Now, we can hop on our phones and learn about the intricacies of the biases which populate the human mind or chaos theory in a matter of seconds. When doing so, there are a few places you should turn to first before you go down the rabbit hole of a google search.
These websites are among the best on the internet for providing insightful posts on a wide range of topics. By frequenting these sites not only will you learn a lot about many different things, but you’ll also become smarter too as you fill in the gaps in your knowledge and seek to plug the new gaps that arise. | https://medium.com/mind-cafe/the-quality-of-information-you-consume-can-determine-the-quality-of-your-life-2845c0f2b44c | ['Tom Stevenson'] | 2020-12-25 14:27:54.084000+00:00 | ['Life', 'Self Improvement', 'Education', 'Learning', 'Productivity'] | Title Quality Information Consume Determine Quality LifeContent Quality Information Consume Determine Quality Life 6 website make smarter person internet teeming website majority worth time Think website like Buzzfeed Bored Panda site chock full clickbait lowquality post Despite corner internet find highquality website full meaningful insightful post quality information consume determines quality life frequented site full conspiracy theory you’d view world differently someone didn’t information age made knowledge accessible ever also mean sift rubbish find something valuable Learning lifelong pursuit one doesn’t finish leave school university information available fingertip you’re disservice you’re seeking learn Lifelong learning best way improve career chance life internet arguably best place learn due resource contains Twenty year ago wanted learn specific topic turn encyclopedia buy book topic hop phone learn intricacy bias populate human mind chaos theory matter second place turn first go rabbit hole google search website among best internet providing insightful post wide range topic frequenting site learn lot many different thing you’ll also become smarter fill gap knowledge seek plug new gap ariseTags Life Self Improvement Education Learning Productivity |
4,118 | Jobs for Your Personality: How to Own Your INFJ Career | If you’re in the middle of a job hunt, you’re probably weighing up the usual considerations.
Money, travel, responsibility. The usual.
But there’s one thing we often overlook. Something that shouldn’t just affect our job choices, but shape our entire career.
I’m talking about our personality type.
According to the Myers Briggs Type Indicator, the most popular personality test of its kind, there are 16 personality types. Not sure what type you are? Take the test for yourself.
The latest in our Jobs for Your Personality series, this article focuses on INFJ personalities or INFJs for short.
But what makes an INFJ? Well it’s defined by four key character traits:
Introversion
Intuition
Feeling
Judgement
Creative, ambitious and idealistic, INFJs recognise the need for change and take the necessary steps to make it happen, fighting tirelessly for their cause. As such, they are often called ‘Advocates’.
But Advocates are extremely rare. In fact, INFJ is the rarest personality type in the world.
So if you’re an INFJ, what skills can you offer? And what career path should you pursue? Let’s take a look.
INFJ Careers advice
While INFJs have an innate ability to perceive other people’s feelings, they are often misunderstood by those around them. So what makes them tick? And do any of these strengths and weaknesses resonate with you?
INFJ Strengths
Passionate — INFJs are fiercely determined. They will stop at nothing to support their cause, even if it means ruffling a few feathers along the way.
Decisive — Unlike some personality types like INFP, INFJs don’t let their inspiration go to waste. Blessed with great willpower, they make excellent decision makers.
Altruistic — INFJs fight for what’s right. They want positive change for everyone, not just themselves.
Creative — Compassionate and wildly imaginative, INFJs are naturally creative. They also tend to make excellent writers and orators.
INFJ Weaknesses
Perfectionist — As INFJs are so committed to their cause, work opportunities and relationships can suffer in their pursuit of perfection.
Private — Driven by a need to present their best possible selves, INFJs sometimes find it hard to let their guard down around friends, family and colleagues.
Exhaustion — Because INFJs give it their all, they can succumb to exhaustion if they don’t find a way to let off steam.
Best Jobs for INFJ
So what’s the bottom line?
Well, their intuitive and empathetic temperament make INFJs a natural fit for careers in healthcare, education and public service.
It’s no surprise then that famous INFJs include Martin Luther King and Mother Teresa. And while many INFJs explore careers in charity work and advocacy, there are a range of other paths to consider. Here are just a few.
Psychologist
Wonderful listeners and deeply empathetic, INFJs can study and evaluate human behaviour like few others.
Counsellor
Driven by a desire to connect with others, INFJs make wonderful counsellors. Whether that’s in schools, hospitals or private practices.
Scientist
The solitary surroundings of the lab are perfect for INFJs. Here they can align their desire for change with their strong work ethic.
Teacher
Inspirational, motivational and compelling, INFJs have all the traits of a perfect teacher.
Writer
INFJs are wonderful communicators. That’s why careers in writing, both creative and professional, tend to appeal.
INFJ Careers in Business
You might be wondering.
Can INFJs thrive in a business environment? Of course they can!
But to really succeed, INFJs need to find the moral objectives in their work. That’s why high wages and seniority may not necessarily appeal.
And while the collaborative structure of a corporate environment may hamper their strong personal goals, there are a great selection of INFJ careers in business worth considering.
Entrepreneur
Advocates are more likely than other personality types to go it alone. Entrepreneurship allows INFJs to steer their business to their own moral compass.
HR
Aside from being good judges of character, INFJs have the organisational ability to manage the many facets of human resources.
Corporate trainer
Just like teaching, corporate training allows INFJs to exercise their inspirational qualities to bring about positive change.
INFJ Careers to Avoid
While INFJs are capable enough to succeed in any field, there are some careers that may jar with their personality.
Accounting
Routine work like accounting or data analysis may leave INFJs feeling unfulfilled.
Politics
The public scrutiny and regular conflict of politics may dilute their will for change.
Sales
High pressure and tight deadlines often feel unimportant to INFJs.
Putting INFJ to Good Use
Here’s the thing.
Any personality type can thrive in any job. But finding a profession that aligns with your personality type may help you achieve long term job satisfaction. So how can you make the most of your INFJ personality?
Find your cause
To really thrive, INFJs need a cause to get behind. Whether that’s environmental change or life coaching, look for ways you can make a difference.
Find a great team
It’s important for INFPs to be able to grow and learn alongside those they’re working with. So find a team that will help you to help them.
Seek independence
Alternatively, you may prefer to work alone. If so, find a role where you can be productive without being swamped by others.
Focus on your skills
Remember, INFJs are intuitive, empathetic and altruistic. So let these traits guide your career choices.
A Final Word
More than anything else, INFJs need to be able to flex their creativity and insightfulness.
However, they also need to know that what they’re doing is in line with their principles and helping other people. With all that to consider, finding the perfect job is easier said than done.
But here’s the good news.
INFJs are incredibly intelligent. And while some INFJs struggle to pick a career path for fear of missing out on other opportunities, their creativity and imagination are invaluable in modern business. Not only that, their ability to turn concepts into concrete plans is a skill cherished in every industry.
Why not take the test for yourself? Or for more careers advice, visit our insights page.
This article was originally published on Advance | https://medium.com/heyadvance/jobs-for-your-personality-how-to-own-your-infj-career-2de478d0103c | [] | 2018-11-03 19:06:01.587000+00:00 | ['Careers', 'Teaching', 'Writer', 'Psychology', 'HR'] | Title Jobs Personality INFJ CareerContent you’re middle job hunt you’re probably weighing usual consideration Money travel responsibility usual there’s one thing often overlook Something shouldn’t affect job choice shape entire career I’m talking personality type According Myers Briggs Type Indicator popular personality test kind 16 personality type sure type Take test latest Jobs Personality series article focus INFJ personality INFJs short make INFJ Well it’s defined four key character trait Introversion Intuition Feeling Judgement Creative ambitious idealistic INFJs recognise need change take necessary step make happen fighting tirelessly cause often called ‘Advocates’ Advocates extremely rare fact INFJ rarest personality type world you’re INFJ skill offer career path pursue Let’s take look INFJ Careers advice INFJs innate ability perceive people’s feeling often misunderstood around make tick strength weakness resonate INFJ Strengths Passionate — INFJs fiercely determined stop nothing support cause even mean ruffling feather along way Decisive — Unlike personality type like INFP INFJs don’t let inspiration go waste Blessed great willpower make excellent decision maker Altruistic — INFJs fight what’s right want positive change everyone Creative — Compassionate wildly imaginative INFJs naturally creative also tend make excellent writer orator INFJ Weaknesses Perfectionist — INFJs committed cause work opportunity relationship suffer pursuit perfection Private — Driven need present best possible self INFJs sometimes find hard let guard around friend family colleague Exhaustion — INFJs give succumb exhaustion don’t find way let steam Best Jobs INFJ what’s bottom line Well intuitive empathetic temperament make INFJs natural fit career healthcare education public service It’s surprise famous INFJs include Martin Luther King Mother Teresa many INFJs explore career charity work advocacy range path consider Psychologist Wonderful listener deeply empathetic INFJs study evaluate human behaviour like others Counsellor Driven desire connect others INFJs make wonderful counsellor Whether that’s school hospital private practice Scientist solitary surroundings lab perfect INFJs align desire change strong work ethic Teacher Inspirational motivational compelling INFJs trait perfect teacher Writer INFJs wonderful communicator That’s career writing creative professional tend appeal INFJ Careers Business might wondering INFJs thrive business environment course really succeed INFJs need find moral objective work That’s high wage seniority may necessarily appeal collaborative structure corporate environment may hamper strong personal goal great selection INFJ career business worth considering Entrepreneur Advocates likely personality type go alone Entrepreneurship allows INFJs steer business moral compass HR Aside good judge character INFJs organisational ability manage many facet human resource Corporate trainer like teaching corporate training allows INFJs exercise inspirational quality bring positive change INFJ Careers Avoid INFJs capable enough succeed field career may jar personality Accounting Routine work like accounting data analysis may leave INFJs feeling unfulfilled Politics public scrutiny regular conflict politics may dilute change Sales High pressure tight deadline often feel unimportant INFJs Putting INFJ Good Use Here’s thing personality type thrive job finding profession aligns personality type may help achieve long term job satisfaction make INFJ personality Find cause really thrive INFJs need cause get behind Whether that’s environmental change life coaching look way make difference Find great team It’s important INFPs able grow learn alongside they’re working find team help help Seek independence Alternatively may prefer work alone find role productive without swamped others Focus skill Remember INFJs intuitive empathetic altruistic let trait guide career choice Final Word anything else INFJs need able flex creativity insightfulness However also need know they’re line principle helping people consider finding perfect job easier said done here’s good news INFJs incredibly intelligent INFJs struggle pick career path fear missing opportunity creativity imagination invaluable modern business ability turn concept concrete plan skill cherished every industry take test career advice visit insight page article originally published AdvanceTags Careers Teaching Writer Psychology HR |
4,119 | Book Review: “Twilight of Democracy” (by Anne Applebaum) | Book Review: “Twilight of Democracy” (by Anne Applebaum)
Anne Applebaum’s new book is a haunting reminder of just how fragile democracies are and how easily they can be dismantled from within.
I’ve been meaning to read this book for a while, ever since I saw that Anne Applebaum was going to expand the essay that she wrote for The Atlantic into a full-length book. I’m glad that I’ve read it, but it’s a haunting book, that’s for sure. It’s a potent, and somewhat apocalyptic, reminder that democracies are only as strong as the people who live in them, that there is nothing about them that is ontologically strong and secure. In order to continue to function as they should, they require buy-in and maintenance and support and, above all, faith. Without those things they are subject to corrosion and destruction from within.
Applebaum focuses on three different countries that have seen a rising tide of anti-democratic tide: Poland, the UK, and the US. Of the three, it’s Poland that has suffered the most dramatic reversals, as the Law and Justice Party has slowly risen to power on a tide of falsehoods, conspiracy theories, and reactionary conservatism. In the UK, meanwhile, a strange sort of nostalgia took hold in the years leading up to the Brexit vote, a yearning for a time when Britain actually did things in the world. As foolish and mendacious as he could be (and still is), Boris Johnson and others like him were able to seize control of the narrative and create enough of a groundswell to achieve the Brexit vote they desired.
However, it’s important to point out that what’s happened in Poland and the UK isn’t confined to those nations, and Twilight also includes a discussion of various other nations in Europe that have begun to struggle against the rising tide of nationalism. She discusses Spain and Hungary in particular detail, and as someone who isn’t particularly plugged in to European politics — except in the most general sense — it was rather distressing to see this reactionary sort of ideology has taken root all over the continent, aided and abetted by technology, which makes it significantly easier to spread disinformation. In the hands of such menacing figures as Viktor Orbán of Hungary, such power is very dangerous indeed, especially since he seems committed to nothing less than the rewriting of history itself (a common tactic among authoritarians everywhere).
Her discussion of the US by necessity includes a substantial analysis of Fox News and its role in the decline of faith in democracy. Applebaum focuses in particular on Laura Ingraham, who has gone from a relative unknown to one of the most powerful, and most Trumpian, voices at Fox News. Like so many others on the right, she has given into a certain sort of apocalyptic despair, which means that she is both willing to carry water for an authoritarian figure (part of a whole class of such people that Applebaum refers to as clercs) and smother the contradictions that such an action necessitates.
Throughout the book, Applebaum is as concerned with people as she is with processes, in that she often focuses on the individuals whose choices and political actions have led to the current state of affairs. There’s a potent truth here, for it’s a fact that no authoritarian is able to rise to power without having the support of at least some of those in positions of cultural authority to buy in, what Applebaum, following in the footsteps of Julien Benda, refers to as clercs. These are the people who make authoritarianism palatable to the masses, whether through their positions in powerful media or by distorting museum exhibits to support a dominant agenda (which has happened in Hungary).
Now, it has to be said that Applebaum does sometimes play a bit of both-sidesism, particularly when it comes to highlighting the supposed excesses of the left. She doesn’t dwell on it too much, but she does call out what she sees as the problems with cancel culture, which she sees as smothering rational political debate. Given that she contributes to The Atlantic (which has made the alleged power of cancel culture one of its most frequently reported-on phenomena), and that she has worked with and for a number of conservative groups and individuals, I’m not terribly surprised at this line of attack on her part, nor do I think it undercuts the validity of the book’s more substantial arguments. However, the very fact that she puts them into the conversation at all shows the extent to which many of the most prominent voices in conservatism still cling to their old pieties, refusing to take accountability for their own culpability for the state in which we find ourselves. Indeed, one of the frustrating things about this book is Applebaum’s lack of self-reflection, particularly when it comes to her friendships with some of the very people that she criticizes (one has to wonder what type of person would be friends with Boris Johnson in the first place).
Twilight of Democracy is part of a growing body of work devoted to the study and analysis of what it is that makes democracies work and why they are so fragile, and for that reason it’s necessary reading. Even though Trump has been defeated at the ballot box and is due to leave the White House on January 20, it’s important to remember that, as of this writing, his assault on the electoral system itself is ongoing, with a “final showdown” set to take place on January 6, when Congress is supposed to certify the results from the Electoral College.
Given this context, this book is thus something of an intellectual call to arms. It reaches out to each of us, asking us to do our part to ensure that democracy doesn’t go the way of so many other failed political systems. As she reminds us near the end of the book: “no political victory is ever permanent, no definition of ‘the nation’ is guaranteed to last, and no elite of any kind […] rules forever.” There’s something more than a little terrifying about the idea that history is one long cycle, that every political victory must be re-fought again and again and again. But such, alas, is the nature of modernity. | https://medium.com/reluctant-moderation/book-review-twilight-of-democracy-by-anne-applebaum-878ed0bd3677 | ['Dr. Thomas J. West Iii'] | 2020-12-28 18:07:39.716000+00:00 | ['Politics', 'Democracy', 'History', 'Authoritarianism', 'Books'] | Title Book Review “Twilight Democracy” Anne ApplebaumContent Book Review “Twilight Democracy” Anne Applebaum Anne Applebaum’s new book haunting reminder fragile democracy easily dismantled within I’ve meaning read book ever since saw Anne Applebaum going expand essay wrote Atlantic fulllength book I’m glad I’ve read it’s haunting book that’s sure It’s potent somewhat apocalyptic reminder democracy strong people live nothing ontologically strong secure order continue function require buyin maintenance support faith Without thing subject corrosion destruction within Applebaum focus three different country seen rising tide antidemocratic tide Poland UK US three it’s Poland suffered dramatic reversal Law Justice Party slowly risen power tide falsehood conspiracy theory reactionary conservatism UK meanwhile strange sort nostalgia took hold year leading Brexit vote yearning time Britain actually thing world foolish mendacious could still Boris Johnson others like able seize control narrative create enough groundswell achieve Brexit vote desired However it’s important point what’s happened Poland UK isn’t confined nation Twilight also includes discussion various nation Europe begun struggle rising tide nationalism discus Spain Hungary particular detail someone isn’t particularly plugged European politics — except general sense — rather distressing see reactionary sort ideology taken root continent aided abetted technology make significantly easier spread disinformation hand menacing figure Viktor Orbán Hungary power dangerous indeed especially since seems committed nothing le rewriting history common tactic among authoritarian everywhere discussion US necessity includes substantial analysis Fox News role decline faith democracy Applebaum focus particular Laura Ingraham gone relative unknown one powerful Trumpian voice Fox News Like many others right given certain sort apocalyptic despair mean willing carry water authoritarian figure part whole class people Applebaum refers clercs smother contradiction action necessitates Throughout book Applebaum concerned people process often focus individual whose choice political action led current state affair There’s potent truth it’s fact authoritarian able rise power without support least position cultural authority buy Applebaum following footstep Julien Benda refers clercs people make authoritarianism palatable mass whether position powerful medium distorting museum exhibit support dominant agenda happened Hungary said Applebaum sometimes play bit bothsidesism particularly come highlighting supposed excess left doesn’t dwell much call see problem cancel culture see smothering rational political debate Given contributes Atlantic made alleged power cancel culture one frequently reportedon phenomenon worked number conservative group individual I’m terribly surprised line attack part think undercut validity book’s substantial argument However fact put conversation show extent many prominent voice conservatism still cling old piety refusing take accountability culpability state find Indeed one frustrating thing book Applebaum’s lack selfreflection particularly come friendship people criticizes one wonder type person would friend Boris Johnson first place Twilight Democracy part growing body work devoted study analysis make democracy work fragile reason it’s necessary reading Even though Trump defeated ballot box due leave White House January 20 it’s important remember writing assault electoral system ongoing “final showdown” set take place January 6 Congress supposed certify result Electoral College Given context book thus something intellectual call arm reach u asking u part ensure democracy doesn’t go way many failed political system reminds u near end book “no political victory ever permanent definition ‘the nation’ guaranteed last elite kind … rule forever” There’s something little terrifying idea history one long cycle every political victory must refought ala nature modernityTags Politics Democracy History Authoritarianism Books |
4,120 | Why “Data Looks Better Naked” | Let’s explore from a historical standpoint. This will allow us to better understand how what was once seen as “well designed” now looks “overly detailed”, and how the teachings of ‘why data looks better naked’ came about.
The knowledge behind why data looks better naked comes from the teachings of Edward Tufte, an artist and statistician. As a statistics professor at Yale University, Edward has written, designed and published four books dedicated to the knowledge of data visualization. In 1983, Tufte published his first book called, The Visual Display of Quantitative Information where it focused on the theories and practices behind designing data graphics (statistical graphs, charts and tables). It was in this book that Edward introduced the concept of “data-ink”. Data-ink is “the non-erasable core of the graphic, the non-redundant ink arranged in response to variation in the numbers represented”. He goes on to explain that we should “remove all non-data-ink and redundant data-ink, within reason.” In doing so, this will create a more cohesive graphical design when it comes to data visualization.
The below GIFs, created by designer Joey Cherdarchuk, illustrates the step by step process taken into “stripping away the excess” in order to make a graph visually “naked”.
Column Chart
Table Chart
Let’s follow the teachings of Edward Tufte with the support of Joey Cherdarchuk’s visuals when it comes to data representation. Though we are accustomed to the old style way of formulating data, let’s push forward in the direction of minimalism. To ensure that your data comes off as clear as possible, let’s strip down the data (rather than dress it up). This will make the data more “effective, attractive and impactive” when the method of “less is more” is put to use. | https://medium.com/comms-planning/why-data-looks-better-naked-ac2adb872378 | ['Naja Bomani'] | 2016-08-15 16:06:17.734000+00:00 | ['Simplicity', 'Minimalism', 'Design', 'Data Visualization', 'Data'] | Title “Data Looks Better Naked”Content Let’s explore historical standpoint allow u better understand seen “well designed” look “overly detailed” teaching ‘why data look better naked’ came knowledge behind data look better naked come teaching Edward Tufte artist statistician statistic professor Yale University Edward written designed published four book dedicated knowledge data visualization 1983 Tufte published first book called Visual Display Quantitative Information focused theory practice behind designing data graphic statistical graph chart table book Edward introduced concept “dataink” Dataink “the nonerasable core graphic nonredundant ink arranged response variation number represented” go explain “remove nondataink redundant dataink within reason” create cohesive graphical design come data visualization GIFs created designer Joey Cherdarchuk illustrates step step process taken “stripping away excess” order make graph visually “naked” Column Chart Table Chart Let’s follow teaching Edward Tufte support Joey Cherdarchuk’s visuals come data representation Though accustomed old style way formulating data let’s push forward direction minimalism ensure data come clear possible let’s strip data rather dress make data “effective attractive impactive” method “less more” put useTags Simplicity Minimalism Design Data Visualization Data |
4,121 | Ionic & Felgo: App Development Framework Comparison | Cross-platform development is making a lot of noise in today’s dev world and there is a reason why. A shared codebase can save a lot of time if you want to target multiple platforms. There are several approaches for creating cross-platform applications. But which one is better? This time you will see the comparison of Ionic and Felgo.
Differences between Cross-Platform Frameworks
Before we start, let’s take a peek at the history of cross-platform development. In early cross-platform mobile app development times, apps were displayed in a WebView. A WebView is nothing more than a native browser window without any extra interface
The HTML engine of the browser took care of rendering all app elements. The idea was to create and run a web application with a native look and feel. This way developers could deploy to many platforms. The platform just had to provide the browser technology. This approach is still used by many frameworks, including Ionic.
On the other hand, a standard web app running inside a browser cannot access all the functionalities of a target device that a modern app needs. That is why tools like Cordova became popular. It provided a web-to-native bridge. The bridge granted access to functionalities like localization in a WebView. Ionic also provides such a bridge with Capacitor. But in reality, it is nothing more than the good old Cordova with some upgrades.
In summary, if you want to create an application using the Ionic framework, you will need to use a web technology stack: HTML, CSS, and JavaScript. Other frameworks, such as AngularJS or React, would also be useful to give the app the desired modern feel.
Hybrid Frameworks and Rendering with a WebView
Hybrid Frameworks, like Ionic, render their content within a WebView. This WebView is wrapped with APIs to access native device features.
However, this approach has some disadvantages like:
The performance of your app depends on the internal version of the WebView used in the targeted OS. This dependency can cause different behaviors and performance characteristics on different OS versions (e.g. Android 6.0 vs 9.0).
You will depend on Apple and Google to add features and improve the performance of the WebView.
There are features that depend on web engines like Webkit and CHromium for both iOS and Android. Some of the CSS fields supported by the JavaScript standard are an example of such a feature. It makes maintainability harder as you need to support multiple Webview browser versions and types.
Web renderers were designed to display websites or multimedia content in a browser. They do not render user interfaces & animations very efficiently. Because of that, performance is significantly slower compared to native apps.
The Felgo Approach
Let’s focus now on how Felgo handles cross-platform rendering. Qt with Felgo compiles real native applications without the need for a WebView. Felgo renders its UI elements with the Qt rendering engine built on C++ & OpenGL ES / Vulkan / Metal. This so-called “scene graph renderer” is optimized for performance. It also guarantees that the UI will look the same on any device & platform. Furthermore, it is also possible to keep your existing native iOS, Android, or C++ code. You can simply reuse your own native code with Felgo thanks to its architecture.
The core language behind Qt & Felgo is C++, which is famous for its performance and stability. However, it is not ideal for creating a modern UI and cutting-edge applications. So Qt introduced a new language called QML.
QML is a declarative language that lets you compose your UI as a tree of visual items, very similar to HTML. For adding application logic, QML relies on JavaScript. Developers can easily get started if they are familiar with these web technologies. Felgo comes with everything you need to build stunning applications in record time. To achieve native performance, all QML items actually translate to performant C++ components in the backend.
Your QML and JavaScript get executed and visualized by a highly optimized C++ renderer. Qt also compiles all components Just in Time (JIT) or Ahead of Time (AIT) if configured. This way, QML can achieve native performance.
Qt & Felgo not only allow you to develop cross-platform for iOS and Android. You can also run your applications on desktop, web and embedded systems.
Inside the Frameworks
The devil is in the details and that is why it’s crucial to take a look inside the architecture of both frameworks. Let’s start with Ionic. The browser renders your code and Ionic needs a bridge to access OS functionalities like a camera:
You have to rely on this bridge to access native features. It is not possible to build an application that directly uses these platform APIs.
But what about Felgo?
You won’t need any additional bridge to access the OS functionalities. You have direct access to all platform features with the native code in your application. This also includes the highly performant QML Engine, which is part of your Qt application:
This architecture ensures a consistent performance on all target platforms and devices.
Framework Business Potential
When considering business potential, there are some things to keep in mind. First is, of course, current staff experience.
When developing with Ionic, you need a team with quite a lot of knowledge about web app development. If they are lacking some of them, the training would take some time.
When considering Felgo, the main skill your team should have is knowledge of JavaScript, because QML is derived from it. As JS is one of the most popular programming languages, the probability that your fellow programmers have such ability is quite high. If you already work with programmers who have JavaScript knowledge then it’s easy to reuse their skills in the new Felgo project.
Another aspect to consider is the supported platforms. Apart from Web, Ionic supports only iOS and Android. With Felgo, you can deploy also to Windows, Mac, Linux, and embedded devices. The variety of platforms is much bigger when using Felgo.
Framework Documentation
Many developers consider documentation as one of the most important factors not only in terms of learning new technology but also in case of reducing the time of development. When creating the app, you will sooner or later bump into some issues that will require some additional knowledge. Documentation is the best place to look for it. If it is high quality, you will solve the problem in no time. Otherwise, you will struggle with scrolling through many pages and hope to find a detailed answer.
Both Felgo and Ionic offer great documentation, to browse APIs, examples and demos.
Learning Curve Comparison
When taking the first steps with Ionic, you need to learn quite a lot of technologies like HTML, Sassy CSS, and JavaScript. On top of that, you should also know a front-end framework like Angular. It uses Typescript language that you will also need to be familiar with. You might also use React to give the app the desired modern look and feel.
There’s a lot to learn if you aren’t an expert in web development but would like to create mobile apps with a cross-platform framework. Besides, Angular and React are not known for being easy to learn.
To learn Felgo, you need some QML skills and know JavaScript to write functions in QML. QML, due to its JSON-like notation, is very friendly for new users.
The gap between Ionic and Felgo’s necessary technology stack is rather big — especially if you are not specialized in any kind of web app technology.
To summarize, the learning curve of Ionic can be much steeper than Felgo’s. Especially when learning the chosen front-end JS framework at the same time.
Framework Pricing and Licensing
For personal usage or “low-budget” developers, both of the frameworks are free. If you’d like to include additional services and tools into your app, you can get professional plans to ensure that you get the most out of the solution. Felgo offers advanced features like analytics and push notifications. Whereas Ionic gives you more than 100 live updates per month in their paid licenses.
Hello World Mobile App Comparison
Architecture and functionalities are one thing. But learning a certain technology
simplicity and clarity are a completely different matter. How to compare these factors? It’s quite simple — let’s write a simple app!
Proceeding with Ionic, you can see at the beginning that creating logic and design will need two separate files for every page. You’ll also need to write the code in two different notations: HTML and TypeScript.
Now, let’s look at the Hello Word app written with Felgo:
Run this code on your iOS or Android device now, with Live Code Reloading
Here you can see how you can create the logic and design in the same QML file. This has a positive impact on the entry-level of technology. QML is also easier to read than HTML, with less syntax overhead. This especially matters when dealing with large projects where a single page can contain many objects.
At the same time, the application logic with TypeScript and QML are quite similar because both are based on JavaScript syntax.
Comparing Integrated Development Environments
When comparing frameworks, it is also worth taking a look at integrated development environments (IDE), and what they can offer you to make development more efficient.
Felgo isn’t just a framework for cross-platform development. It also offers a whole set of tools that you can use throughout the entire lifespan of the application. Felgo comes with the full featured Qt Creator IDE. You also have access to QML Hot Reload that lets you view edits of QML code in real-time. This feature comes with a tool called Felgo Live Server. It lets you deploy apps to multiple, real devices via a network. In the IDE, you have access to built-in documentation. Here you can find info about Felgo types as well as about all Qt classes. Once you write some code, you can use an integrated debugger and profiler to analyze your app’s execution flow.
In this matter, Ionic falls behind as it has no dedicated IDE. Thus, you need to rely on tools that are not fully adjusted to this framework.
With Felgo you also get access to Cloud Builds. This service allows you to build and release cross-platform applications to app stores like Apple Store and Google Play. You can integrate it with your code repository and CI/CD system, so you don’t need to do so manually on every platform. With Cloud Builds you don’t even need to have a MacBook to release iOS applications.
Cross-Platform Framework Comparison Overview:
What is the best cross-platform framework?
The answer to this question does not really exist — there is no silver bullet. Instead, you should ask “What framework is best for me and my project?”. Several factors can help you decide on a particular technology. To ease the decision-making process, you should ask yourself a few questions:
What programming language do you or your team have experience in? What are the requirements of your app? What tooling helps you to work more efficiently? What platforms do you want to support, now and also in the future? Do you have an existing code you want to reuse? Who can help me if I run into problems?
Every technology has its pros and cons and your use-case matters. If you are looking for a reliable, efficient, and easy-to-learn framework, you should definitely consider having a look at Felgo & Qt.
Related Articles:
QML Tutorial for Beginners
3 Practical App Development Video Tutorials
Best Practices of Cross-Platform App Development on Mobile
More Posts Like This
Flutter, React Native & Felgo: The App Framework Comparison
Continuous Integration and Delivery (CI/CD) for Qt and Felgo
QML Hot Reload for Qt — Felgo | https://medium.com/the-innovation/ionic-felgo-app-development-framework-comparison-ba84de105a20 | ['Christian Feldbacher'] | 2020-07-08 10:16:51.360000+00:00 | ['Mobile App Development', 'Programming', 'Technology', 'Apps', 'Framework'] | Title Ionic Felgo App Development Framework ComparisonContent Crossplatform development making lot noise today’s dev world reason shared codebase save lot time want target multiple platform several approach creating crossplatform application one better time see comparison Ionic Felgo Differences CrossPlatform Frameworks start let’s take peek history crossplatform development early crossplatform mobile app development time apps displayed WebView WebView nothing native browser window without extra interface HTML engine browser took care rendering app element idea create run web application native look feel way developer could deploy many platform platform provide browser technology approach still used many framework including Ionic hand standard web app running inside browser cannot access functionality target device modern app need tool like Cordova became popular provided webtonative bridge bridge granted access functionality like localization WebView Ionic also provides bridge Capacitor reality nothing good old Cordova upgrade summary want create application using Ionic framework need use web technology stack HTML CSS JavaScript framework AngularJS React would also useful give app desired modern feel Hybrid Frameworks Rendering WebView Hybrid Frameworks like Ionic render content within WebView WebView wrapped APIs access native device feature However approach disadvantage like performance app depends internal version WebView used targeted OS dependency cause different behavior performance characteristic different OS version eg Android 60 v 90 depend Apple Google add feature improve performance WebView feature depend web engine like Webkit CHromium iOS Android CSS field supported JavaScript standard example feature make maintainability harder need support multiple Webview browser version type Web renderers designed display website multimedia content browser render user interface animation efficiently performance significantly slower compared native apps Felgo Approach Let’s focus Felgo handle crossplatform rendering Qt Felgo compiles real native application without need WebView Felgo render UI element Qt rendering engine built C OpenGL ES Vulkan Metal socalled “scene graph renderer” optimized performance also guarantee UI look device platform Furthermore also possible keep existing native iOS Android C code simply reuse native code Felgo thanks architecture core language behind Qt Felgo C famous performance stability However ideal creating modern UI cuttingedge application Qt introduced new language called QML QML declarative language let compose UI tree visual item similar HTML adding application logic QML relies JavaScript Developers easily get started familiar web technology Felgo come everything need build stunning application record time achieve native performance QML item actually translate performant C component backend QML JavaScript get executed visualized highly optimized C renderer Qt also compiles component Time JIT Ahead Time AIT configured way QML achieve native performance Qt Felgo allow develop crossplatform iOS Android also run application desktop web embedded system Inside Frameworks devil detail it’s crucial take look inside architecture framework Let’s start Ionic browser render code Ionic need bridge access OS functionality like camera rely bridge access native feature possible build application directly us platform APIs Felgo won’t need additional bridge access OS functionality direct access platform feature native code application also includes highly performant QML Engine part Qt application architecture ensures consistent performance target platform device Framework Business Potential considering business potential thing keep mind First course current staff experience developing Ionic need team quite lot knowledge web app development lacking training would take time considering Felgo main skill team knowledge JavaScript QML derived JS one popular programming language probability fellow programmer ability quite high already work programmer JavaScript knowledge it’s easy reuse skill new Felgo project Another aspect consider supported platform Apart Web Ionic support iOS Android Felgo deploy also Windows Mac Linux embedded device variety platform much bigger using Felgo Framework Documentation Many developer consider documentation one important factor term learning new technology also case reducing time development creating app sooner later bump issue require additional knowledge Documentation best place look high quality solve problem time Otherwise struggle scrolling many page hope find detailed answer Felgo Ionic offer great documentation browse APIs example demo Learning Curve Comparison taking first step Ionic need learn quite lot technology like HTML Sassy CSS JavaScript top also know frontend framework like Angular us Typescript language also need familiar might also use React give app desired modern look feel There’s lot learn aren’t expert web development would like create mobile apps crossplatform framework Besides Angular React known easy learn learn Felgo need QML skill know JavaScript write function QML QML due JSONlike notation friendly new user gap Ionic Felgo’s necessary technology stack rather big — especially specialized kind web app technology summarize learning curve Ionic much steeper Felgo’s Especially learning chosen frontend JS framework time Framework Pricing Licensing personal usage “lowbudget” developer framework free you’d like include additional service tool app get professional plan ensure get solution Felgo offer advanced feature like analytics push notification Whereas Ionic give 100 live update per month paid license Hello World Mobile App Comparison Architecture functionality one thing learning certain technology simplicity clarity completely different matter compare factor It’s quite simple — let’s write simple app Proceeding Ionic see beginning creating logic design need two separate file every page You’ll also need write code two different notation HTML TypeScript let’s look Hello Word app written Felgo Run code iOS Android device Live Code Reloading see create logic design QML file positive impact entrylevel technology QML also easier read HTML le syntax overhead especially matter dealing large project single page contain many object time application logic TypeScript QML quite similar based JavaScript syntax Comparing Integrated Development Environments comparing framework also worth taking look integrated development environment IDE offer make development efficient Felgo isn’t framework crossplatform development also offer whole set tool use throughout entire lifespan application Felgo come full featured Qt Creator IDE also access QML Hot Reload let view edits QML code realtime feature come tool called Felgo Live Server let deploy apps multiple real device via network IDE access builtin documentation find info Felgo type well Qt class write code use integrated debugger profiler analyze app’s execution flow matter Ionic fall behind dedicated IDE Thus need rely tool fully adjusted framework Felgo also get access Cloud Builds service allows build release crossplatform application app store like Apple Store Google Play integrate code repository CICD system don’t need manually every platform Cloud Builds don’t even need MacBook release iOS application CrossPlatform Framework Comparison Overview best crossplatform framework answer question really exist — silver bullet Instead ask “What framework best project” Several factor help decide particular technology ease decisionmaking process ask question programming language team experience requirement app tooling help work efficiently platform want support also future existing code want reuse help run problem Every technology pro con usecase matter looking reliable efficient easytolearn framework definitely consider look Felgo Qt Related Articles QML Tutorial Beginners 3 Practical App Development Video Tutorials Best Practices CrossPlatform App Development Mobile Posts Like Flutter React Native Felgo App Framework Comparison Continuous Integration Delivery CICD Qt Felgo QML Hot Reload Qt — FelgoTags Mobile App Development Programming Technology Apps Framework |
4,122 | Watson Text to Speech Releases 5 New Neural Voices! | We are pleased to announce that IBM Watson Text to Speech, a cloud service that enables users to converts text into natural-sounding audio, has introduced five new neural voices (four US English voices and a German voice). These new voices are now generally available in our public cloud offering.
Take A Listen!
Click on the names to listen to the new voice samples:
US English — Emily
“If you know your party’s extension number, you can enter it at any time. For Sales and Customer Service, press 1.”
US English — Kevin
“For all other inquiries, please stay on the line, and a representative will be happy to assist you.”
US English — Henry
“Our business hours are Monday through Friday from 8 am to 7 pm except on major holidays. Please leave a message with your name, contact information, and the nature of your call and someone from the appropriate department will contact you on the next business day.”
US English — Olivia
“All of our agents are currently busy. Please hold, and we will answer your call as soon as possible.”
German— Erika
“Alle unsere Mitarbeiter sind derzeit im Gespräch. Bitte bleiben Sie dran, wir werden Ihren Anruf so schnell wie möglich weiterleiten.”
Learn More
Interested in discovering our TTS capabilities, languages and voice technologies? Click here to learn more.
Try out our TTS languages and voice technologies for yourself with this demo. Or read the science behind the technology of our new neural voices in our whitepaper: “High quality, lightweight and adaptable TTS using LPCNet”. | https://medium.com/ibm-watson/watson-text-to-speech-releases-5-new-neural-voices-2476863c5e23 | ['Vijay Ilankamban'] | 2020-03-14 14:01:00.991000+00:00 | ['Artificial Intelligence', 'Speech Recognition', 'Machine Learning', 'Announcements', 'Watson Text To Speech'] | Title Watson Text Speech Releases 5 New Neural VoicesContent pleased announce IBM Watson Text Speech cloud service enables user convert text naturalsounding audio introduced five new neural voice four US English voice German voice new voice generally available public cloud offering Take Listen Click name listen new voice sample US English — Emily “If know party’s extension number enter time Sales Customer Service press 1” US English — Kevin “For inquiry please stay line representative happy assist you” US English — Henry “Our business hour Monday Friday 8 7 pm except major holiday Please leave message name contact information nature call someone appropriate department contact next business day” US English — Olivia “All agent currently busy Please hold answer call soon possible” German— Erika “Alle unsere Mitarbeiter sind derzeit im Gespräch Bitte bleiben Sie dran wir werden Ihren Anruf schnell wie möglich weiterleiten” Learn Interested discovering TTS capability language voice technology Click learn Try TTS language voice technology demo read science behind technology new neural voice whitepaper “High quality lightweight adaptable TTS using LPCNet”Tags Artificial Intelligence Speech Recognition Machine Learning Announcements Watson Text Speech |
4,123 | How to Compare 200+ Cryptocurrencies with Open-Source CoinScraper Module | A bit dramatic, I know, but it’s a pretty big deal if you are an overbought/oversold kind-of-trader. Most traders use technical analysis to and their favorite indicators to make smart decisions in the market. This could range from moving averages or exponential moving average, depending on the trader. Some may prefer Moving Average Convergence Divergence (MACD) and others on-balance volume (OBV).
Everyone I know uses different tools and we all use some of the same tools as well, but this post is not about Bollinger Bands, Fibonacci Retracements, or Ichimoku Clouds. This post is about the coinscrapper, and while those tools are nice, you need the data before you can use any tool!
The coinscraper client was designed to compare the top 200 supported assets on Kucoin’s decentralized exchange. Each crypto asset’s historical and fundamental data is sourced from coinmarketcap.com. After all the data is collected, the data is preprocessed before calculating relative strength index. We end up with a summary table of the top 200 assets along with their relative strength.
The client has a few requirements/dependencies, please see requirements.txt file, or install the following:
import requests
import pandas as pd
import time
import random
import math
import numpy as np
from math import pi
import matplotlib.pyplot as plt
# %matplotlib inline
from os import mkdir
from os.path import exists
To install the client module, download the .py file and the demo notebook for Google Colab. You can download the files from the repo here.
Connecting to Client
Now that you have the client module installed, open the Demo notebook and run this cell. The demo will walk through some errors and show you how to fix them if they happen to you while running this client.
from coinscraper import coinscrapper
today = 'YYYYMMDD'
client = coinscrapper(today)
The client module will require a google authentication, and will also require a selenium web-driver. I would suggest running the client in Google Colab to test it out. If you are experiencing any errors, please make sure you have uploaded the files to your Colab Notebooks folder on Google Drive.
Feel free to change the file path or change any functionality.
Pulling Summary
The coinscraper client is filled with various methods, but the all purpose one is the .summary() method. This function was designed to process all the actions from getting the list of assets that are traded on KuCoin; creating links for historic data; converting html tables to dataframes; munging the data; generating a .csv file, and an html table with all the results.
client.summary()
Access Saved Datasets
Using the client we can also access the datasets saved during the summary process. The datasets are saved in a python list, and will contain historic data for each asset. The fundamental data and the RSI dataset are separte pythonic lists of datasets.
client.technical_data client.fundamental_data client.rsi_data
Below is an example of how to access the saved datasets using the client.In [10]:
# Historic Price Data
list_of_historic_data = client.technical_data
print('Historic Price Data: ')
display(list_of_historic_data[0].head()) # Fundamental Data
list_of_fundamental_data = client.fundamental_data
print('
Fundamental Data:
')
display(list_of_fundamental_data[0].set_index(0).stack()) # RSI Data
list_of_RSI_data = client.rsi_data
print('
RSI Data: ')
display(list_of_RSI_data[0].tail()) Historic Price Data: DateOpen*HighLowClose**VolumeMarket Cap0Aug 20, 202091.25101.3191.25101.3115344164317903490141Aug 19, 202093.4794.4989.5591.2810936760116129313732Aug 18, 202093.3797.1592.0993.5211999836316524633993Aug 17, 202091.2294.6589.7993.368319201116495215074Aug 16, 202090.0791.3288.4191.22641774871611628957 Fundamental Data: Monero Price 1 $93.81 USD
Monero ROI 1 3,693.27%
Market Rank 1 #16
Market Cap 1 $1,657,837,591 USD
24 Hour Volume 1 $176,797,734 USD
Circulating Supply 1 17,672,780 XMR
Total Supply 1 17,672,780 XMR
Max Supply 1 No Data
All Time High 1 $495.84 USD(Jan 07, 2018)
All Time Low 1 $0.212967 USD(Jan 14, 2015)
52 Week High / Low 1 $105.52 USD /$26.70 USD
90 Day High / Low 1 $105.52 USD /$60.43 USD
30 Day High / Low 1 $105.52 USD /$70.89 USD
7 Day High / Low 1 $105.52 USD /$88.41 USD
24 Hour High / Low 1 $105.52 USD /$92.60 USD
Yesterday's High / Low 1 $101.31 USD /$91.25 USD
Yesterday's Open / Close 1 $91.25 USD /$101.31 USD
Yesterday's Change 1 $10.06 USD (11.02%)
Yesterday's Volume 1 $153,441,643 USD
dtype: object RSI Data: DateOpen*HighLowClose**VolumeMarket Capdate_RSIdate_ 2020–08–16Aug 16, 202090.0791.3288.4191.226417748716116289572020–08–1659.6750132020–08–17Aug 17, 202091.2294.6589.7993.368319201116495215072020–08–1762.4408222020–08–18Aug 18, 202093.3797.1592.0993.5211999836316524633992020–08–1862.6471052020–08–19Aug 19, 202093.4794.4989.5591.2810936760116129313732020–08–1957.8562802020–08–20Aug 20, 202091.25101.3191.25101.3115344164317903490142020–08–2069.210352
Access RSI Charts
Using the client we can access the RSI Charts saved during the summary process. The charts are saved in a python list, and also saved to your authenticated google drive. The plots are the matplotlib objects that just require the .show() method.
client.plots client.candle_sticks (coming soon)
Below is an example of how to access the saved datasets using the client.In [24]:
import os
In [29]:
os.listdir('drive/My Drive/CoinScraper/charts/monero/')
Out[29]:
['RSI-20200820.png']
In [37]:
img = plt.imread("/content/drive/My Drive/CoinScraper/charts/monero/RSI-20200820.png")
plt.figure(figsize=(32,18))
plt.axis('off')
plt.imshow(img);
Access the Log File
Using the client we can also access the log file saved during the summary process. The log file is a text file that shows which processes are running, or errors that occur.
client.log (Below is an example of how to access the log.)
[48] client.log
Here is the table html code generated for web: | https://medium.com/the-innovation/how-to-compare-200-cryptocurrencies-with-open-source-coinscraper-module-269d5d2e1f15 | ['Jacob Tadesse'] | 2020-08-24 18:18:34.263000+00:00 | ['Web Scraping', 'Python', 'Pandas', 'Cryptocurrency', 'Data Science'] | Title Compare 200 Cryptocurrencies OpenSource CoinScraper ModuleContent bit dramatic know it’s pretty big deal overboughtoversold kindoftrader trader use technical analysis favorite indicator make smart decision market could range moving average exponential moving average depending trader may prefer Moving Average Convergence Divergence MACD others onbalance volume OBV Everyone know us different tool use tool well post Bollinger Bands Fibonacci Retracements Ichimoku Clouds post coinscrapper tool nice need data use tool coinscraper client designed compare top 200 supported asset Kucoin’s decentralized exchange crypto asset’s historical fundamental data sourced coinmarketcapcom data collected data preprocessed calculating relative strength index end summary table top 200 asset along relative strength client requirementsdependencies please see requirementstxt file install following import request import panda pd import time import random import math import numpy np math import pi import matplotlibpyplot plt matplotlib inline o import mkdir ospath import exists install client module download py file demo notebook Google Colab download file repo Connecting Client client module installed open Demo notebook run cell demo walk error show fix happen running client coinscraper import coinscrapper today YYYYMMDD client coinscrappertoday client module require google authentication also require selenium webdriver would suggest running client Google Colab test experiencing error please make sure uploaded file Colab Notebooks folder Google Drive Feel free change file path change functionality Pulling Summary coinscraper client filled various method purpose one summary method function designed process action getting list asset traded KuCoin creating link historic data converting html table dataframes munging data generating csv file html table result clientsummary Access Saved Datasets Using client also access datasets saved summary process datasets saved python list contain historic data asset fundamental data RSI dataset separte pythonic list datasets clienttechnicaldata clientfundamentaldata clientrsidata example access saved datasets using clientIn 10 Historic Price Data listofhistoricdata clienttechnicaldata printHistoric Price Data displaylistofhistoricdata0head Fundamental Data listoffundamentaldata clientfundamentaldata print Fundamental Data displaylistoffundamentaldata0setindex0stack RSI Data listofRSIdata clientrsidata print RSI Data displaylistofRSIdata0tail Historic Price Data DateOpenHighLowCloseVolumeMarket Cap0Aug 20 202091251013191251013115344164317903490141Aug 19 2020934794498955912810936760116129313732Aug 18 2020933797159209935211999836316524633993Aug 17 202091229465897993368319201116495215074Aug 16 20209007913288419122641774871611628957 Fundamental Data Monero Price 1 9381 USD Monero ROI 1 369327 Market Rank 1 16 Market Cap 1 1657837591 USD 24 Hour Volume 1 176797734 USD Circulating Supply 1 17672780 XMR Total Supply 1 17672780 XMR Max Supply 1 Data Time High 1 49584 USDJan 07 2018 Time Low 1 0212967 USDJan 14 2015 52 Week High Low 1 10552 USD 2670 USD 90 Day High Low 1 10552 USD 6043 USD 30 Day High Low 1 10552 USD 7089 USD 7 Day High Low 1 10552 USD 8841 USD 24 Hour High Low 1 10552 USD 9260 USD Yesterdays High Low 1 10131 USD 9125 USD Yesterdays Open Close 1 9125 USD 10131 USD Yesterdays Change 1 1006 USD 1102 Yesterdays Volume 1 153441643 USD dtype object RSI Data DateOpenHighLowCloseVolumeMarket CapdateRSIdate 2020–08–16Aug 16 202090079132884191226417748716116289572020–08–16596750132020–08–17Aug 17 202091229465897993368319201116495215072020–08–17624408222020–08–18Aug 18 2020933797159209935211999836316524633992020–08–18626471052020–08–19Aug 19 2020934794498955912810936760116129313732020–08–19578562802020–08–20Aug 20 202091251013191251013115344164317903490142020–08–2069210352 Access RSI Charts Using client access RSI Charts saved summary process chart saved python list also saved authenticated google drive plot matplotlib object require show method clientplots clientcandlesticks coming soon example access saved datasets using clientIn 24 import o 29 oslistdirdriveMy DriveCoinScraperchartsmonero Out29 RSI20200820png 37 img pltimreadcontentdriveMy DriveCoinScraperchartsmoneroRSI20200820png pltfigurefigsize3218 pltaxisoff pltimshowimg Access Log File Using client also access log file saved summary process log file text file show process running error occur clientlog example access log 48 clientlog table html code generated webTags Web Scraping Python Pandas Cryptocurrency Data Science |
4,124 | The festival of families | This week in East Asia — when the moon is its roundest and brightest on the 15th day of the 8th month of the lunar calendar — we celebrate the Mid-Autumn Festival. Traditionally, the festival gives thanks for the harvest, but it is also a time to appreciate harmonious unions and families coming together.
A timely coincidence, because my in-laws are visiting from the UK. We’ve been experiencing Hong Kong tourist hotspots including the bright lights of Victoria Harbour, the Big Buddha that commands a view over Lantau Island, and a boat ride to the remote island of Po Toi.
During every adventure we have witnessed other families — smiling, arguing, laughing, but nonetheless spending time together. This same week, I attended the funeral of a talented friend who left us too soon. Her estranged family arrived from opposite sides of the world to mourn her loss, each in their own way but united in grief.
The tensions between the family members stretch vertically and horizontally through the ages
Families. They unite us and they tear us apart. My first novel is one of a trilogy covering four generations of familial shenanigans. The tensions between the family members stretch vertically and horizontally through the ages, like a delicate web that masks its strength. Families make for complicated dynamics, and I am grateful to Beth Miller — author of When We Were Sisters — for her thoughts on how to deal with families when writing fiction:
“The key thing I do when writing is to focus on the dynamics between each of the various members. If you have four people in a family, you have at least eleven possible configurations of relationship, all with their different complexities, secrets and tensions. How does what A say or do impact on B? How do things change if C comes on the scene? The writer here is like a family therapist. Both writer and therapist have to tease out the dynamics, work out how each pairing, each triad, each quartet, changes depending on who’s there, what new stuff they’re bringing, their shared and separate histories.”
Gotham Writers provides helpful guidance on determining which family member should be the main protagonist and how to write different POVs to tell the broader family story. At the same time, it warns of the risk of more than one character taking centre stage and diluting the focus and cohesiveness.
Over the centuries, brightly lit lanterns have become symbolic of the Mid Autumn Festival. Just like family members, lanterns come in all shapes, sizes and colours. As my family sat on the rooftop, and marvelled at the glorious full moon and the array of colourful lanterns bobbing in unison in the warm sea breeze, we chatted about everything and nothing, and were grateful for our differences and for our unity.
Originally published at www.rjverity.com on October 6, 2017. | https://medium.com/words-on-writing/the-festival-of-families-be2cde72b317 | ['Rj Verity'] | 2018-05-01 00:37:05.471000+00:00 | ['Rj Verity', 'Writing', 'Writing Tips', 'Writer', 'Words On Writing'] | Title festival familiesContent week East Asia — moon roundest brightest 15th day 8th month lunar calendar — celebrate MidAutumn Festival Traditionally festival give thanks harvest also time appreciate harmonious union family coming together timely coincidence inlaws visiting UK We’ve experiencing Hong Kong tourist hotspot including bright light Victoria Harbour Big Buddha command view Lantau Island boat ride remote island Po Toi every adventure witnessed family — smiling arguing laughing nonetheless spending time together week attended funeral talented friend left u soon estranged family arrived opposite side world mourn loss way united grief tension family member stretch vertically horizontally age Families unite u tear u apart first novel one trilogy covering four generation familial shenanigan tension family member stretch vertically horizontally age like delicate web mask strength Families make complicated dynamic grateful Beth Miller — author Sisters — thought deal family writing fiction “The key thing writing focus dynamic various member four people family least eleven possible configuration relationship different complexity secret tension say impact B thing change C come scene writer like family therapist writer therapist tease dynamic work pairing triad quartet change depending who’s new stuff they’re bringing shared separate histories” Gotham Writers provides helpful guidance determining family member main protagonist write different POVs tell broader family story time warns risk one character taking centre stage diluting focus cohesiveness century brightly lit lantern become symbolic Mid Autumn Festival like family member lantern come shape size colour family sat rooftop marvelled glorious full moon array colourful lantern bobbing unison warm sea breeze chatted everything nothing grateful difference unity Originally published wwwrjveritycom October 6 2017Tags Rj Verity Writing Writing Tips Writer Words Writing |
4,125 | Why Dropping Out of School Will Make Your Life Better | I always hated school, just like a lot of you I suppose.
So I quit two years ago, and I’m now attending a professional course. Something very far from the traditional school system.
But why?
Why is that a good idea that you should consider?
Well… there are a couple of reasons. Some of them are more related to the system itself, and some about your mental health and time.
Let’s admit that school is useless, come on. | https://medium.com/illumination/why-dropping-out-of-school-will-make-your-life-better-77558eef68b6 | ['Alyssa Di Grazia'] | 2020-12-28 10:42:12.372000+00:00 | ['Life Lessons', 'Self Improvement', 'Writing', 'Life', 'Change'] | Title Dropping School Make Life BetterContent always hated school like lot suppose quit two year ago I’m attending professional course Something far traditional school system good idea consider Well… couple reason related system mental health time Let’s admit school useless come onTags Life Lessons Self Improvement Writing Life Change |
4,126 | assimilated agony | assimilated agony
how often have we
turned a blind eye
while others
begged
for us
to see? | https://medium.com/a-cornered-gurl/assimilated-agony-774c6f254167 | ['Tre L. Loadholt'] | 2017-10-05 23:33:50.674000+00:00 | ['Micropoetry', 'Love', 'Writing', 'Compassion', 'A Cornered Gurl'] | Title assimilated agonyContent assimilated agony often turned blind eye others begged u seeTags Micropoetry Love Writing Compassion Cornered Gurl |
4,127 | Why Sidewalk News is bringing local news amongst the people | One of my core journalistic beliefs is that, for a community to thrive, all of its members must have access to high quality local news. And that often isn’t the case — as a 2018 report by Fiona Morgan and James Hamilton determined, “Poor people get poor information, because income inequality generates information inequality.”
But I believe there’s a way to use public infrastructure that already exists almost everywhere in the country to bring the news amongst the people — outdoor advertising. There are 3.3 million out-of-home (OOH) advertising spaces in the United States, and the format already supports more than just ads — the FBI says that the use of digital billboards has played a part in arresting 50 of the country’s most wanted criminals in the last decade.
What Sidewalk News will do is help local news outlets use OOH advertising spaces like bus shelters and street furniture to engage with their community directly by putting their news onto these platforms.
Doing so serves three purposes. One is providing news to all members of a community without concern for their technological prowess or ability to pay. As media becomes more digitally-focused, lower-income and less-educated Americans are less likely to have access to high quality news than their wealthier, more-educated peers. Using OOH spaces levels the playing field by making all people equally able to consume this news.
Another purpose is to give community members a personal connection to a news story that may otherwise seem esoteric. Because each “news ad” will be tailored to its specific display point, the news can be “ultra hyper localized” to that particular spot. For instance, someone sitting in a bus shelter will learn from Sidewalk News about how a city-wide issue will affect the street she’s standing on, or the bus line she’s about to take. This will drive civic engagement as people become more aware about the issues surrounding them.
The third purpose is to advertise the media outlet by showing off what they do best — local news. By posting local news on outdoor displays, a reader on the street will see how the media outlet is covering news that is relevant to them. This also builds credibility and brand awareness of the outlet in the community, particularly with potential readers who may not be as familiar with their work.
With news outlets overstretched and under resourced, I don’t imagine that it would be realistic for my partner news outlets to fund this project. I had originally viewed this project as one that could only be funded through donations from journalism organizations or benevolent individuals interested in fostering community engagement. Increasingly, I think the model has the potential for multiple revenue streams.
One avenue for revenue will still be philanthropy from groups interested in local news, civic engagement, and public spaces. I believe these investments will be necessary to get projects started and build the infrastructure required to make them sustainable. But ultimately, there will be an option for sponsorship. Local companies and community organizations will be able to sponsor these “news ads” to show their commitment to supporting local news.
Out-of-home advertising exists almost everywhere; it’s already a part of our lives. I believe we have an opportunity to make it part of the way we consume the news. | https://medium.com/journalism-innovation/why-sidewalk-news-is-bringing-local-news-amongst-the-people-29bd0277c6a9 | ['Elise Czajkowski'] | 2019-04-04 16:08:59.940000+00:00 | ['Journalism', 'Sidewalk', 'Advertising'] | Title Sidewalk News bringing local news amongst peopleContent One core journalistic belief community thrive member must access high quality local news often isn’t case — 2018 report Fiona Morgan James Hamilton determined “Poor people get poor information income inequality generates information inequality” believe there’s way use public infrastructure already exists almost everywhere country bring news amongst people — outdoor advertising 33 million outofhome OOH advertising space United States format already support ad — FBI say use digital billboard played part arresting 50 country’s wanted criminal last decade Sidewalk News help local news outlet use OOH advertising space like bus shelter street furniture engage community directly putting news onto platform serf three purpose One providing news member community without concern technological prowess ability pay medium becomes digitallyfocused lowerincome lesseducated Americans le likely access high quality news wealthier moreeducated peer Using OOH space level playing field making people equally able consume news Another purpose give community member personal connection news story may otherwise seem esoteric “news ad” tailored specific display point news “ultra hyper localized” particular spot instance someone sitting bus shelter learn Sidewalk News citywide issue affect street she’s standing bus line she’s take drive civic engagement people become aware issue surrounding third purpose advertise medium outlet showing best — local news posting local news outdoor display reader street see medium outlet covering news relevant also build credibility brand awareness outlet community particularly potential reader may familiar work news outlet overstretched resourced don’t imagine would realistic partner news outlet fund project originally viewed project one could funded donation journalism organization benevolent individual interested fostering community engagement Increasingly think model potential multiple revenue stream One avenue revenue still philanthropy group interested local news civic engagement public space believe investment necessary get project started build infrastructure required make sustainable ultimately option sponsorship Local company community organization able sponsor “news ads” show commitment supporting local news Outofhome advertising exists almost everywhere it’s already part life believe opportunity make part way consume newsTags Journalism Sidewalk Advertising |
4,128 | Artificial Neural Network From Scratch Using Python Numpy | Finally, let’s build the ANN
ANN
So here we have:
Input node with some inputs (Real numbers; x1, x2, xn) with their weights (Real numbers; w1, w2, wn) and bias (Real number).
And this parameters (weights and bias) connects with our hidden nodes, where we compute weighted sum (sigma or z) for all inputs and theirs weight and then we apply a non-linear activation function (like, sigmoid, tanh, etc) and this generate a our final output (y).
Now, in our model we have 28 x 28 pixels of image (total pixels is 784 pixel) and this pixel is our inputs that goes to our input node then it goes to hidden node (single hidden layer) and then generate a output (single digits between 0 and 9).
Sigmoid Activation Function
Here our y-hat (or output of nodes) is sigmoid(dot product of weight and input ‘x’ + bias)
Implementing Sigmoid Activation Function:
#activation sigmoid
def sigmoid(x):
return 1. / (1.+np.exp(-x))
Cross-entropy Loss (a.k.a Cost, Error) Function
For ‘n’ classes and single samples or for ’n’ digits and single image, below we the have formula:
For ’n’ classes and single samples
But, for ’n’ classes and multiple(m) samples or for ’n’ digits and multiple single image, below we the have formula:
Implementing Cross-entropy Loss Function:
#cross-entropy for our cost function
def compute_multiclass_loss(Y, Y_hat):
L_sum = np.sum(np.multiply(Y, np.log(Y_hat)))
m = Y.shape[1]
L = -(1/m) * L_sum
return L
Back-propagation Using Gradient Descent Algorithm
Back-propagation
Back-propagation is just a way of propagating the total loss back into the neural network to know how much of the loss every node is responsible for, and subsequently updating the weights in such a way that minimizes the loss by giving the nodes with higher error rates lower weights and vice versa.
Gradient Descent
Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. To find a local minimum of a function using gradient descent, we take steps proportional to the negative of the gradient of the function at the current point.
Formula : new weight = prev. weight — learning rate*gradient
Gradient Descent Algorithm
Computing Gradient
Finally let’s implement it and train our model
n_x = X_train.shape[0]
n_h = 64
digits = 10
learning_rate = 1
epochs = 2000
Initializing Weights and bias:
W1 = np.random.randn(n_h, n_x)
b1 = np.zeros((n_h, 1))
W2 = np.random.randn(digits, n_h)
b2 = np.zeros((digits, 1))
X = X_train
Y = Y_train
Now, training start:
for i in range(epochs):
Z1 = np.matmul(W1,X) + b1
A1 = sigmoid(Z1)
Z2 = np.matmul(W2,A1) + b2
A2 = np.exp(Z2) / np.sum(np.exp(Z2), axis=0)
cost = compute_multiclass_loss(Y, A2)
dZ2 = A2-Y
dW2 = (1./m) * np.matmul(dZ2, A1.T)
db2 = (1./m) * np.sum(dZ2, axis=1, keepdims=True)
dA1 = np.matmul(W2.T, dZ2)
dZ1 = dA1 * sigmoid(Z1) * (1 - sigmoid(Z1))
dW1 = (1./m) * np.matmul(dZ1, X.T)
db1 = (1./m) * np.sum(dZ1, axis=1, keepdims=True)
W2 = W2 - learning_rate * dW2
b2 = b2 - learning_rate * db2
W1 = W1 - learning_rate * dW1
b1 = b1 - learning_rate * db1
if (i % 100 == 0):
print("Epoch", i, "cost: ", cost)
print("Final cost:", cost)
model loss
Generating our predictions and checking accuracy:
Z1 = np.matmul(W1, X_test) + b1
A1 = sigmoid(Z1)
Z2 = np.matmul(W2, A1) + b2
A2 = np.exp(Z2) / np.sum(np.exp(Z2), axis=0)
predictions = np.argmax(A2, axis=0)
labels = np.argmax(Y_test, axis=0)
print(confusion_matrix(predictions, labels))
print(classification_report(predictions, labels))
Okay, we got 92% accuracy which is pretty good. | https://medium.com/analytics-vidhya/artificial-neural-network-from-scratch-using-python-numpy-580e9bacd67c | ['Madhav Mishra'] | 2020-09-08 01:15:42.030000+00:00 | ['Programming', 'Deep Learning', 'Artificial Intelligence', 'Data Science', 'Machine Learning'] | Title Artificial Neural Network Scratch Using Python NumpyContent Finally let’s build ANN ANN Input node input Real number x1 x2 xn weight Real number w1 w2 wn bias Real number parameter weight bias connects hidden node compute weighted sum sigma z input weight apply nonlinear activation function like sigmoid tanh etc generate final output model 28 x 28 pixel image total pixel 784 pixel pixel input go input node go hidden node single hidden layer generate output single digit 0 9 Sigmoid Activation Function yhat output node sigmoiddot product weight input ‘x’ bias Implementing Sigmoid Activation Function activation sigmoid def sigmoidx return 1 1npexpx Crossentropy Loss aka Cost Error Function ‘n’ class single sample ’n’ digit single image formula ’n’ class single sample ’n’ class multiplem sample ’n’ digit multiple single image formula Implementing Crossentropy Loss Function crossentropy cost function def computemulticlasslossY Yhat Lsum npsumnpmultiplyY nplogYhat Yshape1 L 1m Lsum return L Backpropagation Using Gradient Descent Algorithm Backpropagation Backpropagation way propagating total loss back neural network know much loss every node responsible subsequently updating weight way minimizes loss giving node higher error rate lower weight vice versa Gradient Descent Gradient descent firstorder iterative optimization algorithm finding local minimum differentiable function find local minimum function using gradient descent take step proportional negative gradient function current point Formula new weight prev weight — learning rategradient Gradient Descent Algorithm Computing Gradient Finally let’s implement train model nx Xtrainshape0 nh 64 digit 10 learningrate 1 epoch 2000 Initializing Weights bias W1 nprandomrandnnh nx b1 npzerosnh 1 W2 nprandomrandndigits nh b2 npzerosdigits 1 X Xtrain Ytrain training start rangeepochs Z1 npmatmulW1X b1 A1 sigmoidZ1 Z2 npmatmulW2A1 b2 A2 npexpZ2 npsumnpexpZ2 axis0 cost computemulticlasslossY A2 dZ2 A2Y dW2 1m npmatmuldZ2 A1T db2 1m npsumdZ2 axis1 keepdimsTrue dA1 npmatmulW2T dZ2 dZ1 dA1 sigmoidZ1 1 sigmoidZ1 dW1 1m npmatmuldZ1 XT db1 1m npsumdZ1 axis1 keepdimsTrue W2 W2 learningrate dW2 b2 b2 learningrate db2 W1 W1 learningrate dW1 b1 b1 learningrate db1 100 0 printEpoch cost cost printFinal cost cost model loss Generating prediction checking accuracy Z1 npmatmulW1 Xtest b1 A1 sigmoidZ1 Z2 npmatmulW2 A1 b2 A2 npexpZ2 npsumnpexpZ2 axis0 prediction npargmaxA2 axis0 label npargmaxYtest axis0 printconfusionmatrixpredictions label printclassificationreportpredictions label Okay got 92 accuracy pretty goodTags Programming Deep Learning Artificial Intelligence Data Science Machine Learning |
4,129 | Boost your marketing strategy with RFM | Okay, now that we combined these features into one dataframe, let’s visualize it! (code link)
This is nothing but a huge mess of customer pile which begs the question:
How are we going to segment them?
Most data scientists try to answer this question by implementing K-Means clustering, which is basically a machine learning algorithm used for seperating data into clusters based on data points’ distance to each other: (https://stanford.edu/~cpiech/cs221/handouts/kmeans.html)
However, I am an advocate of a manual approach, since ML algorithms do not consider any industry-specific dynamics. Accordingly, I will follow two essential steps to ace this challenge:
1- Quantile Transformation and Ranking
In simple terms, a quantile is where a sample is divided into equal-sized subgroups. Quartiles are also quantiles; they divide the distribution into four equal parts. In this case, I seperated RFM values into 4 quartiles and simply labeled them based on their values. (very bad-bad-good-very good). This is what the dataframe transformed into:
2- Segmentation by RFM rankings
This is the part, where the business understanding comes into play. I wrote a python function with a bunch of ‘if-else statements’ to define segments based on their RFM rankings. For instance, I extracted a customer segment with ‘very good’ levels of F-M, yet ‘very bad’ level of R. This means that the customer was once very active and valuable but it’s been a long time since the latest transaction. Hence, the company needs to win this group back urgently! The below-stated function is my personal approach and one can finetune these classes based on different market dynamics.
And here we go! We successfully created our customer segments to efficiently design upcoming marketing strategies! An elegant treemap comes in handy in terms of reflecting the big picture: (code link)
Customer Treemap by Segments
Below, is an analysis of the recently created customer segments through R-F-M. Notice how each group has its own behavioral pattern and differentiates meaningfully. | https://kerim-birgun.medium.com/boost-your-marketing-strategy-with-rfm-c737926fe621 | ['Kerim Birgun'] | 2020-11-05 22:51:04.981000+00:00 | ['Python', 'Data Science', 'Segmentation', 'Analytics', 'Marketing Strategies'] | Title Boost marketing strategy RFMContent Okay combined feature one dataframe let’s visualize code link nothing huge mess customer pile begs question going segment data scientist try answer question implementing KMeans clustering basically machine learning algorithm used seperating data cluster based data points’ distance httpsstanfordeducpiechcs221handoutskmeanshtml However advocate manual approach since ML algorithm consider industryspecific dynamic Accordingly follow two essential step ace challenge 1 Quantile Transformation Ranking simple term quantile sample divided equalsized subgroup Quartiles also quantiles divide distribution four equal part case seperated RFM value 4 quartile simply labeled based value badbadgoodvery good dataframe transformed 2 Segmentation RFM ranking part business understanding come play wrote python function bunch ‘ifelse statements’ define segment based RFM ranking instance extracted customer segment ‘very good’ level FM yet ‘very bad’ level R mean customer active valuable it’s long time since latest transaction Hence company need win group back urgently belowstated function personal approach one finetune class based different market dynamic go successfully created customer segment efficiently design upcoming marketing strategy elegant treemap come handy term reflecting big picture code link Customer Treemap Segments analysis recently created customer segment RFM Notice group behavioral pattern differentiates meaningfullyTags Python Data Science Segmentation Analytics Marketing Strategies |
4,130 | How to Run PostgreSQL Using Docker | Setup
First, we need to install Docker. We will use a Docker compose file, a SQL dump file containing bootstrap data, and macOS in this setup. You can download these two files separately. Just make sure to put both docker-compose.yml and infile in the same folder. Alternatively, you can get the repository from here. Now, let’s discuss docker-compose and SQL dump files briefly.
Docker Compose: It’s a YAML file, and we can define containers and their properties inside. These containers are called services. For example, if your application has multiple stacks, such as a web server and a database server, we can use a docker-compose file.
It’s a YAML file, and we can define containers and their properties inside. These containers are called services. For example, if your application has multiple stacks, such as a web server and a database server, we can use a docker-compose file. SQL-dump: A sql-dump contains SQL queries in plain text. PostgreSQL provides the command-line utility program pg_dump to create and read dump files.
Let’s break down the individual ingredients of the docker-compose.yml file.
version: '3.8' services: db:
container_name: pg_container
image: postgres
restart: always
environment:
POSTGRES_USER: root
POSTGRES_PASSWORD: root
POSTGRES_DB: test_db
ports:
- "5432:5432"
volumes:
- $HOME/Desktop/PostgreSql-Snippets/infile:/infile
- pg_data:/var/lib/postgresql/data/ volumes:
pg_data:
The first line defines the version of the Compose file, which is 3.8. There are other file formats — 1, 2, 2.x, and 3.x. Get more information on Compose file formats from Docker’s documentation here.
After that, we have the services hash, and it contains the services for an application. For our application, we only have one service called db.
Inside the db service, the first tag container_name is used to change the default container name to pg_container for our convenience. The second tag image is used to define the Docker image for the db service, and we are using the pre-built official image of PostgreSQL.
For the third tag restart, we have set the value always. What it does is it always automatically restarts the container by saving time. It restarts the container when either the Docker daemon restarts or the container itself is manually restarted. For example, every time you reboot your machine, you don’t have to manually start the container.
The fourth tag environment defines a set of environment variables. Later we will use these for database authentication purposes. Here we have POSTGRES_USER , POSTGRES_PASSWORD , and POSTGRES_DB . Among these three variables, the only required one is the POSTGRES_PASSWORD . The default value of POSTGRES_USER is postgres , and for POSTGRES_DB it’s the value of POSTGRES_USER . You can read more about these variables from here.
The fifth tag is the ports tag and is used to define both host and container ports. It maps port 5432 on the host to port 5432 on the container.
Finally, the volumes tag is used to mount a folder from the host machine to the container. It comprises two fields separated by a colon, the first part is the path in the host machine and the second part is the path in the container. Remove this portion if you don’t want to mount the sql-dump into the container. The second line of volumes tag is used to store the database data, the first part is the name of the volume, and the second part is the path in the container where the database data is stored. But how do we know what’s that path exactly? We can determine the path by running the following command using the psql . We will discuss how to use psql later in this post.
show data_directory;
Remove the second line if you don’t want to back up your container’s database data. If you choose to remove both lines under the volumes tag, remove the volumes tag.
At the end of the docker-compose file, you can see that we have defined the volume pg_data under the volumes tag. It allows us to reuse the volume across multiple services. Read more about volumes tag here.
The moment of truth. Let’s run the following command from the same directory where the docker-compose.yml file is located. We can see that it starts and runs our entire app.
docker-compose up
Inspection
We can check if the container is running or not using the docker ps command on the host machine. As we can see, we have a running container called pg_container .
docker ps
Moreover, we can see the image by running the docker images command.
docker images
Finally, we can see that a volume has been created by running the docker volume ls command.
docker volume ls
Connect to psql
What is psql? It’s a terminal-based interface to PostgreSQL, which allows us to run SQL queries interactively. First, let’s access our running container pg_container .
docker exec -it pg_container bash
I talked about how the above line works in the following article. Have a look.
Now we can connect to psql server using the hostname, database name, username, and password.
psql --host=pg_container --dbname=test_db --username=root
If you want to type less, use the following command. Find more options for PostgreSQL interactive terminal from here.
psql -h pg_container -d test_db -U root
Here, the password’s value is the root , which has been defined inside the docker-compose file earlier.
Load data from a file
Now we can load the dump file into our test_db database. In this case, infile . It is accessible inside the container because we have mounted it from the host machine.
psql -h pg_container -d test_db -U root -f infile
If we run the PostgreSQL command \dt , we can see two tables called marks and students inside our database test_db .
Did we miss something? Not really, but yes! Since our data is backed up in the volume called postgresql-snippets_pg_data, we can remove the container without losing the database data. Let’s try that now. First, delete the container and then create it again.
docker rm pg_container
docker-compose up
Now after accessing the container and psql we can still see our data!
docker exec -it pg_container bash
psql -h pg_container -d test_db -U root
\dt
In case you want to delete the backup volume, use the docker volume rm command. Read the documentation here.
docker volume rm postgresql-snippets_pg_data
Or you can use the docker-compose command. Read the documentation here. | https://towardsdatascience.com/how-to-run-postgresql-using-docker-15bf87b452d4 | ['Mahbub Zaman'] | 2020-12-30 13:21:45.957000+00:00 | ['Programming', 'Software Engineering', 'Postgresql', 'Data Science', 'Docker'] | Title Run PostgreSQL Using DockerContent Setup First need install Docker use Docker compose file SQL dump file containing bootstrap data macOS setup download two file separately make sure put dockercomposeyml infile folder Alternatively get repository let’s discus dockercompose SQL dump file briefly Docker Compose It’s YAML file define container property inside container called service example application multiple stack web server database server use dockercompose file It’s YAML file define container property inside container called service example application multiple stack web server database server use dockercompose file SQLdump sqldump contains SQL query plain text PostgreSQL provides commandline utility program pgdump create read dump file Let’s break individual ingredient dockercomposeyml file version 38 service db containername pgcontainer image postgres restart always environment POSTGRESUSER root POSTGRESPASSWORD root POSTGRESDB testdb port 54325432 volume HOMEDesktopPostgreSqlSnippetsinfileinfile pgdatavarlibpostgresqldata volume pgdata first line defines version Compose file 38 file format — 1 2 2x 3x Get information Compose file format Docker’s documentation service hash contains service application application one service called db Inside db service first tag containername used change default container name pgcontainer convenience second tag image used define Docker image db service using prebuilt official image PostgreSQL third tag restart set value always always automatically restarts container saving time restarts container either Docker daemon restarts container manually restarted example every time reboot machine don’t manually start container fourth tag environment defines set environment variable Later use database authentication purpose POSTGRESUSER POSTGRESPASSWORD POSTGRESDB Among three variable required one POSTGRESPASSWORD default value POSTGRESUSER postgres POSTGRESDB it’s value POSTGRESUSER read variable fifth tag port tag used define host container port map port 5432 host port 5432 container Finally volume tag used mount folder host machine container comprises two field separated colon first part path host machine second part path container Remove portion don’t want mount sqldump container second line volume tag used store database data first part name volume second part path container database data stored know what’s path exactly determine path running following command using psql discus use psql later post show datadirectory Remove second line don’t want back container’s database data choose remove line volume tag remove volume tag end dockercompose file see defined volume pgdata volume tag allows u reuse volume across multiple service Read volume tag moment truth Let’s run following command directory dockercomposeyml file located see start run entire app dockercompose Inspection check container running using docker p command host machine see running container called pgcontainer docker p Moreover see image running docker image command docker image Finally see volume created running docker volume l command docker volume l Connect psql psql It’s terminalbased interface PostgreSQL allows u run SQL query interactively First let’s access running container pgcontainer docker exec pgcontainer bash talked line work following article look connect psql server using hostname database name username password psql hostpgcontainer dbnametestdb usernameroot want type le use following command Find option PostgreSQL interactive terminal psql h pgcontainer testdb U root password’s value root defined inside dockercompose file earlier Load data file load dump file testdb database case infile accessible inside container mounted host machine psql h pgcontainer testdb U root f infile run PostgreSQL command dt see two table called mark student inside database testdb miss something really yes Since data backed volume called postgresqlsnippetspgdata remove container without losing database data Let’s try First delete container create docker rm pgcontainer dockercompose accessing container psql still see data docker exec pgcontainer bash psql h pgcontainer testdb U root dt case want delete backup volume use docker volume rm command Read documentation docker volume rm postgresqlsnippetspgdata use dockercompose command Read documentation hereTags Programming Software Engineering Postgresql Data Science Docker |
4,131 | 4 Useful JavaScript Books for Aspiring Developers | 4 Useful JavaScript Books for Aspiring Developers
Amazing books for JavaScript knowledge.
Photo by Thought Catalog on Unsplash
Introduction
Not all of us prefer learning online or with video tutorials, there are people that prefer books. Reading these books can benefit both your physical and mental health, and those benefits can last a lifetime. Coding books are very useful if you love reading them because they give you all the details and knowledge you need. Reading books is one of the best ways to learn JavaScript.
In this article, we will give you a list of some useful JavaScript books for developers. Let’s get right into it.
1. JavaScript and jQuery
This book was written for anyone who wants to make his websites a little more interesting, engaging, interactive, or usable. It was written by Jon Duckett in order to help beginners understand the basics of JavaScript and jQuery very well. All you need is just a basic understanding of HTML and CSS.
I recommend starting with this book if you are a beginner, but don’t rely too heavily on jQuery as it’s a bit outdated and most employers find this to be a deterrent.
You can check the book here if you are interested.
2. You Don’t Know JS
This is an awesome book series by Kyle Simpson exploring the parts of JavaScript that we all think we understand but don’t really know. All these books are free, which is incredible.
Here is the Github repository for the series if you are interested.
3. JavaScript Design Patterns
Design patterns are reusable solutions to commonly occurring problems in software design. They are both exciting and fascinating topics to explore in any programming language.
This book (“Learning JavaScript Design Patterns”) was written by Addy Osmani. By reading it, you will explore applying both classical and modern design patterns to the JavaScript programming language.
You can check the book here if you are interested.
4. JavaScript Allongé
JavaScript Allongé is a book about programming with functions. It’s written in JavaScript of course.
This book starts at the beginning, with values and expressions, and builds from there to discuss types, identity, functions, closures, scopes, collections, iterators, and many more subjects up to working with classes and instances. It also teaches you how to handle complex code, and how to simplify code without dumbing it down.
You can check it out here if you are interested.
Conclusion
As you can see, all these books are full of knowledge and value. They helped a lot of developers to improve their JavaScript skills. You can choose any one that fits you and start gaining useful JavaScript knowledge.
Thank you for reading this article, I hope you found it useful.
More Reading | https://medium.com/javascript-in-plain-english/4-useful-javascript-books-for-aspiring-developers-67d9de904ea9 | ['Mehdi Aoussiad'] | 2020-12-27 21:28:33.443000+00:00 | ['JavaScript', 'Web Development', 'Coding', 'Books', 'Programming'] | Title 4 Useful JavaScript Books Aspiring DevelopersContent 4 Useful JavaScript Books Aspiring Developers Amazing book JavaScript knowledge Photo Thought Catalog Unsplash Introduction u prefer learning online video tutorial people prefer book Reading book benefit physical mental health benefit last lifetime Coding book useful love reading give detail knowledge need Reading book one best way learn JavaScript article give list useful JavaScript book developer Let’s get right 1 JavaScript jQuery book written anyone want make website little interesting engaging interactive usable written Jon Duckett order help beginner understand basic JavaScript jQuery well need basic understanding HTML CSS recommend starting book beginner don’t rely heavily jQuery it’s bit outdated employer find deterrent check book interested 2 Don’t Know JS awesome book series Kyle Simpson exploring part JavaScript think understand don’t really know book free incredible Github repository series interested 3 JavaScript Design Patterns Design pattern reusable solution commonly occurring problem software design exciting fascinating topic explore programming language book “Learning JavaScript Design Patterns” written Addy Osmani reading explore applying classical modern design pattern JavaScript programming language check book interested 4 JavaScript Allongé JavaScript Allongé book programming function It’s written JavaScript course book start beginning value expression build discus type identity function closure scope collection iterators many subject working class instance also teach handle complex code simplify code without dumbing check interested Conclusion see book full knowledge value helped lot developer improve JavaScript skill choose one fit start gaining useful JavaScript knowledge Thank reading article hope found useful ReadingTags JavaScript Web Development Coding Books Programming |
4,132 | Start a React Project Truly from Scratch Using Webpack and Babel | What is Webpack and why is it used?
Webpack is a module bundler; as the name implies, it bundles every module that a project needs into one or more bundles that can be referenced in the primary html file.
For example, when building a JavaScript application that has JavaScript code separated into multiple files, each file must be loaded into the primary html file using the <script> tag.
<body>
...
<script src="libs/react.min.js"></script>
<script src='src/header.js'></script>
<script src='src/dashboard.js'></script>
<script src='src/api.js'></script>
<script src='src/something.js'></script>
</body>
By implementing the use of Webpack, these separate JavaScript files can be intelligently bundled into one file that can then be loaded into the primary html file.
<body>
...
<script src='dist/bundle.js'></script>
</body>
In this instance, using Webpack not only dramatically reduces the number of imports but also eliminates any issues that may arise if the scripts are not loaded in order. Besides module bundling, Webpack also offers Loaders and Plugins which can be used to transform files before, during, or after the bundling process. Loaders and Plugins are explored in further detail later on in this article. | https://joshiaawaj.medium.com/start-a-react-project-truly-from-scratch-using-webpack-and-babel-dbaaeea3f8da | ['Aawaj Joshi'] | 2020-12-07 11:19:59.609000+00:00 | ['React', 'ES6', 'Webpack', 'Jsx', 'Babel'] | Title Start React Project Truly Scratch Using Webpack BabelContent Webpack used Webpack module bundler name implies bundle every module project need one bundle referenced primary html file example building JavaScript application JavaScript code separated multiple file file must loaded primary html file using script tag body script srclibsreactminjsscript script srcsrcheaderjsscript script srcsrcdashboardjsscript script srcsrcapijsscript script srcsrcsomethingjsscript body implementing use Webpack separate JavaScript file intelligently bundled one file loaded primary html file body script srcdistbundlejsscript body instance using Webpack dramatically reduces number import also eliminates issue may arise script loaded order Besides module bundling Webpack also offer Loaders Plugins used transform file bundling process Loaders Plugins explored detail later articleTags React ES6 Webpack Jsx Babel |
4,133 | Please, Can Science and Faith Live in Unison? | Please, Can Science and Faith Live in Unison?
Gaining Power by Uniting the World
Photo by David Vázquez on Unsplash
I called to ask
How are you?
Instead your words
Accused
No, not me
Directly
My beliefs
My lifestyle
Everything that makes me
Me
Maybe
Maybe
I am wrong
You are right
COVID is a hoax
Then why did I watch
Dad die from outside
The hospital window?
Why does my granddaughter
Wear a tiny Paw Patrol mask
And ask,
Mommy do I still have germs?
Would a hoax
Require refrigerated
Boxes to house the
Bodies of dead?
I shared a friend’s tears
As she told me the names
Of two loved ones
Who died this week
You complain,
Your grandchild can’t
Attend his school
I cry because my daughter
Who teaches must attend
With substitutes in many classes
You believe Satan
Owns the virus
Or hoax as you call it
His goal is to isolate
Separate mankind
I refuse to grant him
That much power
God provides opportunities
To choose wisely
Human’s with faith
Grab onto the lifelines
Believing that God is good
Reach out and accept his gifts
Masks, hand sanitizer, vaccines
Science and faith can
Survive in unison
Unless you believe
The world is flat | https://medium.com/the-pom/can-science-and-faith-live-in-unison-2ea682bb74aa | ['Brenda Mahler'] | 2020-12-14 17:33:07.559000+00:00 | ['Faith and Life', 'Poetry', 'Faith', 'Science Fiction', 'Coronavirus'] | Title Please Science Faith Live UnisonContent Please Science Faith Live Unison Gaining Power Uniting World Photo David Vázquez Unsplash called ask Instead word Accused Directly belief lifestyle Everything make Maybe Maybe wrong right COVID hoax watch Dad die outside hospital window granddaughter Wear tiny Paw Patrol mask ask Mommy still germ Would hoax Require refrigerated Boxes house Bodies dead shared friend’s tear told name two loved one died week complain grandchild can’t Attend school cry daughter teach must attend substitute many class believe Satan Owns virus hoax call goal isolate Separate mankind refuse grant much power God provides opportunity choose wisely Human’s faith Grab onto lifeline Believing God good Reach accept gift Masks hand sanitizer vaccine Science faith Survive unison Unless believe world flatTags Faith Life Poetry Faith Science Fiction Coronavirus |
4,134 | Lessons in growth engineering: How we doubled sign ups from Pin landing pages | Jeff Chang | Pinterest engineer, Growth
A popular topic within growth hacking circles is improving conversion rates on landing pages. Everyone has seen those “10 tips to triple your conversion rate” articles that are littered with general tips (e.g. increase CTA size) and promise gains, which are usually small at best. Instead of trying those general tactics, we doubled page conversions and increased SEO traffic by doing one thing: leveraging data to better understand potential Pinners.
Improving Pin landing pages
The first step to improve landing page conversions was selecting the right page to work on. While the “Pin page” (a landing page for clicks to a Pin from another site) is one of our highest trafficked pages, it converted worse than other landing pages, so we invested more resources into it. At first, we didn’t have much data about which parts of the page were effective at convincing a new user to sign up, so we tried a simpler, more visual page layout.
After testing this new design in an A/B experiment, we learned it didn’t increase signups compared to the previous version (i.e. the control). It also was hard to extract learnings from this design because it was so different from any previous version. Was it because we replaced the board on the right with Related Pins? Was it because we didn’t show as much content after scrolling? In this case, we learned that by taking smaller steps, we could learn more from each new version.
So, we tried a new version more similar to the control, where we allowed the Pinner to swipe through a carousel of Related Pins at the top of the page.
This version also underperformed, but only slightly. The data showed few people clicked on Related Pins, possibly because they were small and difficult to distinguish.
Next, we tried making Related Pins bigger and added attribution so they looked more like regular Pins.
This was a success! We saw a lot of engagement with the which led to more signups. Our hypothesis was that this version better because it illustrated the related content on Pinterest and, in turn, showed the value of signing up. We shipped this version, and it became the control in future experiments.
However, we wanted to see if we could do even better at converting Pinners on Pin pages. Because Related Pins seemed enticing to new users, we wanted to further highlight them by adding them to the normally blank spaces on the left and right sides of a Pin.
We were surprised to find this version performed the same as the control. For our next iteration, we tried something simpler, where Related Pins were only on the right of the Pin.
We were excited to learn this version beat the new control. But, we wanted to do even better. We looked into the user action event-tracking funnels and found those who clicked through on the main Pin (and thus went to the external site) barely converted, but those who clicked on a Related Pin (and landed on the closeup for that Pin) converted at a much higher rate. So, we reduced the size of the main Pin to be the same as the Related Pins and gave the Related Pins grid more real estate on the page.
This iteration was a huge success and beat the previous control by over 25 percent (and that’s compounded on top of the gains of the previous versions!). Compared to our first Pin page, this iteration converted at twice the rate. Our first instinct was to ship this immediately, but instead we looked into the SEO experiment we ran alongside it and noticed that it dropped traffic by 10 percent. (Related post: SEO experiment framework.) If we shipped this Pin page, we’d get a net win (increased signups outweighed traffic losses), but we wanted to do better.
Conversions and SEO
When working on conversions for any page that gets a significant amount of traffic from search engines, you must consider SEO effects. For example, if an experiment increased signups by 20 percent but dropped traffic by 50 percent, the result is a net signup loss.
For this experiment, we segmented the traffic by various verticals, such as web traffic, image traffic and traffic by subdomain, and saw the biggest traffic drop in image search. We compared the images in the two designs, and found the big difference was we shrunk the size of the image. From previous experiments, we know when we change the images on the page, even just the size, image search traffic is impacted since search engines have to recrawl billions of our pages. We ran another SEO experiment where we used the same large-size image file as before, but sized it down to fit inside the smaller Pin.
This change increased the traffic difference from -10 percent to +10 percent, even though the design looks the same visually. Not only did this new layout increase conversions, it also increased traffic to the page. These effects multiply with each other to create a larger net signup gain.
Key lessons
By iterating quickly and thoughtfully, we were able to double Pin page conversions and increase SEO traffic. Here are the key lessons we learned along the way:
Learn more about users by analyzing the event tracking funnel data from experiments. Use past experiment learnings to drive new iterations instead of trying “random” ideas. It’s best to have a hypothesis backed by data for why each new design will perform better. The faster you iterate, the faster you learn and see gains. If you’re working on converting a page that gets a significant amount of traffic from search engines, running an SEO experiment in conjunction with a conversion experiment is a must. Even if you increase conversions, you might also see a traffic loss resulting in an overall net signup loss.
If you’re interested in growth engineering and love experimenting, join our team!
Acknowledgements: These projects were a joint effort between engineering, PM and design on the User Acquisition team. | https://medium.com/pinterest-engineering/lessons-in-growth-engineering-how-we-doubled-sign-ups-from-pin-landing-pages-1c0bc400cdb9 | ['Pinterest Engineering'] | 2017-02-21 19:52:34.121000+00:00 | ['SEO', 'Growth', 'Klp', 'Engineering', 'Data'] | Title Lessons growth engineering doubled sign ups Pin landing pagesContent Jeff Chang Pinterest engineer Growth popular topic within growth hacking circle improving conversion rate landing page Everyone seen “10 tip triple conversion rate” article littered general tip eg increase CTA size promise gain usually small best Instead trying general tactic doubled page conversion increased SEO traffic one thing leveraging data better understand potential Pinners Improving Pin landing page first step improve landing page conversion selecting right page work “Pin page” landing page click Pin another site one highest trafficked page converted worse landing page invested resource first didn’t much data part page effective convincing new user sign tried simpler visual page layout testing new design AB experiment learned didn’t increase signups compared previous version ie control also hard extract learning design different previous version replaced board right Related Pins didn’t show much content scrolling case learned taking smaller step could learn new version tried new version similar control allowed Pinner swipe carousel Related Pins top page version also underperformed slightly data showed people clicked Related Pins possibly small difficult distinguish Next tried making Related Pins bigger added attribution looked like regular Pins success saw lot engagement led signups hypothesis version better illustrated related content Pinterest turn showed value signing shipped version became control future experiment However wanted see could even better converting Pinners Pin page Related Pins seemed enticing new user wanted highlight adding normally blank space left right side Pin surprised find version performed control next iteration tried something simpler Related Pins right Pin excited learn version beat new control wanted even better looked user action eventtracking funnel found clicked main Pin thus went external site barely converted clicked Related Pin landed closeup Pin converted much higher rate reduced size main Pin Related Pins gave Related Pins grid real estate page iteration huge success beat previous control 25 percent that’s compounded top gain previous version Compared first Pin page iteration converted twice rate first instinct ship immediately instead looked SEO experiment ran alongside noticed dropped traffic 10 percent Related post SEO experiment framework shipped Pin page we’d get net win increased signups outweighed traffic loss wanted better Conversions SEO working conversion page get significant amount traffic search engine must consider SEO effect example experiment increased signups 20 percent dropped traffic 50 percent result net signup loss experiment segmented traffic various vertical web traffic image traffic traffic subdomain saw biggest traffic drop image search compared image two design found big difference shrunk size image previous experiment know change image page even size image search traffic impacted since search engine recrawl billion page ran another SEO experiment used largesize image file sized fit inside smaller Pin change increased traffic difference 10 percent 10 percent even though design look visually new layout increase conversion also increased traffic page effect multiply create larger net signup gain Key lesson iterating quickly thoughtfully able double Pin page conversion increase SEO traffic key lesson learned along way Learn user analyzing event tracking funnel data experiment Use past experiment learning drive new iteration instead trying “random” idea It’s best hypothesis backed data new design perform better faster iterate faster learn see gain you’re working converting page get significant amount traffic search engine running SEO experiment conjunction conversion experiment must Even increase conversion might also see traffic loss resulting overall net signup loss you’re interested growth engineering love experimenting join team Acknowledgements project joint effort engineering PM design User Acquisition teamTags SEO Growth Klp Engineering Data |
4,135 | Fat Acceptance Is Self-Acceptance | I’ve wasted a lot of time waiting until I was thin to go after the things I wanted. I didn’t have the self-confidence to put myself out there because I was fat. Was it an excuse? In some ways, yes, but I did experience roadblocks because of my weight.
I’m getting older, and I no longer have the luxury to wait until I’m a perfect weight to go after my goals.
We have to love ourselves all the time, which what acceptance is all about. If you’re fat, then fat-acceptance is self-acceptance, the same way every other kind of acceptance is.
Fat-acceptance doesn’t mean not growing, improving, or challenging one’s self.
Acceptance gives you a foundation that allows you to move past your emotional obstacles with less fear. Think of it this way, if you’re cutting an apple on an unstable cutting-board, you run the risk of hurting yourself with the knife. We need a stable starting point to take chances, put ourselves out there, and take actionable steps.
How can you change if you loathe who you are at the start or seek help if you don’t feel you are worth it? Without acceptance, there’s nothing to keep you going.
The opposite of fat-acceptance is internalizing cruelty or mistreatment because you feel it’s justified.
You can’t defend yourself from fat-shaming, discrimination, and abuse if you’re convinced you deserve it because of your body-size.
Being fat isn’t a crime — though I’m sure some see it that way. Fat people should be allowed to be happy and accept themselves for their successes and failures.
Fat-acceptance isn’t the same thing as body-positivity.
I like to think the body-positive movement began with good intentions. They wanted people to feel good about their bodies, even when those bodies weren’t perfect.
However, somewhere along the line, as the movement grew, the idea of body-positivity began to apply to only those whose bodies were acceptable fat, and not unruly fat.
You could be positive about your body if it were curvy, thick in the right places, or voluptuous, but if it was obviously fat, then body-positive was something for you to aspire to.
In her article, Leaving Body Positivity Behind for Fat Acceptance, writer Rachael Hope writes:
I disagree with the idea that loving your body is a goal that sets us up for failure. Loving your body doesn’t have to mean that you don’t think you have flaws or that you don’t have bad days. The same way that when you love another human being you don’t like them every moment of every day. I have days where I feel down on myself or dislike the way my body looks or feels. There are specific parts of my body I like less than others. But I still love my body. I love that it is my home. I love that it lets me physically connect with people. I love that it lets me feel touch and pleasure. Accepting your body is something to be proud of. For me, accepting it was part of falling in love. I don’t love my body because it’s the BEST body or because it’s a BETTER body than someone else’s. I love it because it is my body, and I love myself.
As fat people, we need to accept, care for, and love our bodies — it’s vital to our feelings of worth, self-esteem, and the quality of our lives. We need it as armor to fight our battles and help protect us against shame and humiliation.
When you accept your body, you can start to heal.
Fat-acceptance allows us to be honest with ourselves and helps us to see both our limitations and talents. Without fat-acceptance, you may shut something down like working out or applying for a job because you’re starting from a shaky spot.
When you accept your body, you can start to heal. No matter what size it is, how healthy it’s perceived to be, or how it serves you — you’re alive and that’s thanks to your body.
The next time someone tries to shame you for having both self-acceptance and fat-acceptance let them know that you’re not dependent on their approval, and that you don’t need their opinions about your relationship to your own body. | https://medium.com/fattitude/fat-acceptance-is-self-acceptance-b05f19edaaaf | ['Christine Schoenwald'] | 2020-12-04 08:38:01.287000+00:00 | ['Self Acceptance', 'Fat Acceptance', 'Mental Health', 'Culture', 'Feminism'] | Title Fat Acceptance SelfAcceptanceContent I’ve wasted lot time waiting thin go thing wanted didn’t selfconfidence put fat excuse way yes experience roadblock weight I’m getting older longer luxury wait I’m perfect weight go goal love time acceptance you’re fat fatacceptance selfacceptance way every kind acceptance Fatacceptance doesn’t mean growing improving challenging one’s self Acceptance give foundation allows move past emotional obstacle le fear Think way you’re cutting apple unstable cuttingboard run risk hurting knife need stable starting point take chance put take actionable step change loathe start seek help don’t feel worth Without acceptance there’s nothing keep going opposite fatacceptance internalizing cruelty mistreatment feel it’s justified can’t defend fatshaming discrimination abuse you’re convinced deserve bodysize fat isn’t crime — though I’m sure see way Fat people allowed happy accept success failure Fatacceptance isn’t thing bodypositivity like think bodypositive movement began good intention wanted people feel good body even body weren’t perfect However somewhere along line movement grew idea bodypositivity began apply whose body acceptable fat unruly fat could positive body curvy thick right place voluptuous obviously fat bodypositive something aspire article Leaving Body Positivity Behind Fat Acceptance writer Rachael Hope writes disagree idea loving body goal set u failure Loving body doesn’t mean don’t think flaw don’t bad day way love another human don’t like every moment every day day feel dislike way body look feel specific part body like le others still love body love home love let physically connect people love let feel touch pleasure Accepting body something proud accepting part falling love don’t love body it’s BEST body it’s BETTER body someone else’s love body love fat people need accept care love body — it’s vital feeling worth selfesteem quality life need armor fight battle help protect u shame humiliation accept body start heal Fatacceptance allows u honest help u see limitation talent Without fatacceptance may shut something like working applying job you’re starting shaky spot accept body start heal matter size healthy it’s perceived serf — you’re alive that’s thanks body next time someone try shame selfacceptance fatacceptance let know you’re dependent approval don’t need opinion relationship bodyTags Self Acceptance Fat Acceptance Mental Health Culture Feminism |
4,136 | The Curation of Our Little Library is Abysmal and I’d Like to Complain to a Manager | The Curation of Our Little Library is Abysmal and I’d Like to Complain to a Manager
Three copies of The DaVinci Code? This is ridiculous.
Imagine my excitement when a little library appeared just blocks away from my house. Painted green with a glass door that swung open and shut, at last something Pinterest-worthy was happening in my part of the country. I was sure the mere proximity of the thing would make me feel smarter and more well-read. However, the little library has been sitting smugly on the corner for a year now and, I hate to complain, but the outcome of this experiment is downright embarrassing for all involved.
So, please direct me to the person managing this whole situation, because I have a few complaints. Mostly I’m annoyed by the contents of the little library. I assumed everyone would chip in to contribute only the most enlightening and thought-provoking reading materials from their personal libraries. As for myself, I added my copy of The Secret and the person who took it is sure to be hella actualized by now. So, I’ve done my part.
But, as with their lawn care, others in this neighborhood have chosen to do the bare minimum. I saw one woman slip in a pamphlet for her laser hair removal MLM and then walk away as if she’d just saved the world. This is unacceptable.
What if someone important comes to our neighborhood and sees a dogeared copy of 50 Shades of Gray sitting right next to an abridged (abridged!) version of The Three Musketeers? What would Alexandre Dumas think to find his masterful adventure novel cut into pieces and shelved next to Horny Twilight? He would condemn us all.
I’ve also been meaning to address the San Diego travel book from 2003. It’s been in the library for almost four months and no one has taken it. We all know what 2003 was like and none of us are interested in revisiting it. (Two words. Embellished. Camo.) Keep your dated travel books where they belong, in the background of your Instagram photos where they’ll impress your 23 followers.
Of Mice and Men would have been a decent choice for inclusion, if someone hadn’t ripped out the final ten pages. Sure it’s not the happiest of endings and we can all relate to the impulse, but books with parts missing look tacky.
And Sarah Palin’s autobiography? I can’t get too mad about it, because someone was probably trying to get a cursed item out of their house. But, by attempting to pass it on to another neighbor, they’re reenacting one of the most overdone horror tropes. Everybody please burn your evil items in your backyards beneath the light of a full moon instead of adding them to the little library.
Also, if your novel is not in English, what are you even doing? This is America and I live here mostly so I don’t have to be subjected to the German language. German people make up words for everything and it’s exhausting. We don’t need a word for the sad feeling you get after cutting your toenails and we certainly don’t need a word for how men are doomed to turn into their fathers, because none of us want to think about that.
I could go on and on and on about the selection of books available, but that wouldn’t leave me time to complain about the sketchy characters who have started hanging out around the little library, with their wire rim spectacles and their bookshop totes. If you can believe it, I walked over there the other day to donate my copy of Eat Pray Love, a revolutionary work that transformed my relationship with myself, and they sneered at me. They told me it was a reductive novel and I needed to expand my horizons. Then they criticized the contents of the little library, which, honestly, was way too confusing for me and I really need to complain to a manager if I’m going to sort out the complicated array of feelings I’m experiencing right now. | https://sarah-lofgren.medium.com/the-curation-of-our-little-library-is-abysmal-and-id-like-to-complain-to-a-manager-c3c60ec5a468 | ['Sarah Lofgren'] | 2020-06-29 22:43:35.848000+00:00 | ['Satire', 'Humor', 'Reading', 'Funny', 'Books'] | Title Curation Little Library Abysmal I’d Like Complain ManagerContent Curation Little Library Abysmal I’d Like Complain Manager Three copy DaVinci Code ridiculous Imagine excitement little library appeared block away house Painted green glass door swung open shut last something Pinterestworthy happening part country sure mere proximity thing would make feel smarter wellread However little library sitting smugly corner year hate complain outcome experiment downright embarrassing involved please direct person managing whole situation complaint Mostly I’m annoyed content little library assumed everyone would chip contribute enlightening thoughtprovoking reading material personal library added copy Secret person took sure hella actualized I’ve done part lawn care others neighborhood chosen bare minimum saw one woman slip pamphlet laser hair removal MLM walk away she’d saved world unacceptable someone important come neighborhood see dogeared copy 50 Shades Gray sitting right next abridged abridged version Three Musketeers would Alexandre Dumas think find masterful adventure novel cut piece shelved next Horny Twilight would condemn u I’ve also meaning address San Diego travel book 2003 It’s library almost four month one taken know 2003 like none u interested revisiting Two word Embellished Camo Keep dated travel book belong background Instagram photo they’ll impress 23 follower Mice Men would decent choice inclusion someone hadn’t ripped final ten page Sure it’s happiest ending relate impulse book part missing look tacky Sarah Palin’s autobiography can’t get mad someone probably trying get cursed item house attempting pas another neighbor they’re reenacting one overdone horror trope Everybody please burn evil item backyard beneath light full moon instead adding little library Also novel English even America live mostly don’t subjected German language German people make word everything it’s exhausting don’t need word sad feeling get cutting toenail certainly don’t need word men doomed turn father none u want think could go selection book available wouldn’t leave time complain sketchy character started hanging around little library wire rim spectacle bookshop tote believe walked day donate copy Eat Pray Love revolutionary work transformed relationship sneered told reductive novel needed expand horizon criticized content little library honestly way confusing really need complain manager I’m going sort complicated array feeling I’m experiencing right nowTags Satire Humor Reading Funny Books |
4,137 | Get cracking with NativeScript | So you are a pro in Javascript OR a real good Angular, Vue developer and now want to explore building native apps on mobile. However, you are getting a migraine seeing the number of options! React Native, Dart, Kotlin which one should I choose? Well, take a sip of your favourite coffee and sit back. We got you, NativeScript!
NativeScript allows you to build native apps using Angular or TypeScript or modern Javascript and still give you the truly native UI and performance. It allows you to embed a web framework to generate a native app. Sounds cool, doesn’t it?
So, let’s get cracking with it!
Architecture
NativeScript prominently uses MVVM model which enables it to have two-way data binding, so the data gets instantly reflected on the view.
Another important advantage of this approach is that Models and View Models are reusable. This makes it possible to use it with Vue and Angular frameworks where most of the business logic can be shared with WEB components.
It also provides a rich set of Javascript modules which is categorized as UI Modules, Application Modules, Core Modules. These can be accessed at any time to write any complex application.
Native Plugins are written in platform-oriented languages (Swift and Java) which generally acts as a wrapper and can be used with Javascript plugin.
Write once run everywhere
NativeScript helps in building native applications in Javascript, however, you can build mobile apps either with JavaScript/TypeScript or Angular. Most of the code written in JS will remain the same for both platforms. It allows code sharing for business logic and some UI styles for Android and iOS.
Performance
NativeScript shows the ability of running the animations at 60 frames per second, virtualized scrolling, caching similar to native apps. Moreover, it can offload long-running processes to maintain frontend speed.
In the latest release of NS v6.7.8 the newly composed Webpack module has improved the performance on Android considerably.
From NS v6.7.8 onwards we can see the following improvements
Build process for Android increases by 30% while for IOS it increases by 10% Streamlined store approval process makes it enable a faster process for new versions to update.
Native device features
NativeScript provides the feature of writing native expressions directly, with Javascript or TypeScript. This avoids unwanted use of Javascript wrappers around the native ones, so the developer can focus only on business logic. It allows us to call native APIs from Javascript directly because they deal with the same Native APIs.
For e.g, If you want to integrate the camera feature in an app, you can initialize this through JS also.
In addition to this, NativeScript readily provides support to newly available IOS and Android API, by which we can easily shift to new features rather than depending on a specific version.
Pre-styled UI components
There is a rich set of pre-styled components available with NativeScript. You can simply plug and play these components. Also, there is a good bifurcation on layout and components. You can customize the components quite easily.
For e.g. Date picker, Bottom Navigation, Slider, Tabs, etc.
Community support
Earlier, there was less community support available for NativeScript but with time we are seeing a good amount of developers digging into the framework. Also, many organizations are adopting the framework for app development, which automatically helps in building the community.
Ready to use plugins
The NativeScript plugins are building blocks that encapsulate some functionality and help developers build apps faster (just like the NativeScript Core Modules, which is a plugin). Most are community-built, written in TypeScript/JavaScript. Some include native libraries, which are called from the TS/JS code thanks to the Runtimes. Native Script maintains an official marketplace of plugins for most of the native modules.
In addition to this, NS does provide support from npm, CocoaPods (iOS), and Gradle (Android) directly, along with hundreds of verified NativeScript plugins.
AR/VR capabilities
NativeScript lets you access iOS and Android APIs to build mobile apps using JavaScript, and ARKit is no exception. The releases of AR SDKs from Apple (ARKit) and Google (ARCore) have presented an opportunity for NativeScript to enable developers to create immersive cross-platform AR experiences. There is a plugin called nativescript-ar available on the marketplace for this.
Web support
As NativeScript comes with the support of different web frameworks like Angular, Vue, it allows you to build web and mobile apps out of a single codebase. It won’t stop at sharing only services but you can easily share:
Component class definition — that is the xyz.component.ts
Pipes
Router configuration
SCSS variables
With NativeScript 6.0 the amount of code reuse between web and mobile has increased. NativeScript can achieve 70% code reuse across web and mobile, including support for PWAs. This shortens development and testing cycles for both web and mobile apps in production while ensuring consistency across digital channels. It also lowers the cost of development and maintenance for deployed applications.
Learning curve
As NativeScript is based on Javascript, you can use Typescript, Angular or Vue to develop apps. It also supports the declarative coding style. So being a web developer, you don’t need to learn new languages or syntax.
NativeScript bypasses the dependency to learn Objective C(IOS) and Java/Kotlin(Android) for bridging concepts by injecting all iOS and Android APIs into the Javascript Virtual Machines.
Language used
As mentioned earlier, NativeScript uses JavaScript to build native mobile apps. It comes in different flavors — pure JavaScript/ TypeScript, with Angular and with Vue.js. So you can choose any of the given to start with your app.
PWA support
You can create a PWA with the NativeScript. Through the use of the NativeScript and Angular integration, it’s quite easy to build a PWA (Progressive Web App).
From v6.0 onwards NativeScript provides support for PWA which also enhances code reusability between mobile and web applications.
With the new concept of HMR (Hot Module Replacement) provides developers to see changes to JavaScript and CSS resources without reloading the application which enables a better user experience for PWA
Current limitations
NativeScript does have some limitations mostly related to App size which is mostly large in size, but you can overcome this limitation by running Webpack and Uglify. Android performance was not up to market standard in initial versions, but later on, in latest releases v.6.7.8 it is claimed to have better performance along with support for Android X library.
Closing Thoughts
As a web developer, when you start thinking about building mobile apps with cross-compiled platforms, it is definitely worth exploring NativeScript as one of the reliable options. As mentioned above it comes with a lot of capabilities, features and plugins which we need for any mobile app development. Cheers !! | https://medium.com/globant/get-cracking-with-nativescript-421b45e0d1b3 | ['Shreyas Upadhye'] | 2020-08-13 10:54:46.769000+00:00 | ['Mobile App Development', 'iOS App Development', 'Nativescript', 'Cross Compile', 'Android App Development'] | Title Get cracking NativeScriptContent pro Javascript real good Angular Vue developer want explore building native apps mobile However getting migraine seeing number option React Native Dart Kotlin one choose Well take sip favourite coffee sit back got NativeScript NativeScript allows build native apps using Angular TypeScript modern Javascript still give truly native UI performance allows embed web framework generate native app Sounds cool doesn’t let’s get cracking Architecture NativeScript prominently us MVVM model enables twoway data binding data get instantly reflected view Another important advantage approach Models View Models reusable make possible use Vue Angular framework business logic shared WEB component also provides rich set Javascript module categorized UI Modules Application Modules Core Modules accessed time write complex application Native Plugins written platformoriented language Swift Java generally act wrapper used Javascript plugin Write run everywhere NativeScript help building native application Javascript however build mobile apps either JavaScriptTypeScript Angular code written JS remain platform allows code sharing business logic UI style Android iOS Performance NativeScript show ability running animation 60 frame per second virtualized scrolling caching similar native apps Moreover offload longrunning process maintain frontend speed latest release NS v678 newly composed Webpack module improved performance Android considerably NS v678 onwards see following improvement Build process Android increase 30 IOS increase 10 Streamlined store approval process make enable faster process new version update Native device feature NativeScript provides feature writing native expression directly Javascript TypeScript avoids unwanted use Javascript wrapper around native one developer focus business logic allows u call native APIs Javascript directly deal Native APIs eg want integrate camera feature app initialize JS also addition NativeScript readily provides support newly available IOS Android API easily shift new feature rather depending specific version Prestyled UI component rich set prestyled component available NativeScript simply plug play component Also good bifurcation layout component customize component quite easily eg Date picker Bottom Navigation Slider Tabs etc Community support Earlier le community support available NativeScript time seeing good amount developer digging framework Also many organization adopting framework app development automatically help building community Ready use plugins NativeScript plugins building block encapsulate functionality help developer build apps faster like NativeScript Core Modules plugin communitybuilt written TypeScriptJavaScript include native library called TSJS code thanks Runtimes Native Script maintains official marketplace plugins native module addition NS provide support npm CocoaPods iOS Gradle Android directly along hundred verified NativeScript plugins ARVR capability NativeScript let access iOS Android APIs build mobile apps using JavaScript ARKit exception release AR SDKs Apple ARKit Google ARCore presented opportunity NativeScript enable developer create immersive crossplatform AR experience plugin called nativescriptar available marketplace Web support NativeScript come support different web framework like Angular Vue allows build web mobile apps single codebase won’t stop sharing service easily share Component class definition — xyzcomponentts Pipes Router configuration SCSS variable NativeScript 60 amount code reuse web mobile increased NativeScript achieve 70 code reuse across web mobile including support PWAs shortens development testing cycle web mobile apps production ensuring consistency across digital channel also lower cost development maintenance deployed application Learning curve NativeScript based Javascript use Typescript Angular Vue develop apps also support declarative coding style web developer don’t need learn new language syntax NativeScript bypass dependency learn Objective CIOS JavaKotlinAndroid bridging concept injecting iOS Android APIs Javascript Virtual Machines Language used mentioned earlier NativeScript us JavaScript build native mobile apps come different flavor — pure JavaScript TypeScript Angular Vuejs choose given start app PWA support create PWA NativeScript use NativeScript Angular integration it’s quite easy build PWA Progressive Web App v60 onwards NativeScript provides support PWA also enhances code reusability mobile web application new concept HMR Hot Module Replacement provides developer see change JavaScript CSS resource without reloading application enables better user experience PWA Current limitation NativeScript limitation mostly related App size mostly large size overcome limitation running Webpack Uglify Android performance market standard initial version later latest release v678 claimed better performance along support Android X library Closing Thoughts web developer start thinking building mobile apps crosscompiled platform definitely worth exploring NativeScript one reliable option mentioned come lot capability feature plugins need mobile app development Cheers Tags Mobile App Development iOS App Development Nativescript Cross Compile Android App Development |
4,138 | Skiing During the Pandemic | Skiing During the Pandemic
What’s changed, what’s working, and what needs improvement
Opening day at Big Sky Ski Resort, Big Sky, Montana. Photo by Tom Johnson.
As the holidays approach and the coronavirus pandemic enters a dangerous new phase, many skiers wonder whether it’s safe to return to the slopes.
Last week, my wife and I got a first hand look. If our experience is any indication, the industry could be in for a challenging season.
Thanksgiving Day marked the beginning of ski season at many resorts across the U.S. We spent the day skiing at Big Sky, Montana’s largest and best known ski resort.
Since September, we’ve lived at Big Sky Mountain Village, located at the base of the ski area. During our time here, we’ve been impressed by the absence of crowds. All of that changed on Thanksgiving morning, when thousands of people showed up to ring in the new season. Lifts opened at 9 am, but even before then, skiers gathered at the base area. Lines quickly formed at the lifts, and skiers packed the few runs with sufficient snow to open.
From a COVID-19 perspective, skiing has the reputation of being a relatively safe activity. Skiing is an outdoor sport. Skiers are spread out over vast areas and breathe unlimited quantities of fresh mountain air.
But it’s not the skiing that poses the greatest risk; it’s the congregation at the base, in lift lines, at mountain dining facilities, and in bars and restaurants at night. As in many places, enforcement of mitigation measures is key. The best laid plans can be derailed by lack of compliance.
Lift line at Big Sky on opening day. Photo by Tom Johnson.
Destination resorts like Big Sky attract guests from across the country and around the world. Those guests bring with them illnesses present in their home regions and then commingle in resort facilities and in surrounding communities. The resulting stew has the potential to feed disease outbreaks.
The risks posed by ski areas are well documented. Last winter, a coronavirus outbreak at the Austrian ski resort of Ischgl was linked to more than 6,000 infections in nearly 50 countries, an event that contributed to Europe’s initial coronavirus surge.
Europe is now wrestling with how to avoid a repeat of last year. Austria and Switzerland recently decided to open for the season, while other countries, such as Italy, Germany and France, vow to remain shut or operate under significant restrictions.
Safety first
Last spring, surging COVID cases caused the U.S. ski season to grind to a halt. Ski areas abruptly closed in March, shortening the season by as much as two months and closing many business dependent on winter tourism. Big Sky was no exception.
In an effort to get ahead of potential coronavirus-related setbacks, the National Ski Areas Association this fall published their “Ski Well, Be Well” guide to best practices for skiers and resorts alike. The association represents more than 300 alpine resorts that account for more than 90 percent of the skier/snowboarder visits nationwide.
“Ski industry leaders from across the country established these foundational best practices according to scientific guidelines put forth by infectious disease experts, including the CDC and WHO,” the organization says on its website. “Ski areas will comply with additional federal, state and local regulations as they are implemented.”
Both Boyne Mountain Resorts and Vail Resorts advised the creation of the safety document and endorsed its contents. Boyne owns Big Sky, as well as Washington’s Snoqualmie, Maine’s Sugarloaf and several others. Vail Resorts, the nation’s largest ski corporation, owns and operates 37 mountain resorts in three countries, including Vail, Beaver Creek, Breckenridge, Park City, and Whistler Blackcomb.
In attempt to assert some control over resort capacity, Vail Resorts in August announced plans for its first reservation system, requiring skiers to make reservations to ski ahead of time. Other resorts are slowly adopting reservation systems, especially for skiers using partner passes. Skiers using the Ikon Pass will need to make reservations at many resorts, including Big Sky.
Big Sky is now actively rolling out its slate of best management practices intended to curtail risky behaviors known to facilitate transmission of the virus.
“Each of our teams have worked tirelessly to develop new operational practices with the goal of providing the safest experience possible for our guests and our teams,” Big Sky Public Relations Manager Stacie Mesuda told me in an email exchange. “Many things will be different this season — directional traffic in our F&B (food and beverage) locations, new lift-line configurations, social distancing guidelines, and most important, the requirement for all team members and guests to mask up while at the resort as it is mandatory in all public space.”
The resort’s publicized face covering requirements include wearing masks while at the base area, in lift lines, while riding and unloading chairlifts, and while indoors.
“Our efforts to wear masks and facial coverings consistently are a crucial factor in staying open all season,” Mesuda said.
In addition, Mesuda said, the resort has invested in weekly surveillance testing for both symptomatic and asymptomatic employees, beginning in early December. Separately, Big Sky is participating in a community-wide testing partnership.
Good intentions
Our experience on Thanksgiving suggests that despite good intentions, operating the resort safely remains a challenge. Throughout the day, we observed behaviors that were inconsistent with Big Sky’s published regulations and raised questions about whether the resort and the industry have the ability to operate in accordance with their own safety requirements.
In many ways, skiers and the resort behaved as if this season is no different from previous ones. Lift and ticket lines were long and social distancing was scant. Signage encouraging safe practices was present but not always visible in crowded areas, leaving skiers unsure where and how to queue safely at lifts. Absent were any resort employees roaming the lines to assist with directions or enforce social distancing and face covering requirements.
Opening day at Big Sky. Photo by Tom Johnson.
Mask usage was far from universal. The face coverings we observed among guests were likely to be worn below the nose, where they provided no protection, either for the individual or those around them. Even when worn properly, the masks we observed often consisted of coarse-woven gators or bandanas. Few masks we saw were constructed of materials thought to provide maximum protection.
Mesuda told me in a follow-up exchange that “we believe our guests can do much better and have empowered all our teammates to remind guests of our resort policies, while also educating them on proper wearing, acceptable forms of coverings, and ensuring everyone does their part. Most of our guests want to do the right thing — and this is all new to them. We are finding that once we provide some education to our guests about our expectations, they are happy to comply. However, several non-compliant guests were asked to leave the resort and we will maintain that approach in every similar instance.”
Uncomfortable moments
While actively skiing, my wife and I felt as safe as we would have during any other ski season. But throughout the course of the day, we found ourselves in situations—ticket lines, lift lines, riding on lifts—that caused us to accept risks that we have scrupulously avoided throughout the pandemic. This made us uncomfortable.
In the line to obtain our passes, a printer malfunction delayed processing of orders. We stood in line for more than 30 minutes. Most patrons around us wore masks, but social distancing was spotty. We saw no resort employees in the area enforcing compliance. There were no mazes to guide patrons, no marks on walkways suggesting safe distancing. The group behind us continually encroached on our personal space. At one point, a man in the group stood only inches behind my wife.
“Can you please scoot back?” she asked.
The man puffed up and glared at her as he took a step back. “Is this far enough?”
“Six feet,” my wife said, pointing to a resort sign posted next to her.
In the lift line, because of crowd density, we saw few posted signs encouraging compliance with the resort’s safety rules until we made it all the way to the front and were about to board the lift. By then, we had spent 20 minutes jammed together with other skiers awaiting a trip up the mountain.
“Are you having trouble enforcing social distancing and mask requirements?” I asked a lift operator as we approached the front of the line.
“It’s not that I’m having trouble,” she said. “They pull their masks up when I tell them to. But until then, they just do what they want.”
She asked whether we had seen many people without masks. We’d observed around seventy percent compliance, I told her, but that even those in compliance wore face coverings that offered little protection, or they wore them below the nose. She agreed.
“Did you receive much training on COVID?” my wife asked.
“It was very brief,” she said. “Very brief.”
Mesuda later told me that “every new and returning employee went through a mandatory orientation session which featured COVID-19 education, review of resort policies and expectations (for guests and employees), and complemented with department-specific training from department managers.”
I asked the lift operator whether the resort had people patrolling lift lines to ensure compliance with Big Sky’s face covering and social distancing requirements.
“We’re supposed to,” she said, looking futilely out at the line snaking into the distance.
Looking in the same direction, we saw hundreds of people packed together on a windless day, sharing space and air, breathing hard from runs just completed. Compliance tended to fall among groups: if one member of a group wore a mask, everyone wore a mask. In other groups, no one wore face coverings.
Skiers who went through the line unmasked weren’t asked by lift operators to mask up until just before loading onto the lift — at which point, a mask was arguably less helpful.
“Thanks for caring about this,” the lift operator said as we boarded. “Because nobody else seems to.”
Opening day at Big Sky. Photo by Tom Johnson.
Who’s in charge?
At least initially, Big Sky appears to have chosen to rely on guests to independently monitor their behavior and adhere to the resort’s safety requirements. Mesuda as much as confirmed this.
“While we do our part, we are also asking our community to do their part and use good judgement to be socially distant whenever possible,” Mesuda said.
And that’s part of the problem. Perhaps due to coronavirus fatigue or politicization of COVID mitigation measures, a segment of the population continues to abstain from mask wearing and social distancing requirements. Without enforcement, the resort and its guests are at the mercy of those who elect not to comply. And those who exhibit reckless behavior on the slopes are likely to conduct themselves recklessly in other aspects of the their lives, making them more likely to contract the virus and put everyone around them at risk.
Regardless of guest behavior, the resort appeared to have some trouble doing their own part. For instance, we were told to expect separate lines for those wanting to ride only with members of their parties and those willing to share lift rides with others.
“Our lift riding plan is a hybrid intended to maximize uphill capacity while respecting personal choice and space,” Mesuda said. “Guests can choose one of two lines — a “Friends & Family” line if they want to ride with the group they are traveling with (drive together, ride together) or a “normal” line, which would load lifts with unrelated parties as we have typically done in the past.”
On opening day, we saw no such options and no signage directing us. At the front of the line, we were loaded onto the lift with another party, and no one asked us whether that was okay.
Mesuda later explained that the option to ride only with members of one’s own party is limited.
“On select lifts, we are offering guests the option to ride with their party only; and if guests are comfortable, the ability to ride with other parties as well,” Mesuda said. “On lower volume days, we will accommodate our guest’s desire to ride alone or only with members of their party across all lifts that are running.”
On Big Sky’s website, the resort states it will not enforce maximum capacity on chairlifts but will allow groups traveling together to ride on their own chair when “operationally feasible.” In contrast, lifts will be loaded only with people in the same group in nearby Colorado.
Table for two
Perhaps the greatest danger on the mountain lies within the resort’s indoor dining facilities. States around the country are once again issuing stay-at-home orders and forbidding or limiting public gatherings indoors, including indoor dining at restaurants. Despite ranking in the top 10 per capita for test positivity rates, conservative-leaning Montana has resisted implementing aggressive restrictions.
“We are operating restaurants at 50 percent occupancy in compliance with state and county guidelines,” Mesuda wrote. “In addition to managing to a reduced capacity, we have implemented additional measures to minimize the risk of COVID-19 exposure which includes: requiring facial coverings in all indoor facilities unless seated and eating, directional specific entry/exits, increased frequency of sanitizing common area surfaces, as well as online ordering, dedicated pickup areas and even the introduction of a delivery service to lodging units by way of Swifty Delivery.”
Operating dining facilities at 50 percent capacity falls short of restrictions imposed by many cities and states with lower per capita infection rates, and in places where diners are more likely to come from nearby neighborhoods. At Big Sky and other resorts, dining facilities could be veritable melting pots — and 50 percent capacity is little different from an average day in mid-winter.
The CDC rates on-site dining with indoor seating as “Highest Risk” if seating capacity is not reduced and tables not spaced at least 6 feet apart.
Former FDA chief Scott Gottlieb said Monday on CNBC that he avoids indoor dining altogether.
“I will not eat indoors in a restaurant,” Gottlieb said on “Squawk Box.” “I’ve been eating outdoors since the summertime and wouldn’t eat indoors in a restaurant. I think the risk is too high to be in a confined space without a mask on with other people eating in that same location right now.”
While admittedly cautious, my wife and I haven’t set foot in an indoor restaurant in eight months. The prospect of sitting down with strangers from distant locales, regardless of how much surface cleaning is done, is unimaginable. For skiers with concerns about airborne transmission of COVID-19, that leaves few choices but to eat outside. Al fresco dining is fine when the weather is pleasant, but it’s a chilly prospect during a January blizzard.
The author and his wife on opening day at Big Sky. Photo by Tom Johnson.
Looking forward
Opening day at Big Sky reminded me of an episode of truTV’s educational comedy series Adam Ruins Everything. In an episode entitled “Adam Ruins Security,” Adam explains the concept of “security theater.” Security theater is the practice of enacting security measures that are intended to provide the feeling of improved security while doing little or nothing to achieve it. Examples include tightened airport security after a terrorist attack.
On opening day, I felt that Big Sky was to a degree practicing “safety theater.” Through their published guidelines, the resort talked a good game, and this made me feel safe. But on opening day, the resort failed to take some of the actions that would have actually kept me safe.
Since opening day, I’ve noticed some improvements. Mesuda acknowledged as much:
“The lift lane configuration has been extended for Ramcharger 8 and the same since the opening of Swift Current, with a recurring placement of messaging noting facial coverings are required and to socially distance at least 6 feet apart. While lift lanes may have seemed crowded, with most guests maintaining a “tip to tail” distance between skiers and riders, they are generally spaced 6 feet apart and adhering to the recommended spacing between parties. It’s also important to note, by providing our guests the opportunity to only ride with their party on select lifts, and encouraging additional spacing between groups — lift lines will appear longer this season. However, with reduced guest volume and our high-speed chairlift network, we believe guest wait times will be less impacted.”
Readers of this story can view the photos included and ascertain for themselves whether Mesuda’s “tip to tail” comment is accurate.
In the resort’s defense, Thursday was the first day of the season. The resort will no doubt iron out kinks and address shortcomings in their safety policies. The situation on opening day was made more difficult by virtue of the fact that snowfall has been light, meaning that only a couple of lifts and runs were open. With few choices, visitors are confined to a smaller footprint of skiable terrain. As the season progresses, guests will spread out. This may reduce congestion at the base and in lift lines.
But the resort only has a few weeks to get it right before the holiday crush, when many times the number of visitors — heralding from broad geographic areas — will descend on the mountain. By then, the pandemic’s grip on the nation is expected to tighten.
Big stakes
There’s a lot at stake, not just for Big Sky, but for the country’s nearly 500 ski resorts and the communities that depend on them. Snow sports tourism contributes around $20 billion to the U.S. economy each year, according to researchers at the University of New Hampshire and Colorado State University.
There’s a lot at stake for skiers, too. Between travel, accommodations, lift tickets and gear, skiing is an expensive sport. Having a ski vacation cut short by a COVID-19 infection is a difficult pill to swallow, even without considering the risk of ending up in the hospital. And if guests become infected, they may be forced to quarantine onsite.
“Ski areas have also been asked to message to their guests that they will be required to extend their stay and quarantine should they test positive for COVID-19 during their stay,” a spokeswoman from the Colorado Department of Public Health and Environment recently said.
If that were to come about, guests themselves may be on the hook financially.
“If you have to isolate, you are going to have to pay,” Aspen Chamber president Debbie Braun said. “People need to be very aware when they come to town, and we need to make sure they understand our public health orders.”
Mesuda said that in the event that guests become infected during their stay at Big Sky, the resort will “assist them on a case by case basis to ensure they are isolating safely and in compliance with the local health department’s best practices.” She declined to say who would pay for an extended stay.
Bottom line
In the weeks leading up to opening day, we were comforted by Big Sky’s aggressive messaging that suggested the organization is respectful of the coronavirus and is taking steps to mitigate the threat it poses. Big Sky’s website contains pages of information detailing the steps they have taken to ensure guest safety.
Our actual experience underscores the fact that regulations without enforcement are of little use. There will always be a subset of the population that ignores or even ridicules restrictions as onerous, overbearing, or an infringement on personal liberty.
Mesuda says that with the policies it has in place, the resort is confident it can make it through the season safely. If things go poorly, however, they are not afraid to adjust.
“We intend to use common sense and good practices to open safely and efficiently for the full duration of our ski season. Like last winter, we are not afraid to pivot or make hard choices once the season is underway; but remain confident that our current plan will allow us to have a full season of skiing.”
With little snow in the forecast, skiable terrain across the West will likely remain limited into the holiday season. Congestion at the base, in lift lines, and in restaurants will persist. Guests will continue to flock in from all points. With that in mind, it is incumbent on Big Sky and other resorts to do more to enforce their regulations and weed out noncompliant visitors who would put others at risk.
My wife and I intend to ski throughout the season, but like all things this year, we will modify our behavior. We will restrict our skiing to days when crowds are thinner. We’ll bring our own meals and eat them outside. We’ll avoid situations where crowds form and social distancing is inadequate. We’ll request to ride lifts without other parties. We may ski shorter days, since we won’t use the resort’s lodges and restaurants for meals or rest breaks.
By limiting our exposure to others, we believe we’ll feel safe enough to ski. And as long as we are skiing—actually gliding over snow in fresh mountain air—there is probably not a safer place to recreate. | https://medium.com/illumination/thinking-of-skiing-this-season-33a2fcb30ae | ['Tom Johnson'] | 2020-12-09 15:03:52.571000+00:00 | ['Sports', 'Health', 'Business', 'Skiing', 'Covid 19'] | Title Skiing PandemicContent Skiing Pandemic What’s changed what’s working need improvement Opening day Big Sky Ski Resort Big Sky Montana Photo Tom Johnson holiday approach coronavirus pandemic enters dangerous new phase many skier wonder whether it’s safe return slope Last week wife got first hand look experience indication industry could challenging season Thanksgiving Day marked beginning ski season many resort across US spent day skiing Big Sky Montana’s largest best known ski resort Since September we’ve lived Big Sky Mountain Village located base ski area time we’ve impressed absence crowd changed Thanksgiving morning thousand people showed ring new season Lifts opened 9 even skier gathered base area Lines quickly formed lift skier packed run sufficient snow open COVID19 perspective skiing reputation relatively safe activity Skiing outdoor sport Skiers spread vast area breathe unlimited quantity fresh mountain air it’s skiing pose greatest risk it’s congregation base lift line mountain dining facility bar restaurant night many place enforcement mitigation measure key best laid plan derailed lack compliance Lift line Big Sky opening day Photo Tom Johnson Destination resort like Big Sky attract guest across country around world guest bring illness present home region commingle resort facility surrounding community resulting stew potential feed disease outbreak risk posed ski area well documented Last winter coronavirus outbreak Austrian ski resort Ischgl linked 6000 infection nearly 50 country event contributed Europe’s initial coronavirus surge Europe wrestling avoid repeat last year Austria Switzerland recently decided open season country Italy Germany France vow remain shut operate significant restriction Safety first Last spring surging COVID case caused US ski season grind halt Ski area abruptly closed March shortening season much two month closing many business dependent winter tourism Big Sky exception effort get ahead potential coronavirusrelated setback National Ski Areas Association fall published “Ski Well Well” guide best practice skier resort alike association represents 300 alpine resort account 90 percent skiersnowboarder visit nationwide “Ski industry leader across country established foundational best practice according scientific guideline put forth infectious disease expert including CDC WHO” organization say website “Ski area comply additional federal state local regulation implemented” Boyne Mountain Resorts Vail Resorts advised creation safety document endorsed content Boyne owns Big Sky well Washington’s Snoqualmie Maine’s Sugarloaf several others Vail Resorts nation’s largest ski corporation owns operates 37 mountain resort three country including Vail Beaver Creek Breckenridge Park City Whistler Blackcomb attempt assert control resort capacity Vail Resorts August announced plan first reservation system requiring skier make reservation ski ahead time resort slowly adopting reservation system especially skier using partner pass Skiers using Ikon Pass need make reservation many resort including Big Sky Big Sky actively rolling slate best management practice intended curtail risky behavior known facilitate transmission virus “Each team worked tirelessly develop new operational practice goal providing safest experience possible guest teams” Big Sky Public Relations Manager Stacie Mesuda told email exchange “Many thing different season — directional traffic FB food beverage location new liftline configuration social distancing guideline important requirement team member guest mask resort mandatory public space” resort’s publicized face covering requirement include wearing mask base area lift line riding unloading chairlift indoors “Our effort wear mask facial covering consistently crucial factor staying open season” Mesuda said addition Mesuda said resort invested weekly surveillance testing symptomatic asymptomatic employee beginning early December Separately Big Sky participating communitywide testing partnership Good intention experience Thanksgiving suggests despite good intention operating resort safely remains challenge Throughout day observed behavior inconsistent Big Sky’s published regulation raised question whether resort industry ability operate accordance safety requirement many way skier resort behaved season different previous one Lift ticket line long social distancing scant Signage encouraging safe practice present always visible crowded area leaving skier unsure queue safely lift Absent resort employee roaming line assist direction enforce social distancing face covering requirement Opening day Big Sky Photo Tom Johnson Mask usage far universal face covering observed among guest likely worn nose provided protection either individual around Even worn properly mask observed often consisted coarsewoven gator bandana mask saw constructed material thought provide maximum protection Mesuda told followup exchange “we believe guest much better empowered teammate remind guest resort policy also educating proper wearing acceptable form covering ensuring everyone part guest want right thing — new finding provide education guest expectation happy comply However several noncompliant guest asked leave resort maintain approach every similar instance” Uncomfortable moment actively skiing wife felt safe would ski season throughout course day found situations—ticket line lift line riding lifts—that caused u accept risk scrupulously avoided throughout pandemic made u uncomfortable line obtain pass printer malfunction delayed processing order stood line 30 minute patron around u wore mask social distancing spotty saw resort employee area enforcing compliance maze guide patron mark walkway suggesting safe distancing group behind u continually encroached personal space one point man group stood inch behind wife “Can please scoot back” asked man puffed glared took step back “Is far enough” “Six feet” wife said pointing resort sign posted next lift line crowd density saw posted sign encouraging compliance resort’s safety rule made way front board lift spent 20 minute jammed together skier awaiting trip mountain “Are trouble enforcing social distancing mask requirements” asked lift operator approached front line “It’s I’m trouble” said “They pull mask tell want” asked whether seen many people without mask We’d observed around seventy percent compliance told even compliance wore face covering offered little protection wore nose agreed “Did receive much training COVID” wife asked “It brief” said “Very brief” Mesuda later told “every new returning employee went mandatory orientation session featured COVID19 education review resort policy expectation guest employee complemented departmentspecific training department managers” asked lift operator whether resort people patrolling lift line ensure compliance Big Sky’s face covering social distancing requirement “We’re supposed to” said looking futilely line snaking distance Looking direction saw hundred people packed together windless day sharing space air breathing hard run completed Compliance tended fall among group one member group wore mask everyone wore mask group one wore face covering Skiers went line unmasked weren’t asked lift operator mask loading onto lift — point mask arguably le helpful “Thanks caring this” lift operator said boarded “Because nobody else seems to” Opening day Big Sky Photo Tom Johnson Who’s charge least initially Big Sky appears chosen rely guest independently monitor behavior adhere resort’s safety requirement Mesuda much confirmed “While part also asking community part use good judgement socially distant whenever possible” Mesuda said that’s part problem Perhaps due coronavirus fatigue politicization COVID mitigation measure segment population continues abstain mask wearing social distancing requirement Without enforcement resort guest mercy elect comply exhibit reckless behavior slope likely conduct recklessly aspect life making likely contract virus put everyone around risk Regardless guest behavior resort appeared trouble part instance told expect separate line wanting ride member party willing share lift ride others “Our lift riding plan hybrid intended maximize uphill capacity respecting personal choice space” Mesuda said “Guests choose one two line — “Friends Family” line want ride group traveling drive together ride together “normal” line would load lift unrelated party typically done past” opening day saw option signage directing u front line loaded onto lift another party one asked u whether okay Mesuda later explained option ride member one’s party limited “On select lift offering guest option ride party guest comfortable ability ride party well” Mesuda said “On lower volume day accommodate guest’s desire ride alone member party across lift running” Big Sky’s website resort state enforce maximum capacity chairlift allow group traveling together ride chair “operationally feasible” contrast lift loaded people group nearby Colorado Table two Perhaps greatest danger mountain lie within resort’s indoor dining facility States around country issuing stayathome order forbidding limiting public gathering indoors including indoor dining restaurant Despite ranking top 10 per caput test positivity rate conservativeleaning Montana resisted implementing aggressive restriction “We operating restaurant 50 percent occupancy compliance state county guidelines” Mesuda wrote “In addition managing reduced capacity implemented additional measure minimize risk COVID19 exposure includes requiring facial covering indoor facility unless seated eating directional specific entryexits increased frequency sanitizing common area surface well online ordering dedicated pickup area even introduction delivery service lodging unit way Swifty Delivery” Operating dining facility 50 percent capacity fall short restriction imposed many city state lower per caput infection rate place diner likely come nearby neighborhood Big Sky resort dining facility could veritable melting pot — 50 percent capacity little different average day midwinter CDC rate onsite dining indoor seating “Highest Risk” seating capacity reduced table spaced least 6 foot apart Former FDA chief Scott Gottlieb said Monday CNBC avoids indoor dining altogether “I eat indoors restaurant” Gottlieb said “Squawk Box” “I’ve eating outdoors since summertime wouldn’t eat indoors restaurant think risk high confined space without mask people eating location right now” admittedly cautious wife haven’t set foot indoor restaurant eight month prospect sitting stranger distant locale regardless much surface cleaning done unimaginable skier concern airborne transmission COVID19 leaf choice eat outside Al fresco dining fine weather pleasant it’s chilly prospect January blizzard author wife opening day Big Sky Photo Tom Johnson Looking forward Opening day Big Sky reminded episode truTV’s educational comedy series Adam Ruins Everything episode entitled “Adam Ruins Security” Adam explains concept “security theater” Security theater practice enacting security measure intended provide feeling improved security little nothing achieve Examples include tightened airport security terrorist attack opening day felt Big Sky degree practicing “safety theater” published guideline resort talked good game made feel safe opening day resort failed take action would actually kept safe Since opening day I’ve noticed improvement Mesuda acknowledged much “The lift lane configuration extended Ramcharger 8 since opening Swift Current recurring placement messaging noting facial covering required socially distance least 6 foot apart lift lane may seemed crowded guest maintaining “tip tail” distance skier rider generally spaced 6 foot apart adhering recommended spacing party It’s also important note providing guest opportunity ride party select lift encouraging additional spacing group — lift line appear longer season However reduced guest volume highspeed chairlift network believe guest wait time le impacted” Readers story view photo included ascertain whether Mesuda’s “tip tail” comment accurate resort’s defense Thursday first day season resort doubt iron kink address shortcoming safety policy situation opening day made difficult virtue fact snowfall light meaning couple lift run open choice visitor confined smaller footprint skiable terrain season progress guest spread may reduce congestion base lift line resort week get right holiday crush many time number visitor — heralding broad geographic area — descend mountain pandemic’s grip nation expected tighten Big stake There’s lot stake Big Sky country’s nearly 500 ski resort community depend Snow sport tourism contributes around 20 billion US economy year according researcher University New Hampshire Colorado State University There’s lot stake skier travel accommodation lift ticket gear skiing expensive sport ski vacation cut short COVID19 infection difficult pill swallow even without considering risk ending hospital guest become infected may forced quarantine onsite “Ski area also asked message guest required extend stay quarantine test positive COVID19 stay” spokeswoman Colorado Department Public Health Environment recently said come guest may hook financially “If isolate going pay” Aspen Chamber president Debbie Braun said “People need aware come town need make sure understand public health orders” Mesuda said event guest become infected stay Big Sky resort “assist case case basis ensure isolating safely compliance local health department’s best practices” declined say would pay extended stay Bottom line week leading opening day comforted Big Sky’s aggressive messaging suggested organization respectful coronavirus taking step mitigate threat pose Big Sky’s website contains page information detailing step taken ensure guest safety actual experience underscore fact regulation without enforcement little use always subset population ignores even ridicule restriction onerous overbearing infringement personal liberty Mesuda say policy place resort confident make season safely thing go poorly however afraid adjust “We intend use common sense good practice open safely efficiently full duration ski season Like last winter afraid pivot make hard choice season underway remain confident current plan allow u full season skiing” little snow forecast skiable terrain across West likely remain limited holiday season Congestion base lift line restaurant persist Guests continue flock point mind incumbent Big Sky resort enforce regulation weed noncompliant visitor would put others risk wife intend ski throughout season like thing year modify behavior restrict skiing day crowd thinner We’ll bring meal eat outside We’ll avoid situation crowd form social distancing inadequate We’ll request ride lift without party may ski shorter day since won’t use resort’s lodge restaurant meal rest break limiting exposure others believe we’ll feel safe enough ski long skiing—actually gliding snow fresh mountain air—there probably safer place recreateTags Sports Health Business Skiing Covid 19 |
4,139 | I always wanted to write but I never felt good enough. | During my senior year of my undergraduate degree I really started to notice how much school and work had taken over my life. I was living life by reacting to every fire that appeared, instead of conquering my goals. Realizing that I was headed down the wrong path mentally forced me to re-examine what I wanted out of life. I started to read again, looking for meaning where I had lost it. I started to listen to great minds again, through YouTube and podcasts. I can’t stress enough how much listening to podcasts during my daily commute has improved my life.
If you aren’t listening to podcasts or audio-books instead of the same old songs and boring radio stations, then you need to get Castbox, Spotify, Soundcloud or whatever is easiest.
Taking those 10–30 minutes every morning and evening to think critically about new topics or listen to something more engaging than the newest hit song will help everyone in the long run. There are podcasts about anything you can think of, just like there are books about almost every topic imaginable. Find what suits you and upgrade your drive time.
Another flaw I am working on overcoming is procrastination. If there’s one thing I have learned in MBA school is that “what gets measured tends to get done”. That is to say that if you set a simple goal, you are much more likely to complete the task than if you keep that task locked up inside your head. Make a list on your phone of tasks you want or need to get done (I use google Keep for lists and google Tasks for specific goals). Since I have started writing out my goals I have gotten a lot more done, and I feel more accomplished checking off the box on my phone app.
One of the newer techniques I am trying out to boost my fight against laziness is setting goals on my google calendar (which I had never thought of before). I have mountains of books I want to read, but I fall into the habit of getting lost in the middle of books and just watching T.V. instead. To fix this I am set a new goal on my calendar to read 30 minutes for at least 3 days a week. So far I like it, because it notifies me at my set time of day I want to read and nudges me to get some reading done. The first week I only got 1/3 days read, but this week I have all 3 already. People don’t like failure, and setting goals that remind us of when we don’t live up to our ideals nudges us to do better.
It seems trivial, but recognize your goals by defining them. Set some type of measurement for your definition of “success” towards that goal, and you will see a difference. Setting a realistic goal is half the battle. I could have easily put my goal as read every night of the week, but that is too much to jump right into for me at the moment. That also doesn’t mean that I don’t want to read every night, because I do, but I am working slowly towards it. The rub is that there will be ups and downs along the path, but keep striving towards the top and you will see improvements.
I reset my life by realizing what my goals were. I wanted to improve my relationships, read more, learn to write better, and countless other goals, but I never would have made as much progress if I had left those goals floating around in my mind. Part of my procrastination is the feeling of not making significant progress, but by setting measurements it has helped me to see the changes I am making in my life. I am determined to keep climbing. | https://medium.com/the-ascent/how-i-upgraded-my-life-and-started-writing-again-bb5983ed229d | ['Michael Wentz'] | 2018-07-11 21:01:01.194000+00:00 | ['Life Lessons', 'Procrastination', 'Goals', 'Writing', 'Personal Development'] | Title always wanted write never felt good enoughContent senior year undergraduate degree really started notice much school work taken life living life reacting every fire appeared instead conquering goal Realizing headed wrong path mentally forced reexamine wanted life started read looking meaning lost started listen great mind YouTube podcasts can’t stress enough much listening podcasts daily commute improved life aren’t listening podcasts audiobooks instead old song boring radio station need get Castbox Spotify Soundcloud whatever easiest Taking 10–30 minute every morning evening think critically new topic listen something engaging newest hit song help everyone long run podcasts anything think like book almost every topic imaginable Find suit upgrade drive time Another flaw working overcoming procrastination there’s one thing learned MBA school “what get measured tends get done” say set simple goal much likely complete task keep task locked inside head Make list phone task want need get done use google Keep list google Tasks specific goal Since started writing goal gotten lot done feel accomplished checking box phone app One newer technique trying boost fight laziness setting goal google calendar never thought mountain book want read fall habit getting lost middle book watching TV instead fix set new goal calendar read 30 minute least 3 day week far like notifies set time day want read nudge get reading done first week got 13 day read week 3 already People don’t like failure setting goal remind u don’t live ideal nudge u better seems trivial recognize goal defining Set type measurement definition “success” towards goal see difference Setting realistic goal half battle could easily put goal read every night week much jump right moment also doesn’t mean don’t want read every night working slowly towards rub ups down along path keep striving towards top see improvement reset life realizing goal wanted improve relationship read learn write better countless goal never would made much progress left goal floating around mind Part procrastination feeling making significant progress setting measurement helped see change making life determined keep climbingTags Life Lessons Procrastination Goals Writing Personal Development |
4,140 | iPad Air 4 vs. iPad Pro (2020) | iPad Air 4 vs. iPad Pro (2020)
…Did Apple create its own iPad Pro killer?
iPad Pro (2020) vs. iPad Air
Apple’s “Time Flies”event just wrapped up and though there wasn’t an abundance of hardware showcased, leaving many disappointed, there is something big to take notice of. Out of all the products we got a look at the event, the product that got the most exciting hype around is the brand new iPad Air which now shares a lot of the similarities of the 2020 iPad Pro. This raises the question that many will need answering when buying these iPads — what iPad should you buy? The iPad Air or iPad Pro?
This is why we’re going to discuss the major differences between both as a buyer’s guide.
Design
The iPad Air has existed since 2013 as a middle-level iPad that shares a lot of the iPad Pro’s features but didn’t have that higher-end price tag. These two were quite similar for a while until 2018, when Apple “modernized the iPad Pro with a squared-off design and edges, again giving it a much more industrial look, smaller bezels, and Face ID. That also means a cut down of the headphone jack.
For those who wanted this stunning design, they looked up to the iPad Pro.
However, that is not the case anymore.
The 2020 iPad Air has received an all-new design that is the same design language as of the iPad Pro. The Air was not just granted a new design, but new colors as well! Both high-end tablets receive the “Space Gray” and Silver colors, default yet clean. But, the new colors of the Air are the glowing Rose Gold, a tint of Green, and Sky Blue, resembling the color to 2020’s Color of the Year! My, the amazing colors of the iPad Air.
iPad Air 4 colors.
iPad Air gets a more variety of colors. More than the Pro. But to be honest, about more than half the people who own any iPad use a protective case as these iPads come with a hefty price. So, if one of those who use a case for your iPad, don’t be too surprised if you can’t see your beautiful new color!
Display
With the iPad Pro, you have the options to pick between an 11-inch display or a 12.9" one. With the iPad Air, there is just a 10.9-inch option. It’s the 11" Pro but with bigger bezels, decreasing the screen size. For those looking for the biggest screen a high-end iPad can provide, the 12.9-inch iPad Pro is the way to go. Many people want to turn their iPads into laptop-replacements so for those people, this relatively big size is beneficial to get that laptop feel and size.
Both the iPad Air and the iPad Pro are True Tone capable.
The iPad Pro is a little brighter with 600 nits compared to 500 nits on the new Air. If you want a little more screen brightness, the iPad Pro is going to be the way to go.
The last major thing regarding display is Pro Motion. Having that 120hz fluid display is one of the key features that stand out on the iPad Pro. Not only does it make the iPad feel super smooth but it also allows the refresh rate to drop down to match your content, so if you’re watching a movie, it runs at 24fps as it was set as, saves battery life at the same time. Promotion is also what makes the iPad Pro by par the best tablet to write with latency so low, you won’t even feel it. Whether you’re gaming, writing, or just browsing the web.
On the writing note, both iPads do support Apple Pencil Generation 2 but the experience you’ll get on the iPad Pro is superior to the iPad Air can get tanks to 120htz.
The iPad Air’s 60hz screen will have more than double the latency but keep in mind Apple’s latency is down to 9 milliseconds! Not much of a downside to me…
Authentication
iPad Air 4 built-in Touch ID
Although both screens look very similar, there is a very important distinction to make and that is with authentication. On the iPad Pro, you have all the benefits of Face ID — for unlocking your iPad, making purchases, Apple Pay, all that stuff is handled on Face ID, the facial recognition system tucked into the bezel at the top of the iPad. With the iPad Air, you get the Touch ID technology into the Air’s power button. We’ve seen Touch ID in the home button but not integrated into the power button itself. This can be seen as a benefit if you’re in a public space and you’re required to wear a mask — in an instant all by using your finger. Touch ID will come out victorious and Face ID — not so much. It may not be “easier” than just glance at your iPad but it does get the job done.
To test the speed between both, we’ll see in October…
Processor
This is where the tables turn may turn for iPad Air’s favor. This is because the new iPad Air is equipped with Apple’s all-new A14 Bionic processor, Apple’s latest and greatest silicon expected, the successor to its already fast chip worldwide, A13.
This is where it becomes tricky. This is the chip that will go on the iPhone 12s. This must be massive. Apple’s A12Z (iPad Pro’s chip)is still the mightiest Apple Silicon chip in intensive tasks as seen in Geekbench 5’s multi-core test. The A13 beats the A12x in a single-core test. So the A14 will certainly be even faster, exceeding the A13 by a noble amount. But Apple may hold back the intensive strength of the A14 so that it doesn’t beat out the iPad Pro’s chip. Chips are the Pro’s specialty. The A14 has 6 cores and 4 GPUS compared to 8 cores and 8 GPUS. Again said, the A14 won’t outperform the A12z.
Again, we’ll have to wait till October for the real fun to begin. Reviewers will help us find the definite answer then. But know this; Single Core wise in light tasks like surfing the web and Youtube, the A14 will triumph. Maybe not intensive work-wise…
Etc.
Like the iPad Pro, the Air utilizes a USB-C port which allows for connecting external storage, better power, faster data transfer speeds.
The iPad Pro has a 12-megapixel main camera + an ultra-wide camera. The back of the iPad Air has just one single 12-megapixel camera and no ultra-wide. I honestly use my iPad camera for occasional selfies and just to scan a document so..it doesn't matter much to me but if it does for you…well now you know! The Pro also utilizes a Lidar scanner.
Speaker wise, The iPad Pro has a 4 speaker setup that’s going to a very immersive stereo audio experience while the iPad Air has —
2 speakers for a landscape stereo audio experience.
(it has 4 cutouts for 4 speakers like the Pro but this iPad only has two, one on each side.)
RAM could stand in the way of many buying this iPad Air because it only has 4 gigabytes compared to 6 on Pros. Now, there are many who complain about 4 gigabytes on a iPhone. So on a iPad, would the problem be worse? And as Apple has given access to their iPads like the Air to become laptop replacements, would the 4 gigabytes of RAM suffice for computer-level multitasking? Ask yourself that because if you are going to do heavy tasks even just multitasking, the Pro is the best choice…for an iPad. Macs are just around the corner mates!
Accessories won’t be a problem. The iPad Air is also compatible with all of the accessories of the iPad Pro such as Magic Keyboard and 2nd generation Apple Pencil.
Storage Capacity. The iPad Air is limited to an “okay” storage range of 64gb to a max of 256gb whereas with the iPad Pro you have more flexibility from 128gb to 1TB.
Pricing
You can buy a base model iPad Pro 11-inch right now for $799. The newest iPad Air base-model is for $599. The base 11" Pro has 128gb for base while the $600 Air gets only 64gb for a base. In $200, you are able to get 128 gigabytes of storage, 120 hertz, Face-ID, etc if you get the Pro instead. BUT, to be honest, the Air is already getting Apple’s next best chip, has THE same modern shape and design, and access to the Pro accessories and more. The only “big” thing that you could miss out from the Pro is Promotion but if you don’t care, thats fine! Plus, if storage is a problem, you can up the storage to 256gigs and be $50 short from the base 11" Pro.
Closing
All in all, the iPad Air has become in many ways become a better deal than the Pro. Personally, my dream iPad is an iPad that does the following;
A powerful chip
A top-notch display
A modern design
“Durable” in the long run
Access to the iOS/iPadOS app store coming from a Intel Mac
Worth it all the way financially
The iPad Air is the definition for my dream iPad and that used to be the 11" Pro.
However, there are some great things for some people that will make them levitate to the Pro and thats fine. That will never change for a chance, and for those, go for the Pro…but the Air is worth a chance! | https://medium.com/swlh/ipad-air-4-vs-ipad-pro-2020-97a76d396e40 | ['Doctor Marvel'] | 2020-10-22 06:38:42.232000+00:00 | ['iPad', 'Apple', 'Technology', 'Tech', 'Gadgets'] | Title iPad Air 4 v iPad Pro 2020Content iPad Air 4 v iPad Pro 2020 …Did Apple create iPad Pro killer iPad Pro 2020 v iPad Air Apple’s “Time Flies”event wrapped though wasn’t abundance hardware showcased leaving many disappointed something big take notice product got look event product got exciting hype around brand new iPad Air share lot similarity 2020 iPad Pro raise question many need answering buying iPads — iPad buy iPad Air iPad Pro we’re going discus major difference buyer’s guide Design iPad Air existed since 2013 middlelevel iPad share lot iPad Pro’s feature didn’t higherend price tag two quite similar 2018 Apple “modernized iPad Pro squaredoff design edge giving much industrial look smaller bezel Face ID also mean cut headphone jack wanted stunning design looked iPad Pro However case anymore 2020 iPad Air received allnew design design language iPad Pro Air granted new design new color well highend tablet receive “Space Gray” Silver color default yet clean new color Air glowing Rose Gold tint Green Sky Blue resembling color 2020’s Color Year amazing color iPad Air iPad Air 4 color iPad Air get variety color Pro honest half people iPad use protective case iPads come hefty price one use case iPad don’t surprised can’t see beautiful new color Display iPad Pro option pick 11inch display 129 one iPad Air 109inch option It’s 11 Pro bigger bezel decreasing screen size looking biggest screen highend iPad provide 129inch iPad Pro way go Many people want turn iPads laptopreplacements people relatively big size beneficial get laptop feel size iPad Air iPad Pro True Tone capable iPad Pro little brighter 600 nit compared 500 nit new Air want little screen brightness iPad Pro going way go last major thing regarding display Pro Motion 120hz fluid display one key feature stand iPad Pro make iPad feel super smooth also allows refresh rate drop match content you’re watching movie run 24fps set save battery life time Promotion also make iPad Pro par best tablet write latency low won’t even feel Whether you’re gaming writing browsing web writing note iPads support Apple Pencil Generation 2 experience you’ll get iPad Pro superior iPad Air get tank 120htz iPad Air’s 60hz screen double latency keep mind Apple’s latency 9 millisecond much downside me… Authentication iPad Air 4 builtin Touch ID Although screen look similar important distinction make authentication iPad Pro benefit Face ID — unlocking iPad making purchase Apple Pay stuff handled Face ID facial recognition system tucked bezel top iPad iPad Air get Touch ID technology Air’s power button We’ve seen Touch ID home button integrated power button seen benefit you’re public space you’re required wear mask — instant using finger Touch ID come victorious Face ID — much may “easier” glance iPad get job done test speed we’ll see October… Processor table turn may turn iPad Air’s favor new iPad Air equipped Apple’s allnew A14 Bionic processor Apple’s latest greatest silicon expected successor already fast chip worldwide A13 becomes tricky chip go iPhone 12 must massive Apple’s A12Z iPad Pro’s chipis still mightiest Apple Silicon chip intensive task seen Geekbench 5’s multicore test A13 beat A12x singlecore test A14 certainly even faster exceeding A13 noble amount Apple may hold back intensive strength A14 doesn’t beat iPad Pro’s chip Chips Pro’s specialty A14 6 core 4 GPUS compared 8 core 8 GPUS said A14 won’t outperform A12z we’ll wait till October real fun begin Reviewers help u find definite answer know Single Core wise light task like surfing web Youtube A14 triumph Maybe intensive workwise… Etc Like iPad Pro Air utilizes USBC port allows connecting external storage better power faster data transfer speed iPad Pro 12megapixel main camera ultrawide camera back iPad Air one single 12megapixel camera ultrawide honestly use iPad camera occasional selfies scan document soit doesnt matter much you…well know Pro also utilizes Lidar scanner Speaker wise iPad Pro 4 speaker setup that’s going immersive stereo audio experience iPad Air — 2 speaker landscape stereo audio experience 4 cutout 4 speaker like Pro iPad two one side RAM could stand way many buying iPad Air 4 gigabyte compared 6 Pros many complain 4 gigabyte iPhone iPad would problem worse Apple given access iPads like Air become laptop replacement would 4 gigabyte RAM suffice computerlevel multitasking Ask going heavy task even multitasking Pro best choice…for iPad Macs around corner mate Accessories won’t problem iPad Air also compatible accessory iPad Pro Magic Keyboard 2nd generation Apple Pencil Storage Capacity iPad Air limited “okay” storage range 64gb max 256gb whereas iPad Pro flexibility 128gb 1TB Pricing buy base model iPad Pro 11inch right 799 newest iPad Air basemodel 599 base 11 Pro 128gb base 600 Air get 64gb base 200 able get 128 gigabyte storage 120 hertz FaceID etc get Pro instead honest Air already getting Apple’s next best chip modern shape design access Pro accessory “big” thing could miss Pro Promotion don’t care thats fine Plus storage problem storage 256gigs 50 short base 11 Pro Closing iPad Air become many way become better deal Pro Personally dream iPad iPad following powerful chip topnotch display modern design “Durable” long run Access iOSiPadOS app store coming Intel Mac Worth way financially iPad Air definition dream iPad used 11 Pro However great thing people make levitate Pro thats fine never change chance go Pro…but Air worth chanceTags iPad Apple Technology Tech Gadgets |
4,141 | by Jean de La Rochebrochard | Funnel, Model, Growth & Retention
Whatever your business is, you have to master those four sets of metrics.
Funnel of conversion
The destination does not matter if you fail at taking the right path towards it. The funnel of conversion is the path from business origination toward closing and post-closing. If you run an e-commerce business for instance, it goes as follow:
Points of contact (SEO, SEM, Content, Social Sharing, Word of mouth…)
Converted into visitors
Converted into users
Converted into buyers
Converted into customers & ambassadors
This is a simple representation of what a conversion funnel looks like. What it also shows is that every single step matters and is connected to the next one. Visitors don’t become buyers, they become users at first: people who navigate within your website/application, interested in what you have to offer. They become buyers when they purchase something and they become real customers when you can build a relationship with them so they can buy from you again and talk about your service/products around them!
If you develop a mobile consumer app, it goes as follow: Points of contact (Appstore featuring, media…) converted into Downloads converted into Signups converted into Active Users (DAU, MAU…) converted into Purchasing Active Users?
If you run a SaaS Business: Points of contact converted into Visitors > Trial Users / Demo Request > Converted Users > Up-selling rate / Churn rate
Define your funnel of conversion, observe where are the bottlenecks, points of friction. See where you fail to lead more people toward the next step and focus on improving each step, one after the other. Don’t overthink, keep it simple.
Business Model & Model Equilibrium
Who are your customers, what do you sell to them (product, service, ads…), through which form (subscription, one shot)?
Your business model equilibrium is like your funnel of conversion, business-wise. You go from the revenues all the way down to the operational result. You must make the difference between the aggregated funnel of your business model and the detailed version of it. Let me explain it for an e-commerce business:
The aggregated business model equilibrium, monthly, looks as follow:
+ Average Basket per order
- Average Cost of Goods Sold (per order)
= Gross Margin (per order)
- Average Logistic Costs (…)
- Average Transportation Costs (…)
- Other Variable Costs Associated (in average of course, per order…)
= Contribution Margin (Pre Marketing Costs)
- Average Marketing Costs (Marketing Costs for the month / # of Orders)
= Net Contribution Margin
Net Contribution Margin * Number of orders = Available ressources to cover your fixed costs. You see, it’s pretty simple. The only problem here is that we only cover aggregated data. We have no details whatsoever about the gross margin per product, the marketing costs, the rate of returning customers, the costs of logistic and transportation…
You must take each of those metrics separately and observe their specificities, their min/max & standard deviation, how you can improve them individually in order to improve your overall business model equilibrium.
Focus on the ones with the higher impacts (usually on top, like the gross margin).
Growth
Growth is not gross! Let’s see it that way: if your startup does not grow, another one does. There are too many things behind which entrepreneurs hide in order not to focus on growth: Product development, Branding, Team, Technical debt… And many others!
Growth is something you always run after, like almost everything else in a startup. Growth is a full time, scary, challenging everyday mission to achieve.
What matters is the Compound Growth Rate. You should focus on the most downstream metric of your startup and make sure it grows, week after week, month after month. For instance, to calculate the growth rate of your monthly active users, the formula is the following:
((Ending value / Beginning value) ^ (1/ # of periods)) - 1
Month 1: 100 000 users
Month 12: 200 000 users
Oh great! 2 times more users during the period ?!… Except that ((200 000 / 100 00) ^(1/12)) - 1 = 5.95% compound monthly growth, and let’s get this straight: this is not great! Look at the real brutal impact of the compound growth rate over a 1 year period only:
5% growth weekly = Ending value 12x the beginning value
10% growth weekly = Ending value 129x the beginning value
30% growth monthly = Ending value 18x the beginning value
As you can see, if your compound weekly growth rate is not 5% but 10%, the impact is not 2 times but 10 times more important over a 1 year period only! Now, you understand how can emerge empires like Uber, Airbnb or Snapchat only few years after inception.
Retention
Retention is the Holly Grail of all metrics. It is useless without growth as well as growth is stupid without retention. Retention has many forms:
If you sell subscriptions, how many of your customers remain after 1, 3, 6, 12, 24 months? It allows you to calculate both your churn rate (how many of them leave) and your Customer Lifetime Value (how much one customer generate in average during a 12, 24, 36 months period).
If you develop a mobile consumer app, what is the ratio between your monthly active users and your daily active users (DAU/MAU)? How many of your active users remain active after 1, 3, 6 months? How many friends do they invite, what is the virality effect of your app?
If you run an e-commerce business, what is the percentage of customers who buys 2 times or more every year. How many of your customers are returning ones. What is the average number of orders per customer per year?
Learn how to calculate those retention metrics and for those who struggle with the understanding of a cohort, here is a quick example:
Now that you’re getting more familiar with the Fantastic 4, apply them to your business. Again, Google, Quora or any valuable person of your industry can help you. Do not hesitate to ask.
If you liked this post, please share it with the rest of the community :) | https://medium.com/kima-ventures/the-fantastic-4-funnel-model-growth-retention-dc47f1c761cd | ['Jean De La Rochebrochard'] | 2015-09-05 11:52:33.456000+00:00 | ['Metrics', 'Startup', 'Analytics'] | Title Jean de La RochebrochardContent Funnel Model Growth Retention Whatever business master four set metric Funnel conversion destination matter fail taking right path towards funnel conversion path business origination toward closing postclosing run ecommerce business instance go follow Points contact SEO SEM Content Social Sharing Word mouth… Converted visitor Converted user Converted buyer Converted customer ambassador simple representation conversion funnel look like also show every single step matter connected next one Visitors don’t become buyer become user first people navigate within websiteapplication interested offer become buyer purchase something become real customer build relationship buy talk serviceproducts around develop mobile consumer app go follow Points contact Appstore featuring media… converted Downloads converted Signups converted Active Users DAU MAU… converted Purchasing Active Users run SaaS Business Points contact converted Visitors Trial Users Demo Request Converted Users Upselling rate Churn rate Define funnel conversion observe bottleneck point friction See fail lead people toward next step focus improving step one Don’t overthink keep simple Business Model Model Equilibrium customer sell product service ads… form subscription one shot business model equilibrium like funnel conversion businesswise go revenue way operational result must make difference aggregated funnel business model detailed version Let explain ecommerce business aggregated business model equilibrium monthly look follow Average Basket per order Average Cost Goods Sold per order Gross Margin per order Average Logistic Costs … Average Transportation Costs … Variable Costs Associated average course per order… Contribution Margin Pre Marketing Costs Average Marketing Costs Marketing Costs month Orders Net Contribution Margin Net Contribution Margin Number order Available ressources cover fixed cost see it’s pretty simple problem cover aggregated data detail whatsoever gross margin per product marketing cost rate returning customer cost logistic transportation… must take metric separately observe specificity minmax standard deviation improve individually order improve overall business model equilibrium Focus one higher impact usually top like gross margin Growth Growth gross Let’s see way startup grow another one many thing behind entrepreneur hide order focus growth Product development Branding Team Technical debt… many others Growth something always run like almost everything else startup Growth full time scary challenging everyday mission achieve matter Compound Growth Rate focus downstream metric startup make sure grows week week month month instance calculate growth rate monthly active user formula following Ending value Beginning value 1 period 1 Month 1 100 000 user Month 12 200 000 user Oh great 2 time user period … Except 200 000 100 00 112 1 595 compound monthly growth let’s get straight great Look real brutal impact compound growth rate 1 year period 5 growth weekly Ending value 12x beginning value 10 growth weekly Ending value 129x beginning value 30 growth monthly Ending value 18x beginning value see compound weekly growth rate 5 10 impact 2 time 10 time important 1 year period understand emerge empire like Uber Airbnb Snapchat year inception Retention Retention Holly Grail metric useless without growth well growth stupid without retention Retention many form sell subscription many customer remain 1 3 6 12 24 month allows calculate churn rate many leave Customer Lifetime Value much one customer generate average 12 24 36 month period develop mobile consumer app ratio monthly active user daily active user DAUMAU many active user remain active 1 3 6 month many friend invite virality effect app run ecommerce business percentage customer buy 2 time every year many customer returning one average number order per customer per year Learn calculate retention metric struggle understanding cohort quick example you’re getting familiar Fantastic 4 apply business Google Quora valuable person industry help hesitate ask liked post please share rest community Tags Metrics Startup Analytics |
4,142 | Voice of an Angel | Voice of an Angel
A novel
Image courtesy of Conny Manero
Synopsis
Talent agent, Jack Garrett, hears the voice of an angel drifting down from a balcony in Greenwich Village. Frustrated, he spends nights walking the streets trying to find his angel. Jessie Green is in a dead-end job until she loses it, and quickly grabs an opportunity for a better life. With her best friend, Betty McGill, they both stumble into new but different careers with the help of serendipitous good luck. Through a web of unexpected circumstances, Jack and Jessie’s lives are about to collide with more than a few surprises. Will love get in the way of making their dreams come true? Jessie and Jack both have a lot to learn, but can they really trust each other? Voice of an Angel…where more than one dream can come true.
Chapter 1
June 1998
Jessie Green glanced at the red digital clock on the wall … 4:30 p.m. Another half hour and they could all go home. With a sigh, she reached for another shirt from a pile of freshly laundered linen and placed it on the press. In the five years, she had worked for Muller’s Laundry & Dry Cleaning she had pressed hundreds, maybe even thousands of garments: shirts, blouses, slacks, table cloths, and bedsheets. It was not a bad job. She knew there were better jobs, but with no qualifications, working in a laundry was all she could do.
When Jenny Sullivan came to collect the work orders of the day for invoicing tomorrow, Jessie watched the girl with a mixture of admiration and envy. Jenny was Harry Muller’s assistant and always looked picture perfect. She never had a hair out of place, a smudge in her make-up, a wrinkle or stain on her clothes, a ladder in her stockings or dirt on her shoes. Jessie wondered how she did it, how she managed to always look so cucumber fresh.
Looking at Jenny made Jessie wish she had finished high school, and then she too could have gone to secretarial school and looked smart in cute little outfits, with cute little shoes. Instead, she wore jeans, T-shirts, and sneakers to work because being comfortable was important when you were on your feet eight hours a day.
She often regretted dropping out of school. If only she had stuck it out those last three months. But no, back then she was far too anxious to make her debut into the working world. She felt she was wasting her time in a classroom. She could not wait to get out into the real world and start earning money.
When Jessie heard that Muller’s Laundry & Dry Cleaning was looking for help she applied for a job and was hired on the spot. The following Monday, instead of going to school, she proudly went to work. At the time she was certain she was making the right decision, but now she was not so sure. If she had graduated she could have her choice of careers. Instead, she worked in this laundry, this hot, steamy laundry, and was probably stuck here forever. Sure she was earning money, but Jenny probably made double if not triple of what she was making.
At the sound of her name, Jessie looked up from her work and saw Betty McGill frantically tapping her wristwatch. She cast another glance at the wall clock and nodded at her friend. It was just after 5:00 p.m.
“Are you okay?” Betty asked as they walked home, noticing that her friend was not her usual talkative self.
Jessie gave a listless shrug. “Just thinking, you know.”
“About what?”
“The past. The future.”
Betty frowned. “That’s heavy thinking my friend.”
“Don’t you ever think about things?”
“Like what?”
“Like what the future holds for you.”
Betty shrugged her shoulders. “I suppose I’ll meet a nice guy, get married and have kids someday. What else is there?”
“A career.”
“A career!” Betty burst out laughing. “Jessie, you and I work in a laundry, I would hardly call that a career.”
“Don’t you ever wish you could do something else? Something a little more challenging, a little more sophisticated.”
Betty looked at her friend and smiled. “Sure I do. I would like to be a doctor or a lawyer or something else that earns me tons of money, but I’m not exactly qualified.”
Jessie hesitated before making the suggestion.
“We could go back to school.”
Betty laughed again. “Jess it takes years to qualify as a doctor or a lawyer and we didn’t even finish high school.”
Jessie waved an impatient hand. “I don’t mean that. I mean, we could take a course, a secretarial course.”
“You mean to learn to type and stuff?”
“That is exactly what I mean.”
Betty looked doubtful. “I don’t know Jess, I’m sure there’s more to being an assistant than just typing. I think you have to be smart for that sort of work.”
“We’ are smart Betty.” Jessie retorted with a small edge in her voice.
Betty continued. “And there is the small problem with a decent wardrobe. You’ve seen the kind of outfits Jenny wears to work. I don’t know about you, but I don’t have those kinds of clothes.”
Jessie had to admit that Betty had a point. Their wardrobe was a potential problem. Both of them wore mainly jeans and T-shirts. Hardly appropriate office wear.
“Any plans for tonight?” Betty asked in an attempt to change the subject.
“Nothing special,” Jessie answered with a hint of boredom in her voice. Same thing I do every Monday, Tuesday, and Thursday night … ironing.”
“You still iron for your neighbors?”
Jessie nodded. “Elizabeth and Clara are old, they can’t do their own ironing anymore and they are very grateful that I help them. I do Elizabeth’s laundry on Mondays, Clara’s on Tuesdays, and my own on Thursdays.”
Betty shook her head in wonder. “I don’t know how you do it girl. You iron all day long and then you go home to more ironing. Haven’t you ever suggested to them that they could send out their stuff to a laundry?”
“No,” Jessie said vehemently, “and I’m not about to, it’s extra money for me.”
That night after she finished dinner and washed the dishes, Jessie set up her ironing board and iron and collected the ironing from the storage room. She switched on the stereo, selected a CD, plugged in headphones and turned up the volume. She liked nothing better than to sing along with a CD.
Singing along with a CD was something Jessie loved to do while ironing. She sometimes worried that the neighbors might hear her, but thought this unlikely. She never heard a sound from them, so she figured they couldn’t hear her either. If her voice drifted down to the street through the wide-open balcony doors that was different. People on the street below couldn’t see her. They didn’t know where she was, didn’t know who she was.
When the last piece of clothing was ironed and folded, Jessie packed away the iron and the board, put the kettle on for a cup of coffee, and decided she would curl up with a book on the couch. She would slip between the pages and let herself be transported to a sleepy Irish village with some wide-awake citizens. She loved the little village in which the story was set, and she loved the people in it. They seemed so real. They were not the pretentious high society types with tons of money. They were not professionals with glamorous careers. They were ordinary people, with ordinary lives, who loved and cried, worked and struggled, and somehow made a success of what they were doing. Considering herself ordinary too, Jessie liked reading success stories. They gave her hope and courage for the future.
When the clock struck eleven she reluctantly closed her book and carried it with her to bed. She stopped to close the balcony door and switch off the lights. In bed, she would read another couple of pages and before falling asleep and dreaming of a wonderful future.
Chapter 2
But Jessie couldn’t sleep. She tossed and turned and imagined herself sitting behind a desk. She would be dressed in a stunning outfit, answering ringing phones with a smile on her face. A million thoughts scurried through her mind. She knew that completing the course would present many obstacles. She worried she might not qualify as an applicant due to her lack of education. If she was accepted it would have to be an evening class. Would she be able to manage to work all day and attending school at night? She wanted this so badly she would just have to do it. She also wondered where such classes were held, how long each class was, how long a course was, and how much it would cost.
When a nearby church bell struck two o’clock, Jessie sat up and slipped out of bed. She would have some hot chocolate. Maybe that would help her sleep.
Sipping the hot drink at the kitchen table, she reached for yesterday’s newspaper and turned to the classifieds. She was surprised at the number of ads for secretaries, administrative assistants, and executive assistants. She wondered what the difference was between an executive assistant and an administrative assistant. She studied the requirements for each job listed: tying correspondence, typing financial statements, organizing meetings, scheduling appointments, booking flight and hotel accommodations, filing, and answering calls.
When she turned the page she saw a number of ads for private colleges. Some offered courses in drawing and painting, some in car mechanics, hairdressing, foot care, and massage. There were also some that offered secretarial courses. Jessie’s eyes widened when she saw the price … a thousand dollars for a three-month course, not exactly cheap. Somewhat disheartened she closed the paper, finished her hot chocolate, and went back to bed.
The next day at work she made some mental calculations. Half of her wages went to rent, a portion went to bills, another portion to groceries, and toiletries. That left precious little to spend on personal items or necessities for the apartment. How could she possibly save up a thousand dollars for a course?
At three o’clock, Betty indicated with a drinking gesture that it was time for a break.
“You look tired,” she commented as soon as she and Jessie sat down at one of the cafeteria tables. “Are you feeling okay?”
“Fine,” Jessie shrugged. “Just a little tired. I didn’t get much sleep last night.”
“Oh?”
“I kept thinking about taking that course, the secretarial course, and…”
“What is it suddenly with you wanting to be an assistant?” Betty demanded in an annoyed tone. “You’re a press operator. You have been for five years. You’ve always been happy with your work. At least I’ve never heard you complain. But now suddenly you got it in your head that you want to be an assistant. What’s wrong with being a press operator?”
At first, Jessie said nothing, she just stared at her coffee, but then slowly she started formulating her thoughts. “I’m tired of being in a steamy room all day Betty. I’m tired of being hot and sweaty, doing the same thing day after day after day. I’m tired of watching my life go by. I’ve been here five years and I’m doing today what I was doing on my first day. I’m tired of people looking down on me and they do you know. I met a guy the other day and we hit it off, right up to the point where he asked me what I did for a living, and then suddenly he changed. You know why he changed? I do, I wasn’t good enough for him. And this isn’t the first time it’s happened. There have been others I’ve gone out with, but who dumped me as soon as they found out I work in a laundry.”
“That’s stupid,” Betty spat. “Anyone who rates you by what you do, or how much money you have, isn’t worthy of you.”
“Well, that may be true, but that’s not even why I want to take the course. I want to do it for me because I want something better for myself.”
“And a secretarial course is the answer? You think you can be an assistant?”
Jessie stayed silent for a moment. If Betty didn’t believe in her, what chance did she have with strangers? But she wanted to try. She had to try. If it didn’t work out, it didn’t work out, but she had to try.
“Jessie.”
When Jessie looked up Jenny Sullivan was standing next to her.
“Yes.”
“Mr. Muller would like to see you in his office.”
A sense of panic flooded through Jessie. In all the years she had worked for the laundry service she had never been asked to go to the boss’ office. Whatever Mr. Muller had to say was relayed to the staff through memos Jenny pinned on the notice board in the cafeteria. There was only one occasion when Mr. Muller wanted to see an employee in person … to fire that employee.
But why he would want to fire her? Jessie had no idea. She was never late, she was dependable and she was good at her job.
She cast a worried glance at Betty, who looked just as worried.
Trembling Jessie got off her chair and followed Jenny up the stairs to the first floor where the offices were located.
“Wait here,” Jenny instructed when they arrived at her office. “Have a seat please.”
Jenny stepped into the adjoining office and closed the door. Jessie sat down and looked around her. So this was Jenny’s office. Somehow she had pictured it a little bit more glamorous. It had cream-colored walls, dark brown furniture, and a threadbare brown carpet. The only things that livened up the place a bit were two green potted plants on the windowsill, a pink teddy bear next to Jenny’s computer, and a red picture frame on the desk. But the office was bright with sunshine and Jessie thought how wonderful it must be to have natural light all day; to see the sun and the sky, the rain and the snow. In the laundry in the basement, they worked with harsh white tube lights and had no idea what the weather was like.
“Jessie, Mr. Muller will see you now.”
The door of the adjoining office had opened and Jenny motioned Jessie to step inside.
Jessie didn’t want to go in. She had the feeling that no good would come of this meeting.
Keeping her eyes downcast, Jessie couldn’t help but notice the changes as she entered Mr. Muller’s office. The dull brown carpet changed to a plush cream one, and when she looked up she found herself surrounded by luxury. She knew enough about wood to recognize that the numerous bookcases, credenza, and huge desk were oak. She didn’t have to touch the three-piece lounge suite to know that it was made from the softest leather, and she didn’t need to examine the decanter and glasses on the credenza to know they were crystal. There was a big difference between this office and Jenny’s but in comparison to the laundry area downstairs, this place was a palace.
“Jessie,” Harry Muller said rising from the high backed chair behind his desk, “please come in and have a seat.”
Wringing her hands Jessie perched on the indicated chair and waited for what was coming. She didn’t have to wait long.
“I’m afraid I have some bad news for you Jessie,” Harry Muller came straight to the point.
Yep, I’m fired, Jessie thought. She only half heard how her boss praised her work, thanked her for five years of loyal service, but explained that machines were taking over manual labor. Her mind was in such turmoil she only heard the end of his speech, “So I’m afraid I’m gonna have to let you go. I’m really sorry Jessie. It speaks for itself that I will give you an excellent reference and a month’s salary in advance.”
Jessie nodded, thanked her boss, and left the office. As she descended the stairs reality slowly settled in. She was unemployed. She didn’t have a job anymore. She wouldn’t be coming back here on Monday. What was she going to do? What was going to happen to her? She wouldn’t have an income anymore. How was she going to pay the rent? How was she going to pay for groceries? Hang on, don’t panic, she told herself, Mr. Muller had stated that she would get a month’s wages in advance. Surely she could find another job within a month. Yes, she could do that. Things would be all right. She might even find a better job. Who wanted to work in a steamy laundry anyway?
Chapter 3
At the bottom of the stairs, Betty anxiously awaited Jessie.
“And?” she said, inclining her head a little. “What did he want to see you for?”
“I just got fired,” Jessie said flatly.
“Fired!” Betty cried, not able to hide the outrage in her voice. “Why? What did you do? What did he fire you for?”
“Apparently a machine is going to do my job,” Jessie shrugged.
Betty was momentarily speechless. “I … I can’t believe it,” she eventually stammered. “How could he? And what do you mean a machine is going to do your job? How can a machine press shirts and blouses? It probably can do sheets and tablecloths and other flat things, but how can it do delicate things?”
Jessie merely shrugged.
“So where does that leave me?” Betty added as an afterthought. “Am I gonna be fired too?”
Jessie took a deep breath, shrugged again, and shook her head. She had no idea. She also had no idea as to what she was supposed to do now. Was she supposed to finish her day, or should she say goodbye to everyone and just leave?
“Jessie,” both Jessie and Betty looked up at the sound of Jenny Sullivan’s voice as she came hurrying down the stairs. “Can I talk to you for a moment?”
“I’ll talk to you later,” Betty said, sensing the two women needed some privacy.
“Wanna grab a cup of coffee?”
Jenny led the way to the cafeteria, poured two cups of coffee and took them over to a table by the window.
“What will you do now?”
“I don’t know,” Jessie said, cupping the coffee between her hands. “I was actually just thinking about that. Do I leave now, or do I finish the day?”
“You don’t have to finish the day,” Jenny shook her head. “You may leave right away if you like. But before you go I wanted to have a bit of a chat with you. What will you do now? What are your plans? I realize you haven’t had much time to consider your future and you’re probably still in shock, but…”
When Jenny stopped speaking, Jessie looked up. “But what?”
“Well, I wanted to make a suggestion.”
Jessie waited for what was to come.
“I’ve been watching you and listening to you for some time now,” Jenny started tentatively, “and you seem like a very intelligent person. Every morning I see you come in with The New York Times and you don’t just skim the pages, you read the articles. And you talk differently than the other workers around here. You seem to know a lot about politics and the economy in general, and you use words like exemplify, governance, and misconstrue. One would expect such language from a college graduate, not from a … laborer. Now, don’t get me wrong, I think it’s great that you’re so well-spoken, but you do seem a little out of place here. Behind a hot press, I mean.”
Jessie was temporarily at a loss for words. On the one hand, she felt slightly put off that Jenny was surprised she read the newspaper, took an interest in politics and the economy, and knew a few intellectual words. Just because she worked with her hands didn’t mean she didn’t have a mind. But on the other hand, she was flattered that Jenny was taking an interest in her, and she couldn’t wait to hear what she had to suggest.
“I think you can do better than working in a laundry,” Jenny went on. “I think by terminating your employment here, Mr. Muller might have done you the biggest favor.”
“So what do you suggest?” Jessie said, pinching her eyebrows together. “Are you saying that I should apply to work in a store?”
Jenny inclined her head. “Set your sights a little higher Jessie. Have you thought about going back to school? Perhaps take a course of some sort?”
“As a matter of fact, I have,” Jessie admitted hesitantly. “But…”
“But what?”
“Courses are expensive. It would have been difficult enough to pay for a course while I was earning a monthly paycheque, but now, now that I’ve lost my job…”
“On the contrary,” Jenny interrupted. “Now is the perfect time. While you were working it would have been hard to go to night school, but now that you’re not working you have the time to pursue a new career.”
“And what do you suggest I do for money?”
Jenny waved a dismissive hand. “Since it’s only a matter of money, take any job, any job at all. Be a waitress in a bar or a restaurant. It doesn’t pay much, but the tips can add up. Then once you’re finished with your course you can just walk out. Do something with your life, Jessie.”
Jessie was about to mention that she didn’t know anything about waitressing when Jenny handed her two envelopes.
Jessie recognized her pay packet, but she wondered about the second envelope. “What is this?”
“This one is your paycheque,” Jenny explained. “This week’s pay plus another four weeks as Mr. Muller promised. And this,” she tapped the second envelope, “is a gift from Mr. Muller himself. Invest it wisely.”
After Jenny had left her, Jessie reflected on the five years she had worked for Muller Laundry & Dry Cleaning Services. At age seventeen she had arrived at this building full of enthusiasm. She was going to be a working girl. No more classrooms and homework for her, she was a grown-up and she was joining the working force. She had quickly become friends with all the other workers, especially Betty, who had started working for the laundry a little over a year ago and had shown her the ropes. They had sought out each other’s company outside work too. They often went shopping together, went for walks in the park, or just visited each other at home. The years passed and when Jessie lost her parents in a car accident she suggested to Betty they become roommates, but as an only child, Betty wouldn’t leave her widowed mother. In time Jessie considered herself happy. She had her own apartment, the furnishings — although mainly second-hand stuff — were tasteful, and she loved her job. It wasn’t until she started dating and was repeatedly dumped after mentioning she was a press operator in a laundry service that she became unhappy with her job. Now her job had come to an unexpected end. According to Jenny, that was a blessing.
Jessie finished her coffee, went to the locker room to collect her handbag before heading for the exit. She knew she should say goodbye to everyone, but she couldn’t face them. She hated good-byes. She would see Betty tomorrow, and the others — when they heard the news — well, they would understand. Outside the gates, she turned around for one last look. For everyone else the weekend was about to begin, followed by another work week. She had no idea what she would be doing next week.
That night in her apartment Jessie opened the gift envelope. To her utter amazement inside was a cheque in the amount of three thousand dollars and a note that read:
“Please accept this as a token of my appreciation for the last five years of excellent service.
Have fun with it.
Harry Muller”.
Jessie knew right away what she would do with the windfall. Jenny had advised her to invest it wisely, Mr. Muller wrote to have fun with it. Well, she was going to do both. She was going to invest part of the money in herself and enroll in a secretarial course, and with the rest, she was going to go shopping, invest in a whole new wardrobe. Smiling she reached for the phone.
“Betty,” she said when the call was answered, “want to go to the mall with me tomorrow?” | https://medium.com/illumination/voice-of-an-angel-2231073d84d3 | ['Conny Manero'] | 2020-06-15 01:10:36.964000+00:00 | ['Books', 'Reading', 'Novel', 'Book Recommendations', 'Novel Excerpt'] | Title Voice AngelContent Voice Angel novel Image courtesy Conny Manero Synopsis Talent agent Jack Garrett hears voice angel drifting balcony Greenwich Village Frustrated spends night walking street trying find angel Jessie Green deadend job loses quickly grab opportunity better life best friend Betty McGill stumble new different career help serendipitous good luck web unexpected circumstance Jack Jessie’s life collide surprise love get way making dream come true Jessie Jack lot learn really trust Voice Angel…where one dream come true Chapter 1 June 1998 Jessie Green glanced red digital clock wall … 430 pm Another half hour could go home sigh reached another shirt pile freshly laundered linen placed press five year worked Muller’s Laundry Dry Cleaning pressed hundred maybe even thousand garment shirt blouse slack table cloth bedsheets bad job knew better job qualification working laundry could Jenny Sullivan came collect work order day invoicing tomorrow Jessie watched girl mixture admiration envy Jenny Harry Muller’s assistant always looked picture perfect never hair place smudge makeup wrinkle stain clothes ladder stocking dirt shoe Jessie wondered managed always look cucumber fresh Looking Jenny made Jessie wish finished high school could gone secretarial school looked smart cute little outfit cute little shoe Instead wore jean Tshirts sneaker work comfortable important foot eight hour day often regretted dropping school stuck last three month back far anxious make debut working world felt wasting time classroom could wait get real world start earning money Jessie heard Muller’s Laundry Dry Cleaning looking help applied job hired spot following Monday instead going school proudly went work time certain making right decision sure graduated could choice career Instead worked laundry hot steamy laundry probably stuck forever Sure earning money Jenny probably made double triple making sound name Jessie looked work saw Betty McGill frantically tapping wristwatch cast another glance wall clock nodded friend 500 pm “Are okay” Betty asked walked home noticing friend usual talkative self Jessie gave listless shrug “Just thinking know” “About what” “The past future” Betty frowned “That’s heavy thinking friend” “Don’t ever think things” “Like what” “Like future hold you” Betty shrugged shoulder “I suppose I’ll meet nice guy get married kid someday else there” “A career” “A career” Betty burst laughing “Jessie work laundry would hardly call career” “Don’t ever wish could something else Something little challenging little sophisticated” Betty looked friend smiled “Sure would like doctor lawyer something else earns ton money I’m exactly qualified” Jessie hesitated making suggestion “We could go back school” Betty laughed “Jess take year qualify doctor lawyer didn’t even finish high school” Jessie waved impatient hand “I don’t mean mean could take course secretarial course” “You mean learn type stuff” “That exactly mean” Betty looked doubtful “I don’t know Jess I’m sure there’s assistant typing think smart sort work” “We’ smart Betty” Jessie retorted small edge voice Betty continued “And small problem decent wardrobe You’ve seen kind outfit Jenny wear work don’t know don’t kind clothes” Jessie admit Betty point wardrobe potential problem wore mainly jean Tshirts Hardly appropriate office wear “Any plan tonight” Betty asked attempt change subject “Nothing special” Jessie answered hint boredom voice thing every Monday Tuesday Thursday night … ironing” “You still iron neighbors” Jessie nodded “Elizabeth Clara old can’t ironing anymore grateful help Elizabeth’s laundry Mondays Clara’s Tuesdays Thursdays” Betty shook head wonder “I don’t know girl iron day long go home ironing Haven’t ever suggested could send stuff laundry” “No” Jessie said vehemently “and I’m it’s extra money me” night finished dinner washed dish Jessie set ironing board iron collected ironing storage room switched stereo selected CD plugged headphone turned volume liked nothing better sing along CD Singing along CD something Jessie loved ironing sometimes worried neighbor might hear thought unlikely never heard sound figured couldn’t hear either voice drifted street wideopen balcony door different People street couldn’t see didn’t know didn’t know last piece clothing ironed folded Jessie packed away iron board put kettle cup coffee decided would curl book couch would slip page let transported sleepy Irish village wideawake citizen loved little village story set loved people seemed real pretentious high society type ton money professional glamorous career ordinary people ordinary life loved cried worked struggled somehow made success Considering ordinary Jessie liked reading success story gave hope courage future clock struck eleven reluctantly closed book carried bed stopped close balcony door switch light bed would read another couple page falling asleep dreaming wonderful future Chapter 2 Jessie couldn’t sleep tossed turned imagined sitting behind desk would dressed stunning outfit answering ringing phone smile face million thought scurried mind knew completing course would present many obstacle worried might qualify applicant due lack education accepted would evening class Would able manage work day attending school night wanted badly would also wondered class held long class long course much would cost nearby church bell struck two o’clock Jessie sat slipped bed would hot chocolate Maybe would help sleep Sipping hot drink kitchen table reached yesterday’s newspaper turned classified surprised number ad secretary administrative assistant executive assistant wondered difference executive assistant administrative assistant studied requirement job listed tying correspondence typing financial statement organizing meeting scheduling appointment booking flight hotel accommodation filing answering call turned page saw number ad private college offered course drawing painting car mechanic hairdressing foot care massage also offered secretarial course Jessie’s eye widened saw price … thousand dollar threemonth course exactly cheap Somewhat disheartened closed paper finished hot chocolate went back bed next day work made mental calculation Half wage went rent portion went bill another portion grocery toiletry left precious little spend personal item necessity apartment could possibly save thousand dollar course three o’clock Betty indicated drinking gesture time break “You look tired” commented soon Jessie sat one cafeteria table “Are feeling okay” “Fine” Jessie shrugged “Just little tired didn’t get much sleep last night” “Oh” “I kept thinking taking course secretarial course and…” “What suddenly wanting assistant” Betty demanded annoyed tone “You’re press operator five year You’ve always happy work least I’ve never heard complain suddenly got head want assistant What’s wrong press operator” first Jessie said nothing stared coffee slowly started formulating thought “I’m tired steamy room day Betty I’m tired hot sweaty thing day day day I’m tired watching life go I’ve five year I’m today first day I’m tired people looking know met guy day hit right point asked living suddenly changed know changed wasn’t good enough isn’t first time it’s happened others I’ve gone dumped soon found work laundry” “That’s stupid” Betty spat “Anyone rate much money isn’t worthy you” “Well may true that’s even want take course want want something better myself” “And secretarial course answer think assistant” Jessie stayed silent moment Betty didn’t believe chance stranger wanted try try didn’t work didn’t work try “Jessie” Jessie looked Jenny Sullivan standing next “Yes” “Mr Muller would like see office” sense panic flooded Jessie year worked laundry service never asked go boss’ office Whatever Mr Muller say relayed staff memo Jenny pinned notice board cafeteria one occasion Mr Muller wanted see employee person … fire employee would want fire Jessie idea never late dependable good job cast worried glance Betty looked worried Trembling Jessie got chair followed Jenny stair first floor office located “Wait here” Jenny instructed arrived office “Have seat please” Jenny stepped adjoining office closed door Jessie sat looked around Jenny’s office Somehow pictured little bit glamorous creamcolored wall dark brown furniture threadbare brown carpet thing livened place bit two green potted plant windowsill pink teddy bear next Jenny’s computer red picture frame desk office bright sunshine Jessie thought wonderful must natural light day see sun sky rain snow laundry basement worked harsh white tube light idea weather like “Jessie Mr Muller see now” door adjoining office opened Jenny motioned Jessie step inside Jessie didn’t want go feeling good would come meeting Keeping eye downcast Jessie couldn’t help notice change entered Mr Muller’s office dull brown carpet changed plush cream one looked found surrounded luxury knew enough wood recognize numerous bookcase credenza huge desk oak didn’t touch threepiece lounge suite know made softest leather didn’t need examine decanter glass credenza know crystal big difference office Jenny’s comparison laundry area downstairs place palace “Jessie” Harry Muller said rising high backed chair behind desk “please come seat” Wringing hand Jessie perched indicated chair waited coming didn’t wait long “I’m afraid bad news Jessie” Harry Muller came straight point Yep I’m fired Jessie thought half heard bos praised work thanked five year loyal service explained machine taking manual labor mind turmoil heard end speech “So I’m afraid I’m gonna let go I’m really sorry Jessie speaks give excellent reference month’s salary advance” Jessie nodded thanked bos left office descended stair reality slowly settled unemployed didn’t job anymore wouldn’t coming back Monday going going happen wouldn’t income anymore going pay rent going pay grocery Hang don’t panic told Mr Muller stated would get month’s wage advance Surely could find another job within month Yes could Things would right might even find better job wanted work steamy laundry anyway Chapter 3 bottom stair Betty anxiously awaited Jessie “And” said inclining head little “What want see for” “I got fired” Jessie said flatly “Fired” Betty cried able hide outrage voice “Why fire for” “Apparently machine going job” Jessie shrugged Betty momentarily speechless “I … can’t believe it” eventually stammered “How could mean machine going job machine press shirt blouse probably sheet tablecloth flat thing delicate things” Jessie merely shrugged “So leave me” Betty added afterthought “Am gonna fired too” Jessie took deep breath shrugged shook head idea also idea supposed supposed finish day say goodbye everyone leave “Jessie” Jessie Betty looked sound Jenny Sullivan’s voice came hurrying stair “Can talk moment” “I’ll talk later” Betty said sensing two woman needed privacy “Wanna grab cup coffee” Jenny led way cafeteria poured two cup coffee took table window “What now” “I don’t know” Jessie said cupping coffee hand “I actually thinking leave finish day” “You don’t finish day” Jenny shook head “You may leave right away like go wanted bit chat plan realize haven’t much time consider future you’re probably still shock but…” Jenny stopped speaking Jessie looked “But what” “Well wanted make suggestion” Jessie waited come “I’ve watching listening time now” Jenny started tentatively “and seem like intelligent person Every morning see come New York Times don’t skim page read article talk differently worker around seem know lot politics economy general use word like exemplify governance misconstrue One would expect language college graduate … laborer don’t get wrong think it’s great you’re wellspoken seem little place Behind hot press mean” Jessie temporarily loss word one hand felt slightly put Jenny surprised read newspaper took interest politics economy knew intellectual word worked hand didn’t mean didn’t mind hand flattered Jenny taking interest couldn’t wait hear suggest “I think better working laundry” Jenny went “I think terminating employment Mr Muller might done biggest favor” “So suggest” Jessie said pinching eyebrow together “Are saying apply work store” Jenny inclined head “Set sight little higher Jessie thought going back school Perhaps take course sort” “As matter fact have” Jessie admitted hesitantly “But…” “But what” “Courses expensive would difficult enough pay course earning monthly paycheque I’ve lost job…” “On contrary” Jenny interrupted “Now perfect time working would hard go night school you’re working time pursue new career” “And suggest money” Jenny waved dismissive hand “Since it’s matter money take job job waitress bar restaurant doesn’t pay much tip add you’re finished course walk something life Jessie” Jessie mention didn’t know anything waitressing Jenny handed two envelope Jessie recognized pay packet wondered second envelope “What this” “This one paycheque” Jenny explained “This week’s pay plus another four week Mr Muller promised this” tapped second envelope “is gift Mr Muller Invest wisely” Jenny left Jessie reflected five year worked Muller Laundry Dry Cleaning Services age seventeen arrived building full enthusiasm going working girl classroom homework grownup joining working force quickly become friend worker especially Betty started working laundry little year ago shown rope sought other’s company outside work often went shopping together went walk park visited home year passed Jessie lost parent car accident suggested Betty become roommate child Betty wouldn’t leave widowed mother time Jessie considered happy apartment furnishing — although mainly secondhand stuff — tasteful loved job wasn’t started dating repeatedly dumped mentioning press operator laundry service became unhappy job job come unexpected end According Jenny blessing Jessie finished coffee went locker room collect handbag heading exit knew say goodbye everyone couldn’t face hated goodbye would see Betty tomorrow others — heard news — well would understand Outside gate turned around one last look everyone else weekend begin followed another work week idea would next week night apartment Jessie opened gift envelope utter amazement inside cheque amount three thousand dollar note read “Please accept token appreciation last five year excellent service fun Harry Muller” Jessie knew right away would windfall Jenny advised invest wisely Mr Muller wrote fun Well going going invest part money enroll secretarial course rest going go shopping invest whole new wardrobe Smiling reached phone “Betty” said call answered “want go mall tomorrow”Tags Books Reading Novel Book Recommendations Novel Excerpt |
4,143 | ‘The Unexamined Life Is Not Worth Living’ | “The unexamined life is not worth living” Socrates said as he declared the essence of a good life. “The only good is knowledge”. With knowledge, a person could shape their own destiny and find true happiness.
During a time when the world was looking to the cosmos for understanding, Socrates looked inward to the human mind for the questions and answers to life. The way to wisdom was to be found through human dialogue.
Something in Socrates’s words and search for knowledge resonated with me as I thought about writing and the struggles we all face as writers.
I realized that the process itself is the true gift. Writing is the process of examining life.
Some days our words come out beautifully, other days not so much. But one thing remains no matter what happens — we are personally changed because we are actively learning, actively striving, and our minds are engaged and lit up as we struggle with our ideas and words through the writing process.
As writers, we’re exploring ideas that we’re passionate about, and experiences that will make an impact on other people and bring some sort of value into the world.
Sometimes the writing process seems overwhelming and frustrating. Self-motivation and confidence, even sense of purpose, go up and down in alternating waves of frustration and euphoria. Sometimes it’s just so difficult to get started. But when this happens, we should remind ourselves of the gift that writing gives us. | https://medium.com/the-brave-writer/the-unexamined-life-is-not-worth-living-e90573573e8f | ['Tania Miller'] | 2020-12-17 13:47:51.405000+00:00 | ['Writing', 'Self Improvement', 'Writing Tips', 'Philosophy', 'Personal Development'] | Title ‘The Unexamined Life Worth Living’Content “The unexamined life worth living” Socrates said declared essence good life “The good knowledge” knowledge person could shape destiny find true happiness time world looking cosmos understanding Socrates looked inward human mind question answer life way wisdom found human dialogue Something Socrates’s word search knowledge resonated thought writing struggle face writer realized process true gift Writing process examining life day word come beautifully day much one thing remains matter happens — personally changed actively learning actively striving mind engaged lit struggle idea word writing process writer we’re exploring idea we’re passionate experience make impact people bring sort value world Sometimes writing process seems overwhelming frustrating Selfmotivation confidence even sense purpose go alternating wave frustration euphoria Sometimes it’s difficult get started happens remind gift writing give usTags Writing Self Improvement Writing Tips Philosophy Personal Development |
4,144 | Building a scalable and available home feed | Dan Feng | Pinterest engineer, Discovery
We pride ourselves on being a company focused first and foremost on the user experience. In order to deliver a great experience, including showing related content in the home feed, we’re building a service that’s fast and highly available. From a Pinner’s point of view, availability means how often they’ll get errors. For service owners, availability means how many minutes the service can be down without violating SLA (service level agreement). We use number of nines to measure the availability of our site and each service.
The Pinterest home feed is a personalized collection of Pins for each person. One third of Pinterest’s traffic lands on home feed, which makes it one of our most critical pages. When building our new home feed, achieving four nines or higher was one of the metrics used for measuring the success of the project. The full discussion for the new home feed architecture can be found at ‘Building a smarter home feed’. Here, I’ll focus on the design decisions from behind the scenes.
Isolating challenges
The home feed system can be simplified to support three use cases: writing Pinners’ feed to a storage, serving feed from the storage and removing feed when it’s required. Writing feed can have a huge QPS (query per second). Fortunately it’s not user-facing and certain delay (e.g. seconds or even minutes) is tolerable. Serving has relatively small QPS when comparing the writing operation, but it’s user-facing and has a tight performance requirement.
A simple design can include writing all feed to a storage and serving and deleting from it. At our current scale, we keep hundreds of terabyte data and support millions of operations per second. We’ve had success with HBase in our past iterations of the home feed system. After evaluating all the options, we chose HBase as our backend storage. The problem with the design is it’s very challenging to tune the same storage to meet the requirements for both a high volume of writing and a high performance of reading and updating. For example, when a person creates a new Pin, we’ll fan out the Pin to all his or her followers. Followers are sharded across all HBase regions. When we fan out the same Pin to hundreds of Pinners, the write operation will hit multiple regions, lock the WAL (write ahead log) on each region server, update it and unlock it after use. Locking the WAL for each write/update/delete operation isn’t efficient and quickly becomes a bottleneck. A better approach is to batch operations and push the changes to HBase once in a while, which increases the throughput of the HBase cluster dramatically. But the latency of each operation can be as high as the flush interval. For user-facing operations, our latency requirement is at millisecond level and the approach will fail us miserably.
To satisfy the different requirements, we designed a system with two HBase clusters and save data to different HBase clusters at different stages (see the component diagram below).
Zen is a service that provides a graph data model on top of HBase and abstracts the details of HBase operations from data producer and consumer.
SmartFeed worker is pushing feed from all sources (we also reference sources as pools) to HBase through Zen, and called by PinLater, an asynchronous job execution system that can tolerate certain delays and failures.
HBase for materialized content saves the Pins that have potentially been shown in the home feed before, and its content is accessed through Zen.
SmartFeed content generator is in charge of selecting new Pins from the pools, scoring and ordering them.
SmartFeed service is indirectly retrieving feed (content) from both of the HBase clusters, and only talks to the pools cluster through SmartFeed content generator.
When a Pinner hits their home feed:
SmartFeed service calls content generator to get new Pins
Content generator decides how many Pins should be returned and how they should be ordered in the returned result
Simultaneously SmartFeed service retrieves saved Pins from HBase for materialized content
SmartFeed service will wait for the results from the above two steps, mix and return them. (If the calls to content generator fails or timeouts, the result from step 2 will still be returned.)
Offline, SmartFeed service will save the new result to HBase for materialized content and delete them from HBase for pools
With this design, we separate user-facing components from non user-facing components. Since different HBase clusters have different volumes of data and usage patterns, we can scale and configure them individually to meet their needs. In reality, we have far less Pins in materialized content cluster than the cluster for pools. We can make it more reliable and faster without too much cost.
Speculative execution
With the design above, the availability is as good as the HBase for materialized content since we’re serving content only when it’s available. From time to time, HBase cluster can have JVM (Java virtual machine) garbage collection, node failure, region movement, etc. With a single HBase cluster, the availability can occasionally drop below four nines.
To improve the availability over four nines, we implement something called speculative execution. We always keep a hot standby HBase cluster in a different EC2 availability zone to avoid losing Pinners’ data. Any changes made to the primary HBase cluster will be synced to the standby cluster within a few hundred milliseconds. In the event of a partial failure of the primary cluster we’ll serve the data from a standby cluster. This technique helps make the whole system four nines of read availability (not write) and provides a much better Pinner experience than failing the requests. The way that the speculative execution works is:
Make a call to the primary cluster to retrieve data
If the call fails or doesn’t return within certain time, make another call to the standby cluster
Return the data from the cluster which returns first
With this approach, SmartFeed service will be able to return data if either of the clusters is available and the overall availability is close to the combined availability of the two clusters. The tricky part is to pick a proper waiting time. Since syncing data from the primary cluster to the standby cluster has some delay, the data returned from the standby cluster can be stale. If the waiting time is too small, Pinners will have a higher chance of getting stale data. If the waiting time is too long, Pinners have to wait unnecessarily long even if we can return results from the standby cluster much earlier. For us, we find if a call doesn’t return within time x, it’ll eventually time out in most cases. The time x is also larger than the 99.9 percentile of the call’s latency. We decided to use this as the cutoff time, which means results may be returned from the standby cluster for one out of 1,000 calls.
Another interesting finding is that the latency of the standby cluster is higher than the primary cluster because so few calls fall back to the standby cluster, and it’s in a ‘cold’ state for most of the time. To warm up the pipeline and get it ready for use, we randomly forward x percent of calls to the standby cluster and drop the result.
One time the primary HBase was down for almost one hour because of some hardware issue. Thanks to speculative execution, all home feed requests automatically failover to the standby cluster. The performance and success rate of home feed was not impacted at all during the whole HBase incident.
Outcomes
Since the launch of SmartFeed project, we’ve been handling hundreds of millions of calls per day and haven’t had a major incident with the availability dropping below 95 percent for more than five minutes. Overall, our availability is better than four nines.
If you’re interested in tackling challenges and making improvements like this, join our team!
Dan Feng is a software engineer at Pinterest.
Acknowledgements: This technology was built in collaboration with Chris Pinchak, Xun Liu, Raghavendra Prabhu, Jeremy Carroll, Dmitry Chechik, Varun Sharma and Tian-Ying Chang. This team, as well as people from across the company, helped make this project a reality with their technical insights and invaluable feedback.
For Pinterest engineering news and updates, follow our engineering Pinterest, Facebook and Twitter. Interested in joining the team? Check out our Careers site. | https://medium.com/pinterest-engineering/building-a-scalable-and-available-home-feed-6a343766bb6 | ['Pinterest Engineering'] | 2017-02-17 21:59:32.137000+00:00 | ['Engineering', 'Pinterest', 'Hbase', 'Data', 'Qps'] | Title Building scalable available home feedContent Dan Feng Pinterest engineer Discovery pride company focused first foremost user experience order deliver great experience including showing related content home feed we’re building service that’s fast highly available Pinner’s point view availability mean often they’ll get error service owner availability mean many minute service without violating SLA service level agreement use number nine measure availability site service Pinterest home feed personalized collection Pins person One third Pinterest’s traffic land home feed make one critical page building new home feed achieving four nine higher one metric used measuring success project full discussion new home feed architecture found ‘Building smarter home feed’ I’ll focus design decision behind scene Isolating challenge home feed system simplified support three use case writing Pinners’ feed storage serving feed storage removing feed it’s required Writing feed huge QPS query per second Fortunately it’s userfacing certain delay eg second even minute tolerable Serving relatively small QPS comparing writing operation it’s userfacing tight performance requirement simple design include writing feed storage serving deleting current scale keep hundred terabyte data support million operation per second We’ve success HBase past iteration home feed system evaluating option chose HBase backend storage problem design it’s challenging tune storage meet requirement high volume writing high performance reading updating example person creates new Pin we’ll fan Pin follower Followers sharded across HBase region fan Pin hundred Pinners write operation hit multiple region lock WAL write ahead log region server update unlock use Locking WAL writeupdatedelete operation isn’t efficient quickly becomes bottleneck better approach batch operation push change HBase increase throughput HBase cluster dramatically latency operation high flush interval userfacing operation latency requirement millisecond level approach fail u miserably satisfy different requirement designed system two HBase cluster save data different HBase cluster different stage see component diagram Zen service provides graph data model top HBase abstract detail HBase operation data producer consumer SmartFeed worker pushing feed source also reference source pool HBase Zen called PinLater asynchronous job execution system tolerate certain delay failure HBase materialized content save Pins potentially shown home feed content accessed Zen SmartFeed content generator charge selecting new Pins pool scoring ordering SmartFeed service indirectly retrieving feed content HBase cluster talk pool cluster SmartFeed content generator Pinner hit home feed SmartFeed service call content generator get new Pins Content generator decides many Pins returned ordered returned result Simultaneously SmartFeed service retrieves saved Pins HBase materialized content SmartFeed service wait result two step mix return call content generator fails timeouts result step 2 still returned Offline SmartFeed service save new result HBase materialized content delete HBase pool design separate userfacing component non userfacing component Since different HBase cluster different volume data usage pattern scale configure individually meet need reality far le Pins materialized content cluster cluster pool make reliable faster without much cost Speculative execution design availability good HBase materialized content since we’re serving content it’s available time time HBase cluster JVM Java virtual machine garbage collection node failure region movement etc single HBase cluster availability occasionally drop four nine improve availability four nine implement something called speculative execution always keep hot standby HBase cluster different EC2 availability zone avoid losing Pinners’ data change made primary HBase cluster synced standby cluster within hundred millisecond event partial failure primary cluster we’ll serve data standby cluster technique help make whole system four nine read availability write provides much better Pinner experience failing request way speculative execution work Make call primary cluster retrieve data call fails doesn’t return within certain time make another call standby cluster Return data cluster return first approach SmartFeed service able return data either cluster available overall availability close combined availability two cluster tricky part pick proper waiting time Since syncing data primary cluster standby cluster delay data returned standby cluster stale waiting time small Pinners higher chance getting stale data waiting time long Pinners wait unnecessarily long even return result standby cluster much earlier u find call doesn’t return within time x it’ll eventually time case time x also larger 999 percentile call’s latency decided use cutoff time mean result may returned standby cluster one 1000 call Another interesting finding latency standby cluster higher primary cluster call fall back standby cluster it’s ‘cold’ state time warm pipeline get ready use randomly forward x percent call standby cluster drop result One time primary HBase almost one hour hardware issue Thanks speculative execution home feed request automatically failover standby cluster performance success rate home feed impacted whole HBase incident Outcomes Since launch SmartFeed project we’ve handling hundred million call per day haven’t major incident availability dropping 95 percent five minute Overall availability better four nine you’re interested tackling challenge making improvement like join team Dan Feng software engineer Pinterest Acknowledgements technology built collaboration Chris Pinchak Xun Liu Raghavendra Prabhu Jeremy Carroll Dmitry Chechik Varun Sharma TianYing Chang team well people across company helped make project reality technical insight invaluable feedback Pinterest engineering news update follow engineering Pinterest Facebook Twitter Interested joining team Check Careers siteTags Engineering Pinterest Hbase Data Qps |
4,145 | How to Choose the RIGHT Influencers for Your Brand! | One of the most common questions I get asked about and one of my strengths when it comes to creating an Influencer Marketing Strategy is choosing the right influencer. Here’s how to choose the right influencers for your brand in 6 simple steps!
Set an objective
One of the most common problems I see when executing campaigns for brands is that they want to achieve everything at once but that doesn’t work. You need to set a clear objective as it then defines what approach you’ll take to choosing influencers so ask yourself what am I trying to achieve from collaborating with influencers? Some examples could be product awareness, brand awareness, traffic to website or conversion/sales. Each tier of influencers serves a different purpose — in example, it’s recommended to use tier 1 influencers (influencers with a very high follower base) for a new product launch during the first phase of the launch and then move on to using tier 2 influencers afterwards.
2. Keep your opinion out of it as long as they fit into your brand
Ready for the ugly truth? Your opinion doesn’t matter, your customers won’t care about what you think of a certain influencer. Look at the end results — just because you don’t like the influencer personally doesn’t mean that they aren’t worthwhile and it certainly doesn’t mean that they won’t achieve results. As long as they fit into your brand and they have a similar target audience, work with them.
3. Analyize their follower base
Look at their followers. The best way to know if an influencer is going to get you the results you want is by ensuring that their followers are your target audience. If they aren’t then you should be working for them just for the heck of it. Make sure you have a certain TA set prior to even talking or engaging with any influencer.
4. Analyize their numbers
Take a look at their engagement rates. If an influencer has 100,000 followers with only a few hundred likes — something is fishy. At the same time keep in mind the nature of the platform for example, Instagram rolled out an algorithm a while ago and that has affected engagement rates. Take a look at the amount of likes vs amount of comments, etc. The best thing to do is to look at historical data and see the month on month follower growth or engagements to ensure an influencer has real followers and isn’t buying ‘likes or comments’. iconosquare.com is a good tool to use for such data.
5. Previous collaborations with other brands
What brands did they work with? How did that go? What results did they achieve from working with those brands? I know you’re not able to see all the results but you can get a sense of how their followers reacted to those collaborations. They’ll give you key insights especially if those brands are similar to yours. Look at the amount of engagement, quality of comments, interactions, etc.
6. What Do Others Say About Them?
This is the most underrated point in our industry. Brand Managers & Agency People, I BEG YOU — ask around before working with an influencer to avoid disappointments and nightmares. I’ve created my own database and algorithm on rating influencers based on various criteria such as their follower base, how easy they are to work with, their price vs ROI ratio, etc. Working with influencers who are considered celebrities in our modern day and influencers who are barely known, I’ve learned that some of them have been amazing while others have been a nightmare. They’ve allowed their ego to get to them and become quite a challenge to work with (to say the least!). Reality check — there’s so many influencers out there that I as a marketeer have constantly told influencers that this industry has become a competitive market to where they need to rely on word-of-mouth marketing themselves as no brand manager or agency guy wants to work with someone that adds stress to their lives.
Do you have any other tips on how to choose influencers or do you need help in running your next influencer marketing campaign? Tweet me @mikealnaji and let’s discuss! | https://medium.com/astrolabs/how-to-choose-the-right-influencers-for-your-brand-f610e9a22bf3 | ['Mike Alnaji'] | 2017-08-27 14:30:33.979000+00:00 | ['Marketing', 'Influencer Marketing', 'Digital', 'Digital Marketing', 'Social Media'] | Title Choose RIGHT Influencers BrandContent One common question get asked one strength come creating Influencer Marketing Strategy choosing right influencer Here’s choose right influencers brand 6 simple step Set objective One common problem see executing campaign brand want achieve everything doesn’t work need set clear objective defines approach you’ll take choosing influencers ask trying achieve collaborating influencers example could product awareness brand awareness traffic website conversionsales tier influencers serf different purpose — example it’s recommended use tier 1 influencers influencers high follower base new product launch first phase launch move using tier 2 influencers afterwards 2 Keep opinion long fit brand Ready ugly truth opinion doesn’t matter customer won’t care think certain influencer Look end result — don’t like influencer personally doesn’t mean aren’t worthwhile certainly doesn’t mean won’t achieve result long fit brand similar target audience work 3 Analyize follower base Look follower best way know influencer going get result want ensuring follower target audience aren’t working heck Make sure certain TA set prior even talking engaging influencer 4 Analyize number Take look engagement rate influencer 100000 follower hundred like — something fishy time keep mind nature platform example Instagram rolled algorithm ago affected engagement rate Take look amount like v amount comment etc best thing look historical data see month month follower growth engagement ensure influencer real follower isn’t buying ‘likes comments’ iconosquarecom good tool use data 5 Previous collaboration brand brand work go result achieve working brand know you’re able see result get sense follower reacted collaboration They’ll give key insight especially brand similar Look amount engagement quality comment interaction etc 6 Others Say underrated point industry Brand Managers Agency People BEG — ask around working influencer avoid disappointment nightmare I’ve created database algorithm rating influencers based various criterion follower base easy work price v ROI ratio etc Working influencers considered celebrity modern day influencers barely known I’ve learned amazing others nightmare They’ve allowed ego get become quite challenge work say least Reality check — there’s many influencers marketeer constantly told influencers industry become competitive market need rely wordofmouth marketing brand manager agency guy want work someone add stress life tip choose influencers need help running next influencer marketing campaign Tweet mikealnaji let’s discussTags Marketing Influencer Marketing Digital Digital Marketing Social Media |
4,146 | Meaning Eventually Finds Its Place In FKA Twigs’ Grandiose Artistic Vision | Meaning Eventually Finds Its Place In FKA Twigs’ Grandiose Artistic Vision
Her latest LP “MAGDALENE” is an interstellar voyage through the dust of the broken heart to the planet of self
During her long-lasting hiatus, the avant-garde pop princess FKA Twigs got crushed by the brutal hands of her troubled relationships — falling in love with the ex-blood sucker Robert Pattinson and splitting with him in a seemingly heart-wrecking manner. Two years following the end of this romance, twigs comes back feeling quite herself and ready to baptise her loyal fanbase in the career-defining shrift.
In the pre-“MAGDALENE” era, British-born singer and songwriter Taliah Barnett took on a cunning challenge to manufacture the new RnB sound — blending her fragile-sounding soprano and the spooky art-house opera inspired imagery with the precision and inanimation of electronic music. Her efforts were met with universal critical acclaim, but looking in retrospect, the alien world twigs built around herself felt overwhelmed by her pompous visionary yet lacking raw human experience.
In this sense, “MAGDALENE” comes through as a true revelation, being an album that runs on the fuel of imaginative lyricism and storytelling. Here, Twigs tries on the image of Mary Magdalene and tells us the love story with her very own Jesus, drawing inspiration from the chants of classical religious music and futuristic electronic sounds.
The album starts with “thousand eyes” and, in an instance, enswathes you with a death-bearing sound of acapella choral signing and thud but heavy background instrumental, serving as a reminder of memorial liturgy music stripped off to its most naked condition. Alongside this soul-shivering melody, Twigs begins her Skaespherean story of spiritual death and thorny resurrection.
If I walk out the door, it starts our last goodbye
If you don’t pull me back, it wakes a thousand eyes
Going forward, the narrative unfurls in a surprisingly cohesive manner, letting Twigs’ vocal arrangements and poetic talent shine the way they’ve never had before. No matter what it is — the antithesis of distorted chest voice and paper-light head voice on “home with you”, or the intimate insight into the secrets of womanhood on “mary magdalene” — Twigs serves us a bowl full of ripe-fruit she religiously gathered at the garden of Eden.
There are many great moments on “MAGDALEN” — either lyrically, sonically or vocally — but the times it ascends to the sky-high levels lie at the intersection of the experimental search Twigs underwent in her previous work and the lucidity of pop-sound she tamed with the help of her fellow co-writers and co-producers. The result of this artistic confluence materialized in “sad day”, which might be remembered as one of the best pop tracks of this decade.
Channelling Kate Bush ethereal timbre, Twigs starts the track by whispering simple yet beautiful lyrics into the isolation of her listener's auricles, and then the melody expands into this epic dance/electronic ballad where Skrillex’s production touch suddenly falls in place and electrifies the record.
Bearing in mind how great the rest of the album is, it’s at best puzzling how “holy terrain” featuring Future made it to the final cut. It’s a commercially-baked, mediocre RnB song made to, perhaps, please the wider audience. A sad but not criminal oversight which gets forgotten as soon as the record ends.
Despite being listed as the sixth track, “fallen alien” comes through as the culminating phase of the narrative. It’s the most experimental track on the album exposing Twigs in her most desperate state. She basically goes on a full-scale jihad against her former lover. Her vocals here are nothing less than transcending: incisive, hysterical, in a good sense of this word, and on the edge of breaking loose into a wild scream.
The last three songs reveal Twigs meditating on the aftermath of relationships and reconciling with the damage left after it. The album ends with the lead single “cellophane”, a beautiful piano ballad which wraps all the memories and feelings into a thin, transparent sheet of regenerated cellulose.
All wrapped in cellophane, the feelings that we had.
It might sound trivial but the beauty of “MAGDALENE” truly lies in the eyes of its beholder. You can try and disentangle it into separate pieces to only be left with the overwhelming layers of artificial noises and digitally-produced sounds. But looking at “MAGDALENE” in its entirety reveals a timeless piece of pop art which was born in a happy marriage of a self-aware artist and her multifaceted talent.
P.S.
You can find me on Twitter and Instagram. | https://tonysolovjov.medium.com/meaning-eventually-finds-its-place-in-fka-twigs-grandiose-artistic-vision-bdfc8627fe53 | ['Tony Solovjov'] | 2020-01-30 09:32:44.502000+00:00 | ['Review', 'Music', 'Art', 'Pop', 'Culture'] | Title Meaning Eventually Finds Place FKA Twigs’ Grandiose Artistic VisionContent Meaning Eventually Finds Place FKA Twigs’ Grandiose Artistic Vision latest LP “MAGDALENE” interstellar voyage dust broken heart planet self longlasting hiatus avantgarde pop princess FKA Twigs got crushed brutal hand troubled relationship — falling love exblood sucker Robert Pattinson splitting seemingly heartwrecking manner Two year following end romance twig come back feeling quite ready baptise loyal fanbase careerdefining shrift pre“MAGDALENE” era Britishborn singer songwriter Taliah Barnett took cunning challenge manufacture new RnB sound — blending fragilesounding soprano spooky arthouse opera inspired imagery precision inanimation electronic music effort met universal critical acclaim looking retrospect alien world twig built around felt overwhelmed pompous visionary yet lacking raw human experience sense “MAGDALENE” come true revelation album run fuel imaginative lyricism storytelling Twigs try image Mary Magdalene tell u love story Jesus drawing inspiration chant classical religious music futuristic electronic sound album start “thousand eyes” instance enswathes deathbearing sound acapella choral signing thud heavy background instrumental serving reminder memorial liturgy music stripped naked condition Alongside soulshivering melody Twigs begin Skaespherean story spiritual death thorny resurrection walk door start last goodbye don’t pull back wake thousand eye Going forward narrative unfurls surprisingly cohesive manner letting Twigs’ vocal arrangement poetic talent shine way they’ve never matter — antithesis distorted chest voice paperlight head voice “home you” intimate insight secret womanhood “mary magdalene” — Twigs serf u bowl full ripefruit religiously gathered garden Eden many great moment “MAGDALEN” — either lyrically sonically vocally — time ascends skyhigh level lie intersection experimental search Twigs underwent previous work lucidity popsound tamed help fellow cowriters coproducers result artistic confluence materialized “sad day” might remembered one best pop track decade Channelling Kate Bush ethereal timbre Twigs start track whispering simple yet beautiful lyric isolation listener auricle melody expands epic danceelectronic ballad Skrillex’s production touch suddenly fall place electrifies record Bearing mind great rest album it’s best puzzling “holy terrain” featuring Future made final cut It’s commerciallybaked mediocre RnB song made perhaps please wider audience sad criminal oversight get forgotten soon record end Despite listed sixth track “fallen alien” come culminating phase narrative It’s experimental track album exposing Twigs desperate state basically go fullscale jihad former lover vocal nothing le transcending incisive hysterical good sense word edge breaking loose wild scream last three song reveal Twigs meditating aftermath relationship reconciling damage left album end lead single “cellophane” beautiful piano ballad wrap memory feeling thin transparent sheet regenerated cellulose wrapped cellophane feeling might sound trivial beauty “MAGDALENE” truly lie eye beholder try disentangle separate piece left overwhelming layer artificial noise digitallyproduced sound looking “MAGDALENE” entirety reveals timeless piece pop art born happy marriage selfaware artist multifaceted talent PS find Twitter InstagramTags Review Music Art Pop Culture |
4,147 | How to Set the Mood for Maximum Productivity | How to Set the Mood for Maximum Productivity
Get more done in less time by implementing these little rituals
Many of us struggle with being consistently productive. We plan so much for the day and then we end up procrastinating instead. And often, even when we do start working, it’s doesn’t go the way we expected. We just can’t seem to get in the zone.
I used to struggle with this a lot. I would sit down, open a document and start working on the task at hand. Yet my mind would wander and wouldn't stay focused on the work I was supposed to be doing. Often I would give up, saying: “I’m just not in the mood. It’s not a productive day and there’s nothing I can do about it.”
But then I learned that this wasn’t true. There is always something you can do. All I needed to regain my focus was a consistent setting that I associated with work. | https://medium.com/live-your-life-on-purpose/how-to-set-the-mood-for-maximum-productivity-57d735fcc787 | ['Veronika Jel'] | 2020-06-16 13:01:01.324000+00:00 | ['Advice', 'Work From Home', 'Life Hacking', 'Self Improvement', 'Productivity'] | Title Set Mood Maximum ProductivityContent Set Mood Maximum Productivity Get done le time implementing little ritual Many u struggle consistently productive plan much day end procrastinating instead often even start working it’s doesn’t go way expected can’t seem get zone used struggle lot would sit open document start working task hand Yet mind would wander wouldnt stay focused work supposed Often would give saying “I’m mood It’s productive day there’s nothing it” learned wasn’t true always something needed regain focus consistent setting associated workTags Advice Work Home Life Hacking Self Improvement Productivity |
4,148 | THE OCEAN | THE OCEAN
Beautiful and unexplored
71% of the earth’s surface consists of water. Large bodies of water are called oceans. Ocean provides so many things for us, An article from ecology.com about “10 Things to Know About the Ocean”. Those things start from oxygen that we need for breath until the job we need to survive in the world. The majority of people think that rain forests are the main producer of oxygen, but this is a misconception. The Ocean Preneur posted an article that showed the ratio of oxygen production. Rain forests only produce 28% of oxygen and the ocean produces 70% of oxygen for us. Ocean also became a regulator for earth climate, keeping our planet warm when the temperature sinks.
Image from earth.com
How come the ocean plays an important role to control our weather? Ocean Exploration and Research posted an article that shows how ocean taking a big role in our climate system, the majority of radiation from the sun is absorbed by the ocean, particularly in tropical waters around the equator, where the ocean acts like a massive, heat-retaining solar panel. The ocean doesn’t just store solar radiation, it also helps to distribute heat around the globe. The earth Water Cycle also influenced by the ocean. In the evaporation process, the ocean takes an important role to take the sun heat and make deliver it to the condensation process (the process of making cloud form). After a long process, finally, rainfall from skies after the precipitation process. That’s how the ocean playing a role to control the weather.
Ocean also provides food for humans. More than a billion people depend on the sea for protein sources. Not only from fish and the other animal creatures that live in the ocean but also plants that live in the ocean such as algae and seaweed.
Image from sciencenewsforstudents.org
These days people like to spend their holiday in the ocean, from the only sigh seeing the wonderful view of the ocean floor until doing some water sport in the ocean. These days so many water sport that we can do in the ocean, such as scuba diving, snorkeling, surfing, parasailing, wakeboarding, sea kayaking, free diving, sea walking, cage diving, etc.
In 2016, based on FAO (Food and Agriculture Organization) data 59.6 million people in the world were engaged in fisheries and aquaculture. Only at the European Union level, the blue sector represents 3.362.510 of jobs in 9 subsectors. In the United States, almost three million jobs are directly dependent on the resources of the oceans and Great Lakes.
Image from boraborapearlbeachresort.com
Even a tribe in Austronesian lives above the ocean. It called the Sama-Bajau Tribe. They live nomadic above the ocean. Today, not only the Sama-Bajau Tribe who lives above the ocean. The tourism sector has spread its wings to develop a new holiday trend “live above the ocean”. The destination place for those luxuries holiday such as; Maldives, Bora-Bora, Derawan Island, and many more.
Not only humans who depend their lives on the ocean, but so many creatures also depend their lives on the ocean. Ocean became their home and place to live on. Wallace wrote in his book that “being near, in, on, or underwater can make you happier, healthier, more connected, and better at what you do.” The ocean can affect our psychology side in good ways.
That’s what the ocean can provide for us, what about us? Did we do something good for the ocean? Today, so many issues regarding pollution and ocean damage. Humans live about 200.000 years on earth, in that period we have depended on our life to the ocean. If ocean pollution and damage continue, how long can we stay at earth and enjoy the ocean?
Image from dailymail.co.uk
Those issues are; oil spills, seas of plastic garbage, sewage disposal, toxic chemicals. Seas of plastic garbage became the hottest issue today. National Geographic posted an article about many threats that the ocean needs to face today, one of those threats is plastic issues. 12,7 million tons of plastic garbage was found in Ocean every year. Seas of plastic garbage not only impact human’s life but also the ocean ecosystem and all the creatures who live there. If the ocean can talk, it will shout out loud and tell humans to stop destroying everything. The oceans gave us everything that we need, but we destroy it.
As a human being that lives on the earth, we should realize our mistakes and try to fix them. Start from a small thing to big goals. Let’s start with the simplest thing like reducing the usage of plastic. Let’s start now! Give our contribution to a better life and a better future. | https://medium.com/tfi-student-community/the-ocean-2cefca773ce9 | ['Laurent Angelica Santoso'] | 2019-11-29 08:01:02.085000+00:00 | ['Environment', 'Sea', 'Beautiful', 'Life', 'Oceans'] | Title OCEANContent OCEAN Beautiful unexplored 71 earth’s surface consists water Large body water called ocean Ocean provides many thing u article ecologycom “10 Things Know Ocean” thing start oxygen need breath job need survive world majority people think rain forest main producer oxygen misconception Ocean Preneur posted article showed ratio oxygen production Rain forest produce 28 oxygen ocean produce 70 oxygen u Ocean also became regulator earth climate keeping planet warm temperature sink Image earthcom come ocean play important role control weather Ocean Exploration Research posted article show ocean taking big role climate system majority radiation sun absorbed ocean particularly tropical water around equator ocean act like massive heatretaining solar panel ocean doesn’t store solar radiation also help distribute heat around globe earth Water Cycle also influenced ocean evaporation process ocean take important role take sun heat make deliver condensation process process making cloud form long process finally rainfall sky precipitation process That’s ocean playing role control weather Ocean also provides food human billion people depend sea protein source fish animal creature live ocean also plant live ocean algae seaweed Image sciencenewsforstudentsorg day people like spend holiday ocean sigh seeing wonderful view ocean floor water sport ocean day many water sport ocean scuba diving snorkeling surfing parasailing wakeboarding sea kayaking free diving sea walking cage diving etc 2016 based FAO Food Agriculture Organization data 596 million people world engaged fishery aquaculture European Union level blue sector represents 3362510 job 9 subsectors United States almost three million job directly dependent resource ocean Great Lakes Image boraborapearlbeachresortcom Even tribe Austronesian life ocean called SamaBajau Tribe live nomadic ocean Today SamaBajau Tribe life ocean tourism sector spread wing develop new holiday trend “live ocean” destination place luxury holiday Maldives BoraBora Derawan Island many human depend life ocean many creature also depend life ocean Ocean became home place live Wallace wrote book “being near underwater make happier healthier connected better do” ocean affect psychology side good way That’s ocean provide u u something good ocean Today many issue regarding pollution ocean damage Humans live 200000 year earth period depended life ocean ocean pollution damage continue long stay earth enjoy ocean Image dailymailcouk issue oil spill sea plastic garbage sewage disposal toxic chemical Seas plastic garbage became hottest issue today National Geographic posted article many threat ocean need face today one threat plastic issue 127 million ton plastic garbage found Ocean every year Seas plastic garbage impact human’s life also ocean ecosystem creature live ocean talk shout loud tell human stop destroying everything ocean gave u everything need destroy human life earth realize mistake try fix Start small thing big goal Let’s start simplest thing like reducing usage plastic Let’s start Give contribution better life better futureTags Environment Sea Beautiful Life Oceans |
4,149 | Auto-Encoder: What Is It? And What Is It Used For? (Part 1) | Auto-Encoder: What Is It? And What Is It Used For? (Part 1)
A Gentle Introduction to Auto-Encoder and Some Of Its Common Use Cases With Python Code
Background:
Autoencoder is an unsupervised artificial neural network that learns how to efficiently compress and encode data then learns how to reconstruct the data back from the reduced encoded representation to a representation that is as close to the original input as possible.
Autoencoder, by design, reduces data dimensions by learning how to ignore the noise in the data.
Here is an example of the input/output image from the MNIST dataset to an autoencoder.
Autoencoder for MNIST
Autoencoder Components:
Autoencoders consists of 4 main parts:
1- Encoder: In which the model learns how to reduce the input dimensions and compress the input data into an encoded representation.
2- Bottleneck: which is the layer that contains the compressed representation of the input data. This is the lowest possible dimensions of the input data.
3- Decoder: In which the model learns how to reconstruct the data from the encoded representation to be as close to the original input as possible.
4- Reconstruction Loss: This is the method that measures measure how well the decoder is performing and how close the output is to the original input.
The training then involves using back propagation in order to minimize the network’s reconstruction loss.
You must be wondering why would I train a neural network just to output an image or data that is exactly the same as the input! This article will cover the most common use cases for Autoencoder. Let’s get started:
Autoencoder Architecture:
The network architecture for autoencoders can vary between a simple FeedForward network, LSTM network or Convolutional Neural Network depending on the use case. We will explore some of those architectures in the new next few lines.
1- Autoencoder for Anomaly Detection:
There are many ways and techniques to detect anomalies and outliers. I have covered this topic in a different post below:
However, if you have correlated input data, the autoencoder method will work very well because the encoding operation relies on the correlated features to compress the data.
Let’s say that we have trained an autoencoder on the MNIST dataset. Using a simple FeedForward neural network, we can achieve this by building a simple 6 layers network as below:
The output of the code above is:
Train on 60000 samples, validate on 10000 samples
Epoch 1/10
60000/60000 [==============================] - 6s 103us/step - loss: 0.0757 - val_loss: 0.0505
Epoch 2/10
60000/60000 [==============================] - 6s 96us/step - loss: 0.0420 - val_loss: 0.0355
Epoch 3/10
60000/60000 [==============================] - 6s 95us/step - loss: 0.0331 - val_loss: 0.0301
Epoch 4/10
60000/60000 [==============================] - 6s 96us/step - loss: 0.0287 - val_loss: 0.0266
Epoch 5/10
60000/60000 [==============================] - 6s 95us/step - loss: 0.0259 - val_loss: 0.0244
Epoch 6/10
60000/60000 [==============================] - 6s 96us/step - loss: 0.0240 - val_loss: 0.0228
Epoch 7/10
60000/60000 [==============================] - 6s 95us/step - loss: 0.0226 - val_loss: 0.0216
Epoch 8/10
60000/60000 [==============================] - 6s 97us/step - loss: 0.0215 - val_loss: 0.0207
Epoch 9/10
60000/60000 [==============================] - 6s 96us/step - loss: 0.0207 - val_loss: 0.0199
Epoch 10/10
60000/60000 [==============================] - 6s 96us/step - loss: 0.0200 - val_loss: 0.0193
As you can see in the output, the last reconstruction loss/error for the validation set is 0.0193 which is great. Now, if I pass any normal image from the MNIST dataset, the reconstruction loss will be very low (< 0.02) BUT if I tried to pass any other different image (outlier or anomaly), we will get a high reconstruction loss value because the network failed to reconstruct the image/input that is considered an anomaly.
Notice in the code above, you can use only the encoder part to compress some data or images and you can also only use the decoder part to decompress the data by loading the decoder layers.
Now, let’s do some anomaly detection. The code below uses two different images to predict the anomaly score (reconstruction error) using the autoencoder network we trained above. the first image is from the MNIST and the result is 5.43209. This means that the image is not an anomaly. The second image I used, is a completely random image that doesn’t belong to the training dataset and the results were: 6789.4907. This high error means that the image is an anomaly. The same concept applies to any type of dataset.
2- Image Denoising: | https://towardsdatascience.com/auto-encoder-what-is-it-and-what-is-it-used-for-part-1-3e5c6f017726 | ['Will Badr'] | 2019-07-01 07:09:48.367000+00:00 | ['Artificial Intelligence', 'Machine Learning', 'Neural Networks', 'Data Science', 'Deep Learning'] | Title AutoEncoder Used Part 1Content AutoEncoder Used Part 1 Gentle Introduction AutoEncoder Common Use Cases Python Code Background Autoencoder unsupervised artificial neural network learns efficiently compress encode data learns reconstruct data back reduced encoded representation representation close original input possible Autoencoder design reduces data dimension learning ignore noise data example inputoutput image MNIST dataset autoencoder Autoencoder MNIST Autoencoder Components Autoencoders consists 4 main part 1 Encoder model learns reduce input dimension compress input data encoded representation 2 Bottleneck layer contains compressed representation input data lowest possible dimension input data 3 Decoder model learns reconstruct data encoded representation close original input possible 4 Reconstruction Loss method measure measure well decoder performing close output original input training involves using back propagation order minimize network’s reconstruction loss must wondering would train neural network output image data exactly input article cover common use case Autoencoder Let’s get started Autoencoder Architecture network architecture autoencoders vary simple FeedForward network LSTM network Convolutional Neural Network depending use case explore architecture new next line 1 Autoencoder Anomaly Detection many way technique detect anomaly outlier covered topic different post However correlated input data autoencoder method work well encoding operation relies correlated feature compress data Let’s say trained autoencoder MNIST dataset Using simple FeedForward neural network achieve building simple 6 layer network output code Train 60000 sample validate 10000 sample Epoch 110 6000060000 6 103usstep loss 00757 valloss 00505 Epoch 210 6000060000 6 96usstep loss 00420 valloss 00355 Epoch 310 6000060000 6 95usstep loss 00331 valloss 00301 Epoch 410 6000060000 6 96usstep loss 00287 valloss 00266 Epoch 510 6000060000 6 95usstep loss 00259 valloss 00244 Epoch 610 6000060000 6 96usstep loss 00240 valloss 00228 Epoch 710 6000060000 6 95usstep loss 00226 valloss 00216 Epoch 810 6000060000 6 97usstep loss 00215 valloss 00207 Epoch 910 6000060000 6 96usstep loss 00207 valloss 00199 Epoch 1010 6000060000 6 96usstep loss 00200 valloss 00193 see output last reconstruction losserror validation set 00193 great pas normal image MNIST dataset reconstruction loss low 002 tried pas different image outlier anomaly get high reconstruction loss value network failed reconstruct imageinput considered anomaly Notice code use encoder part compress data image also use decoder part decompress data loading decoder layer let’s anomaly detection code us two different image predict anomaly score reconstruction error using autoencoder network trained first image MNIST result 543209 mean image anomaly second image used completely random image doesn’t belong training dataset result 67894907 high error mean image anomaly concept applies type dataset 2 Image DenoisingTags Artificial Intelligence Machine Learning Neural Networks Data Science Deep Learning |
4,150 | A Warm Fuzzy Hug | #DecemberSelfCare
A Warm Fuzzy Hug
Adorable fuzzy PJs, this poem’s for you
Photo by Anastasia Zhenina on Unsplash
as the temperature dips
into the icy stages
the phase where the air
feels spicy against your skin
i celebrate the winter
indoors
by breaking out
the festive fuzzy PJs
the ones that make you feel
like you’re a walking live teddy bear
the PJs that envelope you
like a bear hug. | https://medium.com/the-brain-is-a-noodle/a-warm-fuzzy-hug-ad180ba399d1 | ['Lucy The Eggcademic', 'She Her'] | 2020-12-23 10:05:17.353000+00:00 | ['Poetry Prompt', 'Mental Health', 'Self Care', 'Poetry'] | Title Warm Fuzzy HugContent DecemberSelfCare Warm Fuzzy Hug Adorable fuzzy PJs poem’s Photo Anastasia Zhenina Unsplash temperature dip icy stage phase air feel spicy skin celebrate winter indoors breaking festive fuzzy PJs one make feel like you’re walking live teddy bear PJs envelope like bear hugTags Poetry Prompt Mental Health Self Care Poetry |
4,151 | Data Science for Everyone: Getting To Know Your Data — Part 1 | Data: Formulating the Concepts
Definitions
The word data is the plural form of the word datum, which has the meaning of a “single piece of information, as a fact, statistic, or code” [5]. Another definition is “something given or admitted especially as a basis for reasoning or inference” [6]
In simple terms, data can be defined as numbers, characters, words, sounds, or symbols that can use to describe, quantify, recognize physical or virtual entities.
For example, if you can sufficiently describe a person with some data points (datums) such as name, date of birth, gender, appearance (colors and built), height, weight, etc. The same information can also be used to differentiate one person from another for recognition purposes.
Figure 3: Data field with a value assigned to it. (Image by author)
Let’s look at this conversation between two people “That [tall] [boy] with [brown hair] working as a [barista] at [ABC Cofee Shop] helped me when my car broke down in front of his shop. I think his name is [James]”. The words within [] are the data points you may use to recognize the specific person in a normal conversation as well in a systematic data application.
Data points are sometimes mentioned as features, data fields, characteristics, facts, and attributes. Which should be taken as the same concept at the high level.
A collection of data fields we can use to describe a person can be called a data model of a person. That becomes a record when the values are assigned to those fields. Similarly, we can represent other physical objects like vehicles, buildings, books using their data points which can describe their characteristics.
Figure 4: Data fields used to represent a person. (Image by author)
Several related records can be arranged into a structure such as a table or a list. Imagine a table containing data about 100 different people one row representing each person and each column used to store one data point.
Figure 5: Table
Many related data structures are combined into one larger structure that becomes a database. Depending on the application, there are multiple types of databases and database management systems to choose from.
Data and Information
We looked at the basic concept of the data everyone should know. Let’s quickly look at another related concept always mention with the data. That is Information. Let’s try to understand the difference between information and Data.
As we discussed above, data comes with two main components, structure and context. Without them, data has no meaning or value.
When data is taken with structure, context, and meaning, we call it information.
Here is an example, you got some data: a list of values with different color names. It is certainly data but can it alone give you any context? Is that data meaningful or useful? The answer could be no for both the questions.
Figure 6: Information (Photo by William Iven on Unsplash)
What if the same list is given with another data point for each color value: a car model and brand? The data now have some context and meaning. How about adding another data point: price? We now added some context to the data and can be used to derive useful information from it.
In the person data example illustrated in Figure 4, all the attributes “name”, “date of birth”, “height”, “weight”, etc. have no use unless they are connected to the person entity (arrows in the figure).
Data Organization
Based on how they are arranged, data collections can be categorized as structured, unstructured, and semi-structured.
Figure 7: Illustrating structured, unstructured, and semi-structured data. (Image by author)
The most traditional form of data collection is structured where the data is organized into tables which is easy to handle by both the human and machines. Structured data are easier to search and manage.
Unstructured data like images, videos, sound, and large text contents like books, letters, paragraphs are used by humans for many centuries of years even before the computer era. Managing and searching through unstructured data is quite cumbersome. According to a Forbes Technology Council post referring to Gartner, an “estimate that upward of 80% of enterprise data today is unstructured [7]”. Therefore, the research and development efforts in data science are heavy in this area.
With the advent of the internet and evolution of the computer technology, semi-structured data forms like mark-up languages, hierarchical data structures have become popular in storing and transmitting data. Semi-structured data considered self-describing data and it helps to store complex data that cannot easily organize into tables.
Figure 8: Semi-structured data formats (XML and JSON). (Image by author)
In the process of analyzing the data, unstructured data is converted into a structured or semi-structured form utilizing suitable data science methodologies.
Data types
Let’s now take a different perspective on understanding data.
Data can be represented in many different ways such as numbers, characters, symbols, pictograms, colors, signs, object arrangements, etc.
Figure 9: Different data representations and conversion into digital form. (Image by author)
In digitally computerized data representation every other representation will melt down into numbers and ends up in the binary form when storing, transmitting, and computing.
We can find a system of data types used in computing such as boolean, integer, float, and characters, as primary types.
The derived types of these are strings, arrays, lists, sets, vectors, matrices, tensors, complex numbers, structure, enumerators, dictionaries, tables, and objects made possible the complex computing applications we all benefit from in this era.
A special data type is known as null, none, or void depending on the programming language is used to represent “nothing”.
An in-depth discussion on the different data types and their uses is planned for a future article.
Figure 10: Data Types used in computing. (Image by author)
Data File Types
Traditionally you access data from printed materials, videos, display boards, etc. In a computer system, your data comes as files or streams. The common file types are text, binary, images/photos, audio, video, archive, compressed, database. A detailed discussion on these file formats and their uses is planned for a future article.
Data Encoding
The data you accessed is not always stayed in the same format as it presented to you. We learned the ultimate form of data will be binary (1/0) in the digital systems. However, there are intermediate representations used when storing and transmitting data.
When data is moving from one location to another their representations can also change. We call that encoding and decoding with reference to one representation. One of the common encoding schemes is the American Standard Code for Information Interchange (ASCII) [8] which use to represent characters (letters, digits, and special signs). Figure 10 illustrates the encoding of the text “DATA@8” into ASCII. The universally accepted encoding scheme for the same purpose is known and UNICODE [9].
Figure 11: Character Encoding Example (ASCII) (Image by author)
Encoding should not be confused with encryption, which hides data content from unintended parties. A detailed discussion on the encoding-decoding, encryption-decryption and their uses are planned for a future article.
Analog vs. Digital Data
In nature, the data exist in analog form and we need to convert them into machine recognizable binary form to be used with digital computing machines. The term “digitization” is used to name this conversion process [10]. In electronics, this sometimes called analog to digital encoding.
Some examples of digitization are scanning a paper document to create its digital copy, recording a sound from your mobile phone mic, recording your walking track using GPS data.
When the digitally stored data need to serve as an analog output, digital to analog conversion is used. You are using this analog to digital conversion and vise versa in your personal devices such as mobile phones, video displays, sound recorders, cameras, music players, etc.
Figure 12: Analogue and Digital conversion: digitization of a sound signal. (Image by author)
Qualitative and Quantitative nature of data
Each data measurement can also be classified as qualitative and quantitative and their subclasses by the nature of values they can take.
Figure 13: Classifying data measurements by their Qualitative and Quantitative nature. (Image by author)
Data is a broad concept that can be examined from a variety of perspectives. The more you combine those different perspectives, you will get a better grip of the data you are dealing with. Therefore, the concepts we discussed above are crucial at every stage in the data science workflow. They are also important at any time you engage with the data or data-driven applications.
Measuring Data
Data is quantifiable. Data is quantifiable. The smallest unit of digital data is called a bit, which is also used as a scale to measure data. A single bit can store a value of either 0 or 1.
In the data encoding example, we showed that ASCII uses 7 bits which makes 2⁷ = 128 possible combinations. In other words, 128 different characters can be represented using an ASCII value.
A group of 8 bits (octet) is considered as one byte which is the fundamental unit of measuring data. The symbol defined by the International System of Units (SI) is B. To measure large quantities of data the SI prefixes such as Kilo (K), Mega (M), Giga (G), etc. are used [11]. The use of these prefixes must not be confused with the binary interpretation of prefixes used in many applications like Ki, Mi, Gi, etc.[12,13]
Figure 14: System of units for measuring digital Information (Image by author, information source: [12])
Data visualization
You have seen charts, plots, and various infographics condensing data and information into graphical representations. Visualization is a very efficient way of communicating data. It is also important in the early stages of data science workflow to understand the data, and various quality control measures before moving into the later stages. Some argue data visualization is both an art and science. An in-depth discussion on data visualization methods and their uses are planned for a future article.
It is also important in the early stages of data science workflow to understand the data, and various quality control measures before moving into the later stages. Some argue data visualization is both an art and science. An in-depth discussion on data visualization methods and their uses are planned for a future article. | https://medium.com/towards-artificial-intelligence/data-science-for-everyone-getting-to-know-your-data-part-1-bb8b6d7782b1 | ['Sumudu Tennakoon'] | 2020-12-24 01:03:36.911000+00:00 | ['Data Science', 'Machine Learning', 'Data Scientist', 'Artificial Intelligence', 'Education'] | Title Data Science Everyone Getting Know Data — Part 1Content Data Formulating Concepts Definitions word data plural form word datum meaning “single piece information fact statistic code” 5 Another definition “something given admitted especially basis reasoning inference” 6 simple term data defined number character word sound symbol use describe quantify recognize physical virtual entity example sufficiently describe person data point datum name date birth gender appearance color built height weight etc information also used differentiate one person another recognition purpose Figure 3 Data field value assigned Image author Let’s look conversation two people “That tall boy brown hair working barista ABC Cofee Shop helped car broke front shop think name James” word within data point may use recognize specific person normal conversation well systematic data application Data point sometimes mentioned feature data field characteristic fact attribute taken concept high level collection data field use describe person called data model person becomes record value assigned field Similarly represent physical object like vehicle building book using data point describe characteristic Figure 4 Data field used represent person Image author Several related record arranged structure table list Imagine table containing data 100 different people one row representing person column used store one data point Figure 5 Table Many related data structure combined one larger structure becomes database Depending application multiple type database database management system choose Data Information looked basic concept data everyone know Let’s quickly look another related concept always mention data Information Let’s try understand difference information Data discussed data come two main component structure context Without data meaning value data taken structure context meaning call information example got data list value different color name certainly data alone give context data meaningful useful answer could question Figure 6 Information Photo William Iven Unsplash list given another data point color value car model brand data context meaning adding another data point price added context data used derive useful information person data example illustrated Figure 4 attribute “name” “date birth” “height” “weight” etc use unless connected person entity arrow figure Data Organization Based arranged data collection categorized structured unstructured semistructured Figure 7 Illustrating structured unstructured semistructured data Image author traditional form data collection structured data organized table easy handle human machine Structured data easier search manage Unstructured data like image video sound large text content like book letter paragraph used human many century year even computer era Managing searching unstructured data quite cumbersome According Forbes Technology Council post referring Gartner “estimate upward 80 enterprise data today unstructured 7” Therefore research development effort data science heavy area advent internet evolution computer technology semistructured data form like markup language hierarchical data structure become popular storing transmitting data Semistructured data considered selfdescribing data help store complex data cannot easily organize table Figure 8 Semistructured data format XML JSON Image author process analyzing data unstructured data converted structured semistructured form utilizing suitable data science methodology Data type Let’s take different perspective understanding data Data represented many different way number character symbol pictograms color sign object arrangement etc Figure 9 Different data representation conversion digital form Image author digitally computerized data representation every representation melt number end binary form storing transmitting computing find system data type used computing boolean integer float character primary type derived type string array list set vector matrix tensor complex number structure enumerator dictionary table object made possible complex computing application benefit era special data type known null none void depending programming language used represent “nothing” indepth discussion different data type us planned future article Figure 10 Data Types used computing Image author Data File Types Traditionally access data printed material video display board etc computer system data come file stream common file type text binary imagesphotos audio video archive compressed database detailed discussion file format us planned future article Data Encoding data accessed always stayed format presented learned ultimate form data binary 10 digital system However intermediate representation used storing transmitting data data moving one location another representation also change call encoding decoding reference one representation One common encoding scheme American Standard Code Information Interchange ASCII 8 use represent character letter digit special sign Figure 10 illustrates encoding text “DATA8” ASCII universally accepted encoding scheme purpose known UNICODE 9 Figure 11 Character Encoding Example ASCII Image author Encoding confused encryption hide data content unintended party detailed discussion encodingdecoding encryptiondecryption us planned future article Analog v Digital Data nature data exist analog form need convert machine recognizable binary form used digital computing machine term “digitization” used name conversion process 10 electronics sometimes called analog digital encoding example digitization scanning paper document create digital copy recording sound mobile phone mic recording walking track using GPS data digitally stored data need serve analog output digital analog conversion used using analog digital conversion vise versa personal device mobile phone video display sound recorder camera music player etc Figure 12 Analogue Digital conversion digitization sound signal Image author Qualitative Quantitative nature data data measurement also classified qualitative quantitative subclass nature value take Figure 13 Classifying data measurement Qualitative Quantitative nature Image author Data broad concept examined variety perspective combine different perspective get better grip data dealing Therefore concept discussed crucial every stage data science workflow also important time engage data datadriven application Measuring Data Data quantifiable Data quantifiable smallest unit digital data called bit also used scale measure data single bit store value either 0 1 data encoding example showed ASCII us 7 bit make 2⁷ 128 possible combination word 128 different character represented using ASCII value group 8 bit octet considered one byte fundamental unit measuring data symbol defined International System Units SI B measure large quantity data SI prefix Kilo K Mega Giga G etc used 11 use prefix must confused binary interpretation prefix used many application like Ki Mi Gi etc1213 Figure 14 System unit measuring digital Information Image author information source 12 Data visualization seen chart plot various infographics condensing data information graphical representation Visualization efficient way communicating data also important early stage data science workflow understand data various quality control measure moving later stage argue data visualization art science indepth discussion data visualization method us planned future article also important early stage data science workflow understand data various quality control measure moving later stage argue data visualization art science indepth discussion data visualization method us planned future articleTags Data Science Machine Learning Data Scientist Artificial Intelligence Education |
4,152 | Ten Deep Learning Concepts You Should Know for Data Science Interviews | Deep learning and neural networks can get really complicated. When it comes to data science interviews, however, there are only so many concepts that interviewers test. After going through hundreds and hundreds of data science interview questions, I compiled 10 deep learning concepts that came up the most often.
In this article, I’m going to go over these 10 concepts, what they’re all about, and why they’re so important.
With that said, here we go! | https://towardsdatascience.com/ten-deep-learning-concepts-you-should-know-for-data-science-interviews-a77f10bb9662 | ['Terence Shin'] | 2020-12-10 04:04:50.070000+00:00 | ['Deep Learning', 'Artificial Intelligence', 'Machine Learning', 'Data Science', 'Work'] | Title Ten Deep Learning Concepts Know Data Science InterviewsContent Deep learning neural network get really complicated come data science interview however many concept interviewer test going hundred hundred data science interview question compiled 10 deep learning concept came often article I’m going go 10 concept they’re they’re important said goTags Deep Learning Artificial Intelligence Machine Learning Data Science Work |
4,153 | Getting Started with Python | Python is an amazing language that is used in a wide variety of applications. Did you know that Python is used in applications involving automation, data science and web apps? For example, Facebook uses Python to process images. Before we get started, let us first break down the concepts that we need to learn in order to become a Python ninja.
Python Syntax Data Structures Algorithms
The first thing we must be acquainted with is the syntax of the Python programming language. We must also learn the proper data structures that we want to use when we are solving a particular problem. Lastly, we must know which algorithm we would want to use to reach the solution of the problem.
Now you must be thinking, this is fine and all, but what do I need to install on my machine in order to run Python? If you are already using a Mac or Linux operating system, then Python is already pre-installed. If you are using a Windows 10 machine, Microsoft released an update that also pre-installs Python. However, if you do not have Python installed on your OS or are unsure if you have Python installed, here is a breakdown between a Mac OS and Windows OS for checking: | https://medium.com/quick-code/getting-started-with-python-313eb74915c8 | ['Rafay Syed'] | 2019-08-28 01:21:52.907000+00:00 | ['Programming', 'Computer Science', 'Python'] | Title Getting Started PythonContent Python amazing language used wide variety application know Python used application involving automation data science web apps example Facebook us Python process image get started let u first break concept need learn order become Python ninja Python Syntax Data Structures Algorithms first thing must acquainted syntax Python programming language must also learn proper data structure want use solving particular problem Lastly must know algorithm would want use reach solution problem must thinking fine need install machine order run Python already using Mac Linux operating system Python already preinstalled using Windows 10 machine Microsoft released update also preinstalls Python However Python installed OS unsure Python installed breakdown Mac OS Windows OS checkingTags Programming Computer Science Python |
4,154 | Spores on the cooling off corpse of data science | A few years ago, data scientism was a fresh shiny hot air balloon that took off due to the aristocratic arrogance of prestigious, predictive analytic vendors who simply milked their cows and watched with glass eyes as the era of big data and cloud computation began.
The burner in the balloon’s basket was fueled by the dropping cost of storing and manipulating massive amounts of data, due to the quality big data tools and sophisticated cloud infrastructure that has become widely available. These were hacked together by resident data scientists using mainly R/Python or other open source tools.
The marketing storm of “data driven business”, “BIG data” and “cloud computation” and the decades long hypnotic education about extraordinary value of predictive analytic established a craving appetite for companies all around the world to have their own burner developed. This craving appetite delivered headlines stating that the sexiest job of the century will be data science. This belief is still quite fashionable, and data scientists need not worry about their job yet. However, there are voices and debates question the longtime future of data science.
I am also one of those who believes the bubble is leaking, however I got to this conclusion in a different way, which I will now unfold…
Once upon a time…
The story goes back about fifteen or twenty years, when the classical vendors of predictive analytic software started spreading their products outside the classical application areas such as banking and insurance. They did this first in terms of slideware. Later, slideware evolved into expensive, yet exceptionally unstable, betas. Nevertheless, in the evangelizations of data mining yield to extensive usage of predictive models in various business areas such as: predictive marketing and CRM, financial services, telecommunication, retail, travel, healthcare and pharmaceuticals.
The applications areas of predictive models and the market of predictive analytics is still growing today and predicted to grow in the remainder of the decade by all major research companies. It is also expected that new application areas will strongly emerge such as predictive maintenance. However, I strongly believe that the landscape of predictive analytics is being revolutionized, and will be marked by an alternative to the era of data scientists, or by the new release of data mining software that still copies the data mining workflows invented twenty years ago for classical applications, as aforementioned.
Today, the most widely used applications of predictive analytics are the different from the classical applications for which the classical workflows were invented.
In the early ages of predictive analytics, when it was used strictly for banking and insurance purposes, there was a very high financial impact in relation to the prediction. A single prediction could earn or lose the company hundreds of thousands of dollars. This fact, unsurprisingly, formed the workflow of the classical data mining applications, as well as focusing on the development and fine tuning of a single model, done by a horde of highly trained mathematicians and statisticians, preferably with a PhD.
This culture of “rocket science” still forms the decisions about predictive analytics today. While the financial impact of predictions in the mass applications areas such as: predictive marketing, CRM, retail or travel, is tiny, most of these companies are still purchasing expensive predictive analytic tools that were built for the classical applications. Most of these companies still hire highly trained data miners to use these products, or struggle to recruit versatile data scientists who can build in-house tools to generate predictions whose financial impact can be measured in fractions of pennies.
I’m pretty much sure that most of the business cases of predictive analytics would fail when they tested.
Back to the spores on the corpse
The key to the future of predictive analytics is the ease of applications and speed of deploy-ability. Interpret-ability of a model, or the performance of an individual model becomes less important, and this produces new challenges to be faced by the traditional data mining workflows and software packages. A majority of the applications of predictive analytics would benefit more using Machine Learning as a Service (MLaaS) instead of owning and licensing a standalone product.
MLaaS products will be accessed through well-defined and fairly standardized APIs that open doors for continuous innovations. The most successful candidates in the future of MLaaS providers will feature the management of large amounts of models through advanced model monitoring capabilities. They will support automatic model development, and by closing the fact-feedback-loop, they will also provide online learning and the models got automatically re-trained when there is a drop in predictive performance.
In the very near future, the aforementioned features of MLaaS products will revolutionize the world of predictive analytics as we know it today. Hundreds of classical applications, products, serves and novel IoT applications will benefit from the adaptation capability or “plug and play intelligence” they provide. | https://medium.com/data-science-without-marketing-mystery/spores-on-the-cooling-off-corpse-of-data-science-fb7aef0fd715 | [] | 2017-09-08 10:19:34.307000+00:00 | ['Machine Learning', 'Predictive Analytics', 'Data Science', 'Big Data', 'CRM'] | Title Spores cooling corpse data scienceContent year ago data scientism fresh shiny hot air balloon took due aristocratic arrogance prestigious predictive analytic vendor simply milked cow watched glass eye era big data cloud computation began burner balloon’s basket fueled dropping cost storing manipulating massive amount data due quality big data tool sophisticated cloud infrastructure become widely available hacked together resident data scientist using mainly RPython open source tool marketing storm “data driven business” “BIG data” “cloud computation” decade long hypnotic education extraordinary value predictive analytic established craving appetite company around world burner developed craving appetite delivered headline stating sexiest job century data science belief still quite fashionable data scientist need worry job yet However voice debate question longtime future data science also one belief bubble leaking however got conclusion different way unfold… upon time… story go back fifteen twenty year classical vendor predictive analytic software started spreading product outside classical application area banking insurance first term slideware Later slideware evolved expensive yet exceptionally unstable beta Nevertheless evangelizations data mining yield extensive usage predictive model various business area predictive marketing CRM financial service telecommunication retail travel healthcare pharmaceutical application area predictive model market predictive analytics still growing today predicted grow remainder decade major research company also expected new application area strongly emerge predictive maintenance However strongly believe landscape predictive analytics revolutionized marked alternative era data scientist new release data mining software still copy data mining workflow invented twenty year ago classical application aforementioned Today widely used application predictive analytics different classical application classical workflow invented early age predictive analytics used strictly banking insurance purpose high financial impact relation prediction single prediction could earn lose company hundred thousand dollar fact unsurprisingly formed workflow classical data mining application well focusing development fine tuning single model done horde highly trained mathematician statistician preferably PhD culture “rocket science” still form decision predictive analytics today financial impact prediction mass application area predictive marketing CRM retail travel tiny company still purchasing expensive predictive analytic tool built classical application company still hire highly trained data miner use product struggle recruit versatile data scientist build inhouse tool generate prediction whose financial impact measured fraction penny I’m pretty much sure business case predictive analytics would fail tested Back spore corpse key future predictive analytics ease application speed deployability Interpretability model performance individual model becomes le important produce new challenge faced traditional data mining workflow software package majority application predictive analytics would benefit using Machine Learning Service MLaaS instead owning licensing standalone product MLaaS product accessed welldefined fairly standardized APIs open door continuous innovation successful candidate future MLaaS provider feature management large amount model advanced model monitoring capability support automatic model development closing factfeedbackloop also provide online learning model got automatically retrained drop predictive performance near future aforementioned feature MLaaS product revolutionize world predictive analytics know today Hundreds classical application product serf novel IoT application benefit adaptation capability “plug play intelligence” provideTags Machine Learning Predictive Analytics Data Science Big Data CRM |
4,155 | Build An Android App To Monitor and Convert “bitcoin and etherum” in 20 Local Currencies | In the era of the digital world, the monetary system is constantly changing, and things that has been popular are recently being replaced by improved technologies.
The payment industry is particularly affected by this digital era of cryptocurrencies given the public acceptance it has received by many countries and payment platforms. Countries like Japan has already made it a legal means of payment alongside many others.
A friend once wrote “ Our ecosystem is no longer just about the code but about people who build and use products.
Recently, I’ve realized that 50% of my time (including weekends) is distributed to VS code, the terminal, and Slack. This whole thing is becoming a lifestyle and of course, I’m embracing it — it’s what I love”.
I believe he’s not alone. A lot of us spend up to 50hrs a week on productivity tools. Why should we limit it to just code? Why not extend it to cover daily life utility tasks for us.
With that in mind, I’ve put up a developer tool to show the possibilities of monitoring these cryptocurrencies in realtime on your android devices. Not just that, you will be able to convert them across 20 different local currencies.
So in this tutorial, we’ll walk through how you can build this application for yourself leveraging on the api we’ll be providing for this purpose.
DEMO
It’s always good practice to have a visual and practical idea of what you’re building and how it works, so you can take a look at this short clip to see how this app works, you can also access the source code on github. Without further ado, let’s head on over to Android Studio and start building..
Side Knowledge
By virtue of this article, you will learn a few other Android Development skills like:
Making API calls
Processing nested Json objects with iterators
Making network requests with volley
Working with recyclerviews and cardviews
Mathematical conversions with formats
etc
Technologies
Before we go ahead and start building, it is wise to talk about the technologies we’ll be using so it won’t look confusing when you come across them as we progress.
Volley — a third party library that allows us make network requests seamlessly
— a third party library that allows us make network requests seamlessly Recyclerview/CardView — Android special layouts for better organizing contents on screen
Now create a new Android Studio project called “CryptoCompare”by now this should be a fairly basic step however if you’re just starting off, refer to any of my previous posts on how to set up a new AS project.
Once you’re done creating a new project, install the dependencies for the technologies we talked about. Open your app level build.gradle file and add:
click sync to install dependencies
MainActivity Layout
Then open Activity_main.xml and set up the layout like so:
This is quite a simple layout with a toolbar and three TextView objects for local currency, BTC and ETH respectively, this is primarily just serving as headers for the values to be loaded remotely into the recyclerview we defined below the TextView objects. This layout should look like this in your xml visualizer:
Hey, yours might not look exactly like this but then it shouldn’t, because i used a custom background image which you probably don’t have. The important thing to look out for is the three TextView objects showing up as expected and the blue lines denoting the area covered by your recyclerview and probably the toolbar.
When we make an api call that will return the values of these TextView objects, we’ll simply pass the data into our CardView layout and then use the layout to populate the recyclerview accordingly, make sense ? okay lets continue.
CardView Layout
Talking about CardView, let’s create a new layout resource file called “card_items.xml”. This will be the CardView layout where we will define the contents we’d like to display on the recyclerview i.e Currency, BTC and ETH. So create the new resource file and set it up like so:
This is a simple CardView layout with three TextView objects that we predefined with dummy values to serve as place holders for the actual data we’ll be getting from our api. Just for the sake of clarity, your xml visualizer for this layout should look like this:
Now let’s head over to our MainActivity.java file and get interactive. Open MainActivity.java and initialize the recyclerview object. Then we start making the api call. First we store the api inside a variable we defined as “private static final String URL_DATA” and then use it to build our JSONObject request like so:
What we have done in the onCreate() method here is quite simple, we defined our api in a string variable, initialized our toolbar, texts and recyclerview. We also created an ArrayList from a CardItems class that we are yet to create but will do so soon. Notice we also called a method loadURLData(). This is the method where we make the request to the api to return the values of the bitcoin, etherum and their respective values in 20 currencies. If you copied this snippet into your studio and got errors then don’t fret, you’re not lost, actually we called a method and even two classes we are yet to create:
loadURLData()
MyAdapter class
class CardItems class.
So go back inside the MainActivity class and create the loadURLData() method and set it up like so:
loadURData()
Here we are simply making an api call with volley passing in the variable that stores the Api. The method returns a response from where we then extract our btc and eth values into a JSONObject. Then we use the iterator<?> to iterate through the nested object and match the individual btc and eth values to their respective currencies keysBTC and keysETH . Next we create the MyAdapter class. So create a new java class called MyAdapter and set it up like so:
MyAdapeter class
The MyAdapter class is associated with our recyclerview view object. We use it to organize the contents of the recyclerview. In this context, we simply inflated the card_items.xml layout and then used the implemented methods to create viewHolders and bind it’s contents to the inflated layout.
Okay let’s step through this for a bit, by the way if you see any red lines at this point, don’t worry you are not alone, i’ll explain why you got the red lines and how to overcome it. From the top, when we created the MyAdapter class, we extended RcyclerView.Adapter<MyAdapter.ViewHolder> and passed into it the the cardItemsList and the Context, which prompted us to implement it’s associating methods ( viewHolder() and onBindViewHolder() ). Inside the viewHolder() method we simply inflated the card_items.xml layout file and returned a new viewHolder(v) method.
Then in the onBindViewHolder() method, we created an instance of the CardItems class and then stored the values of the cardItem objects into it’s respective variables (curr, btcVal and ethVal). Then to finally bind these variables to their respective positions on the viewHolder, we set them on the viewHolder holder with the help of our CardItems instance where we defined the setters and getters.
Finally, notice that when we extended the MyAdapter class, we passed in <MyAdapter.ViewHolder> hence, we created the ViewHolder class inside the MyAdapter class where we simply initialized all the view objects in the card_item.xml file, including the LinearLayout.
CardItems class
Finally, to finish setting up our MainActivity, we create the CardItems class. The CardItems class will simply help us generate setters and getters for the contents of our card_items.xml file which we have earlier initialized in the onCreat() method of the MainActivity class . So create a new java class called CardItems and set it up like so:
At this point everything is correctly set up. If you run the app, you should now see that the contents we passed from the JSON response into our card_items layout file will show up on the layout which in turn gets laid out on the recyclerview like so: | https://medium.com/quick-code/build-an-android-app-to-monitor-and-convert-bitcoin-and-etherum-in-20-local-currencies-6628a9058a29 | ['Ekene Eze'] | 2018-02-01 17:05:03.845000+00:00 | ['Mobile App Development', 'Cryptocurrency', 'Android', 'Android App Development', 'Bitcoin'] | Title Build Android App Monitor Convert “bitcoin etherum” 20 Local CurrenciesContent era digital world monetary system constantly changing thing popular recently replaced improved technology payment industry particularly affected digital era cryptocurrencies given public acceptance received many country payment platform Countries like Japan already made legal mean payment alongside many others friend wrote “ ecosystem longer code people build use product Recently I’ve realized 50 time including weekend distributed VS code terminal Slack whole thing becoming lifestyle course I’m embracing — it’s love” believe he’s alone lot u spend 50hrs week productivity tool limit code extend cover daily life utility task u mind I’ve put developer tool show possibility monitoring cryptocurrencies realtime android device able convert across 20 different local currency tutorial we’ll walk build application leveraging api we’ll providing purpose DEMO It’s always good practice visual practical idea you’re building work take look short clip see app work also access source code github Without ado let’s head Android Studio start building Side Knowledge virtue article learn Android Development skill like Making API call Processing nested Json object iterators Making network request volley Working recyclerviews cardviews Mathematical conversion format etc Technologies go ahead start building wise talk technology we’ll using won’t look confusing come across progress Volley — third party library allows u make network request seamlessly — third party library allows u make network request seamlessly RecyclerviewCardView — Android special layout better organizing content screen create new Android Studio project called “CryptoCompare”by fairly basic step however you’re starting refer previous post set new project you’re done creating new project install dependency technology talked Open app level buildgradle file add click sync install dependency MainActivity Layout open Activitymainxml set layout like quite simple layout toolbar three TextView object local currency BTC ETH respectively primarily serving header value loaded remotely recyclerview defined TextView object layout look like xml visualizer Hey might look exactly like shouldn’t used custom background image probably don’t important thing look three TextView object showing expected blue line denoting area covered recyclerview probably toolbar make api call return value TextView object we’ll simply pas data CardView layout use layout populate recyclerview accordingly make sense okay let continue CardView Layout Talking CardView let’s create new layout resource file called “carditemsxml” CardView layout define content we’d like display recyclerview ie Currency BTC ETH create new resource file set like simple CardView layout three TextView object predefined dummy value serve place holder actual data we’ll getting api sake clarity xml visualizer layout look like let’s head MainActivityjava file get interactive Open MainActivityjava initialize recyclerview object start making api call First store api inside variable defined “private static final String URLDATA” use build JSONObject request like done onCreate method quite simple defined api string variable initialized toolbar text recyclerview also created ArrayList CardItems class yet create soon Notice also called method loadURLData method make request api return value bitcoin etherum respective value 20 currency copied snippet studio got error don’t fret you’re lost actually called method even two class yet create loadURLData MyAdapter class class CardItems class go back inside MainActivity class create loadURLData method set like loadURData simply making api call volley passing variable store Api method return response extract btc eth value JSONObject use iterator iterate nested object match individual btc eth value respective currency keysBTC keysETH Next create MyAdapter class create new java class called MyAdapter set like MyAdapeter class MyAdapter class associated recyclerview view object use organize content recyclerview context simply inflated carditemsxml layout used implemented method create viewHolders bind it’s content inflated layout Okay let’s step bit way see red line point don’t worry alone i’ll explain got red line overcome top created MyAdapter class extended RcyclerViewAdapterMyAdapterViewHolder passed cardItemsList Context prompted u implement it’s associating method viewHolder onBindViewHolder Inside viewHolder method simply inflated carditemsxml layout file returned new viewHolderv method onBindViewHolder method created instance CardItems class stored value cardItem object it’s respective variable curr btcVal ethVal finally bind variable respective position viewHolder set viewHolder holder help CardItems instance defined setter getters Finally notice extended MyAdapter class passed MyAdapterViewHolder hence created ViewHolder class inside MyAdapter class simply initialized view object carditemxml file including LinearLayout CardItems class Finally finish setting MainActivity create CardItems class CardItems class simply help u generate setter getters content carditemsxml file earlier initialized onCreat method MainActivity class create new java class called CardItems set like point everything correctly set run app see content passed JSON response carditems layout file show layout turn get laid recyclerview like soTags Mobile App Development Cryptocurrency Android Android App Development Bitcoin |
4,156 | Taking Data Visualization to Another Level | When you tend to use one library for a certain period of time, you get used to it. But, you need to evolve and learn something new every day. If you are still stuck up with Matplotlib(Which is amazing), Seaborn(This is amazing too), Pandas(Basic, yet easy Visualization) and Bokeh, You need to move on and try something new. Many amazing visualization libraries are available in python, which turns to be very versatile. Here, I’m going to discuss about these amazing libraries:
Plotly
Cufflinks
Folium
Altair + Vega
D3.js (My best pick)
If you are aware and use the libraries mentioned above then you are on the right track of evolution. They can help in generating some amazing visualizations and syntax ain’t difficult too. Generally, I prefer more of plotly+cufflinks and D3.js. Alright, lets get back to the basics:
Plotly
Plotly is an open-source, interactive and browser-based graphing library for python. Plotly is a library that allows you to create interactive plots that you can use in dashboards or websites (you can save them as html files or static images). Plotly is built on top of plotly.js which in turn is built on D3.js and it is a high-level charting library. plotly comes with over 30 chart types, including scientific charts, statistical charts, 3D graphs, financial charts and more. The best things about plotly is that you can use it in Jupyter Notebooks, as well as stand alone HTML pages. You can also use it online on their site but i prefer more to use it offline and you can also save the visualization as an image. It’s pretty simple to use it and get it to work.
— Method to use it in Jupyter Notebook (Offline)
First, install the plotly library.
pip install plotly
Then open jupyter notebook and type this:
from plotly import __version__
from plotly.offline import download_plotlyjs, init_notebook_mode, plot, iplot
init_notebook_mode(connected=True)
The syntax is quite simpler or simplest, i say? In Pandas, you use dataframe.plot() and here, you use dataframe.iplot(). This “i” changes the whole definition of the visualization.
With just one line, i generated this scatter plot. You can customize it as you want. Remember to specify mode markers or you’ll just get some cluster of lines.
Scatter plot generated using plotly
Please note that as the data increases, plotly begins to choke. So, I would only use plotly when I have less than 500K data points. Try it all in your Jupyter Notebook.
Cufflinks
Cufflinks bind Plotly directly to pandas dataframes. The combination is just amazing, the power of plotly combined with flexibility of Pandas. It is more effective than plotly. The syntax is even simpler than plotly. With plotly’s Python library, you can describe figures with DataFrame’s series and index, but with cufflinks you can plot it directly. Here is an example:
df = cf.datagen.lines()
py.iplot([{
'x': df.index,
'y': df[col],
'name': col
} for col in df.columns])
With Plotly
df.iplot(kind='scatter')
With Cufflinks
Cufflinks makes it much easier to plot stuffs. You can also generated amazing 3D charts with cufflinks too. I have generated this 3D with just couple of lines of code.
3D chart with Cufflinks
You can always try it out in your Jupyter Notebook.
— Quick Hack:
Set in the config:
c.NotebookApp.iopub_data_rate_limit = 1.0e10
Import it the following way:
import plotly.graph_objs as go
import plotly.plotly as py
import cufflinks as cf
from plotly.offline import iplot, init_notebook_mode
cf.go_offline()
# Set global theme
cf.set_config_file(world_readable=True, theme=’pearl’, offline=True)
init_notebook_mode()
And it works inline.
Next, I’m going to talk about yet another amazing Viz library.
Folium
Folium is built on the data wrangling strengths of the Python ecosystem and the mapping strengths of the Leaflet.js library. You can manipulate your data in python, then visualize it in a Leaflet map via folium. Folium is turning out be an amazing library for plotting spatial data. You can also generate heat maps and choropleth maps using folium. Let’s learn something about folium:
Maps are defined as a folium.Map object, addition of other folium objects on top of the folium.Map can be done to improve the map rendered You can use different map tiles for the map rendered by Folium, such as MapBox, OpenStreetMap , and several other tiles, for that you can visit this github repo folder or this documentation page. You can also select different map projections. Many projections are available out there.
Let’s generate a Choropleth map with Geojson of US unemployment. Here is the snippet:
map = folium.Map([43, -100], zoom_start=4)
choropleth = folium.Choropleth(
geo_data=us_states,
data=state_data,
columns=['State', 'Unemployment'],
key_on='feature.id',
fill_color='YlGn',
name='Unenployment',
show=False,
).add_to(m)
# The underlying GeoJson and StepColormap objects are reachable
print(type(choropleth.geojson))
print(type(choropleth.color_scale))
folium.LayerControl(collapsed=False).add_to(m)
map.save(os.path.join('results', 'GeoChoro.html'))
map
This is just a basic one, you can add markers, pop ups and a lot more to it. Here is how it would look like.
Map with leaflet and folium
Altair + Vega
Altair is a declarative statistical visualization library and it is based on Vega and Vega-Lite. Altair enables you to build a wide range of statistical visualizations quickly with a powerful and concise visualization grammar. You need to install it the following way, if you are using Jupyter Notebook. It also includes some example vega datasets.
pip install -U altair vega_datasets notebook vega
Altair’s main dependency is Vega, in order to make the plots to be visible on the screen, you need to install it and also, you need to run this command for every new session.
alt.renderers.enable(‘notebook’)
Data in Altair is built around the Pandas Dataframe. One of the defining characteristics of statistical visualization is that it begins with tidy Dataframes. You can also save the plot as an image or open it in vega editor for more options. It’s definitely not the best one out there, but definitely worth a try for the sake of creators hard work.
Here is an example, I’m using cars dataset for this;
import altair as alt
from vega_datasets import data source = data.cars() brush = alt.selection(type='interval') points = alt.Chart().mark_point().encode(
x='Horsepower:Q',
y='Miles_per_Gallon:Q',
color=alt.condition(brush, 'Origin:N', alt.value('lightgray'))
).add_selection(
brush
) bars = alt.Chart().mark_bar().encode(
y='Origin:N',
color='Origin:N',
x='count(Origin):Q'
).transform_filter(
brush
) alt.vconcat(points, bars, data=source)
Scatter plot and Histogram with Altair and Vega
You can try it out in your own Notebook and let me know if you get stuck anywhere!
D3.js (Data Driven Documents)
D3.js is a JavaScript library for manipulation of documents based on the data. You can bring data to life using HTML, SVG, and CSS. D3 does not require you to tie tie yourself to any proprietary framework because modern browsers have it all that D3 needs and it is also used for combining powerful visualization components and for a data-driven approach to DOM manipulation.
D3.js is the best data visualization library out in the market. I prefer to use it almost every time. You can use it with python and as well as with R. Originally, it works with JavaScript and that becomes quite tough because JS has wide range of functions and requires much learning and experience but if you are a JS pro then you don’t need to make a second thought. Although, Python and R has made it a bit simpler, just a bit! But you get the best stuff out there with this library.
D3py has 3 three main dependencies:
Numpy Pandas NetworkX
I would suggest you to use it with JavaScript or R, not with python because the version is out of date and was last updated in 2016. Though, it was just a thin python wrapper for D3.js.
R has an interface for D3 visualizations. With r2d3, you can bind data from R to D3 visualizations. D3 visualizations created with r2d3 work just like R plots within RStudio, R Markdown documents, and Shiny applications. You can install the r2d3 package from CRAN as follows:
install.packages(“r2d3”)
You can make some amazing visualizations with this one, Let me show you a couple of it here.
Sequences Sunburst — Kerry Rodden’s Block (Source)
Activity Status of an Year — Kunal Dhariwal (Me, lol)
From basics to High end, you can build anything with D3.js, Don’t forget to try it out.
If you encounter any error or need any help, you can always make a comment or ping me on LinkedIn. LinkedIn: https://bit.ly/2u4YPoF Github: https://bit.ly/2SQV7ss
P.S- Special Thanks to the creators and contributors of those amazing libraries. | https://medium.com/hackernoon/taking-data-visualization-to-another-level-4d1c47bb01a2 | ['Kunal Dhariwal'] | 2019-05-10 13:31:07.864000+00:00 | ['Python', 'Data Science', 'Data Analysis', 'Data Visualization', 'Hackernoon Top Story'] | Title Taking Data Visualization Another LevelContent tend use one library certain period time get used need evolve learn something new every day still stuck MatplotlibWhich amazing SeabornThis amazing PandasBasic yet easy Visualization Bokeh need move try something new Many amazing visualization library available python turn versatile I’m going discus amazing library Plotly Cufflinks Folium Altair Vega D3js best pick aware use library mentioned right track evolution help generating amazing visualization syntax ain’t difficult Generally prefer plotlycufflinks D3js Alright let get back basic Plotly Plotly opensource interactive browserbased graphing library python Plotly library allows create interactive plot use dashboard website save html file static image Plotly built top plotlyjs turn built D3js highlevel charting library plotly come 30 chart type including scientific chart statistical chart 3D graph financial chart best thing plotly use Jupyter Notebooks well stand alone HTML page also use online site prefer use offline also save visualization image It’s pretty simple use get work — Method use Jupyter Notebook Offline First install plotly library pip install plotly open jupyter notebook type plotly import version plotlyoffline import downloadplotlyjs initnotebookmode plot iplot initnotebookmodeconnectedTrue syntax quite simpler simplest say Pandas use dataframeplot use dataframeiplot “i” change whole definition visualization one line generated scatter plot customize want Remember specify mode marker you’ll get cluster line Scatter plot generated using plotly Please note data increase plotly begin choke would use plotly le 500K data point Try Jupyter Notebook Cufflinks Cufflinks bind Plotly directly panda dataframes combination amazing power plotly combined flexibility Pandas effective plotly syntax even simpler plotly plotly’s Python library describe figure DataFrame’s series index cufflink plot directly example df cfdatagenlines pyiplot x dfindex dfcol name col col dfcolumns Plotly dfiplotkindscatter Cufflinks Cufflinks make much easier plot stuff also generated amazing 3D chart cufflink generated 3D couple line code 3D chart Cufflinks always try Jupyter Notebook — Quick Hack Set config cNotebookAppiopubdataratelimit 10e10 Import following way import plotlygraphobjs go import plotlyplotly py import cufflink cf plotlyoffline import iplot initnotebookmode cfgooffline Set global theme cfsetconfigfileworldreadableTrue theme’pearl’ offlineTrue initnotebookmode work inline Next I’m going talk yet another amazing Viz library Folium Folium built data wrangling strength Python ecosystem mapping strength Leafletjs library manipulate data python visualize Leaflet map via folium Folium turning amazing library plotting spatial data also generate heat map choropleth map using folium Let’s learn something folium Maps defined foliumMap object addition folium object top foliumMap done improve map rendered use different map tile map rendered Folium MapBox OpenStreetMap several tile visit github repo folder documentation page also select different map projection Many projection available Let’s generate Choropleth map Geojson US unemployment snippet map foliumMap43 100 zoomstart4 choropleth foliumChoropleth geodatausstates datastatedata columnsState Unemployment keyonfeatureid fillcolorYlGn nameUnenployment showFalse addtom underlying GeoJson StepColormap object reachable printtypechoroplethgeojson printtypechoroplethcolorscale foliumLayerControlcollapsedFalseaddtom mapsaveospathjoinresults GeoChorohtml map basic one add marker pop ups lot would look like Map leaflet folium Altair Vega Altair declarative statistical visualization library based Vega VegaLite Altair enables build wide range statistical visualization quickly powerful concise visualization grammar need install following way using Jupyter Notebook also includes example vega datasets pip install U altair vegadatasets notebook vega Altair’s main dependency Vega order make plot visible screen need install also need run command every new session altrenderersenable‘notebook’ Data Altair built around Pandas Dataframe One defining characteristic statistical visualization begin tidy Dataframes also save plot image open vega editor option It’s definitely best one definitely worth try sake creator hard work example I’m using car dataset import altair alt vegadatasets import data source datacars brush altselectiontypeinterval point altChartmarkpointencode xHorsepowerQ yMilesperGallonQ coloraltconditionbrush OriginN altvaluelightgray addselection brush bar altChartmarkbarencode yOriginN colorOriginN xcountOriginQ transformfilter brush altvconcatpoints bar datasource Scatter plot Histogram Altair Vega try Notebook let know get stuck anywhere D3js Data Driven Documents D3js JavaScript library manipulation document based data bring data life using HTML SVG CSS D3 require tie tie proprietary framework modern browser D3 need also used combining powerful visualization component datadriven approach DOM manipulation D3js best data visualization library market prefer use almost every time use python well R Originally work JavaScript becomes quite tough JS wide range function requires much learning experience JS pro don’t need make second thought Although Python R made bit simpler bit get best stuff library D3py 3 three main dependency Numpy Pandas NetworkX would suggest use JavaScript R python version date last updated 2016 Though thin python wrapper D3js R interface D3 visualization r2d3 bind data R D3 visualization D3 visualization created r2d3 work like R plot within RStudio R Markdown document Shiny application install r2d3 package CRAN follows installpackages“r2d3” make amazing visualization one Let show couple Sequences Sunburst — Kerry Rodden’s Block Source Activity Status Year — Kunal Dhariwal lol basic High end build anything D3js Don’t forget try encounter error need help always make comment ping LinkedIn LinkedIn httpsbitly2u4YPoF Github httpsbitly2SQV7ss PS Special Thanks creator contributor amazing librariesTags Python Data Science Data Analysis Data Visualization Hackernoon Top Story |
4,157 | Physicists create prototype superefficient memory for future computers | Illustration. Energy efficient memory. Credit: @tsarcyanide/MIPT Press Office
Researchers from the Moscow Institute of Physics and Technology and their colleagues from Germany and the Netherlands have achieved material magnetization switching on the shortest timescales, at a minimal energy cost. They have thus developed a prototype of energy-efficient data storage devices. The paper was published in the journal Nature.
The rapid development of information technology calls for data storage devices controlled by quantum mechanisms without energy losses. Maintaining data centers consumes over 3% of the power generated worldwide, and this figure is growing. While writing and reading information is a bottleneck for IT development, the fundamental laws of nature actually do not prohibit the existence of fast and energy-efficient data storage.
The most reliable way of storing data is to encode it as binary zeros and ones, which correspond to the orientations of the microscopic magnets, known as spins, in magnetic materials. This is how a computer hard drive stores information. To switch a bit between its two basic states, it is remagnetized via a magnetic field pulse. However, this operation requires much time and energy.
Back in 2016, Sebastian Baierl from the University of Regensburg in Germany, Anatoly Zvezdin from MIPT in Russia, Alexey Kimel from Radboud University Nijmegen in the Netherlands and Russian Technological University MIREA, along with other colleagues, proposed a way for rapid spin switching in thulium orthoferrite via T-rays. Their technique for remagnetizing memory bits proved faster and more efficient than using magnetic field pulses. This effect stems from a special connection between spin states and the electrical component of a T-ray pulse.
“The idea was to use the previously discovered spin switching mechanism as an instrument for efficiently driving spins out of equilibrium and studying the fundamental limitations on the speed and energy cost of writing information. Our research focused on the so-called fingerprints of the mechanism with the maximum possible speed and minimum energy dissipation,” commented study co-author Professor Alexey Kimel of Radboud University Nijmegen and MIREA.
In this study, we exposed spin states to specially tuned T-pulses. Their characteristic photon energies are on the order of the energy barrier between the spin states. The pulses last picoseconds, which corresponds to one light oscillation cycle. The team used a specially developed structure comprised by micrometer-sized gold antennas deposited on a thulium orthoferrite sample.
As a result, the researchers spotted the characteristic spectral signatures indicating successful spin switching with only the minimal energy losses imposed by the fundamental laws of thermodynamics. For the first time, a spin switch was complete in a mere 3 picoseconds and with almost no energy dissipation. This shows the enormous potential of magnetism for addressing the crucial problems in information technology. According to the researchers, their experimental findings agree with theoretical model predictions.
“The rare earth materials, which provided the basis for this discovery, are currently experiencing a sort of a renaissance,” said Professor Anatoly Zvezdin, who heads the Magnetic Heterostructures and Spintronics Lab at MIPT. “Their fundamental properties were studied half a century ago, with major contributions by Russian physicists, MSU and MIPT alumni. This is an excellent example of how fundamental research finds its way into practice decades after it was completed.”
The joint work of several research teams has led to the creation of a structure that is a promising prototype of future data storage devices. Such devices would be compact and capable of transferring data within picoseconds. Fitting this storage with antennas will make it compatible with on-chip T-ray sources. | https://mipt.medium.com/physicists-create-prototype-superefficient-memory-for-future-computers-moscow-institute-of-489c6c4f181c | ['Moscow Institute Of Physics'] | 2019-05-17 08:57:42.032000+00:00 | ['Science', 'Computers', 'Computer Memories', 'Efficiency'] | Title Physicists create prototype superefficient memory future computersContent Illustration Energy efficient memory Credit tsarcyanideMIPT Press Office Researchers Moscow Institute Physics Technology colleague Germany Netherlands achieved material magnetization switching shortest timescales minimal energy cost thus developed prototype energyefficient data storage device paper published journal Nature rapid development information technology call data storage device controlled quantum mechanism without energy loss Maintaining data center consumes 3 power generated worldwide figure growing writing reading information bottleneck development fundamental law nature actually prohibit existence fast energyefficient data storage reliable way storing data encode binary zero one correspond orientation microscopic magnet known spin magnetic material computer hard drive store information switch bit two basic state remagnetized via magnetic field pulse However operation requires much time energy Back 2016 Sebastian Baierl University Regensburg Germany Anatoly Zvezdin MIPT Russia Alexey Kimel Radboud University Nijmegen Netherlands Russian Technological University MIREA along colleague proposed way rapid spin switching thulium orthoferrite via Trays technique remagnetizing memory bit proved faster efficient using magnetic field pulse effect stem special connection spin state electrical component Tray pulse “The idea use previously discovered spin switching mechanism instrument efficiently driving spin equilibrium studying fundamental limitation speed energy cost writing information research focused socalled fingerprint mechanism maximum possible speed minimum energy dissipation” commented study coauthor Professor Alexey Kimel Radboud University Nijmegen MIREA study exposed spin state specially tuned Tpulses characteristic photon energy order energy barrier spin state pulse last picosecond corresponds one light oscillation cycle team used specially developed structure comprised micrometersized gold antenna deposited thulium orthoferrite sample result researcher spotted characteristic spectral signature indicating successful spin switching minimal energy loss imposed fundamental law thermodynamics first time spin switch complete mere 3 picosecond almost energy dissipation show enormous potential magnetism addressing crucial problem information technology According researcher experimental finding agree theoretical model prediction “The rare earth material provided basis discovery currently experiencing sort renaissance” said Professor Anatoly Zvezdin head Magnetic Heterostructures Spintronics Lab MIPT “Their fundamental property studied half century ago major contribution Russian physicist MSU MIPT alumnus excellent example fundamental research find way practice decade completed” joint work several research team led creation structure promising prototype future data storage device device would compact capable transferring data within picosecond Fitting storage antenna make compatible onchip Tray sourcesTags Science Computers Computer Memories Efficiency |
4,158 | 3 Ways To Know If You’re On Track To Success | SELF
When you look back, what propelled you there will be obvious…
Photo by Razvan Chisu on Unsplash
Many of us live life day to day without recognizing how each day is bringing us closer to that which we want to achieve. I have been using triggering and reflective questions at the end of each day for years now. I found the need for such a checklist even before it was cool. Before influencers began raving about bullet journals and before ‘The Secret’ came out encouraging positive affirmation trends.
I’ve always chased the high of inspiration and believe there isn’t a greater feeling. Many people lose faith after not seeing results soon enough and feel the need for their efforts and patience to be proven by visible achievements.
Putting my progress into perspective by categorizing it has been a helpful way for me to tell whether each day is leading me to my desired success. It ensures I feel positive about the small tasks and insignificant little things that I do daily. I believe that the constant use of these questions is what has led me to be the ever-confident and clear-visioned individual that I am today.
The answers more often than not take me to a content and humble place of appreciation of my constant efforts. At times I apply them in reverse as I know the looming checklist that is ahead. It, therefore, encourages me to provide a reason for at least one of the categories to be ticked off that day.
Of course, success doesn't come overnight. So, why do so many of us feel demotivated and fail to recognize just how much we help our future selves with our daily progress? It isn’t due to one big action or sacrifice but due to thousands of decisions and efforts made every day that then coincide with luck and timing.
It’s an initial decision to change our mindset, ultimately leading to a change in attitude until it has seeped into our bones as a habit of nature. The little things eventually make up the big picture and this takes time. Hopefully, these three questions can calm your spirit and ensure that you see each day as a small success towards a greater goal.
Life is a juggling act. Cavemen never had this many errands to run! There’s no question about it — humans are overworking themselves to death. So let’s not forget to consider just how well we are doing. We are thrust into a world where our human traits long to be happy each day. Meanwhile, we are expected to make money just to survive, and the way in which we do isn’t usually fulfilling. According to research, an astronomical 85% of people are unhappy with their jobs.
Simply being human also means nurturing yourself as well as the relationships and friendships in your life. To fulfill all of these each day is nearly impossible. This can feel entirely overwhelming when we are chasing a dream or working towards success and being content really does come when we find that perfect balance. | https://medium.com/age-of-awareness/3-ways-to-know-if-youre-on-track-to-success-1469c2b68550 | ['Sandra Michelle'] | 2020-11-26 00:06:31.524000+00:00 | ['Productivity', 'Inspiration', 'Self', 'Self Improvement', 'Life Lessons'] | Title 3 Ways Know You’re Track SuccessContent SELF look back propelled obvious… Photo Razvan Chisu Unsplash Many u live life day day without recognizing day bringing u closer want achieve using triggering reflective question end day year found need checklist even cool influencers began raving bullet journal ‘The Secret’ came encouraging positive affirmation trend I’ve always chased high inspiration believe isn’t greater feeling Many people lose faith seeing result soon enough feel need effort patience proven visible achievement Putting progress perspective categorizing helpful way tell whether day leading desired success ensures feel positive small task insignificant little thing daily believe constant use question led everconfident clearvisioned individual today answer often take content humble place appreciation constant effort time apply reverse know looming checklist ahead therefore encourages provide reason least one category ticked day course success doesnt come overnight many u feel demotivated fail recognize much help future self daily progress isn’t due one big action sacrifice due thousand decision effort made every day coincide luck timing It’s initial decision change mindset ultimately leading change attitude seeped bone habit nature little thing eventually make big picture take time Hopefully three question calm spirit ensure see day small success towards greater goal Life juggling act Cavemen never many errand run There’s question — human overworking death let’s forget consider well thrust world human trait long happy day Meanwhile expected make money survive way isn’t usually fulfilling According research astronomical 85 people unhappy job Simply human also mean nurturing well relationship friendship life fulfill day nearly impossible feel entirely overwhelming chasing dream working towards success content really come find perfect balanceTags Productivity Inspiration Self Self Improvement Life Lessons |
4,159 | The Story That’s Not Being Told: Mimi Lok & Last of Her Name | Last of her Name by Mimi Lok. Kaya Press, 2019. 200 pp, prose.
Cheyenne Heckermann: Can you tell me about the journey toward publishing Last of Her Name?
Mimi Lok: It was long! I’d written earlier drafts of the stories over roughly a ten-year span, and spent about three years working on the collection in earnest — rewriting, discarding, organizing. I wanted to send it directly to presses I’d long admired, but every writer I knew told me to find an agent first. After sending the manuscript to various agencies, I learned that I could not get an agent without also having a novel in the works, and I had no novel at the time. So I went back to my original plan and sent it directly to editors. I was thrilled to sell the book to Kaya Press, who I’ve loved for years. Just over a year later, the book was released.
CH: What went into your decision to write “Wedding Night” with such distinct breaks and vignettes between sections?
ML: “Wedding Night” is a messed up love story between two very different people. It was written in a fragmentary way, with perspective shifts between the protagonists Wai Lan and Sing, and this sort of disembodied, omnipotent perspective. Since the nature of memory is key to this story, telling the story in fragments with a greater emphasis on mood and sensory details made more sense than a linear, smoothly coherent narrative.
CH: One of the pieces in Last of Her Name is a novella. Was there anything different in your process with “The Woman in the Closet?”
ML: With the novella I had a slightly clearer sense of the story than with the others, possibly because it was partly inspired by a real-life incident. The story follows Granny Ng, an elderly homeless woman who breaks into a young man’s home, and I was interested in following her closely over a substantial period of time and seeing how things unfold for her. I also knew how it would end on a surface level, but what it had in common with the other stories is that I still had to relinquish control of the story to the characters’ desires, needs, and impulses, and let things go where they needed to go in between. I knew the what but not the how.
CH: You make excellent use of perspective shifts in your short stories. What do you enjoy about having these shifts, and how do they influence your stories?
ML: Challenging what’s accepted as the default perspective, I hope, shakes up our idea of whose experiences and perspectives we privilege over others, whose we don’t consider but should, all of that. I’m always curious about the story that’s not being told, and even if we only get a glimpse of that, it reminds us of complexities and nuances beyond our immediate perception.
CH: What’s next for Mimi Lok? Is there anything that you’re working on that you can talk about?
ML: I am writing more stories, and also working on a novel.
Mimi Lok is the author of the story collection Last Of Her Name, published October 2019 by Kaya Press. Last of Her Name was recently shortlisted for the 2020 PEN/Robert W. Bingham prize for debut short story collection, and a 2020 Northern California Book Award. A story from the collection, “The Woman in the Closet,” was nominated for a 2020 National Magazine Award in Fiction with McSweeney’s Quarterly. Mimi is the recipient of a Smithsonian Ingenuity Award and an Ylvisaker Award for Fiction. Her work can be found in McSweeney’s, Electric Literature, LitHub, Nimrod, Lucky Peach, Hyphen, the South China Morning Post, and elsewhere. She is currently working on a novel. Mimi is also the founding director and executive editor of Voice of Witness, an award-winning human rights/oral history nonprofit that amplifies marginalized voices through a book series and a national education program. | https://medium.com/anomalyblog/an-interview-with-mimi-lok-on-last-of-her-name-41fef835d9a2 | ['Cheyenne Heckermann'] | 2020-02-18 15:50:51.943000+00:00 | ['Publishing', 'Fiction', 'Interview', 'Featured', 'Books'] | Title Story That’s Told Mimi Lok Last NameContent Last Name Mimi Lok Kaya Press 2019 200 pp prose Cheyenne Heckermann tell journey toward publishing Last Name Mimi Lok long I’d written earlier draft story roughly tenyear span spent three year working collection earnest — rewriting discarding organizing wanted send directly press I’d long admired every writer knew told find agent first sending manuscript various agency learned could get agent without also novel work novel time went back original plan sent directly editor thrilled sell book Kaya Press I’ve loved year year later book released CH went decision write “Wedding Night” distinct break vignette section ML “Wedding Night” messed love story two different people written fragmentary way perspective shift protagonist Wai Lan Sing sort disembodied omnipotent perspective Since nature memory key story telling story fragment greater emphasis mood sensory detail made sense linear smoothly coherent narrative CH One piece Last Name novella anything different process “The Woman Closet” ML novella slightly clearer sense story others possibly partly inspired reallife incident story follows Granny Ng elderly homeless woman break young man’s home interested following closely substantial period time seeing thing unfold also knew would end surface level common story still relinquish control story characters’ desire need impulse let thing go needed go knew CH make excellent use perspective shift short story enjoy shift influence story ML Challenging what’s accepted default perspective hope shake idea whose experience perspective privilege others whose don’t consider I’m always curious story that’s told even get glimpse reminds u complexity nuance beyond immediate perception CH What’s next Mimi Lok anything you’re working talk ML writing story also working novel Mimi Lok author story collection Last Name published October 2019 Kaya Press Last Name recently shortlisted 2020 PENRobert W Bingham prize debut short story collection 2020 Northern California Book Award story collection “The Woman Closet” nominated 2020 National Magazine Award Fiction McSweeney’s Quarterly Mimi recipient Smithsonian Ingenuity Award Ylvisaker Award Fiction work found McSweeney’s Electric Literature LitHub Nimrod Lucky Peach Hyphen South China Morning Post elsewhere currently working novel Mimi also founding director executive editor Voice Witness awardwinning human rightsoral history nonprofit amplifies marginalized voice book series national education programTags Publishing Fiction Interview Featured Books |
4,160 | Coin-o-graphy | For a child growing up in a middle class family of Bangladesh, one of the first financial lesson he or she learns is how to save. We were taught how much important saving is, and how one should save. Considering the age, our parents often thought it was not the proper time to introduce saving in banking organizations. Instead we are taught to save inside ‘banks’ made of clay. It is an enclosed clay pot, with only a thin slit as opening — the slit was big enough so that we could shove coins into it. The only way to regain the coins is to break them. We used to drop coins whenever we could. Then when it was full, we used to break them. In my lifetime I have filled and broken around 6 of them. The latest demolition was performed today, 30th September 2017. The latest coins have a tendency to get rusty and we needed to clean them up. So we broke the bank.
This is how the clay banks look like. Image is collected from Google Image Search.
It felt like I opened up a Pandora’s box. Each coin was carrying a bit and piece of my past, since childhood. Each coin can curve a story around it. And that was the exact moment when I thought of writing this piece. It is a collection of stories around these coins, and they way we used to save them. Stories from my parents, and their parents. Stories from my own life. | https://medium.com/the-moonwreckers-diary/for-a-child-growing-up-in-a-middle-class-family-of-bangladesh-one-of-the-first-financial-lesson-he-195dbfc1e13e | ['Akm Nivrito'] | 2017-09-30 09:58:38.566000+00:00 | ['Money', 'Stories', 'Saving', 'Memories', 'Storytelling'] | Title CoinographyContent child growing middle class family Bangladesh one first financial lesson learns save taught much important saving one save Considering age parent often thought proper time introduce saving banking organization Instead taught save inside ‘banks’ made clay enclosed clay pot thin slit opening — slit big enough could shove coin way regain coin break used drop coin whenever could full used break lifetime filled broken around 6 latest demolition performed today 30th September 2017 latest coin tendency get rusty needed clean broke bank clay bank look like Image collected Google Image Search felt like opened Pandora’s box coin carrying bit piece past since childhood coin curve story around exact moment thought writing piece collection story around coin way used save Stories parent parent Stories lifeTags Money Stories Saving Memories Storytelling |
4,161 | Text Generation With Python And GPT-2 | Results
We can get some great results with very little code. Here are a few examples that should give you a better understanding of the impact of each argument in the .generate method.
outputs = model.generate(
inputs, max_length=200, do_sample=True
)
tokenizer.decode(outputs[0], skip_special_tokens=True) [Out]: "He began his premiership by forming a five-man war cabinet which included Chamerlain as Lord President of the Council, Labour leader Clement Attlee as Lord Privy Seal (later as Deputy Prime Minister), Halifax as Foreign Secretary and Labour's Arthur Greenwood as a minister without portfolio. In practice, these cabinet officers were part of the National Security Council, then headed by Lord Chamberlain's secretary Arthur Hargreaves. A number of them became key cabinet secretaries, including Lord Hargreaves in 1948, Lord Butler as Justice Minister in 1949 and Lord Codds as justice minister until his death in 1975. After being replaced by Lord Hargreaves in 1955, there was speculation that the next general election would see Chamberlain and Howe try and avoid a hard line policy..."
We can add more randomness with temperature — the default value is 1 , a high value like 5 will produce a pretty nonsensical output:
outputs = model.generate(
inputs, max_length=200, do_sample=True, temperature=5
)
tokenizer.decode(outputs[0], skip_special_tokens=True) [Out]: "He began his premiership by forming a five-man war cabinet which included Chamerlain as Lord President of the Council, Labour leader Clement Attlee as Lord Privy Seal (later as Deputy Prime Minister), Halifax as Foreign Secretary and Labour's Arthur Greenwood as a minister without portfolio. In practice, his foreign secretaries generally assumed other duties during cabinet so his job fell less and smaller - sometimes twice his overall stature so long a day seemed manageable after he became Chief Arctic Advisor: Mr Wilson led one reshover where we've also done another three, despite taking responsibility over appointments including Prime (for both) his time here since 1901)[31],[38-4]. (These last had fewer staff as many than he is responsible..."
Turning the temperature down below 1 will produce more linear but less creative outputs.
We can also add the top_k parameter — which limits the sample tokens to a given number of the most probable tokens. This results in text that tends to stick to the same topic (or set of words) for a longer period of time. | https://towardsdatascience.com/text-generation-with-python-and-gpt-2-1fecbff1635b | ['James Briggs'] | 2020-12-28 14:47:13.167000+00:00 | ['Machine Learning', 'Data Science', 'Technology', 'Artificial Intelligence', 'Programming'] | Title Text Generation Python GPT2Content Results get great result little code example give better understanding impact argument generate method output modelgenerate input maxlength200 dosampleTrue tokenizerdecodeoutputs0 skipspecialtokensTrue began premiership forming fiveman war cabinet included Chamerlain Lord President Council Labour leader Clement Attlee Lord Privy Seal later Deputy Prime Minister Halifax Foreign Secretary Labours Arthur Greenwood minister without portfolio practice cabinet officer part National Security Council headed Lord Chamberlains secretary Arthur Hargreaves number became key cabinet secretary including Lord Hargreaves 1948 Lord Butler Justice Minister 1949 Lord Codds justice minister death 1975 replaced Lord Hargreaves 1955 speculation next general election would see Chamberlain Howe try avoid hard line policy add randomness temperature — default value 1 high value like 5 produce pretty nonsensical output output modelgenerate input maxlength200 dosampleTrue temperature5 tokenizerdecodeoutputs0 skipspecialtokensTrue began premiership forming fiveman war cabinet included Chamerlain Lord President Council Labour leader Clement Attlee Lord Privy Seal later Deputy Prime Minister Halifax Foreign Secretary Labours Arthur Greenwood minister without portfolio practice foreign secretary generally assumed duty cabinet job fell le smaller sometimes twice overall stature long day seemed manageable became Chief Arctic Advisor Mr Wilson led one reshover weve also done another three despite taking responsibility appointment including Prime time since 190131384 last fewer staff many responsible Turning temperature 1 produce linear le creative output also add topk parameter — limit sample token given number probable token result text tends stick topic set word longer period timeTags Machine Learning Data Science Technology Artificial Intelligence Programming |
4,162 | How to Create an Animated Bar Chart With React and d3 | How to Create an Animated Bar Chart With React and d3 Michael Tong Follow Sep 22 · 3 min read
Photo by Markus Winkler on Unsplash
Have you ever looked at data visualizations and be wowed by all the effects and animations?
Have you ever wondered how to integrate visualizations with react?
In this article, we will talk about how to make an animating bar chart using d3 in react.
To understand how to create the bar chart, let’s understand what d3 is and how it works.
D3 is an open-source javascript library that is used to create custom interactive data visualizations. It is data-driven and generates visualizations from data that can come from arrays, objects, jsons, or data from a CSV or XML file.
It allows direct selection of elements/nodes in the DOM and attach styles and attributes to generate visualizations.
Here is an example of a d3 bar chart:
I know this is a bit long so let me break this down.
Above we set the margins for the graph and on line 28/29, you would see there is an xscale and yscale. The xscale determines our range on the x-axis and in our case, that would be the range of the years(1993, 1994, etc).
On the other hand, the yscale determines the scale depending on the height of the values.
Afterward, we select the current ref and initializes a bar this way:
we select the “g” element of the current SVG, which is the bar chart itself.
Over here, we start joining the data we get from another file. Normally, this will be data from a CSV or JSON file. Afterward, we initialize the chart.
Here is where it gets interesting. After I set the attr of width, a call to make duration and delay how fast the bars show up.
Let’s look at how the rest of the chart is setup:
Over here, we set up the bar labels first. Afterward, we determine the location of the x-axis and y-axis labels, which we attach to the element “g”. “g” is our master node for the whole barChart.
We also select x-axis-title and y-axis-title and bind its data attribute to the respective fields of year and yAxisTitle. We also dictate other attributes that come along with it, such as x, y position, transform, and its font-size.
Pretty straightforward, right? Let’s take a look at how it’s being utilized inside App.js:
Over here, we have a bar chart, where we set the width and the height as well as the y-axis title. We also give radio options for users to select between us and japan data, which maps to a different set of values from the data JSON under ‘./utils/constant’.
It’s hard to show the graph with the animation here but here is a brief overview of how it would actually look like:
That’s it! I know I talked a lot about the visualization but I will also provide the steps to set this out from scratch.
Step 1: install node on your machine and run the following command:
curl “https://nodejs.org/dist/latest/node-${VERSION:-$(wget -qO- https://nodejs.org/dist/latest/ | sed -nE ‘s|.*>node-(.*)\.pkg</a>.*|\1|p’)}.pkg” > “$HOME/Downloads/node-latest.pkg” && sudo installer -store -pkg “$HOME/Downloads/node-latest.pkg” -target “/”
Step 2: run the following command:
npx create-react-app economic-growth-chart
Step 3: go to app.js and replace with the following content:(already shown once in this article)
Step 4: run the following command:
npm install -- save d3 @material-ui/core
Step 5: Creates a utils folder under the src folder and create constant.js with the following content:
Step 6: under the src folder, create a folder called components and create a class called BarChart.js(this is also mentioned in this article already):
Now go into your terminal and run npm start! Your project is all set up. | https://medium.com/weekly-webtips/how-to-create-an-animated-barchart-with-react-and-d3-b4fd3662633f | ['Michael Tong'] | 2020-09-23 06:10:31.808000+00:00 | ['D3js', 'Web Development', 'React', 'JavaScript', 'Data Visualization'] | Title Create Animated Bar Chart React d3Content Create Animated Bar Chart React d3 Michael Tong Follow Sep 22 · 3 min read Photo Markus Winkler Unsplash ever looked data visualization wowed effect animation ever wondered integrate visualization react article talk make animating bar chart using d3 react understand create bar chart let’s understand d3 work D3 opensource javascript library used create custom interactive data visualization datadriven generates visualization data come array object jsons data CSV XML file allows direct selection elementsnodes DOM attach style attribute generate visualization example d3 bar chart know bit long let break set margin graph line 2829 would see xscale yscale xscale determines range xaxis case would range years1993 1994 etc hand yscale determines scale depending height value Afterward select current ref initializes bar way select “g” element current SVG bar chart start joining data get another file Normally data CSV JSON file Afterward initialize chart get interesting set attr width call make duration delay fast bar show Let’s look rest chart setup set bar label first Afterward determine location xaxis yaxis label attach element “g” “g” master node whole barChart also select xaxistitle yaxistitle bind data attribute respective field year yAxisTitle also dictate attribute come along x position transform fontsize Pretty straightforward right Let’s take look it’s utilized inside Appjs bar chart set width height well yaxis title also give radio option user select u japan data map different set value data JSON ‘utilsconstant’ It’s hard show graph animation brief overview would actually look like That’s know talked lot visualization also provide step set scratch Step 1 install node machine run following command curl “httpsnodejsorgdistlatestnodeVERSIONwget qO httpsnodejsorgdistlatest sed nE ‘snodepkga1p’pkg” “HOMEDownloadsnodelatestpkg” sudo installer store pkg “HOMEDownloadsnodelatestpkg” target “” Step 2 run following command npx createreactapp economicgrowthchart Step 3 go appjs replace following contentalready shown article Step 4 run following command npm install save d3 materialuicore Step 5 Creates utils folder src folder create constantjs following content Step 6 src folder create folder called component create class called BarChartjsthis also mentioned article already go terminal run npm start project set upTags D3js Web Development React JavaScript Data Visualization |
4,163 | NaNoWriMo Week 1: Engaging My Inner Trickster | Morning raven.
Day 2 of NaNoWriMo began when my alarm sang a merry tune at 5:30am. I hit snooze. Inside my imagination, my characters glared at me, rolled over and went back to bed.
When I rose, two snooze cycles later and sat in front of my open laptop, I expected my characters to flow through a scene just following a major confrontation. I wanted my main character to spill a tiny bit of her secrets but not all of them.
Nothing happened.
Fear raged while my creativity still slept. This is stupid. Pointless. You should have stayed in bed. Your characters are boring and pointless and you’re not going to have any pages for your writing group to critique in three weeks. Why do you even bother, you should quit your writing group. They’re all better writers than you anyway.
Then, for no reason except maybe my Creativity woke up, I remembered The Trickster¹. Instead of forcing my characters to do things on the page, I wrote all those fears, sucking them out of my brain and putting them right in the middle of the page where my characters were supposed to be talking.
After three paragraphs of dumping random thoughts and fears, my characters jumped in. They interrupted my boring monologue and took over. They skipped that conversation I was trying to force them to have and talked about something else. When it was time to get ready for work I had 1500 words.
And in case you were wondering, I do include my rambling thoughts as part of my word count. Because that’s what a Trickster would do. Because they are part of my novel writing process, especially in November.
What do you do, when fear whispers its vile poison into your ear? How do you engage with your Trickster during NaNoWriMo?
Respond, please, and let me know. My inner Trickster loves new tricks. | https://medium.com/nanowrimo/nanowrimo-week-1-engaging-my-inner-trickster-7b2f48d88081 | ['Julie Russell'] | 2018-11-02 16:50:15.578000+00:00 | ['NaNoWriMo', 'Writing'] | Title NaNoWriMo Week 1 Engaging Inner TricksterContent Morning raven Day 2 NaNoWriMo began alarm sang merry tune 530am hit snooze Inside imagination character glared rolled went back bed rose two snooze cycle later sat front open laptop expected character flow scene following major confrontation wanted main character spill tiny bit secret Nothing happened Fear raged creativity still slept stupid Pointless stayed bed character boring pointless you’re going page writing group critique three week even bother quit writing group They’re better writer anyway reason except maybe Creativity woke remembered Trickster¹ Instead forcing character thing page wrote fear sucking brain putting right middle page character supposed talking three paragraph dumping random thought fear character jumped interrupted boring monologue took skipped conversation trying force talked something else time get ready work 1500 word case wondering include rambling thought part word count that’s Trickster would part novel writing process especially November fear whisper vile poison ear engage Trickster NaNoWriMo Respond please let know inner Trickster love new tricksTags NaNoWriMo Writing |
4,164 | Understanding Big O Space Complexity | Most Common Types of Big O
O(n)
If n = some integer, the number of operations within the algorithm (the way of solving a given problem) increases roughly in proportion with n. This type of algorithm is not ideal! An example of an algorithm with an O(n) could be a function which takes an integer, ’n’, as an argument, and uses a ‘for’ loop to calculate the sum of all numbers up to and including ’n’. In this case, the number of operations increases in proportion to n because the larger n is, the more summations will need to be done within the ‘for’ loop to solve the problem.
O(1)
In the case of an algorithm with an O(1), the number of operations required to complete the problem stays consistent no matter the value of ’n’. This is the most preferable type of Big O and results in the best performance. Example: a function which takes an integer, ’n’, and simply performs some operations on n without any sort of ‘for’ loop, searching, etc. The reason this is O(1) is because no matter the size of n, the number operations within the function will stay exactly the same.
O(n ²)
In this case, the amount of operations within the function will increase exponentially with ’n’. An example of this could be a function involving a nested loop which calls n as well. This results in an exponential increase in the amount of operations required to complete the problem. O(n ²) is the slowest and least preferable type of algorithm!
O(log n)
An algorithm with an O(log n) begins with an increase in proportion with ’n’, but eventually levels off. This is a high performing algorithm and is the next best thing to O(1)! An example of this could involve a search algorithm where the answer space keeps getting split, over and over again until the answer is found.
Below is a helpful graph of common types of Big O Notation in relation to time complexity: | https://medium.com/datadriveninvestor/understanding-big-o-space-complexity-6826478e5a9f | ['Colton Kaiser'] | 2020-06-08 17:22:10.004000+00:00 | ['Software Engineering', 'Coding', 'Software Development', 'Big O Notation', 'Programming'] | Title Understanding Big Space ComplexityContent Common Types Big n integer number operation within algorithm way solving given problem increase roughly proportion n type algorithm ideal example algorithm could function take integer ’n’ argument us ‘for’ loop calculate sum number including ’n’ case number operation increase proportion n larger n summation need done within ‘for’ loop solve problem O1 case algorithm O1 number operation required complete problem stay consistent matter value ’n’ preferable type Big result best performance Example function take integer ’n’ simply performs operation n without sort ‘for’ loop searching etc reason O1 matter size n number operation within function stay exactly ² case amount operation within function increase exponentially ’n’ example could function involving nested loop call n well result exponential increase amount operation required complete problem ² slowest least preferable type algorithm Olog n algorithm Olog n begin increase proportion ’n’ eventually level high performing algorithm next best thing O1 example could involve search algorithm answer space keep getting split answer found helpful graph common type Big Notation relation time complexityTags Software Engineering Coding Software Development Big Notation Programming |
4,165 | The Movies Are Getting Better | I’m writing this in response to Rebecca Stevens A. ‘S article about how there is no such thing as Black Privilege.
Rebecca is right. From India, as a person who watches America through the movies, I’m going to talk about a few movies. I am not a movie buff, but I do have a subscription to an English Movie Pack and I watch part of an English movie everyday during my lunch-hour.
1. Pretty Woman
I recently re-watched this. Back in 1990 when I was 14, I watched this movie with a friend and both our moms. Our moms would clap their hands over our eyes in many scenes, so naturally being able to watch all of it was good fun in itself.
Screenshot from Apple TV trailer of 1990 movie Pretty Woman
Pretty Woman has one black person in it. Darryl, the limousine driver. There isn’t a single black person in the upper class party here.
Screenshot from trailer of 1990 movie Pretty Woman
So as a movie, promoting human equality, I’d say it was a #fail.
Don’t worry, Hollywood will fix everything! Here come, movie no.2 and 3.
2. The Mighty Ducks
Screenshot from trailer of 1992 movie The Mighty Ducks.
Coach is white, while the rag-tag team he teaches is almost all-white. Like in Quidditch, however, this game is gender-neutral, with one girl player.
Screenshot from trailer of 1992 movie The Mighty Ducks
There is one black player on the team, but his dad is always on the white coach’s case, along with a white mom of another player.
Screenshot from trailer of 1992 movie The Mighty Ducks
In this movie, I’d give Disney credit for trying, but not so hard that it looked artificial. The focus is on the Coach’s drinking and driving, and his desire to win even if he’s cheating, unlike the kids who are honest.
3. Hardball
Screenshot from 2001 movie Hard Ball
Hard Ball too deals with a white coach down on his luck who’s forced to coach a kids’ baseball team. I’m amazed they liked the format enough to make the movie twice.
Screenshot from 2001 movie Hard Ball
The thing is: in the older movie, The Mighty Ducks, 1992, most of the ice hockey team is white. There is one black kid on the team, that is all.
In the newer one, 2001, Coach is white, while the kids are all black. This movie isn’t as great as The Mighty Ducks because the Ducks’ coach works harder than the Kekambas’, the game play is better thought out.
I liked the way the Ducks’ ice hockey coach has them have to use eggs for pucks without breaking them.
Keanu Reeve’s character doesn’t do much in the “this way and that’s how” of baseball.
4. A Time To Kill
Screenshot from trailer of 1996 movie A Time To Kill
Next up are two courtroom dramas. One is a dramatization of the novel, A Time to Kill, by John Grisham. Here, the lawyer is white, while the defendant is black.
The defendant, played by Samuel L. Jackson, deliberately picks a white lawyer to offset his all-white jury. I wouldn’t call this a racist movie, but it is a bit like a low-calorie, fat-free, health drink. It reminds you of all the things you’re trying to avoid.
5. Marshall
Screenshot from trailer of 2017 movie Marshall.
The other courtroom drama is Marshall. This movie also has a black defendant, but here, the lawyer is black, too. He’s from the N-double-A-C-P. I deliberately didn’t say NAACP because they never said it that way in the movie, it was always N-double-A-C-P.
Screenshot from trailer of 2017 movie Marshall.
This movie, Marshall, is so great it makes your skin prickle. I wish such movies were released in the theaters in India, and all we get in the movie halls are the superhero movies. | https://medium.com/illumination-curated/the-movies-are-getting-better-b969b92c7909 | ['Tooth Truth Roopa Vikesh'] | 2020-12-15 19:06:43.221000+00:00 | ['Nonfiction', 'Parenting', 'Movies', 'Diversity', 'Perspective'] | Title Movies Getting BetterContent I’m writing response Rebecca Stevens ‘S article thing Black Privilege Rebecca right India person watch America movie I’m going talk movie movie buff subscription English Movie Pack watch part English movie everyday lunchhour 1 Pretty Woman recently rewatched Back 1990 14 watched movie friend mom mom would clap hand eye many scene naturally able watch good fun Screenshot Apple TV trailer 1990 movie Pretty Woman Pretty Woman one black person Darryl limousine driver isn’t single black person upper class party Screenshot trailer 1990 movie Pretty Woman movie promoting human equality I’d say fail Don’t worry Hollywood fix everything come movie no2 3 2 Mighty Ducks Screenshot trailer 1992 movie Mighty Ducks Coach white ragtag team teach almost allwhite Like Quidditch however game genderneutral one girl player Screenshot trailer 1992 movie Mighty Ducks one black player team dad always white coach’s case along white mom another player Screenshot trailer 1992 movie Mighty Ducks movie I’d give Disney credit trying hard looked artificial focus Coach’s drinking driving desire win even he’s cheating unlike kid honest 3 Hardball Screenshot 2001 movie Hard Ball Hard Ball deal white coach luck who’s forced coach kids’ baseball team I’m amazed liked format enough make movie twice Screenshot 2001 movie Hard Ball thing older movie Mighty Ducks 1992 ice hockey team white one black kid team newer one 2001 Coach white kid black movie isn’t great Mighty Ducks Ducks’ coach work harder Kekambas’ game play better thought liked way Ducks’ ice hockey coach use egg puck without breaking Keanu Reeve’s character doesn’t much “this way that’s how” baseball 4 Time Kill Screenshot trailer 1996 movie Time Kill Next two courtroom drama One dramatization novel Time Kill John Grisham lawyer white defendant black defendant played Samuel L Jackson deliberately pick white lawyer offset allwhite jury wouldn’t call racist movie bit like lowcalorie fatfree health drink reminds thing you’re trying avoid 5 Marshall Screenshot trailer 2017 movie Marshall courtroom drama Marshall movie also black defendant lawyer black He’s NdoubleACP deliberately didn’t say NAACP never said way movie always NdoubleACP Screenshot trailer 2017 movie Marshall movie Marshall great make skin prickle wish movie released theater India get movie hall superhero moviesTags Nonfiction Parenting Movies Diversity Perspective |
4,166 | NumPy: Stacking, Splitting, Array attributes | Stacking
Arrays can be stacked horizontally, depth-wise, or vertically. We can use, for that purpose, the vstack , dstack , hstack , column_stack , row_stack , and concatenate functions.
Time for action — stacking arrays
First, let’s set up some arrays:
In: a = arange(9).reshape(3,3)
In: a
Out:
array([[0, 1, 2],
[3, 4, 5],
[6, 7, 8]]) In: b = 2 * a In: b
Out:
array([[ 0, 2, 4],
[ 6, 8, 10],
[12, 14, 16]])
Horizontal stacking: Starting with horizontal stacking, we will form a tuple of ndarrays and give it to the hstack function. It stacks arrays in sequence horizontally (column-wise).This is shown as follows:
In: hstack((a, b))
Out:
array([[ 0, 1, 2, 0, 2, 4],
[ 3, 4, 5, 6, 8, 10],
[ 6, 7, 8, 12, 14, 16]])
We can achieve the same with the concatenate function, concatenation along the second axis, except for 1-D arrays where it concatenates along the first axis, which is shown as follows:
In: concatenate((a, b), axis=1)
Out:
array([[ 0, 1, 2, 0, 2, 4],
[ 3, 4, 5, 6, 8, 10],
[ 6, 7, 8, 12, 14, 16]])
2. Vertical stacking: vstack function is used to stack the sequence of input arrays vertically to make a single array. With vertical stacking, again, a tuple is formed. This time, it is given to the vstack function. This can be seen as follows:
In: vstack((a, b))
Out:
array([[ 0, 1, 2],
[ 3, 4, 5],
[ 6, 7, 8],
[ 0, 2, 4],
[ 6, 8, 10],
[12, 14, 16]])
The concatenate function produces the same result with the axis set to 0. This is the default value for the axis argument.
In: concatenate((a, b), axis=0)
Out:
array([[ 0, 1, 2],
[ 3, 4, 5],
[ 6, 7, 8],
[ 0, 2, 4],
[ 6, 8, 10],
[12, 14, 16]])
3. Depth stacking: Stack arrays in sequence depth wise (along third axis). This is equivalent to concatenation along the third axis after 2-D arrays of shape (M,N) have been reshaped to (M,N,1) and 1-D arrays of shape (N,) have been reshaped to (1,N,1). Additionally, there is the depth-wise stacking using dstack and a tuple, of course. This means stacking of a list of arrays along the third axis (depth). For instance, we could stack 2D arrays of image data on top of each other.
In: dstack((a, b))
Out:
array([[[ 0, 0],
[1, 2],
[2,4]],
[[3,6],
[4,8],
[5,10]],
[[6,12],
[7, 14],
[8,16]]]])
4. Column stacking: The column_stack function stacks 1D arrays column-wise. It’s shown as follows:
In: oned = arange(2)
In: oned
Out: array([0, 1]) In: twiceoned = 2 * oned
In: twiceoned
Out: array([0, 2]) In: column_stack((oned, twiceoned))
Out:
array([[0, 0],
[1, 2]])
2D arrays are stacked the way hstack stacks them:
In: column_stack((a, b))
Out:
array([[ 0, 1, 2, 0, 2, 4],
[ 3, 4, 5, 6, 8, 10],
[ 6, 7, 8, 12, 14, 16]])
In: column_stack((a, b)) == hstack((a, b))
Out:
array([[ True, True, True, True, True,True],
[ True, True, True, True, True,True],
[ True, True, True, True, True,True]],dtype=bool)
Yes, you guessed it right! We compared two arrays with the == operator. Isn’t
it beautiful?
5. Row stacking: NumPy, of course, also has a function that does row-wise stacking. It is called row_stack and, for 1D arrays, it just stacks the arrays in rows into a 2D array.
In: row_stack((oned, twiceoned))
Out:
array([[0, 1],
[0, 2]])
The row_stack function results for 2D arrays are equal to. Yes, exactly the vstack function results.
In: row_stack((a, b))
Out:
array([[ 0, 1, 2],
[ 3, 4, 5],
[ 6, 7, 8],
[ 0, 2, 4],
[ 6, 8, 10],
[12, 14, 16]])
In: row_stack((a,b)) == vstack((a, b))
Out:
array([[ True, True, True],
[ True, True, True],
[ True, True, True],
[ True, True, True],
[ True, True, True],
[ True, True, True]], dtype=bool)
What just happened?
We stacked arrays horizontally, depth-wise, or vertically. We used the vstack , dstack ,hstack , column_stack , row_stack , and concatenate functions.
Splitting
Arrays can be split vertically, horizontally, or depth-wise. The functions involved are hsplit , vsplit , dsplit , and split . We can either split into arrays of the same shape or indicate the position after which the split should occur.
Time for action — splitting arrays
Horizontal splitting:The ensuing code splits an array along its horizontal axis into three pieces of the same size and shape. This is shown as follows:
In: a
Out:
array([[0, 1,
[3, 4,
[6, 7,
In: hsplit(a,
Out:
[array([[0],
[3],
[6]]),
array([[1],
[4],
[7]]),
array([[2],
[5],
[8]])]
Compare it with a call of the split function, with extra parameter axis=1 :
In: split(a, 3, axis=1)
Out:
[array([[0],
[3],
[6]]),
array([[1],
[4],
[7]]),
array([[2],
[5],
[8]])]
2. Vertical splitting: vsplit splits along the vertical axis:
In: vsplit(a, 3)
Out: [array([[0, 1, 2]]), array([[3, 4, 5]]), array([[6, 7, 8]])]
The split function, with axis=0 , also splits along the vertical axis:
In: split(a, 3, axis=0)
Out: [array([[0, 1, 2]]), array([[3, 4, 5]]), array([[6, 7, 8]])]
3. Depth-wise splitting: The dsplit function, unsurprisingly, splits depth-wise. We will need an array of rank 3 first:
In: c = arange(27).reshape(3, 3, 3)
In: c
Out:
array([[[ 0, 1, 2],
[ 3, 4, 5],
[ 6, 7, 8]],
[[ 9, 10, 11],
[12, 13, 14],
[15, 16, 17]],
[[18, 19, 20],
[21, 22, 23],
[24, 25, 26]]]) In: dsplit(c, 3)
Out:
[array([[[ 0],
[ 3],
[ 6]],
[[ 9],
[12],
[15]],
[[18],
[21],
[24]]]),
array([[[ 1],
[ 4],
[ 7]],
[[10],
[13],
[16]],
[[19],
[22],
[25]]]),
array([[[ 2],
[ 5],
[ 8]],
[[11],
[14],
[17]],
[[20],
[23],
[26]]])]
What just happened?
We split arrays using the hsplit , vsplit , dsplit , and split functions.
Array attributes
Besides the shape and dtype attributes, ndarray has a number of other attributes, as shown in the following list:
ndim gives the number of dimensions:
In: b
Out:
array([[ 0,1,2,3,4,5,6,7,8,9,10,11],
[12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23]]) In: b.ndim
Out: 2
2. size contains the number of elements. This is shown a follows:
In: b.size
Out: 24
3. itemsize gives the number of bytes for each element in the array:
In: b.itemsize
Out: 8
4. If you want the total number of bytes the array requires, you can have a look at nbytes . This is just a product of the itemsize and size attributes:
In: b.resize(6,4)
In: b
Out:
array([[ 0, 1, 2,3],
[ 4, 5, 6,7],
[ 8, 9, 10,11],
[12, 13, 14,15],
[16, 17, 18,19],
[20, 21, 22,23]]) In: b.T
Out:
array([[ 0, 4, 8,12,16,20],
[ 1, 5, 9,13,17,21],
[ 2, 6, 10,14,18,22],
[ 3, 7, 11,15,19,23]])
5. If the array has a rank lower than 2, we will just get a view of the array:
In: b.ndim
Out: 1
In: b.T
Out: array([0, 1, 2, 3, 4])
Complex numbers in NumPy are represented by .j . For example, we can create an array with complex numbers:
In: b = array([1.j + 1, 2.j + 3])
In: b
Out: array([ 1.+1.j, 3.+2.j])
6. The real attribute gives us the real part of the array, or the array itself if it only contains real numbers:
In: b.real
Out: array([ 1.,3.])
7. The imag attribute contains the imaginary part of the array:
In: b.imag
Out: array([ 1.,2.])
8. If the array contains complex numbers, then the data type is automatically
also complex:
In: b.dtype
Out: dtype('complex128')
In: b.dtype.str
Out: '<c16'
9. The flat attribute returns a numpy.flatiter object. This is the only way to
acquire a flatiter — we do not have access to a flatiter constructor. The flat
iterator enables us to loop through an array as if it is a flat array, as shown next:
In: b = arange(4).reshape(2,2)
In: b
Out:
array([[0, 1],
[2, 3]])
In: f = b.flat
In: f
Out: <numpy.flatiter object at 0x103013e00>
In: for item in f: print item
.....:
0
1
2
3
It is possible to directly get an element with the flatiter object:
In: b.flat[2]
Out: 2
or multiple elements:
In: b.flat[[1,3]]
Out: array([1, 3])
The flat attribute is settable. Setting the value of the flat attribute leads to
overwriting the values of the whole array:
In: b.flat = 7
In: b
Out:
array([[7, 7],
[7, 7]])
or selected elements
In: b.flat[[1,3]] = 1
In: b
Out:
array([[7, 1],
[7, 1]])
Time for action — converting arrays
We can convert a NumPy array to a Python list with the tolist function. This is shown as follows:
Convert to a list:
In: b
Out: array([ 1.+1.j, 3.+2.j])
In: b.tolist()
Out: [(1+1j), (3+2j)]
2. astype function: The astype function converts the array to an array of the
specified type:
In: b
Out: array([ 1.+1.j, 3.+2.j]) In: b.astype(int)
/usr/local/bin/ipython:1: ComplexWarning: Casting complex values
to real discards the imaginary part
#!/usr/bin/python
Out: array([1, 3])
We are losing the imaginary part when casting from complex type to int. The astype function also accepts the name of a type as a string.
In: b.astype('complex')
Out: array([ 1.+1.j, 3.+2.j])
It won’t show any warning this time, because we used the proper data type.
What just happened?
We converted NumPy arrays to a list and to arrays of different data types.
Summary
We learned a lot the shape of an array can be manipulated in many ways — stacking, resizing, reshaping, and splitting. | https://medium.com/python-in-plain-english/numpy-stacking-splitting-array-attributes-b3ad04b47646 | ['Bhanu Soni'] | 2020-12-23 08:48:14.495000+00:00 | ['Numpy', 'Python', 'Machine Learning', 'Data Science', 'Programming'] | Title NumPy Stacking Splitting Array attributesContent Stacking Arrays stacked horizontally depthwise vertically use purpose vstack dstack hstack columnstack rowstack concatenate function Time action — stacking array First let’s set array arange9reshape33 array0 1 2 3 4 5 6 7 8 b 2 b array 0 2 4 6 8 10 12 14 16 Horizontal stacking Starting horizontal stacking form tuple ndarrays give hstack function stack array sequence horizontally columnwiseThis shown follows hstacka b array 0 1 2 0 2 4 3 4 5 6 8 10 6 7 8 12 14 16 achieve concatenate function concatenation along second axis except 1D array concatenates along first axis shown follows concatenatea b axis1 array 0 1 2 0 2 4 3 4 5 6 8 10 6 7 8 12 14 16 2 Vertical stacking vstack function used stack sequence input array vertically make single array vertical stacking tuple formed time given vstack function seen follows vstacka b array 0 1 2 3 4 5 6 7 8 0 2 4 6 8 10 12 14 16 concatenate function produce result axis set 0 default value axis argument concatenatea b axis0 array 0 1 2 3 4 5 6 7 8 0 2 4 6 8 10 12 14 16 3 Depth stacking Stack array sequence depth wise along third axis equivalent concatenation along third axis 2D array shape MN reshaped MN1 1D array shape N reshaped 1N1 Additionally depthwise stacking using dstack tuple course mean stacking list array along third axis depth instance could stack 2D array image data top dstacka b array 0 0 1 2 24 36 48 510 612 7 14 816 4 Column stacking columnstack function stack 1D array columnwise It’s shown follows oned arange2 oned array0 1 twiceoned 2 oned twiceoned array0 2 columnstackoned twiceoned array0 0 1 2 2D array stacked way hstack stack columnstacka b array 0 1 2 0 2 4 3 4 5 6 8 10 6 7 8 12 14 16 columnstacka b hstacka b array True True True True TrueTrue True True True True TrueTrue True True True True TrueTruedtypebool Yes guessed right compared two array operator Isn’t beautiful 5 Row stacking NumPy course also function rowwise stacking called rowstack 1D array stack array row 2D array rowstackoned twiceoned array0 1 0 2 rowstack function result 2D array equal Yes exactly vstack function result rowstacka b array 0 1 2 3 4 5 6 7 8 0 2 4 6 8 10 12 14 16 rowstackab vstacka b array True True True True True True True True True True True True True True True True True True dtypebool happened stacked array horizontally depthwise vertically used vstack dstack hstack columnstack rowstack concatenate function Splitting Arrays split vertically horizontally depthwise function involved hsplit vsplit dsplit split either split array shape indicate position split occur Time action — splitting array Horizontal splittingThe ensuing code split array along horizontal axis three piece size shape shown follows array0 1 3 4 6 7 hsplita array0 3 6 array1 4 7 array2 5 8 Compare call split function extra parameter axis1 splita 3 axis1 array0 3 6 array1 4 7 array2 5 8 2 Vertical splitting vsplit split along vertical axis vsplita 3 array0 1 2 array3 4 5 array6 7 8 split function axis0 also split along vertical axis splita 3 axis0 array0 1 2 array3 4 5 array6 7 8 3 Depthwise splitting dsplit function unsurprisingly split depthwise need array rank 3 first c arange27reshape3 3 3 c array 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 dsplitc 3 array 0 3 6 9 12 15 18 21 24 array 1 4 7 10 13 16 19 22 25 array 2 5 8 11 14 17 20 23 26 happened split array using hsplit vsplit dsplit split function Array attribute Besides shape dtype attribute ndarray number attribute shown following list ndim give number dimension b array 01234567891011 12 13 14 15 16 17 18 19 20 21 22 23 bndim 2 2 size contains number element shown follows bsize 24 3 itemsize give number byte element array bitemsize 8 4 want total number byte array requires look nbytes product itemsize size attribute bresize64 b array 0 1 23 4 5 67 8 9 1011 12 13 1415 16 17 1819 20 21 2223 bT array 0 4 8121620 1 5 9131721 2 6 10141822 3 7 11151923 5 array rank lower 2 get view array bndim 1 bT array0 1 2 3 4 Complex number NumPy represented j example create array complex number b array1j 1 2j 3 b array 11j 32j 6 real attribute give u real part array array contains real number breal array 13 7 imag attribute contains imaginary part array bimag array 12 8 array contains complex number data type automatically also complex bdtype dtypecomplex128 bdtypestr c16 9 flat attribute return numpyflatiter object way acquire flatiter — access flatiter constructor flat iterator enables u loop array flat array shown next b arange4reshape22 b array0 1 2 3 f bflat f numpyflatiter object 0x103013e00 item f print item 0 1 2 3 possible directly get element flatiter object bflat2 2 multiple element bflat13 array1 3 flat attribute settable Setting value flat attribute lead overwriting value whole array bflat 7 b array7 7 7 7 selected element bflat13 1 b array7 1 7 1 Time action — converting array convert NumPy array Python list tolist function shown follows Convert list b array 11j 32j btolist 11j 32j 2 astype function astype function convert array array specified type b array 11j 32j bastypeint usrlocalbinipython1 ComplexWarning Casting complex value real discard imaginary part usrbinpython array1 3 losing imaginary part casting complex type int astype function also accepts name type string bastypecomplex array 11j 32j won’t show warning time used proper data type happened converted NumPy array list array different data type Summary learned lot shape array manipulated many way — stacking resizing reshaping splittingTags Numpy Python Machine Learning Data Science Programming |
4,167 | The B2B Marketplace Stack | When people think about marketplaces, they usually assume it’s all about matching the demand and supply side. The reality is that it involves so much more than that, particularly when it comes to B2B.
In the following post, which we’ve written (in collaboration with our friends at Hokodo) off the back of working with several B2B marketplaces and interviewing many more, we’ll try to unpack the building blocks of these businesses. Hopefully, it will be useful to entrepreneurs that are in the process of building a B2B marketplace.
The B2B marketplace stack usually consists of the following 4 functions:
Curating the suppliers Facilitating the transaction Supporting the fulfilment of the orders Providing value-added services
Before I dive into it, it’s worth noting that not every B2B marketplaces offers every component included in this diagram. A service marketplace, for instance, will most likely not need to offer logistics or leveraged purchasing. Many of these functions are also key to B2C marketplaces and not exclusive to B2B.
1. Curating the suppliers
1.1 Credentialing
Ever ordered something online only to realise its some cheap knock off and you’ve been massively ripped off? Well that’s where credentialing comes into play by ensuring the trustworthiness of suppliers on the platform.
Why it matters: Whilst it might sound basic, credentialing is paramount in B2B transactions where buyers might in some cases be taking a significant business risk in trying out a new supplier and need to be certain that all parties on the platform can deliver to a certain standard.
Who does it well: Metalshub*, a marketplace for trading metals, only allows suppliers onto its platform that meet certain compliance requirements and continuously checks that they have the relevant and up to date quality certificates. This not only builds trust but also saves purchasing departments from having to run their usual Total Quality Management (TQM) procedures, which in turn encourages them to keep using the platform.
1.2 Cataloguing and Searchability
This is all about making it as easy as possible for a buyer to find exactly what he or she is looking for in as few clicks as possible.
Why it matters: Unlike B2C customers that might enjoy scrolling endlessly to find their dream purchase, most B2B customers are strapped for time and and speed of transaction is vital.
Who does it well: Rekki*, a marketplace which connects restaurants to their suppliers, has developed a translation engine that allows chefs to search for inventory using different abbreviations and kitchen slang, significantly speeding up the ordering process. ManoMano, a marketplace for construction materials, allows its busy construction workers to order using voice enabling them to purchase on the go.
1.3 Leverage Purchasing
Once marketplaces reach a certain scale they can use their market power to secure better prices for their buyers, because who doesn’t love a good discount? This is especially true for marketplaces that are able to pool multiple small orders from buyers into a single large order.
Why it matters: When going up against the status quo, you ideally want to build something which is both 10x better and 10x cheaper than what’s out there already, particularly when going after business buyers who tend to be price sensitive. Guaranteeing customers competitive prices solves one part of the 10x equation.
Who does it well: Shippo, a logistics marketplace, pools demand for shipping services amongst small businesses and, in turn, is able to get up to 60% discounts with carriers such as UPs and Fedex, amongst others. Similarly, Famitoo, a marketplace for agricultural supplies, enables small farmers that were traditionally ripped off by large suppliers, to purchase at similar rates to much larger farmers.
2. Organising the transaction
2.1 Matchmaking and price discovery
Connecting demand and supply and helping them transact act the right price is at the core of any marketplace. As I’ve written about in a previous blog, the matching between the demand and supply sides can be done in three different ways depending on marketplace dynamics: double commit (both buyers and sellers opt-in), buyer-pick (sellers input their availability and buyers select a supplier) and marketplace-picks (a buyer is automatically matched with a seller).
Why it matters: Matchmaking and price discovery can be particularly hard to crack in B2B marketplaces where in some cases you might have complex RFP or bidding based transactions and in others, might have established buyer-supplier relationships leading to a reluctance to try out new suppliers.
Who does it well: This is table stakes for marketplaces and there are many ways of doing it well depending on which matching style you opt for. Laserhub, a marketplace for custom metal sheets, takes a marketplace-picks approach. They abstract away the identity of the supplier and standardise pricing so that the buyer always feels like he or she is transacting with a single party (Laserhub), removing the pressure of having to choose which supplier they should interact with and figuring out what should be the best price. Others, like Rekki, accept the fact that established relationships are a key part of the restaurant/supplier industry and focus on facilitating the connection between these parties, before pushing them to match with new ones.
2.2 Payment
Sounds simple, but it’s far easier said than done.
Why it matters: Enabling payments on your marketplaces is one of the key ways to reduce the risk of leakage. That being said, when it comes to B2B transactions it is a real challenge. Large transaction sizes often mean that credit card payments online are not an option. On top of that, buyers expect to be offered payment on credit terms (e.g. net 30 days). As a result, many B2B marketplaces need to give their customers the option to pay via invoice and manage the related collection process which tends to be complex. Due to the longer payment times, they often need to find ways of helping manage the credit risk and liquidity strain for suppliers that are not paid out immediately.
Who does it well: Marketplaces such as Rigup, Faire and Ankorstore make it part of their core proposition to grant 30 to 90 days of credit to (eligible) buyers whilst also allowing their suppliers to get paid right after the order. In doing so, they take on the credit risk of a buyer not paying in time in exchange for greater supplier loyalty. They also provide buyers with the flexibility to pay via invoice. Hokodo, who we collaborated with on this blog, is one of the key providers of these solutions — if payments are a pain, check them out.
2.3 Transaction Admin
This refers to any tasks which need to be done once an order is placed, from confirming the availability of the goods in stock, sending an invoice to the buyer, organising the last compliance checks (if applicable) and orchestrating the various ancillary services (logistics, cargo and credit insurance, financing etc.).
Why it matters: Let’s face it, nobody likes admin… Done well, this can become a unique selling point for the platform and can even drive suppliers to bring their whole portfolio of buyers onto the platform.
Who does it well: Many good marketplaces integrate with their suppliers ERP systems, which reduces the need to constantly update stocks. Privateaser, a marketplace for events providers, consolidates the invoices from multiple different suppliers into a single invoice — as if the buyers only had one supplier — significantly reducing the admin work which comes with managing multiple invoices.
3. Supporting the fulfilment
3.1 Shipping & logistics
This includes warehousing, packaging, customs handling, inspection services, delivery and returns processing. Many B2B goods marketplaces add this as a feature on top of their platform.
Why it matters: Taking on these additional functions allows marketplaces to entrench themselves in the supply chain of their users, reducing risk of disintermediation and justifying higher take-rates. Several marketplaces are also well positioned to negotiate better shipping rates than a small buyer would be able to.
Who does it well: Amazon’s fulfilment platform is a prime example of this in B2C, with many sellers outsourcing their entire post-sales operation to Amazon, even for goods which are not necessarily sold through Amazon. Another example is Ankorstore, which offers free shipping for orders over €300, even if the transaction is actually made of several orders sourced from various suppliers. This saves costs for buyers and acts as an incentive for them to move their existing suppliers onto the Ankorstore marketplace.
3.2 After-sales
This refers to all of the support offered by a marketplace following the provision of a good or services.
Why it matters: Offering robust after-sales is critical to increasing customers’ satisfaction and stickiness. On top of this, it generates a virtuous cycle of positive customer reviews (reduces the risk of negative reviews), which in turn builds trust and reputation for the platform and attracts future customers.
Who does it well: ManoMano, a construction marketplace, offers a “Garantie Béton” (Concrete Guarantee) which goes above and beyond the industry standard by compensating customers (on their own books) for failed or late deliveries, damaged items or returns that have not yet been refunded. Faire, the B2B wholesale marketplace, is another great example. On top of net 60-day payment terms and bulk shipping, Faire offers free returns on unsold inventory, encouraging retailers to order more and test new products without having to take on inventory risk, you can read more about this here.
3.3 Dispute resolution
Once marketplaces reach a certain scale, disputes inevitably arise. In some cases, this might be because a buyer goes insolvent and can’t pay, goods were damaged or there was an operational mistake. In other cases, it could be due to fraudulent actors e.g. buyers that pretend the goods haven’t arrived or never had the intention of paying back the goods.
Why it matters: The economics of a marketplace can be upset by a very small percentage of dysfunctioning or fraudulent participants. For marketplaces with low margins, a single loss caused by a chargeback could require several additional transactions to be recouped. Unresolved disputes also have a big negative impact on NPS and, in many cases, result in churn.
Who does it well: Hectare, a livestock and agricultural marketplace, introduced an escrow payment facility whereby buyers pay funds into an escrow account prior to delivery, reducing the likelihood of a dispute. They also introduced credit insurance, using Hokodo, to soften the blow in cases of non-payment.
4. Providing value-added services
4.1 Data & Analytics
As they scale, B2B marketplaces accumulate huge amounts of data, which can be repackaged to enable better transparency across marketplace participants or sold to drive additional value.
Why it matters: By opening up access to data on prices, best-selling SKUs and industry dynamics, marketplaces can help their participants make better business decisions and provide them with an additional incentive to keep using the platform. In certain cases, data can even act as an additional revenue stream.
Who does it well: AdQuick, a marketplace that allows buyers to book out-of-home (OOH) advertising, views data as a key part of their value proposition. By gathering individual data points across their marketplace and integrating with various data sources e.g. mobile phones, AdQuick can provide advertisers with accurate attribution analytics, enabling them to measure the effectiveness of an outdoor campaign similarly to how the ROI of online campaigns is measured. This is something which was previously not possible.
Metalshub*, a trading platform for metals and ferroalloys, has leveraged the data they have accumulated on their marketplace to launch the first price indices for certain types of metals. This will not only be a huge differentiator given the opaque market they are operating in, but moving forward will also be a key revenue driver for them.
JOOR is a B2B marketplace in the fashion sector that connects more than 8,000 brands (sellers) with retailers. One of their main value propositions, aside from connecting the demand and supply side is its data exchange which provides brands with a real-time view of the latest transactions allowing them to spot emerging market trends, identify the best-selling styles so as to adjust their offering accordingly.
4.2 Industry-specific tools
Most B2B marketplaces these days offer some form of embedded software that goes beyond the pure matching of demand and supply. They are SaaS-enabled.
Why it matters: As I’ve written about in my Primer on B2B marketplaces, due to a combination of complex workflows, large AOVs and established buyer-supplier relationships in B2B transactions, it tends to be much harder for B2B marketplaces to capture the transaction on their platform compared to B2C marketplaces. As a result, they often need to build workflow tools to either streamline the complexity or get users more comfortable transacting large volumes online.
Who does it well: Faire, the wholesale marketplace, offers a whole suite of tools from invoice management, advance payments and a chat solution which helps suppliers streamline their ordering processes.
Privateaser, a marketplace that brings together event organisers with a community of vetter suppliers, built a booking system for its suppliers, similar to what OpenTable offers restaurants.
Lantum, a marketplace connecting healthcare organisations (clinics, GP practices) with temporary healthcare staff, built a platform which allowed healthcare organisations to not only find and book external staff but, also to manage their internal staff. In parallel, Lantum provides software to freelance doctors to manage their admin, taxes and find new work opportunities.
That’s all folks!
Hopefully the above gives you a good view of some of the key components which make up B2B marketplaces. As mentioned previously, not all of these components will be relevant for all B2B marketplaces. The importance of each of the building blocks depends very much on the industry you are operating in and the market dynamics. Certain elements, such as supporting the fulfilment of an order, might be more relevant for goods marketplaces. Whilst others, like vetting and credentialing might be even more crucial for services marketplaces (e.g. healthcare staff) where suppliers might be relatively unknown.
If you have any feedback on the above stack we would love to hear from you. This post was written in collaboration with Hokodo, one of the leading providers of credit management solutions for B2B marketplaces, make sure to check them out once you get the chance :)
*P9 portfolio companies
Don’t miss out on any future P9 content by signing up to our ICYMI: newsletter! | https://medium.com/point-nine-news/the-b2b-marketplace-stack-fa5b650f09b0 | ['Julia Morrongiello'] | 2020-12-08 13:29:54.074000+00:00 | ['Startup', 'B2B', 'Marketplace', 'VC'] | Title B2B Marketplace StackContent people think marketplace usually assume it’s matching demand supply side reality involves much particularly come B2B following post we’ve written collaboration friend Hokodo back working several B2B marketplace interviewing many we’ll try unpack building block business Hopefully useful entrepreneur process building B2B marketplace B2B marketplace stack usually consists following 4 function Curating supplier Facilitating transaction Supporting fulfilment order Providing valueadded service dive it’s worth noting every B2B marketplace offer every component included diagram service marketplace instance likely need offer logistics leveraged purchasing Many function also key B2C marketplace exclusive B2B 1 Curating supplier 11 Credentialing Ever ordered something online realise cheap knock you’ve massively ripped Well that’s credentialing come play ensuring trustworthiness supplier platform matter Whilst might sound basic credentialing paramount B2B transaction buyer might case taking significant business risk trying new supplier need certain party platform deliver certain standard well Metalshub marketplace trading metal allows supplier onto platform meet certain compliance requirement continuously check relevant date quality certificate build trust also save purchasing department run usual Total Quality Management TQM procedure turn encourages keep using platform 12 Cataloguing Searchability making easy possible buyer find exactly looking click possible matter Unlike B2C customer might enjoy scrolling endlessly find dream purchase B2B customer strapped time speed transaction vital well Rekki marketplace connects restaurant supplier developed translation engine allows chef search inventory using different abbreviation kitchen slang significantly speeding ordering process ManoMano marketplace construction material allows busy construction worker order using voice enabling purchase go 13 Leverage Purchasing marketplace reach certain scale use market power secure better price buyer doesn’t love good discount especially true marketplace able pool multiple small order buyer single large order matter going status quo ideally want build something 10x better 10x cheaper what’s already particularly going business buyer tend price sensitive Guaranteeing customer competitive price solves one part 10x equation well Shippo logistics marketplace pool demand shipping service amongst small business turn able get 60 discount carrier UPs Fedex amongst others Similarly Famitoo marketplace agricultural supply enables small farmer traditionally ripped large supplier purchase similar rate much larger farmer 2 Organising transaction 21 Matchmaking price discovery Connecting demand supply helping transact act right price core marketplace I’ve written previous blog matching demand supply side done three different way depending marketplace dynamic double commit buyer seller optin buyerpick seller input availability buyer select supplier marketplacepicks buyer automatically matched seller matter Matchmaking price discovery particularly hard crack B2B marketplace case might complex RFP bidding based transaction others might established buyersupplier relationship leading reluctance try new supplier well table stake marketplace many way well depending matching style opt Laserhub marketplace custom metal sheet take marketplacepicks approach abstract away identity supplier standardise pricing buyer always feel like transacting single party Laserhub removing pressure choose supplier interact figuring best price Others like Rekki accept fact established relationship key part restaurantsupplier industry focus facilitating connection party pushing match new one 22 Payment Sounds simple it’s far easier said done matter Enabling payment marketplace one key way reduce risk leakage said come B2B transaction real challenge Large transaction size often mean credit card payment online option top buyer expect offered payment credit term eg net 30 day result many B2B marketplace need give customer option pay via invoice manage related collection process tends complex Due longer payment time often need find way helping manage credit risk liquidity strain supplier paid immediately well Marketplaces Rigup Faire Ankorstore make part core proposition grant 30 90 day credit eligible buyer whilst also allowing supplier get paid right order take credit risk buyer paying time exchange greater supplier loyalty also provide buyer flexibility pay via invoice Hokodo collaborated blog one key provider solution — payment pain check 23 Transaction Admin refers task need done order placed confirming availability good stock sending invoice buyer organising last compliance check applicable orchestrating various ancillary service logistics cargo credit insurance financing etc matter Let’s face nobody like admin… Done well become unique selling point platform even drive supplier bring whole portfolio buyer onto platform well Many good marketplace integrate supplier ERP system reduces need constantly update stock Privateaser marketplace event provider consolidates invoice multiple different supplier single invoice — buyer one supplier — significantly reducing admin work come managing multiple invoice 3 Supporting fulfilment 31 Shipping logistics includes warehousing packaging custom handling inspection service delivery return processing Many B2B good marketplace add feature top platform matter Taking additional function allows marketplace entrench supply chain user reducing risk disintermediation justifying higher takerates Several marketplace also well positioned negotiate better shipping rate small buyer would able well Amazon’s fulfilment platform prime example B2C many seller outsourcing entire postsales operation Amazon even good necessarily sold Amazon Another example Ankorstore offer free shipping order €300 even transaction actually made several order sourced various supplier save cost buyer act incentive move existing supplier onto Ankorstore marketplace 32 Aftersales refers support offered marketplace following provision good service matter Offering robust aftersales critical increasing customers’ satisfaction stickiness top generates virtuous cycle positive customer review reduces risk negative review turn build trust reputation platform attracts future customer well ManoMano construction marketplace offer “Garantie Béton” Concrete Guarantee go beyond industry standard compensating customer book failed late delivery damaged item return yet refunded Faire B2B wholesale marketplace another great example top net 60day payment term bulk shipping Faire offer free return unsold inventory encouraging retailer order test new product without take inventory risk read 33 Dispute resolution marketplace reach certain scale dispute inevitably arise case might buyer go insolvent can’t pay good damaged operational mistake case could due fraudulent actor eg buyer pretend good haven’t arrived never intention paying back good matter economics marketplace upset small percentage dysfunctioning fraudulent participant marketplace low margin single loss caused chargeback could require several additional transaction recouped Unresolved dispute also big negative impact NPS many case result churn well Hectare livestock agricultural marketplace introduced escrow payment facility whereby buyer pay fund escrow account prior delivery reducing likelihood dispute also introduced credit insurance using Hokodo soften blow case nonpayment 4 Providing valueadded service 41 Data Analytics scale B2B marketplace accumulate huge amount data repackaged enable better transparency across marketplace participant sold drive additional value matter opening access data price bestselling SKUs industry dynamic marketplace help participant make better business decision provide additional incentive keep using platform certain case data even act additional revenue stream well AdQuick marketplace allows buyer book outofhome OOH advertising view data key part value proposition gathering individual data point across marketplace integrating various data source eg mobile phone AdQuick provide advertiser accurate attribution analytics enabling measure effectiveness outdoor campaign similarly ROI online campaign measured something previously possible Metalshub trading platform metal ferroalloys leveraged data accumulated marketplace launch first price index certain type metal huge differentiator given opaque market operating moving forward also key revenue driver JOOR B2B marketplace fashion sector connects 8000 brand seller retailer One main value proposition aside connecting demand supply side data exchange provides brand realtime view latest transaction allowing spot emerging market trend identify bestselling style adjust offering accordingly 42 Industryspecific tool B2B marketplace day offer form embedded software go beyond pure matching demand supply SaaSenabled matter I’ve written Primer B2B marketplace due combination complex workflow large AOVs established buyersupplier relationship B2B transaction tends much harder B2B marketplace capture transaction platform compared B2C marketplace result often need build workflow tool either streamline complexity get user comfortable transacting large volume online well Faire wholesale marketplace offer whole suite tool invoice management advance payment chat solution help supplier streamline ordering process Privateaser marketplace brings together event organiser community vetter supplier built booking system supplier similar OpenTable offer restaurant Lantum marketplace connecting healthcare organisation clinic GP practice temporary healthcare staff built platform allowed healthcare organisation find book external staff also manage internal staff parallel Lantum provides software freelance doctor manage admin tax find new work opportunity That’s folk Hopefully give good view key component make B2B marketplace mentioned previously component relevant B2B marketplace importance building block depends much industry operating market dynamic Certain element supporting fulfilment order might relevant good marketplace Whilst others like vetting credentialing might even crucial service marketplace eg healthcare staff supplier might relatively unknown feedback stack would love hear post written collaboration Hokodo one leading provider credit management solution B2B marketplace make sure check get chance P9 portfolio company Don’t miss future P9 content signing ICYMI newsletterTags Startup B2B Marketplace VC |
4,168 | This Book Made Me Feel Hopeful in a Way the Quran Never Did | This book is relevant to my experiences as a woman.
When I first held Feminist Theory: From Margin to Center by bell hooks in my hands, I did what I always do.
I cracked open the copy and smelled it. I love the scent of crisp pages.
Then, I sat down and began reading. Frankly, I couldn’t wait to start consuming the content. I have a habit of skipping the acknowledgments and the preface of most books I read, but with this one, I wanted to learn everything the author had to say.
The first few pages were so poignant and meaningful that I stopped to reflect. It was already an emotional experience, and I hadn’t even made much headway.
I could already see myself re-reading passages, filling the margins with annotations, and highlighting ideas that stood out to me. I would take ample notes in the process of gaining more knowledge. This was a book I could interact with.
It held promise and hope.
I saw my religious education as a tedious process that I would one day escape.
I had an active Muslim upbringing.
For most of my life, religious teachers and family members tried to instill the same excitement in me for the Quran. As a child, I went to an Islamic school, or madrasa, in addition to regular school. I was also home-schooled in the Quran. I was sent to Sunday school to study with a scholar. I even attended a Muslim summer camp.
I was deeply entrenched in the religion.
None of it worked. Instead of developing a passion for Islamic principles, I saw my religious education as a tedious process that I would one day escape. The readings were not relevant to me.
Frankly, as I get older, I am angered by how much of my valuable time was wasted by people trying to indoctrinate me.
One of the first indicators to me that Islam (and generally organized religion for that matter) was incompatible with my life was the way in which it brazenly supported the oppression of women. This oppression is implemented not just by the extremists toting weapons on news channels but also by people who would be called moderate Muslims.
The Quran does not speak to women. It speaks to men about what to do with women.
Unlike the Quran, Feminist Theory would give me tools to think about the world and my place in it. This book would not invalidate my experiences of oppression. Instead, it would help me better understand them.
I knew that after I finished reading it, I would be better equipped to challenge the thoughts that work against me and other women.
I felt my heart fill to the brim.
I have been starving for the autonomy of thought, but this too requires learning.
Of course, I am afraid to write this article.
Speaking out against organized religion and specifically doing that as a woman who disagrees with the messages of Islam takes a lot of courage.
I have been hesitant to write about my fraught relationship with Islam for four main reasons.
I was concerned about being ostracized by my family for my beliefs. I have begun to find my peace with that. I am concerned about my actual physical safety because being a vocal religious dissenter is dangerous. I do not want my work to be convoluted and misused by right-wing extremists to justify their hatred and bigotry. I know that self-declared liberal White people become deeply uncomfortable when someone tells them that their “progressive” beliefs are uninformed.
It is fascinating to me that my anxiety arises in part because of the individuals who perform a self-gratifying form of acceptance without concerning themselves about the details of what it is they are accepting. | https://medium.com/an-amygdala/this-book-made-me-feel-hopeful-in-a-way-the-quran-never-did-9af3ef1e668d | ['Rebeca Ansar'] | 2020-06-29 00:37:41.308000+00:00 | ['Personal Growth', 'Women', 'Feminism', 'Self', 'Books'] | Title Book Made Feel Hopeful Way Quran Never DidContent book relevant experience woman first held Feminist Theory Margin Center bell hook hand always cracked open copy smelled love scent crisp page sat began reading Frankly couldn’t wait start consuming content habit skipping acknowledgment preface book read one wanted learn everything author say first page poignant meaningful stopped reflect already emotional experience hadn’t even made much headway could already see rereading passage filling margin annotation highlighting idea stood would take ample note process gaining knowledge book could interact held promise hope saw religious education tedious process would one day escape active Muslim upbringing life religious teacher family member tried instill excitement Quran child went Islamic school madrasa addition regular school also homeschooled Quran sent Sunday school study scholar even attended Muslim summer camp deeply entrenched religion None worked Instead developing passion Islamic principle saw religious education tedious process would one day escape reading relevant Frankly get older angered much valuable time wasted people trying indoctrinate One first indicator Islam generally organized religion matter incompatible life way brazenly supported oppression woman oppression implemented extremist toting weapon news channel also people would called moderate Muslims Quran speak woman speaks men woman Unlike Quran Feminist Theory would give tool think world place book would invalidate experience oppression Instead would help better understand knew finished reading would better equipped challenge thought work woman felt heart fill brim starving autonomy thought requires learning course afraid write article Speaking organized religion specifically woman disagrees message Islam take lot courage hesitant write fraught relationship Islam four main reason concerned ostracized family belief begun find peace concerned actual physical safety vocal religious dissenter dangerous want work convoluted misused rightwing extremist justify hatred bigotry know selfdeclared liberal White people become deeply uncomfortable someone tell “progressive” belief uninformed fascinating anxiety arises part individual perform selfgratifying form acceptance without concerning detail acceptingTags Personal Growth Women Feminism Self Books |
4,169 | 5 Odd Jobs People Were Once Paid to Do | 5 Odd Jobs People Were Once Paid to Do
#4 Dog Whipping
Unemployed men in line at the soup kitchen (1931), from the US National Archives and Records Administration, Public Domain via Wikimedia Commons
Technology and artificial intelligence have been rendering more and more jobs obsolete. A volatile mix of automation and lockdowns have led unemployment rates to double digits in many parts of the world. Last April alone, there were 23.1 million jobless Americans.
All of these factor into the many anxieties a regular laborer has to deal with day-to-day. Apart from a stagnant wage, they also wake up to uncertainty.
Still, the extinction of jobs isn’t something new. Sometimes, the innovations and circumstances that lead to obsolescence are welcomed by both clients and workers alike. That’s because some of these jobs were downright awful.
Here are five jobs people once did that now have been made obsolete through the passage of time.
1. Human Garden Ornaments
Human Garden Ornament (1795), By Johann Baptist Theobald Schmitt, Public Domain via Wikimedia Commons
Quarantine has made a lot of people discover the joys of landscaping and backyard gardening. Among the many garden decor items that have been sold due to this trend is the classic garden gnome. The precursor to the garden gnome was a garden hermit, and being paid to be an ornament in a rich person’s garden was a real job in the 18th century.
The worker was hired to look like a hermit — long nails, untidy hair, and a disheveled beard were all required by his employer. Some of them were even prevented from cleaning themselves to give a more “authentic hermit” appearance. When visitors arrived in the garden, these hermits read them poetry or lines from popular books as a form of entertainment.
These hermits-for-hire were not allowed to leave the garden until the end of their contract period. This often lasted for months and sometimes years, with failure resulting in forfeiture of pay.
With all these strict rules, it was common for garden hermits to quit their job midway, forcing nobles to replace them with a variation of the garden gnome we have today.
2. Human Alarm Clocks
Knocker Upper at work (1947), Public Domain via Wikimedia Commons
One thing we take for granted is the system of “date and time.” We don’t think about calendars, clocks, and schedules as innovative because most of us were born into the system — we can organize our days through our phones accordingly. But a unified system of time and date is a fairly recent invention in history, and so is the annoying alarm clock that comes with it.
The job of an alarm clock is to wake people up, and in 19th century Britain, that was done by a person called the “Knocker Upper.”
The human alarm clocks would use a long pole made of bamboo to reach the window of their clients. With a small wire tied on its end, they would then tap on the window until their customers arose for work.
People hired knocker uppers on a subscription basis because they couldn’t afford expensive alarm clocks, which we have on our phones today. The practice was so ingrained in factory-work customs that the role persisted even up to the 1970s in some parts of Britain.
3. Poop Farmers
Toilet in Rosenborg Castle Copenhagen, by Zymurgy, CC BY-SA 4.0 via Wikimedia Commons
Another thing we rarely think about, and for understandable reasons, is how exactly our poop gets disposed of. There is a complex system that makes sure most of the modern world is free from foul odors and the diseases that come with them. But what is now largely automatic used to be a manual job.
Known commonly as gong farmers, old English for “to go,” these men worked in groups to save the town from a build-up of human waste. The man tasked with scooping poop into a bucket down a pit was known as the “hole man.” Waste would then be passed on to the “rope men,” tasked with pulling the heavy pile out of the pit. Lastly, the “tub man” brought the waste and disposed of it outside of the town.
When more intricate systems of plumbing were developed, the need for gong farmers slowly waned. I’m on the fence about whether or not they were happy about that.
4. Dog Whippers
Dog Whipper Statue in the Netherlands, Public Domain via Wikimedia Commons
“Dog whipper” was an official title given by a church to men who controlled the behavior of bothersome animals during holy ceremonies. As the name implies, they literally whipped misbehaving dogs during a service.
At that time, churches had no effective way of regulating the entry of animals, so they did things manually. These early iterations of animal control officers were equipped with three-foot-long whips and a pair of tongs to catch and get rid of noisy, and often fornicating, cats and dogs.
They had a second role too. Dog whippers also sometimes forcefully poked dozing and sleeping mass goers awake!
5. Court Dwarfs
Portrait of the court dwarf Sebastián de Morra (1645), by Diego Velázquez, Public Domain via Wikimedia Commons
Court dwarfs have been recorded in the histories of Egypt, Rome, and China. They were often given as gifts to different ruling families and sometimes traded as property. In early modern Europe, court dwarfs gained a little more esteem, earning a wage and sometimes doubling as diplomats.
Most court dwarfs were given a ceremonial job in royal courts. They were placed beside the king or queen during public gatherings in order to make the royals look more powerful in comparison to their short stature. Other dwarfs also played the “natural fool,” often complementing a jester.
As the influence of royal families decreased, so did the employment of court dwarfs. Historians point to the reign of Charles the XII of Sweden as the point wherein the practice completely stopped. | https://medium.com/history-of-yesterday/5-odd-jobs-people-were-once-paid-to-do-aa47194194fe | ['Ben Kageyama'] | 2020-12-25 09:01:09.792000+00:00 | ['History', 'Work', 'Nonfiction', 'Labor', 'Jobs'] | Title 5 Odd Jobs People Paid DoContent 5 Odd Jobs People Paid 4 Dog Whipping Unemployed men line soup kitchen 1931 US National Archives Records Administration Public Domain via Wikimedia Commons Technology artificial intelligence rendering job obsolete volatile mix automation lockdown led unemployment rate double digit many part world Last April alone 231 million jobless Americans factor many anxiety regular laborer deal daytoday Apart stagnant wage also wake uncertainty Still extinction job isn’t something new Sometimes innovation circumstance lead obsolescence welcomed client worker alike That’s job downright awful five job people made obsolete passage time 1 Human Garden Ornaments Human Garden Ornament 1795 Johann Baptist Theobald Schmitt Public Domain via Wikimedia Commons Quarantine made lot people discover joy landscaping backyard gardening Among many garden decor item sold due trend classic garden gnome precursor garden gnome garden hermit paid ornament rich person’s garden real job 18th century worker hired look like hermit — long nail untidy hair disheveled beard required employer even prevented cleaning give “authentic hermit” appearance visitor arrived garden hermit read poetry line popular book form entertainment hermitsforhire allowed leave garden end contract period often lasted month sometimes year failure resulting forfeiture pay strict rule common garden hermit quit job midway forcing noble replace variation garden gnome today 2 Human Alarm Clocks Knocker Upper work 1947 Public Domain via Wikimedia Commons One thing take granted system “date time” don’t think calendar clock schedule innovative u born system — organize day phone accordingly unified system time date fairly recent invention history annoying alarm clock come job alarm clock wake people 19th century Britain done person called “Knocker Upper” human alarm clock would use long pole made bamboo reach window client small wire tied end would tap window customer arose work People hired knocker upper subscription basis couldn’t afford expensive alarm clock phone today practice ingrained factorywork custom role persisted even 1970s part Britain 3 Poop Farmers Toilet Rosenborg Castle Copenhagen Zymurgy CC BYSA 40 via Wikimedia Commons Another thing rarely think understandable reason exactly poop get disposed complex system make sure modern world free foul odor disease come largely automatic used manual job Known commonly gong farmer old English “to go” men worked group save town buildup human waste man tasked scooping poop bucket pit known “hole man” Waste would passed “rope men” tasked pulling heavy pile pit Lastly “tub man” brought waste disposed outside town intricate system plumbing developed need gong farmer slowly waned I’m fence whether happy 4 Dog Whippers Dog Whipper Statue Netherlands Public Domain via Wikimedia Commons “Dog whipper” official title given church men controlled behavior bothersome animal holy ceremony name implies literally whipped misbehaving dog service time church effective way regulating entry animal thing manually early iteration animal control officer equipped threefootlong whip pair tongs catch get rid noisy often fornicating cat dog second role Dog whipper also sometimes forcefully poked dozing sleeping mass goer awake 5 Court Dwarfs Portrait court dwarf Sebastián de Morra 1645 Diego Velázquez Public Domain via Wikimedia Commons Court dwarf recorded history Egypt Rome China often given gift different ruling family sometimes traded property early modern Europe court dwarf gained little esteem earning wage sometimes doubling diplomat court dwarf given ceremonial job royal court placed beside king queen public gathering order make royal look powerful comparison short stature dwarf also played “natural fool” often complementing jester influence royal family decreased employment court dwarf Historians point reign Charles XII Sweden point wherein practice completely stoppedTags History Work Nonfiction Labor Jobs |
4,170 | For many universities, the landscape is changing. | For many universities, the landscape is changing. Gone are the days when the throughput of students was the major measure of success. Now universities are evaluated on the outcomes for their students, and the extent to which those students have the right skills to join the workforce.
Perhaps the most challenging shift is that of expectations among students themselves. As digital natives, they live with information and opportunities at their fingertips. When they want to research something, book something or share an insight, they jump online looking for high quality and frictionless experience from their university.
Meeting these expectations is more complex than simply digitising course content. Instead, it must involve shaping all the factors, both online and offline, that influence whether students achieve their goals. Many factors are within the university’s control, such as campus culture, spatial design and course content, while other factors are harder to influence, such as students feeling lonely or trying to succeed at university with a physical or mental health condition.
Human-centred design can be a game-changer for universities
One factor determines the success of these efforts to shape the student experience: design, specifically human-centred design (HCD).
HCD involves gaining deep insight into the needs of all users of a system and using those insights as a base for ideas or solutions. These solutions are tested, and users provide feedback for further refinement before solutions are implemented. Ongoing user feedback and data drive continuous improvement; the design process never ends.
What we mean by design, inspired by Margaret Hagan, who specialises in HCD in legal services.
A traditional approach to rethinking student support services might have focussed on reorganising the student support team, recruiting to new roles or changing the performance metrics of the team.
These interventions may not be helpful if they are not implemented with students’ experience of the service front and centre. By starting with the student experience, a university can consider all the opportunities that might exist to support students. The university might even anticipate some risk factors and offer the right support mechanisms before crisis hits.
Putting HCD into practice
Nous Group has worked with a range of universities and other organisations to drive outcomes using HCD. From our experience, when thinking about system (re)design, the hardest part can be getting started. But breaking the process down to stepping stones can help create a clear path forward.
Universities are rich with data, which can be a blessing and a curse. As the results of research prompt further research questions, it is easy to fall into a research infinity loop and never actually take any action. This is understandable, given the human instinct to want to know more, but is not helpful. Instead, customer experience designers need to pursue progress rather than perfection. This involves doing enough research to be confident in the insights to take a first step. There is no right way to start, so be brave and start somewhere.
Customer experience designers should apply the same thoughtfulness to designing a project as to designing solutions. Every phase of the task — from research, to workshop, to a governance group conversation — is a chance to engage people in developing a shared language and understanding of the problem and can unite people in taking collective action later.
Genuine involvement from people across the university — including students — helps to break down barriers, develop understanding and bust assumptions, all of which makes embedding change simpler.
Involving students in design requires a different dynamic to that that traditionally exists in university-student relationships. It needs to be more equitable, so designers and students tap into the skills and experiences of the other to come up with answers that will work for everyone. This may mean accommodating differences in suitable hours, spaces, communications technology and timeframes.
To build engagement with people from across the university, find a small group willing to start using a new service, system or way of working. Then monitor their progress to create a buzz and build from there.
For many universities, this will be daunting, so having the right governance and mandate is important. Be creative about how engagement is done and always be sure to close the feedback loop. Let those you have engaged with know how their contributions have influenced the work.
To make a change sustainable it needs to go beyond changing the behaviour of individual users and instead needs to alter the organisation, culture or system. Start with short-term fixes to generate evidence of progress, celebrating successes and learning as you go, but always keep the longer-term goal in mind.
Finding the right blend of technical and design expertise can be difficult. Some universities have built strong linkages between services based on goodwill and communication, but now face the challenge of sustaining widespread commitment amid competing priorities. Other universities have developed effective IT systems to connect functions, but lack commitment from users due to a lack of understanding of students’ experience.
Part of the solution is to be deliberate about who you have in the project team so that you have the right expertise and all members are active contributors. Many customer experience designers find great success in creating a dedicated space where colleagues and customers can be immersed in the experience of using design-led approaches. This needs to be visual and show rather than tell.
Teams need to explore problems from multiple perspectives
Like a stone dropped in placid waters, the ripples caused by bad design can extend to the furthest reaches of a university’s performance.
It is the role of customer experience designers to locate the pain points in a process, and then do something about them. This requires using robust research techniques to diagnose the issues well and strong evaluation techniques to measure the impact of progress to inform continuous improvement and to demonstrate the value created by the work.
Good design requires a vast array of skills, from strategy to qualitative and quantitative research, advanced data analytics, knowledge of existing and emerging technology, and quality evaluation techniques. So more than ever we need to collaborate and create teams that can explore problems from multiple perspectives.
User expectations are not staying still, so design solutions cannot either. | https://medium.com/swlh/how-human-centred-design-can-help-universities-better-serve-students-e1d561a9f89c | ['Kirsty Elderton'] | 2019-06-08 11:33:06.307000+00:00 | ['Higher Education', 'Human Centred Design', 'University', 'Design', 'Student Experience'] | Title many university landscape changingContent many university landscape changing Gone day throughput student major measure success university evaluated outcome student extent student right skill join workforce Perhaps challenging shift expectation among student digital native live information opportunity fingertip want research something book something share insight jump online looking high quality frictionless experience university Meeting expectation complex simply digitising course content Instead must involve shaping factor online offline influence whether student achieve goal Many factor within university’s control campus culture spatial design course content factor harder influence student feeling lonely trying succeed university physical mental health condition Humancentred design gamechanger university One factor determines success effort shape student experience design specifically humancentred design HCD HCD involves gaining deep insight need user system using insight base idea solution solution tested user provide feedback refinement solution implemented Ongoing user feedback data drive continuous improvement design process never end mean design inspired Margaret Hagan specialises HCD legal service traditional approach rethinking student support service might focussed reorganising student support team recruiting new role changing performance metric team intervention may helpful implemented students’ experience service front centre starting student experience university consider opportunity might exist support student university might even anticipate risk factor offer right support mechanism crisis hit Putting HCD practice Nous Group worked range university organisation drive outcome using HCD experience thinking system redesign hardest part getting started breaking process stepping stone help create clear path forward Universities rich data blessing curse result research prompt research question easy fall research infinity loop never actually take action understandable given human instinct want know helpful Instead customer experience designer need pursue progress rather perfection involves enough research confident insight take first step right way start brave start somewhere Customer experience designer apply thoughtfulness designing project designing solution Every phase task — research workshop governance group conversation — chance engage people developing shared language understanding problem unite people taking collective action later Genuine involvement people across university — including student — help break barrier develop understanding bust assumption make embedding change simpler Involving student design requires different dynamic traditionally exists universitystudent relationship need equitable designer student tap skill experience come answer work everyone may mean accommodating difference suitable hour space communication technology timeframes build engagement people across university find small group willing start using new service system way working monitor progress create buzz build many university daunting right governance mandate important creative engagement done always sure close feedback loop Let engaged know contribution influenced work make change sustainable need go beyond changing behaviour individual user instead need alter organisation culture system Start shortterm fix generate evidence progress celebrating success learning go always keep longerterm goal mind Finding right blend technical design expertise difficult university built strong linkage service based goodwill communication face challenge sustaining widespread commitment amid competing priority university developed effective system connect function lack commitment user due lack understanding students’ experience Part solution deliberate project team right expertise member active contributor Many customer experience designer find great success creating dedicated space colleague customer immersed experience using designled approach need visual show rather tell Teams need explore problem multiple perspective Like stone dropped placid water ripple caused bad design extend furthest reach university’s performance role customer experience designer locate pain point process something requires using robust research technique diagnose issue well strong evaluation technique measure impact progress inform continuous improvement demonstrate value created work Good design requires vast array skill strategy qualitative quantitative research advanced data analytics knowledge existing emerging technology quality evaluation technique ever need collaborate create team explore problem multiple perspective User expectation staying still design solution cannot eitherTags Higher Education Human Centred Design University Design Student Experience |
4,171 | The Future of Podcasting is Subscription — Lessons from the History of Media | “History doesn’t repeat itself, but it often rhymes”
On April 24, 2019, Luminary Media officially launched its subscription podcasting platform to the public, and was quickly dismissed and even attacked by some in the media and on Twitter. Luminary, which raised nearly $100 million from investors prior to launch, offers a free podcast player, along with a catalog of 40+ exclusive ad-free original podcasts locked behind a $7.99 monthly subscription. To some, introducing a new business model was an affront to the relatively new medium, as demonstrated in the Fast Company article titled “Why podcast fans will always reject a “Netflix for podcasts,” in which the author deftly states “First, it’s annoying.” What these critics fail to understand is that this story has been told before — and in almost every case, the quality of content has increased, the consumer experience has improved, and creators have been more appropriately compensated for their talent.
Advertising, which to date has been the primary revenue channel for podcasts (bringing in a paltry $314 million in 2017), also initially supported nearly every new media format in their formative years. This was true for newspapers, radio, television, and early digital video platforms. Historically, it was unclear (and unlikely) that consumers would pay directly for new types of content enabled by new technologies and means of distribution, leaving sponsorships & advertising as the only potential for monetizing mediums. In order to drive the value of an advertisement up, media companies needed wide distribution to drive circulation (which is even why you may still get The Yellow Pages delivered to your home every year…) Wide distribution = more consumers = more advertising revenue. In this model, in order to invest more capital into quality content, content producers must either (A) reach a wider audience, or (B) insert more ads into the content.
However as more consumers adopt the new distribution channels (radios, television sets, smart phones) and incorporate the content into their daily lives, things evolve. Often, new consumer propositions emerge, promising a better experience or higher quality content for a premium price.
Radio had been primarily free and ad-supported since its inception in the early 20th century, until Satellite radio (a whole new and expensive distribution system) launched in 2001. Subscription satellite radio still had a slow start without any standout content, until it secured the exclusive distribution rights for The Howard Stern Show in 2006, putting the previously free program behind its subscription paywall, and landing over 180,000 subscribers overnight. Today, Sirius XM has about 33 million subscribers.
Television programming was also born as sponsored content broadcast for free over the airwaves (ABC, CBS, and NBC), before pay-cable channels like Home Box Office (HBO) began to emerge in the 1970’s. HBO launched by transmitting popular films straight into subscribers homes, before evolving into producing HBO Original Films and eventually, HBO Original Series such as The Sopranos, The Wire, and Game of Thrones. This iconic content, which draws 140 million+ global subscribers to HBO, would not be possible on a purely advertising-supported channel.
While the podcast industry is still in its infancy, it has seen tremendous growth in both consumption and production volume over the past decade. In 2008, less than 10% of the U.S. population listened to podcasts on a monthly basis, while nearly 1/3rd of the country does today — about 90 million monthly listeners. Further, there are an estimated 700,000 podcasts and 29 million podcast episodes as of April 2019. Surely, some portion of those 90 million monthly U.S. listeners would be willing to pay for a premium ad-free podcast experience in which creators have the resources to innovate and create high-quality content.
The objection that some are raising in regards to Luminary’s paywall is that just because you can charge for content doesn’t mean that you should. The ignorance in this protest is that the pressure to create content that people are willing to pay for often drives up the quality of said content and the resources that producers can devote to it. The expense and investment in subscription based HBO’s Game of Thrones dwarfs the budget of any show on (primarily) ad-supported CBS. HBO viewers are willing to pay a monthly fee for access to this premium content, which in turn allows the network and talent to invest more heavily in quality content.
Further, talent is the backbone of any creative industry, and deserves to be well compensated for the value that they create. While some podcast creators are able to reach a sizeable enough audience to court advertisers, others must turn to platforms like Patreon to ask listeners to donate on a recurring basis in order to fund their favorite shows. Even A-List talent and seasoned podcast professionals with hundreds of thousands of fans must rely on inserting advertisements for Casper into their content, which still leaves little room for investment into high production value, longterm projects, or dabbling with innovative formats. Subscription business models allow the upfront investment directly in talent and content — such as Netflix’s astonishing overall deals for television creators Ryan Murphy ($300M) and Shonda Rhimes ($100M), who left Twentieth Century Fox TV and ABC Studios respectively, studios which were primarily focused on advertising-supported broadcast television, and couldn’t afford to compete with Netflix’s offers.
We also know from other media platforms that consumers are generally pretty amenable to pay higher fees for ad-free experiences. Spotify and Hulu, for instance, both have ~50% of users paying for ad-free tiers of the same content.
Spotify has made its podcasting ambitions clear, spending $400 million to acquire Gimlet Media (a podcast studio with a significant catalog), Parcast (another podcast studio) and Anchor (tools for podcast creators). Spotify’s interest in podcasts is less about creating better content, a better listener experience, or rewarding creators — but instead about making the company’s basic economics work. Despite having 100 million paying subscribers and 217 million total monthly active users, the company’s deals with music publishers means it is still unprofitable due to the share of subscription and advertising revenue that Spotify must send to publishers based on users’ listening habits. If Spotify can get users to listen to more podcasts and less music, it can shift some of that revenue to its own pocket.
For its part, Luminary has the advantage of having a maniacal focus on a very specific type of content — just as Netflix has had and maintained for the better part of a decade. Despite its success with on-demand video, Netflix has resisted the urge to move into sports, news, live TV, gaming / eSports, or ad-supported content — which has provided the clarity and focus to build a dominant media company and beloved consumer brand in record time. Luminary has the opportunity to execute a similar playbook for the podcasting community.
To be clear — there will always be free, ad-supported podcasts in the world, just as there is terrestrial radio and broadcast television. Although some are upset at the potential disruption (and disaggregation) that Luminary will likely ignite in the podcast community, both creators and listeners are likely to be beneficiaries: creators will have the opportunity to experiment and invest in high-quality content they want to create and be more fairly compensated for their talent, while consumers will benefit from a better podcasting experience, with ad-free, high-quality content. This is a story that has been told before — and the podcast community should be excited. | https://medium.com/the-raabithole/the-future-of-podcasting-is-subscription-lessons-from-the-history-of-media-d486bd693141 | ['Mike Raab'] | 2019-06-08 04:57:45.250000+00:00 | ['Business', 'Podcast', 'Media', 'Future', 'Culture'] | Title Future Podcasting Subscription — Lessons History MediaContent “History doesn’t repeat often rhymes” April 24 2019 Luminary Media officially launched subscription podcasting platform public quickly dismissed even attacked medium Twitter Luminary raised nearly 100 million investor prior launch offer free podcast player along catalog 40 exclusive adfree original podcasts locked behind 799 monthly subscription introducing new business model affront relatively new medium demonstrated Fast Company article titled “Why podcast fan always reject “Netflix podcasts” author deftly state “First it’s annoying” critic fail understand story told — almost every case quality content increased consumer experience improved creator appropriately compensated talent Advertising date primary revenue channel podcasts bringing paltry 314 million 2017 also initially supported nearly every new medium format formative year true newspaper radio television early digital video platform Historically unclear unlikely consumer would pay directly new type content enabled new technology mean distribution leaving sponsorship advertising potential monetizing medium order drive value advertisement medium company needed wide distribution drive circulation even may still get Yellow Pages delivered home every year… Wide distribution consumer advertising revenue model order invest capital quality content content producer must either reach wider audience B insert ad content However consumer adopt new distribution channel radio television set smart phone incorporate content daily life thing evolve Often new consumer proposition emerge promising better experience higher quality content premium price Radio primarily free adsupported since inception early 20th century Satellite radio whole new expensive distribution system launched 2001 Subscription satellite radio still slow start without standout content secured exclusive distribution right Howard Stern Show 2006 putting previously free program behind subscription paywall landing 180000 subscriber overnight Today Sirius XM 33 million subscriber Television programming also born sponsored content broadcast free airwave ABC CBS NBC paycable channel like Home Box Office HBO began emerge 1970’s HBO launched transmitting popular film straight subscriber home evolving producing HBO Original Films eventually HBO Original Series Sopranos Wire Game Thrones iconic content draw 140 million global subscriber HBO would possible purely advertisingsupported channel podcast industry still infancy seen tremendous growth consumption production volume past decade 2008 le 10 US population listened podcasts monthly basis nearly 13rd country today — 90 million monthly listener estimated 700000 podcasts 29 million podcast episode April 2019 Surely portion 90 million monthly US listener would willing pay premium adfree podcast experience creator resource innovate create highquality content objection raising regard Luminary’s paywall charge content doesn’t mean ignorance protest pressure create content people willing pay often drive quality said content resource producer devote expense investment subscription based HBO’s Game Thrones dwarf budget show primarily adsupported CBS HBO viewer willing pay monthly fee access premium content turn allows network talent invest heavily quality content talent backbone creative industry deserves well compensated value create podcast creator able reach sizeable enough audience court advertiser others must turn platform like Patreon ask listener donate recurring basis order fund favorite show Even AList talent seasoned podcast professional hundred thousand fan must rely inserting advertisement Casper content still leaf little room investment high production value longterm project dabbling innovative format Subscription business model allow upfront investment directly talent content — Netflix’s astonishing overall deal television creator Ryan Murphy 300M Shonda Rhimes 100M left Twentieth Century Fox TV ABC Studios respectively studio primarily focused advertisingsupported broadcast television couldn’t afford compete Netflix’s offer also know medium platform consumer generally pretty amenable pay higher fee adfree experience Spotify Hulu instance 50 user paying adfree tier content Spotify made podcasting ambition clear spending 400 million acquire Gimlet Media podcast studio significant catalog Parcast another podcast studio Anchor tool podcast creator Spotify’s interest podcasts le creating better content better listener experience rewarding creator — instead making company’s basic economics work Despite 100 million paying subscriber 217 million total monthly active user company’s deal music publisher mean still unprofitable due share subscription advertising revenue Spotify must send publisher based users’ listening habit Spotify get user listen podcasts le music shift revenue pocket part Luminary advantage maniacal focus specific type content — Netflix maintained better part decade Despite success ondemand video Netflix resisted urge move sport news live TV gaming eSports adsupported content — provided clarity focus build dominant medium company beloved consumer brand record time Luminary opportunity execute similar playbook podcasting community clear — always free adsupported podcasts world terrestrial radio broadcast television Although upset potential disruption disaggregation Luminary likely ignite podcast community creator listener likely beneficiary creator opportunity experiment invest highquality content want create fairly compensated talent consumer benefit better podcasting experience adfree highquality content story told — podcast community excitedTags Business Podcast Media Future Culture |
4,172 | How the Data Stole Christmas | by Anonymous
The door sprung open and our three little ones joyfully leaped onto the bed, waking my wife and I from peaceful dreams of slumbering sugarplum fairies. Oh Gosh, it was 6:03am, but who could blame them. The kids wait for Christmas day all year.
I sat up as giggles thrashed with reckless excitement on the bed. My feet found slippers and I lumbered down the hall in the direction of the coffee maker. I could hear the tearing of wrapping paper and whoops of joy coming from the living-room tree.
This scene was playing out in countless ways in households all across America. Christmas is a time for family, friends and a relaxing respite from the busy work calendar. Unfortunately, it was all about to come to an abrupt end as my cell phone interrupted our family time with an impatient buzz. “Who could be calling me today?”
I should have powered it off and tossed the damn thing into a snowbank, but as the manager of a data team in a global e-commerce company, I was used to taking calls at odd times. I pressed talk and reluctantly held the phone to my ear.
It was my co-worker. All hell was breaking loose. The back-end for the e-commerce site had crashed. It was all-hands-on-deck to get it back on-line. Our VP wanted hourly status reports — the next one in 47 minutes. I jumped into some clothes, grabbed a coat and mumbled to my disbelieving wife that I would be back in a couple of hours. To make a long story short, I actually didn’t make it home until 2 am the next morning. Yes, we got the site back up, but I missed Christmas that year.
Lessons Learned
The most tragic part of this story is that it didn’t have to be this way. We had talented people on our team, but our approach to data operations in those days was based on a flawed methodology:
· We had one instance of the production environment. When the data team needed to make a change, like for example, updating a schema, we did so directly on the live operational system.
· Making changes was so fraught with risk that we instituted heavy-weight procedures to control it. This slowed innovation to a crawl and even with all the triple checking, outages still occurred.
· We tested new changes to the best of our abilities, but since our development systems used a simplified tools environment, we would encounter unexpected errors when moving code into the more complex production environment.
· Our test data was perfect whereas production data is notoriously messy. Production data always threw unexpected anomalies at our code.
Managing a continuing succession of outages while trying to keep development projects on schedule and under budget is like trying to play “whack a mole” while simultaneously reading a book. Yet, our approach in those days was mainly based on hope and heroism. We release code and “hope” it doesn’t break anything. When there’s an outage, we call in the technical experts (the “heroes”) to work around-the-clock to fix the problem. Looking back, this was no way to run a major operation. It’s not really a surprise that the head of our department got replaced about every two years. Executive management needed to pin the responsibility somewhere.
DataOps — Lessons Applied
It is tempting but simple-minded to blame outages on people. A robust business process eliminates errors and improves efficiency despite the fact that error-prone humans are involved.
When software companies (Netflix, Facebook, …) started executing millions of high-quality code releases per year, it offered an opportunity for data organizations to renew their approach to development and operations. The methodologies used by these software engineering organizations — Agile development, DevOps and lean manufacturing — apply equally well to analytics creation and operations. The data industry refers to this initiative as DataOps.
The fastest way to institute DataOps methods, even assuming the preexistence of a legacy tools and technical environment, is by aligning and automating workflows using a DataOps Platform. A DataOps Platform offers these capabilities:
· Minimizes cycle time — A DataOps Platform aligns production and development environments using virtualization and orchestrates the DataOps testing, qualification and release of new analytics code with a button push. Continuous deployment of new code is how software companies produce such a high volume of releases per year.
· Eliminates errors — The functional, unit and regression testing of code enables new analytics to be deployed with confidence that it will work as promised in operations. In addition, the data that flow through operations is tested and subject to controls at every step of the operations pipeline. Data errors are trapped and remediated before they corrupt charts and graphs.
· Fosters collaboration — DataOps integrates version control and workflow tools like Jira. The DataOps Platform enables team members to share analytics components, encouraging reuse and improving productivity. Geographically dispersed teams can use their own choice of toolchains while fitting into higher-level orchestrations.
The code and data quality supported by DataOps minimizes the unplanned work that can disrupt a data engineer’s weekend or holiday. With a DataOps Platform, enterprises can move away from relying on hope and heroism. Continuous deployment fully tests and deploys new analytics eliminating time-consuming and error-prone manual steps. Tests and statistical controls ensure that data is error-free before it flows into models and analytics.
Have a DataOps Holiday Season
With DataOps in place, tests monitor the data flowing through operational systems 24x7x365. While data scientists are home sipping eggnog, DataOps works overtime to keep operational systems up and running. I may have missed a Christmas celebration with my family that one time, but with DataOps, never again. Happy holidays.
For more information about how a DataOps Platform can compress your new analytics cycle time and eliminate data errors, please give us a shout at datakitchen.io. | https://medium.com/data-ops/how-the-data-stole-christmas-78454531d0a8 | [] | 2019-12-24 13:22:33.143000+00:00 | ['Data Science', 'Big Data', 'Dataops', 'Analytics', 'DevOps'] | Title Data Stole ChristmasContent Anonymous door sprung open three little one joyfully leaped onto bed waking wife peaceful dream slumbering sugarplum fairy Oh Gosh 603am could blame kid wait Christmas day year sat giggle thrashed reckless excitement bed foot found slipper lumbered hall direction coffee maker could hear tearing wrapping paper whoop joy coming livingroom tree scene playing countless way household across America Christmas time family friend relaxing respite busy work calendar Unfortunately come abrupt end cell phone interrupted family time impatient buzz “Who could calling today” powered tossed damn thing snowbank manager data team global ecommerce company used taking call odd time pressed talk reluctantly held phone ear coworker hell breaking loose backend ecommerce site crashed allhandsondeck get back online VP wanted hourly status report — next one 47 minute jumped clothes grabbed coat mumbled disbelieving wife would back couple hour make long story short actually didn’t make home 2 next morning Yes got site back missed Christmas year Lessons Learned tragic part story didn’t way talented people team approach data operation day based flawed methodology · one instance production environment data team needed make change like example updating schema directly live operational system · Making change fraught risk instituted heavyweight procedure control slowed innovation crawl even triple checking outage still occurred · tested new change best ability since development system used simplified tool environment would encounter unexpected error moving code complex production environment · test data perfect whereas production data notoriously messy Production data always threw unexpected anomaly code Managing continuing succession outage trying keep development project schedule budget like trying play “whack mole” simultaneously reading book Yet approach day mainly based hope heroism release code “hope” doesn’t break anything there’s outage call technical expert “heroes” work aroundtheclock fix problem Looking back way run major operation It’s really surprise head department got replaced every two year Executive management needed pin responsibility somewhere DataOps — Lessons Applied tempting simpleminded blame outage people robust business process eliminates error improves efficiency despite fact errorprone human involved software company Netflix Facebook … started executing million highquality code release per year offered opportunity data organization renew approach development operation methodology used software engineering organization — Agile development DevOps lean manufacturing — apply equally well analytics creation operation data industry refers initiative DataOps fastest way institute DataOps method even assuming preexistence legacy tool technical environment aligning automating workflow using DataOps Platform DataOps Platform offer capability · Minimizes cycle time — DataOps Platform aligns production development environment using virtualization orchestrates DataOps testing qualification release new analytics code button push Continuous deployment new code software company produce high volume release per year · Eliminates error — functional unit regression testing code enables new analytics deployed confidence work promised operation addition data flow operation tested subject control every step operation pipeline Data error trapped remediated corrupt chart graph · Fosters collaboration — DataOps integrates version control workflow tool like Jira DataOps Platform enables team member share analytics component encouraging reuse improving productivity Geographically dispersed team use choice toolchains fitting higherlevel orchestration code data quality supported DataOps minimizes unplanned work disrupt data engineer’s weekend holiday DataOps Platform enterprise move away relying hope heroism Continuous deployment fully test deploys new analytics eliminating timeconsuming errorprone manual step Tests statistical control ensure data errorfree flow model analytics DataOps Holiday Season DataOps place test monitor data flowing operational system 24x7x365 data scientist home sipping eggnog DataOps work overtime keep operational system running may missed Christmas celebration family one time DataOps never Happy holiday information DataOps Platform compress new analytics cycle time eliminate data error please give u shout datakitchenioTags Data Science Big Data Dataops Analytics DevOps |
4,173 | Revisiting Imperial College’s COVID-19 Spread Models | How to run open-source Tensorflow models on Kubernetes and reviewing how effective the COVID-19 spread model was in measuring the effect of interventions.
Photo by Brian McGowan on Unsplash
Earlier this month, the United Kingdom became the first European country to approve and administer the first doses of Pfizer/BioNTech’s COVID-19 vaccine. The United States quickly followed suit with the FDA and CDC recently recommending Moderna’s vaccine as well as Pfizer’s to give the world a glimmer of hope. Other international players, notably China and Russia, are also pushing to approve and produce their own vaccines. Even as COVID-19 continues to rage on, this news of vaccines signals a hopeful end in sight.
To that end, I wanted to revisit a study from the Imperial College COVID-19 Response Team, “Estimating the number of infections and the impact of non-pharmaceutical interventions on COVID-19 in 11 European countries”, published in March. The study used a semi-mechanistic Bayesian hierarchical model to estimate the impact of non-pharmaceutical interventions such as isolation, the closing of public spaces (e.g. schools, churches, sports arenas), as well as widescale social distancing measures.
The Tensorflow implementation used in the paper is open-sourced under the MIT License and available at Tensorflow.org and Github.
Code Setup
While Google provides a free, hosted Jupyter notebook service through Google Colab, I wanted to run the analysis on Kubernetes to practice running data science and machine learning projects on Kubernetes as well as to compare the developer experience for both. To replicate the managed notebook experience of Google Colab, I looked for a similar Kubernetes experience without needing to stand up a cluster myself. At the same time, I wanted some control over my Kubernetes environment and not a fully managed data science platform like the Google AI Platform.
I eventually found puzl.ee, a Kubernetes service provider with GPU support that charges per pod usage. Puzl.ee creates a unique namespace for my workloads and charges for resource usage similar to serverless Kubernetes offerings such as Google’s Cloud Run or AWS Fargate. The number of packaged applications are currently limited (Gitlab CI Runner, SSH Server, and Jupyter Notebook), but support for H2O.ai, PostgreSQL, Determined AI, Redis, Jupyter Hub, Drone CI, MongoDB, and Airflow is on the roadmap. Fortunately, puzl.ee had already published a quick start guide for setting up a Jupyter Notebook with GPU, so I provisioned my Jupyter Notebook after signing up for a free account.
I was given various options for predefined Docker images (as well as an option to use my own). I could also adjust my resource requests, including NVIDIA RTX 2080Ti GPU, before installing my Jupyter Notebook.
Within a minute or so, my Jupyter Notebook with Tensorflow was installed. Afterward, I realized that the code required a Tensorflow 2+ image. Even with the reinstallation, the provisioning process was smooth and painless.
Unfortunately, I ran into several issues trying to run Tensorflow out of the box. I was surprised to see Pandas not included in the standard distribution of the provided Docker image and encountered several errors such as undefined symbol: _ZN10tensorflow8OpKernel11TraceStringEPNS_15OpKernelContextEb . After a few unsuccessful solutions from StackOverflow, I decided to compile my own Docker image with the necessary Tensorflow components installed, which allowed me to move on. However, this highlighted a huge challenge in AIOps where version control and software compatibility are still hard, making these semi-hosted platforms less effective as it requires some DevOps work to untangle and fix dependencies.
Model Setup & Data Preprocessing
After correctly installing all the necessary Python modules, I was able to follow the code to set up the model and load the data. The entire setup is posted on Github, but I’ll summarize the important sections below:
Mechanistic model for infections and deaths
The infection model takes in the timing and the type of interventions, population size, and initial cases per country along with the effectiveness of interventions and the rate of disease transmission as parameters to simulate the number of infections in each European country over time. The model produces two key probabilities conv_serial_interval (convolution of previous daily infections and the distribution over the time of becoming infected and infecting others) and conv_fatality_rate (convolution of previous daily infections and the distribution over the time of infection and death).
Parameters & Probabilities
Parameter values, which are assumed to be independent, include exponential distribution of initial cases per country, negative binomial distribution for the number of deaths, infection rate per each infected person, the effectiveness of each intervention type, and the noise in the infection fatality rate. Given these parameters, a likelihood of observed death is calculated along with the probability of death given infection. Finally, infection transmission is assumed to be Gamma distributed, to turn conv_serial_interval to predict_infections .
Key Assumptions
The study aims to model the effectiveness of intervention measures by looking at the infection and fatality rates. Here the model assumes that the decline in the number of COVID cases is a direct response to interventions rather than gradual changes in behavior. Also, the study assumes that interventions will have the same effect across the European countries selected, not accounting for the size, population density, and perhaps the average age of its citizens, which we now know has a huge impact on fatality rates.
Replicating the Results
The dataset includes interventions enforced and infection/fatality rates from 11 European countries (Austria Belgium, Denmark, France, Germany, Italy, Norway, Spain, Sweden, Switzerland, and the UK). After applying the Tensorflow model as described on the Github post, I was able to replicate the effectiveness of the interventions graph as well as infections/deaths by country:
Effectiveness of interventions (same as Figure 4 on the original paper): shows no effects are significantly different since most measures when in effect around the same time
Infections, deaths, and R_t by country (same as Figure 2 on the original paper)
Comparing the Model with Actual Data
The paper estimated that various intervention measures were successful in curbing the rate of infections in Europe with the caveat that given the long incubation period of COVID-19 and the time between transmission and mortality, the data collected at the time may be premature to conclude effectiveness in certain countries where the pandemic was at its nascent phases.
Looking back, we now know that the initial measures were somewhat successful in slowing down the rate of infection in Europe. It took drastic measures such as country-wide lockdowns and enforcing large-scale social distancing guidelines, but the data shows a downward trend in infection rates until a recent uptick in cases.
Perhaps a more telling graph is comparing the results of Europe as a whole vs. the United States where interventions measures were rolled out in a less coordinated manner:
Now that we have more data, it would be interesting to see a follow-up study to include other parameters to reflect cultural or political factors that made intervention measures more or less successful across the globe.
You can create your own visualization using the link below:
Colab vs. Puzl.ee
Coming back to the data science side of things, this exercise of running the Tensorflow model reminded me of the challenges we still face before making Kubernetes the go-to platform for data science and machine learning tasks. As a free tool, Google Colab makes it easy to clone open-source notebooks and run experiments without any infrastructure setup. With Google Colab Pro, which is priced at $9.99/month, most workloads can be run in a managed manner without too many restrictions.
However, the initial provisioning process on puzl.ee was surprisingly smooth. As the team works on putting together more predefined Docker images, I expect some of the installation and configuration challenges I faced to diminish. I also liked the option of running my own Kubernetes pod to potentially extend the experiment to add other microservices to either fetch/post data or integrate it with other databases within the same Kubernetes cluster. Once puzl.ee adds native support for popular Helm charts via the dashboard similar to how it provides pre-made Docker images, I plan to take a look again for some of my side projects. | https://medium.com/dev-genius/revisiting-imperial-colleges-covid-19-spread-models-daa7ac1a7862 | ['Yitaek Hwang'] | 2020-12-24 16:27:29.356000+00:00 | ['Data Science', 'Kubernetes', 'Machine Learning', 'Programming'] | Title Revisiting Imperial College’s COVID19 Spread ModelsContent run opensource Tensorflow model Kubernetes reviewing effective COVID19 spread model measuring effect intervention Photo Brian McGowan Unsplash Earlier month United Kingdom became first European country approve administer first dos PfizerBioNTech’s COVID19 vaccine United States quickly followed suit FDA CDC recently recommending Moderna’s vaccine well Pfizer’s give world glimmer hope international player notably China Russia also pushing approve produce vaccine Even COVID19 continues rage news vaccine signal hopeful end sight end wanted revisit study Imperial College COVID19 Response Team “Estimating number infection impact nonpharmaceutical intervention COVID19 11 European countries” published March study used semimechanistic Bayesian hierarchical model estimate impact nonpharmaceutical intervention isolation closing public space eg school church sport arena well widescale social distancing measure Tensorflow implementation used paper opensourced MIT License available Tensorfloworg Github Code Setup Google provides free hosted Jupyter notebook service Google Colab wanted run analysis Kubernetes practice running data science machine learning project Kubernetes well compare developer experience replicate managed notebook experience Google Colab looked similar Kubernetes experience without needing stand cluster time wanted control Kubernetes environment fully managed data science platform like Google AI Platform eventually found puzlee Kubernetes service provider GPU support charge per pod usage Puzlee creates unique namespace workload charge resource usage similar serverless Kubernetes offering Google’s Cloud Run AWS Fargate number packaged application currently limited Gitlab CI Runner SSH Server Jupyter Notebook support H2Oai PostgreSQL Determined AI Redis Jupyter Hub Drone CI MongoDB Airflow roadmap Fortunately puzlee already published quick start guide setting Jupyter Notebook GPU provisioned Jupyter Notebook signing free account given various option predefined Docker image well option use could also adjust resource request including NVIDIA RTX 2080Ti GPU installing Jupyter Notebook Within minute Jupyter Notebook Tensorflow installed Afterward realized code required Tensorflow 2 image Even reinstallation provisioning process smooth painless Unfortunately ran several issue trying run Tensorflow box surprised see Pandas included standard distribution provided Docker image encountered several error undefined symbol ZN10tensorflow8OpKernel11TraceStringEPNS15OpKernelContextEb unsuccessful solution StackOverflow decided compile Docker image necessary Tensorflow component installed allowed move However highlighted huge challenge AIOps version control software compatibility still hard making semihosted platform le effective requires DevOps work untangle fix dependency Model Setup Data Preprocessing correctly installing necessary Python module able follow code set model load data entire setup posted Github I’ll summarize important section Mechanistic model infection death infection model take timing type intervention population size initial case per country along effectiveness intervention rate disease transmission parameter simulate number infection European country time model produce two key probability convserialinterval convolution previous daily infection distribution time becoming infected infecting others convfatalityrate convolution previous daily infection distribution time infection death Parameters Probabilities Parameter value assumed independent include exponential distribution initial case per country negative binomial distribution number death infection rate per infected person effectiveness intervention type noise infection fatality rate Given parameter likelihood observed death calculated along probability death given infection Finally infection transmission assumed Gamma distributed turn convserialinterval predictinfections Key Assumptions study aim model effectiveness intervention measure looking infection fatality rate model assumes decline number COVID case direct response intervention rather gradual change behavior Also study assumes intervention effect across European country selected accounting size population density perhaps average age citizen know huge impact fatality rate Replicating Results dataset includes intervention enforced infectionfatality rate 11 European country Austria Belgium Denmark France Germany Italy Norway Spain Sweden Switzerland UK applying Tensorflow model described Github post able replicate effectiveness intervention graph well infectionsdeaths country Effectiveness intervention Figure 4 original paper show effect significantly different since measure effect around time Infections death Rt country Figure 2 original paper Comparing Model Actual Data paper estimated various intervention measure successful curbing rate infection Europe caveat given long incubation period COVID19 time transmission mortality data collected time may premature conclude effectiveness certain country pandemic nascent phase Looking back know initial measure somewhat successful slowing rate infection Europe took drastic measure countrywide lockdown enforcing largescale social distancing guideline data show downward trend infection rate recent uptick case Perhaps telling graph comparing result Europe whole v United States intervention measure rolled le coordinated manner data would interesting see followup study include parameter reflect cultural political factor made intervention measure le successful across globe create visualization using link Colab v Puzlee Coming back data science side thing exercise running Tensorflow model reminded challenge still face making Kubernetes goto platform data science machine learning task free tool Google Colab make easy clone opensource notebook run experiment without infrastructure setup Google Colab Pro priced 999month workload run managed manner without many restriction However initial provisioning process puzlee surprisingly smooth team work putting together predefined Docker image expect installation configuration challenge faced diminish also liked option running Kubernetes pod potentially extend experiment add microservices either fetchpost data integrate database within Kubernetes cluster puzlee add native support popular Helm chart via dashboard similar provides premade Docker image plan take look side projectsTags Data Science Kubernetes Machine Learning Programming |
4,174 | Unlock honest feedback from your employees with this one word | Unlock honest feedback from your employees with this one word
Consider using this one word in your next one on one meeting…
A few years ago, a CEO told me how she was struggling to get honest feedback from her board.
No one seemed willing to be critical or give her pointers on things she could improve. After every board meeting, she would turn to them and ask directly:
“What feedback does anyone have for me?”
She’d hear crickets. Every single time.
No one would speak up. Even though they were board members — people who are supposed to hold her accountable as the CEO of the company — they shied away from offering their honest input.
This was so perplexing to the CEO. She felt like she was being very clear with what she wanted… Why weren’t they just giving her the feedback she was asking for?
One day, she decided to try something different.
Instead of asking, “What feedback does anyone have for me?”… she asked this:
“What advice does anyone have for me?”
All of sudden, everyone started weighing in. “Well I might try this…” and “The way you brought up this point could’ve been better…” and “You could try structuring the meeting like this…”
The word “advice” unlocked all the honest feedback that CEO needed.
Why? The word “feedback’” carries a lot of baggage. To some, they automatically associate it with a “critique” or something negative. It can seem scary and formal.
But “advice” is a much more welcoming word. Advice is about lending someone a hand. When someone gives you advice, they’re just looking out for you.
And when you ask for advice, it’s an invitation. You’re signaling that another person has expertise or knowledge that you find interesting and valuable. That person is often flattered you even asked for advice in the first place.
Who doesn’t love to give advice? :-)
The next time you’d like to get honest feedback, try asking for advice instead. Notice how much more people open up to you. See how swapping that one word makes a difference. | https://medium.com/signal-v-noise/unlock-honest-feedback-with-this-one-word-dcaf3839e7ee | ['Claire Lew'] | 2018-12-04 15:15:54.492000+00:00 | ['Leadership', 'Startup', 'Employee Engagement', 'Employee Feedback', 'Management'] | Title Unlock honest feedback employee one wordContent Unlock honest feedback employee one word Consider using one word next one one meeting… year ago CEO told struggling get honest feedback board one seemed willing critical give pointer thing could improve every board meeting would turn ask directly “What feedback anyone me” She’d hear cricket Every single time one would speak Even though board member — people supposed hold accountable CEO company — shied away offering honest input perplexing CEO felt like clear wanted… weren’t giving feedback asking One day decided try something different Instead asking “What feedback anyone me”… asked “What advice anyone me” sudden everyone started weighing “Well might try this…” “The way brought point could’ve better…” “You could try structuring meeting like this…” word “advice” unlocked honest feedback CEO needed word “feedback’” carry lot baggage automatically associate “critique” something negative seem scary formal “advice” much welcoming word Advice lending someone hand someone give advice they’re looking ask advice it’s invitation You’re signaling another person expertise knowledge find interesting valuable person often flattered even asked advice first place doesn’t love give advice next time you’d like get honest feedback try asking advice instead Notice much people open See swapping one word make differenceTags Leadership Startup Employee Engagement Employee Feedback Management |
4,175 | How we initiate point of sale transactions globally | The challenge
Most POS setups include a cash register, controlled by store staff, a payment terminal, where the shopper enters their card, and a serial connection between the two. A library is embedded on the cash register facilitating communication between the cash register and the payment terminal. These libraries are typically created and maintained by the company that facilitates the terminals (such as Adyen).
Using libraries creates a number of challenges:
A tight integration between the cash register and the library means a significant amount of setup and development work is required, because the library will be part of the cash register software.
The cash register software — which is third party — is often updated as infrequently as once a year, meaning retailers are not able to immediately benefit from the latest library updates.
Cash registers differ significantly between vendors and platforms, creating a large maintenance burden on the development of the library for Adyen.
Data centers
Furthermore, many larger retailers prefer centrally-hosted solutions for their cash register software. This means the software needs to be configured to initiate a transaction on a payment terminal in the store, by routing requests from the data center into the store network. To do this, merchants need to use port forwarding to manage payments across multiple locations, a fixed IP for each terminal, or possibly a VPN setup for security. All these possibilities involve a complex network setup that drains operational resources.
Solving the library challenge with the Nexo protocol
Ideally, we needed a solution that would be independent of any specific platform, able to be used for serial connections, local network, and internet transports, and support a message format with advanced features such as asynchronous notifications.
To meet these criteria, we removed our need for libraries and created the Terminal API, adopting the Nexo protocol — a card payment standard that facilitates communication between the cash register and terminal.
Nexo’s basic interaction model is request/response JSON messaging. This means that making a payment with the Terminal API is a simple request-response, and all informational events, such as notifying where the terminal is in the payment process, are communicated via JSON webhooks that are optionally implemented.
Using this approach is advantageous because:
Supporting new programming languages is simpler, as the library required all potential events to be implemented as a callback and passed as part of the initial payment request.
Maintaining a JSON messaging format, rather than custom libraries, callbacks, and SDKs, makes it far easier for merchants to roll out and update the software.
Internally, this had the added benefit of us not needing to support multiple programming languages for the API.
Solving network setup complexity by routing through the cloud
Using the Terminal API over the store network was a great first step. However, it did not solve the challenge of initiating payments from a centralized place such as a data center.
To simplify the setup investment and remove the cost of all this complexity, we also adapted our Terminal API for the cloud. The in-store architecture relied on the merchant’s cash register and backend to communicate to the terminal, as below:
In the cloud version of the API, we added the ability for the merchant to initiate a terminal payment directly with Adyen’s backend.
Incorporating WebSockets
One advantage of serial connections is that they provide bidirectional communication, so both cash register and payment terminals can initiate communication and exchange data related to the status of the transaction. With our Terminal API over the network, transactions are https request-response. The cash register initiates a payment request by sending an https request to the terminal.
However, on the internet, having a communication channel where both parties can initiate communication is cumbersome, as the NATed terminals cannot be reached without opening the firewall and setting up port forwarding. We needed a solution to easily enable bidirectional communication.
We found this solution in WebSockets. This technology is used by a number of platforms for push notifications, such as in newsfeeds, and we leveraged it for communication between a terminal and the Adyen backend.
To enable bidirectional communication, we create a single https request from the payment terminal, and added headers to request an upgrade the connection to a WebSocket, as displayed below. After that, a bidirectional communication channel is established between our backend and the payment terminal.
A standard flow is as follows:
As the cash register initiates a transaction, it sends an https request to the Adyen backend. The Adyen backend looks up which WebSocket the terminal is using and routes the request to the terminal over it (more on this below under load balancing). The terminal delivers its response to the Adyen backend over the WebSocket and the backend subsequently delivers it as a https response to the cash register.
Load balancing and redundancy
Redundancy is a key consideration in our system architecture. During application updates, or when carrying out maintenance, transactions cannot be affected. Our payment acceptance layer is made up of multiple servers over multiple data centers around the world. This helps reduce latency and ensure redundancy. (Note: you can read more about our approach to redundancy and database setup here: Updating a 50 terabyte PostgreSQL database).
This infrastructure ensures redundancy and the possibility to balance loads if we need to carry out maintenance. However, it does raise a new challenge — when a terminal opens a connection with Server A, and a cash register with server B, what happens? We configured our setup so that if a terminal connects to Server A, a message is triggered from that server to other servers that says “I now have a connection with this terminal.” If a cash register then connects to Server B, Server B can look up which server owns the WebSocket connection, and route the message via that server.
If we need to carry out application updates or maintenance on one server, it sends a message to the terminal to reconnect with another server. Once all connections are closed we can begin.
Conclusion
Our Terminal API simplifies rollout and ensures merchants are able to stay abreast of the latest software updates. However, there are more innovative ways in which it may be used. For example, since in-store payments can be initiated remotely, merchants would be able to create an experience where a shopper initiates an order in-app, walks into a store, scans a QR code with their phone to initiate the payment on the terminal, and picks up their item. These kinds of possibilities make it very exciting for us to see how merchants use this technology.
For more information on our Terminal API, you can see our documentation: Terminal API Integration and a blog post on the commercial benefits: Introducing the Terminal API. | https://medium.com/adyen/how-we-initiate-point-of-sale-transactions-globally-7fad4786db16 | [] | 2020-07-08 05:16:38.864000+00:00 | ['Java', 'Retail Technology', 'Cloud', 'Payments', 'Point Of Sale Software'] | Title initiate point sale transaction globallyContent challenge POS setup include cash register controlled store staff payment terminal shopper enters card serial connection two library embedded cash register facilitating communication cash register payment terminal library typically created maintained company facilitates terminal Adyen Using library creates number challenge tight integration cash register library mean significant amount setup development work required library part cash register software cash register software — third party — often updated infrequently year meaning retailer able immediately benefit latest library update Cash register differ significantly vendor platform creating large maintenance burden development library Adyen Data center Furthermore many larger retailer prefer centrallyhosted solution cash register software mean software need configured initiate transaction payment terminal store routing request data center store network merchant need use port forwarding manage payment across multiple location fixed IP terminal possibly VPN setup security possibility involve complex network setup drain operational resource Solving library challenge Nexo protocol Ideally needed solution would independent specific platform able used serial connection local network internet transport support message format advanced feature asynchronous notification meet criterion removed need library created Terminal API adopting Nexo protocol — card payment standard facilitates communication cash register terminal Nexo’s basic interaction model requestresponse JSON messaging mean making payment Terminal API simple requestresponse informational event notifying terminal payment process communicated via JSON webhooks optionally implemented Using approach advantageous Supporting new programming language simpler library required potential event implemented callback passed part initial payment request Maintaining JSON messaging format rather custom library callback SDKs make far easier merchant roll update software Internally added benefit u needing support multiple programming language API Solving network setup complexity routing cloud Using Terminal API store network great first step However solve challenge initiating payment centralized place data center simplify setup investment remove cost complexity also adapted Terminal API cloud instore architecture relied merchant’s cash register backend communicate terminal cloud version API added ability merchant initiate terminal payment directly Adyen’s backend Incorporating WebSockets One advantage serial connection provide bidirectional communication cash register payment terminal initiate communication exchange data related status transaction Terminal API network transaction http requestresponse cash register initiate payment request sending http request terminal However internet communication channel party initiate communication cumbersome NATed terminal cannot reached without opening firewall setting port forwarding needed solution easily enable bidirectional communication found solution WebSockets technology used number platform push notification newsfeeds leveraged communication terminal Adyen backend enable bidirectional communication create single http request payment terminal added header request upgrade connection WebSocket displayed bidirectional communication channel established backend payment terminal standard flow follows cash register initiate transaction sends http request Adyen backend Adyen backend look WebSocket terminal using route request terminal load balancing terminal delivers response Adyen backend WebSocket backend subsequently delivers http response cash register Load balancing redundancy Redundancy key consideration system architecture application update carrying maintenance transaction cannot affected payment acceptance layer made multiple server multiple data center around world help reduce latency ensure redundancy Note read approach redundancy database setup Updating 50 terabyte PostgreSQL database infrastructure ensures redundancy possibility balance load need carry maintenance However raise new challenge — terminal open connection Server cash register server B happens configured setup terminal connects Server message triggered server server say “I connection terminal” cash register connects Server B Server B look server owns WebSocket connection route message via server need carry application update maintenance one server sends message terminal reconnect another server connection closed begin Conclusion Terminal API simplifies rollout ensures merchant able stay abreast latest software update However innovative way may used example since instore payment initiated remotely merchant would able create experience shopper initiate order inapp walk store scan QR code phone initiate payment terminal pick item kind possibility make exciting u see merchant use technology information Terminal API see documentation Terminal API Integration blog post commercial benefit Introducing Terminal APITags Java Retail Technology Cloud Payments Point Sale Software |
4,176 | Django REST API | Last week we talked about creating basic applications with Django. Today let’s try to design a RESTful API with Django.
Prerequisites
Before we start let’s install additional libraries which will help us design API:
pip install djangorestframework pip install markdown pip install django-filter
Project Setup
Now let’s make sure that all moving parts of the REST framework libraries are in their places.
In the project folder we have file settings.py. In it there is an array INSTALLED_APPS. Add one more element, ‘rest_framework’, to this array. Also we need to add this dictionary to this file:
REST_FRAMEWORK = { ‘DEFAULT_PAGINATION_CLASS’: ‘rest_framework.pagination.PageNumberPagination’, ‘PAGE_SIZE’: 10 }
In the previous post we didn’t sync with the database. We can do it by running migrations:
python manage.py migrate
Also let’s create superuser:
python manage.py createsuperuser — email [email protected] — username admin
Now our initial database and admin user ready we can start developing API.
Serializers
Serialization is a process of turning data into readable format. In our case we will use serializer from the django rest framework library. Let’s cd to the app directory and create file serializers.py. In this file we will import the User model, import serializer and create filter data we are going to send out. This is how serializer looks like:
Views
Next we will need to create a view where user serializer will be used. Here we will query the database and sort record:
You can see that we are not declaring individual views. Instead we use viewsets. We group common behavior into classes.
Urls
And finally we need to write some routes for our API. Let’s cd to the project (not app) directory and open urls.py. Using viewsets allows automatically generating each URL config by registered viewset with router class:
And our brand new API is ready to use.
API CRUD
Let’s run server and see what we got:
python manage.py runserver
If we open localhost there will be all declared routes, in our case only one, because only the users have been created.
And if we click on the users route we will see serialized data from the User table.
That is Read operation, what about the rest of them? Easy! On the bottom of the page we can find an interface for sending POST requests which will create a new record in the user table. And after creating a new user we can follow RESTful convention and go to the URL users/:id (in our case that id will be 2) and see all details about this specific user and we have interfaces for Update and Delete actions.
CRUD is complete!
Conclusion
Django is a very powerful framework and with additional libraries it allows us to create fully functioning APIs very fast.
Keep learning, keep growing!
Let’s connect on LinkedIn! | https://medium.com/datadriveninvestor/django-rest-api-1ab821e40733 | ['Pavel Ilin'] | 2020-11-10 09:14:50.636000+00:00 | ['Python', 'Django', 'Rest Api', 'Rest', 'Framework'] | Title Django REST APIContent Last week talked creating basic application Django Today let’s try design RESTful API Django Prerequisites start let’s install additional library help u design API pip install djangorestframework pip install markdown pip install djangofilter Project Setup let’s make sure moving part REST framework library place project folder file settingspy array INSTALLEDAPPS Add one element ‘restframework’ array Also need add dictionary file RESTFRAMEWORK ‘DEFAULTPAGINATIONCLASS’ ‘restframeworkpaginationPageNumberPagination’ ‘PAGESIZE’ 10 previous post didn’t sync database running migration python managepy migrate Also let’s create superuser python managepy createsuperuser — email adminexamplecom — username admin initial database admin user ready start developing API Serializers Serialization process turning data readable format case use serializer django rest framework library Let’s cd app directory create file serializerspy file import User model import serializer create filter data going send serializer look like Views Next need create view user serializer used query database sort record see declaring individual view Instead use viewsets group common behavior class Urls finally need write route API Let’s cd project app directory open urlspy Using viewsets allows automatically generating URL config registered viewset router class brand new API ready use API CRUD Let’s run server see got python managepy runserver open localhost declared route case one user created click user route see serialized data User table Read operation rest Easy bottom page find interface sending POST request create new record user table creating new user follow RESTful convention go URL usersid case id 2 see detail specific user interface Update Delete action CRUD complete Conclusion Django powerful framework additional library allows u create fully functioning APIs fast Keep learning keep growing Let’s connect LinkedInTags Python Django Rest Api Rest Framework |
4,177 | 6 Steps to Up the Sustainability Game for Your Business | Lessons from the most sustainably managed company in the world
Photo by Alexander Abero on Unsplash
While some business owners are still hanging onto a feast or famine mindset, others, as if tipped off by a prophet, have embraced a marathon running attitude. Gone are the old days where Darwinism reigned in business — sustainability is the new law of jungle.
However, unlike financial success, ‘sustainability’ of a business is daunting to measure to say the least. It’s like trying to predict human longevity — use all the health matrix you want but only time will tell.
Yet The Wall Street Journal cracked the hard nut head-on. They recently published a ranking of the The 100 Most Sustainably Managed Companies in the World.
Who claims the top spot? Let me spare you the laborious effort of a click. It’s one of the largest Japanese conglomerate with business in electronic products, video gaming, music and media — Sony.
Sony’s headquarter in Tokyo. Photo: Sony.net
To combat the pandemic alone, Sony set up a $100 million Sony Global Relief Fund for COVID-19 for efforts in medical — donating $10 million to the World Health Organization’s COVID-19 Solidarity Response Fund; education — joining UNICEF to roll out digital learning platform “Learning Passport” in Latin America; and the creative industry —Play At Home for gamers, 500 ARTISTS WANTED for musicians, and free Sony cameras for visual artists.
All these efforts find their roots in Sony’s purpose statement revealed last year, “Fill the world with emotion, through the power of creativity and technology”, which to Sony is not a goal but reason for existence.
Before the word “Covid-19” existed in any dictionary, Sony has been funding startups focusing on environmental technologies with plans to invest 1 billion yen ($9.46 million) over the course of three to five years and recoup in about a decade. It publicly announced its goal to achieve a “zero environmental footprint” by 2050 and use 100% renewable energy by 2040 — both are now very well on track.
If you are wondering how Sony became the most sustainably managed company in the world, here’s six easily applicable steps we learned from Sony’s sustainability success. | https://medium.com/datadriveninvestor/6-steps-to-up-the-sustainability-game-for-your-business-ddd41b8f596b | ['Eunice X.'] | 2020-12-28 15:49:42.209000+00:00 | ['Finance', 'Investing', 'Sustainability', 'Venture Capital', 'Economics'] | Title 6 Steps Sustainability Game BusinessContent Lessons sustainably managed company world Photo Alexander Abero Unsplash business owner still hanging onto feast famine mindset others tipped prophet embraced marathon running attitude Gone old day Darwinism reigned business — sustainability new law jungle However unlike financial success ‘sustainability’ business daunting measure say least It’s like trying predict human longevity — use health matrix want time tell Yet Wall Street Journal cracked hard nut headon recently published ranking 100 Sustainably Managed Companies World claim top spot Let spare laborious effort click It’s one largest Japanese conglomerate business electronic product video gaming music medium — Sony Sony’s headquarter Tokyo Photo Sonynet combat pandemic alone Sony set 100 million Sony Global Relief Fund COVID19 effort medical — donating 10 million World Health Organization’s COVID19 Solidarity Response Fund education — joining UNICEF roll digital learning platform “Learning Passport” Latin America creative industry —Play Home gamers 500 ARTISTS WANTED musician free Sony camera visual artist effort find root Sony’s purpose statement revealed last year “Fill world emotion power creativity technology” Sony goal reason existence word “Covid19” existed dictionary Sony funding startup focusing environmental technology plan invest 1 billion yen 946 million course three five year recoup decade publicly announced goal achieve “zero environmental footprint” 2050 use 100 renewable energy 2040 — well track wondering Sony became sustainably managed company world here’s six easily applicable step learned Sony’s sustainability successTags Finance Investing Sustainability Venture Capital Economics |
4,178 | Fixing our Bug Problem | by Thomas Gomersall
Insects may not be what many would consider endangered species, but according to a devastating 2019 study in Biological Conservation, 41 per cent are in decline. An additional third are threatened and without immediate, radical action, most could be extinct within decades in what many have dubbed the ‘Insect Apocalypse’ (Sánchez-Bayo & Wyckhuys, 2019). This wouldn’t just be apocalyptic for insects, but for the countless animals and plants that they feed and pollinate, including our food crops. Yet in places such as Hong Kong, people continue to destroy their habitats for development and use pesticides that kill far more insects than the mosquitos they are intended for (Williams, 2019).
But if Swedish teen environmental activist Greta Thunberg proves anything, it’s that ordinary citizens can make a difference. Here are some ways Hongkongers can help protect local insects.
Citizen scientist programmes, such as City Nature Challenge, help familiarise people about insects. Photo credit: WWF-Hong Kong
Love the Bugs
Public support is integral for successful conservation. But widespread public ignorance of the importance of insects and limited exposure to them make insect conservation a tough sell (Tsang, 2019).
Better education programmes about insects for schools and the wider public can help address this, while citizen scientist programmes, such as City Nature Challenge, help familiarise people about insects.
“If you want to ask people to conserve insects, the most direct way is through [using] photography to make them realise that they can look very beautiful” says Toby Tsang, a post-doctoral urban ecology researcher at the University of Hong Kong. “If you can manage to somehow first promote [insect conservation] through photography, I think more people will start paying attention.”
Tropical and subtropical insects can only survive within a narrow temperature range. Photo credit: Thomas Gomersall
Tackle Climate Crisis
This is a particularly important measure for protecting Hong Kong insects, as tropical and subtropical insects can only survive within a narrow range of temperatures, making it especially hard for them to cope with rapid warming (Bonebrake & Mastrandrea, 2010), as shown by the mass bee die-offs in last summer’s heatwave (Williams, 2019) and the expected significant declines in butterfly diversity in country parks from current warming projections (Cheng & Bonebrake, 2017). Measures to reduce Hong Kong’s carbon footprint include buying more locally sourced foods*, eating less meat and using low-emission public transport** and most importantly, continuing to pressure the government to cut emissions.
Roofs of residential high rises provide plenty of space for communal rooftop gardens. Photo credit: Mathew Pryor
Bug Cities
While natural areas certainly provide better habitat overall, urban green spaces in Hong Kong (e.g. parks) are nonetheless surprisingly valuable for insect conservation. They support considerable insect diversity (including 58 of Hong Kong’s 250 butterfly species) and can help insects to move between the fragmented country parks, maintaining vital inter-population gene flow. However, habitat quality in parks is limited by frequent pesticide spraying and vegetation trimming (Tam & Bonebrake, 2016; Bonebrake, 2019).
Luckily, ordinary citizens can easily create more insect-friendly green spaces themselves. Those with conventional gardens can do this by leaving small sections of them untended. But roofs of residential high rises also provide plenty of space for communal rooftop gardens and getting building management permission to start one is usually fairly easy (Pryor, 2019). If individuals each add their own plants to the garden, this will lead to a greater plant abundance that can support more insect species (Tsang & Bonebrake, 2017), particularly if these include native species such as the Chinese ixora and rhododendrons (Burghardt et al, 2009; Tam & Bonebrake, 2016; Ng & Corlett, 2000).
Seasonally blooming plants, such as Chinese ixora are best for pollinators. Photo credit: Thomas Gomersall
In return, rooftop gardens and the insects they attract (including butterflies, bees and beetles) can bring unexpected benefits for gardeners (Pryor, 2019). For instance, a high abundance and diversity of pollinators have been linked to higher crop yields (Garibaldi et al, 2014), good news for those who like growing fruit and vegetables on the roof. Even the process of creating and tending to a rooftop garden has been found to have great psychological benefits for people.
“I do a lot of research into why people farm on the roof. [Roof gardens] produce huge amounts of happiness.” says Mathew Pryor, Head of the Division of Landscape Architecture at the University of Hong Kong. “Everybody who participates in a rooftop farm is blissfully happy. […] It’s a personal project.”
As for natural insect habitats, Hongkongers should lobby their legislators to do more to protect these areas and vote for those whose environmental policies go towards meeting such goals.
Growing flowers of varying lengths will help butterflies that specialise in feeding from long-bodied flowers. Photo credit: Thomas Gomersall
Food, Glorious Food
When creating insect-friendly green spaces, it’s also important to consider food sources, particularly to encourage butterfly breeding, as the caterpillars of many species will only feed from specific plants (Lo & Hui, 2004, p.64) ***. Seasonally blooming plants, such as Chinese ixora are best for pollinators as they grow quickly and produce lots of flowers and nectar. Growing several species that bloom at different times of the year ensures that pollinators have abundant food year-round (Pryor, 2019) while growing flowers of varying lengths will help butterflies that specialise in feeding from long-bodied flowers along with more generalist feeders (Kunte, 2007). Some butterflies will also feed from fluids produced by rotting fruit such as papaya and banana skins. (Lo & Hui, 2004, p.65; Bunker, 2019).
Pesticides are the second-biggest driver of global insect declines. Photo credit: Thomas Gomersall
Put down the Pesticide
Of course, not everyone who keeps plants may want to attract insects that could potentially strip them to the stem. But don’t reach for that bug-spray. Pesticides are the second-biggest driver of global insect declines (Sánchez-Bayo & Wyckhuys, 2019) and there are plenty of other means to keep insects away from plants without killing them.
Planting marigolds next to other plants is one environmentally friendly way to guard against insect infestation. Photo credit: Thomas Gomersall
One way is the plant-by-plant method, in which bad-smelling plants like marigolds are placed close to other plants to ‘guard’ them from insects. Another is to use non-toxic, homemade sprays like water-diluted vinegar or water boiled with garlic. If the odour generated by this method puts you off as much as the insects, a less smelly though no-less effective or insect-friendly option would be to use Neem oil, which is available in most gardening shops. (Bunker, 2019).
External Links
* Locally sourced food in Hong Kong: https://medium.com/wwfhk-e/eating-sustainably-in-hong-kong-65b7f0dff961
** Advice on cutting personal carbon emissions: https://medium.com/wwfhk-e/when-the-lights-go-back-on-thoughts-on-earth-hour-96e70c5307fc
*** Specific caterpillar food plant preferences of the world’s butterflies and moths: https://www.nhm.ac.uk/our-science/data/hostplants/search/index.dsml
References
· Bonebrake, TC (PhD), interviewed by Thomas Gomersall, 2019, The University of Hong Kong.
· Bonebrake, T.C. and Mastrandrea, M.D. 2010. Tolerance adaptation and precipitation changes complicate latitudinal patterns of climate change impacts. PNAS. 091184107.
· Bunker, S, interviewed by Thomas Gomersall, 2019, World Wide Fund for Nature — Hong Kong.
· Burghardt, K.T., Tallamy, D.W. and W.G. Shriver. 2009. Impacts of native plants on bird and butterfly biodiversity in suburban landscapes. Conservation Biology, vol. 23 (1): 219pp–234pp.
· Cheng, W. and Bonebrake, T.C. 2017. Conservation effectiveness of protected areas for Hong Kong butterflies declines under climate change. Journal of Insect Conservation, vol. 21: 599pp-606pp.
· Garibaldi, L.A., Carvalheiro, L.G., Leonhardt, S.D., Aizen, M.A., Blaauw, B.R., Isaacs, R., Kuhlmann, M., Kleijn, D., Klein, A.M., Kremen, C., Morandin, L., Scheper, J. and R. Winfree. 2014. From research to action: enhancing crop yield through wild pollinators. Frontiers in Ecology and the Environment, vol. 12 (8): 439pp–447pp.
· Lo, P.Y.F. and Hui, W.L. 2004. Hong Kong Butterflies, 1st edn., Friends of the Country Parks, Hong Kong. 64pp–65pp.
· Ng, S.C. and Corlett, R.T. 2000. Comparative reproductive biology of the six species of Rhododendron (Ericaceae) in Hong Kong, South China. Canadian Journal of Botany, vol. 78 (2): 221pp–229pp.
· Pryor, M (PhD), interviewed by Thomas Gomersall, 2019, The University of Hong Kong.
· Sánchez-Bayo, F. and Wyckhuys, K.A.G. 2019. Worldwide declines of the entomofauna: A review of its drivers. Biological Conservation, vol. 232: 8pp–27pp.
· Tam, K.C. and Bonebrake, T.C. 2016. Butterfly diversity, habitat and vegetation usage in Hong Kong urban parks. Urban Ecosystems, vol. 19: 721pp–733pp.
· Tsang, TPN. (PhD), interviewed by Thomas Gomersall, 2019, The University of Hong Kong.
· Tsang, T.P.N. and Bonebrake, T.C. 2017. Contrasting roles of environmental and spatial processes for common and rare urban butterfly species compositions. Landscape Ecology, vol. 32 (1): 47pp–57pp.
· Williams, M., ‘The insect apocalypse is coming: Hong Kong moth study shows the threats and complexities’. South China Morning Post, 31 March 2019, https://www.scmp.com/magazines/post-magazine/long-reads/article/3003821/insect-apocalypse-coming-study-hong-kong-moth (Accessed: 1 April 2019) | https://medium.com/wwfhk-e/fixing-our-bug-problem-4a8041c7ebe0 | ['Wwf Hk'] | 2019-07-15 02:43:11.484000+00:00 | ['Biodiversity', 'Nature', 'Extinction', 'Insects', 'Environment'] | Title Fixing Bug ProblemContent Thomas Gomersall Insects may many would consider endangered specie according devastating 2019 study Biological Conservation 41 per cent decline additional third threatened without immediate radical action could extinct within decade many dubbed ‘Insect Apocalypse’ SánchezBayo Wyckhuys 2019 wouldn’t apocalyptic insect countless animal plant feed pollinate including food crop Yet place Hong Kong people continue destroy habitat development use pesticide kill far insect mosquito intended Williams 2019 Swedish teen environmental activist Greta Thunberg prof anything it’s ordinary citizen make difference way Hongkongers help protect local insect Citizen scientist programme City Nature Challenge help familiarise people insect Photo credit WWFHong Kong Love Bugs Public support integral successful conservation widespread public ignorance importance insect limited exposure make insect conservation tough sell Tsang 2019 Better education programme insect school wider public help address citizen scientist programme City Nature Challenge help familiarise people insect “If want ask people conserve insect direct way using photography make realise look beautiful” say Toby Tsang postdoctoral urban ecology researcher University Hong Kong “If manage somehow first promote insect conservation photography think people start paying attention” Tropical subtropical insect survive within narrow temperature range Photo credit Thomas Gomersall Tackle Climate Crisis particularly important measure protecting Hong Kong insect tropical subtropical insect survive within narrow range temperature making especially hard cope rapid warming Bonebrake Mastrandrea 2010 shown mass bee dieoffs last summer’s heatwave Williams 2019 expected significant decline butterfly diversity country park current warming projection Cheng Bonebrake 2017 Measures reduce Hong Kong’s carbon footprint include buying locally sourced food eating le meat using lowemission public transport importantly continuing pressure government cut emission Roofs residential high rise provide plenty space communal rooftop garden Photo credit Mathew Pryor Bug Cities natural area certainly provide better habitat overall urban green space Hong Kong eg park nonetheless surprisingly valuable insect conservation support considerable insect diversity including 58 Hong Kong’s 250 butterfly specie help insect move fragmented country park maintaining vital interpopulation gene flow However habitat quality park limited frequent pesticide spraying vegetation trimming Tam Bonebrake 2016 Bonebrake 2019 Luckily ordinary citizen easily create insectfriendly green space conventional garden leaving small section untended roof residential high rise also provide plenty space communal rooftop garden getting building management permission start one usually fairly easy Pryor 2019 individual add plant garden lead greater plant abundance support insect specie Tsang Bonebrake 2017 particularly include native specie Chinese ixora rhododendron Burghardt et al 2009 Tam Bonebrake 2016 Ng Corlett 2000 Seasonally blooming plant Chinese ixora best pollinator Photo credit Thomas Gomersall return rooftop garden insect attract including butterfly bee beetle bring unexpected benefit gardener Pryor 2019 instance high abundance diversity pollinator linked higher crop yield Garibaldi et al 2014 good news like growing fruit vegetable roof Even process creating tending rooftop garden found great psychological benefit people “I lot research people farm roof Roof garden produce huge amount happiness” say Mathew Pryor Head Division Landscape Architecture University Hong Kong “Everybody participates rooftop farm blissfully happy … It’s personal project” natural insect habitat Hongkongers lobby legislator protect area vote whose environmental policy go towards meeting goal Growing flower varying length help butterfly specialise feeding longbodied flower Photo credit Thomas Gomersall Food Glorious Food creating insectfriendly green space it’s also important consider food source particularly encourage butterfly breeding caterpillar many specie feed specific plant Lo Hui 2004 p64 Seasonally blooming plant Chinese ixora best pollinator grow quickly produce lot flower nectar Growing several specie bloom different time year ensures pollinator abundant food yearround Pryor 2019 growing flower varying length help butterfly specialise feeding longbodied flower along generalist feeder Kunte 2007 butterfly also feed fluid produced rotting fruit papaya banana skin Lo Hui 2004 p65 Bunker 2019 Pesticides secondbiggest driver global insect decline Photo credit Thomas Gomersall Put Pesticide course everyone keep plant may want attract insect could potentially strip stem don’t reach bugspray Pesticides secondbiggest driver global insect decline SánchezBayo Wyckhuys 2019 plenty mean keep insect away plant without killing Planting marigold next plant one environmentally friendly way guard insect infestation Photo credit Thomas Gomersall One way plantbyplant method badsmelling plant like marigold placed close plant ‘guard’ insect Another use nontoxic homemade spray like waterdiluted vinegar water boiled garlic odour generated method put much insect le smelly though noless effective insectfriendly option would use Neem oil available gardening shop Bunker 2019 External Links Locally sourced food Hong Kong httpsmediumcomwwfhkeeatingsustainablyinhongkong65b7f0dff961 Advice cutting personal carbon emission httpsmediumcomwwfhkewhenthelightsgobackonthoughtsonearthhour96e70c5307fc Specific caterpillar food plant preference world’s butterfly moth httpswwwnhmacukoursciencedatahostplantssearchindexdsml References · Bonebrake TC PhD interviewed Thomas Gomersall 2019 University Hong Kong · Bonebrake TC Mastrandrea MD 2010 Tolerance adaptation precipitation change complicate latitudinal pattern climate change impact PNAS 091184107 · Bunker interviewed Thomas Gomersall 2019 World Wide Fund Nature — Hong Kong · Burghardt KT Tallamy DW WG Shriver 2009 Impacts native plant bird butterfly biodiversity suburban landscape Conservation Biology vol 23 1 219pp–234pp · Cheng W Bonebrake TC 2017 Conservation effectiveness protected area Hong Kong butterfly decline climate change Journal Insect Conservation vol 21 599pp606pp · Garibaldi LA Carvalheiro LG Leonhardt SD Aizen Blaauw BR Isaacs R Kuhlmann Kleijn Klein Kremen C Morandin L Scheper J R Winfree 2014 research action enhancing crop yield wild pollinator Frontiers Ecology Environment vol 12 8 439pp–447pp · Lo PYF Hui WL 2004 Hong Kong Butterflies 1st edn Friends Country Parks Hong Kong 64pp–65pp · Ng SC Corlett RT 2000 Comparative reproductive biology six specie Rhododendron Ericaceae Hong Kong South China Canadian Journal Botany vol 78 2 221pp–229pp · Pryor PhD interviewed Thomas Gomersall 2019 University Hong Kong · SánchezBayo F Wyckhuys KAG 2019 Worldwide decline entomofauna review driver Biological Conservation vol 232 8pp–27pp · Tam KC Bonebrake TC 2016 Butterfly diversity habitat vegetation usage Hong Kong urban park Urban Ecosystems vol 19 721pp–733pp · Tsang TPN PhD interviewed Thomas Gomersall 2019 University Hong Kong · Tsang TPN Bonebrake TC 2017 Contrasting role environmental spatial process common rare urban butterfly specie composition Landscape Ecology vol 32 1 47pp–57pp · Williams ‘The insect apocalypse coming Hong Kong moth study show threat complexities’ South China Morning Post 31 March 2019 httpswwwscmpcommagazinespostmagazinelongreadsarticle3003821insectapocalypsecomingstudyhongkongmoth Accessed 1 April 2019Tags Biodiversity Nature Extinction Insects Environment |
4,179 | Netflix Is Now Worth More Than Disney — What’s Their Next Move? | BYTE/SIZE
Netflix Is Now Worth More Than Disney — What’s Their Next Move?
Four 3D chess moves that could make Netflix top dog
Created by Murto Hilali
My friends, who are 17–18-year-old males, have devoted their eyeballs to Netflix’s latest reality show: Too Hot To Handle, where attractive singles spend a month on a desert island trying not to advance the human species together (no sex):
These are guys who read self-help books and publish on Medium, guys who I admire. My point? Netflix has successfully stolen my friend group’s souls, AND I WANT THEM BACK. Slow clap, Netflix, slow clap.
Quarantine has people flocking to their screens and Joe Exotic’ mullet, and the streaming giant’s market cap is almost $187 billion now — just above Disney’s. With all this momentum, don’t be surprised if the firm starts making baller moves to grow its reach.
I got Netflix two months ago, so I’m already an expert — here are my unrequested (and probably unqualified) ideas for where Netflix could be heading next. | https://medium.com/swlh/netflix-next-move-9ea44b42150f | ['Murto Hilali'] | 2020-05-14 03:38:19.247000+00:00 | ['Finance', 'Technology', 'Marketing', 'Business', 'Data'] | Title Netflix Worth Disney — What’s Next MoveContent BYTESIZE Netflix Worth Disney — What’s Next Move Four 3D chess move could make Netflix top dog Created Murto Hilali friend 17–18yearold male devoted eyeball Netflix’s latest reality show Hot Handle attractive single spend month desert island trying advance human specie together sex guy read selfhelp book publish Medium guy admire point Netflix successfully stolen friend group’s soul WANT BACK Slow clap Netflix slow clap Quarantine people flocking screen Joe Exotic’ mullet streaming giant’s market cap almost 187 billion — Disney’s momentum don’t surprised firm start making baller move grow reach got Netflix two month ago I’m already expert — unrequested probably unqualified idea Netflix could heading nextTags Finance Technology Marketing Business Data |
4,180 | 15+ Binary Tree Coding Problems from FAANG Interview | Image by Omni Matryx from Pixabay
A Binary Tree is a hierarchical Data Structure. Depending on how you store the nodes in a tree, the Terminology differs.
Out of Free Stories. Here is my Friend Link
Hey guys, I have been sharing a lot about Tech Interview Questions asked in FAANG, I am currently working on the Tech Interview Questions asked in LinkedIn, Yahoo, and Oracle. I have been researching a lot about these “interview problems”. When it comes to Binary Tree Problems, Most of them can be solved, if you have a strong foundation in certain types of problems.
This post is all about making you strong in the fundamental Logic’s that are used to Solve Binary Tree Problems.
So that when you are in your Interview and you come across a Binary Tree problem you will know which logic to use and how you could approach that problem!
Free For Kindle Readers
If you are Preparing for your Interview. Even if you are settled down in your job, keeping yourself up-to-date with the latest Interview Problems is essential for your career growth. Start your prep from Here!
15+ Binary Tree Coding Problems from Programming Interviews
What is the Lowest Common Ancestor? How to find the Lowest Common Ancestor of Two Given Nodes? Solution How to find out if a given tree is a subtree of another tree? Solution How to Traverse the Binary Tree Iteratively? Solution What is Breadth First Traversal? How to implement it? Solution How to find out the Diameter of a Tree? Solution How to Traverse the Tree in Zig-Zag fashion? Solution What is Depth First Traversal? How to implement it? Solution How to print the Right Side View of A Binary Tree? Solution How to Construct BST from Preorder Traversal? Solution How to find out if two given trees are mirror images of each other? Solution How to find out the sum of the Deepest Leaves in a Binary Tree? Solution How to Capture a Binary Tree into a 2D Array? Solution How to Merge Two Binary Trees? Solution How to find if a pair of Nodes in BST is equal to a target? Solution How to Find the Minimum Distance Between Two Nodes in a given BST? Solution
These are some of the most popular binary tree-based questions asked on Programming job interviews. You can solve them to become comfortable with tree-based problems.
Go Even Further
These are some of the most common questions about binary tree data structure form coding interviews that help you to do really well in your interview.
I have also shared a lot of Coding Interview Questions asked in FAANG on my blog, so if you are really interested, you can always go there and read through them.
These Challenges will improve you in Dynamic Programming, Back Tracking, Greedy Approaches, Sorting and Searching Techniques to help you do well in the Technical Interviews.
Good Knowledge of these Different Algorithms and the time and space complexities behind is a must-know for every interview. Focus on this the most.
Further Reading
4 Incredibly Useful Linked List Tips for Interview
Top 25 Amazon SDE Interview Questions
Do you think you really know about Fibonacci Numbers?
9 Best String Problems Solved using C Programming
One Does not Simply Solve 50 Hacker Rank Challenges
End of the Line
You have now reached the end of this article. Thank you for reading it. Good luck with your Programming Interview!
If you come across any of these questions in your interview. Kindly Share it on the comments section below. I will be thrilled to read them.
Before you go:
Want to become outstanding in java programming?
Free for Kindle Readers.
A compilation of 100 Java(Interview) Programming problems which have been solved. (HackerRank) 🐱💻 This is completely free 🆓 if you have an amazon kindle subscription.
If you like this article, then please share it with your friends and colleagues, and don’t forget to follow the house of codes on Twitter! | https://medium.com/dev-genius/15-binary-tree-coding-problems-from-faang-interview-2ba1ec67d077 | ['House Of Codes'] | 2020-06-22 12:54:02.762000+00:00 | ['Coding', 'Java', 'Software Development', 'Interview', 'Programming'] | Title 15 Binary Tree Coding Problems FAANG InterviewContent Image Omni Matryx Pixabay Binary Tree hierarchical Data Structure Depending store node tree Terminology differs Free Stories Friend Link Hey guy sharing lot Tech Interview Questions asked FAANG currently working Tech Interview Questions asked LinkedIn Yahoo Oracle researching lot “interview problems” come Binary Tree Problems solved strong foundation certain type problem post making strong fundamental Logic’s used Solve Binary Tree Problems Interview come across Binary Tree problem know logic use could approach problem Free Kindle Readers Preparing Interview Even settled job keeping uptodate latest Interview Problems essential career growth Start prep 15 Binary Tree Coding Problems Programming Interviews Lowest Common Ancestor find Lowest Common Ancestor Two Given Nodes Solution find given tree subtree another tree Solution Traverse Binary Tree Iteratively Solution Breadth First Traversal implement Solution find Diameter Tree Solution Traverse Tree ZigZag fashion Solution Depth First Traversal implement Solution print Right Side View Binary Tree Solution Construct BST Preorder Traversal Solution find two given tree mirror image Solution find sum Deepest Leaves Binary Tree Solution Capture Binary Tree 2D Array Solution Merge Two Binary Trees Solution find pair Nodes BST equal target Solution Find Minimum Distance Two Nodes given BST Solution popular binary treebased question asked Programming job interview solve become comfortable treebased problem Go Even common question binary tree data structure form coding interview help really well interview also shared lot Coding Interview Questions asked FAANG blog really interested always go read Challenges improve Dynamic Programming Back Tracking Greedy Approaches Sorting Searching Techniques help well Technical Interviews Good Knowledge Different Algorithms time space complexity behind mustknow every interview Focus Reading 4 Incredibly Useful Linked List Tips Interview Top 25 Amazon SDE Interview Questions think really know Fibonacci Numbers 9 Best String Problems Solved using C Programming One Simply Solve 50 Hacker Rank Challenges End Line reached end article Thank reading Good luck Programming Interview come across question interview Kindly Share comment section thrilled read go Want become outstanding java programming Free Kindle Readers compilation 100 JavaInterview Programming problem solved HackerRank 🐱💻 completely free 🆓 amazon kindle subscription like article please share friend colleague don’t forget follow house code TwitterTags Coding Java Software Development Interview Programming |
4,181 | Karma Enters Global Market: KYC Now Available For The Foreign Citizens | Hello, dear friends!
Foreign Investors Allowed
Finally, a citizen of any country can pass the KYC on our platform and become an accredited investor.
We’ve connected popular KYC and AML provider — The Sumsub company. They provide a standard KYC procedure: one just need to add a passport photo/ID/drivers license photo and selfie with this document.
There are also important rules for our Korean friends
After passing the KYC investor can join any existing loan offer on the Market.
For the moment we start to work with foreign investors who have accounts in Russian banks or in Rubles in local banks. Plus, we're now integrating with an international payment system to allow any investor to join our platform from any banking account or a credit card.
Automatic accreditation for the Russian investors
We've developed a scoring system which automatically check investors' data. It helps to reduce the average moderation time to 1 min.
Notifications about virtual account updates
You always can control any incomes on your virtual account, such as payout amount, new deposits etc. Email and SMS notifications will make you aware of any operations with your balance.
“Zero commission” offer for investors extended till the end of June
The amount of new Karma investors is increasing on a daily basis. We are really happy that the majority of investors is staying with us and reinvesting more after receiving their first income.
We're working on attracting new investors also. That’s why we decided not to charge platform’s commission from all investors till the end of June.
Withdrawal fees
We’ve paid banking fees for any operations by ourself for a long time, but as the amount of users is growing, we have start charging a small fee of 50 RUB for every virtual account withdrawal.
Cheers^_^ | https://medium.com/karmared/karma-enters-global-market-kyc-now-available-for-the-foreign-citizens-72373dcf2953 | ['Karma Project'] | 2019-04-09 07:13:51.356000+00:00 | ['Investing', 'P2p', 'Global', 'Development', 'Banking'] | Title Karma Enters Global Market KYC Available Foreign CitizensContent Hello dear friend Foreign Investors Allowed Finally citizen country pas KYC platform become accredited investor We’ve connected popular KYC AML provider — Sumsub company provide standard KYC procedure one need add passport photoIDdrivers license photo selfie document also important rule Korean friend passing KYC investor join existing loan offer Market moment start work foreign investor account Russian bank Rubles local bank Plus integrating international payment system allow investor join platform banking account credit card Automatic accreditation Russian investor Weve developed scoring system automatically check investor data help reduce average moderation time 1 min Notifications virtual account update always control income virtual account payout amount new deposit etc Email SMS notification make aware operation balance “Zero commission” offer investor extended till end June amount new Karma investor increasing daily basis really happy majority investor staying u reinvesting receiving first income working attracting new investor also That’s decided charge platform’s commission investor till end June Withdrawal fee We’ve paid banking fee operation ourself long time amount user growing start charging small fee 50 RUB every virtual account withdrawal CheersTags Investing P2p Global Development Banking |
4,182 | The Feynman Technique Can Help You Remember Everything You Read | The 4 Steps You Need To Take
In essence, the Feynman technique consists of four steps: identify the subject, explain the content, identify your knowledge gaps, simplify your explanation. Here’s how it works for any book you read:
#1 Choose the book you want to remember
After you’ve finished a book worth remembering, take out a blank sheet. Title it with the book’s name.
Then, mentally recall all principles and main points you want to keep in mind. Here, many people make the mistake to simply copy the table of content or their highlights. By not recalling the information, they skip the learning part.
What you want to do instead, is to retrieve the concepts and ideas from your own memory. Yes, this requires your brainpower. But by thinking about the concepts, you’re creating an effective learning experience.
While writing your key points, try to use the simplest language you can. Often, we use complicated jargon to mask our unknowingness. Big words and fluffy “expert words” stop us from getting to the point.
“If you can’t explain it simply, you don’t understand it well enough.” — Albert Einstein
#2 Pretend you are explaining the content to a 12-year old
This sounds simpler than it is. In fact, explaining a concept as plain as possible requires deep understanding.
Because when you explain an idea from start to finish to a 12-year old, you force yourself to simplify relationships and connections between concepts.
If you don’t have a 12-year old around, find an interested friend, record a voice message for a mastermind group, or write down your explanation as a review on Amazon, Goodreads, or Quora.
#3 Identify your knowledge gaps and reread
Explaining a book’s key points helps you find out what you didn’t understand. There will be passages you’re crystal clear about. At other points, you will struggle. These are the valuable hints to dig deeper.
Only when you find knowledge gaps — where you omit an important aspect, search for words, or have trouble linking ideas to each other — can you really start learning.
When you know where you’re stuck go back to your book and re-read the passage until you can explain it in your own simple language.
Filling your knowledge gaps is the extra step required to really remember what you read and skipping it leads to an illusion of knowledge.
#4 Simplify Your Explanation (optional)
Depending on a book’s complexity, you might be able to explain and remember the ideas after the previous. If you feel unsure, however, you can add an additional simplification layer.
Read your notes out loud and organize them into the simplest narrative possible. Once the explanation sounds simple, it’s a great indicator that you’ve done the proper work.
It’s only when you can explain in plain language what you read that you’ll know you truly understood the content. | https://medium.com/age-of-awareness/the-feynman-technique-will-make-you-remember-what-you-read-f0bce8cc4c43 | ['Eva Keiffenheim'] | 2020-10-21 15:03:23.461000+00:00 | ['Reading', 'Books', 'Education', 'Learning', 'Personal Development'] | Title Feynman Technique Help Remember Everything ReadContent 4 Steps Need Take essence Feynman technique consists four step identify subject explain content identify knowledge gap simplify explanation Here’s work book read 1 Choose book want remember you’ve finished book worth remembering take blank sheet Title book’s name mentally recall principle main point want keep mind many people make mistake simply copy table content highlight recalling information skip learning part want instead retrieve concept idea memory Yes requires brainpower thinking concept you’re creating effective learning experience writing key point try use simplest language Often use complicated jargon mask unknowingness Big word fluffy “expert words” stop u getting point “If can’t explain simply don’t understand well enough” — Albert Einstein 2 Pretend explaining content 12year old sound simpler fact explaining concept plain possible requires deep understanding explain idea start finish 12year old force simplify relationship connection concept don’t 12year old around find interested friend record voice message mastermind group write explanation review Amazon Goodreads Quora 3 Identify knowledge gap reread Explaining book’s key point help find didn’t understand passage you’re crystal clear point struggle valuable hint dig deeper find knowledge gap — omit important aspect search word trouble linking idea — really start learning know you’re stuck go back book reread passage explain simple language Filling knowledge gap extra step required really remember read skipping lead illusion knowledge 4 Simplify Explanation optional Depending book’s complexity might able explain remember idea previous feel unsure however add additional simplification layer Read note loud organize simplest narrative possible explanation sound simple it’s great indicator you’ve done proper work It’s explain plain language read you’ll know truly understood contentTags Reading Books Education Learning Personal Development |
4,183 | Don’t Use Quarantine as an Excuse To Stop Having Boundaries | Don’t Use Quarantine as an Excuse To Stop Having Boundaries
You need to protect your mental health now more than ever
Adobe Stock Photo
There’s nothing like a global crisis to give people in your lives the excuse they have been looking for to touch base. We are scared, anxious, and nervous and because we are emotionally vulnerable, it can cause us to allow outreach from people that we wouldn’t normally tolerate.
In the past few weeks, I have seen and heard examples of the following types of what I would call “quarantine inspired outreach”.
“I’m so bored, let’s shake things up.”
A coworker told me that she had become so bored while trapped indoors that she thought about texting her ex-boyfriend over the weekend, just to create a little excitement.
Although this may seem like a good idea after too much time and a little too much wine, don’t do it.
You need to respect people’s boundaries and remember that just because you think it’s funny to text your ex, his new girlfriend is probably not going to think the same thing. No one needs to be instigating fights while people are trapped together in a 500-square-foot apartment. | https://medium.com/fearless-she-wrote/dont-use-quarantine-as-an-excuse-to-stop-having-boundaries-32176664d263 | ['Carrie Wynn'] | 2020-04-09 17:10:00.311000+00:00 | ['Mindfulness', 'Relationships', 'Mental Health', 'Advice', 'Self'] | Title Don’t Use Quarantine Excuse Stop BoundariesContent Don’t Use Quarantine Excuse Stop Boundaries need protect mental health ever Adobe Stock Photo There’s nothing like global crisis give people life excuse looking touch base scared anxious nervous emotionally vulnerable cause u allow outreach people wouldn’t normally tolerate past week seen heard example following type would call “quarantine inspired outreach” “I’m bored let’s shake thing up” coworker told become bored trapped indoors thought texting exboyfriend weekend create little excitement Although may seem like good idea much time little much wine don’t need respect people’s boundary remember think it’s funny text ex new girlfriend probably going think thing one need instigating fight people trapped together 500squarefoot apartmentTags Mindfulness Relationships Mental Health Advice Self |
4,184 | What Are the Rules for Lending Your Books to Friends | Photo by Євгенія Височина on Unsplash
When you start to collect books at home, your friends and guesses will begin to envy you. Then every time you have visitors, you will live in dread of the moment that when they want to borrow one of your books. Besides, you will find yourself in an uncomfortable conversation.
Letting a visitor borrow a book is a complicated situation. Every reader like me wants to share the stories that make us happy; however, on the other, we do not want anyone to put their greasy fingers on our beautiful little books. The reason is that the statistics tell us that if we lend a book, the probability of seeing it again is almost 0. Even if we are lucky enough to see our books back, we would see that most of the pages of it turn into yellow and fall apart as we turn the pages. I am utterly sure about that because It is like the law of thermodynamics.
Once one of my books leaves the house door, it becomes Schrödinger’s cat, and it will be both located and lost. I call that situation “the uncertainty principle of borrowed books.”
I also want to add that yellow pages or missing pages are not our friends’ fault, and it is not because they are careless. If we want our friends to return the books that they borrow, we have to put set some strict rules. Moreover, we should print a booklet and put it inside the books.
For the rules, the first thing we need to make it clear that they have to return the books. That’s why we should make a list of which books we give and to whom in a simple spreadsheet, or in a booklet to keep them recorded. If we also want to put an approximate return date, do it. For instance, we would even write it down in the agenda to send them a reminder email. Yes, we need to be that heavy!
You should also clarify that if they return the books with stained chocolate, wine, coffee, or other fluid and moisture — not to mention loose pages or damaged covers — they have to buy a new one. That is the best way to make sure that they will be careful.
Finally, and the most vital point to keep in mind! The books that we love shouldn’t be lent. Thus, we shouldn’t present them because it is the best way to see them again. If our favorite books pass to others, then they will give the books to strangers. Unfortunately, it is almost impossible for those books to find their way back home.
If you are still willing to lend your favorite books, there is nothing to say. You are such a person that wants everyone to enjoy such special readings as you. I sometimes buy a couple of second-hand copies and have them available to everyone who asks me to read them. If the books are returned to me, that’s fine. If they get lost in the hyperspace, then nothing happens, and I don’t need to be sad. | https://mathladyhazel.medium.com/what-are-the-rules-for-lending-your-books-to-friends-d77ea84433f6 | ['Hazel Clementine'] | 2020-01-18 08:26:57.300000+00:00 | ['Books', 'Reading', 'Friendship', 'Relationships', 'Education'] | Title Rules Lending Books FriendsContent Photo Євгенія Височина Unsplash start collect book home friend guess begin envy every time visitor live dread moment want borrow one book Besides find uncomfortable conversation Letting visitor borrow book complicated situation Every reader like want share story make u happy however want anyone put greasy finger beautiful little book reason statistic tell u lend book probability seeing almost 0 Even lucky enough see book back would see page turn yellow fall apart turn page utterly sure like law thermodynamics one book leaf house door becomes Schrödinger’s cat located lost call situation “the uncertainty principle borrowed books” also want add yellow page missing page friends’ fault careless want friend return book borrow put set strict rule Moreover print booklet put inside book rule first thing need make clear return book That’s make list book give simple spreadsheet booklet keep recorded also want put approximate return date instance would even write agenda send reminder email Yes need heavy also clarify return book stained chocolate wine coffee fluid moisture — mention loose page damaged cover — buy new one best way make sure careful Finally vital point keep mind book love shouldn’t lent Thus shouldn’t present best way see favorite book pas others give book stranger Unfortunately almost impossible book find way back home still willing lend favorite book nothing say person want everyone enjoy special reading sometimes buy couple secondhand copy available everyone asks read book returned that’s fine get lost hyperspace nothing happens don’t need sadTags Books Reading Friendship Relationships Education |
4,185 | Why I Am Choosing to Stop Writing on Medium at This Point | I published my first post on Medium on October 25, 2019. and my 400th on October 24, 2020.
In that one year, I earned a total of $359.94, just shy of 90 cents per story.
While that is more than I’ve ever earned from my writing, it is less than what I expected to earn per month by the year’s end.
When things don’t go as planned, it is only prudent to analyze and evaluate your efforts.
“Regardless of how far you’ve traveled down the wrong path, turn around.”
~ Turkish Proverb
Looking Back
When I decided to publish on Medium, I planned to evaluate the ROI after the first six months of 2020. However, by the end of June, I was fun making friends and picking up followers, so instead of doing a thorough analysis, I decided to continue.
I have learned that the journey of life is not linear. There are ups and downs and a lot of curves along the way. Sometimes, you even have to take a detour and get back down the road.
Medium is no different. I learned from others that sometimes it takes one story to catch fire for your income to skyrocket. I have always believed in synchronicity. When you are committed to an idea or a goal, the Universe conspires for you. And, when that happens, it doesn’t always look like what you may have expected.
In October, as I approached my first anniversary of writing on Medium, the Universe gifted me with a sizable reward that had nothing to do with writing. It made me pause. I decided to take a look at where I was and where I was headed.
Freedom Lifestyle
Simply put, Freedom Lifestyle is organizing your affairs in such a way that you can spend your time doing what you love and enjoy. When I started writing on Medium, I had been enjoying Freedom Lifestyle because I was able to twist balloons and give them away. I learned that when I did that, others gave away money. It worked for me.
I approached writing on Medium in a similar vein. My goal was to build it up to the point where it will take care of my ongoing expenses and free up the time I spent on twisting balloons. I didn't expect that I would be forced to stop twisting balloons because of the way 2020 turned out. However, as I have always believed, the Universe had arranged to take care of me before I even knew that there might be a need for it.
Moving Forward
I have been receiving signs that while I am on the right path, a fork lies on the road ahead, where I may be able to change direction and have a more fulfilling opportunity.
I enjoy writing, and it enables me to pursue my purpose in life. However, I know that it is not something I feel passionate about. It is not the thing I do best. My choice of communication modality is verbal, as opposed to writing.
It is time for me to explore that.
“If you can’t figure out your purpose, figure out your passion. For your passion will lead you right into your purpose.” ~ T. D. Jakes
As always, thank you for reading and responding.
Here are a couple of related stories:
Graphic created by Rasheed Hooda using Canva
Will you buy me some chai? | https://medium.com/narrative/why-i-am-choosing-to-stop-writing-on-medium-at-this-point-7f03398f3863 | ['Rasheed Hooda'] | 2020-11-08 13:45:55.755000+00:00 | ['Writing', 'Purpose', 'Passion', 'Options', 'Life Lessons'] | Title Choosing Stop Writing Medium PointContent published first post Medium October 25 2019 400th October 24 2020 one year earned total 35994 shy 90 cent per story I’ve ever earned writing le expected earn per month year’s end thing don’t go planned prudent analyze evaluate effort “Regardless far you’ve traveled wrong path turn around” Turkish Proverb Looking Back decided publish Medium planned evaluate ROI first six month 2020 However end June fun making friend picking follower instead thorough analysis decided continue learned journey life linear ups down lot curve along way Sometimes even take detour get back road Medium different learned others sometimes take one story catch fire income skyrocket always believed synchronicity committed idea goal Universe conspires happens doesn’t always look like may expected October approached first anniversary writing Medium Universe gifted sizable reward nothing writing made pause decided take look headed Freedom Lifestyle Simply put Freedom Lifestyle organizing affair way spend time love enjoy started writing Medium enjoying Freedom Lifestyle able twist balloon give away learned others gave away money worked approached writing Medium similar vein goal build point take care ongoing expense free time spent twisting balloon didnt expect would forced stop twisting balloon way 2020 turned However always believed Universe arranged take care even knew might need Moving Forward receiving sign right path fork lie road ahead may able change direction fulfilling opportunity enjoy writing enables pursue purpose life However know something feel passionate thing best choice communication modality verbal opposed writing time explore “If can’t figure purpose figure passion passion lead right purpose” Jakes always thank reading responding couple related story Graphic created Rasheed Hooda using Canva buy chaiTags Writing Purpose Passion Options Life Lessons |
4,186 | Large Scale Satellite Data Processing | Large Scale Satellite Data Processing
With more amounts of spatial data coming in than ever, we look to increase satellite data processing methods and efficiency.
Source: Techspot
Spatial Data — What is it?
Spatial data, or geospatial data, can be thought of as any data containing location information on the Earth’s surface. Spatial data is present in any field or industry, especially in today’s “real-time” data-driven world. There are a few common terminologies that better explain the language of spatial data.
Vector data is any data point that contains points, lines, and/or polygons. It can be thought of as our “man-made” map view of the world, consisting of road networks, administrative boundaries, etc. Raster data, referred to as imagery, is any pixelated data, such as satellite images. Often, raster data are photos taken from any areal device like satellites or drones. The resolutions can greatly vary depending on device precision and other technological and areal factors. Geographic Coordinate System (GCS) is a projection of the Earth’s 3D surface to define locations on Earth. It uses latitude, longitude, degrees (angels), and axes.
High-Resolution Earth Imagery is Now Made Publicly Available
Source: EOSDIS Imagery
There have been huge advancements in remote sensing technologies within the last decade alone. This has paved the way for petabytes of high-resolution satellite imagery being made publicly available. The use-case of this data applies to any industry — especially fields like atmospheric science, agriculture, ecology, soil science, etc. NASA EOSDIS provides public access to over 17 petabytes, and growing, satellite imagery data. The European Space Agency (ESA) launched the Sentinel-1A satellite which collects over 5 petabytes of data within the first two years of its launch alone.
These advancements, however, exposed vulnerabilities with processing drastic amounts of data and consequently paved the way for us to build new systems of spatial data processing — including SpatialHadoop, Simba, GeoSpark, RasDaMan, and more. These systems, however, focus solely focus on processing raster or image data. They perform poorly on queries processing both vector and raster data simultaneously. Additionally, these methods require the conversion of raster-to-vector or vector-to-raster — which proves to be extremely inefficient when processing large amounts of data.
So what’s a solution that can handle both vector and raster data? One answer is zonal statistics.
Zonal Statistics
Zonal statistics is the operation that processes a combination of vector and raster data. It computes statistics over a raster layer, for each polygon in a vector layer. All values are aggregated from the raster layer which overlaps with a set of polygons from the vector layer.
For example, let’s say we are interested in finding the average temperature of U.S. states. We have polygons of U.S. states from the vector layer vector and temperature data from the raster layer. Zonal statistics can compute the average temperature (or other statistics) for each state in the U.S. This computation of processing vector and raster layers simultaneously is called the scanline method.
The Scanline Method
Figure 1
The scanline method works directly for the zonal statistics problem by finding intersections between horizontal scanlines and respective polygon boundaries, as seen in Figure2(c). In the latest version, the scanline method can now process multiple adjacent polygons at a time, further reducing processing complexities. This means, for example, if we wanted to find the average temperature of all the states in the west — we can now compute this query in one computation, as opposed to computing the average separately per state.
The Algorithm
Figure 2
Input : a set of polygons (vector layer), raster layer
: a set of polygons (vector layer), raster layer Output: desired aggregation (e.g. sum, minimum, maximum, count, average of pixels inside the polygon)
Step 1: As shown in Figure2(b), first calculate the Minimum Bounding Rectangle (MBR) of the input polygon and map its two corners to the raster layer to locate the range of rows/scanlines to process in the raster layer. To include multiple polygons, we extend this step to all queried polygons.
Step 2: As shown in Figure2(c), compute the intersections of each scanline with its polygon boundaries. Each scanline is converted to the vector space and stores its y-coordinates in a sorted list. Then, each polygon is scanned for its corresponding range of scanlines — which are then used to compute intersections with the polygon.
Step 3: As shown in Figure 2(d), the pixels that lie inside the polygons are processed. Rather than processing one polygon at a time, this step processes one scanline at a time, speeding up computation significantly.
Pros of the Scanline Method
This algorithm overcomes the limitation of raster and vector-based methods. First, it only requires minimal intermediate storage for the intersection points. Second, it accesses the pixels that are inside the polygon which improves disk IO for very large raster layers. Thirds, it does not require any complicated point-in-polygon tests which make it faster than the vectorization methods. Finally, the scanline method is IO-bound which makes it optimal from the processing perspective since it requires one scan over the raster data to process all polygons.
Use-Cases
Source: ESA — Average Temperature in Arctic 1997–2008
As I mentioned above, the use-cases for processing spatial data are endless. As we are slowly but surely moving towards a more “green” and environmentally friendly global community, it is important to be able to process and extract insights from Earth imagery. Processing both vector data and satellite data simultaneously allows us to not only apply spatial data insights to our societies, but it allows us to foresee trends on the Earth’s surface. We can better monitor vegetation, temperature, ocean level changes, etc.
Visit my GitHub repository to view a full-stack application using the scanline method in the backend to process multiple polygons through a user-interface. | https://medium.com/towards-artificial-intelligence/large-scale-satellite-data-processing-e963692380b8 | [] | 2020-12-25 01:02:58.806000+00:00 | ['Satellite', 'Satellite Technology', 'Spatial Analysis', 'Big Data', 'Remote Sensing'] | Title Large Scale Satellite Data ProcessingContent Large Scale Satellite Data Processing amount spatial data coming ever look increase satellite data processing method efficiency Source Techspot Spatial Data — Spatial data geospatial data thought data containing location information Earth’s surface Spatial data present field industry especially today’s “realtime” datadriven world common terminology better explain language spatial data Vector data data point contains point line andor polygon thought “manmade” map view world consisting road network administrative boundary etc Raster data referred imagery pixelated data satellite image Often raster data photo taken areal device like satellite drone resolution greatly vary depending device precision technological areal factor Geographic Coordinate System GCS projection Earth’s 3D surface define location Earth us latitude longitude degree angel ax HighResolution Earth Imagery Made Publicly Available Source EOSDIS Imagery huge advancement remote sensing technology within last decade alone paved way petabyte highresolution satellite imagery made publicly available usecase data applies industry — especially field like atmospheric science agriculture ecology soil science etc NASA EOSDIS provides public access 17 petabyte growing satellite imagery data European Space Agency ESA launched Sentinel1A satellite collect 5 petabyte data within first two year launch alone advancement however exposed vulnerability processing drastic amount data consequently paved way u build new system spatial data processing — including SpatialHadoop Simba GeoSpark RasDaMan system however focus solely focus processing raster image data perform poorly query processing vector raster data simultaneously Additionally method require conversion rastertovector vectortoraster — prof extremely inefficient processing large amount data what’s solution handle vector raster data One answer zonal statistic Zonal Statistics Zonal statistic operation process combination vector raster data computes statistic raster layer polygon vector layer value aggregated raster layer overlap set polygon vector layer example let’s say interested finding average temperature US state polygon US state vector layer vector temperature data raster layer Zonal statistic compute average temperature statistic state US computation processing vector raster layer simultaneously called scanline method Scanline Method Figure 1 scanline method work directly zonal statistic problem finding intersection horizontal scanlines respective polygon boundary seen Figure2c latest version scanline method process multiple adjacent polygon time reducing processing complexity mean example wanted find average temperature state west — compute query one computation opposed computing average separately per state Algorithm Figure 2 Input set polygon vector layer raster layer set polygon vector layer raster layer Output desired aggregation eg sum minimum maximum count average pixel inside polygon Step 1 shown Figure2b first calculate Minimum Bounding Rectangle MBR input polygon map two corner raster layer locate range rowsscanlines process raster layer include multiple polygon extend step queried polygon Step 2 shown Figure2c compute intersection scanline polygon boundary scanline converted vector space store ycoordinates sorted list polygon scanned corresponding range scanlines — used compute intersection polygon Step 3 shown Figure 2d pixel lie inside polygon processed Rather processing one polygon time step process one scanline time speeding computation significantly Pros Scanline Method algorithm overcomes limitation raster vectorbased method First requires minimal intermediate storage intersection point Second access pixel inside polygon improves disk IO large raster layer Thirds require complicated pointinpolygon test make faster vectorization method Finally scanline method IObound make optimal processing perspective since requires one scan raster data process polygon UseCases Source ESA — Average Temperature Arctic 1997–2008 mentioned usecases processing spatial data endless slowly surely moving towards “green” environmentally friendly global community important able process extract insight Earth imagery Processing vector data satellite data simultaneously allows u apply spatial data insight society allows u foresee trend Earth’s surface better monitor vegetation temperature ocean level change etc Visit GitHub repository view fullstack application using scanline method backend process multiple polygon userinterfaceTags Satellite Satellite Technology Spatial Analysis Big Data Remote Sensing |
4,187 | From Pandemic to Panopticon: Are We Habituating Aggressive Surveillance? | In Shoshana Zuboff’s 2019 book The Age of Surveillance Capitalism, she recalls the response to the launch of Google Glass in 2012. Zuboff describes public horror, as well as loud protestations from privacy advocates who were deeply concerned that the product’s undetectable recording of people and places threatened to eliminate “a person’s reasonable expectation of privacy and/or anonymity.”
Zuboff describes the product:
Google Glass combined computation, communication, photography, GPS tracking, data retrieval, and audio and video recording capabilities in a wearable format patterned on eyeglasses. The data it gathered — location, audio, video, photos, and other personal information — moved from the device to Google’s servers.
At the time, campaigners warned of a potential chilling effect on the population if Google Glass were to be married with new facial recognition technology, and in 2013 a congressional privacy caucus asked then Google CEO Larry Page for assurances on privacy safeguards for the product.
Eventually, after visceral public rejection, Google parked Glass in 2015 with a short blog announcing that they would be working on future versions. And although we never saw the relaunch of a follow-up consumer Glass, the product didn’t disappear into the sunset as some had predicted. Instead, Google took the opportunity to regroup and redirect, unwilling to turn its back on the chance of harvesting valuable swathes of what Zuboff terms “behavioral surplus data”, or cede this wearables turf to a rival.
Instead, as a next move, in 2017 Google publicly announced the Glass Enterprise Edition in what Zuboff calls a “tactical retreat into the workplace.” The workplace being the gold standard of environments in which invasive technologies are habituated and normalized. In workplaces, wearable technologies can be authentically useful points of reference (rather than luxury items), and are therefore treated with less scrutiny than the same technologies in the public space. As Zuboff quips: “Glass at work was most certainly the backdoor to Glass on our streets”, adding:
The lesson of Glass is that when one route to a supply source [of behavioral data] encounters obstacles, others are constructed to take up the slack and drive expansion.
This kind of expansionism should certainly be on our minds right now as we survey the ways in which government and the tech industry have responded to the COVID-19 pandemic. Most notably in asking if the current situation — one in which the public are prepared to forgo deep scrutiny in the hopes of some solution — presents a real opportunity for tech companies to habituate surveillance technologies at scale? Technologies that have been previously met with widespread repugnance.
Syndromic Surveillance
Over the last few days and weeks, the media have reported offers from tech companies looking to help governments stymy the spread of the coronavirus. Suggestions vary in content, but many (or most) could reasonably be classified as efforts to track and/or monitor the population in order to understand how the virus moves — known as “syndromic surveillance.”
On Monday, Facebook’s Data for Good team announced new tools for tracking how well we’re all social distancing by using our location data. Facebook were following hot on the heels of Google, who promised to do something very similar just last week. According to reports, the readouts from Google’s data stash will reveal phenomenal levels of detail, including “changes in visits to people’s homes, as determined by signals such as where users spend their time during the day and at night.”
This granular data is intended to inform government policy decisions, and ultimately influence public behavior to curtail the spread of the virus. This end purpose is, of course, an extremely noble one: saving human lives. This is a cause that legitimizes most methods. Nevertheless, we should not let our sheer desperation to stop this abominable disease blind us to some of the burgeoning concerns surrounding tech’s enthusiastic rollout of unprecedented intrusion.
Control Concerns
It’s almost reflexive now to look to China when discussing the excessive deployment of technological surveillance tools. Not unexpectedly, the Chinese government has turned the COVID-19 outbreak into an opportunity to flex their surveillance tech muscles, while baking ever more controls into the daily lives of citizens.
Authorities have been monitoring smartphones, using facial recognition technology to detect elevated temperatures in a crowd or those not wearing face masks, and obliging the public to consistently check and self report their medical condition for tracking purposes. The Guardian, further reported:
Getting into one’s apartment compound or workplace requires scanning a QR code, writing down one’s name and ID number, temperature and recent travel history. Telecom operators track people’s movements while social media platforms like WeChat and Weibo have hotlines for people to report others who may be sick. Some cities are offering people rewards for informing on sick neighbors.
But this is what we’ve come to expect from China. Perhaps more surprising is that similar pervasive tracking techniques have been adopted in so many other COVID-19 hotspots around the globe. This silent, yet penetrative policing is still unfamiliar to the public in most areas stricken by the coronavirus.
The New York Times reported that in Lombardy, Italy, local authorities are using mobile phone location data to determine whether citizens are obeying lockdown, and in Israel, Prime Minister Benjamin Netanyahu has authorized surveillance technology normally reserved for terrorists to be used on the broader population.
In countries like the UK and the US, the announcement of each new tracking technology has been accompanied by an avalanche of privacy assurances. Yet, we’ve already seen a number of worrying instances where the vigilant monitoring of the pandemic has tipped over into boundary-crossing privacy lapses — like this tweet from New York’s Mayor Bill de Blasio.
And in Mexico, when public health officials notified Uber about a passenger infected with the virus, the company suspended the accounts of two drivers who had given him rides, then tracked down and suspended the accounts of a further 200 passengers who had also ridden with those drivers (NY Times).
The pandemic has unleashed a fresh government enthusiasm for using tech to monitor, identify, and neutralize threats. And although this behavior might seem like a natural response to a crisis, authorities should be alive to the dehumanizing aspects of surveillance, as well as the point at which they start to view the rest of us as mere scientific subjects, rather than active participants in societal efforts.
A False Choice?
Of course, there are those who would willingly relinquish personal privacy in order to save lives. They believe that an end to this suffering justifies any action taken by governments and tech companies, even if it involves a rummage in our personal data cupboards. But what isn’t clear is the extent to which we can trust this as a straight transaction. After all, these are largely unproven technologies.
In the New York Times, Natasha Singer and Chloe Sang-Hun write:
The fast pace of the pandemic…is prompting governments to put in place a patchwork of digital surveillance measures in the name of their own interests, with little international coordination on how appropriate or effective they are.
And writing for NBC News’ THINK, Albert Fox Cahn and John Veiszlemlein similarly point out that the effectiveness of tech tracking pandemic outbreaks is “decidedly unclear”. They recount previous efforts, like Google Flu Trends, that were abandoned as failures.
In short, we could be giving up our most personal data for the sake of a largely ineffective mapping experiment.
Yuval Noah Harari argues that the choice between health and privacy is, in fact, a false one. He emphasizes the critical role of trust in achieving compliance and co-operation, and says that public faith is not built through the deployment of authoritarian surveillance technologies, but by encouraging the populace to use personal tech to evaluate their own health in a way that informs responsible personal choices.
Harari writes:
When people are told the scientific facts, and when people trust public authorities to tell them these facts, citizens can do the right thing even without a Big Brother watching over their shoulders. A self-motivated and well-informed population is usually far more powerful and effective than a policed, ignorant population.
He ends with a caution that we could be signing away personal freedoms, thinking it is the only choice.
The New (Ab)Normal
So, to return to our original question: has this dreadful pandemic provided legitimacy to an aggressive, pervasive surveillance that will carry on into the future? Are we witnessing the beginning of a new normal?
Nearly two decades after the 9/11 attacks, law enforcement agencies still have access to the high-powered surveillance systems that were instituted in response to imminent terror threats. Indeed, as Yuval Harari asserts, the nature of emergencies tends to be that the short-term measures they give rise to become fixtures of life on the premise that the next disaster is always lurking. He adds that, “immature and even dangerous technologies are pressed into service, because the risks of doing nothing are bigger.”
Whenever we eventually emerge from this difficult time, there is every chance that our collective tolerance for deep surveillance will be higher, and the barriers that previously prevented intrusive technologies taking hold will be lower. If we doubt this, it’s important to know that some tech companies are already openly talking about the pandemic in terms of an expansion opportunity.
Perhaps if our skins are thicker, and privacy becomes a sort of quaint, 20th century concern, we could worry less and enjoy greater security and convenience in a post-pandemic era?
If this seems appealing, then it’s worth remembering that the benefits of constant and penetrating surveillance, like disease tracking or crime detection, are offset in a range of different and troubling ways.
By allowing a permanent tech surveillance land grab, we simultaneously allow and embed a loss of anonymity, as well as an new onslaught of commercial and governmental profiling, cognitive exploitation, behavioral manipulation, and data-driven discrimination. To let this mission creep go on unchallenged would be to assent to a new status quo where we willingly play complacent lab rats for our information masters.
So, as we urgently will an end to this global devastation, let’s be attentive when it comes to the aftermath and clean-up, lest we immediately exchange one temporary nightmare scenario for another, more lasting one. | https://medium.com/swlh/from-pandemic-to-panopticon-are-we-habituating-aggressive-surveillance-f880ef754bc0 | ['Fiona J Mcevoy'] | 2020-04-10 00:09:49.499000+00:00 | ['Covid 19', 'Coronavirus', 'Technology', 'Surveillance', 'Government'] | Title Pandemic Panopticon Habituating Aggressive SurveillanceContent Shoshana Zuboff’s 2019 book Age Surveillance Capitalism recall response launch Google Glass 2012 Zuboff describes public horror well loud protestation privacy advocate deeply concerned product’s undetectable recording people place threatened eliminate “a person’s reasonable expectation privacy andor anonymity” Zuboff describes product Google Glass combined computation communication photography GPS tracking data retrieval audio video recording capability wearable format patterned eyeglass data gathered — location audio video photo personal information — moved device Google’s server time campaigner warned potential chilling effect population Google Glass married new facial recognition technology 2013 congressional privacy caucus asked Google CEO Larry Page assurance privacy safeguard product Eventually visceral public rejection Google parked Glass 2015 short blog announcing would working future version although never saw relaunch followup consumer Glass product didn’t disappear sunset predicted Instead Google took opportunity regroup redirect unwilling turn back chance harvesting valuable swathe Zuboff term “behavioral surplus data” cede wearable turf rival Instead next move 2017 Google publicly announced Glass Enterprise Edition Zuboff call “tactical retreat workplace” workplace gold standard environment invasive technology habituated normalized workplace wearable technology authentically useful point reference rather luxury item therefore treated le scrutiny technology public space Zuboff quip “Glass work certainly backdoor Glass streets” adding lesson Glass one route supply source behavioral data encounter obstacle others constructed take slack drive expansion kind expansionism certainly mind right survey way government tech industry responded COVID19 pandemic notably asking current situation — one public prepared forgo deep scrutiny hope solution — present real opportunity tech company habituate surveillance technology scale Technologies previously met widespread repugnance Syndromic Surveillance last day week medium reported offer tech company looking help government stymy spread coronavirus Suggestions vary content many could reasonably classified effort track andor monitor population order understand virus move — known “syndromic surveillance” Monday Facebook’s Data Good team announced new tool tracking well we’re social distancing using location data Facebook following hot heel Google promised something similar last week According report readout Google’s data stash reveal phenomenal level detail including “changes visit people’s home determined signal user spend time day night” granular data intended inform government policy decision ultimately influence public behavior curtail spread virus end purpose course extremely noble one saving human life cause legitimizes method Nevertheless let sheer desperation stop abominable disease blind u burgeoning concern surrounding tech’s enthusiastic rollout unprecedented intrusion Control Concerns It’s almost reflexive look China discussing excessive deployment technological surveillance tool unexpectedly Chinese government turned COVID19 outbreak opportunity flex surveillance tech muscle baking ever control daily life citizen Authorities monitoring smartphones using facial recognition technology detect elevated temperature crowd wearing face mask obliging public consistently check self report medical condition tracking purpose Guardian reported Getting one’s apartment compound workplace requires scanning QR code writing one’s name ID number temperature recent travel history Telecom operator track people’s movement social medium platform like WeChat Weibo hotlines people report others may sick city offering people reward informing sick neighbor we’ve come expect China Perhaps surprising similar pervasive tracking technique adopted many COVID19 hotspot around globe silent yet penetrative policing still unfamiliar public area stricken coronavirus New York Times reported Lombardy Italy local authority using mobile phone location data determine whether citizen obeying lockdown Israel Prime Minister Benjamin Netanyahu authorized surveillance technology normally reserved terrorist used broader population country like UK US announcement new tracking technology accompanied avalanche privacy assurance Yet we’ve already seen number worrying instance vigilant monitoring pandemic tipped boundarycrossing privacy lapse — like tweet New York’s Mayor Bill de Blasio Mexico public health official notified Uber passenger infected virus company suspended account two driver given ride tracked suspended account 200 passenger also ridden driver NY Times pandemic unleashed fresh government enthusiasm using tech monitor identify neutralize threat although behavior might seem like natural response crisis authority alive dehumanizing aspect surveillance well point start view rest u mere scientific subject rather active participant societal effort False Choice course would willingly relinquish personal privacy order save life believe end suffering justifies action taken government tech company even involves rummage personal data cupboard isn’t clear extent trust straight transaction largely unproven technology New York Times Natasha Singer Chloe SangHun write fast pace pandemic…is prompting government put place patchwork digital surveillance measure name interest little international coordination appropriate effective writing NBC News’ THINK Albert Fox Cahn John Veiszlemlein similarly point effectiveness tech tracking pandemic outbreak “decidedly unclear” recount previous effort like Google Flu Trends abandoned failure short could giving personal data sake largely ineffective mapping experiment Yuval Noah Harari argues choice health privacy fact false one emphasizes critical role trust achieving compliance cooperation say public faith built deployment authoritarian surveillance technology encouraging populace use personal tech evaluate health way informs responsible personal choice Harari writes people told scientific fact people trust public authority tell fact citizen right thing even without Big Brother watching shoulder selfmotivated wellinformed population usually far powerful effective policed ignorant population end caution could signing away personal freedom thinking choice New AbNormal return original question dreadful pandemic provided legitimacy aggressive pervasive surveillance carry future witnessing beginning new normal Nearly two decade 911 attack law enforcement agency still access highpowered surveillance system instituted response imminent terror threat Indeed Yuval Harari asserts nature emergency tends shortterm measure give rise become fixture life premise next disaster always lurking add “immature even dangerous technology pressed service risk nothing bigger” Whenever eventually emerge difficult time every chance collective tolerance deep surveillance higher barrier previously prevented intrusive technology taking hold lower doubt it’s important know tech company already openly talking pandemic term expansion opportunity Perhaps skin thicker privacy becomes sort quaint 20th century concern could worry le enjoy greater security convenience postpandemic era seems appealing it’s worth remembering benefit constant penetrating surveillance like disease tracking crime detection offset range different troubling way allowing permanent tech surveillance land grab simultaneously allow embed loss anonymity well new onslaught commercial governmental profiling cognitive exploitation behavioral manipulation datadriven discrimination let mission creep go unchallenged would assent new status quo willingly play complacent lab rat information master urgently end global devastation let’s attentive come aftermath cleanup lest immediately exchange one temporary nightmare scenario another lasting oneTags Covid 19 Coronavirus Technology Surveillance Government |
4,188 | Why Donald Trump Could Win Even Though He’s Losing | Since he was diagnosed with COVID-19, the president’s poll numbers definitely haven’t shifted for the better. According to recent polls, Trump is down double-digits and in every single battleground state except Georgia. Because there will likely be no other debates and the debacle that was the first one, it raises the question: can Donald Trump win this election?
To be honest, chances are that he will lose the popular vote by a wide margin. But just like last election, it’s still possible to have an electoral college/popular vote split.
To shift the election, Trump will likely have to gain voters in states like Florida, Michigan, Ohio, Iowa, and Pennsylvania to have a shot at winning the White House again. That’s going to take some more work by him and his campaign, and he’s going to have to change the rhetoric of his campaign.
But it’s still possible. As we well know by now, polls don’t give us the true picture of how the election is going to turn out, and things can change fast in the weeks preceding election day. This election is different, so here’s a couple of scenarios.
Donald Trump wins the Electoral College
Although the most unlikely of these three scenarios, this result is still quite possible. Although pundits put his chances at about one-in-seven, Trump still has a chance to pull this election. Here’s the most recent polling data for some battleground states:
Michigan: Biden 54, Trump 43
Ohio: Biden 44, Trump 43
Arizona: Biden 48, Trump 43
Wisconsin: Biden 47, Trump 42
Florida: Biden 53, Trump 43
If Trump managed to win all of those states, he’d still have to win through various combinations of North Carolina, Georgia, Pennsylvania, and Iowa. Long story short, this won’t be the easiest run for his campaign.
But assuming that the polls underrepresent Trump’s supporters in statewide polls, which they certainly did last election, Trump is within striking distance in Ohio, Arizona, and Wisconsin; if he carries those states, it wouldn’t be hard to conceive a Trump win on November 3rd (or later).
The Electoral College Comes Out As a Tie
Assuming that Nevada, New Hampshire, and Michigan go blue, both of which are solid leads for Biden, and Texas and Georgia go red, there are two likely, in my view, scenarios that could end up in a tie:
Biden wins Arizona and Wisconsin; Trump wins Florida, Pennsylvania, Ohio, North Carolina, Iowa, Maine’s Second District, and Nebraska’s Second District.
Biden wins Pennsylvania and Maine’s Second District; Trump wins Florida, Arizona, Wisconsin, Ohio, North Carolina, Iowa, and Nebraska’s Second District.
In this case, the election would go the the House of Representatives, where each state delegation would receive one vote; the candidate with the majority of the delegation votes would become president.
If the current congress voted, it would likely be a Trump victory: a majority of state delegations are Republican. But here’s another weird situation: if the Democrats take back the Senate and Republicans retain a majority of the state delegations in the House, we could end up with a President Trump and a Vice-President Harris.
Needless to say, this probably isn’t going to happen.
Trump Challenges the Legitimacy of the Election
Finally, there’s the situation that many fear: Trump refuses to back the results of the election. He won’t guarantee the peaceful transition of power that’s taken place in our country for centuries.
The problem with this election, however, is that there could be an illusory appearance of voter fraud. As results come in on election day, it’s likely that in-person voting could indicate a landslide victory for Trump: Republican voters are much more likely to vote in-person.
Here’s an interesting cartoon by David Horsey in the Seattle Times
However, as mail-in ballots are counted, more and more Democratic votes will be counted, causing a massive shift in the electoral college almost a week after election day.
Because Trump has been building a case of voter fraud since the onset of this pandemic, I don’t think any of us would be surprised if this did end up happening.
However, it’s likely that nothing would come out of a legal challenge by the president. As seen in 2000, the Constitution ensures that the presidential election goes through in the most efficient, if not fair, process possible, allowing for a continual transition of power: that’s probably not going to change just because one man decides to delegitimize a system that he once beat.
I know that each of the situations that I’ve listed are unlikely, but that’s just the nature of the race for our president. There’s no denying it: Donald Trump is losing this election. However angry it might make him or his supporters, electoral shift is a real thing.
But I have to emphasize one thing: this election is like no other. For the first time in American history, mail-in voting could make up an even larger percentage of the voting population, and election results may not be known even a week after the election. That just means that faith in our electoral system will undoubtedly be low.
Realistically, my point is this: Donald Trump isn’t out of this election. When some pundits thought that it was an almost guaranteed Clinton win last election, Trump beat the odds. Although from my perspective, that’s not likely to happen again, it still could. For the Democrats reading this, it’s okay to think that Biden will win; if the election were tomorrow, I think he would. But there’s a reason that the “October surprise” is a thing. Don’t be surprised if it doesn’t turn out the way you might hope it will. | https://medium.com/discourse/why-donald-trump-could-win-even-though-hes-losing-bb7f763251ab | ['Yash Rajpal'] | 2020-10-12 17:01:41.315000+00:00 | ['Politics', 'Trump', 'Election 2020', 'Biden', 'Coronavirus'] | Title Donald Trump Could Win Even Though He’s LosingContent Since diagnosed COVID19 president’s poll number definitely haven’t shifted better According recent poll Trump doubledigits every single battleground state except Georgia likely debate debacle first one raise question Donald Trump win election honest chance lose popular vote wide margin like last election it’s still possible electoral collegepopular vote split shift election Trump likely gain voter state like Florida Michigan Ohio Iowa Pennsylvania shot winning White House That’s going take work campaign he’s going change rhetoric campaign it’s still possible well know poll don’t give u true picture election going turn thing change fast week preceding election day election different here’s couple scenario Donald Trump win Electoral College Although unlikely three scenario result still quite possible Although pundit put chance oneinseven Trump still chance pull election Here’s recent polling data battleground state Michigan Biden 54 Trump 43 Ohio Biden 44 Trump 43 Arizona Biden 48 Trump 43 Wisconsin Biden 47 Trump 42 Florida Biden 53 Trump 43 Trump managed win state he’d still win various combination North Carolina Georgia Pennsylvania Iowa Long story short won’t easiest run campaign assuming poll underrepresent Trump’s supporter statewide poll certainly last election Trump within striking distance Ohio Arizona Wisconsin carry state wouldn’t hard conceive Trump win November 3rd later Electoral College Comes Tie Assuming Nevada New Hampshire Michigan go blue solid lead Biden Texas Georgia go red two likely view scenario could end tie Biden win Arizona Wisconsin Trump win Florida Pennsylvania Ohio North Carolina Iowa Maine’s Second District Nebraska’s Second District Biden win Pennsylvania Maine’s Second District Trump win Florida Arizona Wisconsin Ohio North Carolina Iowa Nebraska’s Second District case election would go House Representatives state delegation would receive one vote candidate majority delegation vote would become president current congress voted would likely Trump victory majority state delegation Republican here’s another weird situation Democrats take back Senate Republicans retain majority state delegation House could end President Trump VicePresident Harris Needless say probably isn’t going happen Trump Challenges Legitimacy Election Finally there’s situation many fear Trump refuse back result election won’t guarantee peaceful transition power that’s taken place country century problem election however could illusory appearance voter fraud result come election day it’s likely inperson voting could indicate landslide victory Trump Republican voter much likely vote inperson Here’s interesting cartoon David Horsey Seattle Times However mailin ballot counted Democratic vote counted causing massive shift electoral college almost week election day Trump building case voter fraud since onset pandemic don’t think u would surprised end happening However it’s likely nothing would come legal challenge president seen 2000 Constitution ensures presidential election go efficient fair process possible allowing continual transition power that’s probably going change one man decides delegitimize system beat know situation I’ve listed unlikely that’s nature race president There’s denying Donald Trump losing election However angry might make supporter electoral shift real thing emphasize one thing election like first time American history mailin voting could make even larger percentage voting population election result may known even week election mean faith electoral system undoubtedly low Realistically point Donald Trump isn’t election pundit thought almost guaranteed Clinton win last election Trump beat odds Although perspective that’s likely happen still could Democrats reading it’s okay think Biden win election tomorrow think would there’s reason “October surprise” thing Don’t surprised doesn’t turn way might hope willTags Politics Trump Election 2020 Biden Coronavirus |
4,189 | On The Art of Facing Things | It turns out facing things is not as hard, not nearly as hard, as resisting them. But to face things, especially forces that oppose us, we must go against every instinct we have to continue to believe and do what we believed and did before.
Facing things requires we undo and unlearn the well-worn emotional habits that we have repeated so often we forget we can do something else, and mistake them for cause and effect, the way the world is and will always be.
Salmon have much to teach us about the art of facing things. In swimming up waterfalls, these remarkable creatures seem to defy gravity. It is an amazing thing to behold. A closer look reveals a wisdom for all beings who want to thrive. What the salmon somehow know is how to turn their underside — from center to tail — into the powerful current coming at them, which hits them squarely and the impact then launches them out and further up the waterfall; to which their reaction is, again, to turn their underside back into the powerful current that, of course, again hits them squarely; and this successive impact launches them further out and up the waterfall. Their leaning into what they face bounces them further and further along their unlikely journey. — Mark Nepo, The Book of Awakening
The salmon shows the raging waterfall its tender side, the part that is most defenseless, the part that a fisherman would gut and all the loose pulsing life of its inner organs spills out. To have guts spill out like that makes me think about how strong our skins must be to hold in all the life that we keep integrating: It looks haphazard, a mess, when our guts are spilled. But inside, there’s an unseen order that keeps salmon, and humans living, that keeps us moving forward through the most unlikely of circumstances.
What does it mean to expose one’s tender belly to the elements, to face the strongest forces which are intent upon repelling us?
One would think such power, such force would be impossible to resist, that the salmon, or the person, has no other choice but to go with the flow, in the direction, with the momentum of the water, its power, what appears to be the source of power.
Water has the ability to adapt itself to any kind of container and is strong enough to bore through stone.
One must have a strong container to direct the flow of water, and this perhaps is where people, not salmon, think they must have a certain kind of power to be able to control the forces at work in their own lives.
But the salmon does not try to direct the water or the direction it is flowing. Yet salmon defy the dams that direct the powerful flow of water. So what is this that allows them to go against the strongest and powerful currents, to defy the strength of a waterfall and gravity itself?
Could it be that tenderness is disarming because those who are in a race to shore up bigger defenses cannot anticipate those who refuse to fight?
Even the raw power of the water cannot overcome the salmon who go their own way, who follow the call to live and to cultivate all that is yet to be born, what only they can cultivate.
I have found in my own life that when I stopped trying to get the love and attention and recognition I so desperately wanted from people who couldn’t give it to me, the people who could see me then appeared.
They say that love is the most powerful force in the universe. When faced with actual forces almost no one believes it, except maybe the salmon who have learned something about the power of love to create a different way.
A note about reference points: Fred Rogers famously said we should “look for the helpers,” in any situation where we are uncertain. Chances are good we’ll find them.
In my search through 331 images for salmon, not one photo pictured a living salmon. Many were photos of sushi. A few were salmon-colored rooms. A few of the photos of waterfalls featured women, some nude, or fisherman. Photos of dams showed neither people nor the wildlife affected by them.
I wonder if perhaps our difficulties in facing things, in knowing how to face the strange changing circumstances of our lives have something to do with our reference points — what we look to when we are looking around for clues as to how we might handle a situation. When we surround ourselves with a world with human beings at the center, and other living things dead or absent, is it any wonder we find only the solutions we’ve already thought of?
What other ways might we develop to solve problems or even understand the nature of our problem were we to expand the scope of whom and what we include as reference points? | https://medium.com/the-philosophers-stone/on-the-art-of-facing-things-865ca66f1651 | ['Suzanne Lagrande'] | 2020-09-11 21:36:02.679000+00:00 | ['Self-awareness', 'Love', 'Life Lessons', 'Philosophy', 'Life'] | Title Art Facing ThingsContent turn facing thing hard nearly hard resisting face thing especially force oppose u must go every instinct continue believe believed Facing thing requires undo unlearn wellworn emotional habit repeated often forget something else mistake cause effect way world always Salmon much teach u art facing thing swimming waterfall remarkable creature seem defy gravity amazing thing behold closer look reveals wisdom being want thrive salmon somehow know turn underside — center tail — powerful current coming hit squarely impact launch waterfall reaction turn underside back powerful current course hit squarely successive impact launch waterfall leaning face bounce along unlikely journey — Mark Nepo Book Awakening salmon show raging waterfall tender side part defenseless part fisherman would gut loose pulsing life inner organ spill gut spill like make think strong skin must hold life keep integrating look haphazard mess gut spilled inside there’s unseen order keep salmon human living keep u moving forward unlikely circumstance mean expose one’s tender belly element face strongest force intent upon repelling u One would think power force would impossible resist salmon person choice go flow direction momentum water power appears source power Water ability adapt kind container strong enough bore stone One must strong container direct flow water perhaps people salmon think must certain kind power able control force work life salmon try direct water direction flowing Yet salmon defy dam direct powerful flow water allows go strongest powerful current defy strength waterfall gravity Could tenderness disarming race shore bigger defense cannot anticipate refuse fight Even raw power water cannot overcome salmon go way follow call live cultivate yet born cultivate found life stopped trying get love attention recognition desperately wanted people couldn’t give people could see appeared say love powerful force universe faced actual force almost one belief except maybe salmon learned something power love create different way note reference point Fred Rogers famously said “look helpers” situation uncertain Chances good we’ll find search 331 image salmon one photo pictured living salmon Many photo sushi salmoncolored room photo waterfall featured woman nude fisherman Photos dam showed neither people wildlife affected wonder perhaps difficulty facing thing knowing face strange changing circumstance life something reference point — look looking around clue might handle situation surround world human being center living thing dead absent wonder find solution we’ve already thought way might develop solve problem even understand nature problem expand scope include reference pointsTags Selfawareness Love Life Lessons Philosophy Life |
4,190 | Database “Magic”: Running Huge High Throughput-Low Latency KV Database with No Data In Memory | Database “Magic”: Running Huge High Throughput-Low Latency KV Database with No Data In Memory Zohar Elkayam Follow Dec 20 · 4 min read
A couple of weeks ago I was talking to one of my oldest database colleagues (and a very dear friend of mine). We were chatting about how key/value stores and databases are evolving, and how they always seem to be revolving around in-memory solutions and cache. The main rant was how this kind of thing doesn’t scale well, while being expensive and complicated to maintain.
My friend’s background story was that they are running an application that uses a user profile with almost 700 million profiles (their total size was around 2TB, with a replication factor of 2). Since the access to the user profile is very random (meaning, users are fetched and updated without the application being able to “guess” which user it would need next) they could not use pre-heating of the data to memory. Their main issue was that sometimes they get peaks of over 500k operations per second of this mixed workload and that doesn’t scale very well.
User Profile use case summary
In my friend’s mind’s eye, the only things they could do is use some kind of a memory based solution. They could either use an in-memory store — which, as we said before, doesn’t scale well and is hard to maintain, or use a traditional cache-first solution, but lose some of the low latency required, because most of the records are not cached.
I explained that Aerospike is different. In Aerospike we can store 700 million profiles, 2 TB of data, provide the said 500k TPS (400k reads and 100k writes, concurrently) with a sub 1ms latency, but without storing any of the data in memory. The memory usage would then be very minimal — under 5 percent of the data for that use case.
My friend was suspicious: “ What kind of wizardry are you pulling here?!”
So since I am not a wizard (yet, I am still convinced my Hogwarts acceptance letter is on its way — I’m almost sure it’s just the owl delayed), I went ahead and created a modest demo cluster for them, just to show my “magic”.
Aerospike Cluster: Hybrid Memory Architecture
In this next screenshot, we can see the result: a 6 node cluster, running 500k TPS: 400k reads + 100k writes, storing 1.73TB of data but only utilizing 83.45GB of RAM.
Running a 6 node cluster: 1.73TB of data, 84 GB of RAM
This cluster doesn’t have a specialized hardware of any kind. It’s using 6 nodes of AWS’ c5ad.4xl (a standard option for a node), which means a total of 192GB RAM and 3.5TB of ephemeral devices, cluster-wide. From the pricing perspective it’s only about 1900$ a month, way less than what they pay now (and I price-tagged it before any discounts).
Obviously, if the cluster has a total of 192GB of DRAM, the data is not being stored fully in memory. In this case 0 percent of the data was fetched from any sort of cache — so, for 1.73TB of data, the memory usage was under 84GB (even though the Linux kernel would allow for some caching if needed! This makes things even better when using other access patterns like Common Records, or Read After Write).
The cool thing is the performance. Predictable performance is something every application needs — and for the peaks described earlier, we can see in the next screenshots a latency of under 1ms for both reads and writes! | https://medium.com/aerospike-developer-blog/database-magic-running-huge-high-throughput-low-latency-kv-database-with-no-data-in-memory-eb67ecdac851 | ['Zohar Elkayam'] | 2020-12-21 10:48:19.017000+00:00 | ['NoSQL', 'Redis', 'Database', 'Big Data', 'Aerospike'] | Title Database “Magic” Running Huge High ThroughputLow Latency KV Database Data MemoryContent Database “Magic” Running Huge High ThroughputLow Latency KV Database Data Memory Zohar Elkayam Follow Dec 20 · 4 min read couple week ago talking one oldest database colleague dear friend mine chatting keyvalue store database evolving always seem revolving around inmemory solution cache main rant kind thing doesn’t scale well expensive complicated maintain friend’s background story running application us user profile almost 700 million profile total size around 2TB replication factor 2 Since access user profile random meaning user fetched updated without application able “guess” user would need next could use preheating data memory main issue sometimes get peak 500k operation per second mixed workload doesn’t scale well User Profile use case summary friend’s mind’s eye thing could use kind memory based solution could either use inmemory store — said doesn’t scale well hard maintain use traditional cachefirst solution lose low latency required record cached explained Aerospike different Aerospike store 700 million profile 2 TB data provide said 500k TPS 400k read 100k writes concurrently sub 1ms latency without storing data memory memory usage would minimal — 5 percent data use case friend suspicious “ kind wizardry pulling here” since wizard yet still convinced Hogwarts acceptance letter way — I’m almost sure it’s owl delayed went ahead created modest demo cluster show “magic” Aerospike Cluster Hybrid Memory Architecture next screenshot see result 6 node cluster running 500k TPS 400k read 100k writes storing 173TB data utilizing 8345GB RAM Running 6 node cluster 173TB data 84 GB RAM cluster doesn’t specialized hardware kind It’s using 6 node AWS’ c5ad4xl standard option node mean total 192GB RAM 35TB ephemeral device clusterwide pricing perspective it’s 1900 month way le pay pricetagged discount Obviously cluster total 192GB DRAM data stored fully memory case 0 percent data fetched sort cache — 173TB data memory usage 84GB even though Linux kernel would allow caching needed make thing even better using access pattern like Common Records Read Write cool thing performance Predictable performance something every application need — peak described earlier see next screenshots latency 1ms read writesTags NoSQL Redis Database Big Data Aerospike |
4,191 | Welcome to the Bazaar of the Bizarre | Here are some quick notes pertaining to the tabs of this publication. Since tabs are tag-loaded, it is important to properly tag posts, so that they fall under the correct tabs. For posts to be loaded in the Poetry From The Soul tab, make sure that your main tag is “Poetry.” For posts to fall under the Musings tab, make sure to use “Writing” as your main tag. Although I have no problem with writers cross-tagging their work as “Poetry”, “Musings”, and/or “Writing.” it is my aim to have the bulk of “Poetry” posts fall under the Poetry From The Soul tab; whereas, the Musings tab will be the general section solely for various kinds of prose such as contemplations, flash fiction, meditations, memoirs, reflections, short stories, and vignettes. For posts to fall under the Fibonacci & Other Weird Forms tab, please use the tag “Forms,” and anything in this section can be automatically cross-tagged as “Poetry.” Although all forms are welcome, I say the more experimental (and less traditional), the better, only because experimentation with various forms has always been instrumental in the honing of my skills as a poet.
Photo by Jr Korpa on Unsplash | https://medium.com/the-bazaar-of-the-bizarre/welcome-to-the-bazaar-of-the-bizarre-74d9aee0e1cf | [] | 2020-12-11 07:13:15.505000+00:00 | ['Musings', 'Writing', 'Bazaar Of The Bizarre', 'Mdshall', '21stenturygrio'] | Title Welcome Bazaar BizarreContent quick note pertaining tab publication Since tab tagloaded important properly tag post fall correct tab post loaded Poetry Soul tab make sure main tag “Poetry” post fall Musings tab make sure use “Writing” main tag Although problem writer crosstagging work “Poetry” “Musings” andor “Writing” aim bulk “Poetry” post fall Poetry Soul tab whereas Musings tab general section solely various kind prose contemplation flash fiction meditation memoir reflection short story vignette post fall Fibonacci Weird Forms tab please use tag “Forms” anything section automatically crosstagged “Poetry” Although form welcome say experimental le traditional better experimentation various form always instrumental honing skill poet Photo Jr Korpa UnsplashTags Musings Writing Bazaar Bizarre Mdshall 21stenturygrio |
4,192 | Engineer Q&A: Jessica Chong, Frontend Engineer | I’m taking part in this Q&A as part of an effort to introduce the world to the Engineering team at Optimizely. If you’re interested in joining, we’re hiring Frontend Engineers in San Francisco and Austin!
Tell us about yourself! What’s your name, what do you do at Optimizely, and how long have you been here? Tell us a bit about your trajectory to get here.
My name is Jess, and I’m a Senior Software Engineer working on Product at Optimizely. I’ve been here for 2.5 years. I got here via the I/Own It scholarship program, which was originally conceived to grow our WomEng population. I cannot overstate how much this scholarship changed my life.
Last year everything came full circle when I ran the program for the second time. You can read more about it here.
How did you figure out that this was what you wanted to do?
I started making websites in 1999, when I was 13 years old and still using dial-up. I would spend hours poring through tutorials on htmlgoodies.com, painstakingly positioning tables and frames, exchanging design ideas and HTML tips with my Internet friends, and uploading my sites to domains that were owned by other teenagers. It was empowering, and I was part of a supportive community.
C:\Windows\Desktop\jess\yes\index.htm
I’ve actually blogged about my “Internetolescence”, a core part of my teenage identity, but it never crossed my mind that I could make a career out of making stuff on the Internet until I was well into my adulthood. In high school I was drawn to the arts and social sciences, and I studied Geography as an undergrad at Vassar because it addressed the core questions I had about the world, namely: “How/why does where you are born impact how you live and how you die?” I didn’t know of anyone who was pursuing software engineering in school, or, in fact, as a career. In truth, the only formal computer science education I’ve had was my seventh-grade computer class, where I made a calculator with Visual Basic. I was definitely most excited about styling the calculator (It was purple and yellow, and labeled ~*JeSsIcA’s fUnkEe caLcuLatOr!!*~).
Somehow I turned my middle-school hobby of making websites into a career. I freelanced for several years making websites before I landed at the job where I used Optimizely.
What’s it like to be a Frontend Engineer at Optimizely?
My day-to-day consists of doing code reviews, reviewing engineering design docs from my peers, scoping work, 1:1s, and of course, writing code.
Frontend work here ranges from technical infrastructure to product/feature work. Frontend engineers mostly write Javascript, but if we want, we can go into the Python app backend and update or write APIs there as well. I’ve also written code in the Optimizely client codebase (the stuff that gets loaded on our customers’ websites). Most of my recent work correlates to specific features in our product — I recently drove two features: mutually exclusive experiments and multivariate testing. I work closely with Product Managers like Jon and Whelan; Designers; and other Engineers.
As a Frontend Engineer, I see myself as the final gatekeeper before a feature reaches an audience. I have to ask myself many questions as I’m developing: Is the code I’m writing performant? Is the UI I’m building or reviewing for a coworker intuitive to use? How can I make this easy for the next person to build on?
Frontend engineers are distributed across different “squads” or teams, but we convene every two weeks at the Frontend Community of Practice (or Frontend COP), which is led by individual contributors. Anyone can put whatever they want on the meeting agenda. We’ve talked about things like ES6 conversion, security, data validations, tests, code coverage, code organization, and interview questions for incoming candidates. We’re in an interesting moment because we’re in the midst of shepherding a migration from VueJS to ReactJS. We handle application state using NuclearJS, an open-source project first developed here at Optimizely by Jordan Garcia. What I’m learning is that the engineering challenges are not exclusively technical; many of them are interpersonal. For example, how do you sell people on an idea? How do you convince people with competing (and often conflicting) agendas that refactoring is a good thing?
What have you been working on lately?
The last few quarters, I’ve been midwifing Multivariate Testing to completion. One of my squad’s main goals is to get Optimizely X to feature parity with Optimizely Classic, and Multivariate Testing is one of the remaining pieces. Multivariate Tests allow customers to test multiple different factors on their sites to see what combination of factors has the best outcome. We made full-factorial Multivariate Testing generally available in March, and are about to release partial-factorial testing. I hope that customers love it. I’ve been monitoring the usage dashboard for our product (we use a Chartio built by our inhouse Chartio wiz Jon) and watching more and more customers use it.
It’s been really cool to drive Multivariate Testing, especially because I used it in Optimizely’s Classic product, when I was a customer! It’s also been a cross-functional effort, requiring work on the backend, frontend, event infrastructure, QA, and client. By far the best thing about working on this project has been my team.
I’m also fresh off of a two-week rotation as a Support Engineer for my squad. As Support Eng, we’re required to basically drop everything that we’re doing and focus exclusively on resolving bugs. I love the adrenaline rush I get when I can reliably reproduce a bug and solve it… but I’m always relieved to come off the rotation because working with a constant sense of urgency (or panic) is exhausting and not sustainable.
Outside of the Eng org, I also co-chair the Diversity and Inclusion group here and ran the second iteration of the I/Own It scholarship. Last year I was the Ambassador for the ADEPT organization for Optimizely.org, our company’s social impact arm. I think John Leonard, who manages Optimizely.org, does very impactful work driving volunteer activities and giving us ownership to run programming ourselves. Last year I ran a day where we hosted high school CS students and ran a couple of clothing drives for St. Anthony’s. I also volunteered as a mentor with BUILD; we worked in small groups to help high schoolers build skills in marketing, technology, and entrepreneurship. The programming that we do in partnership with BUILD is run completely voluntarily by my colleagues — it’s really special.
What’s unique about engineering at Optimizely?
I am surrounded by smart, passionate, collaborative, and wonderful people who genuinely want their peers to succeed. I feel like my peers 110% have my back. We also have a tremendous amount of ownership over our own work.
I feel supported by my manager, Asa, and am constantly pushed to do things that I’m afraid to do. I was lucky to have the chance to be a tech lead/epic owner on two features and to work on the same team as some of the most generous, fun people here.
I hosted a Girl Geek dinner panel. L-R: Kelly, Neha, Elizabeth, Heather, Yours Truly.
This isn’t so much related to engineering, but I suppose it’s illustrative of how I’ve been encouraged to grow: I have a terrible fear of public speaking. In seventh grade, everyone had to give speeches in class; as soon as I opened my mouth to deliver my speech on dreams (wherein I kept pronouncing ‘Freud’ Frowd’), the piece of gum that had been marinating in my saliva for an hour fell onto the floor. Knowing about my fear of public speaking, my managers found opportunities for me to lead onboarding sessions for technical new hires, and I’ve had the chance to speak on several diversity-related panels here.
Another thing I love is that our leadership is incredibly open — we all have a direct line to Bill, our VP of Engineering; every time I meet with him, I come away with the sense that my opinions matter, and that my feedback will turn into action.
The Engineering organization is very democratic. I love the energy that fills the room any time we run an ADEPT-wide retrospective. It’s like a weird family reunion, but with lots of Post-Its and Sharpie fumes, and the knowledge that our feedback will be heard, considered, and acted upon! I like that we’re not dogmatic about our approaches to work, and that we are flexible.
Lastly, I love that engineers also have a lot of input into product. We’re encouraged to come up with test ideas and to dogfood our own product. (I hate that term. I prefer “drink our own champagne” even though that sounds very Marie Antoinette-ish).
🌸🌺🌼 The Inaugural Floral Jumpsuit Friday 🌸🌺🌼
Also, I’m part of a group called WomEng. We meet at least once a month for lunch and other activities. A few weeks ago we took a self defense class! I love that we have such varied interests — yoga, running, improv, skiing, sewing, art, pool.
What advice do you have for other engineers?
Debugging is your best friend. Have compassion! Realize that everyone comes from a different background and perspective, but that ultimately, everyone wants a good outcome at the end of the day. Pay it forward! I had a great experience onboarding an intern from a completely different background. I went to Hack Reactor, so my programming knowledge is almost 100% in JavaScript. I had to onboard an intern who had a CS degree but had never done web development and had never written JavaScript before. I had to figure out how to teach concepts that were still relatively new to me to someone who was coming from a different perspective. I learned a lot from teaching, especially in the process of figuring out how to take a complex idea and synthesize it down to the most important bits so other people can understand it.
What do you do outside of work?
Outside of work, I like to sew, knit, read, make Kombucha, bike, watch TV shows whose target audience is teenagers (cough Riverdale cough), and write incredibly lame limericks about work. Here are a few:
TODAY I LEARNED I TALK TO MYSELF.
“Excuse me,” says Matt, from my right,
“But I’ve overheard much of your plight.”
“I’m talking?” I ask.
“Yeah! Throughout your whole task!”
(I’d assumed my thoughts were out of ear sight!)
POOR YOYO
Johanna spent hours on the spreadsheet.
She filled it in all nice and neat.
But no autosave
Meant the outcome was grave…
Next time she’ll hit Save on repeat!
SUPPORT ENG ROTATION
The line changes were meant to be few
for the bug at the top of the queue.
It was very unpleasant
to find my assessment
was so poor, it made James stew! | https://medium.com/engineers-optimizely/engineer-q-a-jessica-chong-frontend-engineer-a62fa7994ecb | ['Jessica Chong'] | 2019-04-02 21:43:25.703000+00:00 | ['Interview', 'Front End Development', 'Women In Engineering', 'Software Engineering', 'Engineering Team'] | Title Engineer QA Jessica Chong Frontend EngineerContent I’m taking part QA part effort introduce world Engineering team Optimizely you’re interested joining we’re hiring Frontend Engineers San Francisco Austin Tell u What’s name Optimizely long Tell u bit trajectory get name Jess I’m Senior Software Engineer working Product Optimizely I’ve 25 year got via IOwn scholarship program originally conceived grow WomEng population cannot overstate much scholarship changed life Last year everything came full circle ran program second time read figure wanted started making website 1999 13 year old still using dialup would spend hour poring tutorial htmlgoodiescom painstakingly positioning table frame exchanging design idea HTML tip Internet friend uploading site domain owned teenager empowering part supportive community CWindowsDesktopjessyesindexhtm I’ve actually blogged “Internetolescence” core part teenage identity never crossed mind could make career making stuff Internet well adulthood high school drawn art social science studied Geography undergrad Vassar addressed core question world namely “Howwhy born impact live die” didn’t know anyone pursuing software engineering school fact career truth formal computer science education I’ve seventhgrade computer class made calculator Visual Basic definitely excited styling calculator purple yellow labeled JeSsIcA’s fUnkEe caLcuLatOr Somehow turned middleschool hobby making website career freelanced several year making website landed job used Optimizely What’s like Frontend Engineer Optimizely daytoday consists code review reviewing engineering design doc peer scoping work 11 course writing code Frontend work range technical infrastructure productfeature work Frontend engineer mostly write Javascript want go Python app backend update write APIs well I’ve also written code Optimizely client codebase stuff get loaded customers’ website recent work correlate specific feature product — recently drove two feature mutually exclusive experiment multivariate testing work closely Product Managers like Jon Whelan Designers Engineers Frontend Engineer see final gatekeeper feature reach audience ask many question I’m developing code I’m writing performant UI I’m building reviewing coworker intuitive use make easy next person build Frontend engineer distributed across different “squads” team convene every two week Frontend Community Practice Frontend COP led individual contributor Anyone put whatever want meeting agenda We’ve talked thing like ES6 conversion security data validation test code coverage code organization interview question incoming candidate We’re interesting moment we’re midst shepherding migration VueJS ReactJS handle application state using NuclearJS opensource project first developed Optimizely Jordan Garcia I’m learning engineering challenge exclusively technical many interpersonal example sell people idea convince people competing often conflicting agenda refactoring good thing working lately last quarter I’ve midwifing Multivariate Testing completion One squad’s main goal get Optimizely X feature parity Optimizely Classic Multivariate Testing one remaining piece Multivariate Tests allow customer test multiple different factor site see combination factor best outcome made fullfactorial Multivariate Testing generally available March release partialfactorial testing hope customer love I’ve monitoring usage dashboard product use Chartio built inhouse Chartio wiz Jon watching customer use It’s really cool drive Multivariate Testing especially used Optimizely’s Classic product customer It’s also crossfunctional effort requiring work backend frontend event infrastructure QA client far best thing working project team I’m also fresh twoweek rotation Support Engineer squad Support Eng we’re required basically drop everything we’re focus exclusively resolving bug love adrenaline rush get reliably reproduce bug solve it… I’m always relieved come rotation working constant sense urgency panic exhausting sustainable Outside Eng org also cochair Diversity Inclusion group ran second iteration IOwn scholarship Last year Ambassador ADEPT organization Optimizelyorg company’s social impact arm think John Leonard manages Optimizelyorg impactful work driving volunteer activity giving u ownership run programming Last year ran day hosted high school CS student ran couple clothing drive St Anthony’s also volunteered mentor BUILD worked small group help high schoolers build skill marketing technology entrepreneurship programming partnership BUILD run completely voluntarily colleague — it’s really special What’s unique engineering Optimizely surrounded smart passionate collaborative wonderful people genuinely want peer succeed feel like peer 110 back also tremendous amount ownership work feel supported manager Asa constantly pushed thing I’m afraid lucky chance tech leadepic owner two feature work team generous fun people hosted Girl Geek dinner panel LR Kelly Neha Elizabeth Heather Truly isn’t much related engineering suppose it’s illustrative I’ve encouraged grow terrible fear public speaking seventh grade everyone give speech class soon opened mouth deliver speech dream wherein kept pronouncing ‘Freud’ Frowd’ piece gum marinating saliva hour fell onto floor Knowing fear public speaking manager found opportunity lead onboarding session technical new hire I’ve chance speak several diversityrelated panel Another thing love leadership incredibly open — direct line Bill VP Engineering every time meet come away sense opinion matter feedback turn action Engineering organization democratic love energy fill room time run ADEPTwide retrospective It’s like weird family reunion lot PostIts Sharpie fume knowledge feedback heard considered acted upon like we’re dogmatic approach work flexible Lastly love engineer also lot input product We’re encouraged come test idea dogfood product hate term prefer “drink champagne” even though sound Marie Antoinetteish 🌸🌺🌼 Inaugural Floral Jumpsuit Friday 🌸🌺🌼 Also I’m part group called WomEng meet least month lunch activity week ago took self defense class love varied interest — yoga running improv skiing sewing art pool advice engineer Debugging best friend compassion Realize everyone come different background perspective ultimately everyone want good outcome end day Pay forward great experience onboarding intern completely different background went Hack Reactor programming knowledge almost 100 JavaScript onboard intern CS degree never done web development never written JavaScript figure teach concept still relatively new someone coming different perspective learned lot teaching especially process figuring take complex idea synthesize important bit people understand outside work Outside work like sew knit read make Kombucha bike watch TV show whose target audience teenager cough Riverdale cough write incredibly lame limerick work TODAY LEARNED TALK “Excuse me” say Matt right “But I’ve overheard much plight” “I’m talking” ask “Yeah Throughout whole task” I’d assumed thought ear sight POOR YOYO Johanna spent hour spreadsheet filled nice neat autosave Meant outcome grave… Next time she’ll hit Save repeat SUPPORT ENG ROTATION line change meant bug top queue unpleasant find assessment poor made James stewTags Interview Front End Development Women Engineering Software Engineering Engineering Team |
4,193 | Can You Keep Google Out of Your Gmail? | Gmail is a great service, but not everyone is comfortable giving Google access to their email. Security expert Max Eddy explains what steps will (and won’t) help keep your messages private.
By Max Eddy
This week, I’m following up on a message from a reader who previously wrote in about how not to get locked out of your accounts when you’re using two-factor authentication, or 2FA. Jeremy from Capetown also asked whether it’s possible to use 2FA to keep Google out of Gmail.
What Is Two-Factor Authentication?
Two-factor authentication is when you use two authentication factors from a list of a possible three: Something you know, something you have, or something you are. A password, for example, is something you know and a fingerprint is something you are. When you use the two together, you’re using 2FA.
In practical terms, 2FA involves an extra step you take after entering your password to absolutely prove you are who you say you are. This often involves using a one-time code generated from an app or sent via SMS, but there are many other options, including tap-to-login apps like Duo or hardware security keys like those from Yubico and other manufacturers.
2FA is good. You should use it. It’s a great way to keep the bad guys out of your accounts, but it doesn’t appear that it will do much to keep out Google.
Who Sees What?
In general, Google does appear to have access to the content of your emails. Christopher Cuong Nguyen, who lists himself as a former Google employee, wrote on Quora in 2010 that a very small number of employees can access email content, and that a highly regulated path exists for information to be retrieved. Now, this information is almost a decade old, but it does demonstrate that at one point, there were people who could reach into your Gmail account.
Google says that as a law-abiding company, it is required to comply with legal requests for information from governments and law enforcement. This can include the contents of your email messages, although Google points out that it strives to narrow the scope of requests it receives and requires a search warrant before handing over your photos, documents, email messages and more.
There are other ways Google uses your Gmail information. While the company no longer scans messages to generate custom ad content, it famously did so for years. Even now, Gmail parses your messages enough to pull out and highlight travel information and generate type-ahead suggestions when you write messages. Depending on your level of comfort, this might be totally fine or wildly invasive.
Google does appear to encrypt your emails, but primarily while those messages are in transit. Even if those messages are encrypted while at rest on Google’s servers, if Google is managing the encryption keys—and what I have seen implies it does—Google could still conceivably access your messages.
2FA Isn’t the Answer
I can see where Jeremy is coming from with his question. Since I control my Yubikey, and Google doesn’t, if I enable 2FA, Google shouldn’t be able to access my Gmail account. Google can, however, effect changes to accounts that are secured with 2FA.
Firing up one of my non-work Gmail accounts, I clicked the Forgot My Password option. It immediately jumped alternate options for sign-in: sending a text to my phone, using my Yubikey, tapping an alert on a verified phone, sending an email to my recovery email address, answering a security question, entering the date I created my Gmail account, and then finally leaving an email address where I could be reached by Google to address my problem directly. If Google can grant me access to my own account without necessarily having my password or second factor, that implies that Google can do that itself.
Even Google’s Advanced Protection Program for Gmail has a kind of recovery option. When enabled, Advanced Protection requires that you enroll two different hardware security keys-one for login and another as a backup. If you lose both keys, Google says this about regaining control of your Advanced Protection Program account:
If you still have access to a logged-in session, you can visit account.google.com and register replacement keys in place of the lost keys. If you have lost both keys and do not have access to a logged-in session, you will need to submit a request to recover your account. It will take a few days for Google to verify it’s you and grant you access to your account.
On balance, it seems like 2FA±even the extreme version of it used in Advanced Protection—is not enough to keep Google itself out of your email. For most people, that’s probably a good thing. Email accounts are an incredibly important part of an individual’s security infrastructure. If you lose a password or have to change a password, an email sent to a verified account is usually part of the process. If an attacker gains access to your email account, they could go on to use the account recovery option on websites to gain access to even more accounts. It’s important that users have the means to regain control of their accounts.
Truly Private Messages
When we talk about what can and cannot be seen in messaging systems, we’re talking about encryption, not authentication. Most services use encryption at different points in the process of sending and storing a message. Gmail, for example, uses TLS when sending a message to ensure it’s not intercepted. When a messaging service of any kind retains the keys used to encrypt your messages when they’re resting on the server, it’s a safe assumption that the company can access those messages themselves.
If you want to keep your Gmail account, but want to make your messages unreadable, you could encrypt those messages yourself. There are numerous encryption plug-ins for Chrome, or you can configure Thunderbird to encrypt your messages with PGP, a commonly used encryption scheme for email. The more expensive Yubico models can also be configured to spit out your PGP key when needed.
I am going to be honest and say that while I am sure some of these work, I have never been able to understand them adequately. The creator of PGP famously said that even he finds the process too convoluted to understand.
What might be easier is using encryption tools to encrypt messages, and then attach or paste the encrypted output into Gmail. You’d have to coordinate the decryption process on the other end, but the content of the email would not be readable to Google, or anyone else for that matter. Keybase.io is another service that can encrypt, decrypt, or sign text that can be used in an email.
If you absolutely must be sure that no one but you has access to your email, there are a few options. First and foremost would be to ditch Gmail. ProtonMail, from the creators of ProtonVPN, is a service intended to respect your privacy and does so by encrypting all your email messages-including those you send and receive from people using other email providers. Here’s how ProtonMail describes its operation:
All messages in your ProtonMail inbox are stored end-to-end encrypted. This means we cannot read any of your messages or hand them over to third parties. This includes messages sent to you by non-ProtonMail users, although keep in mind if an email is sent to you from Gmail, Gmail likely retains a copy of that message as well.
Another option is to look beyond email. The late 2010s brought about a glut of over-the-top messaging services, which use your data connection instead of your SMS plan to send messages between devices. In recent years, many of those services have adopted end-to-end encryption, meaning that only you and your intended recipient, can read your messages. Signal is the best known, and an excellent app in its own right. WhatsApp adopted the Signal protocol, and now encrypts its messages end to end. Facebook Messenger, somewhat ironically, also uses the Signal protocol for its Secret Messages mode.
Apple’s Messages platform might is probably best known for its stickers and animoji karaoke, but it’s also a remarkably secure messaging system. It’s also notable because unlike other messaging services, you can send and receive messages on either your phone or your computer without granting Apple access to the content of your messages.
When it comes to using Gmail, I recommend people listen to their guts. If you’re deeply worried about your messages being read by humans or bots, try an alternative. If Gmail is really convenient for you, and you like the features it offers, stick with it. Trying to bend Gmail toward being totally secure is definitely possible, but there are so many easier alternatives. Lastly, 2FA is a great solution for keeping the bad guys out of your accounts, and that’s about it. I wouldn’t rely on it to lock out the owner of a service. | https://medium.com/pcmag-access/can-you-keep-google-out-of-your-gmail-3ec59d0d5e90 | [] | 2019-05-20 21:24:24.996000+00:00 | ['Privacy', 'Cybersecurity', 'Security', 'Google', 'Technology'] | Title Keep Google GmailContent Gmail great service everyone comfortable giving Google access email Security expert Max Eddy explains step won’t help keep message private Max Eddy week I’m following message reader previously wrote get locked account you’re using twofactor authentication 2FA Jeremy Capetown also asked whether it’s possible use 2FA keep Google Gmail TwoFactor Authentication Twofactor authentication use two authentication factor list possible three Something know something something password example something know fingerprint something use two together you’re using 2FA practical term 2FA involves extra step take entering password absolutely prove say often involves using onetime code generated app sent via SMS many option including taptologin apps like Duo hardware security key like Yubico manufacturer 2FA good use It’s great way keep bad guy account doesn’t appear much keep Google Sees general Google appear access content email Christopher Cuong Nguyen list former Google employee wrote Quora 2010 small number employee access email content highly regulated path exists information retrieved information almost decade old demonstrate one point people could reach Gmail account Google say lawabiding company required comply legal request information government law enforcement include content email message although Google point strives narrow scope request receives requires search warrant handing photo document email message way Google us Gmail information company longer scan message generate custom ad content famously year Even Gmail par message enough pull highlight travel information generate typeahead suggestion write message Depending level comfort might totally fine wildly invasive Google appear encrypt email primarily message transit Even message encrypted rest Google’s server Google managing encryption keys—and seen implies does—Google could still conceivably access message 2FA Isn’t Answer see Jeremy coming question Since control Yubikey Google doesn’t enable 2FA Google shouldn’t able access Gmail account Google however effect change account secured 2FA Firing one nonwork Gmail account clicked Forgot Password option immediately jumped alternate option signin sending text phone using Yubikey tapping alert verified phone sending email recovery email address answering security question entering date created Gmail account finally leaving email address could reached Google address problem directly Google grant access account without necessarily password second factor implies Google Even Google’s Advanced Protection Program Gmail kind recovery option enabled Advanced Protection requires enroll two different hardware security keysone login another backup lose key Google say regaining control Advanced Protection Program account still access loggedin session visit accountgooglecom register replacement key place lost key lost key access loggedin session need submit request recover account take day Google verify it’s grant access account balance seems like 2FA±even extreme version used Advanced Protection—is enough keep Google email people that’s probably good thing Email account incredibly important part individual’s security infrastructure lose password change password email sent verified account usually part process attacker gain access email account could go use account recovery option website gain access even account It’s important user mean regain control account Truly Private Messages talk cannot seen messaging system we’re talking encryption authentication service use encryption different point process sending storing message Gmail example us TLS sending message ensure it’s intercepted messaging service kind retains key used encrypt message they’re resting server it’s safe assumption company access message want keep Gmail account want make message unreadable could encrypt message numerous encryption plugins Chrome configure Thunderbird encrypt message PGP commonly used encryption scheme email expensive Yubico model also configured spit PGP key needed going honest say sure work never able understand adequately creator PGP famously said even find process convoluted understand might easier using encryption tool encrypt message attach paste encrypted output Gmail You’d coordinate decryption process end content email would readable Google anyone else matter Keybaseio another service encrypt decrypt sign text used email absolutely must sure one access email option First foremost would ditch Gmail ProtonMail creator ProtonVPN service intended respect privacy encrypting email messagesincluding send receive people using email provider Here’s ProtonMail describes operation message ProtonMail inbox stored endtoend encrypted mean cannot read message hand third party includes message sent nonProtonMail user although keep mind email sent Gmail Gmail likely retains copy message well Another option look beyond email late 2010s brought glut overthetop messaging service use data connection instead SMS plan send message device recent year many service adopted endtoend encryption meaning intended recipient read message Signal best known excellent app right WhatsApp adopted Signal protocol encrypts message end end Facebook Messenger somewhat ironically also us Signal protocol Secret Messages mode Apple’s Messages platform might probably best known sticker animoji karaoke it’s also remarkably secure messaging system It’s also notable unlike messaging service send receive message either phone computer without granting Apple access content message come using Gmail recommend people listen gut you’re deeply worried message read human bot try alternative Gmail really convenient like feature offer stick Trying bend Gmail toward totally secure definitely possible many easier alternative Lastly 2FA great solution keeping bad guy account that’s wouldn’t rely lock owner serviceTags Privacy Cybersecurity Security Google Technology |
4,194 | How to Embrace Middle Age. When I got my first letter from AARP… | Photo by Aron Visuals @ Unsplash.com
I stood at my mailbox with the grocery bags hanging on my arm as time seemed to slow. This surreal moment, I presume, was to give my mind a few extra seconds to process what was happening. At first, seeing the unmistakable logo didn’t really affect me until I realized the letter from AARP was addressed to me.
“This is it. I’m here. I’m officially old.”
There are fewer years in front of me than behind me and the last 10 or so could have me eating from a spoon and making macaroni art.
But before allowing myself to spiral down a rabbit hole of depression while contemplating my mortality I thought, “I wonder if there are any cool discounts” and tore open the envelope.
There weren’t.
In the weeks that followed, my thoughts swirled like creamer in coffee. I wondered whether my life has made any kind of impact on the world. I reflected on the few life accomplishments in my 53 years of existence. I remembered how I much really want to travel. Is the Red Hat Society still around?
I figured it was time to get that pink vintage camper I always dreamed about and take those road trips I put off for the last 30 years. Almost instantly visiting my family became a priority.
It was also after receiving the letter it became clear the entire world has been noticing my age even though I haven’t. Have I been in denial the entire time?
It feels like I woke up from a coma and am now on the other side of middle age.
I notice things that I haven’t before. Fine lines aren’t fine lines anymore. They’re wrinkles. People call me ma’am. I’m not as physically strong as I was and now use tools to do things like open pickle jars and remove bottle caps. I threw my shoulder out trying to start the lawnmower last summer. I could go on….and on…and on.
In order to make myself feel better, it only made sense to call my sister and invite her to my pity party. She’s 15 years older than me and always knows what to say to help me feel better.
After telling her of my life-altering-ah-ha moment, she said, “You think that’s bad? I get mail from crematoriums and funeral homes!”
What the heck??!!
Her tone turned serious, yet warm and she simply said, “Sissy, make your memories NOW. You’re wondering where 30 years went? You have 10–15 good years left before you REALLY start slowing down. Enjoy the time you have left. Make these years count.”
She’s right.
This IS the best time of my life. The more I think about it, the more I realize there are a LOT of reasons why being over 50 is fabulous. It’s so freeing. All those ideas and beliefs I thought were so important and that I struggled with just aren’t that important. It didn’t take long to find the first 12 reasons why being over 50 is awesome.
1) Grandchildren! I should’ve had them first. Oh my heart! When you have kids you don’t think it’s possible for your heart to love anymore. Then grandkids happen.
2) I’m at that age where I can date a 38-year-old or his father. Options baby, options! Just not both. That would be weird.
3) Some places will give me the senior discount anyway. Who knew?
4) My minivan days are over. Geez, I hated that thing. Yep, that’s me in the roadster. BTW I’ve also noticed that “oh yeah” nod from other middle-agers in their sports or luxury cars.
5) I’m the kooky/eccentric/hippie mom and I’m okay with that.
6) Mumus are being considered as a wardrobe option.
7) I’m starting to wonder if you can win enough at bingo to make a living.
8) No more hosting giant family holiday dinners. That baton has thankfully been passed to my children. As much as I enjoy entertaining, it simply got to be too much. Now I just show up with a dish and help clean up.
9) Vintage camper or converted school bus living is a very real possibility. Tiny home? RV?
10) I can enroll in college for whatever I WANT, without regard to whether or not my degree will help me in my career. Yarn dying, anthropology, hip hop dance, sociology, astrophysics, yak breeding.
11) No more periods! That alone makes aging awesome.
12) Errands consist of booze runs, craft stores, and yard sales not soccer practice, PTA meetings, and dry cleaners.
The more I think about it, the more I’m falling in love with my later years. It’s time to start checking off the items on my bucket list.
Oh yeah….I AM joining the Red Hat Society. | https://medium.com/crows-feet/embracing-middle-age-celebrating-and-enjoying-the-rest-of-my-life-554c020be1ec | ['Angelica Mordant'] | 2020-03-07 22:30:18.368000+00:00 | ['Life', 'Life Lessons', 'Positive Thinking', 'Psychology', 'Women'] | Title Embrace Middle Age got first letter AARP…Content Photo Aron Visuals Unsplashcom stood mailbox grocery bag hanging arm time seemed slow surreal moment presume give mind extra second process happening first seeing unmistakable logo didn’t really affect realized letter AARP addressed “This I’m I’m officially old” fewer year front behind last 10 could eating spoon making macaroni art allowing spiral rabbit hole depression contemplating mortality thought “I wonder cool discounts” tore open envelope weren’t week followed thought swirled like creamer coffee wondered whether life made kind impact world reflected life accomplishment 53 year existence remembered much really want travel Red Hat Society still around figured time get pink vintage camper always dreamed take road trip put last 30 year Almost instantly visiting family became priority also receiving letter became clear entire world noticing age even though haven’t denial entire time feel like woke coma side middle age notice thing haven’t Fine line aren’t fine line anymore They’re wrinkle People call ma’am I’m physically strong use tool thing like open pickle jar remove bottle cap threw shoulder trying start lawnmower last summer could go on…and on…and order make feel better made sense call sister invite pity party She’s 15 year older always know say help feel better telling lifealteringahha moment said “You think that’s bad get mail crematorium funeral homes” heck tone turned serious yet warm simply said “Sissy make memory You’re wondering 30 year went 10–15 good year left REALLY start slowing Enjoy time left Make year count” She’s right best time life think realize LOT reason 50 fabulous It’s freeing idea belief thought important struggled aren’t important didn’t take long find first 12 reason 50 awesome 1 Grandchildren should’ve first Oh heart kid don’t think it’s possible heart love anymore grandkids happen 2 I’m age date 38yearold father Options baby option would weird 3 place give senior discount anyway knew 4 minivan day Geez hated thing Yep that’s roadster BTW I’ve also noticed “oh yeah” nod middleagers sport luxury car 5 I’m kookyeccentrichippie mom I’m okay 6 Mumus considered wardrobe option 7 I’m starting wonder win enough bingo make living 8 hosting giant family holiday dinner baton thankfully passed child much enjoy entertaining simply got much show dish help clean 9 Vintage camper converted school bus living real possibility Tiny home RV 10 enroll college whatever WANT without regard whether degree help career Yarn dying anthropology hip hop dance sociology astrophysics yak breeding 11 period alone make aging awesome 12 Errands consist booze run craft store yard sale soccer practice PTA meeting dry cleaner think I’m falling love later year It’s time start checking item bucket list Oh yeah…I joining Red Hat SocietyTags Life Life Lessons Positive Thinking Psychology Women |
4,195 | Robert Service — The Poet of the Yukon | A Short Biography of the Preston born Balladeer, Poet and Novelist
Robert with Marlene Dietrich during the filming of The Spoiler, 1942. Image: wikipedia
Having written about Jack London recently, and the new film version of his novel, The Call of the Wild, I thought it was time to take another look at Robert Service, the poet, balladeer and novelist who, like London, made a fortune out of the Yukon and Klondike gold rushes by their chosen forms of literature, and not gold prospecting, which they realised early on was a fools errand.
Both men were born within two years of each other (1874 and 1876), with Service the elder. Both writers grew up during a great flowering of American literature, with writers such as Frank Norris, Stephen Crane, Theodore Dreiser, Bret Harte, Ambrose Bierce, and Henry James.
Although Service read Burns at a young age, each was influenced by Rudyard Kipling early: London discovering him in prison as a disruptive youth, Service by way of boredom working in a Scottish bank, with lunch breaks spent reading. Both men sought adventure and found it. Sadly, London died aged only fifty during WWI, with Service dying aged eighty-four at the height of Rock ’n’ Roll.
Jack London’s legacy lives on. Robert Service is almost forgotten. Surely, it must be time for a film about him: it’s a good story.
Service at his desk in the early 1930s. Image: Yukon Info
Robert Service’s father, also called Robert, was born in Glasgow in 1837 — the year of Queen Victoria’s succession to the throne — where, at the age of fourteen, he became a clerk at the Commercial Bank of Scotland. Eighteen years later, in 1869, with no prospect of ever becoming anything other than a clerk, he decided to move to Lancashire and the prosperous town of Preston, which had, with the end of the American Civil War, regained once more its place at the centre of the cotton spinning and weaving industries. It was also a centre for banking, having created its own bank, The Preston Bank, at 39 Fishergate in 1844. Robert was sure his prospects of promotion within the banking industry would be assured in Preston. He applied to The Preston Bank for the position of clerk and was accepted. The site is today occupied by the NatWest and the Abbey National.
Sadly, things were to be no different in Preston than they had been in Glasgow, and by 1873 Robert had given up any idea of promotion, settling down to the daily task of recording figures and serving customers. Although something of a loner Robert senior nevertheless enjoyed the hustle and bustle of Preston and took every chance he could to make his way out into the countryside where he’d walk for miles and perhaps read from a small volume of poetry he often kept in his coat pocket.
Emily Parker — Robert Service’s mother, whose family were originally from Liverpool — was born in 1854. Her father, James Parker, had been born in the Lancashire town of Clitheroe soon after the Battle of Waterloo, in 1815. Emily’s mother Ann, and her father were both staunch Wesleyan Methodists, who had met and fallen in love after a service at Clitheroe’s Methodist Chapel. They married in 1835, and moved to Preston in 1838 where James started a wholesale grocery and tea importing business, which had, by 1872, become hugely successful with impressive business premises on Church Street, and a large Georgian mansion in the prestigious Winckley Square.
The Parkers’ moved in the very best Preston social circles, with James becoming a Conservative councillor in the 1850s.
With the sudden death of Ann Parker, in 1872, Emily was at last free to look for a husband, and found him in The Preston Bank.
Emily was a pretty girl who had often been to The Preston Bank with her father, and it was probably on these visits that she noticed the rather portly, but rather distinguished looking Robert Service working behind the bank’s counter.
Emily set her sights on him and eventually won his heart, resulting in the couple eloping to Gretna Green to be married.
Robert Service was born in the Christian Road house, Preston, on January 16th 1874, but it would take his father six weeks to register the boy’s birth. Robert Service would later write of his father that there was an “…other-worldliness and irresponsibility about him…” that brought out both irritation and admiration in Robert Service the poet who would always have a soft spot for his old man.
On the 24th of November 1875, Robert’s maternal grandfather, James Parker, died of cancer, leaving, it was estimated by the Preston Guardian, anywhere between £50,000 — £100,000.
In fact, according to James Parker’s will, he only left £18,000 of which £4,000 went to his housekeeper, with £2,300 going to Emily, Robert’s mother. On the strength of this legacy the Service family moved from Christian Road to 27 Latham Street, a slightly more gentile address just a tad closer to the prestigious Winckley Square.
Robert’s father then gave up his position at the bank setting himself up as an independent insurance agent. As one might imagine things didn’t work out and in the spring of 1878 the family packed their belongings and caught a train to Glasgow, Robert senior’s home town, settling in the select, and elegant, Lansdowne Crescent, where Robert’s father had another go at selling insurance.
Elegant it may have been, but 29 Lansdowne Crescent was a very small apartment indeed, with the consequence that, with Emily again pregnant (the Services already had five children) it was decided to off-load the two older boys, Robert and John, onto John Service (their paternal grandfather)and his family, which included three maiden aunts, in the small town of Kilwinning where the boys would remain for the next few years.
As Service biographer, James Mackay, writes:
“…Kilwinning, a small burgh and market town of some five thousand souls, situated on the right bank of the River Garnock in north Ayrshire, about twenty-four miles south-west of Glasgow.
“ An account of Kilwinning in 1851 dismisses it as comprising ‘one street, a few lanes and a square called the Green…”
It would seem, even with a good deal of house building in the years from 1851 to 1878, the place still felt “…like a village…” and was full of Service family off-shoots, not least grandfather John Service — who was also postmaster of Kilwinning — and his wife Agnes, who raised the two brothers in the Post Office, as well as looking after the aunts whose company Robert seemed to enjoy.
The postmaster often talked of his own grandfather and how he’d been a friend of the poet Robert Burns, which, because of the age difference, seems unlikely, but nevertheless, as these things often do, the stories stuck, and was something of a fortuitous myth for an up and coming poet. Maybe it also encouraged Service to read Burns, and write his first poem — a grace — at the age of six?
The postmaster was an easy going sort of chap, until Sunday came along, when he metamorphosed into a very strict adherent to the Sabbath. The Post Office was closed, with silence demanded about the house, especially at breakfast. There must be no reading of newspapers or books, and no singing of hymns in the house. The family then waited for the church bells to ring and, as the black coated and frocked worshipers made their way down the long street others would join them in silence from their homes. And although Robert Service, as a child, found the whole thing tiresome, he was already observing people, and their ways. It would all go into his work in later years.
When Robert was nine he, with younger brother John, moved back to Glasgow and their parents. The boys attended Hillhead School, leaving aged fifteen. Robert, like his father, found work in a bank.
In the late1890s Robert realised banking was not for him and left to find adventure (he was a great reader of adventure stories which must have included those of Jack London in Famous Fantastic Mysteries) in US and Canada. He tried his a hand at various jobs, even working in a bank again, but he couldn’t settle. And like London before him, realised he also wanted to be a writer.
But what sort of writer?
Like Jack London, Robert Service took to the west coast roads, living rough, roads that eventually took him to the Yukon (a mighty long walk), the furthest point in the north-west Canada.
It was to be the making of him, realising quickly during this vagabond period that he could write a much looser, rhyming ballad-style Kiplingesque long form poetry that, during the deep frozen Yukon winters, could keep a bar room full of hard living, hard drinking, gold prospectors entertained.
Robert Service was a hit, with his first book of ballads, Songs of a Sourdough, published in 1907 to huge success.
By 1908, still in the Yukon surrounded by miners, Service became rather chained to his desk as he got stuck in to his second volume, as his biographer, James Mackay writes:
“ With the onset of the winter of 1908 Robert got down to serious writing, producing his second book in four months, working from midnight till three in the morning. Any other hours were impossible because of the rumpus about him. Robert’s colleagues whooped it up every evening, but he would retire to bed at nine and sleep till twelve, then make a pot of strong, black tea and begin to write.”
When his publishers received his manuscript they were rather perturbed about certain poems, their violence and vulgarity, and couldn’t promise to publish unless they were removed, at which point Service threatened to take the MSS to another publisher. Eventually a compromise was reached with Briggs the publisher agreeing to pay Service an extra 5% in royalties for the removal of just one offending poem. When Ballads of a Cheechako was published later in 1908 it was another huge success, with Robert receiving a cheque for $3,000 within days of its publication. Robert Service was the best agent he ever had.
Thirteen more volumes of ballads and poetry followed, along with six novels, three volumes of non-fiction, several popular songs, numerous articles, plus fifteen collections of his verse. Several of his novels were made into movies, all of which earned him a great deal of money, allowing him to work just four months of the year, with the rest of time spent relaxing, ice skating and bob-sleighing, and travelling. He had achieved the goal he’d set himself after the publication of Sourdough.
Service moved to Paris in 1913, living in the Latin Quarter and posing as an artist. then, in June 1913 he married Parisienne Germaine Bourgoin, daughter of a distillery owner in France. She was thirteen years younger than Robert.
With the onset of WWI, Service worked briefly as a war correspondent for the Toronto Star (later Hemingway’s paper), then as an ambulance driver with the American Red Cross, as would Hemingway.
During the winter of 1917 Service moved his family to the south of France, and it was in Menton that Doris, one of his twin daughters, caught scarlet fever and died. Fearing that their other twin daughter, Iris, may catch the disease, Robert moved his wife and daughter to their summer home in Lancieux. Then, after hearing about a devastating Zeppelin raid on Canadian troops, Robert offered to help the war effort in any way he could (he was forty-one), resulting in an attachment to the Canadian Expeditionary Force “…with a commission to tour France, reporting back on the activities of the troops.”
As result of that attachment, and the brutality of the fighting he witnessed, Service wrote a series of war poems that are amongst the very best of his work.
After the war the Service family lived in the south of France before returning to Paris.
And although he lived in Paris at the same time as Hemingway, Scott Fitzgerald, Ezra Pound and James Joyce, he never met them, and because of the generational gap had probably never heard of them. It is certain they had heard of him, may even have read his work.
In 1920, Robert Service was worth some $600,000 (around $90m today), with a good deal of it invested in stocks and shares. But that same year saw a sudden drop (50%) in share prices that reduced his investment values hugely. The poet didn’t hesitate and re-invested his money into life annuities with some of the biggest insurance companies in the US. It was a wise move that kept him in comfort for the rest of his life. Had he not done so he would have been wiped in the crash of 1929.
Of the six novels he wrote, three were thrillers, written in Paris in the 1920s, all of which were turned into silent movies.
When wintering in Nice during the twenties, Robert would often be seen dining with Somerset Maugham and H. G. Wells, who were of the same generation and equally rich.
Throughout the interwar years Service and his family travelled widely in Europe, and often, as with Somerset Maugham, spending many months in the far-east, usually ending up in California, where they mixed with the Hollywood crowd.
During WWII, the family settled in the US, with Robert working for the government raising War Bonds.
After the war they returned to France, eventually settling back at their home in Lancieux, a home that had been turned into a German gun emplacement during WWII, with Robert’s precious library utterly destroyed. Robert Service died there, from a “…wonky heart…”on September 11th, 1958.
Germaine Service survived him by thirty-one years, dying aged one-hundred- and-two on December 26th, 1989, in Monte Carlo.
Interestingly Iris Service married the manager of Lloyd’s Bank, in Monte Carlo, in 1952.
Robert Service is perhaps best known now for “The Shooting of Dan McGrew”, which would have been okay with him as he thought of himself as a writer of verse, and not as a poet. He was much more than that.
And James Mackay’s 1996 brilliant biography of Service, Vagabond of Verse, sets the record straight. | https://stevenewmanwriter.medium.com/robert-service-the-poet-of-the-yukon-e48a44113251 | ['Steve Newman Writer'] | 2020-02-20 17:39:10.016000+00:00 | ['Poetry', 'Books', 'Biography', 'Literature', 'History'] | Title Robert Service — Poet YukonContent Short Biography Preston born Balladeer Poet Novelist Robert Marlene Dietrich filming Spoiler 1942 Image wikipedia written Jack London recently new film version novel Call Wild thought time take another look Robert Service poet balladeer novelist like London made fortune Yukon Klondike gold rush chosen form literature gold prospecting realised early fool errand men born within two year 1874 1876 Service elder writer grew great flowering American literature writer Frank Norris Stephen Crane Theodore Dreiser Bret Harte Ambrose Bierce Henry James Although Service read Burns young age influenced Rudyard Kipling early London discovering prison disruptive youth Service way boredom working Scottish bank lunch break spent reading men sought adventure found Sadly London died aged fifty WWI Service dying aged eightyfour height Rock ’n’ Roll Jack London’s legacy life Robert Service almost forgotten Surely must time film it’s good story Service desk early 1930s Image Yukon Info Robert Service’s father also called Robert born Glasgow 1837 — year Queen Victoria’s succession throne — age fourteen became clerk Commercial Bank Scotland Eighteen year later 1869 prospect ever becoming anything clerk decided move Lancashire prosperous town Preston end American Civil War regained place centre cotton spinning weaving industry also centre banking created bank Preston Bank 39 Fishergate 1844 Robert sure prospect promotion within banking industry would assured Preston applied Preston Bank position clerk accepted site today occupied NatWest Abbey National Sadly thing different Preston Glasgow 1873 Robert given idea promotion settling daily task recording figure serving customer Although something loner Robert senior nevertheless enjoyed hustle bustle Preston took every chance could make way countryside he’d walk mile perhaps read small volume poetry often kept coat pocket Emily Parker — Robert Service’s mother whose family originally Liverpool — born 1854 father James Parker born Lancashire town Clitheroe soon Battle Waterloo 1815 Emily’s mother Ann father staunch Wesleyan Methodists met fallen love service Clitheroe’s Methodist Chapel married 1835 moved Preston 1838 James started wholesale grocery tea importing business 1872 become hugely successful impressive business premise Church Street large Georgian mansion prestigious Winckley Square Parkers’ moved best Preston social circle James becoming Conservative councillor 1850s sudden death Ann Parker 1872 Emily last free look husband found Preston Bank Emily pretty girl often Preston Bank father probably visit noticed rather portly rather distinguished looking Robert Service working behind bank’s counter Emily set sight eventually heart resulting couple eloping Gretna Green married Robert Service born Christian Road house Preston January 16th 1874 would take father six week register boy’s birth Robert Service would later write father “…otherworldliness irresponsibility him…” brought irritation admiration Robert Service poet would always soft spot old man 24th November 1875 Robert’s maternal grandfather James Parker died cancer leaving estimated Preston Guardian anywhere £50000 — £100000 fact according James Parker’s left £18000 £4000 went housekeeper £2300 going Emily Robert’s mother strength legacy Service family moved Christian Road 27 Latham Street slightly gentile address tad closer prestigious Winckley Square Robert’s father gave position bank setting independent insurance agent one might imagine thing didn’t work spring 1878 family packed belonging caught train Glasgow Robert senior’s home town settling select elegant Lansdowne Crescent Robert’s father another go selling insurance Elegant may 29 Lansdowne Crescent small apartment indeed consequence Emily pregnant Services already five child decided offload two older boy Robert John onto John Service paternal grandfatherand family included three maiden aunt small town Kilwinning boy would remain next year Service biographer James Mackay writes “…Kilwinning small burgh market town five thousand soul situated right bank River Garnock north Ayrshire twentyfour mile southwest Glasgow “ account Kilwinning 1851 dismisses comprising ‘one street lane square called Green…” would seem even good deal house building year 1851 1878 place still felt “…like village…” full Service family offshoot least grandfather John Service — also postmaster Kilwinning — wife Agnes raised two brother Post Office well looking aunt whose company Robert seemed enjoy postmaster often talked grandfather he’d friend poet Robert Burns age difference seems unlikely nevertheless thing often story stuck something fortuitous myth coming poet Maybe also encouraged Service read Burns write first poem — grace — age six postmaster easy going sort chap Sunday came along metamorphosed strict adherent Sabbath Post Office closed silence demanded house especially breakfast must reading newspaper book singing hymn house family waited church bell ring black coated frocked worshiper made way long street others would join silence home although Robert Service child found whole thing tiresome already observing people way would go work later year Robert nine younger brother John moved back Glasgow parent boy attended Hillhead School leaving aged fifteen Robert like father found work bank late1890s Robert realised banking left find adventure great reader adventure story must included Jack London Famous Fantastic Mysteries US Canada tried hand various job even working bank couldn’t settle like London realised also wanted writer sort writer Like Jack London Robert Service took west coast road living rough road eventually took Yukon mighty long walk furthest point northwest Canada making realising quickly vagabond period could write much looser rhyming balladstyle Kiplingesque long form poetry deep frozen Yukon winter could keep bar room full hard living hard drinking gold prospector entertained Robert Service hit first book ballad Songs Sourdough published 1907 huge success 1908 still Yukon surrounded miner Service became rather chained desk got stuck second volume biographer James Mackay writes “ onset winter 1908 Robert got serious writing producing second book four month working midnight till three morning hour impossible rumpus Robert’s colleague whooped every evening would retire bed nine sleep till twelve make pot strong black tea begin write” publisher received manuscript rather perturbed certain poem violence vulgarity couldn’t promise publish unless removed point Service threatened take MSS another publisher Eventually compromise reached Briggs publisher agreeing pay Service extra 5 royalty removal one offending poem Ballads Cheechako published later 1908 another huge success Robert receiving cheque 3000 within day publication Robert Service best agent ever Thirteen volume ballad poetry followed along six novel three volume nonfiction several popular song numerous article plus fifteen collection verse Several novel made movie earned great deal money allowing work four month year rest time spent relaxing ice skating bobsleighing travelling achieved goal he’d set publication Sourdough Service moved Paris 1913 living Latin Quarter posing artist June 1913 married Parisienne Germaine Bourgoin daughter distillery owner France thirteen year younger Robert onset WWI Service worked briefly war correspondent Toronto Star later Hemingway’s paper ambulance driver American Red Cross would Hemingway winter 1917 Service moved family south France Menton Doris one twin daughter caught scarlet fever died Fearing twin daughter Iris may catch disease Robert moved wife daughter summer home Lancieux hearing devastating Zeppelin raid Canadian troop Robert offered help war effort way could fortyone resulting attachment Canadian Expeditionary Force “…with commission tour France reporting back activity troops” result attachment brutality fighting witnessed Service wrote series war poem amongst best work war Service family lived south France returning Paris although lived Paris time Hemingway Scott Fitzgerald Ezra Pound James Joyce never met generational gap probably never heard certain heard may even read work 1920 Robert Service worth 600000 around 90m today good deal invested stock share year saw sudden drop 50 share price reduced investment value hugely poet didn’t hesitate reinvested money life annuity biggest insurance company US wise move kept comfort rest life done would wiped crash 1929 six novel wrote three thriller written Paris 1920s turned silent movie wintering Nice twenty Robert would often seen dining Somerset Maugham H G Wells generation equally rich Throughout interwar year Service family travelled widely Europe often Somerset Maugham spending many month fareast usually ending California mixed Hollywood crowd WWII family settled US Robert working government raising War Bonds war returned France eventually settling back home Lancieux home turned German gun emplacement WWII Robert’s precious library utterly destroyed Robert Service died “…wonky heart…”on September 11th 1958 Germaine Service survived thirtyone year dying aged onehundred andtwo December 26th 1989 Monte Carlo Interestingly Iris Service married manager Lloyd’s Bank Monte Carlo 1952 Robert Service perhaps best known “The Shooting Dan McGrew” would okay thought writer verse poet much James Mackay’s 1996 brilliant biography Service Vagabond Verse set record straightTags Poetry Books Biography Literature History |
4,196 | In search of better agriculture and food sector outcomes in Punjab Province | * A five-year program seeks to empower small scale farmers and strengthen markets in Punjab province in Pakistan * The transformation process is essential to boost sustainable growth and tackle persistent malnutrition in a province where about 40 % are employed in agriculture and about 40% of children under age 5 are stunted * A recent visit to Punjab provides snapshots of the opportunities and challenges involved
What if public expenditure and regulations could be designed to deliver more results-per-rupee in the agriculture and food sector of Punjab province in Pakistan? What if government spending resulted in more poverty reduction, higher resilience, more business opportunities, and better nutrition? What would a smarter food economy look like? Who would benefit and who would stand to lose?
A year and a half after the start of a five-year Punjab Agriculture and Rural Transformation Program, the answers to these questions are still being formulated, as reforms and modernization attempts are being made in the fields, market lanes and offices of Punjab, Pakistan’s largest province. But one thing is clear: there is appetite for change.
Although public support for agriculture totaled about US$ 1.3 billion in 2017, growth has been low and erratic in the last few years, holding back a sector that provides about 40 percent of employment and contributes more than 20 percent of provincial GDP. Nor is the sector providing adequate nutrition: a survey found 39.2 percent of children under 5 to be stunted in Punjab.
The program known by its acronym SMART, supported by a World Bank program-for-results loan, seeks to remove some of the obstacles to growth by introducing policy and regulatory changes, and technological innovations.
A visit to the province in July 2019 provided multiple snapshots of the opportunities and challenges involved in the transformation process. | https://medium.com/world-of-opportunity/in-search-of-better-agriculture-and-food-sector-outcomes-in-punjab-province-428ffeb261c3 | ['World Bank'] | 2019-09-06 15:02:45.177000+00:00 | ['Health', 'Poverty', 'Agriculture', 'Data', 'Food'] | Title search better agriculture food sector outcome Punjab ProvinceContent fiveyear program seek empower small scale farmer strengthen market Punjab province Pakistan transformation process essential boost sustainable growth tackle persistent malnutrition province 40 employed agriculture 40 child age 5 stunted recent visit Punjab provides snapshot opportunity challenge involved public expenditure regulation could designed deliver resultsperrupee agriculture food sector Punjab province Pakistan government spending resulted poverty reduction higher resilience business opportunity better nutrition would smarter food economy look like would benefit would stand lose year half start fiveyear Punjab Agriculture Rural Transformation Program answer question still formulated reform modernization attempt made field market lane office Punjab Pakistan’s largest province one thing clear appetite change Although public support agriculture totaled US 13 billion 2017 growth low erratic last year holding back sector provides 40 percent employment contributes 20 percent provincial GDP sector providing adequate nutrition survey found 392 percent child 5 stunted Punjab program known acronym SMART supported World Bank programforresults loan seek remove obstacle growth introducing policy regulatory change technological innovation visit province July 2019 provided multiple snapshot opportunity challenge involved transformation processTags Health Poverty Agriculture Data Food |
4,197 | What’s on Mind ? | Whats on your mind?
They asked
Different worlds
Or happy thoughts.
There is a dark tunnel
Which leads to hell
And this is all
Left to imagine.
A slideshow flash before my eyes
Of people I love and hate
The happy thoughts of future
Are far behind
Fear takes it all away
Does it happen to you too?
The trembles pass down my spine
And the hair stands erect at their end
Negative thoughts has filled me up
And the tunnel will take me to hell.
On the other side,
I saw a shrub blooming
Among the dark clouds
It gave courage and strength
Maybe that is all what I want. | https://medium.com/poets-unlimited/whats-on-mind-54aa222641eb | ['Nalini Gupta'] | 2017-09-21 21:17:04.661000+00:00 | ['Deep Thoughts', 'Poetry', 'Fiction', 'Writing', 'Poetry On Medium'] | Title What’s Mind Content Whats mind asked Different world happy thought dark tunnel lead hell Left imagine slideshow flash eye people love hate happy thought future far behind Fear take away happen tremble pas spine hair stand erect end Negative thought filled tunnel take hell side saw shrub blooming Among dark cloud gave courage strength Maybe wantTags Deep Thoughts Poetry Fiction Writing Poetry Medium |
4,198 | Why I Stopped Forgiving People… and Maybe You Should, Too | Why I Stopped Forgiving People… and Maybe You Should, Too
Forgiveness was turning me into a chump
Image by Timisu
Forgive and forget, right? I used to think that. I don’t anymore.
Now look, I know I’m up against some real heavy hitters here when I say maybe you should stop forgiving people. After all, isn’t forgiveness a cornerstone of the major religions?
The Tanakh says that one who forgives an insult keeps a friend (Proverbs 17:9). The Christian New Testament says to forgive, if you have anything against anyone (Mark 11:25). The Qur’an says that one who forgives shall have reward with God (42.40). The Vedas say that forgiveness is the greatest strength (Mahābhārata 5.33.48).
Well, I tried that. For decades, I forgave people who did me wrong. And you know what? It made me feel good about myself. But then one day, I woke up. And I realized that forgiving people was turning me into a chump.
No Good Deed Goes Unpunished
What changed my mind was Frankie. We met in high school, and despite our differences became friends. But at some point, he started going down roads I didn’t particularly want to be on. It wasn’t just all the New Age stuff he and his girlfriend Hailey were into. It was their physically abusive relationship, which he made no apologies for, along with his white suburban Marxism and what I considered an abuse of psychedelics.
I reached my breaking point one night shortly after his marriage (not to Hailey) when he invited me up to his new place, saying we’d go out and shoot some pool. Turns out, he hadn’t bothered to tell his new wife about these plans, and I found myself cooling my heels out in the hallway while overhearing a knock-down-drag-out argument. Eventually he walked out, cool as a cucumber, and drove us to a pool hall somewhere in Atlanta. Almost as soon as we arrived he excused himself and slipped into the back to score some acid. I ended up playing 9-ball with total strangers for over an hour.
I’d just had the bartender call me a cab, hoping he could find Frankie’s place from my memory of the directions to his house from mine which were on a scrap of paper in my car (this was before cell phones and GPS), when Frankie reappeared, all smiles. His guy didn’t have the stuff, so they’d had to go get it. After that night, I didn’t see Frankie for two or three years.
Until the day he showed up on my front porch. His wife was divorcing him. All his stuff was in his car.
It wasn’t much stuff. He’d sold most of it, including his guitars. “What’d you do,” I asked, “have a yard sale?” Frankie let out a quick “Ha!” Apparently, he had other venues for selling things.
So he stayed a couple weeks.
Back then, I was using the “envelope method” of budgeting, and a couple of times I could swear I’d had more cash in this or that envelope than was now there. But since I didn’t write down the running amounts, I couldn’t be sure. I told myself I was being paranoid, misjudging my friend, letting old scores make me suspicious. Then again, why was it that I, of all people, was seemingly the only option he had to turn to?
A few days after he split, I went to get my checkbook out of my desk. And didn’t see it. Uh-oh. Rummaging around, I found it, and felt ashamed of myself. Until I thought, “Wait, where’s the watch?”
My grandfather’s silver pocket watch. (Yes, like in Pulp Fiction — except my grandfather carried his in his vest and never took it to war.) I knew the watch was in that drawer. Except now, it wasn’t. I searched the house. No dice. It was gone. I had no idea where Frankie had lit out to. And I’ve never seen him since.
The Trap of Forgiveness
It took a while, but I forgave Frankie. For the night at the pool hall, the cash in the envelopes, even the watch, which was irreplaceable. Being angry wasn’t doing me any good. And the odds of getting restitution were pretty much zero. It felt like the right thing to do, like it made me a better person.
Then came the day, many years later, when I decided to finally go through all my old photographs, put dates and names on the backs, organize them, and toss out the ones I didn’t want to keep. I pulled out all the photo albums and Fotomat envelopes. And there, in the back of the drawer, was the watch.
That’s when it hit me. Forgiveness had made me a chump.
I realized then that forgiving Frankie hadn’t really made me a better person. It had only made me feel like one. But not just better than who I’d been before. Better than Frankie. It was a way of permanently casting Frankie as the offender and myself as the victim. I got to be blameless, and he got to be the villain.
Truth was, if I was honest with myself, I’d had my own problems in relationships. I wasn’t physically abusive, but I knew how to turn the emotional screws when I wanted to. And come to think of it, I’d pulled a vanishing act on very close friends a couple of times myself, and for no more noble reasons than he had. And, too, I’d taken my own flights of fantasy into strange philosophies and mystical nonsense. And I could be a downright arrogant sumbich.
I was no better than him. After all, we became friends for a reason, didn’t we? But by “forgiving” him, I got pretend to myself that I was. That he was down there and I was up here. My “forgiveness” had never been about him. It had been about me the whole time. It was a shaming moment. And it changed my attitude.
If Not Forgiveness, What?
There are lots of stories told about the Buddha, to illustrate his teachings. In one of them, a man decides to test the Buddha by insulting him. If he were to react with anger, he would show himself to be a fraud. If he did nothing, he would reveal himself as a coward.
So the man found the Buddha sitting with his disciples, walked up to him, and spat right in his face.
The Buddha wiped off the spittle with the hem of his garment, looked up at the man, and said, “What now? What else do you have to say?”
The man was not prepared for this question. He turned and left in silence and went home. And that night, he could not sleep for shame at what he had done.
The next day, he again found the Buddha sitting with his disciples, and he bowed to him and said, “Sir, please forgive me for what I did to you yesterday.”
“I’m afraid that’s not possible,” responded the Buddha. “I cannot forgive you. Because I have no grudge against you. Please, sit down, and let us talk of something else.”
Returning to Here and Now
There is a technique in Buddhist counseling to ask the person seeking help to focus on what is going on at the moment. If a person is angry about an argument they have had with their spouse, they might be asked “So where is your spouse right now?” And then they might be asked “Where is your argument?”
The argument no longer exists. What exists is simply who we are at this moment.
We do not need to “let go of the argument” because there is no argument to let go of. There is only who we are now, where we are now. Once we see this, we can get to the truly important question: What next? What do I choose to do at this time? What karma, what result, do I intend to create?
Doing this, we can escape the trap of forgiveness, the self-serving urge to cast ourselves as the victim and the other as the offender, ourselves as the good guy and the other as the bad guy. We can recognize our responsibility to decide how we are going to act, and let go of our desire to protect our own ego. And believe it or not, we can do this for offenses a lot more heinous than petty larceny (real or imagined). Just ask the Vietnamese monk Thích Nhất Hạnh. For me, it has turned out to be the key to getting past things I am not yet ready to write about, and maybe never will be. I don’t have to carry them anymore.
I have no idea where Frankie is today. I’ll probably never see him again. But wherever you are, Frankie, I owe you one. | https://medium.com/illumination-curated/why-i-stopped-forgiving-people-and-maybe-you-should-too-37fc61e00a22 | ['Paul Thomas Zenki'] | 2020-11-22 02:20:56.949000+00:00 | ['Psychology', 'Personal Growth', 'Zen', 'Buddhism', 'Forgiveness'] | Title Stopped Forgiving People… Maybe TooContent Stopped Forgiving People… Maybe Forgiveness turning chump Image Timisu Forgive forget right used think don’t anymore look know I’m real heavy hitter say maybe stop forgiving people isn’t forgiveness cornerstone major religion Tanakh say one forgives insult keep friend Proverbs 179 Christian New Testament say forgive anything anyone Mark 1125 Qur’an say one forgives shall reward God 4240 Vedas say forgiveness greatest strength Mahābhārata 53348 Well tried decade forgave people wrong know made feel good one day woke realized forgiving people turning chump Good Deed Goes Unpunished changed mind Frankie met high school despite difference became friend point started going road didn’t particularly want wasn’t New Age stuff girlfriend Hailey physically abusive relationship made apology along white suburban Marxism considered abuse psychedelics reached breaking point one night shortly marriage Hailey invited new place saying we’d go shoot pool Turns hadn’t bothered tell new wife plan found cooling heel hallway overhearing knockdowndragout argument Eventually walked cool cucumber drove u pool hall somewhere Atlanta Almost soon arrived excused slipped back score acid ended playing 9ball total stranger hour I’d bartender call cab hoping could find Frankie’s place memory direction house mine scrap paper car cell phone GPS Frankie reappeared smile guy didn’t stuff they’d go get night didn’t see Frankie two three year day showed front porch wife divorcing stuff car wasn’t much stuff He’d sold including guitar “What’d do” asked “have yard sale” Frankie let quick “Ha” Apparently venue selling thing stayed couple week Back using “envelope method” budgeting couple time could swear I’d cash envelope since didn’t write running amount couldn’t sure told paranoid misjudging friend letting old score make suspicious people seemingly option turn day split went get checkbook desk didn’t see Uhoh Rummaging around found felt ashamed thought “Wait where’s watch” grandfather’s silver pocket watch Yes like Pulp Fiction — except grandfather carried vest never took war knew watch drawer Except wasn’t searched house dice gone idea Frankie lit I’ve never seen since Trap Forgiveness took forgave Frankie night pool hall cash envelope even watch irreplaceable angry wasn’t good odds getting restitution pretty much zero felt like right thing like made better person came day many year later decided finally go old photograph put date name back organize toss one didn’t want keep pulled photo album Fotomat envelope back drawer watch That’s hit Forgiveness made chump realized forgiving Frankie hadn’t really made better person made feel like one better I’d Better Frankie way permanently casting Frankie offender victim got blameless got villain Truth honest I’d problem relationship wasn’t physically abusive knew turn emotional screw wanted come think I’d pulled vanishing act close friend couple time noble reason I’d taken flight fantasy strange philosophy mystical nonsense could downright arrogant sumbich better became friend reason didn’t “forgiving” got pretend “forgiveness” never whole time shaming moment changed attitude Forgiveness lot story told Buddha illustrate teaching one man decides test Buddha insulting react anger would show fraud nothing would reveal coward man found Buddha sitting disciple walked spat right face Buddha wiped spittle hem garment looked man said “What else say” man prepared question turned left silence went home night could sleep shame done next day found Buddha sitting disciple bowed said “Sir please forgive yesterday” “I’m afraid that’s possible” responded Buddha “I cannot forgive grudge Please sit let u talk something else” Returning technique Buddhist counseling ask person seeking help focus going moment person angry argument spouse might asked “So spouse right now” might asked “Where argument” argument longer exists exists simply moment need “let go argument” argument let go see get truly important question next choose time karma result intend create escape trap forgiveness selfserving urge cast victim offender good guy bad guy recognize responsibility decide going act let go desire protect ego believe offense lot heinous petty larceny real imagined ask Vietnamese monk Thích Nhất Hạnh turned key getting past thing yet ready write maybe never don’t carry anymore idea Frankie today I’ll probably never see wherever Frankie owe oneTags Psychology Personal Growth Zen Buddhism Forgiveness |
4,199 | What if you could draw your thoughts in technical interviews? | A lot of people like drawing out their approach as they work through coding challenges, and that usually involves using a pen/paper or your iPad. Even some of the best tech interview related products like AlgoExpert (which I highly recommend using) end up showing you how to work through problems by drawing on a (digital)whiteboard. This got me thinking, why dont we have a single place to code, draw, and video chat with another person? I decided to build that.
Just share your sandbox link!
Interview Sandbox is an app that came out of the desire for a place to practice, pencil, and perform without having to have a split screen, or to spend time drawing on a piece of paper and showing your interviewer your thoughts. The app is pretty simple, you create your sandbox (no login needed!) and share your link with anyone else, to get them on the same page. They can see your code in real time, see your drawings in real time, and you can chat with them too. After you’re done, just save the link for future reference. That’s it.
The app is currently in v1 so there may be some quirks and some bugs, but I will be ironing those out and it should get better with each successive use! Oh and if you have any feedback on how it can get better, I would love to hear. Just leave a comment. | https://medium.com/hackernoon/what-if-you-could-draw-your-thoughts-in-technical-interviews-47e1ff87bf33 | ['Sagar Desai'] | 2020-05-27 21:08:54.178000+00:00 | ['Technical Interview', 'Software Engineering', 'Software Development', 'Interview', 'Coding'] | Title could draw thought technical interviewsContent lot people like drawing approach work coding challenge usually involves using penpaper iPad Even best tech interview related product like AlgoExpert highly recommend using end showing work problem drawing digitalwhiteboard got thinking dont single place code draw video chat another person decided build share sandbox link Interview Sandbox app came desire place practice pencil perform without split screen spend time drawing piece paper showing interviewer thought app pretty simple create sandbox login needed share link anyone else get page see code real time see drawing real time chat you’re done save link future reference That’s app currently v1 may quirk bug ironing get better successive use Oh feedback get better would love hear leave commentTags Technical Interview Software Engineering Software Development Interview Coding |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.