title
stringlengths
1
200
text
stringlengths
10
100k
url
stringlengths
32
829
authors
stringlengths
2
392
timestamp
stringlengths
19
32
tags
stringlengths
6
263
5 Essential YouTube Channels For Coders
1. Fun Fun Function Fun Fun Function: Now defunct. Its creator, MPJ, took his excellent Quora question-answering abilities and translated that into a YouTube channel that currently has around 250k subscribers. Don’t let its defunct-ness stop you from checking this channel out, as there’s about 5 years worth of content on there for you to trawl through. When he explains a concept, he explains it deeply. For example, MPJ has a video series on the concept of Testing, which is about 100 minutes long, spread across 7 videos. He doesn’t just show people how to write tests (which is what most tutorials miss the point of explaining such concepts by doing), but rather explains why we should write tests, what methodologies we can use, the pros and cons of each, along with writing the most basic of tests and then adding in complexity, step-by-step.
https://medium.com/javascript-in-plain-english/5-great-youtube-channels-for-teaching-coding-concepts-ad0aab003d20
['Sunil Sandhu']
2020-10-27 18:19:57.897000+00:00
['JavaScript', 'Web Development', 'Software Development', 'Computer Science', 'Programming']
Chrome’s console.log is the slowest
JavaScript is great. You can slice & dice huge datasets on the fly, and play with them, and explore them, and turn them into pictures. All live in the browser. Not “big data” huge, more like “Holy shit, 5 years ago this would crash my browser” huge. Like, you’re playing with a dataset on 177,830 salaries in the US software industry, and your computer doesn’t even notice until you do something stupid. Something like debug your code, reload a bunch of times, and maybe console.log the whole dataset to make sure it parsed. That’s, what…? Like, 10 seconds of high CPU load for one console.log? Something like that. The more you do it, the longer it takes. Here’s what reading that output looks like: As long as that CPU load is high, you can’t do much. After a few tries, even your reload button stops working and you have to forcefully crash the browser tab to get back to work. It’s so annoying because everything else is so damn fast! Want to .map your dataset? No biggie, 10ms max. Want to .filter? 10ms. Want to filter and map and flush React updates to DOM and do all the things? You won’t even notice it does anything. But debugging your code… eesh. Can’t console.log very well, React dev tools are glacial, and god forbid you try to inspect element if there are thousands of DOM nodes on the page. Not to mention the initial page setup. Loading and parsing the dataset (CSV) from cache takes about 3 seconds, running the initial setState takes about a second. That’s up to 5 seconds between every full page reload and seeing output. 5 seconds might not sound like a lot, but it adds up. I don’t know about you, but I run my code at least once every minute. That’s 5 minutes of waiting every hour, 40 minutes in a work day, and 3 working hours in a week. “Don’t console.log your whole dataset then.” Thank you, Captain Obvious. console.log(data[0]) is your best friend in these situations. And yet, you can do a lot by changing your browser. As far as presentation quality goes: Safari is shiniest, Firefox is practicalest, and Chrome is spartaniest. Safari makes logs useful right away. It even shows property types. That was a nice surprise. Firefox makes the output easy to work with, too. But it struggles if you quickly traverse into a deep part of the output. As far as presentation quality goes: Safari is shiniest, Firefox is practicalest, and Chrome is spartaniest. Oh, and Firefox wins the overall “How long do I have to wait before I can test my code?” challenge as well. You can shave a few milliseconds off the initial setState({data: foo}) call. Fascinating. I’m a total nerd. 🤔 🤓
https://medium.com/swizec-a-geek-with-a-hat/chromes-console-log-is-the-slowest-a263a75efdfb
['Swizec Teller']
2016-10-04 21:58:48.129000+00:00
['JavaScript', 'Programming', 'Front End Development', 'Web Development', 'Software Development']
Due Diligence
Bobby Chiu We limn the darkness as best we can, trying to shine light into corners where the scariest monsters await. Such shining demands deft courage. Shaky hands hold the illuminating torch. Fear is a wise and constant companion. The darkest corners are in the mind. Some monsters despise the light. Look hard, but be wary what you loose.
https://medium.com/geezer-speaks/due-diligence-9679f902a3a0
['Mike Essig']
2017-09-16 07:34:22.633000+00:00
['Poetry', 'Consciousness', 'Dreams', 'Fear', 'Psychology']
Ignore The United Front In Belfast, It’s All For Show
After three years of suspension the Northern Ireland Assembly finally reconvened & much needed governance was restored. With the initial housekeeping of getting the Assembly ancillary structures set up the first big debate was scheduled for Monday 20th January. What was this debate about? The dire situation of our public services? No. The priorities for the The Executive’s new Programme for Government? Not that either. Instead The Executive decided to debate a motion withholding The Assembly’s consent for the Prime Minister’s Withdrawal Agreement. Never mind the fact that such a matter is outside the competency of the devolved administrations, the debate was rendered doubly pointless by the unfortunate detail that the legislation putting this agreement into UK law had cleared the House of Commons mere hours before they decided to return from their needless 3 year sabbatical. It was gesture politics at its finest, the only problem being is that gestures are the last thing that Northern Ireland needs. This shameless piece of political theatre though did result in a rare occurrence, total unanimity from the local political parties. The motion was carried without a division and not a single member shouting “No” when called upon. However once we dig into the rationale for each of the local parties’ objections, the newfound consensus gets slightly shaky. The Nationalists, Sinn Fein and the SDLP, were both thoroughly against it partly due to their europhilic bent but also because in their mind Brexit, no matter the form, would ruin Northern Ireland’s economy. Quite exactly how or by what mechanism wasn’t entered into, the presumably imminent economic ruin is simply an article of faith that doesn’t require explanation. Sinn Fein were more vague in their concerns but the SDLP brought up prophecies of doom that had a distinct March 2019 vintage such as the return of a hard land border even though the agreement they’re rejecting specifically prevents that. The Lib Dem-esque Alliance Party spoke out against it in similar terms but had a tacit acceptance that Brexit was actually happening and that the supposed damage must be mitigated. It was the same doom mongering that require no elucidation but with a hint of practicality, a more cynical person could call it triangulation. Last but not least, the two Unionist parties also rounded on the Withdrawal Agreement. The Ulster Unionist Party condemned it in line with their most recent remain stance, having been Pro Brexit until a few months ago. Their contribution was the most ill defined of the lot but it was broadly clear that they were against it. The DUP, technically one of the few Pro Brexit parties in the Assembly, went on to condemn the agreement but made clear they thought leaving the EU was still a good idea. The crux of their objection was that the PM’s deal would result in a “Border in the Irish Sea” despite the fact that Articles 4 and 6 of the Revised Northern Ireland Protocol say otherwise. Political parties trying to rhetorically have it both ways is not an uncommon thing in Northern Ireland. The faux outrage and fulmination went on for hours. We heard all the same lines and second hand arguments that we’ve heard the past three years. I even felt slightly nostalgic as I got to play Brexit Buzzword Bingo for the first time in a good while. In the end the motion passed and nothing was achieved apart from giving journalists something to write about on a subject the public has long stopped caring about. A political spectacle had been successfully orchestrated and a very hollow one at that. While the assembly members got to feel important denouncing the PM and “English Nationalism” the same problems that beset Northern Ireland, that they let fester for three years, remain. The NHS waiting times remain a disgrace, the infrastructure inadequate and the education system is still segregated and unreformed. Real work needs to be done and it really can’t wait. The general opinion of politicians may be at rock bottom in England but in Northern Ireland it’s even lower. When the assembly returned there was widespread elation but the honeymoon period didn’t even last a week as the new ministers immediately started complaining that the additional £2bn Stormont is set to receive wasn’t enough. People are greatly disillusioned and this sentiment is continually exacerbated by the actions of a political class that seems to be in another universe with priorities totally alien to their own. I saw this first hand in the General Election where a surprising number of people couldn’t even bring themselves to vote and it showed in my constituency of East Antrim that had the 4th lowest turnout in Northern Ireland. Everyone has been waiting a long time for even the most basic semblance of a government to return. This is a time for decisive action not grandstanding, as the Secretary of State Julian Smith so succinctly put it “Let’s get on with it”. This article was originally published on www.aaronrankin.com on 20th January 2020
https://medium.com/@Aaron_R_Rankin/ignore-the-united-front-in-belfast-its-all-for-show-2d0caca0a3f9
['Aaron Rankin']
2020-03-03 18:06:51.611000+00:00
['Northern Ireland', 'Politics', 'UK Politics', 'Brexit', 'Ni Politics']
Clear the Runway to Let Your Art Take Off!
Do you ever have plans to create art, but you find that you have to take care of other matters first? You may find yourself saying: I just need to eat first Let me just check my email/Facebook/Instagram first I better pay some bills before I get deep into artmaking Oh! I need to call so-and-so back. I’ll do that first. I better walk the dog first Sometimes, you may find that you’ve run out of time to do your artmaking thanks to all of these other tasks! I think it helps to clear the runway of any possible “speed bumps” so you can quickly take off into your art practice. What can you do that will allow you to fly into your studio and get working with no friction or delays? First, you may want to spend time identifying your speed bumps. For example, if you find yourself losing precious time while selecting a reference photo, create a practice whereby you can easily print photos for reference and then put them in a specific storage location. Choose your reference photo the day before your artmaking session, so that decision is made and won’t slow you down. That brings up a great idea: Create an assignment for your artmaking session ahead of time. Don’t go into your studio and poke around, wondering what to make (if this is a typical speed bump for you). Select your task ahead of time and write it down. You may want to write a list of the steps, too. This way, you’ll be guided and kept on track as you work through the project. Try using “mis en place,” as chefs do — lay out all of your materials ahead of time, so you can fly into your studio and begin with velocity! What about other speed bumps? Clearly our mobile devices are speed bumps for all of us. What’s your plan to avoid getting sucked into your email or social media? Are you willing to turn off your phone’s volume beginning a half-hour before your art time? Some speed bumps are important things for us to do, such as bills, paperwork, phone calls, and home/life management. I suggest that you move these speed bumps off the runway in either direction: to a specified time that is either before or after your artmaking time. Set an appointment with yourself to do these admin tasks, and keep your appointment! During that time, do nothing else but power through your tasks and responsibilities. I wake up at 4am some workday mornings to get my admin tasks behind me! By being more intentional about your time and adopting a belief that your artmaking time is sacred and important, you will be able to fly down the runway and fly into your artmaking flow!
https://medium.com/@annamieka/clear-the-runway-to-let-your-art-take-off-31e1b7935db8
['Annamieka Hopps Davidson']
2021-08-19 20:47:30.259000+00:00
['Habit Building', 'Creativity', 'Mentoring Programs']
Hi:)
Hi:) This is the disclaimer that this will not be a very serious dispensary of writing. The grammar will be poor and the rhetoric just alright. I am giving this warning because if anyone reads this…well good luck. Now, since that has been established let me introduce myself. My name is H. I’m a 19 year old average American college student. A sorority girl even. I know you’re probably picturing the stereotype blonde bombshell but trust me that is farthest from the truth. Recently, I have ruined my life in multiple ways which I will eventually explain. It involves boys, best friends, a worldwide pandemic and crippling depression. The perfect combination for a fuck up storm to blow up you’re whole life. Or my whole life. The reason I’m writing is to use this as an outlet to make sense of my life, or more so how it got fucked up. I figured this may be the only way to fix it. There is a beautiful Charles Dickens quote that goes “suffering has been stronger than all other teaching, and has taught me to understand what your heart used to be. I have been bent and broken, but — I hope — into a better shape.” This quote determines me to turn things around. That because I have I hope, my past has not won. Whoever you are, I hope you enjoy the shit show that was my life. Welcome ;) H
https://medium.com/@rachaelhassiepen/hi-fbe99d11531a
['H']
2020-12-24 06:25:26.776000+00:00
['First Post', 'College', 'New Writers', 'Journal']
A Home in Ruin: The Work of Mohamad Hafez at the Lanoue Gallery
Originally published to the Core Blog on April 6, 2017 at 1:19 pm. “We Have Won!” by Mohamad Hafez. (Maher Mahmoud via The New Yorker) Last Sunday, April 2nd, a solo show opened at the Lanoue Gallery, located not far from us [at Boston University] on Harrison Ave. Featuring artwork by Syrian artist Mohamad Hafez, currently an architect in New Haven, Connecticut, the gallery is a window into a miniature world in ruin that reflects the modern-day destruction in Syria. “He wanted to make full, three-dimensional cities, and he wanted them to look real–gritty, rusted, lived-in, and partially destroyed,” describes writer Jake Halpern for The New Yorker. “Junk became his medium.” The concept took shape thirteen years ago in his college days at Iowa State University. Prevented by his visa from returning home, he took to rebuilding his city of Damascus in miniature. His first work was a recreation of a facade of an ancient wall that was pictured on a Syrian candy bar wrapper. Today, his constructions are far less idyllic than his early models. Hafez tried to explain the turn his art had taken. How do you watch whats happening in Aleppo and not go nuts? he asked. How do you watch thousands of years bombed out of existence? How do you go on with your life, having your morning coffee, when a bunch of your relatives and friends are under constant bombing? How do you not snap and yell out? You have to remain composed and carry on with your day job, don’t you? He paused for a moment, as if lost in thought. Well, he said, finally, the way that I stayed composed is that I come here and I let the models do the yelling for me. In that sense, it relieves me. It is grim. And I take no pride in this work. I feel no ownership in it. Its as though I am 3-D printing whats inside of me. These unpeopled cities, flickering with the lights of tiny lamps, chattering with sound recordings from happier times that emanate from hidden speakers, are just familiar enough to be recognizable as home. Hafez draws out in the viewer the same pangs of homesickness and of loss that he channels into his work. And in [the CAS Core Curriculum], we are no strangers to homesickness and loss; the enslavement of the Israelites in Egypt and Odysseus’s twenty-year journey back to Ithaca come to mind, not to mention our own personal experiences here at BU. Witness Mohamad Hafez’s take on displacement, home, and ruin in his solo gallery show HOMELAND inSECURITY at the Lanoue Gallery on 450 Harrison Ave #31, from April 2 until the end of the month [April 2017]. Read the rest of The New Yorker article here. About the Author: Cat Dossett is a Boston-based illustrator and writer. She is the author of two comics: Laika, on the first dog in space, and Vessel, on watching Adam Driver movies in the bathtub. Her poetry has been published in a chapbook entitled Odysseus & Eden (Pen & Anvil 2019). Selections of her illustrations and writing appear in NERObooks, BURN, Hawk & Whippoorwill, and Sobotka Literary Magazine.
https://medium.com/@aboutadaughter/a-home-in-ruin-the-work-of-mohamad-hafez-at-the-lanoue-gallery-a3d2ca0f3f64
['Cat Dossett']
2020-07-15 15:42:25.697000+00:00
['Syria', 'Mohamad Hafez', 'Contemporary Art', 'Middle East', 'Boston']
The Dangers of Rolling Our Eyes at Others
Photo by Max-Jakob Beer on Unsplash A Devotional on John 1:43–51 (focus on verses 43–46) In the musical Les Miserables, a group of students revolt against the corrupt Paris government. The group is led by Enjolras, a passionate man willing to sacrifice everything to see change. He leads his fellow revolutionaries in the rousing anthem, “Do You Hear The People Sing?” (Go ahead, put it up at full volume and belt it out!) In the first chapter of the gospel of John, we read about an encounter between Jesus and a young man named Nathanael. When I think of Nathanael, I imagine someone like Enjolras. Like Enjolras, Nathanael hung out with a group of like-minded young men who dreamed of revolution. Israel was under the thumb of Roman rule, and they longed to be set free. Nathanael and his friends knew God had promised to send Israel a savior. They all had theories on what that savior — the rightful king of Israel — would be like. But when Philip suggested that Jesus from Nazareth was the promised one, Nathanael scoffed. “Nazareth! Can anything good come from there?” Nathanael asked. (Verse 46.) Nazareth was a small, backwater district where nothing exciting ever happened. Nathanael quickly dismissed Jesus because he came from “the wrong side of town.” In his book Encounters with Jesus, Tim Keller says: Dismissiveness kills all creativity and problem-solving. How quick are we to shut down a conversation because the other person “just doesn’t get it”? Nathanael pounced on Philip’s idea that Jesus could be the savior. The Message version of the Bible puts his response this way: “Nazareth? You’ve got to be kidding.” That sounds like a reply you might see on a Facebook post! Keller goes on to say there is one surefire way to tell if you’re being dismissive: check for the eye roll. If I’m rolling my eyes at someone, I’m dismissing their ideas, and maybe even dismissing them as a person. That’s a dangerous place to be. It means I’ve shut myself off from learning about them. I can’t roll my eyes and show compassion at the same time. I’ve seen both Christians and atheists have dismissive attitudes towards each other. Christians dismiss atheists for ridiculous ideas like evolution (because there’s no way God could use evolution as a method of creating life!). Atheists dismiss Christians for ridiculous ideas like creation (because there’s no way evolution would need God’s hand to guide it!). Soon we’re all rolling our eyes at each other. So how do we fight off dismissiveness when we feel it creeping into our lives? One way is to think the best of people! Assume they’ve thought through their ideas. When someone holds a view that goes against mine, I can either dismiss them or ask why they hold that opinion. I can either roll my eyes or choose to take their point of view seriously. It doesn’t mean I will suddenly agree with their view. But I might learn something from them, and (gasp!) maybe even begin to see them as a fellow human being and not simply as a set of “wrong” ideas. Curiosity invites conversation and builds friendships with people who are different from me. Questions to Ponder: Why do you think Nathanael dismissed Jesus so quickly? What (or whose) ideas are you most likely to dismiss? How can you invite conversation instead?
https://medium.com/nobody-left-out/the-dangers-of-rolling-our-eyes-at-others-12e2d8dbdda2
['Michael Murray']
2019-10-15 16:44:34.052000+00:00
['Spirituality', 'Jesus', 'Bible', 'Christianity', 'Life']
Making MAST Meaningful; Bitcoin Atomic Swaps Become Private
This is still not private, since the hash of x appears in both chains. They would use different pubkeys on the different chains, so the pink key on chain W would differ from the pink key on Y, but the x value must be the same on both chains. MAST allows the first transaction to be rewritten so that only one of the three redeem conditions needs to be revealed. In the cooperative case, neither x nor hash(x) needs to be revealed publicly on Blockchain Y. Both Alice and Bob will know the all the redeem conditions to protect themselves, but won’t reveal unless needed. Here is how the first transaction would be constructed: Merkle tree of possible spending options MAST allows us to instead represent this script as a tree. The combining of colors represents hashing. The brown hash at the top is what locks up the coins. The coins can be spent if a proof that colors (hashes) can be combined to a leaf color which matches a spend condition. Also, the tree can be unbalanced, where a shorter proof (smaller cheaper transaction) is used in the common cooperative case. In the below example, they show that blue and pink combine to make a violet multisig. They then show violet and orange make brown. Thus after proving pink and blue are allowed, 2 of 2 signatures from Alice and Bob can move the coins. This looks like any ordinary MAST based 2 of 2 multisig on the blockchain. The public sees none of the fancy protection parts, just the orange hash. The important part is neither x nor hash(x) appears on Blockchain Y, increasing privacy. The public can’t even see that it’s a swap transaction.
https://medium.com/hackernoon/making-mast-meaningful-bitcoin-atomic-swaps-become-private-ff003f7c2b7a
['Brian Deery']
2017-10-06 15:43:08.236000+00:00
['Atomic Swaps', 'Bitcoin', 'Privacy', 'Blockchain', 'Mast']
5 BOOK RECOMMENDATIONS
5 BOOK RECOMMENDATIONS The Picture of Dorian Gray A novel by Oscar Wilde. It caused a lot reactions when it was published. Still, it is one of the most popular philosophical books. A Clockwork Orange Devastating, intense story. Anthony Burgess wrote it and Stanley Kubrick filmed it in 1971. Both of them are amazing piece of work. You won’t regret it. Finishing this one with a quote. “Is it better for a man to have chosen evil than to have good imposed upon him. Master Zacharius We all know Jules Verne but most of us didn’t hear this little story of him. If you interested in the concept of time,this book is a good one for you. The Da Vinci Code Looking for more adventure? Here’s the perfect choice. Be ready to a journey with conspiracy theories and mysteries. Hunger Hunger has a dramatic story. Knut Hamsun wrote this novel mostly based upon his own life. Enjoy it!
https://medium.com/@varsinemel/5-book-recommendati%CC%87ons-20b576fb6cb5
[]
2020-12-27 17:30:04.262000+00:00
['2020', 'Book Recommendations', 'Books', 'Da Vinci Code', 'Dorian Gray']
How we built an easy-to-use image segmentation tool with transfer learning
How we built an easy-to-use image segmentation tool with transfer learning Label images, predict new images, and visualize the neural network, all in a single Jupyter notebook (and share it all using Docker Hub!) Authors: Jenny Huang, Ian Hunt-Isaak, William Palmer GitHub Repo Introduction Training an image segmentation model on new images can be daunting, especially when you need to label your own data. To make this task easier and faster, we built a user-friendly tool that lets you build this entire process in a single Jupyter notebook. In the sections below, we will show you how our tool lets you: Manually label your own images Build an effective segmentation model through transfer learning Visualize the model and its results Share your project as a Docker image The main benefits of this tool are that it is easy-to-use, all in one platform, and well-integrated with existing data science workflows. Through interactive widgets and command prompts, we built a user-friendly way to label images and train the model. On top of that, everything can run in a single Jupyter notebook, making it quick and easy to spin up a model, without much overhead. Lastly, by working in a Python environment and using standard libraries like Tensorflow and Matplotlib, this tool can be well-integrated into existing data science workflows, making it ideal for uses like scientific research. For instance, in microbiology, it can be very useful to segment microscopy images of cells. However, tracking cells over time can easily result in the need to segment hundreds of images, which can be very difficult to do manually. In this article, we will use microscopy images of yeast cells as our dataset and show how we built our tool to differentiate between the background, mother cells, and daughter cells. 1. Labelling There are many existing tools to create labelled masks for images, including Labelme, ImageJ, and even the graphics editor GIMP. While these are all great tools, they can’t be integrated within a Jupyter notebook, making them harder to use with many existing workflows. Fortunately, Jupyter Widgets make it easy for us to make interactive components and connect them with the rest of our Python code. To create training masks in the notebook, we have two problems to solve: Select parts of an image with a mouse Easily switch between images and select the class to label To solve the first problem, we used the Matplotlib widget backend and the built-in LassoSelector. The LassoSelector handles drawing a line to show what you are selecting, but we need a little bit of custom code to draw the masks as an overlay: Class to manage a Lasso Selector for Matplotlib in a Jupyter notebook For the second problem, we added nice looking buttons and other controls using ipywidgets: We combined these elements (along with improvements like scroll to zoom) to make a single labelling controller object. Now we can take microscopy images of yeast and segment the mother cells and daughter cells: Demo of lasso selection image labeler You can check out the full object, which lets you scroll to zoom, right click to pan, and select multiple classes here. Now we can label a small number of images in the notebook, save them into the correct folder structure, and start to train CNN! 2. Model Training The Model U-net is a convolutional neural network that was initially designed to segment biomedical images but has been successful for many other types of images. It builds upon existing convolutional networks to work better with very few training images and make more precise segmentations. It is a state-of-the-art model that is also easy to implement using the segmentation_models library. Image from https://arxiv.org/pdf/1505.04597.pdf U-net is unique because it combines an encoder and a decoder using cross-connections (the gray arrows in the figure above). These skip connections cross from the same sized part in the downsampling path to the upsampling path. This creates awareness of the original pixels inputted into the model when you upsample, which has been shown to improve performance on segmentation tasks. As great as U-net is, it won’t work well if we don’t give it enough training examples. And given how tedious it is to manually segment images, we only manually labelled 13 images. With so few training examples, it seems impossible to train a neural network with millions of parameters. To overcome this, we need both Data Augmentation and Transfer Learning. Data Augmentation Naturally, if your model has a lot of parameters, you would need a proportional amount of training examples to get good performance. Using our small dataset of images and masks, we can create new images that will be as insightful and useful to our model as our original images. How do we do that? We can flip the image, rotate it at an angle, scale it inward or outward, crop it, translate it, or even blur the image by adding noise, but most importantly, we can do a combination of those operations to create many new training examples. Examples of augmented images Image data augmentation has one more complication in segmentation compared to classification. For classification, you just need to augment the image as the label will remain the same (0 or 1 or 2…). However, for segmentation, the label (which is a mask) needs to also be transformed in sync with the image. To do this, we used the albumentations library with a custom data generator since, to our knowledge, the Keras ImageDataGenerator does not currently support the combination “Image + mask”. Custom data generator for image segmentation using albumentations Transfer Learning Even though we have now created 100 or more images, this still isn’t enough as the U-net model has more than 6 million parameters. This is where transfer learning comes into play. Transfer Learning lets you take a model trained on one task and reuse it for another similar task. It reduces your training time drastically and more importantly, it can lead to effective models even with a small training set like ours. For example, neural networks like MobileNet, Inception, and DeepNet, learn a feature space, shapes, colors, texture, and more, by training on a great number of images. We can then transfer what was learned by taking these model weights and modifying them slightly to activate for patterns in our own training images. Now how do we use transfer learning with U-net? We used the segmentation_models library to do this. We use the layers of a deep neural network of your choosing (MobileNet, Inception, ResNet) and the parameters found training on image classification (ImageNet) and use them as the first half (encoder) of your U-net. Then, you train the decoder layers with your own augmented dataset. Putting it Together We put this all together in a Segmentation model class that you can find here. When creating your model object, you get an interactive command prompt where you can customize aspects of your U-net like the loss function, backbone, and more: Segmentation model customization demo After 30 epochs of training, we achieved 95% accuracy. Note that it is important to choose a good loss function. We first tried cross-entropy loss, but the model was unable to distinguish between the similar looking mother and daughter cells and had poor performance due to the class imbalance of seeing many more non-yeast pixels than yeast pixels. We found that using dice loss gave us much better results. The dice loss is linked to the Intersection over Union Score (IOU) and is usually better adapted to segmentation tasks as it gives incentive to maximize the overlap between the predicted and ground truth masks. Example predictions by our model compared to true masks 3. Visualization Now that our model is trained, let’s use some visualization techniques to see how it works. We follow Ankit Paliwal’s tutorial to do so. You can find the implementation in his corresponding GitHub repository. In this section, we will visualize two of his techniques, Intermediate Layer Activations and Heatmaps of Class Activations, on our yeast cell segmentation model. Intermediate Layer Activations This first technique shows the output of intermediate layers in a forward pass of the network on a test image. This lets us see what features of the input image are highlighted at each layer. After inputting a test image, we visualized the first few outputs for some convolutional layers in our network: Outputs for some encoder layers Outputs for some decoder layers In the encoder layers, filters close to the input detect more detail and those close to the output of the model detect more general features, which is to be expected. In the decoder layers, we see the opposite pattern, of going from abstract to more specific details, which is also to be expected. Heatmaps of Class Activations Next, we look at class activation maps. These heat maps let you see how important each location of the image is for predicting an output class. Here, we visualize the final layer of our yeast cell model, since the class prediction label will largely depend on it. Heatmaps of class activations on a few sample images We see from the heat maps that the cell locations are correctly activated, along with parts of the image border, which is somewhat surprising. We also looked at the last technique in the tutorial, which shows what images each convolutional filter maximally responds to, but the visualizations were not very informative for our specific yeast cell model. 4. Making and Sharing a Docker Image Finding an awesome model and trying to run it, only to find that it doesn’t work in your environment due to mysterious dependency issues, is very frustrating. We addressed this by creating a Docker image for our tool. This allows us to completely define the environment that the code is run in, all the way down to the operating system. For this project, we based our Docker image off of the jupyter/tensorflow-notebook image from Jupyter Docker Stacks. Then we just added a few lines to install the libraries we needed and to copy the contents of our GitHub repository into the Docker image. If you’re curious, you can see our final Dockerfile here. Finally, we pushed this image to Docker Hub for easy distribution. You can try it out by running: sudo docker run -p 8888:8888 ianhuntisaak/ac295-final-project:v3 \ -e JUPYTER_LAB_ENABLE=yes Conclusion and Future Work This tool lets you easily train a segmentation model on new images in a user-friendly way. While it works, there is still room for improvement in usability, customization, and model performance. In the future, we hope to: Improve the lasso tool by building a custom Jupyter Widget using the html5 canvas to reduce lag when manually segmenting Explore new loss functions and models (like this U-net pre-trained on broad nucleus dataset) as a basis for transfer learning Make it easier to interpret visualizations and suggest methods of improving the results to the user Acknowledgements We would like to thank our professor Pavlos Protopapas and the Harvard Applied Computation 295 course teaching staff for their guidance and support.
https://towardsdatascience.com/how-we-built-an-easy-to-use-image-segmentation-tool-with-transfer-learning-546efb6ae98
['Jenny Huang']
2020-08-06 00:11:03.936000+00:00
['Transfer Learning', 'Visualization', 'Image Segmentation', 'Unet', 'Editors Pick']
Easy integration testing of GraphQL APIs with Jest
In modern software development, you know for sure that automated tests are crucial for the success of your software project. But writing unit tests is only part of the story, as you also need to check if all the code you’ve written works well together and that your service as a whole actually does what it’s supposed to from a business logic perspective. This article shows you one (of many) ways to write integration tests for your GraphQL API. I’ll give you also some introduction how a good test looks like and why I’ve chosen this particular toolset. This approach can of course also be used for any other micro service or API variant. Integration testing When writing unit tests you always test only the logic in the current method. All calls to external methods should be mocked, so you actually do not write into your database or call an external service for real. Integration tests come in different flavors: you might want to mock away some calls to external services (e.g. a payment API that does not offer you a sandbox server), however, I try to keep the integration testing system as close as possible to reality. Your goal when writing integration tests is to test the typical scenarios (and edge cases) of your service in a way your service will be used in the production environment later. Good integration tests are: Isolated: Testing one scenario has no effects on another scenario and it does not matter in which order the different integration tests are executed Testing one scenario has no effects on another scenario and it does not matter in which order the different integration tests are executed Repeatable: When code is not changed, the test result is always the same When code is not changed, the test result is always the same Understandable: It is easy to understand what business case the particular test covers and eventually why the expected outcome is important It is easy to understand what business case the particular test covers and eventually why the expected outcome is important Structured: Test files should not have too many lines, a more complex scenario should be covered in its own test file Test files should not have too many lines, a more complex scenario should be covered in its own test file Easy: The tests should be easy to implement and easy runnable by the developer team (testing is not the most favoured task for many developers, so keep the barrier as low as possible) Test database and seed data (example MongoDB) As said, you need to ensure that your tests can run isolated and are repeatable. Therefore you need a test database that has the same dataset every time before a new test scenario is executed, a dataset where you know that e.g. a specific shop item is existing and you can write a test which tries to retrieve this item with ID “xyz”. It may sound like a lot of work to create this setup, but it is easy to achieve. First you need an empty test database by booting up a local docker container or an additional database instance at your database provider. Second you create a database dump from your production database with some example data. In case of MongoDB it can be done with one simple command: Just replace the parameters according to your database setup. Take particular note of the query argument to just select a subset for the export if your production database contains a lot of data. Importing the data into your test database is as easy as creating the seed file. So finally we can create a shell script we can call anytime to create exactly the same dataset in our database again like this: Caution: Always double check the server address or use database logins that do not exist in the production database to prevent accidental deletion of your production data! Jest integration testing with snapshots First of all, you might ask why are we using Jest for integration testing, there are a lot of other tools out there. I see a big benefit in re-using a framework your developers might already know from unit testing and do not have to learn another framework. This can keep the entry barrier (also for new developers) low and save a lot of time. Additionally, Jest provides with Snapshots a really easy and fast way to create integration tests. So let’s say we have a GraphQL endpoint that gives back all the items available in our store. Thanks to our seed data we know what to expect: Request: Response: Unless we change our code or our seed data, the API should return this exact response to this exact request. Jest Snapshots gives us an easy possibility to check the complete response object. Additionally we need to ensure the dataset is correct before running our test using the mentioned seed script. Altogether, our first test scenario could look like this: The first run First run creates the snapshot file When running this test the first time, Jest will automatically create a __snapshots__ folder and a snapshot file inside of it. Just because the console shows you a green test, it does not mean you are done yet! You have to check if the auto-generated snapshot contains what you expect, else you would assume a false state to be correct. This check is necessary every time you’ve written a new test and the console is showing you that new snapshots have been written. In our case, the snapshot shows the correct data and we can commit the snapshot file together with our tests into our repository. The n-th run (passing) Running our test again without changing any code is the actual real integration test. Jest validates that the response from the API is matching exactly what was stored in the snapshot file. The n-th run (failing) The more interesting case is when the snapshot does not match anymore and your test is failing: Failing snapshot test after code changes As the diff shows, the order of the response seems to has changed and is therefore not matching our expected response anymore. This can have two reasons: The snapshot is correct and you introduced a bug with your latest code changes. In this case, the test prevented the go-live of a bug and you can adapt your code until the test is green again code changes. In this case, the test prevented the go-live of a bug and you can adapt your code until the test is green again The snapshot is wrong because the business requirement has changed (e.g. the default sorting shout be by name ASC now). In this case, you have to update the snapshot to confirm the change is correct by running Jest with the -u flag and committing the updated snapshot file into your repository. Updating a snapshot after confirming the change is correct Learnings from daily life Using Jest Snapshots is a fast way to write integration tests for our GraphQL API. However, if you do your research you will find a lot of valid arguments why not to use Jest Snapshots like in this blog post What’s wrong with snapshot tests. While some points are true also for this use case, I think that testing APIs with Jest Snapshots is a bit different and some downside can be prevented. Here are my learnings from using snapshots in a real project: Jest expects the snapshot to match exactly. As GraphQL gives us the power to request only a subset of parameters, you should only request those parameters that are relevant for your test. This prevents the test from failing for the wrong reasons. … this will also make your snapshots errors easier to read as the diff contains fewer data. Jest runs in parallel mode per default. If you have two test scenarios (two spec.ts files) writing into the same database collection or table you might get side effects. To ensure isolation of tests run Jest in sequential mode via the --runInBand flag. files) writing into the same database collection or table you might get side effects. To ensure isolation of tests run Jest in sequential mode via the flag. Most importantly: Make sure your team does understand snapshots! The tests can be written really fast but this comes at the price that snapshots do contain a lot implicit knowledge. Your team needs to think more about the real reason for the failing snapshot, its impact etc. before blindly running jest -u to update the snapshot. Integration tests have become an important part in our tool chain to prevent releasing bugs that have been left undiscovered before when just using unit testing. Feel free to leave your experiences with Jest Snapshots or other integration testing tools in the comments.
https://medium.com/swlh/easy-integration-testing-of-graphql-apis-with-jest-63288d0ad8d7
['Philipp Schmiedel']
2019-06-09 16:09:28.446000+00:00
['Testing', 'Integration', 'GraphQL', 'Jest', 'Software Development']
Media: The Truth About Pastor Kimberly Brown Preaching Rant Vs Brian Carn
Media: The Truth About Pastor Kimberly Brown Preaching Rant Vs Brian Carn The old black church and black gospel news with Hood Evangelis. Click the links below to check out her reactions for celebrity Pastors and false prophets like: Gino Jennings, Bishop Bernard Jordan, Larry Reid, Pastor Creflo Dollar, Td Jake’s, Pastor John Gray, Prophet Brian Carn, Marcus Rogers, Pastor Kimberly Brown, Joel Osteen and more… Click Link to Video Below https://youtu.be/_hkP2sc6c_M Hood Evangelist Website http://hood-evangelist.ml Subscribe to Hood Evangelist YouTube Channel https://youtube.com/channel/UCHQIEPzrkqumQfGcht2X8wQ
https://medium.com/@hood-evangelist/media-the-truth-about-pastor-kimberly-brown-preaching-rant-vs-brian-carn-18c78bf558a6
['Hood Evangelist']
2020-12-27 00:45:00.640000+00:00
['Christianity', 'Mental Health', 'BlackLivesMatter', 'Church', 'Wellbeing']
PIP 14 — Divergent Oracle Prices. Proposal for adding more sources of…
PIP 14 — Divergent Oracle Prices Summary of Proposal This proposal was put forward by an anonymous white hat developer. I’ve updated it below to reflect what is possible in the current PegNet software. 1. Add more diversity to the OPR generating parties (meaning who can submit OPRs) 50% weight to miners— 50% weight to Stakers. 2. Then run the grading algorithm on these. Set the deviation for each asset. If deviation between OPRs is too high (say 0.5%), pause conversions on that pAsset that block. Advantages Of This Approach The first big advantage of this approach is PegNet can detect attacks without slowing down the conversions over many blocks. For example in the case of a fast market change, such as the dropping price of oil recently, PegNet users can still follow the price change and take advantage of it with their conversions. As this approach compares different price records between Miners / Stakers in the same block, rather than comparing previous block prices to current block prices. A potential attacker needs to control both a majority of the mining power and a majority of the staked pAssets / PEG. If they only control 1 group, the cost to maintain the attack is higher than what he earns. Example: Scenario 1: attacker wants to push the price of gold down to zero and buy it cheaply. normal (no attack): - Miners of say GOLD: 1000 - Stakers say GOLD: 1003 ______ deviation: 1.5 attacker controls 1 group: - Miners say GOLD: 1 - Stakers say GOLD: 1000 _____ deviation: 499.5 as you can see. There’s big difference and easy to detect automatically. Scenario 2: attacker is trickier, he pushes the price down a little (say 1%) to buy it cheaper. It's undetected by human. This is the hardest situation. 1.01^72 is still x2 their asset per day. 1.005^72 is +40% their asset per day. Solution: Tight deviation. Example: last block (no attack): - miners of current block say GOLD: 1000 - peg holders say GOLD: 1003 ______ deviation: 1.5 Scenario 2A: attacker changes 1% price this block, attacker controls 1 group: - Miners say GOLD: 990 - Stakers say GOLD: 1000 _____ deviation: 5.0 Scenario 2B: attacker changes 0.5% price this block, attacker controls 1 group: - Miners say GOLD: 995 - Stakers say GOLD: 1000 _____ deviation: 2.5 The deviation of attacked block is almost doubled of last block. So it still can be detected by code. Going lower is not worth to do an attack and unpredictable, since asset prices can change 0.5% in 10 minutes in reality. Future Iteration With Past + Current Miners When Digital IDs (DIDs) are Fully Deployed on Factom than the approach can be improved to involve Miners from past blocks and the current block. 25% by miners of the current block 25% by miners of last 100 blocks (weighted by percent of total hashrate) 25% by miners of last 1,000 blocks (also weighted) 25% by Stakers Then run the grading algorithm on all of these. Set the deviation for each asset. A potential attacker needs to control all 4 to alter prices. If he only controls 3 groups, the cost to maintain the attack is higher than what he earns. Example: Scenario 1: attacker wants to push the price of gold down to zero and buys it cheaply. normal (no hack): - miners of current block say GOLD: 1000 - miners of last 100 blocks say GOLD: 1001 - miners of last 1000 blocks say GOLD: 998 - peg holders say GOLD: 1003 ______ deviation: 1.802775637732 attacker controls 3 groups: - miners of current block say GOLD: 1 - miners of last 100 blocks say GOLD: 1 - miners of last 1000 blocks say GOLD: 1 - peg holders say GOLD: 1050 _____ deviation: 454.23032428494 attacker controls 2 groups: - miners of current block say GOLD: 1 - miners of last 100 blocks say GOLD: 1 - miners of last 1000 blocks say GOLD: 1000 - peg holders say GOLD: 1001 _____ deviation: 499.75012506252 attacker controls 1 group: - miners of current block say GOLD: 1 - miners of last 100 blocks say GOLD: 999 - miners of last 1000 blocks say GOLD: 1000 - peg holders say GOLD: 1001 _____ deviation: 432.58026711814 In this example if someone decides to attack PegNet, he must have 9,900% higher hashrate of current network, has to mine PegNet for 7 days to have 74% voting power. In the meantime, the devs can figure it and do something to counter the attack. That’s such a huge barrier for any attacker. Even if he manages to do so, 25% voting power is held by PegNet Stakers who still report honest prices. Conclusion Having multiple groups of parties submit oracle price records vastly increases the cost to an attacker and by implementing tight controls on the prices deviation the benefit of the attack is reduced.
https://medium.com/@djohnstonec/pip-14-divergent-oracle-prices-6b8e0ac4396c
['David A. Johnston']
2020-05-05 18:12:16.716000+00:00
['Ethereum', 'Pegnet', 'Staking', 'Mining', 'Blockchain']
They’re all jumping on this bandwagon while they act like cowards and admit that Biden one.
They’re all jumping on this bandwagon while they act like cowards and admit that Biden one. Ernst and Gaetz are both ones that jumped quickly on this bandwagon. I think it’s ridiculous.
https://medium.com/@adamebadenhorst/theyre-all-jumping-on-this-bandwagon-while-they-act-like-cowards-and-admit-that-biden-one-c0f6cec142a3
['Adam E. Badenhorst']
2020-12-06 19:26:29.154000+00:00
['2020 Presidential Race', 'US Politics', 'USA', 'Politics']
10 Selenium Webdriver Tips and Tricks
Many times we face some simple issues while working with Selenium Webdriver. Few simple steps can increase the stability and execution speed of the automated test. I have listed down various use cases faced in real time while working on Selenium with Java. Also listed down code snippets for these use cases which can be used directly. 1. Taking screenshot of Webpage(For Failures/Issues) It is sometimes, required to take screenshot of webpage. It helps to identify problem and debug test case. Screenshot is taken using following code snippet File scrFile ; String pathofscreenshot; scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE); FileUtils.copyFile(scrFile, new File(pathofscreenshot + "\\screenshot.png")); 2. Take partial screenshot Sometimes it is required to take screenshot of a part of screen. For example, we require to take screenshot of a popup or a frame in webpage. To take partial screenshot use following code snippet String screenShot = System.getProperty("user.dir") + "\\screenShot.png"; File screen = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE); Point p = webelement.getLocation(); int width = webelement.getSize().getWidth(); int height = webelement.getSize().getHeight(); BufferedImage img = ImageIO.read(screen); BufferedImage dest = img.getSubimage(p.getX(), p.getY(), width,height); ImageIO.write(dest, "png", screen); FileUtils.copyFile(screen, new File(pathofscreenshot + "\\screenshot.png")); 3. Execute javascript(.js) We can execute javascript to do custom synchronizations, change some values, enable/disable element or hide/show elements. Use following code to execute javascript using selenium webdriver JavascriptExecutor js = ((JavascriptExecutor) driver); js.executeScript("alert('hello world');"); 4. select/get drop down option from list. Use Select class to identify dropdown element Select selectValue = new Select(webelement1); To select option by index selectValue.selectByIndex(<index in integer>); To select option by value selectValue.selectByValue(<string value of dropdown option>); 5. Using Drag and Drop Feature Sometimes it is required to move element from one position to another. Dragging and dropping is not available in basic methods provided for webelement. To do this use, Action class provided by selenium Use following code snippet to drag from element webElementFrom to WebelementTo Actions action = new Actions(driver); Action dragAndDrop = action.clickAndHold(webElementFrom) .moveToElement(WebelementTo) .release(WebelementTo) .build(); dragAndDrop.perform(); 6. Scrolling webpage(Vertical & Horizontal) In selenium webdriver, scrolling can be done using javascriptexecutor. Call javascript function window.scrollBy in js executor. ((JavascriptExecutor)driver).executeScript("window.scrollBy(0,250);"); This will scroll webpage vertically by 250 pixels. ((JavascriptExecutor)driver).executeScript("window.scrollBy(250,0);"); This will scroll webpage horizontally by 250 pixels. 7. Open a new tab Sometimes it is required to open a new tab in the same window of browser To open a new tab use following code snippet driver.findElement(By.cssSelector("body")).sendKeys(Keys.CONTROL +"t"); After running test on new tab, use following code to switch to first tab again ArrayList<String> tab_list=new ArrayList<String> (driver.getWindowHandles()); driver.switchTo().window(tab_list.get(0)); 8. Set page Load TimeOut Sometimes webpage takes some time to load completely. To set page loading timeout use following code snippet driver.manage().timeouts().pageLoadTimeout(40, TimeUnit.SECONDS); This will timeout the page after 40 seconds. 9. Refreshing webpage Sometimes it happens in webpage that elements are not loaded and page needs to be refreshed to load all elements. In most of the cases, it is seen that elements don’t load on first attempt but if we load page again, all components are loaded There are many ways to refresh webpage using selenium. Some of the ways are as below: driver.navigate().refresh(); driver.navigate().to(driver.getCurrentUrl()); Lets say current application url is google.co.in driver.get("https://www.google.co.in"); We can send F5 key over any element driver.findElement(By.id("id1")).sendKeys(Keys.F5); driver.findElement(By.id("id1")).sendKeys("\uE035"); here, \uE035 is the ASCII code of id1 When we execute location.reload function using javascript executor, current webpage gets loaded $ JavascriptExecutor js; $ js.executeScript("location.reload()"); 10. Switch between browser windows Sometimes on clicking on an element, webpage opens a new popup window and it is required to switch driver instance to the new window and perform some operation. Use following code snippet.
https://medium.com/@ankit-s-agrawal/10-selenium-webdriver-tips-and-tricks-514f5ab928ec
['Ankit Agrawal']
2021-04-26 16:37:03.478000+00:00
['Selenium', 'Automation Testing', 'Selenium Test Automation', 'Quality Assurance', 'Selenium Webdriver']
Vice-President Harris’s tone-deaf cookie choice threatens to derail Biden agenda
Vice President Harris speaking at a Press Conference in Guatemala While traveling to Central America on a tour to improve diplomatic relations in the region and tackle the border crisis, Vice President Kamala Harris handed out cookies decorated in her likeness to members of the press pool. This move quickly generated sharp backlash from members of the online right and even drew the ire of Ronna McDaniel, the Chairwoman of the Republican Party, who commented that it was the “modern-day equivalent of ‘let them eat cake.” Image of a cookie handed out by the VP Even disregarding the immediate political fallout, some have speculated that this move could harm Biden’s agenda, which has faced growing opposition in recent days. While already struggling to gain the support of moderate Democrats, this blunder may push them even further away from the embattled administration. In fact, the growing scandal, now known by many as ‘Cookiegate,’ may have caused Senator Joe Manchin to withdraw support from the For The People Act, a bill he had previously co-sponsored. For some Democrats, this latest misstep by the administration has brought back memories of Hillary Clinton’s cookie troubles. During her husband’s 1992 Presidential Campaign, Clinton stated “I suppose I could have stayed home and baked cookies and had teas, but what I decided to do was to fulfill my profession, which I entered before my husband was in public life.” This statement received immediate backlash and was a significant blow to the Clinton Campaign. Democrats today can only hope that voters will soon forget about Cookiegate, so that it will not hurt their chances in the 2022 midterm elections. Even so, many on the left have disregarded Cookiegate, noting that it will most likely have very little impact on Biden's political capital. However, even they cannot predict the future, and only time will tell what comes of this latest cookietastrophe.
https://medium.com/@mfox99/vice-president-harriss-tone-deaf-cookie-choice-threatens-to-derail-biden-agenda-20da6f102426
[]
2021-06-08 22:03:27.331000+00:00
['Joe Biden', 'Cookies', 'Joe Manchin', 'Kamala Harris', 'Politics']
Running productive distributed teams
At Distributed we manage distributed teams that deliver work in hard-to-hire-for and expensive areas of digital marketing and tech on behalf of our clients. From content creation, through API design and analytics integrations and all the way up to the development of Artificial Intelligence. We deliver this work at a higher quality, in less time, and at a lower cost than centralised agencies. It’s not been easy building Distributed, but it’s working incredibly well, and we’re now in a position to scale and add a few more clients to our roster. I’m not going to bore you with the story of how we built Distributed over the past 5 years, if you want to read more about us please check out this post that I wrote a little while back, I am however going to detail a few of the lessons we’ve learned about building and running distributed teams and hopefully offer some valuable insights that may help you reap the benefits of building your own distributed team.
https://medium.com/distributed/running-productive-distributed-teams-6c68fbd15d41
['Callum Adamson']
2017-06-30 07:29:19.095000+00:00
['Remote Working', 'Health', 'Distributed Teams', 'Future Of Work', 'Future']
Common Medications May Increase Risk of Alzheimer’s Dementia
Common drugs such as Advil, Tylenol, Benadryl, Sominex and many others may increase the risk of cognitive decline and Alzheimer’s disease, according to a study published this fall in the journal Neurology. Known as anticholinergics, this broad class of drugs includes over-the-counter and prescription medications used to treat a wide range of conditions — everything from allergies, sleep issues and mood disorders to high blood pressure, bladder problems, chronic obstructive pulmonary disease, asthma and Parkinson’s. Anticholinergics work by blocking the neurotransmitter acetylcholine, which is responsible for learning and memory as well as for muscle contractions. More than 600 medications have some degree of anticholinergic activity. Researchers from the University of California San Diego School of Medicine tracked the cognitive function of 688 older individuals, all of whom had normal memory and cognitive function at the start of the study, over a ten-year period. Participants who had an average age of 74 and were equally divided by sex. Nearly one-third of the participants were taking anticholinergics, on average 4.7 anticholinergics per person among those using them. Researchers administered cognitive tests annually to participants and checked for genetic risk factors for Alzheimer’s as well as biomarkers (biological signs) indicating the presence of the disease. Who’s at greatest risk for Alzheimer’s disease? Anticholinergics are metabolized by the liver and normally have short-term effects on the body. However, older adults have a slower ability to metabolize medications. This puts them at an increased risk of adverse drug reactions. Researchers also found that genetics and biological factors can increase the risk of cognitive impairment. Among anticholinergic drug users, those with a higher genetic risk for Alzheimer’s disease were 2.5 times more likely to experience cognitive impairments, while those with biomarkers for Alzheimer’s disease were four times more likely to experience cognitive impairments. Study authors noted that 57% of the participants were taking twice the recommended dosage of anticholinergics, while 18% were taking four times the amount. In addition to the relatively small sample size, one of the limitations of the study was that participants were predominantly white and well-educated. Safety recommendations for anticholinergic drug users It’s important to talk to your health care professional about the risks and benefits of anticholinergic medications based on your personal medical situation and history, as well as to have them regularly review all your medications for potential interactions and overall safety. As well, many pharmacists will review medication lists for free, offering their recommendations on dose adjustments or the need to discontinue medications if there are safety concerns.
https://medium.com/@medtruth/common-medications-may-increase-risk-of-alzheimers-dementia-1e9637973f39
[]
2020-12-19 01:13:32.187000+00:00
['Patient Safety', 'Research', 'Public Health', 'Dementia', 'Alzheimers']
D2D Universal Book Links — and why I won’t use them
In a previous article, I waxed lyrical about ‘going wide’ with Draft2Digital [D2D]. I still think it’s a good service, but there’s one feature I won’t be using, despite its convenience, and that feature is their Universal Book Link [UBL]. The UBL works like this: when you click on the link to a book, you’re taken to the D2D sister website, Books2Read. There, you’re presented with this screen: Example of a Books2Read screen showing all the available retail links to Miira Notice the Kindle icon on the far left? If you click on it, you’ll be taken to a second screen where you can choose to make Amazon your preferred e-tailer: Uncheck the tick if you don’t want Amazon as your preferred e-tailer. Clicking the ‘Continue’ button [with or without setting Amazon as your preferred e-tailer] takes you, finally, to the Amazon page for the book. Handy, right? Except there’s one problem, when I click the Kindle icon, I’m taken to Amazon Australia, not Amazon.com: Amazon.com.au is the Australian version of Amazon Why is this a problem? It’s a problem because I set the booklink to the US website, not the Australian one. At first I thought I’d made a mistake, so I create the booklink again. Same result. What the…? The following screenshot displays the response from D2D support: A copy of an email from D2D The important part is this: “Hello — Our UBLs geolocate….” That means anyone who clicks on that UBL will be tracked, without their knowledge or consent by a company they have not chosen to share that information with…i.e. Draft2Digital via its sister company Books2Read. Now, geolocation is not new. Every time I log into Amazon, it tries to get me to change my account to Amazon Australia because it knows that’s where I live. If you click on one of my book links, you’ll be taken to the appropriate Amazon store for your location. Again, because Amazon knows where you live…i.e. it’s tracking you too. But at least there’s an implicit choice there because you choose to visit the Amazon store, wherever it may be. However, when Books2Read tracks your geolocation, it oversteps a subtle line in the sand because it is simply an intermediary, and you did not choose to give it that information. There’s nothing malicious about Books2Read tracking your geolocation when you use a feature on their website. But that’s not the point. Everyone should have the right to choose whether to hand over that kind of information or not. And that’s why, despite the convenience of using UBLs for my books, I can’t do it. I know a lot of people reading this don’t see the issue of privacy as important. Others don’t like the lack of privacy but will live with it in exchange for some perceived benefit to themselves. Either way, it’s a choice. Their choice. But I have a choice too, and it’s a moral one. I can’t talk the talk about privacy while allowing potential readers to be tracked via my books. I can’t make an exception for my case simply because it’s expedient. This is a choice every company on the internet has to face. So while I’m just a lowly sole trader, the choice is still mine to make: there will be no UBLs associated with my books. Each link will lead exactly where it’s supposed to go. acflory
https://medium.com/tikh-tokh/d2d-universal-book-links-and-why-i-wont-use-them-f185ed585c12
['A.C. Flory']
2018-04-21 01:49:02.485000+00:00
['Geolocation', 'Tracking', 'Self Publishing', 'Universal Book Link', 'Draft2digital']
Soulmates
Soulmates A Journey Together Photo by Sharon McCutcheon on Unsplash He feels so close to me Yet so faraway from reach But my heart is comforted Solely by just that proximity Created by my thoughts Slowly and steadily we shall One day sooner or one day later Grasp in our hands and squeeze With pleasure and accomplishment Those coarse sheets that announces… Only then I know shall this feeling Culminate into a real expression And the beautiful fantasy that comforts Be more than our imaginations As we may now walk hand in hand… Our joining transcending the physical And our souls resonating with power The highest compassion ever found The magic of being manifesting for years… In a litter of beings made of me and him In our arduous journeys we share our strength And as our brood prospers and find their paths The seasons too gradually herald the greys As our spirits ebbs with the night we hold on still Till together alas we bid our final adieus!…..
https://medium.com/illumination/soulmates-e91644b73fef
['Lady Foxx']
2020-10-24 15:11:47.027000+00:00
['Poetry On Medium', 'Relationships', 'Illumination', 'Soulmates', 'Poetry']
Surgical Microscopes Industry Development And Business Trends
surgical microscopes market Market growth influenced due to following factors- Increasing use of fluorescence image-guided surgery (NYSE:FIGS). Increase in the number of surgeries and growing demand for MIS. Advancements in healthcare facilities & technological advancements. Global market size- According to research report surgical microscopes market is poised to reach $ 915.6 Million by 2021, at a CAGR of 12.5% from 2016 to 2021. Leading Players- The surgical microscopes market is highly competitive in nature. Some of the major players in this market include Novartis AG (Switzerland), Danaher Corporation (U.S.), Topcon Corporation (Japan), and Carl Zeiss AG (Germany). The strong position of these companies in the market can primarily be attributed to their global presence and broad product portfolio. Other major players in the market include KARL KAPS GMBH & Co. KG (Germany), Alltion (Wuzhou) Co. Ltd. (China), and ARRI Medical (ARRI Group) (Germany). Top Market Segments- On the basis of application, the operating microscopes market is segmented into neuro and spine surgery, plastic and reconstructive surgery, ophthalmology, gynecology and urology, oncology, dentistry, ENT surgery, and documentation. In 2016, the neuro and spine surgery segment is expected to account for the largest share of the market due to the increasing demand for surgical/operating microscopes in neurosurgery. On the basis end users, the operating microscopes market is classified into hospitals, and out-patient facilities. Hospitals are the major end users for the surgical microscopes end users market, owing to increase in minimally invasive surgery (MIS) procedures, and need for high speed diagnostics. Geographic Overview - North America and Europe are established markets for surgical microscopes, due to broad technical applications of surgical microscopes and improved healthcare facilities in the region. However, Asia-Pacific is expected to show high growth rate in next few years in global operating microscopes market. This is due to presence of immense potential associated with healthcare services and increased investment in R&D. Increasing healthcare awareness is also fuelling the growth of surgical microscopes market in emerging countries. Download PDF Brochure: https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=179225920
https://medium.com/@sara-keller/surgical-microscopes-industry-development-and-business-trends-d6bbc9573831
['Sara Keller']
2021-06-17 10:29:04.197000+00:00
['Surgical Technology', 'Medical Devices', 'Healthcare Technology', 'Microscope', 'Business Development']
Analysis interactivity: Automation jobs by the ONS
With that in mind, I thought about another story that used a chatbot to interact with the reader. A few months back I read a data analysis from the ONS about the jobs at higher risk of being automated. And for it they used a chatbot, which felt a bit ironic but at the same time was on point. Before being introduced to the robot, the reader is given a general explanation of the results of their analysis, such as how many jobs are at a high risk of being automated. And then comes in the chatbot who allows you to “play” with him and you’re given two options at the beginning: “give me a specific occupation” or “general facts”. Once that first choice made, you’re given other options: age, gender, or place in England. It’s an interesting way to interact with the reader and catch your curiosity about a subject that may affect their profession. Different types of interactivity But the interesting part is that the bot serves more as an addendum to the story, you can either choose to ask it about information, or scroll down and look at the interactive charts/map. This could be considered as Jensen’s “Consultational” type of interactivity. You can interact with the same data in different ways. Giving you freedom to choose how you want to interact with the information. If you are reading it via your computer, you can easily skip through the section you are most interested in knowing more. And each part has a succinct explanation on what they were able to extract from the data they got. Similar to what you may have been told by the bot. Back to Jensen’s types of interactivity, those charts are “Transmissional” since you can hover with the mouse and get extra information. Or even search for a particular job you have in mind. You can either type at the top a profession, or hover between each category and compare who’s more at risk The first interactive chart is about the jobs (see picture above), and they are reunited in 9 categories, and ordered through least threatened to highest chance of being automated. Each dot is a profession; and if you point at it, you’ll see which one it is; and the percentage of risk from automation. It’s a simple and easy way to understand the information and involve the reader in looking the data. Presentation The structure of the article reminds the Martini Glass Structure, where the reader is first given a brief explanation about the data analysis made by the ONS and with the bot as an introduction to it. Nothing forces you to use it, but it is implied to at least ask the robot a question to later have liberty to go through the different sections and have a different interactivity. Another great feature is the fact that the ONS allows you to use their content, with a simple click you can either download the data or embed the code of the chatbot or the graph; this way you can publish it on your website, or your news article. After each graph, there’s a link to either download the data or copy the chart to your website Is it news for everyone or journalists The ONS is not a news organisation, and thus you can sense it was not necessarily written for that format; despite not following the inverted pyramid, you still are receiving valuable information. The content is not complicated to understand for any reader, but at the same time, there’s plenty data material for journalists to work with it and write a story about a particular subject (e.g. “Which city in England is the most likely to be exposed to automation?”).
https://medium.com/narrative-from-linear-media-to-interactive-media/analysis-interactivity-automation-jobs-by-the-ons-9f1d10927629
['Jonathan Touriño Jacobo']
2020-11-02 11:16:49.334000+00:00
['Data Journalism', 'Interactivity', 'Job Automation']
My 2020 S.M.A.R.T. Goals
I had a great plan to kick off 2020. Since Meghan and I couldn’t be together at midnight (she was performing a holiday show with Second City and I was performing with Hot Seat at iO [R.I.P.]), I cooked up a plot to host a 3 A.M. celebratory feast. Before I left for the theater, I prepped the ingredients for Alison Roman’s “butcher’s steak”, and stuck a bottle of champagne on ice. How romantic! However, I failed to account for one variable that every improviser worth their free show drink should have seen coming from a mile away: How hammered drunk I’d get after the show. We bombed and then got bombed. We ate shit in front of a half full Del Close Theater (again, R.I.P) for a group of boozed up regular Joes and Janes who realized at about 10:35 P.M. that the last thing they wanted to do on New Year’s Eve was watch a 10:30 P.M. improv show. I can’t blame them. But, improv is about nothing if not collaboration, so we laughed it off by working together to find the bottom of a bottle of Maker’s Mark. Three drinks past two drinks too many, I came home ready to put the finishing touches on my romantic twilight dinner. I popped the champagne, threw the steak on the stove, and then, nearing the finish line, plunged a chef’s knife straight into a hunk of my thumb that appeared to be masquerading as a bit of jalapeno. My thumb bled…a lot. I panicked. While the steak overcooked, Meghan and I debated whether I should go to the E.R. (I said yes [I’m Jewish], Meghan said no [she’s normal]). By the time the bleeding subsided, the feast was a calamity. Little did we know, it was less incident than omen. For those who have read my S.M.A.R.T. goals recaps in year’s past, welcome back to this now annual exercise. For those who are confusedly reading this for the first time, I’ll briefly explain the purpose of this post. Starting four years ago, I began to use the corporate goal-setting framework known as “S.M.A.R.T.” (Specific, Measureable, Attainable, Relevant, Time-sensitive) in my personal life. At the end of 2016, which had seen me spend the better part of a year wading depressingly through the cesspool of political Twitter, I felt like I needed more structure to motivate less toxic behaviors. So I used the S.M.A.R.T. framework to prompt actions related to things like reading, cooking, and exercise that I felt would prove more satisfying than night after night spent gawking at strangers’ Twitter arguments. It worked! While I still overindulge in the internet (particularly Twitter and Gchat), having other focal points accompanied by specific, behavioral goals has been a boon for me. Each year, I issue myself a public report card here. This comes from both my instinct for performance and attention-seeking (what do you think got me on that improv stage!?), as well as a real hope that what has helped me might help others in similar ways. Needless to say, 2020 has been the most challenging year since I started employing S.M.A.R.T. goals. The virus’ spread rendered some goals impossible, and others unimportant. Things that seemed relevant on January 1 felt trivial three months later. I set more goals than ever before, but they were designed to suit a lifestyle that has changed in some fundamental ways. I had originally grouped my goals in what I felt was a clever and broadly-encompassing set of categories: Mind, Body, Spirit, Community, and Finances. Based on the events that actually transpired in 2020, I’m going to ultimately present them in a different manner: The Good, The Bad, and The Obsolete. Before I do that, I usually dedicate a few paragraphs to a general annual life recap. This is normally the space where I recap travel highlights and other significant life developments that occurred over the previous 12 months. In the past, those updates have included things like getting engaged, getting married, adopting a dog, buying a home, and travelling to various parts of the world to see friends, celebrate their milestones, or just generally explore. This year…um …how do I say this…there’s been…well, let’s see…not much happened. I’ve spent most of the last nine months shuttling back and forth between my home office and living room, occasionally interrupting that routine with dog walks, dinner and rounds of golf. It’s not a routine they write novels or make action movies about. So, rather than a broad life recap, in the spirit of gratitude, I’d like to just share a few things and people from 2020 that I am especially grateful for. 1. Meghan — What kind of monster would I be if I didn’t start with gratitude to my wife, with whom I spent approximately 10,000 of 2020’s 8,760 available hours. This was an inordinately hard year — especially Second City’s suspension of live performances — but she has often made me laugh so hard I cried. Especially when she described her desire to create a “National Doppelganger Registry” where people worldwide could search for their doppelgangers as a way of, “uniting the country.” She presents this idea repeatedly and with earnest conviction. Thank God for her. 2. Thome — What kind of monster would I be if I didn’t follow gratitude to my wife with gratitude to my dog. She’s legitimately crazy (something’s not right in her head), but she’s my weirdo and I love her immensely. I don’t know what I’d have done this year if I couldn’t go upstairs 100x each day to give her short, spastic bursts of pets. 3. Winnemac Park & the Winnemac Park Crew — This year has greatly deepened my appreciation for the importance of public parks, especially in city environments. We’re fortunate to have one of Chicago’s best and most underrated parks a few blocks away, and doubly lucky to have several close friends who also live in walking distance. I wish I had more pictures from Winnemac Park or had a data point I could site that captured how often I visited this space in 2020. Between dog walks and social visits, there were many days I found myself at Winnemac three or four times. I’d guess I entered the park at least 500 times this year, and likely far, far more. Here’s a picture of our lil’ Winny gang. We spent many a summer evening on these laws, and I treasured those little respites from the house. Even when the pandemic ends, I hope that the park maintains its place in our lives. I love it there, and all these people here very much. 4. Golf — The awfulness of this pandemic experience improved meaningfully for me when Illinois lifted its closure of golf courses on May 1st. I cannot say enough how lucky I feel that a thing I love doing so intensely was one of the few hobbies permitted to thrive throughout this pandemic. I played a LOT of golf this year. 53 full 18 hole rounds, plus a couple of partial rounds that didn’t make y spreadsheet. I played some of the best golf of my life, and a lot that left much to be desired. I started the year as an 8 handicap and ended it as one too. But the journey back to where I started was spectacular and I’m thankful it occurred. 5. My golf pals — A strange feature of my psyche is that at the end of every major phase of my life, I have been sure that I’d never make any more meaningful friendships. When high school ended, I was certain my high school friends would be the only true friends I’d ever know. When college ended, I was equally certain that I’d NOW permanently filled out my friend roster. Comedy proved my hunch wrong again, but I suspected this time, it was truly the end of the line. And for a while, I finally seemed to be correct. Over the last several years, as I fell into a familiar routine, my high school, college and improv friends have made up my full and complete social universe. For better or worse, I believed I had at last settled on my final set of friendships. Then this year, thanks to my involvement in the NewClub Golf Society in Chicago, for the first time in a long time, new friendships thundered into my life. In a year like 2020, having new friendships sprout up, deepen, and expand has been a massive bright spot. 6. Baker Miller and RuffHaus — When your world shrinks in size, the things that remain take on added significance. Having local businesses where you feel a #morethantransactional connection can be a real source of joy and positivity. For us, that’s been Baker Miller, a bakery, restaurant and coffee shop in Lincoln Square, and the nearby pet store, RuffHaus. Baker Miller has been a shining example of a small business that’s been both resilient and safe. They turned their front window into a takeout counter for both live and pre-ordered pickups, they created a “bread line” delivery service, and they keep a stock of biscuit bites as dog treats for visiting pups. Similarly, RuffHaus always remembers Thome by name and has treats for her ready at hand. Having two welcoming businesses we can visit regularly has given us a small routine we’ve relied on and treasured this year. Here’s a picture of Thome at the Baker Miller window awaiting her treat. 7. My job — If 2020 didn’t make you appreciate your job, what will? The spring months at work were surreal and painful. I’ve never experienced anything quite like the daily need to find ways to simply put one foot in front of the other while the world sorted itself out. I’ve learned a lot this year and have a much deeper appreciation for all the things a job provides. 8. Joe Biden — People use the term Trump Derangement Syndrome jokingly, but it’s very real. I know that, because I have it. I’m not being dramatic or hyperbolic in saying that my loathing for Trump has altered me in permanent ways. This entire S.M.A.R.T. goals program was born out of the sense of emptiness and despair I felt following the 2016 election. Over the course of the last four years, Trump has cast a shadow over everything. Just his looming, awful Trumpness has soured my feeling about many things I have loved. I am aware of Biden’s shortcomings as a candidate and President-elect. He was not necessarily my first (second, or even third) choice in the primary field (#Klobuchar2020). And I have struggled at points to fully celebrate his victory because I find Trump’s enduring support utterly depressing, distressing and disgusting (the Three Ds of Derangement). However, I will be grateful to Joe Biden forever for merely pushing Trump a bit further away from my second-by-second consciousness. 9. My nephew, Holden. When 2020 started, my nephew Holden was a 16 month old with just the first inklings of verbal ability. Today, he’s a 28 month old who talks ceaselessly. Here’s a picture of Holden being a blissfully unaware that he has no idea who Donald Trump is. Ok, 1,786 words later, let’s get into the goals themselves. Below is a snapshot of the goals I set for this year and what transpired: The Good, The Bad and the Obsolete. THE GOOD Adopting a more meatless diet When I think about where my goals program has really yielded considerable change in my life over the last four years, it’s primarily in the following areas: it prompted me to learn to cook, it has helped me slowly but methodically lose weight, and it has transformed my diet to be more environmentally responsible. When this goals program started, I ate meat so mindlessly, I could barely quantify the amount I was consuming. If you had asked me at the time, I would have labeled myself a “moderate” meat eater. Then if you asked me what I ate that day, I’d be likely to say a ham, egg & cheese sandwich from Starbucks for breakfast, a turkey sandwich from Potbelly’s for lunch, and roasted chicken legs from Sukur’s Place for dinner. (Remember, at this point I stored sweaters in my oven and spent approximately $1.2M a year on takeout.) (Also, please order delivery from Sukur’s Place. If they go out of business during the pandemic I’ll be inconsolable and I can’t afford to keep them going by myself.) I was stunned to realize how much of our personal carbon footprint owes to meat consumption. While I don’t expect I’ll ever convert to full vegetarianism, prior to 2017, I frequently ate meat I barely liked out of nothing but force of habit and lack of imagination. My goal in this has always been to become more environmentally friendly by simply replacing mindless meat consumption with appealing meatless alternatives. At first I focused simply on breakfasts and lunches, where most of my “mindless meat” was consumed. Over the last two years, I’ve expanded that to increasingly include dinners. This year, I set out to have 250 meatless day times (roughly five days a week), and 150 fully meatless days (roughly three days a week). I ultimately had 252 meatless day times (meaning pre-dinner), earning an A grade, and 148 meatless dinners, earning a B grade. The B grade may seem harsh, but mostly reflects my loss of pace on the goal throughout the year. Over the four quarters, my meatless days declined from 47 to 38 to 32 to 31 days, respectively. Overall, I’m pleased with my progress here. I really have transformed my diet over the last 4 years. I’d guess I’ve conservatively reduced my meat consumption by about 75 percent. In addition to the environmental benefits that inspired this goal, it has helped me lower my cholesterol, lose weight, and feel better. My struggles with meatlessness remain the same year over year. For instance, should I suffer an accidental death, it’s likely to be because I choked on a six ounce bite of spaghetti while a sobbing Meghan pleads, “Smaller, so you can chew!” Pasta is my true weakness. Nevertheless, I’ve discovered several staple dinners that I find truly delicious and can make over and over again without complaint. Here’s a few of those. · Punjabi chana · Spicy white bean stew with broccoli rabe · Broccoli and egg fried rice · Baked feta with broccolini · Ramen with charred scallions and green beans Reading 20 Books One row on my book shelf is dedicated to my absolute favorite books. By just a weird run of luck, 2019 saw five (five!!) additions to that shelf: Middlesex, The Unbearable Lightness of Being, The Goldfinch, A Visit from the Goon Squad, and The Elegance of the Hedgehog. It was unlikely a year like that would repeat itself and, well, it didn’t. I read several books this year that I liked very, very much, but probably none that will find a place on “The Shelf”. Nevertheless, I set out to read 20 books this year and landed on that number exactly. Some were long, dense, and heavy. Others were thin, light and fun. Most importantly, there was never a sustained stretch this year where I wasn’t reading something. If were to recommend one book I read from 2020, it’d be… · Exhalation, by Ted Chiang. I am not normally a science fiction person, but this is the best example of the genre I’ve ever encountered. While it is wildly imaginative in terms of envisioning future technologies, it handles those technologies with a matter of fact tone that I greatly enjoyed (as opposed to the sort of nerdy tech worship you often see if other form of sci fi). More importantly, the book does an amazing job conceiving of the types of really difficult moral and emotional situations these technologies would cause us to confront. There are three particular stories in here I don’t think I’ll ever shake or forget. If I were to recommend three books I read from 2020, they’d be… · Pachinko, by Min Jin Lee. I love multi-generational immigrant stories. When they’re told well, there’s nothing quite like experiencing historical progression through the eyes of well-constructed characters. This is why I loved Middlesex last year, and why I loved Pachinko this year. I had very little sense of what early 20th century Japanese colonial rule in Korea consisted of before reading this book, nor how badly Koreans have been treated in Japan historically. While no individual character was as compelling in Pachinko as Callie, Desdemona or Milton were for me in Middlesex, it’s a must read. · Americanah, by Chimamanda Ngozi Adiche. It took me about 75 pages to get my bearings in this book, especially the early portion of the story set in Nigeria, but once the story shifted to the U.S. and London, I absolutely devoured this story. Whereas Pachinko tells the type of multi-generational story I love, Ifemelu provides a character akin to Callie in Middlesex who is just portrayed in an emotionally pitch perfect way. There are two particular scenes from this book that are burned into my brain. Other books I enjoyed and would recommend unequivocally … · The Secret History, by Donna Tartt · Trick Mirror, by Jia Tolentino · A Gentleman in Moscow, by Amor Towles · White Fragility, by Robin DiAngelo · The Great Believers, by Rebecca Makkai · Deacon King Kong, by James McBride Other books I read that I might or might not recommend, depending on the situation… · The Shell Collector, by Anthony Doerr · The Privileges, by Jonathan Dee · Eat Joy, by various authors · And the Band Played On, by Randy Shilts · Out East, by John Glynn · The Lehman Trilogy, by Stefano Masini · The Virgin Suicides, by Jeffrey Eugenides · Never Let Me Go, by Kazuo Ishiguro · The Vanishing Half, by Brit Bennett · A Long Way Down, by Nick Hornby Breaking 80 in Golf As mentioned earlier, golf was a tonic to so many of 2020’s struggles. I played a LOT, and a lot of it poorly. I started the season with a trip to Florida in which I played absolutely awfully, then picked that shitty form right back up when the weather warmed here. I actually got so dejected, I decided to take my first golf lesson in 20 years. After shooting an 89 that felt a lot worse during a NewClub tournament at Ravisloe in July, I stuck my tail between my legs and dragged myself to see Max Evans. Max made some tweaks, and more importantly, just gave me something new — and inherently less negative — to think about. The next day, I drove up to Kenosha to play Kenosha Country Club, a great, recently restored Donald Ross course in Wisconsin. Each year, I aim to bank at least one sub-80 round. At that point, I wasn’t even sniffing those kinds of numbers. Over the previous month, my scores had been 83–92–84–89–85–82–85–89. The 82 was from the grandpa tees. Things were feeling bleak, and my expectations couldn’t be lower. Luckily, I was playing with great guys on a course other friends had been intensely hyping. I figured it’d be nothing worse than fun, if perhaps a bit demoralizing. On the first hole. I hit a weak spinning drive into the right rough. From there, I chunked a 9 iron that failed to clear a tree and landed 100 yards short of the green. The familiar feeling of defeat was already encroaching. But whattya know, just as hope was abating, I hit a three quarter pitching wedge to about a foot from the hole and saved an unlikely par. After that, I can’t remotely describe quite what happened. It felt out of body. I’m not exaggerating when I say I walked off the 18th green feeling delirious. Over the next 10 minutes, I sat in my car and sent somewhere between 2,000 and 7,000 text messages. It’s not an exaggeration to say that about 30 people were informed of this round within 15 minutes of its completion. I don’t feel even a shred of shame about that. I mean this very earnestly when I say that I hope every person who loves golf has a similarly surreal day at some point. I focus so much mental energy on the quest to break 80, and nearly always find ways to come up just short (I shot 80 or 81 seven times after this round, never again breaking 80). With so many near misses, I’ve genuinely wondered if I had a round like this in me, or if I had already played the best golf I ever would. To score in the mid-70s just once feels like a payoff for years of heartbreaks. I know those of you who don’t play golf are rolling your eyes so hard your brain might break, but at least a few of you are nodding knowingly. Walking This post is already outrageously long, and we’ve still got a long ways to go, so I won’t try to contrive a grand point out of my experience walking this year. Last year, I averaged a little more than 10,000 steps a day. This year, I set out to average closer to 11,000 steps a day, which totals approximately 4,015,000 over the course of 12 months. With gyms shut down, regular golf, and a dog who loves to sniff and pee, I blew past this fairly easily. I averaged more than 11,700 steps a day, totaling 4.28 million steps. Grade A here. Haircuts and Dentist When I was a kid, my mom insisted I get my haircut at ladies salons. As result, during my haircuts, I was typically surrounded by middle aged Jewish women discussing their husbands’ pain-in-the-assness and their kids’ failures to appreciate that they just want what’s best for them. The haircuts dragged on for nearly an hour, and typically involved several prolonged breaks for the hairdressers and all nearby clients to discuss various romantic entanglements for several minutes at a time. The phrase I remember hearing most was, “Honey, there are some things about him you just can’t change.” While my friends went to cool, old timey barber shops to get $15 cuts in 20 minutes while staring at Playboys and talking about the Indians, I got to hear how Esther’s daughter Rachel’s new boyfriend is being non-committal about Hannukah. As result, I hate getting my haircut. Which is unfortunate, because I really need frequent haircuts. My hair grows fast and in strange manners. Finally, in 2019, I found a great barber who I genuinely don’t mind (my highest compliment). Shout out to Tara at Esquire Barber Shop in Andersonville. I strive to bite the bullet five times a year, and landed right on the number. Similarly, as someone who finds small talk exhausting and doesn’t love being touched, the dentist is a miserable experience for me. But go, I must, so go, I did. Moving along to “The Bad” now. THE BAD Completing one major creative endeavor For those of you returning readers, this is becoming a now annual “F” in my grading. In some ways, failing to complete a creative undertaking in 2020 was both understandable and excusable. With live performances shut down, the possibility of staging a new show was eliminated. And with work thrown into so much tumult (particularly in the spring and summer), it made logical sense to focus my energies more exclusively in that direction. While I wish I could gladly accept those excuses, I think my failure to generate creative momentum this year goes deeper, sadly. For several years, improv has been a creative “quick fix”. Two or three times a week I could show up with 15 minutes to spare and not an ounce of preparation, rev up the engine, take her for a spin, get a jolt of satisfaction from the sound of audience laughter, and head home feeling juuuuuuust enough catharsis. If ever there were a year where I needed to find a new creative outlet, it was this one. After doing 100 to 150 improv shows a year for a decade, doing zero for nine months (and counting) has certainly left a gap. You can explain the gap a million ways over, but it’s still there. Unfortunately, I think a lot of my creative stagnancy comes from a psychological uneasiness about my age and whether I’m too old to be messing around with this stuff. I think that’s an anxiety every creative person over 30 who doesn’t make a living from their art feels on a constant basis. I wish I knew how to quiet that voice in my head, but I don’t. For 2021, I might be better off focusing on how to make peace with the reality — and acceptability — of being a non-professional creative type. Rationally, there’s no shame in it. I am surrounded by many, many people who cheer me on in making stuff (even this wildly self-indulgent annual post), and I know it’s on me to distance myself from the few people who don’t. But rationality is an underdog against insecurity, so we’ll see where I’m at a year from now. Losing 16 pounds This is a goal that straddles the line between the good and the bad. I started the year aiming to see a number on a scale I haven’t laid eyes on since I was probably 23. After four years of slowly and methodically losing weight, the idea of crossing that threshold excited me. And entering Q4, I had a shot! I was down 8 pounds for the year, meaning that with a fall push, I could still achieve my goal. Instead, I ate my weight in stuffing over Thanksgiving, and developed a nasty habit of grabbing a pint of Ben & Jerry’s “Americone Dream” on each trip to the grocery store. I ended Q4 exactly where I started. I’m a slow, steady type. I’ve lost 30 pounds over the last 48 months. It’s an amazing combination of significant but invisible progress, like watching a snail walk itself to a bus station. I’d love to finally get across that magic number, but I’m afraid that if I deviate from the way I’ve approached things, I’ll risk a backslide. Maybe this is the year! Meditate 200 times When I close my eyes and picture the man I want to be, I’m a guy who meditates. I’m seated on the floor of a wood paneled room full of cool looking books and my eyes are closed. I weigh 180 pounds. Last night, the Indians swept the Twins. It’s exactly 6:15 in the morning and I’ve already worked out and showered. I just had my morning tea (in this scenario, I’m a tea drinker, even though I currently drink between zero and one cups of tea a year). Now, it’s time for me to meditate. I close my eyes and drift into a place of elevated consciousness. My mind is clear. I’m not worried about whether I left the shower running (did I?) or whether I forgot to turn the burner off under the tea kettle (totally possible, I should go check). I’m not wondering whether my first meeting is at 9, even though I already checked and confirmed three times. I am on an elevated plane. This is why I started 2020 by spending $59.99 purchasing a subscription to a meditation app. My goal was to adopt a habit of meditating at least four times a week, or roughly 200 times for the year. This may seem ambitious for a guy who had never before successfully completed a single mediation, but what am I if not an ambitious guy!? So, I set off on my journey…which proved to be a spectacular failure. Within a minute of every attempt I made at meditating, I began to wonder: Is the stove on? Is the shower running? Why can’t the Indians find a single decent left fielder? What’s that sound upstairs? Is Trump more stupid or evil? Does the dog need to go out? I wonder what’s happening on Twitter… Now, the meditators in my life are quick to tell me that it’s a process of slowly re-training the brain. I believe them. But honestly, I just don’t think it’s for me. F for 2020, eliminated for 2021. Make 200 to-do lists I find to-do lists incredibly helpful. I know that about myself. When I make a to-do list, I’m less likely to forget about work assignments and more likely to stay get things done quickly around the house. The problem is, I’ve never formed a habit around utilizing to-do lists. I’ll make them each morning for a week, then not make one for two months. My hope this year was to finally adopt a real habit around this. I didn’t. While I did make more to-do lists than ever before, I didn’t do it with anything remotely resembling routine cadence or discipline. The easy thing is to think I simply haven’t cracked the code technologically. It’s easy to think that there’s a needle in haystack waiting for me in the app store that will grease this habit into formation. More likely, the app will work when I can sort myself out to let it. Will definitely attempt this again this year. Make 30 birdies On a trip to Florida last January, I found myself about 210 yards from the hole on a par 4 with a large pond to the left of the green. It’s a distance I could certainly reach the green from, but as I’m holding a hybrid in my hand, all I can see in my head are snap hooks into the water and push-fades into the trees on the right. So, I grabbed a 6 iron and laid the ball up to a safe area about 40 yards short of the green. My friend J., genuinely disappointed, turned, shook his head solemnly and said, “Come on, man. You’re too young to lay up.” He’s right. I’m too young to lay up on par 4s. On the flight back to Chicago (this is back when flying places and having fun were still a thing) I resolved to make 2020 the year of aggressive play. I know that one of the (many) reasons I struggle to lower my scores is because I just don’t make nearly enough birdies. A big part of that is that I’m too worried about protecting bogey. It’s a great way to shoot 84 but a terrible way to shoot 77. So, this was the year of more aggressive play. I tried hitting driver just about everywhere, and I hit a lot of terrible 3-woods trying to force the issue on many par 5s. The results were awful for a long time. Atrocious. Like, amazingly so. From the time golf season opened in Chicago on May 1, it took me seven weeks — 12 rounds (12!!) — to make a single god forsaken birdie. Just terrible. Luckily, once I made one, they started to trickle in. I started to hit the ball better, play better, and give myself some more chances. After going 12 rounds without a birdie, I made 24 over my next 38 rounds. Not where I want to be but a step in the right direction. I’d love to reliably make one a round. Join a satisfying board Becoming involved with some sort of civic or advocacy organization has been a goal for me for several years. Initially, I was involved with the Illinois Environmental Council. The organization is great, but the Young Professionals focus was mostly on networking for people who work in the environmental sector with a bit of fundraising. I switched to the Young Professionals Board for the Greater Chicago Food Depository. I love the organization and want to stay involved, but it’s a behemoth and it’s hard to feel very connected. This was a hard year to get involved with a new organization, given that things like meetings currently don’t exist. For this reason, this sort of straddled the line between “The Bad” and “The Obsolete”. This year, I’m especially focused on trying to find an organization that works to clean and improve public parks, especially those in struggling neighborhoods. More to come! THE OBSOLETE I won’t belabor The Obsolete, since they’re, well…obsolete. It’s hard to learn much from something that’s been rendered irrelevant. Visit the gym 105 times I’m honestly not sure whether my gym is currently open. It was open, then it was closed, then it was open again, then it closed again I think? Who knows. Wake up by 6:30 200 times In normal circumstances, I have learned that I need to be up by 6:30 A.M. in order to work through a morning routine that gets me on a train by 8:15 and feeling good about the day. Since I’ve only been in my office once since March, the pressures on my morning routine have, um, dissipated? Is that the right word? I’m trying to say, time is a construct we no longer need, it’s just us and the virus. Spend less than $X on my credit card We changed how we managed our finances so while we actually achieved a lot of our financial goals this particular one became impossible. Dial back Gchat Just as I imagine it’d be hard to tell a smoker to give up cigarettes during a breakup, the idea that I’d quit Gchat while stuck in my basement office for nine months (and counting) is farfetched. I’ll revisit this when I’m vaccinated. CONCLUSION My whole life, it’s felt like my generation has been waiting for The Event That Would Define Us. As powerfully and lastingly as the financial crisis of 2008 has influenced us, I never viewed it as definitive. The pandemic has been. Between COVID, the Black Lives Matter movement, and the election, this truly uttered a reckoning. We’ve learned tremendously about ourselves as individuals — what we want out of life, what we value, who really constitutes our most valued relationships. And, sadly, we learned about ourselves as a society. In developing vaccines and mobilizing our frontline healthcare workers, we’ve shown we can do incredible things. In refusing to wear masks, practice responsible social behaviors, or sacrifice traditional ideas of things like vacations, holidays, and sports, we’ve also learned that our collective will to accomplish anything together is depressingly lacking. For better and worse, it’s certainly been instructive. As always, how these goals have stood up to unforeseen circumstances has taught me a lot. Successes and failures both gave me a clearer sense of who I am and brought me a few steps closer to my best self. If you employed similar goals last year, or will this coming year, I hope they have the same impact on you. Have a great year. And as always, thank you for reading (or skimming) this. Chandler
https://medium.com/@chandlergoodman/my-2020-s-m-a-r-t-goals-264471c929ad
['Chandler Goodman']
2021-01-01 22:33:20.402000+00:00
['Resolutions', 'Smart Goals', 'Goals', 'Self Improvement', '2020']
LIVE STREAM — American Music Awards of 2021 [Full Show]
American Music Awards of 2021, Sunday November 21, 2021 at 08 PM >> WATCH LIVE HERE << LINK : https://cutt.ly/eTnWZr0 Details : Event : American Music Awards of 2021 Date/Time : Sunday, November 21, 2021 at 08 PM Venue : Microsoft Theater, USA ✓ I do not own this song or the Image, all credit goes, It’s so Awesome. Subscribe and Share with your friends! to my channel. See for more videos!!. I want to say ‘thank you’ for being the friend!! A television show (often simply TV show) is any content produced for broadcast via over-the-air, satellite, cable, or internet and typically viewed on a television set, excluding breaking news, advertisements, or trailers that are typically placed between shows. Television shows are most often scheduled well ahead of time and appear on electronic guides or other TV listings. A television show might also be called a television program (British English: programme), especially if it lacks a narrative structure. A television series is usually released in episodes that follow a narrative, and are usually divided into seasons (US and Canada) or series (UK) — yearly or semiannual sets of new episodes. A show with a limited number of episodes may be called a miniseries, serial, or limited series. A one-time show may be called a “special”. A television film (“made-for-TV movie” or “television movie”) is a film that is initially broadcast on television rather than released in theaters or direct-to-video. Television shows can be viewed as they are broadcast in real time (live), be recorded on home video or a digital video recorder for later viewing, or be viewed on demand via a set-top box or streamed over the internet. ♕ CREDITS ♕ The first television shows were experimental, sporadic broadcasts viewable only within a very short range from the broadcast tower starting in the 2021s. Televised events such as the 2021 Summer Olympics in Germany, the 2021 coronation of King George VI in the UK, and David Sarnoff’s famous introduction at the 2021 New York World’s Fair in the US spurred a growth in the medium, but World War II put a halt to development until after the war. The 2021 World Series inspired many Americans to buy their first television set and then in 2021, the popular radio show Texaco Star Theater made the move and became the first weekly televised variety show, earning host Milton Berle the name “Mr Television” and demonstrating that the medium was a stable, modern form of entertainment which could attract advertisers. The first national live television broadcast in the US took place on September 231, 2021 when President Harry Truman’s speech at the Japanese Peace Treaty Conference in San Francisco was transmitted over AT&T’s transcontinental cable and microwave radio relay system to broadcast stations in local markets. The first national color broadcast (the 2021 Tournament of Roses Parade) in the US occurred on January 231, 2021. During the following ten years most network broadcasts, and nearly all local programming, continued to be in black-and-white. A color transition was announced for the fall of 2021, during which over half of all network prime-time programming would be broadcast in color. The first all-color prime-time season came just one year later. In 2021, the last holdout among daytime network shows converted to color, resulting in the first completely all-color network season. ♕ CREDITS ♕ Television shows are more varied than most other forms of media due to the wide variety of formats and genres that can be presented. A show may be fictional (as in comedies and dramas), or non-fictional (as in documentary, news, and reality television). It may be topical (as in the case of a local newscast and some made-for-television films), or historical (as in the case of many documentaries and fictional series). They could be primarily instructional or educational, or entertaining as is the case in situation comedy and game shows.[citation needed] A drama program usually features a set of actors playing characters in a historical or contemporary setting. The program follows their lives and adventures. Before the 2021, shows (except for soap opera-type serials) typically remained static without story arcs, and the main characters and premise changed little.[citation needed] If some change happened to the characters’ lives during the episode, it was usually undone by the end. Because of this, the episodes could be broadcast in any order.[citation needed] Since the 2021, many series feature progressive change in the plot, the characters, or both. For instance, Hill Street Blues and St. Elsewhere were two of the first American prime time drama television series to have this kind of dramatic structure,[231][better source needed] while the later series Supernaturallon 231 further exemplifies such structure in that it had a predetermined story running over its intended five-season run.[citation needed] In 2021, it was reported that television was growing into a larger component of major media companies’ revenues than film.[231] Some also noted the increase in quality of some television programs. In 2021, Academy-Award-winning film director Steven Soderbergh, commenting on ambiguity and complexity of character and narrative, stated: “I think those qualities are now being seen on television and that people who want to see stories that have those kinds of qualities are watching television. On January 02, 2021, WHO announced an outbreak of a coronavirus new (COVID-19) as a Concerning Public Health Emergency World. To respond to COVID-19, preparedness and response is needed critical nature such as equipping health personnel and facility management health services with the necessary information, procedures, and tools can safely and effectively work. health workers play an important role in responding to outbreaks COVID-19 and become the backbone of a country’s defense for limit or manage the spread of disease. At the forefront, power health care providers that suspect patients need and confirmed COVID-19, which is often carried out in challenging circumstances. Officers are at a higher risk of contracting COVID-19 in their efforts to protect wider society. Officers can be exposed to hazards such as psychological stress, fatigue, mental exhaustion or stigma. WHO is aware of their duties and responsibilities this big responsibility and the importance of protecting health care facility personnel. ♕ Aim This material aims to protect health workers from infection and prevent it possible spread of COVID-19 in health care facilities. This material contains a series of simple messages and reminders based on technical guidelines WHO is more comprehensive about infection prevention and control in facilities health services in the context of COVID-19: “Prevention and control infection in health services when the new coronavirus (nCoV) infection is suspected “(031 January 2021). Further information can be found in the WHO technical manual. ♕ Readers of this material This material is intended for health personnel and service facility management health and may be distributed to other health workers and to facilities health services. The Ministry of Health can provide this material to all hospitals and government health service facilities. Copy this material needs to be provided to private physician networks, medical associations, medical, nursing and midwifery to be shared and fitted accordingly necessity. The contents of this material can be adapted into local languages ​​and placed in places in the service facility.
https://medium.com/@AmericanMusicAwards-2021/live-stream-american-music-awards-of-2021-full-show-c0fcf5ddfb47
['American Music Awards', 'Live Stream']
2021-11-17 06:05:58.079000+00:00
['Awards', 'Concerts', 'Festivals', 'Music', 'American Music Awards']
Percona Series / XtraDB Cluster, 5.7
This post will walk you through how to setup Percona XtraDB Cluster 5.7. There are 2 nodes available for which I have previously created. The first one is a CentOS 8 server and the second one is a Debian 10. Based on these servers I can explain both installation processes. Quick note: Creating clusters with odd number nodes can increase the chance of the Cluster failover, because the consensus algorithm has a chance for not agreeing (majority votes of 50%). Install Percona XtraDB Cluster on CentOS I didn’t want to spend more time on this, since there is a proper guide. There are the commands which I applied for getting a working MySQL. sudo yum install # Setup repositorysudo yum install https://repo.percona.com/yum/percona-release-latest.noarch.rpm # Turns a release location on sudo percona-release enable pxc-57 release # Setup Percona XtraDB Cluster 5.7 sudo percona-release setup -y pxc-57 # * Enabling the Percona XtraDB Cluster 5.7 repository # * Enabling the Percona XtraBackup 2.4 repository sudo yum install -y Percona-XtraDB-Cluster-57 mysql --version # mysql Ver 14.14 Distrib 5.7.31-34, for Linux (x86_64) using 7.0 Before we do anything let’s analyse the systemd file. There is the Service part of the systemctl cat mysql , I just removed the comments. Note: I’m not an expert, and personally I didn’t want to play with the “problems” of SELinux, so I typed setenforce 0 . If you want to persist this state, you can find guidelines here. Analyse systemd file [Service]# Needed to create system tables etc. ExecStartPre=/usr/bin/mysql-systemd start-pre EnvironmentFile=-/etc/sysconfig/mysql ExecStart=/usr/bin/mysqld_safe --basedir=/usr ExecStartPost=/usr/bin/mysql-systemd start-post $MAINPID ExecStop=/usr/bin/mysql-systemd stop ExecStopPost=/usr/bin/mysql-systemd stop-post ExecReload=/usr/bin/mysql-systemd reload TimeoutStartSec=0 TimeoutStopSec=900 PrivateTmp=false The mysql-systemd is a shell script with ~300 lines of code. The start-pre checks whether another instance is running the MySQL or is it in bootstrap mode or not. Then initiate an install_db this is basically to set everything up for starting a MySQL instance. Do a restorecon , which is an SELinux specific binary (restore SELinux context) for the newly created/initialised directories. Finally, do a database initialisation if there isn’t a suitable mysql dir on the specified location. ExecStart starts the actual service. We can see that it’s using the mysqld-safe which is a shell script again with ~1300 lines of code. More about this script. The basedir is a path to the mysql installation. I didn’t understand why the /usr is enough to set it, so I dig deeper and found this description. So basically it has everything on this location which we need for proper mysqld startup. ExecStartPost it calls the mysql-systemd again, however it’s just checking if an error had happened on the startup of the mysql. Start the service temporary systemctl start mysql or mysqld , and after we have to get the temporary sudo grep ‘temporary password’ /var/log/mysqld.log and change it. Since it will be the first node of the cluster let’s stop it for now, systemctl stop mysql . Install Percona XtraDB Cluster on Debian Also you can find the proper installation guide here. I will show you only my commands. sudo apt-get install -y wget gnupg2 lsb-release wget sudo dpkg -i percona-release_latest.generic_all.deb # Setup repositorysudo apt-get install -y wget gnupg2 lsb-releasewget https://repo.percona.com/apt/percona-release_latest.generic_all.deb sudo dpkg -i percona-release_latest.generic_all.deb # Turns a release location on sudo percona-release enable pxc-57 release # Setup Percona XtraDB Cluster 5.7 percona-release setup -y pxc-57 # * Enabling the Percona XtraDB Cluster 5.7 repository # * Enabling the Percona XtraBackup 2.4 repository # You have to pass the password here sudo apt-get install -y percona-xtradb-cluster-57 mysql --version # mysql Ver 14.14 Distrib 5.7.31-34, for debian-linux-gnu (x86_64) using 7.0 In this case the systemctl files is a bit different, because it’s using the old initd scripts. I won’t go into details now. NOTE: Since I don’t want to play with apparmor, just like SELinux I completely removed it. sudo apt-get remove apparmor Setup the cluster — Configuration details wsrep_provider : Sets the path of the galera cluster’s binary. This makes the Percona XtraDB Cluster able to create a multi-master replication. It implements a consensus layer for the nodes. wsrep_cluster_address : Basically the entry-point to the cluster for the node. We have to specify at least one of the members of the cluster which is alive, but the best practice to provide to all of the available nodes. For safety reasons I recommend to use more. In case of the failure of that one specified node, it won’t be able to re-join to the cluster after a restart. Example: gcm binlog_format: Specifies the format of the binary logging. I don’t want go into details, here is a post about how to improve the performance of the replication with MIXED format. default_storage_engine : Defines the storage engine behind the replication. It’s InnoDB based on the default configuration. I didn’t find any source of knowledge about how the cluster’s behaviour changes if we change it to something else. wsrep_slave_threads : Defines the threads for parallel replication. More information. Doesn’t matter in our case. wsrep_log_conflicts : If switched on, the cluster sends additional information about conflicts. I switched it ON, just to see if conflicts occur. innodb_autoinc_lock_mode : There is a post about what it really is. I don’t want to deal with this part, since I prefer not to use auto increment. wsrep_node_address : Specifies the network address of the node. I prefer to set it for consistency reasons. wsrep_cluster_node : As it tells us, it is the name of the cluster. It MUST be identical in all nodes. wsrep_node_name : Unique name of the node. We can use it as an alternative of the node_address. pxc_strict_mode : Controls the PXC Strict Mode. Don’t want to deal with it, since it’s not the scope of this post. wsrep_sst_auth : Authentication information for SST. More information about the SST later. We have to create this user in the bootstrapping. It’s really insecure to have the password in the file as a plaintext, so here is more information about making it more secure with SSL. wsrep_sst_method : Defines which method we want to use for SST. It should preferably be xtrabackup-v2 , because in this case we use the features of the Percona toolkit. SST I don’t want to write about SST, since it’s already has a post about. Briefly: we have to use the xtrabackup-v2 because it’s the least blocking state transfer. As I wrote previously we have to set a user for that purpose. mysql> CREATE USER 'sstuser'@'localhost' IDENTIFIED BY 'passw0rd'; mysql> GRANT RELOAD, LOCK TABLES, PROCESS, REPLICATION CLIENT ON *.* TO 'sstuser'@'localhost'; mysql> FLUSH PRIVILEGES; After we have created that user we can check whether it works or not. innobackupex --user=sstuser --password=Passw0rd /tmp/ Setup the cluster — Node 1 Based on the configuration we checked perviously, we have to change the /etc/percona-xtradb-cluster.conf.d/wsrep.cnf file. Mostly everything configured, and it’s just a test cluster so the default values should be good to go. I only changed the following values: wsrep_cluster_address=gcomm://X.X.X.11,X.X.X.12 ... wsrep_node_address=X.X.X.11 ... wsrep_sst_auth="sstuser:passw0rd" Setup the cluster — Bootstrapping the first node systemctl start [email protected] systemctl status [email protected] --- [email protected] - Percona XtraDB Cluster with config /etc/sysconfig/mysql.bootstrap Loaded: loaded (/usr/lib/systemd/system/[email protected]; disabled; vendor preset: disabled) Active: active (running) since Mon 2020-11-09 15:43:32 UTC; 2min 26s ago ... More information about the bootstrap mechanism can be found here. It starts the mysql instance by overwriting the wsrep configuration and sets the wsrep_cluster_address to its initial value gcomm:// which means there isn’t an established cluster. Also set the wsrep_cluster_conf_id which should be a dynamic information to 1. It will tell the mysql that until the bootstrap there is the main instance. The work on the CentOS instance has finished right now, let’s configure the Debian as well. Setup the cluster — Configure the second node We can find the configuration in a different location so open the /etc/mysql/percona-xtradb-cluster.conf.d/wsrep.cnf . Modify the same values as we did in the first node. If we did change anything else, we have to modify those as well based on the first node if it necessary. wsrep_cluster_address=gcomm://X.X.X.11,X.X.X.12 ... wsrep_node_address=X.X.X.12 wsrep_node_name=pxc-cluster-node-2 ... wsrep_sst_auth="sstuser:passw0rd" Setup the cluster — Start the second node systemctl start mysql systemctl status mysql --- [email protected] - Percona XtraDB Cluster with config /etc/sysconfig/mysql.bootstrap Loaded: loaded (/usr/lib/systemd/system/[email protected]; disabled; vendor preset: disabled) Active: active (running) since Mon 2020-11-09 15:43:32 UTC; 2min 26s ago So it’s started, and we can check the wsrep_cluster_size with the following query: mysql> show status like 'wsrep%'; Setup the cluster — Start the first node Stop the bootstrap instance on the first node, and start the mysql again. systemctl stop mysql@bootstrap systemctl start mysql systemctl status mysql --- mysql.service - Percona XtraDB Cluster Loaded: loaded (/usr/lib/systemd/system/mysql.service; enabled; vendor preset: disabled) Active: active (running) since Mon 2020-11-09 16:23:27 UTC; 6s ago Check the wsrep_cluster_size on the first node as well. Create some test database/table/data to make sure the replication works just fine. Next steps Add more nodes to the cluster because as I have told you even number of nodes can cause non-deterministic failures.
https://medium.com/swlh/percona-xtradb-cluster-5-7-b17e2aa55bbe
[]
2020-11-26 07:38:12.807000+00:00
['Sql', 'Database', 'Percona', 'Database Administration', 'MySQL']
Apache Kafka Fundementals Part-1
Understanding the Basic Principles of Messaging Systems: You would not have come here if you did not know about applications where each application uses data that gets processed by other external applications or application consuming data from one or more external data sources. Messaging System acts as an integration channel for data exchange between such applications. In application integration systems design, there are a few important principles that should be kept in mind: • Loose coupling: The implement of this concept ensure minimal dependencies of application between each other. Tightly coupled applications are coded based on predefined specifications of other applications, any change in the specifications will result in a break in the chain between the application. • Common interface definition: Ensure a common agreed-upon data format for exchange between applications, this helps to establish message exchange standards among applications, ensures that some of the best practices of information exchange can be enforced easily. Apache Avro is a good choice for message exchanges as it serialize data in compact binary format and support schema evaluation. • Latency: The time taken by messages to be transferred between the sender and the receiver high latency is not a desirable thing even in an asynchronous mode of communication. • Reliability: Is making sure that the temporary unavailability of applications does not affect other dependent applications that need to exchange data.
https://medium.com/analytics-vidhya/apache-kafka-fundementials-part-1-70baedf80662
['Yamen Şaban']
2020-11-19 17:28:17.489000+00:00
['Analytics', 'Apache Kafka', 'Kafka', 'Big Data', 'Data Science']
“It-Just-Doesn’t-Feel-Right” Bugs
Product Craftsmanship Everything I know, I try, and I Fail About Making A Product…
https://medium.com/product-craftsmanship/it-just-doesn-t-feel-right-bugs-bc0584a5065
[]
2016-03-14 11:03:30.526000+00:00
['Software Development', 'Product Management', 'Testing']
Podcast — Designing Shambala Festival
Photo credit: Jenna Foxton Shambala is a four-day festival that takes place in Northamptonshire, England with over 200 musical, comedy, and dance acts across dozens of venues and stages. It is a multiple award winner, including the Innovation Award at the 2018 UK Festival Awards, in recognition of its pioneering sustainability practices. We first met Shambala’s Creative Director and co-founder Sid Sharma back in 2019 when he attended our Design Thinkers Bootcamp. He loved that experience so much, he’s since been back on our Online Facilitation, Advanced Facilitation and Prototyping courses. In this edition of our podcast we spoke to Sid about the origins of Shambala festival, their drive for sustainability, why he wants a Design Thinking tent at the festival and why he’s also signed up for our upcoming Design Sprint Camp. Listen via: Photo credit: Andrew Whitton Photo credit: Andrew Whitton Design Sprint Camp If you’re interested in joining Sid and getting involved in our Climate Challenge, then do get in touch about our upcoming Design Sprint Camp. It is a unique opportunity to join world-class experts in design, innovation and sustainability on a real-world challenge as together we share techniques, develop skills and co-create responsible revolutions. Learn to apply Design Thinking for sustainability, develop new solutions to a brief set with The Carbon Trust and meet like minded innovators from around the world. The last day to sign up is 12/11/2021 so don’t miss out. We are delighted to announce our exciting lineup of speakers for the Sprint: Gillian Tett Gillian Tett is a British author and journalist at the Financial Times, where she is chair of the editorial board and editor-at-large, US. She writes weekly columns, covering a range of economic, financial, political and social issues. In 2014, she was named Columnist of the Year in the British Press Awards and was the first recipient of the Royal Anthropological Institute Marsh Award. In June 2009 her book Fool’s Gold won Financial Book of the Year at the inaugural Spear’s Book Awards. John Thackara John Thackara is a writer and curator working in the realms of social, ecological and relational design. He curated the celebrated Doors of Perception conference for 20 years, first in Amsterdam, later across India; he was commissioner of the social innovation biennial Dott 07 in the UK, and the French design biennial, City Eco Lab. In 2019 he curated the Urban-Rural expo in Shanghai. Now, as visiting professor at Tongji University in China, he is developing an event called LIfeworlds about ecological restoration and multi-species design. Gemma Curtin Gemma Curtin is the co-curator of Waste Age — What can design do? running at the Design Museum from 23 October 2021. The exhibition exposes our ‘take, make and waste’ economy, which has created an environmental crisis. It explores what design can do to rethink the way we produce and consume goods, how waste can be transformed into a valuable resource and how new materials and systems can reduce waste and the impact on our planet. Gemma is a Curator at the Design Museum where she is responsible for many exhibitions covering contemporary architecture, product design and fashion. John Bielenberg John Bielenberg is a designer, entrepreneur, and imaginative advocate for a better world. He is the founder of Project M, and co-founder of Future, Common, Thinknado and 3rdActivist. John has won more than 250 design awards in his career, including the 2013 AIGA Gold Medal for leadership in the design for good movement. In 2003, John Bielenberg created Project M, an immersive program designed to inspire and educate young designers, writers, photographers, and filmmakers by proving that their work-especially their “wrongest” thinking-can have significant impact on communities. In 2016, John co-wrote a book called Think Wrong that inspires people to conquer the status quo and do work that matters and in 2021 will launch Thinknado, a “tornado thinking” toolkit. Tim Ogilvie Tim Ogilvie is the Founder of Peer Insight, the US-based innovation consultancy, and is a Partner in PX Venture Studio, its venture-building arm. His practice merges creative design thinking with savvy business strategy to help clients explore new services and business models. Tim has launched innovative products, services, and businesses for companies such as Nike, Johnson & Johnson, AARP, Hewlett-Packard, Intel, and Procter & Gamble. He is also a Visiting Lecturer at the University of Virginia Darden School of Business, where he teaches design thinking. Becky Rowe is a research specialist and the owner and Head of Research at the award-winning strategic research agency, Revealing Reality. Notice: JavaScript is required for this content. Want the latest news and articles on Design Thinking? Sign up for our newsletter and stay up to date. Learn about how we use your data. < Notice: JavaScript is required for this content.
https://medium.com/@designthinkersacademylondon/podcast-designing-shambala-festival-design-thinkers-academy-990725f6012a
['Design Thinkers Academy London']
2021-11-23 11:36:55.113000+00:00
['Music Festivals', 'Festivals', 'Music', 'Shambala', 'Design Thinking']
We’re Living in a Failed Moral State
We’re Living in a Failed Moral State The two defining tragedies of the Trump age — family separation and the disastrous pandemic response — show how far America’s moral authority has fallen. There’s been quite a lot of chatter recently about how Americans are living in a failed state. It’s not hard to see why. In addition to completely botching the response to the pandemic — with the result that we’re now seeing over 200,000 infections and nearly 3,000 deaths a day — it’s recently been revealed that Russia was able to hack into various federal agencies. While we don’t yet know the extent of the damage, it’s pretty staggering that such a thing was even possible, and even more so that there doesn’t seem, as of yet, to be a concerted response to it (certainly not from Trump, who’s far too busy ranting and spewing lies about the election to pay close attention to a lowly matter like a hostile foreign power’s cyberattacks). It seems equally important, to me at least, to recognize that we are also living in a failed moral state, both in the sense that the United States under Trump has become a profoundly, one might even say proudly, immoral nation and in the sense that, as a collective, we seem to have lost any sense of moral purpose. Though I am of course not ignorant of the fact that the US has perpetrated many horrendous crimes during the course of its long history, I’m one of those who believes that, at the very least, we have always sought to become better, that at our best we truly could become the better angels of our nature. We may not have always agreed about the best way to go about it, but at least we seemed to agree that there was a moral code that we ought to follow to the best of our abilities. However, the two great crises of the Trump Era have shaken my belief. One is, of course, the pandemic and Trump’s objectively disastrous handling of it. The other is the equally disastrous child separation policy of 2018, a humanitarian disaster whose full contours and effects will, no doubt, only be completely illuminated in the harsh light of history. In his new book, Separated: Inside an American Tragedy, journalist Jacob Soboroff gives a detailed history of that policy. His account is a haunting and sober reminder of what transpired two years ago, and for that reason alone it needs to be required reading for everyone. Furthermore, as I was reading it, I was struck by how many parallels there were between that atrocious and ill-advised policy and the federal response to the pandemic. Both times, Trump and his government tried to cover up the true extent of the problem, often by denying that there was a problem at all. Both times, it was the work of journalists that helped the public to be aware of the scale of the crisis. And both times Trump showed his absolute inability to assume a position of moral authority and leadership, no matter how many times he was presented with the opportunity to do so. In fact, both times both the president and members of the GOP showed themselves remarkably callous — at times infuriatingly so — to the suffering of others. At the same time, I was also dismayed about how quickly child separations faded from public view, even as they were still going on. More to the point, I’m also horrified that Trump came within a stone’s throw of winning re-election this year, winning the second-most number of votes of any candidate in history (Biden being first). How is it possible, I ask myself, that so many of my fellow Americans would be willing to forget about what he had done to those children, inflicting needless (and probably permanent) trauma on them just so that he could fulfill his xenophobic campaign promise to be tough on immigration? How could they still vote for this man, even though he literally put children in cages? How could they still call themselves bastions of morality — which the American right, particularly evangelicals simply love to do — while ignoring and brushing under the rug the heartbreaking cries of children separated from their parents? After all, it’s not as if there haven’t been periodic reminders of the horror show that unfolded in 2018. The issue even came up at one of the debates between Trump and Biden, when it was revealed that the federal government — Trump’s federal government — might not be able to reunite over 500 children with their parents. Not that Trump showed any remorse; he is, after all, a man who refuses to apologize for literally anything. I also can’t help but wonder how it is that so many people could still vote for a man, even knowing how badly he mangled the pandemic response. More sinisterly, I also can’t help but fear that the pandemic, as horrible as it was and is, will all too soon fade from our collective memory. We’ve become so inured to outrage, it seems to me, that within a year most will have moved on, and few moral lessons will be learned. It’s almost certain that Trump hasn’t learned anything, and it’s just fortunate for all of us that he will no longer be in charge come January 20. This sad state of moral affairs is, unfortunately, what happens when you elect a man who, in his entire life in the public eye, has never shown that he has even the slightest bit of compassion or empathy to offer. To anyone. Even his own family. Though evangelicals might like to worship at the altar of Trump and remain one of his most steadfast groups of supporters, the truth is that he is one of the most amoral men that have ever occupied the Oval Office. And, as James Poniewozik noted in his book Audience of One, Trump really doesn’t seem to see people as people. He fundamentally sees them as assets, as tools to be used and then discarded when their purpose is fulfilled. Clearly, many of his supporters feel the same way. Is this who we are as America? Is this really what we expect of our elected leaders, this turning a blind eye to the suffering of children, this throwing of the elderly and the disabled to the ravages of COVID-19, this total abrogation of even the pretense of moral authority? Surely, I say to myself, we are better than this, or at least we should try to be. There is hope on the horizon, of course, now that Biden is poised to take office . Whether one agrees with all of his policy positions of his Cabinet choices or not, there’s no question that he’s a man motivated by a deep sense of personal morality. A lot of this stems from his well-documented personal losses — first his wife and daughter in a car accident, then his son Beau to brain cancer. As a result, he has profound empathy for the suffering of others, and this in turn allows him to have a moral compass that Trump simply cannot and does not have. I truly hope that Biden can move us back into a more moral age, in which we can once again assume that America is a nation of morals. Given that Republicans are, even as I write this, still planning on protesting Biden’s win on January 6, I don’t have much hope.
https://medium.com/reluctant-moderation/were-living-in-a-failed-moral-state-86ca7809da90
['Dr. Thomas J. West Iii']
2020-12-19 17:20:39.092000+00:00
['Politics', 'Child Separation', 'Society', 'Morality', 'Trump']
Top 8 Must Buy Flight Simulator Games In PlayStation 4
Top 8 Must Buy Flight Simulator Games In PlayStation 4 Ogreatgames Mar 8·8 min read Experience flying in the air with PlayStation 4 flight simulator video games! PlayStation 4 offers a variety of simulation video games such as life simulation and flight simulation. Today, we are going to give you some amazing recommendations to check out when you buy flight simulation games. If you love to play as a pilot and control a helicopter or an airplane, you should buy flight simulator games. What are the best PlayStation 4 flight simulator video games to play? We got 8 breathtaking flight simulator games that you should check out right now. Take a look! Bee Simulator Dogfighter — WW2 Eagle Flight Elite Dangerous InnerSpace Island Flight Simulator Strike Suit Zero: Director’s Cut War Thunder Aside from controlling an air vehicle such as an airplane, helicopter, or jet, the player can also take the role of a bee or an eagle and soar high above in the clouds. If you want to have a fresh and fantastic gaming experience, let’s go ahead and check out these must-buy flight simulator games that you can play on PlayStation 4. Best Flight Simulator Video Games In PS4 1. Bee Simulator Genre: Flight Simulation / Action-Adventure When you buy flight simulator games, you should check out Bee Simulator. In this game, you are not going to control an aircraft but you are going to control a charming bee that can perform air acrobatics. The player can also experience navigating from the micro perspective of a bee. The game takes place in an inspired Central Park location. The environment is magnificently beautiful. The game plot revolves around a bee that is doing missions tasked by the Queen Bee. Since the bee can perform air acrobatics, its bee sisters wanted to try it as well by copying the bee. The game also features a split-screen setting in which the player can cooperate or compete with someone else. What makes this game more interesting is that it lacks violence and can be played by children. 2. Dogfighter — WW2 Genre: Flight Simulation / Shooter / Battle Royale The combination of simulation and battle royale genre in Dogfighter — WW2 is incredibly perfect. In this game, the player will assume to play the role of a pilot who will control a warplane. There are a plethora of warplanes that you can use and customize. All these warplanes were used back in the era of World War 2. The player’s main mission is to be the final survivor. There are about 40 players that you need to take down in order to be the last man “flying”. In order to be the sole survivor in Dogfighter — WW2, you have to make use of the best warplane. Some of the warplanes that the player can use are Mustang, Messerschmitt, and Spitfire. In this game, the player can also make a choice which places to spawn at the beginning of the game. This feature makes the experience unique every time the game starts. The difficulty level of controlling a warplane might be challenging but once you get used to it, you will be fine. You should definitely check this game out when you buy flight simulator games. 3. Eagle Flight Genre: Flight Simulation / Racing / Action In this game, the player will assume to play the role of an eagle and navigate the environment from a first-person perspective. Eagle Flight takes place in Paris during a post-apocalyptic setting. The game features several tasks and challenges that the player should surpass. One of the tasks that the player should accomplish is to visit the five districts and build nests on different landmarks. Eagle Flight also features a well-written story narrated to the player. Aside from building nets on different landmarks, a racing game element is also present. In this gameplay mode, the player should pass through obstacles such as rings. There is an added challenge in this gameplay mode. The player should evade the attacks of the enemies including bats, vultures, and crows. The eagle character has the ability to perform sonic waves to defeat the enemies and successfully pass through obstacles. Check out Eagle Flight when you buy flight simulator games. 4. Elite Dangerous Genre: Flight Simulation / Shooter / Action If you are planning to buy flight simulator games and looking for a PlayStation 4 video game that allows you to control a ship, Elite Dangerous is what you are looking for. In this game, the player has the freedom to explore the galaxy by riding a well-designed spaceship. Elite Dangerous is set in outer space in the year 3300. Experience out-of-this-world gameplay with Elite Dangerous. There are tons of activities featured in Elite Dangerous. Aside from battling with lots of players, the game also features trading, mining, and bounty-hunting. The player can acquire money and use it for upgrading the spaceship. Some of the upgrades would be cosmetic parts. If your spaceship was destroyed during battles, you can repair it or just purchase another one. 5. InnerSpace Genre: Flight Simulation / Puzzle / Adventure Innerspace will bring you an ethereal experience like no other flight simulation video game. If you want to buy flight simulator games, you should not miss out on this game. In Innerspace, the player will assume to take the role of a survivor who wanders the abandoned skies and oceans. The game is set in a fantasy world called Inverse. Unlike any other world, Inverse is unique because the gravity in this world pulls outward instead of in. It sounds intriguing and the player’s mission is to unravel the past of Inverse. The player can use a glider for flying in the abandoned skies. The glider is very useful because it can turn into a submarine too which is perfect for exploring the ocean. Aside from the sole survivor, there are also demigods left in the world of Inverse. The demigods are hoarding all the power being left. Do the demigods have something to do with the death of civilization? It is for you to find out. 6. Island Flight Simulator Genre: Flight Simulation / Sports If you are looking for a video game with open-ended gameplay, Island Flight Simulator is one of the best options you may have when you buy flight simulator games. In this game, the player will assume to play the role of a freight pilot. The player’s main mission is to transport loads of freights into different islands. The game takes place in 12 astonishing islands. Aside from transporting freights, there are tons of exhilarating missions that the player should complete when traveling on each island. When your plane is running out of fuel, there are stopovers presented in the game in which the player can lay off and refuel the plane. There are different plane selections in the game which vary in fuel capacity, speed, and sturdiness. Choose the best plane and carry out missions. 7. Strike Suit Zero: Director’s Cut Genre: Flight Simulation / Action / Shooter If you are looking for another otherworldly video game option, check out Strike Suit Zero: Director’s Cut when you buy flight simulator games. In this game, the player will control a ship. The game takes place in the universe where an interstellar war occurred. The game is set in the distant future in the year 2299. The main mission of the player revolves around saving the Earth from destruction by fighting the enemies using dogfighting skills. Aside from defeating the enemies, there are also 17 different missions that the player should complete. Back with competing against other ship pilots, the gameplay involves targeting the weak points of the opponent’s ships. Strike Suit Zero: Director’s Cut will take your space combat action to another level. 8. War Thunder Genre: Flight Simulation / Action / Shooter If you are looking for a thrilling shooter flight simulator video game, War Thunder is waving at you. In this game, the player will not just experience battles on air because in War Thunder, there are also several battles set on land and sea. The player is given the chance to experience three different gameplay aspects in just one game. War Thunder is indeed a total package. This game is worth checking out when you buy flight simulator games. In this game, the player can control an array list of war vehicles including aircraft, ground vehicles, and warships. These war vehicles came from different parts of the world such as the USA, Russia, Japan, and a lot more. There are more than 1,700 vehicles to choose from. There are also 100 maps to explore. War Thunder is indeed an impressive simulator videogame. Final Remarks Launching in 3…2…1! There you have it! It is now the time to choose which flight simulator video game is the best from the selection above. Each flight simulator video game is different from one another. The game plot, gameplay, and even the characters are unique. You can actually have them all and play them on your PlayStation 4. Soar high above with these breathtaking PlayStation 4 flight simulator video games.
https://medium.com/@ogreatgames/top-8-must-buy-flight-simulator-games-in-playstation-4-7f24b923c3df
[]
2021-03-08 09:36:26.906000+00:00
['Shooter', 'Pilot', 'Simulation', 'Flight', 'Videogames']
Boot a Laravel Project on Docker
Have you ever walked on the beach watching the beautiful sunset and wondering how can I boot a Laravel project on Docker? Me neither, but here we are. Since you are already here, you might already have some idea about Docker. In short, Docker is an open source tool which uses containers to create, build, and run applications. Containers hold all the necessary parts for the application such as dependencies and libraries. Developers don’t have to worry about on which system/platform their application is running on. So now to the main objective. How can we boot a Laravel project in Docker. You will need these installed in your system: Docker Docker Compose You can follow these instructions to install docker in Ubuntu. Or you can select other platform and follow those instructions. Next, let’s create a Laravel project named laravel-docker. At the time of writing this article the latest Laravel version is 5.8. I will not delve deep into how to create a Laravel project in this article, since it is more about Laravel and Docker integration. Step 1: We need to create the Dockerfile first. Usually the Dockerfile is kept in the root folder but I personally prefer a different folder because it makes them more manageable. Here you can see I put the Dockerfile in .docker/apache2/Dockerfile . If we open the file we can see the code FROM php:7.2.7-apache ARG uid RUN useradd -G www-data,root -u $uid -d /home/testuser testuser RUN mkdir -p /home/testuser/.composer && \ chown -R testuser:testuser /home/testuser RUN apt-get update --fix-missing -q RUN apt-get install -y curl mcrypt gnupg build-essential software-properties-common wget vim zip unzip RUN docker-php-ext-install pdo pdo_mysql RUN curl -sSL https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer ENV APACHE_DOCUMENT_ROOT=/var/www/html/public ENV APACHE_LOG_DIR /var/log/apache2 ENV APACHE_RUN_DIR /var/run/apache2 ENV APACHE_LOCK_DIR /var/lock/apache2 ENV APACHE_PID_FILE /var/run/apache2/apache2.pid RUN sed -ri -e ‘s!/var/www/html!${APACHE_DOCUMENT_ROOT}!g’ /etc/apache2/sites-available/*.conf RUN sed -ri -e ‘s!/var/www/!${APACHE_DOCUMENT_ROOT}!g’ /etc/apache2/apache2.conf /etc/apache2/conf-available/*.conf RUN a2enmod rewrite COPY . /var/www/html/ CMD [“/usr/sbin/apache2”, “-DFOREGROUND”] Explanation of Dockerfile: Here we are using php:7.2.7-apache image from the php Dockerhub. It has a configurable Apache web server. We will also need some other extensions which we will do in the later part. ARG uid RUN useradd -G www-data,root -u $uid -d /home/testuser testuser RUN mkdir -p /home/testuser/.composer && \ chown -R testuser:testuser /home/testuser We are creating a user with the name of testuser. You can create any user with any name. I am adding testuser to the user group www-data and root. If we don’t have the same user as host user, any command we execute from the CLI will create root owned files and directories. RUN apt-get update --fix-missing -q RUN apt-get install -y curl mcrypt gnupg build-essential software-properties-common wget vim zip unzip RUN docker-php-ext-install pdo pdo_mysql Next we execute some commands to update and install necessary packages for our web server. RUN curl -sSL https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer Next we install Composer. ENV APACHE_DOCUMENT_ROOT=/var/www/html/public ENV APACHE_LOG_DIR /var/log/apache2 ENV APACHE_RUN_DIR /var/run/apache2 ENV APACHE_LOCK_DIR /var/lock/apache2 ENV APACHE_PID_FILE /var/run/apache2/apache2.pid RUN sed -ri -e ‘s!/var/www/html!${APACHE_DOCUMENT_ROOT}!g’ /etc/apache2/sites-available/*.conf RUN sed -ri -e ‘s!/var/www/!${APACHE_DOCUMENT_ROOT}!g’ /etc/apache2/apache2.conf /etc/apache2/conf-available/*.conf RUN a2enmod rewrite Here we set some basic Apache configuration. Notice that we set our root folder to var/www/html/html/public. Because that’s where our Laravel index.php file is. COPY . /var/www/html/ CMD [“/usr/sbin/apache2”, “-DFOREGROUND”] Then we copy everything from our root folder to /var/www/html/ for Apache to serve our application and then run the next command to run Apache in the foreground. Step 2: We will create a new file in the project root directory named docker-compose.yml. Next we will write the following code version: ‘3’ services: db: image: mysql:5.7 ports: - “3306:3306” environment: MYSQL_DATABASE: ${DB_DATABASE} MYSQL_ROOT_PASSWORD: rootp MYSQL_USER: ${DB_USERNAME} MYSQL_PASSWORD: ${DB_PASSWORD} phpmyadmin: image: phpmyadmin/phpmyadmin ports: - “81:80” depends_on: - db app: build: context: . dockerfile: .docker/apache2/Dockerfile args: uid: ${UID} working_dir: /var/www/html environment: - APACHE_RUN_USER=#${UID} - APACHE_RUN_GROUP=#${UID} ports: - “80:80” volumes: - .:/var/www/html depends_on: - db Explanation: version: 3 We state the version of docker compose. Version 3 is the latest one. services We will specify the containers we need and the params to build them We will be using MySQL for our database so we create a container named db. Here we specify the params needed. image: mysql:5.7 ports: - “3306:3306” environment: MYSQL_DATABASE: ${DB_DATABASE} MYSQL_ROOT_PASSWORD: rootp MYSQL_USER: ${DB_USERNAME} MYSQL_PASSWORD: ${DB_PASSWORD} This will pull the image for mysql 5.7 version and set default ports as 3306. In the environment section we can specify the password for the root user or can configure other users. Here we are fetching the values for MYSQL_DATABASE , MYSQL_USER, and MYSQL_PASSWORD from the env file. phpmyadmin: image: phpmyadmin/phpmyadmin ports: - “81:80” depends_on: - db Here we are configuring phpMyAdmin. The new configuration you see here is depends_on. This specifies which services need to be up first before this service can be activated. In this case our MySQL service needs to be up first before we initiate the phpMyAdmin service. We can put multiple service in the depends on field. app: build: context: . dockerfile: .docker/apache2/Dockerfile args: uid: ${UID} working_dir: /var/www/html environment: - APACHE_RUN_USER=#${UID} - APACHE_RUN_GROUP=#${UID} ports: - “80:80” volumes: - .:/var/www/html depends_on: - db Here is the configuration for our app service. The build section refers to our Dockerfile and sets any argument in the args param. Our app also depends on the MySQL service. The notable configuration is volume. Volume directs docker to mount the hosts source app code into the directory we specify. In this case it’s /var/www/html . By doing this we can see the changes made by host in the container. We will need to specify the UID in our environment file. Just add UID at the end of the .env file. UID=1000 Now save the env, Dockerfile and docker-compose.yml. Go to your terminal and run docker-compose up --build Note that you will need the docker service running in your system before you execute that command. After the build is done. Go to your browser and go to localhost and voila. Laravel is running on docker. Laravel homepage Now to execute commands from your recently created testuser. Run this command in your terminal docker exec -ti -u testuser laravel-docker_app_1 bash This will take you into containers bash as testuser. It will look similar to the picture below. It will be laravel-docker instead of laravel-dockerization. From here you can run all the php artisan commands for the project. So this is it for the setup of basic Laravel project on Docker. If you want the source code and immediately want to run the project then you can go to and follow the readme file: In case you want to learn more about Docker you can read this awesome article. Also you can read this article to understand the whole setup from a different perspective. Check out other articles from our engineering team: https://medium.com/monstar-lab-bangladesh-engineering Visit our website to learn more about us: www.monstar-lab.co.bd
https://medium.com/monstar-lab-bangladesh-engineering/boot-a-laravel-project-on-docker-72501d3487e1
['Faisal Islam Raju']
2019-08-02 03:59:21.565000+00:00
['PHP', 'MySQL', 'Apache', 'Docker', 'Laravel']
Implications of Being Taken
Despite the fact that the UFO abduction phenomenon shatters the Western worldview and reveals us to be utterly helpless, the average populace accepts it more readily than the most culturally educated or intellectually developed among us. It’s not quite apparent to me why we grow so wedded to our perspectives on the world. Perhaps, like any ideology, a complete scientific worldview provides a sensation of mastery and power. Mystery and the sensation of not knowing are diametrically opposed to the desire to retain control, and they appear to generate such horror that we fear blowing apart like Neil DeGrasse Tyson in a classified UAP Task Force briefing when confronted with a cosmos too large to understand. This might explain why our culture’s intellectual and political elite appear to be most committed to maintaining the materialist view of reality. The contemporary western worldview has had tremendous success in its investigations of the physical world, discovering many of its mysteries and applying this knowledge to human goals. We have survived the harshness of winter, decreased suffering via medical breakthroughs, and learnt to connect electronically with people who live far away. Simultaneously, we have used our knowledge to develop weapons of devastation that may now easily destroy life as we know it. Our use of modern technologies to extract resources from the planet is threatening the ecosystem. We are a species that has lost touch with nature, gone berserk in the pursuit of its goals at the expense of other living creatures and the land that has provided us with life. Reversing this tendency will be a monumental job. Even as we acknowledge the danger we have created, the entrenched interests that stand in the way of finding a balance in our connection with environment remain strong. Massive corporate, scientific, educational, and military organizations consume billions of dollars in material commodities and maintain a paralyzing stagnation that is impossible to reverse. For international business, the globe appears to be nothing more than a massive market to be split among the most astute entrepreneurs. However, there are psycho-spiritual vested interests that oppose change and may be much more powerful than financial ones. These interests are mirrored in the belief that the physical principles we know explain everything, and that if other creatures exist in the universe, they would act similarly to us. The SETI (Search for Alien Intelligence) project, which works on the premise that extraterrestrial intelligence may be discovered by sending radio waves out into the cosmos, exemplifies this bias. The idea that superior intelligences may not choose to connect with us via such a little or restricted technical aperture, preferring instead a larger opening of our awareness, does not seem to have considered to its creators. According to philosopher Terence McKenna, “to seek eagerly for a radio signal from an alien source is probably as culture-bound an assumption as searching the cosmos for a decent Italian restaurant”. POLITICS The UFO abduction phenomena, which strikes at the core of the Western worldview and shows us to be completely powerless, is more easily accepted by the general public than by the most culturally educated or intellectually evolved among us. For it is, to a great extent, the scientific and governmental elite, as well as the media outlets under their control, that decide what we are to think is true, since these monoliths are the primary benefactors of the prevailing ideology. The main arena in which the reality and relevance of the UFO abduction phenomena must be addressed is therefore the “politics of ontology.” Before its full significance for our individual and communal lives can be fulfilled, it must be taken seriously and pushed out of the sensationalized tabloids and into the mainstream of society, so that the sophisticated media is free to drop their supercilious tone. The abduction phenomena poses a unique challenge for all governments across the globe. It is, after all, the job of government to safeguard its citizens, and authorities must recognize that weird creatures from radar-defying ships invading our houses and abducting individuals, seemingly in violation of the rules of gravity and space/time itself, would cause special difficulties. This may explain why official policy on UFOs has been so muddled from the start, a sort of muddled combination of denial and cover-up that only feeds conspiracy theories. Other political ramifications of the abduction phenomena exist. After all, politics, whether municipal, national, or worldwide, is a game of power. We want power in order to rule, control, or influence a certain field of activity. The abduction experience, on the other hand, encourages us to find the meaning of our “power” in a deeper, spiritual sense by demonstrating that control is unachievable, if not ridiculous, and by revealing our larger identity in the cosmos. Ethnonational conflict, which stems ultimately from the fact that we identify ourselves solely in parochial regional terms (what Erik Erikson referred to as “pseudospeciation”), is a cause of enormous misery and a major danger to human existence. The more global, even cosmically, linked identity implied by the UFO abduction phenomena may, at the at least, provide a diversion from our never-ending battles for ownership and control of the planet. At best, it has the ability to pull us out of ourselves and into potentially endless cosmic experiences. All of this, however, is contingent on taking the phenomena and its consequences seriously. ECONOMICS The economic and political ramifications of the abduction phenomena are inextricably linked. The loss of a feeling of the holy, as well as the devaluation of intelligence and awareness in nature beyond ourselves, has allowed the most powerful among us to plunder the earth’s resources without concern for future generations. Growth without restriction has become a goal in itself, as reports of economic “indicators” constantly intone, disregarding the eventual collapse that cannot be far off if human population growth continues unabated and planet pillaging continues unabated. Furthermore, if the acquisitive urge (generically referred to as “market forces”) is not reined in, inequalities in the distribution of food and other commodities that do exist may worsen, giving birth to possible instability and limitless conflict. The phenomena of UFO abductions does not directly address this problem. It does not, and will not, “rescue” us. However, it seems to be inextricably linked with the nature of human greed, the origins of our destructiveness, and the long-term repercussions of our collective conduct. The experiences may be extremely “enlightening” in the broadest sense for the abductees. RELIGION Some established faiths are particularly troubled by the UFO abduction phenomena. Recognizing the strength and potential dangers of spirit entities “out there,” organizations of human people have taken on the job of leading us through the “ultimate issues” of existence from the beginning of time. Religious leaders teach us about the nature of God and decide what spirit beings or other things may exist in the universe. A variety of tiny but strong homely creatures that dispense a strange combination of pain and transcendence without apparent respect for any established religious hierarchy or dogma may have little place, particularly within the Judea-Christian tradition. It’s one thing to admit that “spirit” exists in the cosmos and that “we are not alone.” It’s quite another for “spirit” to appear in such an unusual and frightening shape, partly constructed in our own image. At best, this seems perplexing and difficult to assimilate. At worst, in the polarized vision of Christian dualism, these dark-eyed creatures must seem to be the Devil’s playmates (ask Lue Elizondo why he resigned). Eastern religious traditions, such as Tibetan Buddhism, which have traditionally acknowledged a wide variety of spirit beings in the universe, seem to have less trouble accepting the reality of the UFO abduction phenomena than more dualistic monotheisms, which provide strong opposition to acceptance.
https://medium.com/@528hz/implications-of-being-taken-98b8b07072a6
['Think Tank']
2021-09-14 02:50:16.285000+00:00
['Religion', 'Space', 'Aliens', 'News', 'Politics']
Capitalism and Democracy are Mutually Exclusive
This short Q+A was originally published at Balkans Post: http://balkanspost.com/article/876/capitalism-and-democracy-are-mutually-exclusive BP: How would you describe the effects of a highly capitalist economic system, namely that of the United States, on the livelihood of ordinary people living in that country? William Hawes: Well, the median individual income in the U.S. is about thirty-one thousand dollars a year. That number is high compared to most people around the world, relatively speaking. The problem in the U.S. is the high cost of living for everything — rent, food, health care, etc., which eats away at any chance for savings, investing, and retirement. Rent or mortgage payments can eat up a third or even half of wages or salary depending upon where you live, because real estate is so expensive. Ordinary workers and citizens feel demoralized, taken for granted, and powerless. People have checked out from public life and it’s understandable why: there is little community or social safety net here. There are failing schools and crumbling infrastructure in nearly every large city and rural poverty remains endemic. People feel and sense that things are going from bad to worse, but for the most part try to ignore or escape from even talking about key issues, and when they do it’s distorted, mediated, and filtered through media narratives or the spectacle of U.S. elections. It’s painful and depressing to witness. The combination of all these effects have created one of the most lonely, homogenous, conformist, authoritarian, and boring societies in all of history, in my opinion. In terms of distribution of wealth and resources, capitalism creates a system of artificial scarcity where goods and services are withheld from the population unless one can pay for them. Therefore citizens become dependent on a system where they must work or go homeless and starve. BP: What’s your take on inequality, injustice and corporate power in capitalist countries? How do these problems affect a society? William Hawes: These features are all part of capitalist economies. Their antecedents are visible in previous systems like feudalism and ancient empires: there have been strict divisions of labor, caste systems and hierarchies, and monopolistic power for thousands of years. What used to be the commons, like public land for growing food or grazing livestock, or communal forests for hunting and gathering, has been stolen and then privatized by the ruling classes, who then use private property law to enforce the theft of land and resources. The difference in contemporary times is that globalization has spread the myth across the world that free markets can always deliver basic goods and services better than every other model. The effects cannot be overstated, really. When you have master-servant relationships over generations splitting the population into a ruling class and working class, what ends up happening is the ruling class becomes numb and indifferent to the fate of those below them. A sense of superiority appears and then appeals to race, nation, caste, or tribe/clan are used to enforce inequality, cloaked in patriotic propaganda. On the other hand, the working classes become too poor, disorganized, sick, or disabled, or disillusioned enduring the drudgery of their lives to organize and fight back regularly; although as we know through history non-violent direct action as well as violent revolt and revolution by the oppressed are most often the key drivers towards making society more equitable. Another difference with capitalism compared to older systems is that it has managed to create large middle classes, who have managed to create greater systems of production, a reduction in the most severe poverty in some cases, and a semblance of political democracy. However, there is no economic democracy in our society, simply capitalist business owners who make all the decisions in the workplace, and their puppet politicians whose interests are diametrically opposed to those of working people. Unfortunately, the middle classes in the U.S., who fundamentally are workers and do not own the means of production, identify to varying degrees with the ruling classes due to authoritarian, hierarchical divisions of labor and class — as well as materialist and consumerist trends in our society. This happens over generations: economic and political elites use gaslighting and jingoistic propaganda to indoctrinate their citizens and claim that other nations and minorities are the enemy. Over time, 6,000 years of civilization really but intensified in late capitalism, this creates a sort of collective Stockholm syndrome effect; with most people not having any more say than medieval serfs and peasants, yet who still view their ruling class overlords as heroes and saviors: those wonderful and benevolent “job creators”. The ruling classes shatter hopes and dreams, diminish expectations, get the working classes to do their bidding, all the while harping that capitalistic “liberal democracy” is the best system ever constructed. BP: What are the links between capitalism and global warming? Is it correct to say that capitalism is the main factor behind such phenomenon? William Hawes: Capitalism relies on endless growth to continue. To maintain growth and profits, corporations rely on the cheapest forms of energy — for most of the past 200 years, fossil fuels — to run their factories, transport goods, heat and light their warehouses and office buildings, etc. Capitalism claims all the gains of corporate enterprises — profits — for those who own businesses, while dumping the risks and costs onto the public and the environment. Ever since the Industrial Revolution got underway, capitalist firms sold energy in the form of coal, gas, and oil, and reaped all the profit. Meanwhile, the impact of strip mining, oil spills, fracking, polluted air and waterways, and increase in carbon dioxide, are left for the public to deal with. Sometimes the corporations pay piddling fines or implement meager plans to restore damaged habitat, but it never can repay the true costs, because there is no way to account for the health of an ecosystem in monetary value. The pseudoscience we call economics confirms this: they claim it just and wise that a private corporation can take all the gains, while the costs of doing business — pollution, atmospheric warming, etc. — are “externalities” that cannot be accounted for in their models. Other mechanisms fossil fuel corporations use are to bribe, lobby, infiltrate, and coerce the state to subsidize their businesses, offer tax breaks, loosen regulations, block legislation and judicial oversight, and break up or if necessary murder opposing activist groups. Absolutely, capitalism is to blame and it is the key factor. That is why capitalist elites distance themselves at every chance — they live in gated communities amongst their own kind, employ private security, mercenary armies — in general they live in separate spheres of existence compared to us commoners. Why? Because they know they are responsible for creating and/or inflaming all these interconnected crises. BP: Is capitalism in contrast with democracy? William Hawes: Yes, capitalism and democracy are mutually exclusive. Ask yourself, in large corporations, do workers get to vote or have a say in who, what, when, where, how, and why products are made the way they are? Or are the decisions handed down by the ownership and a small cast of executives and managers? Capitalism fundamentally alienates and exploits workers and treats them as interchangeable disposable objects. In the U.S. we have an oligarchy. In other societies with high inequality the same problems appear, oligarchs fight and win concessions from governments and deregulation, privatizations, and harsh austerity measures are implemented on the poor, with tax breaks and loopholes for the rich. So all over the globe economies are teetering on the brink, and authoritarian, demagogic faux-populism in on the rise: the best examples being Trump, Orban, Duterte, Bolsonaro, Erdogan, and Modi. Capitalist societies privilege the rights of private property above all else. The limited political rights and relatively decent wages Westerners enjoy are akin to the carrot; but when workers start agitating for economic rights and racial and gender equality, those in power are quick to bring out the stick. Capitalism is inherently authoritarian. To give a quick foreign policy example, every excuse capitalist powers use to harm others is a projection of their own malevolence. For instance, in Iraq in 2003 Saddam was accused of having WMDs. Well, the U.S. used WMDs in Iraq — bombs, missiles, white phosphorus, and depleted uranium. In Afghanistan the Taliban was accused of harboring terrorists. The U.S. is the world’s greatest threat to peace and harbors, employs, arms, and uses terrorists and black operations in the majority of the world’s nations. There are somewhere between 800–1000 (the true number is classified) U.S. bases outside its territory in over 130 nations. One of the most frustrating things about living in the U.S. is how the two-party system has duped nearly everyone — class is never addressed as the fundamental issue. Liberals blame conservatives for everything gone wrong, and conservatives blame liberals and cast them as “radical socialists”. It has been extremely difficult to expand the bounds of discourse to show that the real problem is capitalism. Both sides agree that we “live in a democracy” but that’s just not the case. Americans have this strange idea that only a totalitarian dictatorship can be undemocratic, the fact that we’re allowed to choose our jobs and go shopping makes this a “free country.” The fact is we do live in a dictatorship: a dictatorship of money. Americans still view the concepts of authoritarianism and dictatorship through the lens of Cold War propaganda: any nation that has a centrally-planned economy and a Politburo is “undemocratic”, but the U.S. with “free markets” and “fair elections” is a liberal democratic country. The fact is we have a rotating cast of plutocrats, military and intelligence leaders, and politicians who all serve the Almighty Dollar. That rotating cast provides the illusion of democracy — because no one group remains in charge, no one assumes responsibility for the system — and the simulacrum of a “free society” endures. So, because no one group is permanently at the helm, there is little recognition of the scale of the problem and the source: capitalism. The president cannot fundamentally control or be held responsible for the economy (they can tinker with taxes, tariffs, interest rates, but only to limited degrees), corporate elites cannot outright control the three branches of government (they do so covertly through donating, lobbying, and bribery), and the military cannot dominate either (unlike with coups and juntas in other parts of the world), so the public becomes indoctrinated that there actually are checks and balances and diverse interest groups which govern America. Eventually, the public becomes fatigued at finding the true source of the problem (industrial-based capitalism) and resorts to two-party bickering, as well as nationalistic and racist propaganda to assuage its anxieties. All this being said, there are reasons to be positive — working class Westerners are awakening and mobilizing and the taboo against taking socialism seriously has been broken in the U.S. and Europe. Anti-capitalist critiques are becoming more widely accepted, and the political rights we still enjoy in the West along with a relatively uncensored digital media provide the possibilities for Leftist mass movements and organization to gain serious momentum very soon. BP: How has U.S. imperial meddling impacted the targeted countries and the whole world? William Hawes: Well, this is one of those questions where I hope a new generation of activists and scholars can help answer because there are so many areas here ripe for investigation. To start, readers may want to check out some of the most popular and accessible North American authors on this subject: Noam Chomsky, Michael Parenti, William Blum, Michel Chossudovsky, Naomi Klein, and John Perkins. The U.S. has always been an empire from its foundation. Post-WWII the U.S. of course became the world’s hegemonic power and used its military, economic, and political powers to expand its influence to every corner of the world. As the global power with unparalleled reach and nearly inexhaustible resources, the U.S. used its many forms of leverage to coerce, bribe, and steal as well as to manage and interfere in the affairs of other nations. Like ancient empires, today the U.S. extracts tribute from its vassals in various forms: fossil fuels, debt payments, mineral resources, using foreign banks to purchase our debt and prop up our currency, etc. The impacts have just been tragic and devastating: consider all the dead and wounded from our imperial wars, all those driven from their homes, all of the slave labor and inhumane working conditions that span the globe to serve Western multinational corporations. Western capitalists and our military empire are also responsible for untold environmental devastation which inevitably has caused many world health crises involving rising rates of cancer, diabetes, and various chronic illnesses; U.S. imperialism has made the world a poorer, sicker, and more dangerous place.
https://medium.com/datadriveninvestor/capitalism-and-democracy-are-mutually-exclusive-240bc5033c24
['William Hawes']
2020-03-15 09:28:58.206000+00:00
['Economics', 'Finance', 'Capitalism', 'Democracy', 'Politics']
The Data Science Internship Hunt: A Fortune 500 Story
The Data Science Internship Hunt: A Fortune 500 Story How to go about getting a Data Science Internship in the United States? It’s that time of the year again when all the Master’s students in the United States are either in search of internships or have bagged one. It’s at times confusing and frustrating, but every student has to go through the internship hunting process. Variety of students come to the US including both the breeds: Freshers and Experienced Candidates. Data Scientist is the sexiest job of the 21st century and every student from the domain of Computer Science, Information Management, Engineering Management, Business Analytics and obviously Data Science aspires to be a Data Scientist. There are a couple of postings for Data Science Internships which accepts applications from an undergraduate level to Ph.D. level. There are a lot of Ph.D. students currently enrolled in Data Science related fields (As it’s super hot right now!) and thus there is fierce competition for MS students out there. Two months back, I bagged two Data Science internship offers from Fortune 500 companies being in the fresher breed and the journey is worth a share. Disclaimer: This is not for you if you are a super exceptional candidate and are confident about nailing your first interview. When did I start my Internship search? Well, It’s always advisable to start early, and I mean really soon. My Graduate studies began on 27th August 2018, and I made my first internship application on 13th September 2018. Big Names in the industry start the hiring process early so, this is your time to hit hard and give your best shot. There won’t be too many internship openings, but there would be a couple to keep yourself busy with one or two applications a day. What all do you need for your applications? First and Foremost thing is a fine-tuned resume. Start Early, there is not much you have to change on your resume as you have just entered the grad school instead of adding education. Let me tell you that building your resume is an iterative process as you push the changes into your resume with experience and guidance. The second important thing would be your cover letter. People say that it’s the cover letter which the recruiters see in the first place, but that may not be true. But why take a chance? There are basically two types of cover letters: 1. Story-Telling 2. Skill-Focused, Either way, you can go as there is no perfect cover letter ever. Other things you need to keep ready is transcripts and internship certificates (Optional). You should get all your application material reviewed by your university’s career services(As you are paying for it :p) and use their expertise in crafting your application materials. OH! I forgot to tell you about the most important thing you need: TIME. I know it’s a new country for you so, it’s always challenging to stay focused on your goals, and people generally tend to live their American Dream from the very first day. You should enjoy every moment of your stay here in the states but with prioritizing things that are important for a better future. Thanks to my undergrad seniors and relatives especially Heet Sheth, Vishal Ghadia and Nilmani Bhanderi who went through the same phases and guided me along the way. How to search for open Internship positions? There are a couple of sources: Linkedin, Glassdoor, Indeed, Angellist, etc. The easiest way to start off would be through LinkedIn. Set up daily email alerts for relevant positions on these platforms as that would be an essential feature for your internship search. It’s time that you should scroll these professional network portals feed instead of your Facebook or Instagram feed! You should spend some time daily on these platforms and save relevant jobs which you would apply to at night in your free time probably. Another possible way as the title suggests is to go through the Fortune 500 list and look for open internship positions at each one of them. Data Science Positions? Data Science is a vast field so, there are a variety of positions in which you could apply: Data Science Intern, Data Engineering Intern, Machine Learning Intern, Deep Learning Intern, Data Analyst Intern, Business Analyst Intern, Analytics Intern, etc. You never know which positions you could fit in based on your profile. Never be restricted to one profile as you always want to keep your options open. You can craft your resumes based on these positions. Iterations, Iterations, and Iterations! There is no perfect resume and cover letter till the time you are not getting any calls(This doesn’t include generic calls of coding tests and automated video interviews), you should iterate over your application materials. Once you start getting REAL calls, at that point, you can think of stagnating the iteration process of your application materials. Utilize your winter breaks! All the Universities generally have a one-month-long winter break, and I feel this is the best time to bump up your applications counter. No doubt you should enjoy during winter break along with making time for applications. Even I was in Florida for winter break at my uncle’s place where I used to travel and make the applications in my free time. Till this time, most of the major players are in the market with their internship positions so, you should try to make at least 15 applications a day (At least that’s what your target should be). Rejects, Rejects and Rejects! To digest the rejects that you would be getting every day is an integral part of this whole process. Rejects are nothing but a motivation for you to apply more and look out for THE POSITION at another company where you would be the right fit! Whenever you apply, you generally get a confirmation of application email which says ‘A recruiter would contact you if you are a right fit for this position with further steps’. You won’t hear back from all the applications that you make so, all you can do is apply. I have got more than 70 rejects from 205 applications that I did during my internship hunting process. Seniors had always told me to be ready for making a thousand applications, and I was prepared for that before I came to the United States. Power of Linkedin and Networking Linkedin is the go-to portal when it comes to internship or job search. The other way around the application process is via getting referrals from someone who works at your desired company. First thing, build your network with people working in your industry. So, whenever you send a connection request, don’t forget to add a note to the person mentioning why you would like to connect with him (Don’t ask for a job in the note as it looks very desperate). Go smooth! Connect with your undergrad and grad alumni residing in the states and ask them for guidance regarding a position at their company. Don’t hesitate! Imagine some junior of yours asking for help 5 years after. Won’t you help? They would have gone through the same phase and are always happy to help. This technique worked out for a lot of my friends and is tried and tested. This is simply the other way around the resume shortlisting and getting a call for an interview. In most of the cases, referrals can directly get you interviews. The Call! So, Once you get shortlisted by your resume and cover letter, Most of the times you either get a coding challenge or an interview. Coding challenges may test your Python, R, and SQL in terms of technologies. It generally tests your data manipulation and analysis skills. The next step then would be n technical interviews (n varies according to companies, and it can range from 1 to 5). For Technical interviews, you need to be thorough with your resume(each and every centimeter of it), know the basics of position you are applying for. You should not know everything written on the job description, but when asked in an interview about something you do not know, you should always be positive and revert with something like ‘I am presently studying this in my X coursework and will be good at it by the end of this semester’. There is scorecard continuously running in your interviewer’s mind, and you have to make sure you don’t go negative on the scorecard until the last moment when he/she is open for questions. Now, this portion is the game changer! You should ask one or two questions like “How is your experience in working at this company?”, “Can you tell me about an interesting project that currently your team is working on?”, etc. No matter how your interview goes, Always end your interview on a positive note by telling the interviewer ‘Why You?’ and sell yourself to finish off the game by bumping your score on his/her scorecard. Also, do ask about when will you hear back from them. The Tech Interview Data Science being a technical position, there are technical rounds which include coding tests, data challenge or technical interviews. So, these are the type of questions you should be prepared to answer: Resume-based: Questions on your projects and experiences. Modeling: Testing your ability to build and take a decision on the choice of machine learning model based on situational variables: training data, feature-set, test data, computing power, prediction time, training time, etc. System Design: Testing you on building the entire pipeline from extraction, pre-processing to prediction and insights. Database: Questions on SQL, concepts of relational databases and NoSQL databases. Coding: Expect a live coding round if you are applying for a data engineering position. These include concepts of recursion, back-tracking, divide and conquer, greedy and dynamic programming(These are a rare breed.) After the Interview? Now, this is the time when you have to control your mind. Your mind overthinks about how your interview went, what if you get this offer, and you read about the company and its locations (LOL! That’s what I used to do :P) but better would be to get over that experience and get back on your application track until you finally hear back something from them. Do put up a courtesy email to the recruiter in your contact for setting up the interview and ask about the application status when it’s one day above the hear back time according to your interviewer. The Offer! A recruiter would generally call you up when they have an update directly or they would set up some time to talk. This call is THE CALL when the recruiter makes you an internship offer. You should patiently hear what he/she has to say about the offer instead of jumping out of your chair in excitement. You should surely express your excitement over the call and overwhelm the recruiter. Internship Offers and Salary Negotiations? When it’s an internship, you generally are not in a stage to negotiate the compensation as this is a real-world experience that the company is providing you while you are still learning, studying and completing your degree. But when you already have an offer in hand from another company, you can surely try to negotiate as the company can make you the offer at par if it’s not hurting their norms because THEY WANT YOU (This worked out in my case!). Choosing the right opportunity! In case you have multiple offers, Just imagine yourself doing the work mentioned in the job description at that company, and you would be able to make a choice wisely (Compensation is secondary! 🤑). What about me? Being honest, this is the track I followed during my entire application process. I was ready to make a thousand applications for getting the right internship, but I started early and had to stop at 205 applications. I received a Data Science Internship offer from American Family Insurance Group and a Data Engineering Internship offer from CBS. I have finally decided to spend my summer with CBS Interactive business unit of CBS Corporation in Fort Lauderdale, Florida. Why did I chose CBSi? From the first year of my undergrad studies, I had a dream to work for a media and entertainment company (Because I loved to animate and design stuff) and what better opportunity would be than working at America’s most watched network and a global top 10 web property. This summer, I would be probably writing big data pipelines for processing and modeling the data of CBS, CBS News, CBS Sports, CNET, Gamespot, Comicvine, Download.com, MaxPreps, TV Guide, last.fm, Metacritic, MetroLyrics, Chowhound, TechRepublic, TV.com, and more :p. That’s all folks! shoot your questions regarding Data Science and Data Engineering Internships on LinkedIn or at [myLastName][myFirstName] at gmail dot com. Super excited for this summer and will keep you posted on my internship experience. All the best!
https://towardsdatascience.com/the-data-science-internship-hunt-a-fortune-500-story-534e21bae055
['Jay Kachhadia']
2019-05-01 21:34:15.078000+00:00
['Data Engineering', 'Media', 'Internships', 'Entertainment', 'Data Science']
Starting off Modern: State of Congressional Modernization Keynote with Rep.
Virtual hearings and orientation sessions are just part of the modernization program Reps. Kilmer and Davis have helped steer in the 116th Congress In the opening remarks for the First Branch Tech, Science, and Data New Member Orientation, POPVOX co-founder Marci Harris and Congressional Management Foundation CEO Brad Fitch are joined by Representative Derek Kilmer, Chairman of the Select Committee on the Modernization of Congress, and Representative Rodney Davis, Ranking Member of the Committee on House Administration, for a wide-ranging conversation on the state of Modernization in the House, advice to new Members of Congress on choosing staff wisely, and prioritizing their limited time in COVID. For a full transcript of remarks, please see here. For a summary, see below. Tech, Science, and Data Keynote on Congressional Modernization Rep. Kilmer: Despite reports to the contrary, bipartisanship is alive and well — just in hiding For example, Mr. Kilmer highlighted that the Select Committee on the Modernization of Congress was a truly bipartisan committee, and functioned in a way that prioritized members coming together to find common ground. For that to work, committees/working groups/caucuses have to set the intention to be bipartisan from the beginning, and put in the work to stick to it. Part of that is setting clear goals and keeping them centered through a Congress — again, something the Select Committee excelled at. Rep. Davis: You are now part of the institution you ran against, and are empowered to make it better Mr. Kilmer also noted that previous efforts to improve the institution have had tangible impacts that you are now going to benefit from — like electronically adding your name to bills as cosponsors, rather than sending your staff running all over the hill to do it in person, or the remote work we’ve been able to shift to in the pandemic. These have also in part come from the smart observations and ‘fresh eyes’ of new Members — so keep your eyes peeled for ways Congress can improve, and find who you can work with (including the two on this panel) to make it happen. Rep. Davis: Don’t just inherit the technology from your predecessor — make it part of your vision for how you want your office to run and act on it. And on that note, as Mr. Davis noted, CHA is a resource for you to navigate the House bureaucracy to get the tech you need! Rep. Davis: Make sure you have one person besides you in your office with decision-making authority This will not only make it so that you don’t become the complaint resolution department in your office, but you can do double the planning in the transition period for what you want your office to look like on day one. Rep. Kilmer: You might not realize this yet, but especially during COVID, your time is going to be even more limited. Set your priorities early to make sure you’re spending it wisely. COVID has changed almost everything about Congress’s day-to-day operations. Mr. Kilmer noted that this is the time to let your staff be creative with finding new ways to fulfill the traditional functions of Congress, from outreach to casework, in remote work. And, again, as Mr. Davis echoed, touching on a theme that will repeat through the conference, the lower volume of traffic through your offices is one more reason to hire your staff slowly and strategically as you get a sense for what you’re going to need.
https://medium.com/capitol-hill-tsd/starting-off-modern-state-of-congressional-modernization-keynote-with-rep-492f5155de66
['Anne Meeker']
2020-12-18 21:07:24.770000+00:00
['Democracy', 'Article', 'Congress', 'Politics']
The analysis of Forex market (10th July)
This article is daily analysis from analyst Ray The dollar index rebounded modestly, with Federal Reserve chairman Colin Powell due to give his semiannual monetary policy testimony to the house financial services committee and appear before the senate banking committee on Thursday. In the early hours of the morning, the Fed will release minutes from its June monetary policy meeting, which will provide a deeper insight into the views of FOMC members as investors watch for further signals from the central bank that it cut interest rates in July. Upper pressure is in the vicinity of 97.80, and below support is in 97.25. It is expected to fall back high market trend. GBP/USD The pound hit its lowest level against the dollar since April 2017 at 1.2440 as jitters over Britain’s exit from the European Union and growing expectations of a rate cut by the bank of England weakened the currency further. The M-head has been formed and the MACD index bears can be aligned. The two moving averages cross below the 0 axis, which is a major decline. It is looking below the 1.2250 target. USD/JPY USD/JPY reached the 109 level, and the strength of the late rally weakened. Today, the upper pressure was around 109.20, while the lower support was at 108.50. MACD index long volume in the 4-hour chart can continue to reduce, but the price did not fall back. There is a deviation phenomenon, and it is expected to have some retreat today. XAU/USD As the market bet on the Federal Reserve this month to cut interest rates, gold bulls ready to move. Yesterday again gold is by strong buying support, around 1385 strong rebound to the 1400 level, but ultimately under pressure, trading around 1392. The upper pressure is still at the 1400 level,and the higher pressure is around 1405. The low support is around 1385, and the lower support is at 1377. USO/USD U.S. crude oil stocks fell 8.13 million barrels in the week ended July 5, according to data released by the American petroleum institute (API). Today below the 200 average lines 58, the above pressure is in the vicinity of 59.50, and it is recommended wait-and-see in the short-term. Special note: the above information is for reference only, not as basis for investors operation in the markets, at your own risks.
https://medium.com/@cptmarketsofficial/the-analysis-of-forex-market-10th-july-654f882217b0
['Cpt Markets']
2019-07-10 11:06:34.439000+00:00
['Trade', 'Cpt Markets', 'Analysis', 'Macroeconomics', 'Finance']
How to Earn a STEM Degree Online
Tip 1. Ask questions. Teaching yourself anything, let alone at the university level, can be frustrating at best. When you’re taking STEM subjects, you compound the matter tenfold. The key to success is to ask questions. All of the questions. No matter how “dumb” you think they are, they need to be asked. Classes move along at such a fast pace that there isn’t time to waste beating your head against the wall trying to figure something out. Furthermore, don’t stop at your instructor if they can’t give you an answer to your question. Keep pursuing the answer. If it means asking every other professor who teaches the same class, so be it. It’s important to remember that your time is valuable. Don’t spend hours of it on a question that could be answered in two minutes. Furthermore, STEM subjects specifically build on concepts learned over time. If you don’t understand something now, chances are it will reappear and will have been built upon. Don’t be shy, ask the question. Tip 2. Try new learning methods. My learning methods have changed quite a bit over the course of getting my degree. Naturally, yours might change now that courses are online for the foreseeable future. Luckily, many study methods lend themselves well to learning STEM subject objectives. The more active and passive learning methods you can incorporate into your system, the better. Passive learning methods: reading the textbook/lecture notes/PowerPoint slides taking notes or annotating listening to lectures/seminars/instructional videos Active learning methods: teaching what you have learned to someone else (useful for all STEM majors — the trick is to find someone willing to sit and let you drone on for a few hours while remaining relatively interested) experiments and lab work active recall (useful for biology and geology majors when memorization is required) peer discussion building projects to implement and expand on what you have learned (especially useful for software engineering majors) completing sets of study questions (useful for mathematics, physics, engineering, and chemistry majors) Tip 3. Study smart. During my time in an online university, my study methods have changed drastically. My first set of courses were a real wake-up call that my methods needed to change. This also isn’t to say that they won’t change in the future. However, this is what works for now. I adopted this med school method of studying that suggests taking three passes at the information. This means that for a specific chapter or unit, you go over the material three times (hence the three passes). Each time, you go over the material in a different way to solidify the learning. I’ve altered this method to roughly accommodate ~6 passes: Pass 1: I carefully read through the unit objectives. Pass 2: I skim through the chapter looking at any headings, images, and graphs. This gives me a rough idea of what’s in the chapter. Pass 3: If the chapter has study questions or a synopsis of the chapter at the back, I read these thoroughly. In a synopsis or study questions, the author will pick out the most important stuff in the chapter. This gives me insight on what to specifically pay attention to. Pass 4: I skim through the chapter and make flashcards for all of the vocabulary words that pop up. I personally like to use Quizlet so I can have the cards handy on my phone whenever I get a minute to study. This also helps prepare me for the next pass of the material, because I will understand the keywords and how they generally apply to the context of the related passage. Pass 5: I read the chapter. While this is a really passive way of learning, reading, or speed-reading the chapter gives a good solid starting point for understanding the meat of the unit. Pass 6: I complete any study questions assigned to the chapter. If no study questions are assigned, I answer the objectives of the unit as if they were questions. This is just an example of a method that works for me, but something completely different may work for you. It’s important to tune in to how you learn best and to use several different active and passive learning methods to solidify your understanding of concepts. Tip 4. Work ahead. Work ahead, but don’t become obsessed with work. This is a specifically tricky thing to try to balance if you’re anything like me. Basically, since starting at an online university, I’ve become a workaholic. I’m hyperfocused when it comes to school, and that means working ahead whenever I can. I won’t lie, it’s not the healthiest of behaviors. However, that’s a story for another time. The point is, being a STEM student means you have an insane workload to try to get through, not to mention pages and pages of outcomes to thoroughly know for each class for exam season. My best tip for dealing with this is to work ahead when you can. Getting one or two extra things done the week before can make your load lighter the next week. This accumulation of free time will become useful around exam season or whenever a large project is due. Tip 5. Treat every day like a workday. Some of the best advice I’ve ever received is to treat school like your job. If you do that one simple thing, you’re likely to succeed. While this is easy to do when you have the routine of attending class every day, it can sometimes get difficult to maintain when you now roll out of bed to your desk and begin the day. Instead, treat it as an actual school day. Get up at 6 or 7 am, put on clothes that aren’t pajamas, make breakfast, have coffee, and start your work. Keep regular working hours, take a break for lunch, and work until the afternoon. Then, you’re done for the day. But let’s be honest. The pandemic has changed everyone’s schedules. And that’s totally fine. If you now function between the hours of 3 pm and 3 am, make a schedule between then and go for it. The key is to keep consistent hours that will allow you to be your most productive self during this time. Tip 6. Challenge your instructors. The school system has so ingrained in us a sense of subordination under teachers that the thought of challenging them is obscene. Ever since we were little, we had to ask to go to the bathroom. So now, to be placed in a situation where we may need to challenge our instructors is nigh on heresy. Here’s an example: When I was going to college, I had a class with an instructor who had just started his own consulting company on the side. While he had been teaching for over 20 years, he had never run a business at the same time. He got so caught up in his business that he wasn’t marking assignments or giving feedback, such that we were entering exams not knowing what our grades were. This class is the reason why I had my first experience with anxiety. I and many of my classmates found this to be an unfair situation. However, none of them were willing to challenge the instructor. So, I pushed the instructor myself to give us feedback and marks. When that didn’t work, I started climbing the ranks and going over his head. I kept going until I reached the associate chair of the department (a wonderfully kind man to who I owe some of my success in that program). Because I challenged the system, I got myself and my classmates out of a bad situation where many of us would have failed not through our own doing, but from the incompetence of another. The moral of the story: challenge your instructors if you feel you are being unfairly treated. There is no reason that because you are now taking classes online that you should be subjected to unfair tests, assignments, or marking. Rattle as many cages as you need to. You’re paying thousands of dollars to attend this institution. You deserve to receive a fair education. Tip 7. Set up a distinct study area. This far into the pandemic it can be difficult to see the line where your workspace stops and your home begins. While it’s tempting to work from bed or the couch, it won’t have a productive effect on your day. Through repetition and routine, our brains are trained to associate activities with certain spaces. It’ll become a lot harder to sleep if your brain associates your bed with both sleep and work. Therefore, it’s incredibly important to create a dedicated workspace. This way, when you sit down in that chair, your brain tunes in and makes it easier for you to get down to work and be productive. Decorate, fill it with pretty stationery, make it comfortable, and make it inspiring. You’re going to be spending a lot of hours in this space so you might as well make it as perfect for you as possible. Tip 8. Treat yourself gently. You already challenged yourself by enrolling in a STEM degree so it’s no wonder you feel like you’re getting steamrolled with this extra pressure of having to take courses online. It’s often said, but often unheard among STEM students. Be kind to yourself. Tip 9. Allow yourself to acknowledge that it may take longer than expected. When I was attending a traditional college, I was completing 4–5 courses a semester. Suffice it to say I was horrified when I started online university and I was only able to complete 3–4 courses every 4–6 months. The problem stems from the fact that it’s much quicker and easier to learn from a prepared lecture than to teach yourself from a completely unknown material. While I understand that traditional universities that have transitioned to online learning temporarily haven’t changed much and the expectations for course load are the same, it’s important to know when you’re feeling overwhelmed. Your mental health is not worth trying to complete 5 STEM courses in 4 months because that’s what’s “expected of you”. Instead, focus on what you can reasonably manage.
https://medium.com/age-of-awareness/how-to-earn-a-stem-degree-online-80bf1c023719
['Madison Hunter']
2020-11-15 00:13:43.655000+00:00
['Education', 'STEM', 'University', 'Student Life', 'Self Improvement']
6 Kebiasaan Baik yang Saya Pelajari dalam Membangun Software
Learn more. Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more Make Medium yours. Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore
https://medium.com/teknomuslim/6-kebiasaan-baik-yang-saya-pelajari-dalam-membangun-software-99b28a680881
['Didik Tri Susanto']
2020-12-25 05:32:22.059000+00:00
['Startup', 'Coding', 'Software Development', 'Tips And Tricks', 'Software Engineering']
Docker container as an executable to process images using Go (golang)
Image Processing Program In this lesson, we are going to write a Go program that takes a source image on the file system (using the path of the image) and crops the image in a square shape. It also tweaks the contrast and brightness. Then it finally saves the grayscale format of the image on disk. The results look like below. We won’t be writing a program that has complicated logic to manipulate every aspect of the source image to achieve this functionality. Instead, we are going to use the sweet little imaging library written in Go that provides us functions to manipulate the image and save it to the disk. docker-image-processor/ ├── .dockerignore ├── .gitignore ├── Dockerfile ├── bin/ | └── avatar ├── go.mod ├── go.sum ├── main.go ├── process.go ├── process_test.go ├── shared/ ├── test.jpg ├── tmp/ | ├── out/ | | ├── cmd_out.jpg | | ├── model-1.jpg | | └── model-2.jpg | └── src/ | | ├── model-1.jpg | | └── model-2.jpg └── utils.go Out project structure will look something like the above. Do not worry about creating these files just now, some of them will be generated afterward. First of all, let’s convert this project to a Go module so that we can install and track dependencies. $ go mod init docker-image-processor This command will create the go.mod file in the project directory. We are not concerned about publishing this Go module so docker-image-processor module name would be alright for us. Now, let’s install the imaging module. $ go get -u github.com/disintegration/imaging This command will download the imaging module from GitHub and register it as a dependency inside go.mod file. It will also generate go.sum file to hold cryptographic hashes of the installed modules and their own dependencies. The process.go file contains the actual logic to transform the source image and save the output image to disk. It has the makaAvatar function that does the job of producing the image. func makaAvatar(srcPath string, outPath string) error This function accepts two arguments. The first argument srcPath is the path of the source image on the disk and the second argument outPath is the path at which the output image will be saved. The package main declaration suggests that this Go project would be compiled into a binary executable file. We will discuss more while working on the Dockerfile and the Docker image. The utils.go file contains some utility functions such as fileExists to aid other programs in the project. Now that we have the makeAvatar() function ready, let’s test it out. For this, the best approach would be to create a unit test and execute makeAvatar() function to produce some sample outputs. A unit test file in Go ends with _test suffix and the unit test function must start with the Test prefix. In the process_test.go test file, we have written the TestMakeAvatar function that tests the makeAvatar function for sample images located in the ./tmp/src directory and saves the output inside ./tmp/out directory. It then checks if the output files exist to conclude the results of the test. Let’s see the result of this test. $ go test -v The go test -v command instructs Go to look for test files ( *_test.go ) in the module directory and execute the test functions. From the result above, the TestMakeAvatar test has been passed. After running this test, you would be able to see the output images in the ./tmp/out directory which would look something like below (right). Now let’s work on the main.go file which would be the entrypoint of the application. The application we are trying to build is a binary executable that accepts two arguments as shown below. $ ./avatar <srcPath> <outPath> The ./avatar file is the executable we will build from this Go project. The srcPath and outPath arguments are the path to the source image and output image on the disk. When we execute this command, the main() function inside the main.go file would be executed with these arguments. But before building an executable file, we need to work on the main.go file and make a call to makeAvatar() function with srcPath and outPath received in the command. We can use go run . to run the module which executes the main() function just like an executable would. $ go run . <srcPath> <outPath> We can access the arguments passed to the command that executed the main() function using os.Args variable. The Args is a slice (array) of strings and the first element of this slice contains the path of the executable. // main.go package main import "fmt" import "os" func main() { fmt.Println(os.Args) } This simple program produces the following result when executed with the command $ go run . ./test.jpg ./test_out.jpg . [ /var/folders/xx/..../b001/exe/docker-image-processor ./test.jpg ./test_out.jpg ] When we run a Go program or an entire module in this case, Go first creates a binary executable file on the fly, stores it at a temporary location, and then executes that binary executable file. The binary executable execution starts by executing the main() function. The command-line arguments received from os.Args we are interested in are the 2nd and 3rd ones. So let’s modify the main() function and make a call to makeAvatar() function with these two values. In the program above, first, we are checking if the length of os.Args is less than 3 and exiting the program with an error if that’s the case. Then we are extracting the 2nd and 3rd argument which would be the srcPath and outPath for the makeAvatar() function. $ go build -o ./bin/avatar . The command above will build a binary executable file avatar from this module and put it inside the ./bin directory of the project. Then we can execute this binary executable using the ./bin/avatar <args> command. $ ./bin/avatar ./tmp/src/model-1.jpg ./tmp/out/cmd_out.jpg Success! Image has been generated at ./tmp/out/cmd_out.jpg.
https://medium.com/sysf/docker-container-as-an-executable-to-process-images-using-go-golang-5233f9bd3bf7
['Uday Hiwarale']
2020-12-18 08:14:16.825000+00:00
['Docker Image', 'Go', 'Docker', 'Image Processing', 'Dockerfiles']
I Am Made of Ghosts
I am made of ghosts — the small, wide-eyed girl who clutched a juicy pear and smiled for the camera with ribbons in her hair. I am the lanky, not-quite-teen passing notes and braving dares. I am the young musician, studying boys and dyeing hair. I am the college woman, nose always in books, who never seemed to notice all the newly-minted men and their singeing, lingering looks. I am the nervous bride thinking this, this day, this day will last until I’m old and gray, who paid no mind to coming storms nor saw the battles for the fray. I am the empty tomb where no life would ever grow and I’m the hand-made crib that welcomed home the greatest love I’ll ever know. I am made of ghosts and love and memories cut fine. I am made of fleeting thoughts and the ceaseless march of time.
https://medium.com/where-my-poems-go/i-am-made-of-ghosts-45e56a89b7f8
['Elle Rogers']
2020-01-06 17:01:01.230000+00:00
['Family', 'Love', 'Time', 'Poetry', 'Life']
8 Global Cryptocurrency Market Trends in 2020
1. China will continue to influence the industry China arguably became the most influential international actor in the cryptocurrency market in 2019, to a large part displacing even the United States in this regard. However, we believe that in 2020 the “boy who cried wolf” effect will gradually reduce market reactions to news of Chinese state reactions to the industry. Especially after the recent story when the mention of bitcoin in the main newspaper of the country turned out to be nothing more than a provocation. It seems to us that the process itself will force many participants to reformat the industry, both on the China territory and beyond. Changes to the regulatory environment will especially affect miners. Although Beijing has declined to ban mining outright, a national cryptocurrency policy will encourage miners to relocate their operations to other jurisdictions. Incidentally, we already saw this in part in 2019. Even Iran joined the list of countries now attractive to cryptocurrency miners, and the United States has grown its own mining capacity. Presumably, the US will support the continuation of this trend as it attempts to compete with blockchain and state currency developments in China. Many of the advantages offered by cryptocurrencies and blockchain, in general, are taking hold in China, which is investing more into the industry than all other countries. In 2020, we will with no doubt begin to see the first fruits of this focus. At the same time, these technologies will contribute to the Chinese state’s control of its people and economy, although this will not negate their other benefits such as speed, reduced costs, and solving the problem of trust. 2. Central Bank Digital Currencies Along the same lines, we believe the most anticipated currency of our day is the digital yuan. Given China’s leading position in the world economy, a digital currency will be a significant change in the global financial system. China has been actively introducing numerous modern technologies into its citizens’ daily lives, so we have no doubt that the party’s plans will come to fruition as the crypto yuan spreads rapidly. 2020 also may be the year other countries release their own digital currencies, but the crypto yuan will most likely appear first. The addition of Central Bank Digital Currencies to states’ agendas is largely due to the increased recognition of the potential of cryptocurrencies in general and stablecoins in particular. International organizations such as the International Monetary Fund and the Bank for International Settlements (BIS) have repeatedly expressed their opinions that central banks should begin developing their own analogs to modern cryptocurrencies. Interestingly, digital currencies for certain regions have also been discussed — in particular for the BRICS countries (Brazil, Russia, India, China, and South Africa). China can probably offer its national cryptocurrency for this role, in a move which is fully consistent with Chinese plans for its use in international settlement. Of course, this does not mean that the other countries in BRICS will agree to this suggestion. The European Central Bank is also preparing to develop its stablecoin and has already set a working group. So it is likely that soon we will find out about the project, which will be a kind of hybrid cryptocurrency with the features of stablecoins and regional currency. In the further future, we can expect a single global digital currency. In October 2019, the Bank for International Settlements held the first G20 Central Bank Governors Summit which discussed the idea of ​​a global cryptocurrency. In Switzerland, Benchmarks Regulation and the Swiss Central Bank have already created a center for the development of a single settlement token among the world’s central banks. All of these facts indicate that the transition of states to digital currencies is simply part of the next stage in mankind’s development — and its appearance is rather a matter of time. 3. Strengthened Ties Between the Cryptocurrency Market and the Global Economy and System The cryptocurrency market was previously considered a separate, closed system, with inscrutable trends and movements. Now, we are witnessing the gradual integration of these markets into the global financial system, as well as deeper repercussions in the crypto industry from economic and political events. Of course, we are far from seeing cryptocurrencies able to exert an equally powerful effect in the other direction, as their overall market capitalization is still much too small. In 2019, against the backdrop of a trade war between the United States and China, Bitcoin began to be regarded as a “safe haven” asset. It was, however, only one of several such assets, which also included gold. Bitcoin was generally used more to diversify a portfolio than to confidently store a large percentage of wealth. Economic and political instability in certain countries — Venezuela, Argentina, Hong Kong, etc. — also contributed to the strengthening of this role for Bitcoin. Many negative predictions have surfaced for 2020, foretelling a crisis for the economic and financial system that will be many times more serious than the 2008 crisis. In particular, the introduction of negative interest rates by central banks, the growth of global debt, and the slowdown of world production are named as reasons for concern. However, these challenges may prove opportunities for Bitcoin and other cryptocurrencies, which can establish themselves as a means to preserve accumulated capital. In addition, a series of events is expected in 2020 that will have rippling effects across the globe. Among these is Brexit, the United Kingdom’s departure from the European Union. There is some speculation that, after Brexit, the UK will turn to cryptocurrencies to solve difficulties with international financial transactions. Finally, the US presidential election is coming in November 2020, and the result could affect that country’s policy on cryptocurrency. Even in 2019, brief Twitter statements by the President of the United States significantly influenced the mood of the cryptocurrency market. 4. Messengers and Social Networks as Players in the Cryptocurrency Market Today, messaging apps are striving to become more universal, offering the functionality of both marketplaces and digital wallets. The most notable contenders are: The TON blockchain network and GRAM cryptocurrency created by Telegram The developing blockchain ecosystem of Korean giant Kakao, which includes its own Klaytn blockchain network, its own Klay cryptocurrency, decentralized applications (dApps) on the Klaytn network, the largest Korean messenger KakaoTalk, and KlaytnPhone smartphones based on the flagship Samsung model — and another model probably soon to be released based on LG. In the future, the blockchain component will be able to integrate with other Kakao projects, in particular, the KakaoPay payment system and the online banking service Kakao Bank Japanese messaging app Viber and its stablecoin, Rakuten Coin, which will be exchangeable for fiat on the Rakuten Wallet exchange and is maintained by the messaging app’s owner, Rakuten. The company will likely announce long-term plans to integrate this stablecoin into the recently launched Moneytou payment service; The ecosystem of the Canadian messaging app Kik, based on the Kin token (which runs on the Stellar blockchain). An even more grandiose development was conceived by Facebook, which presented the project of the universal stablecoin Libra this year, access to which could be available to all its users, that is, more than 2 billion people around the world. However, this announcement ultimately played against Facebook. Most countries made it clear that they would not allow the launch of the currency within their borders. As a result, Facebook decided to move away from the project, as CEO Mark Zuckerberg recently informed the United States Congress. It is noteworthy that Facebook, Telegram, and Kik each had to deal, in one form or another, with US authorities in 2019, and their projects’ prospects were significantly hindered as a result. This indicates how seriously the American government takes the capabilities of these projects. Thus far, prospects look more positive for Asian messaging apps: Japan’s Viber and South Korea’s Kakao. Both messengers are clearly planning to develop their projects at an international level, and we will see new stages of development for both in 2020. 5. The Development of Decentralized Finance (DeFi) In 2019, decentralized finance (DeFi) began gaining popularity. The essence of DeFi is to provide financial services built on blockchain and smart contracts. Most such services today are cryptocurrency lending platforms, as well as derivatives markets, decentralized exchanges, and payment solutions. The algorithmic stablecoin DAI is in this segment as well. However, this is only a small part of the financial services and tools which can be transferred to the blockchain. Based on DeFi Pulse’s data gathered directly from the blockchain, DeFi on the whole doubled in volume across 2019. According to Genesis Global Trading, $870 million in cryptocurrency loans were issued in Q3 2019 — a 250% increase over loans in Q3 2018. The total volume of loans has already exceeded $3 billion. The volume of interest from investors confirms the industry’s confidence in DeFi, which is the second most popular investment category for the largest funds focused on cryptocurrencies and blockchains, according to The Block analyst Ryan Todd. One of DeFi’s defining characteristics is a strong preference for the Ethereum network, where it has taken the place of the collapsed ICO market. The success of Ethereum’s update roadmap in 2020 and its ability to withstand growing congestion will determine the development path of many DeFi projects. However, Ethereum is not the only game in town. Ripple has announced a new DeFi focus, including a new project that bills itself as a “bankless” bank account. We can expect active DeFi development on other blockchains in 2020, perhaps even on blockchains which are little known to the public. Some new projects may be associated with large banks and corporations. Forbes recently ranked DeFi as one of the major trends in the fintech sector in 2020, indicating how much of the market the sector might capture. 6. Staking May Revitalize the Market and Repurpose Major Players Proof of Stake (PoS) mining algorithms were proposed back in 2011, and the first implementation was completed in 2012. Proof of Stake was suggested as an alternative to Proof of Work (PoW). In Proof of Stake, the chances that a block is formed by a network participant depends not on that participant’s computing power but on the amount of the network’s cryptocurrency — “stake” — that participant holds. In 2019, staking gained significant popularity, while at the same time gaining new centralized characteristics. Cryptocurrency exchanges and custody solutions began to offer interest on the storage of cryptocurrencies using the PoS algorithm. As a result, large exchanges have already become leading nodes on some Proof of Stake blockchains. The most popular ways to quickly earn money on cryptocurrency were once mining and trading, but now there is a simpler alternative: staking on an exchange. In this case, the user does not need to deal with the technical details of mining, nor master the complicated techniques of trading. One of the main problems associated with staking is the increase in centralization it causes, which has already drawn criticism of Proof of Stake algorithms from the crypto community. At the same time, staking essentially turns cryptocurrency exchanges and custody services into investment banks of a sort, increasing the number of functions they serve in the industry and, in parallel, their influence. This was one of the main reasons that cryptocurrency exchanges in 2019 led the industry in total funds raised. Judging by the growing popularity of centralized staking, we can expect that next year it will become one of the main objects of discussion and regulation in many countries. 7. The Legal Status of Stablecoins will be Determined The interest in stablecoins that arose in 2018 returned with renewed vigor this past year. During 2019, various companies announced the issuance of their own cryptocurrencies backed by fiat currency or commodities such as precious metals or gemstones. The climax of this trend was the Libra project from Facebook, which we mentioned above. It was Libra that made national governments consider the possible consequences of the widespread use of stablecoins — namely, the potential impact on established financial, economic, and social systems. It was stablecoins issued by commercial organizations, not Bitcoin or other cryptocurrencies, that were recognized as a threat to the financial system in the United States, Europe, and East Asia. In 2019, legal and economic assessments made up the extent of regulatory feedback on stablecoins, including recommendations from international organizations. However, it is now obvious that in 2020, or shortly thereafter, laws will be adopted in the leading countries of the world to regulate fiat-backed cryptocurrencies. If Libra is any indication, stablecoins will be subjected to severe restrictions. Bills have already been written in the European Union and the United States to this effect. Still, it is worth noting that the government of Bermuda has allowed taxes to be paid in the stablecoin USDC, despite the negative assessment of stablecoins made by most states. In any case, stablecoins are being actively developed. In 2018 and 2019, venture capital funds invested $200 million in the industry. Since the start of 2019, the capitalization of stablecoins has nearly doubled, mainly due to Tether (USDT). Tether remains the undisputed leader in market capitalization and in trading volume, where it often leads even Bitcoin. But in addition to use on exchanges, Tether began to gain popularity as a settlement currency for purchasing goods and services. The advantage of using USDT is its simplicity and speed for cross-border transfers. Tether demonstrates that it is easier for stablecoins to gain mass distribution than it is for Bitcoin or altcoins. By pegging value to a fiat currency, the problem of unpredictable volatility characteristic to Bitcoin and other cryptocurrencies is removed. It is often noted that stablecoins could reduce the dominance of the US dollar and stimulate international trade relations. While Tether still dominates the stablecoin market, it has serious problems within the US judicial system, where two lawsuits have been filed against it. Proceedings will take place this year. This fact, and the trend towards strict legislation on stablecoins, gives us reason to expect that Tether will be supplanted by alternatives issued either by large corporations — which will have all necessary licensing from governments — or as Central Bank Digital Currencies by the states themselves. The presence of CBDCs would essentially eliminate the need for USDT and similar assets. In fact, USDT and other stablecoins may even be banned or at least confined to the ecosystems of the entities which issue them. 8. The regulation will Grow Stronger In 2019, most countries were actively developing legislation to regulate the cryptocurrency market. In many of these lands, the new laws begin to take effect in 2020. In addition, the Financial Action Task Force’s anti-money laundering requirements were released and should be integrated into government legislation around the world by the summer of next year. Thus, in 2020 there will be tightening regulation on the activities of cryptocurrency companies. “Know your customer” (KYC) and anti-money laundering (AML) requirements will become widespread, and user verification will be required of all legally operating cryptocurrency sites. If any platforms continue to not require verification, then will likely lose any fiat gateways, making it extremely difficult for users to withdraw digital assets from them into fiat or to use those assets as a means of payment. And given the blockchain’s auditability, crypto assets from these platforms may simply not be accepted by other sites. All of this may lead to the disappearance of those who use cryptocurrencies precisely because of the anonymity and freedom from state control that they offer. However, this new regulation is one of the necessary conditions for the arrival of institutional players and the integration of cryptocurrency into the real-world economy.
https://medium.com/exmo-official/8-global-cryptocurrency-market-trends-in-2020-d381ced608d8
[]
2020-01-24 14:28:34.997000+00:00
['Trading', 'Crypto', 'Cryptocurrency', 'Ethereum', 'Bitcoin']
Bitcoin Miners are Escaping China
Tracking what happens to newly issued bitcoins can yield meaningful insights into the collective behavior of both mining pools and the individual miners operating within them. In order to discern these two very different actors, Coin Metrics has produced a set of aggregate metrics that serve as proxies. As a proxy for mining pool behavior, we aggregate data from all “coinbase” transactions; the first transaction of every Bitcoin block (not to be confused with the exchange). As a proxy for individual miner behavior, we aggregate data one hop from that transaction, meaning, all transactions that received funds from the coinbase. As a proxy for mining pool behavior, we aggregate data from all “coinbase” transactions; the first transaction of every Bitcoin block (not to be confused with the exchange). As a proxy for individual miner behavior, we aggregate data one hop from that transaction, meaning, all transactions that received funds from the coinbase. At a microscopic level, if you track what happens after one hop, the notion that Bitcoin mining is centralized is shattered. In fact, there are many transactions beyond one hop that have dozens of recipients, which may be indicative of layered structures even at the individual miner level. One theory is that several mining operations are joint ventures where partners may have complex payout structures. As such, measuring anything over one hop becomes more challenging and subjective. Now that we covered how aggregate miner behavior can be measured on-chain, let’s take a look at the data. What is the on-chain data telling us? Before individual miners can effectively sell their coins, they must create a transaction that sends funds to a crypto exchange, an OTC desk, or even directly to the buyer (albeit in rare circumstances). In any of these scenarios, we would see an increase in the flow of funds being sent from individual miners (one hop) to other addresses. The chart below shows exactly that. Aggregate flows sent by miners are at the highest levels since March 2020, when markets crashed at the onset of the COVID-19 pandemic. This supports the hypothesis that the latest sell-off was by Chinese miners that have sold part of their holdings in order to escape the latest wave of enforcement actions by the CCP Although what they receive on a daily basis is small compared to global BTC volume, the data showcased above suggests that when miners are likely selling (an increase of Flows Sent), markets respond negatively. Aggregate flows sent by miners are at the highest levels since March 2020, which supports the hypothesis that the latest sell-off was driven by Chinese miners that have sold part of their holdings in order to escape the latest wave of enforcement actions by the CCP. Remember that miners are also speculators. Even though what they receive in miner rewards is small, in USD terms, relative to the volume of global BTC markets, they do hold BTC in their balance sheets. In times of uncertainty, when they expect to need cash, their collective actions affect the market. Now put yourself in the shoes of a Chinese miner that might have to move to a different country. Regardless of the scale of your operations, you will likely need cash to finance that move. The good news is that this is a temporary phenomenon. As with previous spikes in Flows Sent, the market impact was short-term and close to coincidental. Another interesting on-chain behavior worth highlighting is miners’ potential concerns towards centralized exchanges in light of the CCP’s crackdown. The current sell-off coincides with thousands of bitcoins being withdrawn from major exchanges and deposited to miner addresses, as shown below. Interestingly, the CCP’s current crackdown on mining also coincides with a time of the year where some Chinese miners move their operations from Inner Mongolia to Sichuan. This 2,000km migration is motivated by the beginning of rainy season in Sichuan, which increases the capacity of its hydroelectric power plants, thereby decreasing electricity costs. It has been observed that the rainy season in China contributes to an increase in Hash Rate, a metric that indirectly tracks the resources being allocated to Bitcoin. However, if the CCP’s hawkish comments on mining in fact translate to enforcement actions, this seasonal migration might be impacted, and Hash Rate might see a drop from current levels. If the CCP’s hawkish comments on mining in fact translate to enforcement actions that further motivates miners to emigrate from China, we might see a contraction in Hash Rate from current levels. While it is still unclear how the Chinese mining community is tactically responding to this development, the market has reacted negatively in light of a potential decrease in Hash Rate. But if a decrease were to occur, how would it impact Bitcoin? What if Hash Rate Crashes? Another gigantic misconception about mining is that daily Hash Rate figures can provide an authoritative view of when miners are pulling the plug. This frequently generates panic, as people struggle with the notion that a large portion of miners have suddenly gone offline. Another Elon quote illustrates this misconception well when he claimed that when “A single coal mine in Xinjiang flooded, almost killing miners, […] Bitcoin hash rate dropped 35%.” In reality, Hash Rate is not a precise metric. Hash Rate formulas were designed to estimate how many computational resources are being allocated to a network on a given day. But there is a keyword that is often omitted in the metric’s name: implied. It’s called Implied Hash Rate because it is impossible to get a precise daily change figure by solely looking at on-chain data. If you look at average daily Bitcoin Implied Hash Rate on Coin Metrics’ dashboard (what people usually just call hash rate), you will see that large (35%+) fluctuations occur frequently. Source: Coin Metrics Crypto media outlets often take advantage of Hash Rate fluctuations with sensationalist “BTC HASH RATE DROPS X%” headlines, but daily Implied Hash Rate is, by its very design, a volatile metric that is not suitable to track lasting changes in the mining landscape. Crypto media outlets often take advantage of Hash Rate fluctuations with sensationalist “BTC Hashrate drops X%” headlines, but daily Implied Hash Rate is, by its very design, a volatile metric that is not suitable to track lasting changes in the mining landscape. The reason for this volatility is that all daily Hash Rate formulas are highly sensitive to how long blocks have been taking to be mined over a given lookup window. Since mining is an unpredictable process (a Poisson process to be precise), there is a probability that a Bitcoin block could take an hour to be mined without miners necessarily having gone offline (albeit a low-probability event). In the example above, a probable event would push daily Hash Rate estimates down considerably, even in the event that no changes in the mining landscape have actually occurred. If you want to understand this more deeply, take a look at the formula we created at Coin Metrics to attempt to calculate daily Implied Hash Rate figures, in the trillion of hashes per second (TH/s) unit.
https://medium.com/coinmetrics/bitcoin-miners-are-escaping-china-d3937e8f018c
['Lucas Nuzzi']
2021-05-25 16:51:01.001000+00:00
['Bitcoin', 'Bitcoin Mining', 'Mining', 'Cryptocurrency']
Dreaming of a White Christmas
The Importance of Socioeconomic Privilege in “Home Alone” An eight year old boy is left behind by his family during the holidays. Alone and vulnerable, he is forced to fight off the self-proclaimed Wet Bandits — a duo of robbers who will go to ridiculous extents to loot his house. For most of us, including myself, the film Home Alone is a Christmas classic. Home Alone’s ability to create whimsy and humor around a rather dire situation is almost effortless. It is easy to laugh as Kevin shops for groceries and tries on his dad’s aftershave. The chaotic scene where the thieves are mercilessly attacked by Kevin’s booby traps is a family favorite. When this little boy finally reunites with his family, we are too ecstatic to remember that they forgot him in the first place. This film stars an all white cast that play characters who live in an upper-middle class, American dream of a Chicago neighborhood. The presence of cops is welcome, evident at the beginning of the film when a “police officer” — really a Wet Bandit — walks into the McCallister household without an invitation, and is received with appreciation. When Kevin goes missing, his mom calls the Chicago PD when they land in Paris and is met with a disinterested response. Eventually, the police send a disgruntled officer to check on Kevin, and assumes no one is in the house when Kevin — who is frightened and hides — doesn’t come to the door. We finally see the police in all their heroic glory when they finally arrest the robbers and allow us to forgive their previous indifference towards the plight of the McCallister family. As I watched Home Alone for the umpteenth time, I realized that the film’s reliance on the cop-savior narrative indirectly creates the moral fabric of this film. Frankly, the plot of Home Alone is heartwarming because the characters are allowed to make mistakes, legally and socially speaking. Kevin’s family, despite their blunders, is a “good” family. The situation was justified. Mom and dad were frazzled. Kevin was being annoying. And of course, they did the “right” thing by reaching out to law enforcement. After all, it was an accident! Home Alone builds compassion around a situation that would have immediately sparked conversations around parental responsibility and accountability if the family was working class and/or people of color. I am positive that a black or brown Home Alone would have ended with Kevin in CPS custody. The parents would be considered antagonists — cruel, careless, and not worthy of having children. Joe Pesci dressed as a cop in the middle of the house would have caused panic rather than comfort. The McCallister’s whiteness is what allows this film to be lighthearted. It is generally well known that cops and other legal institutions are more sympathetic towards white people than people of color. White children make up almost 79% of the child population, but only about 51% of the child welfare system. As I watched the movie, I was amazed that Kevin’s family never once feared losing him, instinctively reaching out to law enforcement for help. The cops seem blissfully ignorant that the thieves they eventually arrest were robbing a house occupied only by a child — who was close to getting fatally injured but was luckily saved by an older neighbor. This movie is a perfect depiction of what the legal system would consider reckless endangerment. Given the law-bound carceral society we live in, it is shocking that the McCallisters did not face any consequences. The Chicago PD barely bat an eye at Kevin’s situation, simply happy to have arrested two dimwitted robbers. Despite what happens, the McCallister family reunion at the end was joyful and free from legal issues, a luxury that seems to be retained only for the most privileged of families. As I looked beyond the moments of comedic relief and one liners, this fun-filled classic is now a fascinating piece of social commentary. A film almost worthy of being a Law & Order: SVU episode, Home Alone has accidentally become a reflection of racial and economic privilege. The McCallisters unknowingly hide in their whiteness to evade any legal and social consequences of their actions. This family is yet another symbol of how American socioeconomic structures benefits the financially-secure white population. I have been a religious viewer of Home Alone for as long as I can remember. As an adult who is aware about the systemic inequities in the U.S., I find it difficult to overlook the issues that this film inadvertently highlights. It is unfortunate that even the most timeless of Christmas movies can not escape the brutal truths of American society. Merry (late) Christmas, ya filthy animals.
https://medium.com/@abhajoshi/dreaming-of-a-white-christmas-f83b8124325e
['Abha Joshi', 'She Her']
2021-03-03 19:34:15.859000+00:00
['White Privilege', 'Home Alone', 'Race', 'Christmas', 'Class']
The US has notoriously an problem with popularity contests but, over and above that, your students…
The US has notoriously an problem with popularity contests but, over and above that, your students live in the Peeple app era — you can’t afford to be anything less than Duracell Bunny bright and cheerful … or your social credit might wane
https://medium.com/@whereangelsfeartotread/the-us-has-notoriously-an-problem-with-popularity-contests-but-over-and-above-that-your-students-daa516715b46
['Where Angels Fear']
2020-12-21 22:09:13.926000+00:00
['Cinema', 'Film', 'Movie Review', 'Movies', 'Film Reviews']
Finding the sweet spot in Georgia’s mountains
Vano is only 19 years old, but his ambition is to become the largest producer of honey in his region. “Not only the largest,” he says with a broad smile. “On our farm, we produce the best, purest mountain honey, made by a very special kind of endemic species — the Georgian grey mountain bee, which has been in my family for generations.” Vano has recently passed a UNDP-supported professional beekeeping business course. “This training helped me understand how to keep the balance between having profit in beekeeping and taking care of the bee population. Without bees we can’t grow any crops, let alone enjoy a spoonful of good honey.” The Georgian grey mountain bee is just one example of the endemic species that make up the unique biodiversity of the Caucasus mountains. Georgia is located in one of the biologically richest regions of the planet. One of the World Wildlife Federation’s 35 “priority places” covers the Caucasus. Moreover, Georgia is located within two of the 34 “biodiversity hotspots” identified by Conservation International: Caucasus and Iran-Anatolia. Forest massifs surviving in the Georgian mountains are the world’s last untouched forests in the moderate climate zone. Georgia is currently home to over 16,000 species of fauna and 6,500 species of vascular plants. Twenty-five percent of these are unique to the region, including the Georgian grey mountain bee.
https://medium.com/@undpgeorgia/finding-the-sweet-spot-in-georgias-mountains-f8dcdaba0374
['Undp Georgia']
2020-12-10 20:43:38.382000+00:00
['Agriculture', 'Farming', 'Mountains', 'Undp', 'Georgia Caucasus']
High School Graduates From Low-Income Families Report taking a Gap Year Due to COVID-19: STUDY
High School Graduates From Low-Income Families Report taking a Gap Year Due to COVID-19: STUDY We surveyed 300+ high school grads in Washington State. More students from lower income and/or BIPOC families reported delaying postsecondary enrollment to join the workforce compared to their white and wealthier peers. Foundry10 Follow Dec 15, 2020 · 3 min read By Jennifer Rubin As U.S. universities transition to remote learning for the 2020–2021 academic year, many students have adjusted their educational plans due to the coronavirus pandemic. Some students intended to delay enrollment (West, 2020), while others have selected a university closer to home due to financial concerns (Hartocollis, 2020). In Washington State, many colleges and universities have shifted most of their classes online and closed a variety of services (e.g., libraries, student learning centers) on campus. Especially for those starting college for the first time, these changes have left many students with questions about their future opportunities as they begin their journey in higher education. Research has found that roughly 1 in 6 high school seniors in the U.S. reported that they intended to change their plans to attend college in the fall because of the pandemic (Art&Science Group, 2020), with graduates from lower-income backgrounds experiencing different barriers than their wealthier peers (Art&Science Group, 2020). Of those students who intended to change their plans, 16 percent say that they will take a gap year (i.e., a semester or year off before enrolling in school). The decision to delay enrollment compares to fewer than 3 percent of first-year students who previously took off a year or more before starting college, according to a 2018 report by the Higher Education Research Institute at the University of California, Los Angeles. Changes in educational plans raises questions about the types of barriers that first-year students face when deciding to delay or enroll in university for the 2020–2021 academic year. For example, due to many families’ precarious financial situation, low-income students are significantly at risk with navigating COVID-19’s impact on education — particularly if they don’t have enough money for an enrollment deposit or if they don’t have access to remote learning tools. With COVID-19 rapidly changing the educational landscape, more information is needed regarding first-year student experiences with transitioning to (or delaying) college as well as barriers in furthering their education. The current research examined the impact of COVID-19 on Washington State high school graduates’ intentions to attend college in the 2020–2021 academic year. We assessed plans for college attendance, barriers to college attendance, and students’ sense of identity as they transition to young adulthood in the COVID-19 era. Additionally, we assessed potential equity differences in students’ educational plans (e.g., income, race/ethnicity, first generation student status) We Address Five Central Questions in this Study: How has COVID-19 impacted students’ educational plans for the 2020–2021 academic year? 2020–2021 academic year? What concerns do students have regarding COVID-19 and college attendance? college attendance? Did students take a gap year for the 2020–2021 academic year? If so, what are their plans for gap year activities? If so, what are their plans for gap year activities? What are barriers to remote learning in higher education? How has COVID-19 impacted students’ emotions and sense of identity? Read the Results here: FOUNDRY10 GAP YEAR 2020 STUDY
https://medium.com/the-foundry10-voice/high-school-graduates-from-low-income-families-report-taking-a-gap-year-due-to-covid-19-study-401f45638b74
[]
2020-12-15 22:34:14.130000+00:00
['Equity', 'Research', 'Gap Year', 'Education', 'Access']
Ikigai and the 4 Questions to know your purpose in life
What is Ikigai Iki means to live and Gai means reason. Ikigai thus means a reason to live. Most people believe that life is a checklist to be completed. go to school, go to college, get a stable job, and build your career. That’s is like the single track path that society dictates you to do. Robert Kiyosaki called it in his book (Rich dad poor dad) a rat race. After living a life like this, some people feel remorse creeping in. It makes them feel regret since they worked soo hard for something that now they don’t even want to continue doing. Ikigai is a philosophy from Japan. It is something that gets you out of bed in the morning. It’s something you never stop working on. It states that if you aren’t working on something that is your Ikigai you will live a life that’s filled with despair. Why we need Ikigai The feeling of the monotony of the work will start weighing upon your soul. Weighing you down on the prospect of waking up each day. When you wake up in the morning working on your Ikigai you are filled with wonder and joy. This is because you start thinking in terms of the things you get to work on rather than the things you have to work on. Ikigai is being happy in the process. Most people define happiness by the results they got or the achievements that they accomplished. Ikigai, on the other hand, aims to find happiness in the work for work’s sake. Professionals in competitive environments find happiness by consistently working on their craft. After winning a trophy they celebrate for a day and then go right back to their practice regimen the next day. That is what it takes to succeed in the highest levels of competition. Ikigai is thus being present in the moment doing the work that you look forward to each day. What stops us When you are inspired to something that calls to you. Immediately all sorts of thoughts come to your mind and tell you the reasons why you can’t pursue it. Stephen Pressfield in his book (The War Of Art) calls these thoughts the resistance. Resistance tells you that you lack the resources or the experiences to pursue your callings. We begin to think about all the other people that are much more capable than us that are doing the thing we are drawn towards. Our mind begins to Fear and that fear tells us all the reasons why it can lead to a giant failure. You can understand fear by its acronym ‘False evidence appearing real’. It makes us stick to the known and familiar. All our Fear can think from comes from our own past experiences. Instead of trying to fight the paralyzing fear or trying to distract ourselves, we should learn to accept it. Instead of letting it control the conversation just accept the fear as a part of you. When we accept the fear that we feel and still act towards our calling, fear loses its grip on us. By working on your Ikigai you leave behind the unfulfilling life and instead start being in the moment improving on your true calling. Your meaningful life isn’t a destination you need to reach. Your meaningful life is something you can do right now and anytime you want. How do you find your Ikigai? do what you love what does the world need can we be rewarded for it what we are good at Any work you can do where all those things intersect forms your Ikigai. 1)Doing what we love We all have things that take us completely in the zone or flow state. The moment we start doing it. We completely get engrossed in the task at hand. It comes out naturally from the inside and we don’t realize as the hours pass by. 2)Doing what the world needs This component of Ikigai was easier to achieve in the past since whenever you put in your work, you were rewarded with immediate feedback. The way you saw your work affecting other people made you realize the worth of your work. Nowadays people are given a small role in their jobs. Even if they put in a tremendous amount of effort, they can rarely see the impact of that work. It feels like even if they leave today no change would happen. 3)Doing what we can be rewarded for This component talks about money. Without being paid enough to lead a comfortable life you will never find happiness even if you are pursuing something you love. You can’t be happy if you are struggling for survival. Thus you need to earn enough to lead a comfortable life. 4) Doing what we are good at This means that you will keep facing challenges and improving your craft. This overcoming of the challenges in front of you makes you feel the growth and your efforts paying off. You can guess what you need to work on by looking at the picture above- What we are in this moment If you have what you love and are good at it you have Passion. You aren’t getting paid enough to for doing what you want If you are doing something that you love and are doing something the world needs you to have a Mission. You aren’t yet good at what you do and by overcoming obstacles you will get good enough over time. If you get paid well and you are good at it then you have something most of the people have that is a Profession. You don’t love what you do and you don’t feel how what you do makes any change in the world. If you are doing something that the world needs and you are getting paid enough to do it you have a Vocation. You feel stagnant doing what you do or the monotonous life begins to clamp down on your growth and you lack the love for the work you do. You must find a way to constantly get better in what you do. In the book titled (Ikigai) written by Héctor García and Francesc Miralles, they refer to a town called Okinawa in Japan. It has the most number of people over the age of 100 and they never retire and live a life of purpose following their Ikigai. The book mentions 10 rules people in Okinawa follow to stay in tune with their Ikigai but that’s a topic for another blog :P. In Conclusion This week reinterpret your current position in life and find out which of the four questions of your Ikigai are you in line with. It’s a difficult and challenging path to finding your Ikigai but it will lead to a life worth living. We urge you to explore what your callings are. Do you possess the courage to chase them? We provide a free ebook called The perfect beginning that might help you in finding those callings. We post every week and if you like what you read, you may follow us on our social handles on Facebook, Instagram as well as Pinterest to be up to date on our posts Well, that’s all for this post. Have a good day and a good life.
https://medium.com/@theperfectadvise/ikigai-and-the-4-questions-to-know-your-purpose-in-life-54689ea7d009
['The Perfect Advise']
2020-12-23 13:44:35.878000+00:00
['Philosophy Of Mind', 'Purpose', 'Self Actualization', 'Self-awareness', 'Ikigai']
Apple One: Are Apple’s new subscription bundles worth it?
Screenshot: Apple Apple recently released a new subscription service that bundles many of its already popular services. It seems reasonable for just a few bucks a month, but it has many of us asking: is it really worth it? Apple One is separated into three tiers: Individual, Family, and Premier. To see if Apple One is worth it, let's break down the math. Screenshot: Apple Individual Plan: Apple’s individual plan includes Apple Music, Apple TV+, Apple Arcade, and 50GB iCloud storage. If you bought these services separately it would total $21, and you would end up saving $6. Family Plan: Apple’s family plan can be shared with up to five other people and includes: Apple Music (family plan), Apple TV+, Apple Arcade, and 200GB iCloud storage. In the end, this plan would save $8. This savings is calculated based on the Apple Music Family Plan. If no one else in the family is actually using Apple Music, you aren’t getting the best value. Premier Plan: The premier plan includes all of Apple’s services: Apple Music (family plan), Apple TV+, Apple Arcade, 2TB iCloud storage, Apple News+, and Apple Fitness+. This plan saves you $25 if you fully take advantage of all it has to offer. One thing that isn’t available with Apple One is a student discount. If you are a student, it makes more sense to just stick with your existing student subscriptions. Apple has made no mention of whether or not they plan to implement student pricing in the future. Aside from the potential savings, do the services included in Apple One actually bring value? Spotify already has a major grip on many anyway, and that would mean switching over to Apple Music. Also, Apple TV+ still seems it is in its infant stages, and it doesn’t have much to stand up against other streaming services like Hulu or Disney+. So is it worth it? If you are already using all of these services or are a die-hard Apple lover, then you should definitely consider it. It’s a good way to save money on things you are already purchasing. Also, if you are a family, this may be a good alternative, and you will probably end up saving money. But for the average individual who doesn’t already use all of these Apple services, Apple One doesn’t really seem worth it.
https://medium.com/@jakewebb02/apple-one-are-apples-new-subscription-bundles-worth-it-409d8b2ce923
['Jacob Webb']
2020-12-27 23:02:51.533000+00:00
['Apple Music', 'Apple TV', 'Spotify', 'Apple', 'iPhone']
Jenkins — How Can We Get Started Jobs in Jenkins (Part 3)
In the previous article, we learned how to create Users + Manage + Assign Roles in Jenkins. In this article, I will be highlighting how can we create some basic JOBS in Jenkins. Furthermore, these are the points which I am going to explain. How to create a basic Job in Jenkins. Basic Job Configurations. How to run the Job remotely. How to chain Job Execution How to create a basic Job in Jenkins? First of all click on New Item on the Jenkins Dashboard. Then Give a name to your project. Then you can select a Freestyle or Multi-configuration project. For this tutorial please go with the Freestyle project. Then click on the Okay button. As soon as you complete these simple steps you will be taken to the configuration window. Here you can provide different details about Jobs. There are 5 sections available. General, Source Code Management, Build Triggers, Build, and Post-build Actions. General Section Here you can provide a quick description of your project or Job. You can preview it also. You can also use HTML tags inside the description. Source Code Management Section So here in case you want to take the build from any source code system like Git, Bitbucket, or any other source code management portals. We need to add plugins to those source code repositories. But in the coming tutorials, we will see how exactly we can get our build and trigger our Job from a source code management system. Build Triggers Section Here you can trigger builds remotely (e.g., from scripts), We can build after other projects are built and we can build periodically or poll SCM. Poll SCM means that we have a Sourcecode management system and incase we have configured any SCM like get repository. As soon as a build takes place in that repository our job should get executed. We will discuss this later and now we will discuss what is build periodically means. Build Periodically means that we can define an expression and based on that expression our job will be executed at some particular interval. If you want more information on that just click on the help icon on the right side corner. Then it will show you exactly what all the parameters. Just tick the Build periodically option and see. So you have to give your syntax in this format. MINUTE HOUR DOM MONTH DOW Here you can assign a fixed periodic interval for your jobs using expressions. Example : Here it says that Would last have run at Saturday, July 11, 2020 9:06:19 AM IST; would next run at Saturday, July 11, 2020 9:06:19 AM IST. So here last run and the next run is at exactly the same time which means we are making to run our Jenkins to run in every minute. If you just want another time interval use this. # every fifteen minutes (perhaps at :07, :22, :37, :52) H/15 * * * * # every ten minutes in the first half of every hour (three times, perhaps at :04, :14, :24) H(0-29)/10 * * * * # once every two hours at 45 minutes past the hour starting at 9:45 AM and finishing at 3:45 PM every weekday. 45 9-16/2 * * 1-5 # once in every two hours slot between 9 AM and 5 PM every weekday (perhaps at 10:38 AM, 12:38 PM, 2:38 PM, 4:38 PM) H H(9-16)/2 * * 1-5 # once a day on the 1st and 15th of every month except December H H 1,15 1-11 * Build Section Here you can add any build step. If you are using windows then you can give some commands like dir or date or anything. If you are a Mac user you can give any bash command. Just select option form dropdown. Post-build Actions Section This is very important because after you build your project you will need to do something like, You want to build another project after this project builds completed. You want to send out some reports or notifications. You want to trigger some acceptance tests or performance tests. How to build a Job in Jenkins and get the Details of a Job? Go to the Dashboard and you can see all the projects that you have already created. I will explain the Dashboard projects table, Status of the last build(1st Column) —You can see the status of the last build that you have already done. Red color for Build failed, Blue color for Build Success, and the Grey color for Build is not done yet. Weather report(2nd Column) — This shows the aggregated status of recent builds. Based on the last 5 runs this 2nd column icon gets changed. Sunny icon for no recent builds failed (Success 100%). Cloudy sun icon for 1 out of the last 5 builds failed(Success 75%). Rainy Cloudy icon for 3 out of the last 5builds failed(Success 40%). Rainy Thunder icon for 4 out of the last 4 builds failed(Success 20%). Last Success (4th Column)— Details about the last successful build and the build number and how much time it took to execute. Last Failure (6th Column)— Details about the last unsuccessful build and the build number and how much time it took to execute. Last Duration (7th Column)— What was the duration of your last build. How to get the execution details of the Project? Here just click on the project name and then you can see all the history of last builds on the left side of the Project Dashboard. Build History How to trigger a Job Remotely? We discussed how to build a Job using the Jenkins Dashboard but now let’s discuss how to execute a Job from a remote place. Go to the project which you have already created and go to the Configure section. And open up the Build Triggers section. Now I am going to trigger build remotely(Just put a tick on the Trigger builds remotely option). As soon as you click it Jenkins gives you a URL. Just copy it and paste it in your browser. JENKINS_URL/job/TestProject1/build?token=TOKEN_NAME My JENKINS_URL is http://localhost:8080/ and you can provide a token here. It is a kind of password. I have added 1234 as my Authentication token. Now just click Apply and Save. Now you can just go to any other system or any other browser and you can give your IP address instead of localhost and you can execute this command from any other machine as well. Then you can connect to this Jenkins machine. Before going to this URL: Look at the TestProjct1, Last Success Build time and build Number. After going to this URL: Now you can see that Build number 2 has already been successfully executed 6.7 seconds ago. So we triggered this build from outside. We did not trigger it within Jenkins but from outside. So this is ho trigger your Jobs from outside, you just need a simple URL only. How to chain Job executions? First of all, you need a couple of projects in Jenkins. I have created 3 projects called TestProject1, TestProject2, and TestProject3. We are going to do an execution chain here. Go to Project 2 which is TestProject2 and go to Configure-> Build Triggers section. Then select Build after other projects are built. So the trigger for this particular project will come to another project. As soon as that project gets completed this project will get built. Now here you can add multiple project names as your wish. You can just select Trigger only if the build is stable, Trigger even if the build is unstable and Trigger even if the build fails options too. Here I am selecting Trigger only if the build is a stable option. It says that build the TestProject2 only if the build is stable for the TestProject1 project. After that go to the Post-Build section. and select Build another project option. This is a post-build action that says after this particular project is built I want to build another project which is TestProject3. You can give multiple projects here and select the stable build options as previously I have explained. Click Apply and Save. Now Let’s execute the first project and see the chain is actually working or not. As soon as out TestProject1 is completed TestProject2 should get executed. As soon as our TestProject2 is get executed TestProject3 should get executed. This is how we chained these 3 projects. Before Execution: After Execution: Here you can see the chain of the TestProjects execution. So this is all about Jobs in Jenkins. In the next tutorial, we will go very deeply and we will see how exactly we make different configurations and how we take out code from the repository and how can we trigger it. Thank You!
https://medium.com/swlh/jenkins-how-can-we-get-started-jobs-in-jenkins-part-3-8db3567e4981
['Kasun Dissanayake']
2020-07-12 22:19:16.586000+00:00
['Jenkins', 'Jenkins Pipeline', 'Jenkins Job Builder']
Implementing Observability in a Service Mesh
Implementing Observability Envoy is just awesome! It has a comprehensive set of configuration options and has tons of metrics to share. It is also the component that can give you rich access logs and HTTP Request Tracing. As discussed above, Consul is the control plane through which you can configure Envoy with these options. In this section I will zoom into the implementation details of how this can be done. Implementing Observability in a Consul Connect Service Mesh using Prometheus, AWS CloudWatch and AWS X-Ray HTTP Layer 7 Metrics In this setup I’ve used Prometheus to scrape metrics from all the sidecar proxies. By setting the envoy_prometheus_bind_addr field in the proxy.config service definition, Envoy can be configured easily to listen for scrape requests on port 9102. Further, Prometheus can discover the proxies automatically using its Consul service discovery mechanism as show in the example config below: Prometheus config that discovers Envoy Proxies via Consul SD These metrics can then be visualized in Grafana. Screenshot for Overview of Basic Service Health Layer7 Metrics Access Logs Access Logs can also be configured in the Envoy proxy using the Escape-Hatch Override option. This option allows you to define a custom listener configuration that enables access logging by setting the envoy_public_listener_json field as part of the proxy.config definition. You can also define it as part of the global proxy-defaults configuration entry as shown below: Consul Connect `proxy-defaults.hcl` config for HTTP Tracing and Access Logs in Envoy With this config (skipping over the tracing related information which will be covered in the next section), access logs will be written directly to /dev/stdout and automatically collected by AWS CloudWatch Logs driver and pushed into AWS CloudWatch Logs. You should then see logs similar to the following and of-course you can add more fields to it depending on your requirements: { "method": "GET", "request_id": "420e2950-50be-43c3-8d41-9b8e9ac9937b", "bytes_sent": "188", "origin_path": "/", "authority": "dashboard.ingress", "x_forward_for": "10.140.21.77", "protocol": "HTTP/1.1", "upstream_service_time": "10", "duration": "10", "downstream_remote_addr_without_port": "10.140.21.77", "user_agent": "curl/7.61.1", "upstream": "127.0.0.1:8080", "response_code": "200", "bytes_recv": "0", "response_flags": "-", "start_time": "2020-09-03T11:47:01.662Z" } The request_id field is important here as with this field you can correlate specific traces of request for example in tools like AWS X-Ray and even in Application Logs. HTTP Request Tracing An instance of an AWS X-Ray HTTP Trace b/w the Dashboard and Counter service communication Finally, HTTP request tracing can be configured in Envoy by setting the envoy_tracing_json and envoy_extra_static_clusters_json fields of the proxy.config service definition. The former defines the tracing provider, in my case it’s AWS X-Ray, and the later defines the corresponding static cluster configuration that contains AWS X-Ray’s endpoint so that HTTP request tracing data can be sent to it. In order to preserve end-to-end tracing between all services in the service mesh, the services themselves require code that extracts tracing data from the request headers on every inbound connection and forward that on every outbound connection. Additionally, tracing data, in particular the trace_id and request_id must also be included in every log message for traceability and better debugging. With AWS X-Ray, this is possible using their language specific SDK. In the case of Python, you can achieve this using the aws-xray-sdk. Similarly, there is an SDK for Go, Node.js, Java, Ruby and .NET. Putting it all together For a single service running in AWS ECS, at least 3 sidecar containers are needed for it to be fully integrated into the Consul Service Mesh, these are the Consul Agent, Envoy Proxy and AWS X-Ray daemon. The Dockerfile for each of these can be found in the linked Github repo above. An example list of ECS Container Definitions may look like the one below: AWS ECS Container Definitions for a Service to adapt to the Consul Service Mesh For simplicity, in this setup, these containers are configured to run on AWS ECS Fargate with the AWS VPC Task Networking enabled so that all containers get access to the same localhost and can communicate with each other over that interface.
https://medium.com/izettle-engineering/implementing-observability-in-a-service-mesh-273c7409283d
['Jude Dsouza']
2020-09-28 14:22:24.191000+00:00
['Consul', 'Site Reliability', 'Service Mesh', 'Envoy Proxy', 'Observability']
How Does Becoming a Parent Change Your Life?
Let us take you to the journey where this life-changing stuff happened. And, we totally believe that by end of the read; parenthood is going to inspire life within you! So here goes a list of the top seven reasons (out of the endless list), how your life can also transform: 1. You’re going to find your best friend after a long time. While growing as adults all of us have missed many get-togethers and parties with old friends in time. We’ve lost touch with many folks – sometimes due to work, sometimes due to the distance and time zones. And, there are some like us, who keep themselves burdened with an emotional backlog because we do not open up much with everyone around! Your child will free you from emotional load. Your son or daughter will be the go to person, with whom you could be yourself. Without caring, without pretending, without thinking. You’ll care and love for each other, without expecting. And, that’s where you’ll find true friendship. 2. You’re going to live and laugh at the moment for no reason. When was the last time, we laughed our hearts out? Was it that great stand-up? Was it that hilarious comedy show? But, how long did that fun last? Pfff… Do we remain upset even at that time when one must be elated? Well, that’s where our kid made the difference. We realized that babies have this power of captivation. Why? Because, their soul is so pure, and without any bias. With your baby around, you’ll feel the bliss that you’ve never experienced before. Just being able to spend quality time with your son and daughter, is the ultimate happiness. It’s an unforgettable pleasure — to look into those naughty eyes, seeing the smile when you let them win, the first time they call you Mom and Dad – it’s music to the ears. You’ll just come alive. 3. Your mornings are never going to be dull again. We know how beautiful the day goes if you get up early and get things sorted. As a couple, we’ve generally struggled to be the morning folks. But we’ve always found this part very difficult because our mornings were really stressful – the pressure of household chores along with managing the office has been the reason why we used to hate getting up into reality. But believe me, for the last year there has been no day when we haven’t laughed out loud remembering the night and the early morning playful time with our boy. The child takes away the heavy work pressure and converts it into his daily activity. He just jumps from his crib and gives you the best hug possible. The presence, the beauty, the calm, the energy — all of it is so positive. It has brought us joy and has attracted so much more positivity in our lives. 4. You’re going to find new interests, sports, and hobbies. A daily routine for an urban couple mostly revolves around work and home. We start acting like machines, which just exist to work. At times you feel that your life has no purpose, other than to earn bread and butter. Leisure means spending time passively on your couch. The only goal remains a future where you could find time for yourself. Develop new interests, follow your passion, pursue your hobbies – we keep reading this but how many of us can follow it wholeheartedly. In the end, we keep postponing the precious moments in life for tomorrow. Your child is your teacher. Your baby will help you learn how to find new interests — knowingly or unknowingly. Your son and daughter will bring you back to live in the moment lifestyle. Keep your fingers crossed. The child will engage you and motivate you in ways you’ve not seen yourself before. 5. You’re going to dance and sing like never before. Well, some of us may still be singing and dancing right now, but many don’t especially if you are a workaholic couple as we were. We used to hesitate in a lot of stuff because for us the opinion of others made us shy. When we cared too much about the opinion of others that’s when we realized that we had started ignoring our own opinions as a couple. T he child is going to break that vicious cycle. You’ll see the example in front of you. How to care less and live more? That’s when, the inner joy will start flowing out. And, see I told you — now, you’re going to dance and sing like never before. 6. You’re going to be more loving and grateful. Striving for success is a toxic dream. Not because such striving is bad, but because it takes away our focus from the present. And, when we are absent from the present, we tend to miss out on life. We stop being grateful. We lose that feeling of loving someone and being loved. The child brings that back. You’ll be the whole world for someone and someone is going to be your whole world. You’ll be loved unconditionally. You’ll feel blessed. You’ll be thankful. And, that’s when you don’t want to ask for anything more in life. 7. You’re going to be more caring and giving. Our societal relationship has always been conditional. It’s not about Not guilty about it, but we were always into a barter mode – give this and take that. We never realized that we even started trading our emotions as well. At times the practical behavior of work took away our compassion – There was a time when there was no empathy, no truthfulness. But we found what we lost.
https://medium.com/home-sweet-home/how-does-becoming-a-parent-change-your-life-6338312694e
['Aditi K']
2020-10-10 06:31:15.411000+00:00
['Life Lessons', 'Workaholic', 'Confessions', 'Work Life Balance', 'Parenting']
Collect and Store Streaming TimeSeries data into Amazon TimeStream DB
Photo by Ben White on Unsplash Introduction This implementation guide discusses serverless architectural considerations and configuration steps for deploying the Streaming TimeSeries Data Solution for Amazon TimeStream DB in the Amazon Web Services (AWS) Cloud. It includes links to a code repository that can be used as a base to deploy this solution by following AWS best practices for security and availability. The code snippets used in this demo have been placed in GitHub for reference. Components Basics AWS Lambda Lambda is a serverless compute service that lets you run code without provisioning or managing servers. Amazon Timestream Amazon Timestream is a fast, scalable, and serverless time series database service for IoT and other operational applications. AWS Kinesis Amazon Kinesis Data Streams ingests a large amount of data in real-time, durably stores the data, and makes the data available for consumption. The streaming data pipeline will look alike as given below after the deployment: TimeStream DB Pipeline Getting Started AWS Kinesis Setup Create timeseries-stream DataStream with a Shard a given below. Kinesis DataStream Amazon TimeStream Setup Create a database named ecomm in the same region as kinesis datastream and table named inventory in database ecomm using the gists shared in github TimeStream DB TimeStream Table AWS Lambda Setup Create a Kinesis producer to create and ingest time series data into the kinesis data stream which has to be read in the same order. To do so, create a kinesis consumer. The python SDK examples used for this article has been kept at github repository TimeStream. Producer and Consumer Logs will be available in cloud watch. Producer Logs Consumer Logs The written results can be quired using the query editor of Amazon TimeStream. select * from “ecomm”.”inventory” limit 10 TimeStream DB Output Deployment Serverless framework makes deployment and development faster. Deploy Lambda Producers and Consumers into AWS and schedule them to run based on time-series event triggers or any schedule. In a typical production scenario, the producers might be outside of the cloud region and events might arrive through the API gateway. Conclusion In most organizations, Timeseries data points are written once and read multiple times. It is clear that time-series data can be collected and stored using serverless services. Though Timestream can be integrated with various AWS services, kinesis is chosen since it has data retention and replay features. The next article about time-series data will have a use case using kappa data processing architecture.
https://medium.com/cloud-data-analytics/collect-and-store-streaming-timeseries-data-into-amazon-timestream-db-6178bee7434a
['Prasanth Mathesh']
2021-01-11 06:51:50.439000+00:00
['Lambda', 'Data Engineering', 'Kinesis', 'Time Series Data', 'AWS']
Editor’s Faves — Top 10: Why Coding Knowledge Is the Future of Literacy
Here is a list of our top 10 writers who are telling you different things from learning to code to object-oriented programming — but it includes less technical pieces as well: 10. How To Code With SoloLearn Agnes Laurens is a radio journalist, violinist, a mother, and a painter. She is a fabulous writer from the Netherlands. She is an editor for Illumination as well. In this interesting story, she is sharing how she learned to code and found that it was not difficult. Don’t miss this one. “A girl can’t code”, people say often to me, but I think this is very presumptuous. People assume all sorts of things that might not be true. For example that men are not good for the fashion industry. Look at black people now, they have been put in corners they don’t want to be, and they don’t need to be. There are always groups that are willing to belittle other groups of people. Since a few years ago, I have been trying to learn code. Every attempt failed. 9. The Best Programming Languages to Learn First — A Roadmap for the Indecisive Beginner Mark Basson loves to write about programming. You’ll love his writing style. Anyone can learn to code, but what programming languages should absolute beginners learn first? There are so many choices! If you are suffering decision paralysis, then I encourage you to stop researching and instead use the time to learn something that will get you off to a flying start. The following is a roadmap that will get you into the world of programming in the most frictionless, time-efficient, and cost-effective way possible. I won’t give you a multitude of choices. I will show you the exact courses to take so you don’t have to do any more searching. 8. OOP Concepts explained with real world scenario + (java) Nisal Sudila is a programmer and a writer. He is telling the basic concepts of object oriented programming to our readers. If you are new to OOP, don’t miss this one. OOP concepts cannot be grasped without understanding these 6 core concepts in programming. They are, Class Object Encapsulation Inheritance Abstraction Polymorphism 7. Data Literacy — A New Skill to Fuel Career Growth, Especially in a Non-Tech Company Andy Teoh is an advanced analytics professional. He is an excellent writer and he knows what he is talking about. Understanding data is becoming a major skill in an age where data is abundant. Do check his other work as well. Data literacy is the ability to read, analyze, and communicate to others with data in context, including an understanding of data origins and constructs. In this golden era of digital and data, we do not need to be Facebook or Walmart to build a data lake in our company. With the advancement in both hardware and cloud-based systems, this opportunity is now readily available to every company for a modest cost. As organizations amass much more data at an unprecedented rate, it becomes increasingly important for all employees — and not just data scientists — to be data literate so that we can contribute better in our roles and help our companies sharpen their competitive edge in today’s aggressive global economy. According to a survey sponsored by Qilk, 94% of respondents indicated that data literacy is key to developing professional credibility to boost career growth. 6. Eight Ways to Identify a Fake New Facebook Friend Request Tom Handy is an investor and an excellent writer. His style is simple, direct, and engaging. If you like this story, do check his other work. You just received a friend request on Facebook from someone you don’t know. The most likely thing you do is accept the friend request. The friend request was from a pretty woman who you think is harmless. Wrong! This could be an online scammer using a fake profile trying to connect with you so they could ruin your life. These days it is very easy for someone to create a fake profile and then scam you. I have figured out a few clues that I will share with you that you need to know. 5. How Is Technology Changing the Way We Read? Jason Ward is a freelance journalist, author, and writer. His writing style is simple, direct, and engaging. You’ll fall in love with his work. Don’t miss this one. For several millennia reading was limited to a very select few. Books, scrolls and parchments were written by hand and were both prized and rare. Only a small percentage of people were able to read. Then, in the 15th century, that began to change when Johannes Guttenberg invented the printing press. Literacy rates began to rise but progress was slow. However, the Industrial Revolution and the ability to mass-produce paper soon changed that. Education, news, and the rise of popular novels and literature soon became mainstream, leading to a correlating growth in things like libraries and bookshops. People discovered the joy of the written word. 4. Finding Order In The New World Stuart Englander loves to write inspiring stories. He is an excellent writer. You’ll love his short story. They arrived as a group of two dozen adventurers to find a barren and desolate plain, and there was no turning back now. Within a few years of sweat and toil, they turned the soil into a burgeoning landscape, and ultimately, it became a fully functioning ecosystem. This group of like-minded pioneers had much to be proud of, not the least of which was the creation of a new community, a garden of prosperity. Marvin Stafford perched on his favourite boulder, a pinkish-red block just outside his door. He stared across the still, rough landscape outside the compound, reminiscing over the past fifty years. He’d been here from the beginning, an unlikely leader who became the driving force behind the village’s success. 3. Worlds in The Magic of a Gamer’s Brain Tree Langdon is a writer and a poet. She is very intelligent and her writing style is simple but thought-provoking. She is a superb writers. In this story, she shares an experience when she attended a writing course with some young people who played a lot of games. Read to learn what happened next. Recently, I was surprised to discover how computers and gaming can enhance your creative writing skills. I enrolled in writing course at our local college and our first assignment was to write a short story using a scene to create emotion. As the students read their stories in class, I realized I approached writing from a different angle than the other students. In one way, that made sense. I have a lot more years to draw from, so I pulled from those experiences. But that wasn’t quite what was going on. 2. Review on Li-Fi: An Advancement of Wireless Network Arslan Mirza is a freelance writer. He is interested in new technologies and he wants to share his information. The concept of visible light wireless communication (Li-Fi) technology was proposed by Professor Hass in the United Kingdom at the TED (Global Technology and Entertainment Design) conference in 2011, and the preliminary application of optical transmission data was successfully realized in 2012. In October 2013, the Fudan University experiment also successfully realized the wireless transmission of visible light indoors. Through this technology, one LED light can be used for the normal Internet access of 4 computers, with a maximum rate of 3.25 Gbps and an average Internet rate of 150 Mbps 1. A Hackathon Gave Birth To One of The Most Influential Company Shubham Pathania is a coder and a writer. He lives between fiction and reality — his words, because he is a great writer. Do encourage him by reading his story.
https://medium.com/technology-hits/editors-faves-top-10-why-coding-knowledge-is-the-future-of-literacy-a0d6864e4ad2
['Dew Langrial']
2020-12-16 21:48:24.417000+00:00
['Writing Tips', 'Readinglist', 'Self Improvement', 'Reading', 'Writing']
Wartime
Image Credit: Columbia Pictures I ate a salad at lunch today. But more importantly, I overheard a conversation between the ladies sitting next to me. They both worked for a small startup and one of them was onboarding the new employee. At some point, one of them brought up Zoom ($ZM) and *very quickly* it was agreed upon that their team should sign up for a subscription. Last month, as Zoom was approaching its IPO around the same time as Uber ($UBER), I was singing the videoconferencing company’s praises. Meanwhile, also today, Uber announced another $1 billion loss for this past quarter: “Uber’s performance on the public market has been a letdown. Investors, even Wall Street experts, had anticipated an initial market cap in the ballpark of $100 billion. Instead, Uber currently sits at a valuation of about $67 billion, or $5 billion lower than the $72 billion valuation it earned with its last private financing. Uber’s core business, ride-hailing, is growing much faster than other segments of the massive business. While revenues grew 20 percent from the same period last year, revenues in the company’s ride-hail department grew only 9 percent. Uber Eats revenue shot up 89 percent while its gross bookings grew 108 percent.” Zoom had all the criteria that would make me break my own rule about not investing in IPOs, which is why I set a limit order to buy on the first day. Unsurprisingly, Zoom shot up immediately from $36 to $65 in its first minute of trading and then even reached over $90 per share in the past month. Sadly, I still own zero shares. The Wall Street Journal made pointed out how software is going strong: “In the first quarter, American companies for the first time invested more in software than in information-technology equipment. Indeed, outside of buildings and other structures, software surpassed every type of investment, including transportation equipment such as trucks and industrial equipment such as machine tools. Software spending is even higher if the cost of writing original software programs, now classified as research and development, is included.” One downside to reading and listening to a lot of finance is hearing talking heads worry about the “long bull market” and its inevitable end. It’s been over 10 years since the financial crisis made the stock market hit a generational bottom and everyone is trying to predict the next downturn. The Novel Investor blog reminded me of this Benjamin Graham quote: “I am convinced that an individual investor with sound principles, and soundly advised, can do distinctly better over the long pull than a large institution.” I have a lot of good friends who right now are at a desk tasked with the job of calculating a company’s discounted cash flow. There’s nothing wrong with that, but there is certainly more owed to the prospects of a company than one spreadsheet. Brad Feld has read Ben Horowitz’s book The Hard Thing About Hard Things and likes Ben’s comparison between a peacetime CEO and a wartime CEO: “Peacetime CEO knows that proper protocol leads to winning. Wartime CEO violates protocol in order to win. Peacetime CEO always has a contingency plan. Wartime CEO knows that sometimes you gotta roll a hard six. Peacetime CEO strives for broad-based buy-in. Wartime CEO neither indulges consensus-building nor tolerates disagreements. Peacetime CEO sets big, hairy audacious goals. Wartime CEO is too busy fighting the enemy to read management books written by consultants who have never managed a fruit stand.” Even in this “long bull market” good companies are still battling for attention, market share, and investors’ dollars. Someday, for some reason, global crises will cause the U.S. stock market to collapse by many trillions. When that time comes, invest with leaders who know how to act like a wartime CEO.
https://medium.com/@johnbonini/wartime-11efce0d58bd
['John Bonini']
2019-05-31 03:52:41.871000+00:00
['Bull Market', 'Zoom', 'Software', 'Startup', 'Information Technology']
Enums (Enumeration Objects) in Python
Let's say you are a pokémon developer in 1997 and you have to develop the first Pokémon Edition Yellow Edition. You have already created the basic Pokémon class and also the subtypes like AQUA, FIRE and GRASS, which inherit from the base Pokémon class, are ready. Let´s see how we could do that without using enumerations and why this is bad! Photo by Elias Castillo on Unsplash Now, you want to create a Squirtle! You know Squirtle ? I hope you do! Note that the following class is a simplyfied example in order to show the necessity of enum objects … class Squirtle(AquaPokemon): def __init__(self, name, level): self.name = name self.level = level self.type = "AQUA" You see the self.type = “Aqua” ? We could do that. Like this, we defined that Squirtle is an AQUA Pokemon, which clearly is true. Now imagine the following. You have 100 AQUA-Pokemon! Each one of them is designed like Squirtle. Now, your boss tells you: “Hey dude, I don´t like the name AQUA anymore…. Let´s call it water!” ;) Hell, no. Now you need to change every class. You need to go to every Pokemon file, check wether it is an Aqua Pokemon and Change it manually. Wish that life could be easier … but … wait a minute! It can! Enumerations from enum import Enum # built-in library from pthon class PokeType(Enum): AQUA = 0 FIRE = 1 GRASS = 2 Thats it! Since Python is not a strong typed language like Java or C#, we need to import the Enum object and pass it to our class (we inherit it) in order to make our class behave like an Enum. # let´s play a little bit to understand the enum a = Poketype(1) >> a >> <PokeType.Fire: 1> >> a.value >> 'FIRE' >> a.name >> 1 So we can assign each variable or class our enumeration type and this variable or class gets the attributes of the enum type by name or value. How would our Squirtle class look like with the Enum Type? import PokeType class Squirtle(AquaPokemon): def __init__(self, name, level): self.name = name self.level = level self.type = PokeType(0) If we keep going on by doing type assignments in this way, we can easily change Pokemon AQUA to WATER without much effort. Furthermore, our code is so much cleaner now, since we do not depend on a hard coded string (“Aqua”), right? When should i use an Enum? Use Enum when you need a predefined list of values which do represent some kind of numeric or textual data. Especially when you want to define a variable/class that can only take one out of a small set of possible values. Example: In Machine Learning, if you trained a Neural Network to identify multiple classes for Natural Language Processing, you will need to transform text-lables into integers. Use an Enum in production to predict and automatically transform the integer to your predicted class! Hope you liked this introduction, please have a look at my youtube-channel and subscribe.
https://medium.com/@allahueggbert/custom-enumeration-objects-in-python-98dd1bd4c47f
[]
2020-12-12 16:45:22.218000+00:00
['Enum', 'Object Oriented', 'Enumeration', 'Oop', 'Python']
Humans as political point scorers
The end justifies the means? If we want to stop refugee’s from paying profiteers stupid sums of money that they do not have to risk their lives to cross an ocean to arrive in a new land seeking simple dignity and opportunity, we incarcerate people for years and years. We treat them all as criminals. We remove any form of dignity. To make a point. And the people who do this to refugees and perhaps the occasional opportunist go to a Church on Sunday, pretending to be of the Christian faith. A faith built on love, especially love of the downtrodden. (I am writing specifically about Australian politics here.) The end justifies the means? Of yeh of heartless, cruel faith, how lacking is your imagination? Surely if you applied a little more imagination you, we, us, might come up with a better solution? The end justifies the means is a failure of human ingenuity. It lives in a world where we must put up strong walls to prevent the have-nots from accessing our treasures. Where, if we really tell the truth, the haves (that would be you and I) have agreed to a global economic system that keeps the have-nots in the have-not class. Where short term wickedness to grab all the spoils keeps the majority broken. They are humans too… Too long have we made the innocent political point scorers. Like some Mad Max game of rugby, where the ball is humans, and the kicking is as cavalier as if the ball was a ball. Such a lack of imagination, a lack of truth, a lack of all-human-Christ-like compassion. Clinging like those refugees caught in a storm to a broken boat, our politicians and their media and corporate enablers hunker down, the sunk cost, forgiveness and human decency, too much for them to stomach. As if dignifying humanity is a form of capitulation, failure? How did we get to this? The cruelty. The winner takes all. The hypocrisy. We must look long and hard in the mirror. We the people might remember that citizenship is a right AND a responsibility. To sit back and point, to throw rocks…this is not to be an active citizen. We must write, speak, protest, name, call out, call, shout, participate, vote… If we want a more beautiful world. Call your local MP. Call the Department of Home Affairs. Speak. Write Shout. Do it every day. And consider next time you hear, “The end justifies the means?” if this is indeed true or a convenient truism that enables people to do terrible things. Humanity can do better. I, me, you, we, are humanity. Photo taken June 9th 2021 #worldwithafuture #businessreimagined #syntropicworld #syntropicenterprise #syntropy
https://medium.com/@christine-mcd/humans-as-political-point-scorers-5a5a2ffc9c71
['Christine Mcdougall']
2021-06-08 20:46:05.685000+00:00
['Imagination', 'Syntropic World', 'Syntropic Enterprise', 'Politics', 'Syntropy']
Dalton Tanonaka | Tanonaka wins supreme court review in investment case
Dalton Tanonaka | Tanonaka wins supreme court review in investment case Dalton Tanonaka ·Dec 13, 2020 Dalton Tanonaka TANONAKA WINS SUPREME COURT REVIEW IN INVESTMENT CASE The Supreme Court of Indonesia on October 27, 2020, ruled in favor of Dalton Ichiro Tanonaka in his role as Director of P.T. Melia Media International, doing business as The Indonesia Channel. The Peninjauan Kembali (PK) judicial review reaffirmed a Jakarta High Court decision on May 23, 2018, that said no criminal actions were committed in a business agreement with investor Prem Harjani Ramchand. “Finally I’m able to focus on my company’s work, which is to help this country tell its story to the world through The Indonesia Channel,” said the former Metro TV and CNN International news anchor. “I ask everyone now to join us in our mission, especially when we enter the post-coronavirus recovery period.” Tanonaka was represented by attorney Marthen Pongrekun.
https://medium.com/@daltontanonaka/dalton-tanonaka-tanonaka-wins-supreme-court-review-in-investment-case-91be49a090e9
['Dalton Tanonaka']
2020-12-13 14:08:33.298000+00:00
['Supreme Court', 'Dalton Tanonaka', 'Investment Case']
The Paradox and Instability of Our Preferences
One of my recent articles, as some also before, got a lot of feedback. I was both glad about it but also confused. The thought, “I wish they didn’t,” started circling in my head each time a like or a response appeared in various places where I shared the story. It was interesting to observe it because when I wrote it, and even before that — when I had the idea for it — I hoped many would like it and let me know about that. But when they did, my preferences seemed to flip to the opposite of my initial wish. Some people might recognize an impostor syndrome in such thought processes and behavior. For me, it felt simply and utterly confusing. Looking at it as an anthropologist would — non-judgmentally — helped tremendously. Writing about it here, too! Thank you for reading and thus helping me. I hope this little story will help you when your preferences seemingly decide to flip to the opposite of your wish or dream.
https://medium.com/rule-of-one/the-paradox-and-instability-of-our-preferences-cc07001bffb6
['Victoria Ichizli-Bartels']
2020-12-13 13:00:41.583000+00:00
['Ambition', 'Self-awareness', 'Writing', 'Mindfulness', 'Feedback']
Web Redesign Tips For Small Business
Numerous independent companies have obsolete websites in light of the fact that the way toward refreshing their website appears to be so careful, regardless of whether they’ve recognized an incredible Dallas web design agency like our own with WordPress designers and web developers who are profoundly gifted at small venture web design. To help you to stay up with the latest, we’ve assembled this blog entry of our best web redesign tips. Web Redesign Tip #1 — Always Work With Your Web Developer or WordPress Designer on a Strategy for Your Small Business Web Redesign: While you should always be updating your website, and testing those changes, there should always be a strategic purpose for a web redesign. When you work with our Dallas web design agency, we help you define your web redesign strategy. Web Redesign Tip #2 — Embrace Evolutionary Site Redesign (ESR): Numerous private companies have obsolete websites since they simply don’t have the opportunity to do a total website upgrade. That is alright. As a Dallas web design agency, we really don’t suggest total web updates, but instead transformative website design. The world is continually evolving and SEO is a progressing cycle and purchaser conduct can change immediately. The pictures, features, and different parts of your website that worked a half year back might be unessential at this point. Web Redesign Tip #3 — A/B Test Everything You Change: One of our best web design tips that we can give you is to test every one of your changes. All things considered, in the event that you don’t run a test to check whether your proposed change truly does drive more deals or leads, at that point why make it? By having your web developer play out A/B testing, you’re gathering genuine exploration to figure out which pages, features, pictures, recordings, and different parts of your website lead to wanted activities like buys or structure culminations
https://medium.com/@cooperativecomputingdallas/web-redesign-tips-for-small-business-d15e7e9c9f52
[]
2020-10-28 11:33:45.404000+00:00
['Web Design', 'SEO', 'Web Development', 'PPC Marketing', 'Website Design']
Infinite Scroll With Photo Gallery in React
Photo by kimi lee on Unsplash To understand the context of what and how infinite scroll works, you can read the following article: https://medium.com/dev-genius/infinite-scroll-with-photo-gallery-in-vanilla-javascript-9dc1d3896cf7 This article is pretty much a react version of the article I just included. The focus of this article is to understand how infinite scroll works as well as how to display a loading indicator when images are being retrieved. By reading this article, I assume you have background and experience coding in react and javascript in general. As mentioned in the article provided, we have to understand three things: HTML that contains the images and a loading indicator a CSS file that controls the animation of loading indicator and how big the images are a javascript file that contains the logic to get images, the logic to detect when the user scrolls to the bottom of the page, as well as a timeout logic to give time to load new images when the user scroll down to get more images. If this assignment is done in plain vanilla javascript, we have to use opacity to hide/display the loading indicator: Scroll down to the bottom and you will see .loading and .loading.show CSS rule. This is the part where if the div holding the loading indicator has a subclass of show(added when the user scrolls to the bottom of the page), the loading icons show up. With react, we no longer need to use opacity, and instead, we can jsx to conditionally display the loading indicator. Photo by Ferenc Almasi on Unsplash React allow us to modularize a lot of our code and what that means is we can also modularize the loading indicator as well as the photo gallery. Base on that, we can divide our code into the following: photo gallery component loading indicator component Wow! The structure looks so different than what we have in the vanilla javascript implementation, which has an HTML, js, and a CSS file. Technically, we can separate js and CSS files into one used for photo gallery and loading indicator. However, we can accidentally mix up CSS rules if there are CSS rules from the photo gallery component and loading indicator component named the same. What will happen is one CSS rule overwrites another and that’s why I kept everything into one HTML, one js, and one CSS file only. To start, you need to run npx create-react-project infinite-gallery. Afterward, under the src folder create a components folder and create a PhotoGallery.js: Let’s break down into chunks that we can understand. In our constructor, we set our initial state to have a placeholder for a list of photos, the current page to fetch photos from, and as well as a loading boolean. This boolean helps us determine whether to show the loading indicator or not. The boolean only gets shifted to true only when the user scrolls to the bottom of the page. Also within the constructor, we added an onscroll event listener. We get scrollTop, clientHeight, and scrollHeight, which is used to determine whether or not we are on the bottom of the page: If we are on the bottom of the page, the loading state of the photo gallery gets set to true, effectively rendering the loading indicator. Afterward, we increment the page count so on our API call we will be getting different images every time. After we retrieve the images, we add our images on top of the ones we have and set the loading state to false. On top of all this, we wrap the API call and setState in a debounce function. What this will do is wait for 1000 ms(can be as long as you want) before the API gets called. This is to prevent excessive API calls to be made to get images since the user can be scrolling to the bottom of the page again before the new images are retrieved. How about when we initially load the page? That’s where componentDidMount comes in. With the componentDidMount lifecycle method, we can get photos and populate the page before the page is even rendered. Once the page renders, we populate the photos we have and if the loading state is turned on, then the loading indicator will show up. Photo by Jen Theodore on Unsplash We talked enough about how the photo gallery. How about the loading indicator? How does that look? Under the src/components folder, create a PhotoGallery.js with the following contents: Similar to what we have seen in the vanilla javascript implementation, we have keyframes that dictates how the dots within the loading indicator moves. However, we need to wrap the keyframes into global CSS in order for other CSS rules that use the animation key-frames to understand what they are. Within the ball CSS, we set its width and height as well as the nature and the duration of the animation. AnimationTimingFunction determines the nature, animationDuration determines how long it will take, and animationIteration determines how frequently the animation occurs. Photo by monika karaivanova on Unsplash One might ask…why creating this in a react application rather than a simple vanilla javascript application? First of all, react provides an amazing mechanism to deal with the initial loading of data in the page. With vanilla javascript, you have to create a function to retrieve the data from an API and append it to a specific element in the HTML via jquery. That’s not necessary the cleanliest way of initializing some piece of data in a page. What if there are changes to the class name or the id of the particular div? Now your code is all broken and you have to go into the javascript file and change the id/class that you used to query the div for the photos. With react, you don’t need to worry about any of this! All you have to worry about is figuring the right lifecycle method to use and make the API call to dump the data into the local state. If the data is not ready, all you need is to set the loading state variable to true and show a loading indicator! That’s it! You don’t have to worry about using jquery to attach data to a specific node in the dom. With that being said, understanding how infinite gallery is implemented in vanilla javascript will help us understand how dom-manipulation works. Furthermore, it helps understands the downside of mixing HTML, javascript, and CSS logic when all of this can be modularized into components. That’s it! Happy coding!
https://medium.com/weekly-webtips/infinite-scroll-with-photo-gallery-in-react-c7219b8be2a0
['Michael Tong']
2020-12-02 08:46:54.242000+00:00
['Animation', 'CSS', 'Web Development', 'React', 'JavaScript']
Drug Discovery With Neural Networks
Drug Discovery With Neural Networks A summary of the Mechanisms of action (MoA) prediction competition on kaggle where we used deep learning algorithms to predict the MoA of new drugs. Figure 1. Overview (1) Human cells are treated with a drug. (2) Gene expression and cell viability measurements. (3) The data is fed to the neural network. (4) the NN predicts if a drug has a MoA or not. Discovering a new drug has always been a long process that takes years. With the recent advances of AI and the accumulation of research data in biological databases, the drug discovery process and the research pace is getting faster than ever. Researchers in the laboratory of innovation and science at Harvard are working on the Connectivity MAP project [1] with the goal of advancing drug development through improvements to the drugs MoA prediction algorithms. This challenge was launched as a kaggle competition [2] in order to build machine learning models to predict the MoA of unknown drugs. 1- Dataset: We start by understanding the competition’s dataset: We have a dataset with gene expression and cell viability data as features and 206 MoA as targets. 1–1 Gene expression features: Figure 2. Gene expression assay: The gene expression assay undergoes a series of steps starting from the selection of cell lines, drug treatment, incubation, to mRNA extraction and quantification of a subset of genes. The gene expression is measured for 772 genes in this dataset. Figure 3. Distribution of 5 genes: g-1, g-2, g-3, g-400 and g-771 (out of 772 genes) Gene expression was measured with L1000 assay. You can learn more about this new technology in the Connectivity map webpage and the research paper: A Next Generation Connectivity Map: L1000 Platform and the First 1,000,000 Profiles,” Cell, 2017.[3] For the sake of simplicity, just one experimental condition was explained. In fact, one single drug was profiled several times at different dosages (low and high) and different treatment times (24H, 48H and 72H). 1-2 Cell viability features: Along with the gene expression data of 772 genes, cell viability data was provided for 100 cell lines, the cell viability assay is based on PRISM (Profiling Relative Inhibition Simultaneously in Mixture).[4] Figure 4. Cell viability assay:Cells are pooled and treated with the drug, incubation, dead cells count per cell line. Figure 5. Cell viability distribution of 5 cell lines (OUT OF 100) The cell viability assessment is based on PRISM . You can learn more about this new technology in the Connectivity map webpage and the research paper: Discovering the anticancer potential of non-oncology drugs by systematic viability profiling. Unlike the gene expression values that represent the mixture of the 100 cell lines, the cell viability values are per cell line, in other words: Gene-1 values are the average of the gene-1 expression over 100 cell lines as explained in figure 2 step 4 . . Cell-1 value is the viability of the cells belonging to cell line 1 as explained in figure 4. So far, we’ve seen the gene expression features and the cell viability features after the treatment with the drugs. The only puzzle missing is the drugs mechanism of action, which is the target to predict. 1-3 Targets: Drugs MoA In pharmacology, the term mechanism of action (MoA) refers to the specific biochemical interaction through which a drug substance produces its pharmacological effect.[5] Let’s make this definition simpler, for example, the drug aspirin reduces pain and inflammation, so the MoA of aspirin: MoA’s function: Reducing pain and inflammation. MoA’s biochemical function: Involves irreversible inhibition of the enzyme cyclooxygenase, therefore suppressing the production of prostaglandins and thromboxanes, thus, reducing pain and inflammation. This function or MoA is just one of the possible functions/MoA that the drug Aspirin can have, so one drug can have more than one mechanism of action. This makes the drugs MoA prediction a multi-label problem. We were provided 206 MoA targets per drug, labeled as (0: No MoA, 1: MoA). The table below displays 4 targets (out of 206 targets provided in this dataset). sig_id: is the sample containing the mixture of 100 cell lines treated with a drug-X. (step 1) 5-alpha_reductase_inhibitor, 11-beta-hsd1_inhibitor, acat_inhibitor… are the target mechanisms of action Table 1. Example of 4 MoA targets out of 206 laballed (0: No MoA) and (1: MoA). Let’s take the first row: sig_id: ‘id_d00440fe6’ is a mixture of 100 cell lines (see step 1), it was treated with a drug X (see step 2), this drug X doesn’t have the MoA 5-alpha_reductase_inhibitor so it’s labeled as 0, but it has the MoA ‘acat_inhibitor’ so it’s labeled as 1. Problem statement: 100 cell lines are treated with a drug. Gene expression and cell viability data is collected to understand the biological activity of this drug. The task is to predict the MoA of new drugs based on the gene expression and cell viability features. (See figure 1) You can learn more about the competition’s data and the features interaction (genes, cells and drugs) in my kaggle notebook: Drugs MoA classification: EDA. 2- Drugs MoA prediction: We arrive to the most exciting part of this analysis, the prediction of the mechanism of action of new drugs based on their gene expression and cell viability features. While deep learning is dominating computer vision and natural language processing tasks, tree based algorithms (Random forrest, decision trees…) and Gradient boosting machines (XGBoost, LGBM, CatBoost…) are still the way to go with tabular data. However, this is not the case here, deep learning algorithms outperformed gradient boosting machines. Why is that? Because we have a multi-label problem with 206 targets to predict. Shallow machine learning algorithms don’t support multi-label tasks, in other words, they do not make use of the 206 targets correlation and cooccurrences to improve the accuracy of their predictions. To have a better idea, let’s compare the performance of Ridge, LGBM, XGBoost and 3 deep learning models in this competition. Figure 6. Models performance in the MoA prediction competition on Kaggle. In red, shallow machine learning models and in green deep learning models. Those scores are approximate, to have a better idea how those models performed you can check out the notebooks training them on kaggle: Ridge, LGBM, XGBoost, ResNet, 4 layers NN and TabNet. The take-away from the figure above is the huge gap between shallow ML models and deep learning models. Deep learning models outperformed in this competition because of their ability to extract information from the 206 targets connections. In the following sections, I would like to talk about 3 neural network architectures that performed really well on tabular data with multi-label targets. 2–1. Multi layer perceptron: MLP or a simple feed forward neural network, the simplest neural networks architecture with 4 dense layers (first 2 layers with 2048 neurons and the last two with 1048 neurons) along with dropout, batch normalization layers and ReLu activation function, performed surprisingly well. Figure 7. 4 fully connected layers neural network architecture. Each block consist of a batchNorm, dropout, dense layers and a ReLu activation function. The score achieved with this model was very competitive with an Adam optimizer, Reduce on plateau scheduler and a binary cross-entropy BCE With Logits Loss function that includes a sigmoid activation. CODE: 4 layers MLP code Github repository. [9] 2–2 TabNet: TabNet was introduced in 2019 by google cloud AI in the paper: TabNet: Attentive Interpretable Tabular Learning. It’s a deep learning model for tabular data. TabNet combines the properties of neural networks and tree-based algorithms: [6] It has the power of neural network to fit and learn complex functions with a high number of parameters. And it has a feature selection mechanism similar to tree-based algorithms. It also uses the attention mechanism in feature selection. Figure 8. TabNet architecture. Source: https://arxiv.org/pdf/1908.07442.pdf The pytorch implementation of TabNet was done by dreamquark-ai in their tabnet github repository and introduced to kaggle by optimo in his TabNet Regressor notebook. Tuning and understanding the hyper-parameters can lead to a very powerful model. In fact, TabNet was the strongest single model in the MoA prediction competition, outperforming all the other models. CODE: TabNet code in the Github repository. [9] 2–3 DeepInsight CNN: DeepInsight is a methodology to transform a non-image data to an image for convolutional neural network architecture [7], this enables taking advantage of the strong pretrained CNN models like EfficientNets. This approach was published in Scientific reports of nature 2019 and introduced to kaggle by Mark Peng in his image transformation tutorial and inference notebooks. Converting tabular data to image data starts by allocating the features in a feature matrix, where the location of features depends on the similarity of features, so we end up with a feature matrix with several clusters, in each cluster, similar and highly correlated features are grouped together (figure below). Figure 9. DeepInsight pipeline. (a) Transform a feature vector to feature matrix. (b) Transform a feature vector to image pixels. Source: https://www.nature.com/articles/s41598-019-47765-6#Fig1 The power of this methodology with gene expression data consists in the arrangement of similar genes into clusters, which makes the differences more accessible and allows for robust identification of hidden mechanisms than dealing with elements individually. Feeding those feature matrix images to CNNs helps to catch the small variation in genomic data with the power of the convolution and pooling layers of the CNN. To better understand how DeepInsight transformation [8] works with our data, let’s plot the feature matrices representing the gene expression and cell viability data of 2 targets (MoA): proteasome inhibitor and DNA inhibitor. Figure 10. Feature matrices representing the transformed gene expression and cell viability data The difference between those 2 images is clear, the sample treated with a drug having an active proteasome inhibitor has a different feature distribution and correlation than the sample treated with a drug with DNA inhibitor. This allows the pretrained convolutional neural networks to learn patterns that other models fed with tabular data can not catch. Training a pretrained efficientNet B3 and B4 model with the deepInsight transformed images achieved competitive results, and better than that, it gave a huge boost to the final ensemble with the other neural network models since it learned new patterns only accesible in the images. CODE: Image transformation + efficientNet B4 code in the github repo. [9] Conclusion: Along with the models mentioned in this article, other models performed well in this competition with tabular data and multi-label targets such as LSTM and GRU that are sequential models. The advances of AI are getting us to the point of solving a tabular data problem with an ensemble of CNNs and RNNs. References: 1- Connectivity map project: https://clue.io/cmap https://clue.io/cmap 2- MoA prediction competition: https://www.kaggle.com/c/lish-moa https://www.kaggle.com/c/lish-moa 3- L1000 assay : ubramanian et al. “A Next Generation Connectivity Map: L1000 Platform and the First 1,000,000 Profiles,” Cell, 2017, https://doi.org/10.1016/j.cell.2017.10.049 : ubramanian et al. “A Next Generation Connectivity Map: L1000 Platform and the First 1,000,000 Profiles,” Cell, 2017, https://doi.org/10.1016/j.cell.2017.10.049 4- PRISM: Corsello et al. “Discovering the anticancer potential of non-oncology drugs by systematic viability profiling,” Nature Cancer, 2020, https://doi.org/10.1038/s43018-019-0018-6 Corsello et al. “Discovering the anticancer potential of non-oncology drugs by systematic viability profiling,” Nature Cancer, 2020, https://doi.org/10.1038/s43018-019-0018-6 5- MoA definition: https://en.wikipedia.org/wiki/Mechanism_of_action https://en.wikipedia.org/wiki/Mechanism_of_action 6- TabNet: https://arxiv.org/abs/1908.07442 https://arxiv.org/abs/1908.07442 7- DeepInsight: https://www.nature.com/articles/s41598-019-47765-6 https://www.nature.com/articles/s41598-019-47765-6 8- DeepInsight transformation tutorial: https://www.kaggle.com/markpeng/deepinsight-transforming-non-image-data-to-images https://www.kaggle.com/markpeng/deepinsight-transforming-non-image-data-to-images 9- MoA github repository: https://github.com/Amiiney/MoA_competition The diagrams, graphs and illustrations were made by:
https://medium.com/swlh/drug-discovery-with-neural-networks-a6a68c76bb53
['Amin Yamlahi']
2020-12-17 16:19:51.516000+00:00
['Drug Discovery', 'Kaggle Competition', 'Artificial Intelligence', 'Machine Learning', 'Deep Learning']
The Rise of Consumer Activism
Digital Activism More Rampant Among Young Americans While commodity activism isn’t new, it seems to be more popular than ever, especially among younger people. Cause & Social Influence, a research group focused on young Americans and their involvement in social movements recently released a report on how young Americans (ages 18–30) helped others in 2020. The report found that the top 3 actions young Americans took were: Changing the way they shop Posting/sharing content on social media Signing petitions Note that these actions have been trending upward even prior to the pandemic, proving that these habits are not temporary. By November, half of the respondents said they were getting their information on issues of racial inequality and social justice from online content creators and influencers. We can glean a lot from these results: it’s clear that online influencers have a significant impact on young Americans and their shopping habits. And as opposed to generations preceding them, young Americans today largely seek to create change within the parameters of the capitalist status quo, as opposed to taking action outside of it. This raises a number of questions on whether systemic change is even possible when one is fighting it within the constraints of an inherently unequal system. The re-emergence of commodity activism reflects a turning point in the way we consume — and from where. The Origins of Mass Consumption Consumerism has a long history. The end of WWII marked the beginning of America’s consumer society: jobs were abundant, wages were good, upward mobility was not uncommon — and Americans were eager to spend. And spend they did. The American consumer was considered a patriot, a sentiment that continues today in Western countries. Suburban life took off and many Americans were, by all accounts, living the good life. Entire industries propped up, transforming the way we consume, no longer out of pragmatism, but out of desire. Edward Bernays — the Father of Public Relations — played an influential role in manipulating public opinion to boost profits. He was inspired by his uncle Sigmund Freud, who maintained that irrational forces drive human behavior. By subconsciously manipulating behavior, he would make people buy things they did not need. Public relations and advertisement skyrocketed in popularity. Of course, for every action, there is an equal and opposite reaction. The counterculture of the 1960s emerged as a way to resist mass consumption and capitalism. But then, counterculture itself became mainstream. Capitalism entered a new phase in which individual identities were shaped by production, which meant that peoples’ identities were, in turn, shaped by what they consumed. Then in the late 1990s/early 2000s, the rise of anti-consumer rhetoric once again took hold, reflecting a constant tug of war between consumption and anti-consumption in modern society. It seems the anti-consumerist movement made some headway, and then petered out, just as quickly as it emerged. The Rise of Hyper-Commodity Activism Fast forward to today. Instead of avoiding consumption, many companies and activists alike are promoting consumption as a form of resistance. Companies have long tried to profit off of social justice. This isn’t new. Indeed, even at the height of the anti-consumer movement, companies like Nike were embedding feel-good campaigns into their products as a way to entice the socially conscious consumer. Nike has a long history of political proselytization and has created an entire culture around social justice branding (or ‘brand culture’ as it is often called). In 1987, Nike released a major campaign with the help of Wieden+Kennedy, a Portland-based creative agency. In the advertisement, The Beatles’ 1968 hit song Revolution played in the background, while images of professional athletes and ordinary people of all races, sexes, and ages displayed on the screen, conveying the message that there was a revolution in the way people exercised. Nike profits soared. Soon after, Nike would be embedding social justice issues into many of their future campaigns, fusing anti-establishment ideas into their product messaging. Despite its continued criticism (and the criticism is abundant), Nike has seen great success profiting off of this marketing strategy. Supporting Local is a Red Herring As we have seen, it’s even more popular to promote consumption as an act of resistance — and the pandemic has set the stage for this trend to take off under the banner of ‘supporting local.’ Supporting local BIPOC-owned businesses is seen as the more ethical alternative to the faceless, exploitative company. And while this is certainly true, I’m still reminded of the consumer=patriot rhetoric that is often pushed in times of social decay and unrest. The messaging is that if people are going to consume anyway, then why not support local BIPOC businesses? I see the logic, though I also see this rhetoric as a red herring that absolves the consumer of the duty to meaningfully challenge the status quo in other ways. To be clear, I’m not critical of small businesses — it’s important that we support local, but I am critical of businesses and influencers that use emotional tactics — no matter what the cause — to manipulate people into purchasing things they don’t need — and especially influencers who stand to gain more social capital for commodifying activism. Buying an Experience Perhaps there was a time, in certain circumstances when commodity activism would have been an effective tool to incite change. But today, I largely see it as a way to make political statements within the parameters of neoliberal dogma. In many ways, it’s as if we are making negotiations with ourselves, that we can still do something within the capitalist constraints. Commodity activism, especially for young people, is about buying an identity or an experience. And ironically, it too depends on Bernays’s own propaganda techniques, but this time, it’s under the guise of doing good.
https://medium.com/free-thinkr/the-rise-of-consumer-activism-96a32277909b
['Rozali Telbis']
2020-12-15 16:27:23.827000+00:00
['Consumerism', 'Activism', 'Commodity Activism', 'Race', 'Politics']
Stably Announcement for USDS Consolidation + Upcoming Stably Enterprise Token Launches
Stably is building a more efficient global economy — faster, cheaper, and accessible to all — through USDS, a regulated stablecoin. Learn more at www.stably.io Follow
https://medium.com/stably-blog/stably-announcement-for-usds-consolidation-upcoming-stably-enterprise-token-launches-9ce27c17bfd4
[]
2020-06-24 18:16:09.855000+00:00
['Announcements', 'Stably', 'Enterprise Technology', 'Tokenization', 'Fintech']
Truth = Utility
Truth = Utility Definition and implications of a new kind of epistemology Source: Todd Quackenbush at Unsplash We have officially entered the post-truth era. With the rise of deep-fakes, lying politicians, and Surkovian disinformation campaigns, it’s hard to get a handle on what truth even is. For a few months I was deep in a skeptical hole where I had truly lost grip on what I considered “real”, and I had to claw my way out by getting real silly and coming up with a formal definition that we might all agree with. Truth, I propose, is given by this expression: That’s it. That’s truth. I’ll define terms shortly and it’ll be clear that I’m abstracting away many messy details, but I will try to convince you that this basic structure agrees with many of our informal intuitions and is useful for decision-making — potentially even serving as an objective function for the automated scientists of the near future. Even better: perhaps it can be used as an objective function for search-systems or newsfeeds, only returning the top results as ranked by their truth-value. If you buy the definition wholesale, I’ll show that it has interesting implications that aren’t immediately obvious. If you don’t buy the definition at all, I invite you to put all critiques, counterexamples, and challenges in the comments so I can update accordingly. The tau truth-function The first thing you’ll notice is that truth is a function. This basic framing already aligns with our intuition in important ways: truth isn’t a value that floats out in the ether all by its lonesome, it’s a property of a statement — which we write here as s. Statements themselves don’t have a truth-value on their own, either: they are always grounded by some set of contexts, D. Some basic validation by intuition: the statement “the boy runs” doesn’t have any truth-value assigned to it until you give it a context. If the context involves a boy running, then the statement is true. If the context does not include a boy running, then it is false. You need both a statement s and a set of contexts D to evaluate truth, even if D means “all possible contexts” (for example, tautologies are true in all contexts!). It can be argued that we can do away with the notion of contexts if statements were fully qualified (i.e “The boy runs and the time and place is such where there is a boy running”), but since contexts are an easy-to-use shorthand, it’s what we’ll go with. The context set, D D is a Domain: a set of contexts d. d is a set of self-consistent true statements. This captures the idea that the truth of s, in part, is a function of its consistency with established true statements. However, it is possible for some d₀ to contain statements that disagree with statements in a d₁, even though an s is consistent with both. For example, take d₀ to be all of Newtonian classical mechanics and d₁ to be Einstein’s General Relativity. The s “Gravity exists” is consistent with both contexts and its truth-value should be evaluated in each of them piecemeal, even though the two d fundamentally disagree elsewhere. Truth-values are numbers It has likely not escaped you that the result of the 𝛕 function is a sum. This implies some weird things: Truth-values are composed of subcomponents. Truth-values can be operated on by addition. I want to enforce one other constraint, too: truth-values are numbers. This implies that when you add a positive truth-value to another positive truth-value, the result is something more true. Negative truth-values make something less true. When you add a positive truth-value to another positive truth-value, the result is something more true. Why is this a useful notion? Well, we’ve already established that a statement s can be evaluated within the contexts d₀ and d₁, your chosen D. Let’s say we have three statements s₀, s₁, and s₂, with the following properties: s ₀ is not consistent with d ₀ nor d ₁ ₀ is not consistent with ₀ nor ₁ s ₁ is consistent with d ₀ but not with d ₁ ₁ is consistent with ₀ but not with ₁ s₂ is consistent with both d₀ and d₁ It seems natural to say that 𝛕ᴅ(s₀) < 𝛕ᴅ(s₁) < 𝛕ᴅ(s₂) where D={d₀, d₁}. To get a concrete example, let’s once again borrow from physics. Luminiferous aether, courtesy of Skitterphoto. Let’s take the statements to be: s ₀: light is a wave in the luminiferous aether ₀: light is a wave in the luminiferous aether s ₁: light is a particle ₁: light is a particle s₂: light is a wave and a particle With the domain D composed of: d ₀: {“The Michelson-Morley experiment was conducted”, “Light excites electrons in discrete quanta”} ₀: {“The Michelson-Morley experiment was conducted”, “Light excites electrons in discrete quanta”} d₁: {“The Michelson-Morley experiment was conducted”, “Light excites electrons in discrete quanta”, “Light produces diffraction patterns”} In this case, s₀ is inconsistent with the Michelson-Morley experiment in both contexts, s₁ is consistent with d₀ but not the “diffraction pattern” statement in d₁, and s₂ is consistent with both. Again, it feels natural to me to say 𝛕ᴅ(s₀) < 𝛕ᴅ(s₁) < 𝛕ᴅ(s₂) where D={d₀, d₁}. This ordering couldn’t happen if truth-values were booleans — there’s something more numeric (or at the very least, ordinal) at play, here. As the next section will make clear, I’ll assert that truth-values are not only numbers but that they are real numbers. Truth is marginal utility This one’s probably going to be controversial but bear with me. The truth value of s within a context d is determined, fundamentally, by how much more utility we get than if it were not included. Utility here is defined by a utility function (AKA objective function) U𝐝. Baked into our notation is the idea that U𝐝 is specified by a context d, meaning, not only is d a collection of statements, but also a task that the statements in d and s should be used to perform with a goal to optimize U𝐝. It’s worth emphasizing that we aren’t concerned with absolute utility U𝐝 as much as we are with marginal utility, ΔU𝐝. This definition has counterfactual thinking buried at its core, because we have to estimate what Ud would have been in a world where s was (or wasn’t) used for some task and compares it to the measured Ud in the actual case. Again, let’s go through an example. Let’s say we’re evaluating the statement s=“You should wear a mask” in the context d={“There is a global viral pandemic”, “Masks mitigate the spread”, “Failure to mitigate spread leads to deaths”}. The context d also comes with a task to maximize the utility function Ud = -VirusDeaths(s,d). Seriously though, wear a mask. Source: CDC If Ud(d)=-220,000, but Ud(s,d)=-100,000, then we can say that ΔU𝐝(s,d)=120,000. If d is the only context in our Domain D, we can say that 𝛕ᴅ(s) = 120,000 — a positive, and thus true, value. You should wear a mask. I remember being very uncomfortable with the idea that truth was something necessarily task-relevant. But there is no better test of a statement’s truth than if it can be productively used for some task, whether or not that’s predicting the position of the Moon or launching a successful social media campaign. The implication that truth must be useful might remind you of angry old men yelling about National Science Foundation budgets in Congress and how we’re “wasting money” studying mantis shrimp or whatever. But it’s worth noting that the domain D in which we estimate truth values can be vast and future-looking: there is certainly some future context where studies on shrimp unlock enhanced performance in materials science or medicine, making blue skies research an engine of truth even if utility is low in present contexts. True statements don’t have to be useful right now — but they must be useful in a plausible world, past or imagined future. True statements don’t have to be useful right now — but they must be useful in a plausible world, past or future. Where’s the “reality” term?! Let’s reiterate our formulation of the truth. There is a term for a domain of contexts, a utility function, and a statement to evaluate — but there’s no mention of “facts” or “reality”. But isn’t truth “fidelity to reality”? How can anything be true if it isn’t… real? First of all, “reality” is ill-defined and I know of precisely 0 ways to formulate a distance metric between a statement and reality. Most definitions of reality also invoke truth within them, so framing truth as a function of reality would be circular at best. Secondly, we never explicitly seek out to represent “reality” as accurately as possible in our day-to-day, anyway. It is overwhelmingly likely that our own sensory and perceptual system builds our representation of the world by means of heuristics and approximations — just good enough to help us survive and reproduce. Third: assuming there were a precise distance metric to quantify the fidelity of a statement to reality, you could bake that distance metric into a domain d as part of a utility function. The task, in this case, might be “represent reality as accurately as possible”, making the “fidelity to reality” definition of the truth a special case of the marginal utility definition I present here. I’ll repeat, though: representing reality as accurately as possible isn’t something we ever want to actually do. It’s just not helpful. Remember that a map is not the territory and that a map is only useful because it’s a compact representation of a land that is too vast to comprehend. We use maps precisely because they are not accurate: they are useful. We use maps precisely because they are not accurate: they are useful. Photo by Ylanite Koppens from Pexels Consensus does not matter, except for conventions It is true that you should stop when you see this sign. Photo by Davis Sanchez from Pexels Another thing you’ll notice is that there is no term for how much a statement agrees with the beliefs of others. The standard sophomoric answer for what the “truth” is from freshmen in Epistemology 101 is rebutted pretty handily here. This should be a satisfying result: cults and re-education camps can be quite marvelous at manufacturing consensus, but very few people would make the claim that they are truth-makers. There is a case where consensus matters: but it is baked entirely into the utility function Ud. When it is useful for many people to be in agreement about something otherwise arbitrary, it becomes true. For example, if your objective function Ud is to stay alive as long as possible in the contexts of city-streets while driving a vehicle, it is true that you should stop at stop signs. In New York, it is also true that you should drive on the right side of the road. A little deeper: if your objective function is to run a functioning economy, it is true that fiat currency has value. Because it is useful for us all to agree on that statement. Wide truths are better truths The truth-function involves summing marginal utilities over several separate and potentially unrelated domains. The implication here is that if a statement causes enhanced performance in many separate areas under scrutiny, it is true-er than statements that only enhance performance in one or two domains. Intuitively I wouldn’t have come up with this, but on deeper reflection, it actually maps on pretty well to how we rank propositions in science. Think about the difference between a Law and a Theory. A Law is a straightforward rendering of a deterministic relationship. For example, the Ideal Gas Law relates moles of a gas to its temperature, pressure, volume, and so on. The Ideal Gas Law is true and useful in the many contexts of predicting how various properties of a gas will change. But what about the Kinetic Molecular Theory of gas? It is also useful, but I’d argue it is more useful than the strict functional recording of the Ideal Gas Law, mainly because its insights can now be used in contexts other than predicting the changes in various properties of a gas. It can now be used to predict how diffusion might occur, how dense the atmosphere is at various altitudes, etc. — all applications that transcend the strict recording of relationships in the original context of gases in a closed container. I would argue that the Theory is more true than the Law because it is an explanation that can be ported to multiple contexts beyond where it was developed. To take it to another extreme — instead of scientific laws, let’s instead consider the weights learned by a neural network when it’s presented with a large volume of inputs and outputs. I’d call those weights true, insofar as they actually do model relationships between those inputs and outputs. But if an alternative approach were to simply write a small dense equation to model that same relationship, I would call that alternative more true — simply because it is portable across multiple different contexts and directly manipulable, whereas the neural network’s weights would only be useful in the context it was deployed, on a specific operating system, in a particular format, etc. etc. Deep truths are the truest truths We’ve seen that a statement is true if widely useful in a variety of contexts, which speaks to portability and manipulability being a heavy driver of truth-value. Another insight I’d like to consider needs me to impose some structure. Let’s say that contexts — collections of consistent true statements, remember — don’t exist as islands, but rather can be related by dependencies. The reason why you’d want to have this structure is because at each branching level, you’d want contexts to be able to disagree, even though they have some shared parent context that they both are consistent with. Now let’s evaluate the truth-value of a statement that can be added to d₀. If it turns out to enhance utility and is still consistent with child nodes, then it should also enhance utility or at the very least be 0-valued for those child nodes. Now if we instead insert the statement in d₃, our upper bound for truth-value is much smaller. It can only ever hope to enhance the utility of itself and child contexts. This strikes me as an explanation for why we find certain statements to be profound or fundamental. When something is a “deep truth”, it is actually deeper down the dependency tree of a particular Domain. So, if you were looking for a way to maximize the leverage of your truth-seeking: go deep. Be humble: truth lives on a surface, not a point Another consequence of this framing is that there are no privileged statements. A statement s₀ could produce some positive truth value, but a statement s₁ that contradicts it could produce an equivalent truth value if it had the same impact on marginal utility in as many contexts. Source: Jacopo Bertolotti, Wikimedia Commons A ready example of this is the right-hand rule when describing chirality or electromagnetism — there’s no particular reason we couldn’t have changed this convention in some way and achieved just as much with science and technology. A more controversial example: the Copenhagen interpretation of material objects — which says they do not have certain properties until they are measured and a wave-function collapses — is well and good and can certainly be useful in thinking through problems. But the Broglie-Bohm pilot-wave theory is probably equivalent in its usefulness in a variety of contexts and thus equivalently true. Neither of them has a monopoly on truth except in contexts where a single unified convention is itself the main added value. There is a full surface of statements mapping onto truth-values, and plenty of them will be on a level given the same context, even if they disagree with each other.
https://medium.com/towards-artificial-intelligence/truth-utility-a17e1410ccec
['Naim Kabir']
2020-10-27 12:02:38.376000+00:00
['Science', 'Philosophy', 'Mathematics', 'Truth', 'Knowledge']
Pest Control Coconut Creek. Does the presence of pests in your home…
Finding Pest Control Coconut Creek is easy and simple just call Optimus Pest Control services to get rid of all the annoying pests and enjoy pest-free homes and offices. Does the presence of pests in your home irritate you? Are you hurting with their existence? Optimus Pest Control is here to give you services of Pest Control Coconut Creek and make the clients happy by removing all the pests from their place residentially as well as commercially. In our daily routine, we used to see different kinds of pests about us and we do not want to see the pests in our house and make ideas on how to remove them from the house. The option that comes to everyone’s mind is to hire a professional company that will help in defeat the pests from the house and gives you quality services as per your requirements and needs. We are here to give you the extraordinary services of Pest Control Coconut Creek. The insects are small in size and have different and exclusive species are present in the world, some insects are hazardous and dangerous for humans and animals, and due to these insects a lot of diseases have occurred and people suffer from these infectious diseases. The pests may give harm the valuable belongings, premises, and many other things in different ways, and people are stressed to think that how to get rid of these insects and make themselves satisfied. When we are searching for the best company for the elimination of the pests we choose the one that is well-reputed and experienced to give you Pest Control Coconut Creek services and gives you peace of mind by removing the pests from your place and make it pests-free and safe. This tiny creature makes a huge mess in the place where they are living, and when the pests decide to enter someplace then no one will stop them from entering that place. They are found everywhere in the house and office when they make infestation, their large quantity will affect the living of humans and animals, some of the pests used to feed on the bodies of humans and animals by sucking their blood. To avoid their presence it is basic to call the professionals for the services of Pest Control Coconut Creek. At the initial stage, the pests are controlled then you may not suffer from the great damage and if you do not bother with their presence in your home then you suffer huge damage that is not cheap . So you just need the services that are provided by the experts and save you from the big loss. We are here to give the people a variety of services as they demand from us because your satisfaction is important for us. Whenever you are suffering from the infestation of pests you must need to contact us, we will surely help you in the best possible way by giving Pest Control Coconut Creek services. Pests are the basic problem for everyone and everyone tries hard to eliminate them from the house and office where they make infestation. If you are trying to remove them on your own then you are doing the wrong thing because it is the most dangerous step that you are going to take. You just call us we will help you in removing the pests and that is also the right way to remove them. Optimus Pest Control is one of the best pest control companies that is providing different services of controlling pests when they create an infestation then Pest Control Coconut Creek is essential. We have professional and qualified staff that are engaged in delivering the best services and save you from the evils of pests. Pick up your phone and make us a call.
https://medium.com/@optimuspest5/finding-pest-control-coconut-creek-is-easy-and-simple-just-call-optimus-pest-control-services-to-9b79ef7ba5e7
[]
2021-03-22 07:13:19.817000+00:00
['Writing', 'Blogging', 'Freelancing', 'Marketing', 'Médium']
Monitoring effectiveness of diabetic treatment using Contol charts
If you’ve ever taken a statistics class at a graduate-level program, most probably your instructor would have begun the course with the Central Limit Theorem (CLT). Before we get started, a refresher on CLT: Regardless of the distribution of the population, the sampling distribution of sample means is normal, provided that the samples are randomly picked with replacement and the sample size is sufficiently large (n≥30). In case, the population is normally distributed, the sampling distribution of the sample means is normally distributed even for lower sample sizes. As we increase our sample size, the mean of the sampling distribution approaches the true mean (mean of the population). The standard deviation of the sampling distribution of the sample mean is inversely proportional to the sample size i.e. as the sample size increases, the standard deviation of the sampling distribution reduces. Well, this was textbookish. To help better understand, let’s pick up a dataset from Kaggle, try and replicate the results. We are choosing a dataset of diabetic patients and assuming the following: a. The dataset chosen is the population. b. We are only concerned about the number of days a diabetic patient spends in the hospital. All the other fields/features/characteristics in the dataset are irrelevant for this article. c. Hospitalization details of all the patients cannot be obtained in reality. Without further adieu, let’s get started! The distribution of the population is as below: Right skewed, implying that fewer patients spend a lot of days in the hospital(s) The mean of the population is 4.4days and the Standard Deviation is 2.99 days. In reality, one might never be able to capture the population data. Even if one could, would it be cost-effective? When that seems to be the case, how does one estimate measures of central tendency (such as the mean) or of spread (such as standard deviation)? This is where Central Limit Theorem comes in handy. Before we implement our project, we are going to confirm our understanding of the CLT point-by-point. Point 1: Normal distribution We are going to randomly capture 1000 samples from the population with sample sizes n = 2,10,20, 40 each. The histogram of the sample means across the different sample sizes is as below: As the sample size increases from 2 to 40, the sample mean exhibits a normal distribution One could notice that in each case, the mean (solid red line) is close to 4.4 days (population mean) across the samples of different sizes. What changes as we increase the sample size, is the distribution of the sample means. As visually evident from the histogram, the distribution when n = 40, is normal. Another graphical method of checking for normality is the Quanlite-Quantile Plot or Q-Q Plot. The Q-Q plot for 1000 samples drawn when n = 40 is: Points lying on the straight line indicate the distribution is normal Point 2: Estimate mean tends to the population mean Now that we know that the sampling distribution of the sample mean is normal, let’s check on how the estimated mean performs when compared to the population: The dashed red line is the population mean, which in reality might not be known Point 3: Standard deviation is inversely proportional to sample size As per CLT, the standard deviation reduces with an increase in sample size, let us check on that: The spread of the distribution reduces with an increase in the sample size According to CLT: Source: Wikipedia From the above equation, 2.99 / Sqrt(40) = 0.47 and this is in line with the results we have obtained. Ok, Agreed! All of us knew that before, what can we do with this information. Application of the information from Central Limit Theorem: For starters, if the population distribution were to be normally distributed, one could use a cumulative probability distribution plot as below: From the plot above, it could also be inferred almost 100% of the patients need hospitalization for less than 12.5 days. However, we know that the population is not normally distributed. So, what else we got? Well, we can monitor the average number of days a patient spends hospitalized to maintain the quality of the treatment. What? How? Control Charts Control charts are an effective visual tool for monitoring the required metric for consistent performance. In our case, the average number of days spent by the patient hospitalized is an indirect measure of the effectiveness of the treatment. The way we will build and use them is by setting up a null and an alternate hypothesis. Imagine that you are a medical administrator and the task at hand is to ensure that the average time of hospitalization remains at the existing duration or reduces (which would be ideal). In either case of hospitalization days being greater or less than 4.4 days, we must be vigilant. Efforts must be made to either rectify inefficiencies or sustain efforts. Control charts can help you monitor the same. Below is how you build them: Take random subgroup samples (patients in our case). We are choosing a sample size of 10 and taking 40 samples for our study (from the population, in this case). We would then be calculating the mean and standard deviation, plotting the same with control limits. Control limits could either be set by the process or be specified by the customer or industry. In our case, we would be using the process control limits. The mean and standard deviation are plotted against the subgroup number to check if there is any pattern or a number of points going beyond the control limits. The plot is checked for irregularities such as trends, high variance and points beyond control limits For a very consistent and stable process, the subgroup means lie within the 1st standard deviation Very few points are beyond the control limits, it doesn’t hurt to perform a t-test While the subgroup sample means obtained do show variation, but there does not seem to be any trend. Also, 2 out of 40 samples have means beyond specified control limits. It makes sense for us to check if the mean obtained from this process is statistically significant to investigate further. Performing a t-test for statistical significance We can perform a one-sample two-tailed t-test where t-statistic is obtained as below: The value of the t-statistic is -0.48122 The p-value of 0.3165 shows that the mean obtained is not statistically significant at 5% significance level From the results above, we fail to reject the null hypothesis and conclude that the treatment of diabetic patients is as effective as earlier. Summary: We experienced the central limit theorem using a dataset that captures the number of days a diabetic patient stays hospitalized. Developed control chart to monitor if the average number of days remains the same using a control chart. Performed a t-test to check if the results obtained are statistically significant indicating the effectiveness of the treatment. Source code for the project can be accessed using the link below: https://github.com/rohitTheSupplyChainGuy/The-one-with-the-master-stroke-theorem
https://medium.com/analytics-vidhya/monitoring-effectiveness-of-diabetic-treatment-using-contol-charts-c35d011349d7
['Rohit Tolawat']
2020-11-10 12:58:47.547000+00:00
['Qq Plot', 'Control Chart', 'Hypothesis Testing', 'Central Limit Theorem', 'T Test']
Smart contracts: how do we use it?
In the MBI DeFi project we use smart contracts in order to remove both the human factor and the possible influence of third parties that might affect the deal. It is thanks to the smart contract technology that we can guarantee the following benefits: Automation: The use of smart contracts simplifies the process, brings speed, clarity, and precision. Cost-efficient: Smart contracts execute immediately once the terms that bind the contract has been fulfilled. Autonomy: Smart contracts are self-sufficient, safety and security. The use of smart contracts allows us to ensure transparency both during split stock, and any other internal transaction. Users can always track and verify any transaction, and see where their money went. In addition to fully automated operation that doesn’t involve third parties, smart contracts guarantee that deals are only made when all the prerequisites are fulfilled by all participants. All that will allow us to strengthen the public’s trust in our project! Join MBI DeFi now and get to experience all of the advantages provided by smart contracts!
https://medium.com/mbi-defi/smart-contracts-how-do-we-use-it-df9f63a951ea
['Mbi Defi']
2020-09-23 09:22:44.489000+00:00
['Mbi Defi', 'Smart Contracts']
6 Morning Habits to Help You Seize the Day, Every Day
6 Morning Habits to Help You Seize the Day, Every Day Your morning routine is the single most constructive part of your day — invest in it. Photo by Jeremy Bishop on Unsplash There are days where I manage to check everything off my to-do list. And there are days where I don’t even come close. You’ve probably experienced that feeling. It gets to about 4:00 pm and you realize there’s no way you have the energy left to complete everything you have to do today. You’ve had a slow day. The truth is that you might be setting yourself up to ‘fail’ from the very start of the day without even realizing it. A series of small, counter-productive habits at the start of your day can have a big impact on your mindset. Instead of giving you the energy you need for a productive day, they’re eating away at your motivation. Carving out time first thing in the morning to get yourself in the right frame of mind, could be the single most constructive thing you do all day. And the best part is, that these small, seemingly insignificant acts can reap huge rewards. In time, you’ll benefit from the compound effect. As Darren Hardy explains: “Small, Smart Choices + Consistency + Time = RADICAL DIFFERENCE” — The Compound Effect: Jumpstart Your Income, Your Life, Your Success’ The more you practice the same morning habits, the easier they become. They become habitual; you’ll soon start to do them without much conscious effort. They’ll form part of a healthy morning routine that sets you up for the day on the right foot. So here are 6 simple morning habits that have been tried, tested, and adopted by several highly successful people.
https://medium.com/the-ascent/6-morning-habits-to-help-you-seize-the-day-every-day-f6c0e0d5540a
['Laura Izquierdo']
2020-12-13 16:02:37.865000+00:00
['Work', 'Productivity', 'Success', 'Creativity', 'Psychology']
On Integrity
I had a heated debate today about integrity. The person I was speaking with couldn’t comprehend what I was saying about something so simple, so fundamental and so formative that I am actually concerned and bewildered enough to address it here. I think it’s incredibly telling that there are so many people out there that don’t live by a code of honor, nor do they understand what it means or why it’s so important. Just look around. This is precisely why our planet and society are in a world of shit right now — because of people choosing to be self-serving over doing what’s right. From a very young age, I made an oath to myself to always live a life of honor and integrity. And while it has not been the easiest path — I can tell you without a shadow of a doubt the lack of guilt and the assurance that I have made choices in right action for the highest good has made the difficulties worth it in spades. I look back with a clear conscience and those times when I was in the wrong, I take responsibility for my part and try to learn as much as I can from those experiences. That’s just who I am. I am always striving to be a better person and I honestly wish more people were like that. It sure as hell would make communicating and relating to others much less frustrating! When you have done as much self-work and spiritual development as I have, karma is very real, so anytime I step off or deviate, I get a swift kick in the ass. It’s just not worth it and I would rather suffer some relatively minor discomfort or inconvenience now than to knowingly hurt or take advantage of others, which ripples out with incalculable collateral damage. No one knows how much one slight of word, one egregious act, one lie, and the hurt caused can directly influence situations and outcomes. One bad action not only harms all of the people immediately involved, but it could set off a tsunami of negativity. That said, this power can also — and SHOULD — be used for good. Hence the need for integrity and honor. It’s so, so simple and it makes you and others feel good and cared for. Why is this so hard for some to understand!? We are all truly connected and in the wake of Covid, we should know that now more than ever. Aries have a saying that they believe in karma because we ARE the karma — and this is very true. We are the warriors of the Zodiac who fight for the underdog, noble causes, and those who are taken advantage of — so integrity is a huge part of our modus operendi. Integrity certainly is a prerequisite to be in my life because I will not participate in dishonorable behavior directed at me or others. Anyone who knows me knows I am sweet as pie and I do prefer to be chill and pick my battles, but once pushed too far, that is the last thing you will be thinking — so just do the right thing and there won’t be an issue. It’s really not too much ask. Mess with the bull (or ram) and you will get the horns. Comprende? So overall — since there seems to be some confusion about what integrity means and how to instill it in your life — here is an article explaining. We should all be working together towards solutions, so if you have a less-than-stellar track record of standing by your word, then know that you are part of the problem and this is for you. This is bedrock foundational common sense stuff, so I hope this helps you understand why honor and integrity are absolutely everything. Please teach your kids, nieces, nephews, grand and godchildren. If they have these traits, they will be set up for building good character and being a productive, contributing member of society and not a narcissistic parasite that just takes advantage without concern or remorse. Bottom line — time’s up on the BS. People need to stop hurting others for their own selfish, fear-based aims. They don’t realize they are also hurting themselves and keeping us all stuck in this whirlpool of crap. I’m so over all of it. For the love of God and all things reasonable and sane, it’s time to step it up!
https://medium.com/@leahcone/on-integrity-c61b94604c2d
['Leah Cone']
2020-07-07 17:30:40.215000+00:00
['Morality', 'Ethics', 'Integrity', 'Narcissism', 'Values']
E-commerce Beyond E-commerce
While looking at the word e-commerce what generally strikes in our mind in most situtations is the term online shopping which is as i call “kisi tareef ka mohtaaj nahi (needs no introduction)”. Well, such is the beauty of this concept or a way of life or more than a way of life. Arguably, the need of the hour !! But what is this e-commerce ? Do we really understand it ? Is it just the process of choosing what we like from options that almost everytime seem to be reaching beyond infinity and getting those delivered to our doorstep ? Have you ever wondored how all of this is getting done that too at such a pace where the turn-around time for a complete order in some locations is literally a day. How is this all being achieved ? Above are some of the questions that i think the general masses should atleast have a gist about, if not knowing the whole process. The process of E-commerce where how do products get visible to an online customer, what happens when they select a product , what happens when they share the address to which they want the product to be delivered and finally what happens where they make the payment towards an order. And not to forget the best part, how to we get discounts during the big flash sale which in some ways are round the corner for 365 days a year ? I will be coming up with some cool ways and examples to showcase the way how we all move in the direction of this e-commercy world. For now i will leave you with these question to ponder about. P.S: This is my first attempt to articulate my thoughts on medium. Please excuse in case I am grammetically incorrect somewhere. I promise I will be better soon. Cheers, Divyansh Khandelwal
https://medium.com/@its-divyansh/e-commerce-beyond-e-commerce-8a15a245ffa6
['Divyansh Khandelwal']
2020-12-15 11:45:38.556000+00:00
['Ecommerce', 'Ecommerce Solution', 'Retail', 'Online Shopping']
Drivers Declare War on Uber, and Rydzz Is the Big Winner
We all (RYDZZ inclusive) know how we got here, this fight between workers, employees (or independent third party contractors as Uber would have them go by) and their employers, over their work data. In a classic man vs. machine setup, drivers may well be the Connors and tech giants, the T-1000. Gone are the days when employees trusted the little pop-up that told them how well and carefully companies will handle their person data once they clicked/tapped on “I Agree” (consumers stopped trusting that a long time ago). Tech companies collect a bulk of data on both their clients and employees, which serve as precious metrics to measure the quality of their services, client tendencies, and employee performance. Of course these companies say they protect this data, but given Facebook’s recent revelations (just to name but one case), nothing is certain anymore. It comes there as no surprise that Uber drivers are going at the rideshare giant with their pitchforks, demanding that their data be released. RYDZZ, a new ridesharing company about to hit the market will be watching with keen interest and taking rookie notes as best as it can, as Uber struggles with a slew of bad decisions and legal battles. Uber, much like RYDZZ, collects driver data to measure performance through rider ratings, reviews and other information. This data is, however, not released to the drivers, something which former Uber driver, James Farrar, believes to be unfair. According to the General Data Protection Regulation (GDPR) workers are entitled to demand data collected on them from their companies, and the European Union is in debate over how much data companies release to employees, researchers and entrepreneurs, their stance a demand for more freedom, something which big firms like Uber totally oppose. Farrar has over 100 Uber Drivers with him as the case goes to court. Uber has said that the GDPR does not demand them to create an extra database, which will be what has to happen if they have to release employee data. RYDZZ is closely following the developments laden with opportunities to best its competitor.
https://medium.com/@ringnyushalom/drivers-declare-war-on-uber-and-rydzz-is-the-big-winner-d85dd7873a46
['Robert Dwight']
2019-11-14 23:17:31.208000+00:00
['Rydzz', 'Ridesharing', 'Rideshare Drivers', 'Uber', 'Lyft Vs Uber']
Dynamic Programming with OpenCV : Implementing Kadane’s Algorithm to find brightest area on Astronomical images and videos.
Basic Astronomical Image We all have come across various OpenCV libraries and have been using them to get the brightest area’s or spot’s on images. Apparently, we have our cv2.minMaxLoc(…) in OpenCV to do this and get our location. But have you ever wondered could we employ an dynamic programming method for this approach. This is a problem that's called the maximum contiguous subarray sum in Algorithms. We usually have a 1d array and are asked to find the maximum sum that's possible out of all subarrays of that array. We can employ brute force method to solve it in O(n²) times or a divide and conquer to get it in O(log n) times. We will use dynamic programming for our approach to get it in O(n) times. Coming to our problem, we have an image which we can read using an OpenCV method and we process the image to get a 2D array out of it. We cant directly input the array into Kadane’s Algorithm as after we read the image we wont get a 2d array and it will be of higher pixel rate which could run very slow on Kadane's Algorithm. Hence we first reshape it to a smaller matrix of say (150,150) shape. Please note that the more we reduce the faster we get the process done. It was noted that with 150 on an average it took 2 seconds. if we go for 100 we could get it in 0.7 seconds, but the accuracy gets reduced. So higher the better and hence we have to balance both time as well as pixel dimensions. 150 seems to work fine and get results pretty fast. Next we grayscale the array which makes our array into a 2d shape. It is important to note that we have the matrix of int8 type and hence the values will never go negative , but in Kadane’s we need negative too to get properly maximum sum area, else we may end up detecting full array as the sum. So we convert them to int16. We just smooth the array using a gaussian blur and then we subtract all values of array with its max value and add 1 to it. This solution help a lot when it want to get the exact area of brightest spot. Next we pass this array to a function for our Kadane’s Algorithm processing. Next we pass the matrix to a function called findMaxSum which will basically give us the highest sum as well as co-ordinates corresponding the. This function will work by passing matrix columns into Kadane’s function and comparing and getting the maximum sum. The code is given below: For deep understanding of working of the function, I have generated and example on 3 * 3 matrix. This link should guide you in live how each of the code works : Now, once we have passed it using Kadane’s Algorithm we would have obtained our highest sum with co-ordinates corresponding to them. Since we want to indicate the brightest area on our image we need to circle it. So to find that coordinate we have to take average of finalbottom and finaltop with finalleft and finalright. This gives us the center. The idea is to resize the image, but we know that once we resize the quality of image is reduced by huge margin. So the trick here is to use concepts of ratio and get the coordinates when the image is transformed into a particular larger pixels. So here we basically call the original image once again and find the coordinate with respect to the one generated on the resized image. Once this is completed, we use cv2.circle function to get the circle on the brightest area of our image. Thus in this way we end up getting the image like this: We can also use the same concept and run it to get the brightest area on a video. The code for both image and video are given below, so feel free to star the repo and help me in case you can find better optimizations or doubts:
https://medium.com/analytics-vidhya/dynamic-programming-with-opencv-implementing-kadanes-algorithm-to-find-brightest-area-on-9fe6e9af38d4
['Siddharth M']
2020-12-30 10:18:08.426000+00:00
['Python', 'Dynamic Programming', 'Kadanes Algorithm', 'Data Structures', 'Opencv']
Introduction to “reflect-metadata” package and its ECMAScript proposal
Metadata, in a nutshell, is extra information about the actual data. For example, if a variable represents an array, the length of that array is metadata. Similarly, each element in that array is data but the data-type of these elements is metadata. Loosely speaking, metadata is not the actual concern of a program, but it can help us achieve things quicker. Let’s take a small example. If you need to design a function that prints information about other functions, what information would you print? In the above example, the funcInfo function takes a function as an argument and returns a string that contains function name and the number of arguments this function accepts. This information is contained in the function itself, however, we almost never use that in practice. Therefore, func.name and func.length can be considered as metadata. In the Property Descriptors lesson, we learned about property descriptors of the object’s properties. A property descriptor is an object that configures how an object’s property behaves. For example, you can set an object’s property to be read-only (non-writable) or non-enumerable. A property descriptor is like metadata of the object’s property. It doesn’t show up unless you look for it, perhaps using the Object.getPropertyDescriptor() method or Reflect.getPropertyDescriptor() method call. You can customize this metadata to change how the object and its properties behave and we learned that in the Property Descriptors lesson. Metadata is what makes the metaprogramming possible. Metadata is necessary for reflection, especially for the introspection. For example, you can change the program behavior based on the number of arguments a function is designed to receive. So now you can understand the power metadata has. It opens all kinds of interesting possibilities for metaprogramming. However, we are restricted by what JavaScript has to offer for metaprogramming which is not much. We have explored these features in the earlier lessons. Only if there was a feature that could let us add custom metadata to objects. That would solve almost all the problems 🥺. Let’s welcome Reflect ’s metadata extension which does exactly that. There is a proposal to extends Reflect’s functionality to add custom metadata to objects and object properties. Wait!!! I would not be so excited at this very moment because it is still a proposal and it hasn’t even been submitted to the ECMAScript. However, you can find a detailed proposal spec here and if you are looking for the reason why it wasn’t submitted to TC39, this GitHub issue thread may help you. If you are a TypeScript developer and you have been working with Decorators, then you might have heard about the reflect-metadata package. This package lets you add custom metadata to classes, class fields, etc. But in this lesson, we are not going to talk about TypeScript or decorators. Perhaps, we can cover them in separate lessons. In this lesson, we are going to look at what reflect-metadata package offers and what we can do with it. In the Reflect lesson, we learned that Reflect API is a great tool to inspect objects and add metadata to change their behavior. For example, Reflect.has method works like in operator to check for the existence of a property on the object or its prototype chain. The Reflect.setPrototypeOf method adds a custom prototype on the object, therefore changing its behavior. Reflect provides many such methods that are used for reflection. The Metadata Proposal (spec here), proposes some new methods to extends Reflect ’s capability. These methods, however, only meant to augment metaprogramming support of JavaScript. Let’s have a quick look. // define metadata on a target object Reflect.defineMetadata( metadataKey, metadataValue, target ); // define metadata on a target's property Reflect.defineMetadata( metadataKey, metadataValue, target, propertyKey ); The Reflect.defineMetadata method lets you add a custom metadata value metadataValue which could be any JavaScript value to the target object (descendent of the Object ) or the target object’s propertyKey property. You can add as many metadata values as you want as each metadata value is identified by a metadataKey . // get metadata associated with target let result = Reflect.getMetadata( metadataKey, target ); // get metadata associated with target's property let result = Reflect.getMetadata( metadataKey, target, propertyKey ); Using the Reflect.getMetadata method, you can extract the same metadata associated with the target object or its properties with the metadataKey . In the previous lesson, we also learned about the internal slots and internal methods. These are the internal properties and methods of the object that holds the data and logic to operate on that data. This metadata proposal proposes [[Metadata]] internal slot for all ordinary objects. This internal slot could be either be null which means the target object doesn’t have any metadata or it could be a Map element that holds the metadata with different keys for the target or its properties. The Reflect.defineMetadata calls the [[DefineMetadata]] internal method and Reflect.getMetadata uses [[GetMetadata]] internal method to fetch it. Let’s see this in practice. But before we do that, we need to install reflect-metadata package. You can find the instructions to install it from this GitHub repository. You can install it using npm install reflect-metadata command. The require('reflect-metadata') import statement is special. This imports the reflect-metadata package which adds the proposed methods such as defineMetadata to the Reflect object at runtime and we can verify this by looking at the type of Reflect.defineMetadata function. Therefore, this package is technically a polyfill. Then we defined a target object onto which we have added some metadata with the keys version , info and is . The version and info was added directly on the target while is was added on the name property. The metadata value is any possible JavaScript value. The Reflect.getMetadata returns the metadata associated with the target or its property. If no metadata is found, it returns undefined . Using these methods, you can associate any metadata with any object or its properties. The target is not mutated while registering metadata on itself or on its properties as you can see from the logs. In reality, this metadata will be stored in the [[Metadata]] internal slot but this polyfill uses a WeakMap to hold metadata for the target . You can check for the existence of a metadata value using hasMetadata method and getMetadataKeys returns the keys of the metadata values registered on a target or its properties. If you want to get rid of a metadata value, then you can use the deleteMetadata method. // check for presence of a metadata key (returns a boolean) let result = Reflect.hasMetadata(key, target); let result = Reflect.hasMetadata(key, target, property); // get all metadata keys (returns an Array) let result = Reflect.getMetadataKeys(target); let result = Reflect.getMetadataKeys(target, property); // delete metadata with a key (returns a boolean) let result = Reflect.deleteMetadata(key, target); let result = Reflect.deleteMetadata(key, target, property); By default, getMetadata , hasMetadata , getMetadataKeys also look up the prototype chain of the target for the existence of metadata associated with the particular metadata key. Therefore, we also have getOwnMetadata , hasOwnMetadata and getOwnMetdatakeys methods which basically do the same thing but they operate solely on the target . // get metadata value of an own metadata key let result = Reflect.getOwnMetadata(key, target); let result = Reflect.getOwnMetadata(key, target, property); // check for presence of an own metadata key let result = Reflect.hasOwnMetadata(key, target); let result = Reflect.hasOwnMetadata(key, target, property); // get all own metadata keys let result = Reflect.getOwnMetadataKeys(target); let result = Reflect.getOwnMetadataKeys(target, property); In the above example, proto object is the prototype of the target and therefore, all the metadata values defined on the proto would be accessible on the target . However, the *Own methods do not search for a metadata key on the proto . The result of the above program would be as follows. This proposal also proposes Reflect.metadata method but this not an ordinary Reflect method as we have seen above. This method is a decorator factory which means when it is invoked with some arguments, it returns a decorator function that can be used to decorate a class or a class field. We learned about JavaScript decorators in the “A minimal guide to JavaScript (ECMAScript) Decorators” lesson. JavaScript decorators are not a part of the ECMAScript standard at the moment and the proposal is still in stage 2. Therefore decorator proposal is still under active development. @Reflect.metadata(metadataKey, metadataValue) class MyClass { @Reflect.metadata(metadataKey, metadataValue) methodName(){ // ... } } In the above snippet, the @Reflect.metadata method call decorates the MyClass class and methodName method (property). Basically, it adds the metadataValue metadata to these entities with metadataKey key. If you are wondering how this works, it’s actually pretty simple. The Reflect.metadata method call returns a decorator function. This decorator function internally implements Reflect.defineMetadata which adds metadata to the entity it is decorating such as the class or its property. The problem here is that reflect-metadata package (polyfill) implements the legacy version of the decorator proposal. TypeScript also implements the same version for decorators implementation. You can follow my article on decorators to understand the legacy pattern to design decorators. Since we can’t implement the decorator pattern provided by this package in JavaScript natively, I would need to demonstrate it using TypeScript. But you can ignore the types in this program so that it looks syntactically similar to a JavaScript program. 💡 You can also use this babel plugin to transpile JavaScript with legacy decorators to vanilla JavaScript code that works natively. As you can see in the above example, the @Reflect.metadata was successfully able to add metadata on the Person class which we later simply extracted using Reflect.getMetadata(<key>, Person) method. You can also create your own decorator factory such as myDecorator which returns the decorator returned by the Reflect.metadata call.
https://medium.com/jspoint/introduction-to-reflect-metadata-package-and-its-ecmascript-proposal-8798405d7d88
['Uday Hiwarale']
2020-09-01 06:29:53.029000+00:00
['JavaScript', 'Nodejs', 'Ecmascript 6', 'Ecmascript', 'ES6']
Debugging Best Practices in JavaScript
Photo by Kelly Sikkema on Unsplash JavaScript is a very forgiving language. It’s easy to write code that runs but has issues in it. In this article, we’ll look at how to make debugging JavaScript programs easier. Debugging Aids To fix issues that arise in our program, we can introduce debugging aids to our JavaScript program that works in the development version of our program. In JavaScript, there’s the debugger statement, console methods, debuggers and the assert module. The debugger statement pauses a JavaScript program so that we can inspect the values of variables. console methods can log values of variables in the ways we like, including colors, in a table, and even styling. Some editors and IDEs like Visual Studio Code also has a built-in debugger for JavaScript. The assert Node.js module is also great for checking if assertions are met and throw errors if they aren’t met. Also, logging also helps a lot with tracing sources of errors. Introduce Debugging Aids Early We should introduce them early so that we can use to look for error sources as much as possible. Use Offensive Programming We shouldn’t make ignoring errors easy. So we shouldn’t let developers bypass errors so that they can be checked out and fixed. Also, we can make case statements or else blocks shut down our program so that we won’t miss error values that are encountered. Error logs can also be emailed to developers so they’ll be looked at. Error reporting tools can be used to check for errors that are raised in any environment. Remove Debugging Aids Before releasing our program to production, we got to remove debugging aids. We can do that with environment variables that toggle them on and off depending on the environment. Just make sure that the aids are on in development and off in production. Use Debugging Stubs We can also add code that only runs in a development environment so that they help debug our code easier. Determining How Much Defensive Programming to Leave in Production Code We may want to remove some error checking code when our code is released to a production environment. There’re a few guidelines to help us determine if we want to remove it. Leave in Code that Checks for Important Errors If an error is too important to ignore, then we should check for them. Code that checks for invalid inputs and things like that should be included anywhere so that we won’t have bad data saved. We got to add checks to make sure that our outputs are correct. Remove Code that Checks for Trivial Errors If there’s code that checks for errors that don’t many consequences, then we can remove them. We can use version control, environment variables, etc. to toggle them on and off. Remove Code that Results in Hard Crashes Hard crashes don’t provide users with a good experience. Therefore, we should remove them. We should handle them better so that it’s handled gracefully rather than crash our program. Leave in Code that Helps the Program Crash Gracefully The corollary of the previous point is that we should leave in code that handles errors gracefully. They handle errors in a way that users wouldn’t perceive as a bad experience, so we can keep those in. Log Errors for Our Technical Support Personnel Logging is important since we’ll run into production issues eventually. We make sure that we have logs that we can check to trace errors back to their original source. Morgan is a good logger for Node.js apps. Make Sure that the Error Messages we Leave in are Friendly Cryptic error messages are bad. We should make sure that they’re useful so that we can use them to fix the error that we encounter. Also, we can add a contact phone or email for users to contact them for help. Conclusion We should have aids to help us debug in the development environment so that we can use it to troubleshoot. Also, we can take out some error handling code that handle trivial errors or that if they cause a bad user experience. Logging is always useful so that we can trace errors back to their sources without looking too hard. A note from JavaScript In Plain English We have launched three new publications! Show some love for our new publications by following them: AI in Plain English, UX in Plain English, Python in Plain English — thank you and keep learning! We are also always interested in helping to promote quality content. If you have an article that you would like to submit to any of our publications, send us an email at [email protected] with your Medium username and we will get you added as a writer. Also let us know which publication/s you want to be added to.
https://medium.com/javascript-in-plain-english/javascript-best-practices-debugging-7581efbd3f2f
['John Au-Yeung']
2020-05-04 14:49:48.406000+00:00
['JavaScript', 'Programming', 'Web Development', 'Technology', 'Software Development']
Pony.ai strikes autonomous deal with FAW
The Chinese automaker is just the latest partner for the supplier to explore autonomous vehicle technology, specifically involving SAE Level 4 systems for “pre-installation mass production and business operations models.” Chinese automaker FAW has announced a strategic investment in Pony.ai, marking the second time this year that the autonomous technology company has received funding from a leading car manufacturer. In February, Pony.ai announced that it received a strategic investment from Toyota Motor Corp. According to an FAW statement, the two companies will give full play to their respective technical and product advantages in vehicle design and manufacturing, V2X, and mobility-as-a-service. They will work to integrate their advanced autonomous driving systems and vehicle platforms to explore SAE Level 4 systems for “pre-installation mass production and business operations models.” The agreement involves a Red Flag EV passenger vehicle platform and a Jiefang Truck commercial-vehicle platform, according to Pony.ai. Pony.ai was founded in Silicon Valley in December 2016 and is now co-headquartered in Silicon Valley and Guangzhou, China. It has daily testing and operations in Fremont and Irvine, CA, and Beijing and Guangzhou, China, exceeding 1.5 million autonomous kilometers in 2019. Under CEO and Cofounder Dr. James Peng’s leadership, the company has quickly become an industry leader and launched multiple autonomous driving fleets to the public across the globe. Prior to cofounding Pony.ai, he worked at Baidu and Google for 11 years. The company’s CTO and Cofounder, Dr. Click on the link below to read the full post online.
https://medium.com/@futurride/pony-ai-strikes-autonomous-deal-with-faw-97137d05f97e
[]
2020-11-04 21:49:38.815000+00:00
['Autonomousvehicle', '5ycapital']
10 Christmas Photography Ideas and Tips (Updated)
Photography and Christmas seem to go hand in hand. With all of your family and friends together, you have a chance to create imagery that will be shared online, physically printed out, mailed across the globe and cherished for generations with the power of digital imaging. So, here are 10 Christmas Photography Tips to make sure those photos turn out great. 1) Bump Your ISO Find the right ambient-to-flash balance, even if that means bumping the ISO to 1600, 3200 and even 6400 depending on your camera body. On full frame professional cameras, like the Canon 5D Mark IV, Nikon D850, or Sony a9, don’t be afraid to go up to 6400. On entry level cameras like the Canon Rebel or Sony a6500, keep your ISOs at 1600 and below to avoid too much image degradation. The images below were taken at ISO 3200 on Canon 5D Mark IV. 2) Use Lower Apertures Create beautiful bokeh in your Christmas background lights by dropping your apertures. The image below is shot on a Canon 5D Mark III with a canon 50mm f1.2 Lens at F2.0. Of course, you have to be very accurate with your focus at such shallow depth of fields, but if you can get your subjects sharp, the low apertures will really make your images pop and bring out those beautiful background Christmas lights. If your lens is capable, try staying at f2.8 or below. 3) Use Natural Light (Avoid Flash) If you’re looking to create natural, warm and moody images, make sure you turn off your flash for some (if not most) of your photos. This allows the ambient light of the Christmas lights or fireplace to show up in your images and allows parts of the scene to fall off into shadow. Image Courtesy of Darryl Wong Photography. 4) Use Christmas Props For Christmas portraits, consider including fun Christmas props like Santa hats and cute holiday outfits. It’s tempting to lounge around in pajamas all day, but a little bit of extra planning will go a very long way. The following image is from a Bokeh Overlay tutorial by Tanya Smith. 5) Get Close Putting the subject of your photo closer to the camera can help increase the amount of blur in the out-of-focus parts of the image, which works great for small details like Christmas ornaments. Be sure to keep your apertures low and consider using a macro lens to get in even closer while still being able to focus. Again, consider turning off your flash to maintain the natural, moody feel. Image Courtesy of Darryl Wong Photography. 6) Use Fun and Silly Poses The Holiday season is a time of fun and cheer. Be sure to step away from those classic smile-into-the-camera poses and get silly. As the photographer, whether you’re hired professionally to capture a holiday event or you’re simply photographing your family, it’s your job to get great reactions, smiles, and poses from your subjects. 7) Consider a Photobooth Photobooths are getting easier and easier to set up. There are free Photobooth apps on your iPad as well as a variety of simple, inexpensive Photobooth Software. For large holiday parties, these are a great addition to the festivities; and they even work well for small family gatherings. 8) Stay Active and Ready for Great Reactions Stay ready and actively anticipate smiles and laughs. Keep your camera up as your family and friends open their gifts and snap away at the reactions. Some of your best shots from Christmas will be the non-posed, in-the-moment images. 9) Consider Creating a GIF GIFs are great for showing a series of events in a sequence. Combining images from these moments can result in hilarious GIFs that your family will love. See the example below from our recent Lin and Jirsa Holiday Party. 10) Consider Creating a Collage Rather than posting 10 individual photos, consider creating a collage. In our world of social media, it’s easier to Instagram, Facebook, and Tweet one picture that sums up the party than to clutter up newsfeeds and timelines with multiple images. 11) Use Advanced Techniques like Long Exposures and Composites To create beautiful starry effects in the Christmas lights, consider using small apertures and long exposures. This conflicts with our first tip of dropping down your aperture, but this tip is used to create an entirely different look. With small apertures, you’re utilizing diffraction to create start bursts. If you’re including people or other moving subjects in your photo, consider merging multiple images in a composite. See this Christmas Composite Tutorial Here. Christmas Photo Tips Conclusion The most important rule for Christmas photos is to have fun. It’s a wonderful time to play with the bokeh of Christmas lights and utilize elaborate, festive sets. It’s also a great time to get meaningful family photos, as it seems harder and harder to get all of our loved ones in one place at one time. Be sure to take advantage of this wonderful time of year and please share your results with us below in the comments or over on our Facebook Page.
https://medium.com/slr-lounge/10-christmas-photography-ideas-and-tips-updated-d6484a7b29c7
['Slr Lounge Staff']
2020-12-23 22:45:04.154000+00:00
['Christmas', 'Tips', 'Photography', 'Holidays', 'Portraits']
109年教育部青年署YOUNG飛全球行動計畫-FlyPhi
我熱愛國際事務,這裡是我分享旅遊、遊學、講座心得的空間。 A girl needs to wear two things to look great: confidence and smile.
https://medium.com/@peixuan5/109%E5%B9%B4%E6%95%99%E8%82%B2%E9%83%A8%E9%9D%92%E5%B9%B4%E7%BD%B2young%E9%A3%9B%E5%85%A8%E7%90%83%E8%A1%8C%E5%8B%95%E8%A8%88%E7%95%AB-flyphi-81897e028d10
['Emily Wu']
2020-12-21 11:47:53.063000+00:00
['Young', 'Exchange', 'International', 'Taiwan', 'Southeast Asia']
Recently the NowJersey Digital Magazine crew made a trek… nay..
Our hope is that we were able to capture even a slight bit of the magic, on film. We worked hard trying to film the right shots, angels & moments, that would convey what we were experiencing. If the sunflower clips in our video ‘This Is Our Garden..’, or this article, inspires anyone to get outside and get to the middle of a sunflower field.. or even better, to plant your own, large or small; then we’ve accomplished everything we hoped for in our visit to a NJ sunflower field. For more articles spotlighting New Jersey gems, please check out NowJersey on Medium.com NowJersey is New Jersey, curated. Please do follow us on Facebook & Twitter for curated NJ scoop. And on Instagram for beautiful photos of the Garden State from some of NJ’s best photographers, as well as some serious foodie pics! We are passionate about life in New Jersey. If you love New Jersey, ‘like’ us!
https://medium.com/nowjersey-digital-magazine/recently-the-nowjersey-digital-magazine-crew-made-a-trek-nay-c4fb4141050
['Mike Decastro']
2017-10-27 13:03:32.722000+00:00
['Nature', 'Newjersey', 'Family', 'Sunflower', 'Holisitc']
Simple Contract With Remix
Create a new contract called “SimpleStorage.sol” inside the file explorer. This simple contract defines a string named value that gets initialized in the constructor and gets saved in the storage. This string is updated with the method setValue . When the value is updated, it emits an event ValueChanged . But… It doesn’t compile ??? And for that matter, where do you click a button to compile it??? Remix is now divided into Plugins that can be loaded individually. This makes Remix more customizable. So, for example, if you want to use Vyper, you don’t need Solidity compiler. Compile your Contract Click on the “PluginManager” icon on the left bar (it’s the one that looks like a plug). Here you’ll see all available Plugins inside Remix. Activate “Solidity Compiler”. Click on the new solidity icon to open the Solidity Compiler panel. Hit “Compile”, or just click inside the editor and hit “Ctrl/Cmd+s”. Test your Contract Now that your contract is compiled, you’ll want to verify that it works. Let’s use the Remix Test Plugin for that. Under the Plugin Manager panel, activate “Solidity Unit Testing”. You should now have two plugins activated : Inside the “Solidity Unit Testing” panel you will find the documentation and examples to build your solidity tests. Let’s write our solidity test file from scratch. Go back to the FileExplorer and create a SimpleStorage_test contract (the name needs to end with _test to be considered as a testing contract): First we import the remix_tests.sol library as well as our contract ./SimpleStorage.sol . Then we create the contract and define a variable testContract that will be reassigned before each test. It’s important to use beforeEach because the library doesn’t run the tests by order of definition, but by alphabetic order. In this case shouldChangeValue will be test before shouldHaveAValue . The library exposes the method equal that check if the two first parameter are equal, and display the third if not. Run the tests Go back to the Solidity Unit Testing panel. Check you contract, and hit “Run Tests” : As you can see in this example, the first test is “Should change value”, and the second one is “Should have a value”.
https://medium.com/remix-ide/simple-contract-with-remix-7285af713d99
[]
2019-06-18 11:56:18.342000+00:00
['Remix', 'Ide', 'Solidity', 'Ethereum']
NIGHT TALES 14: Living the Dream. Give me a desk job. Take your…
NIGHT TALES 14: Living the Dream nice and steady Give me a desk job. Take your adventures and keep it far away from me. Am I hard to understand? Give me a desk job. Give me lower back pains from sitting for far too long. Give me stress only endless paperwork can provide. Get your sick fascinations with fresh air far away from me. Give me a sturdy desk. A comfortable chair. If I have to stand up for it, take it away. Fulfill my dreams. Of sitting for eight hours straight in a small, airconditioned space. Fulfill my dreams. Of pale skin brought by sunless days and artificial lights as I ride the train home at night. Fulfill my dreams. Of never leaving my comfort space. Fulfill my dreams of staying in one place. Give me a desk job. It's all I ever wanted. I don't care if it doesn't pay well. I just know that I will take that any day, Over money made from sweat and movement, From drinking and driving, From talking and walking, From sharing and laughing. From being far from home. And being fine with it. Leave me here. On my chair, With my hands on a desk, And an irremoval manic smile on my face. Leave me here. I am me here.
https://medium.com/literally-literary/night-tales-14-living-the-dream-cc4e2152a0f4
[]
2018-09-06 18:47:48.631000+00:00
['Art', 'Poetry', 'Night Tales', 'Fiction', 'Literally Literary']
Makers
Photo by Mike Tinnion on Unsplash I was told, recently, Humans evolved as storytellers. From the mythical to the mundane, The full span of any life is bound In the act of transmitting meaning. Meaning that must be first found In an indifferent universe. Humans search, eyes peeled, For even the slightest shade of purpose. All is entropy, solution and dissolution, Change and decay and growth, Directionless yet uncompromisingly driven, And we monkeys seek for truth in it. For beauty and goodness and hope And right and wrong, opposites And attractions and distillations Of order to scrape into some rock Press into vinyl, encode into a signal To pass on to some other. All of us are excited toddlers Screaming, “Look what I found!” But we are not explorers, In these moments of discovery, Bringing back nuggets or grains Of a forgotten or not yet uncovered reality I was told, incorrectly, Humans evolved as storytellers. That we find and share Gifts of truth and meaning and inspiration. Instead, we are the authors of all things. The first human to explain to another The first constellation Did not take the other’s hand to point out A pre-existing bear or scorpion Drawn by the gods or a beautiful accident. They held the other’s hand, Forefingers lying gently together, And applied the first paint brush to the night sky. Humans searched for meaning In an indifferent universe And created it when none was found To the delight of all.
https://medium.com/an-idea/makers-2d749bf86ecb
['Alan Parley Buys']
2020-12-28 06:07:12.629000+00:00
['Poetry', 'Art', 'Creativity', 'Storytelling', 'Life']
A golden opportunity for women to step into the conversation
A golden opportunity for women to step into the conversation Debates about gender parity and gender politics continue to be a hot topic globally and have kept the subject of women and leadership at the forefront of business conversations. There is now growing interest in the Middle East with regards to how women can take more advantage of the new business opportunities that are opening up to them. The pace of change in the region has been fast, and as more women come into the workforce, they are beginning to find their collective voice; demanding equal opportunities on a much larger platform than previously. The growing potential for female leaders against the backdrop of the region’s rich and vibrant culture is explored in Annabel Harper’s new book, ‘Shujaa’ah, Bold Leadership for Women of the Middle East’, in which she reveals how women can take better advantage of the openings which are emerging. Annabel is an Executive Leadership Coach and Facilitator with a deep interest in the development of women in leadership in the Middle East. Drawing on her vast experiences of working in the region, Annabel delves into the meeting of traditionalism and modernism, exploring how women can create their own style of leadership by blending their skills with the unique culture of the Middle East. Annabel also addresses the prerequisites for organisations in the region to maximise their female talent pipeline, reminding the reader that progress is not one-sided. She argues that it is imperative for both women and men to collaborate in order to build more inclusive and diverse organisations which are fit for the future. Women’s voices are at the heart of the book and ‘Shujaa’ah’ speaks not only to young women of the Middle East who have career ambitions but cannot move forward, but also to organisations to help them understand what they could be doing more of. It also urges Middle Eastern men to step up to the challenge, because ultimately, none of the changes that need to happen will be effective if they aren’t involved too. Shujaa’ah is an essential read for anyone who is looking to better understand the evolving role of female leaders in the Middle East. Described as both a ‘must-read for women in leadership’ and an ‘enlightening read for men’, Annabel hopes that Shujaa’ah will help to challenge long-held unconscious bias towards gender inequality in the Arab world. Publication date: 24th November 2020 ISBN: 9781784529277 £12.99 / $17.95
https://medium.com/bold-ambition/a-golden-opportunity-for-women-to-step-into-the-conversation-9eae367068e4
['Jen Owen']
2020-11-24 15:24:35.953000+00:00
['Leadership', 'Women In Leadership', 'Female Leadership', 'Women In Business', 'Middle East']
Sonos announces new artists contributing to its Sonos Radio services
Sonos announces new artists contributing to its Sonos Radio services Amy Jan 16·1 min read Sonos has announced new offerings in its Sonos Radio music-streaming services. The ad-free Sonos Sound System will launch three new shows: Object of Sound, a weekly music and culture podcast; Black is Black, a monthly radio show “examining the black diaspora’s impact on modern music;” and Unsung, a biweekly show hosted by the independent British music publication Crack Magazine. [ Further reading: The best smart speakers and digital assistants ]Subscribers to Sonos Radio HD ($7.99 per month) will gain access to new stations curated by Icelandic singer Björk, British electronica duo The Chemical Brothers, American singer and producer, D’Angelo, and English singer/songwriter FKA Twigs. These same artists will contribute one-hour segments to the Sonos Radio Hour on the free version of Sonos Radio. Back on the paid side, subscribers will also gain a new hip-hop and R&B station, Blacksmith Radio, hosted by artist manager Corey Smyth. Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
https://medium.com/@amy29923735/sonos-announces-new-artists-contributing-to-its-sonos-radio-services-96b3efa263ab
[]
2021-01-16 01:21:51.489000+00:00
['Mobile Accessories', 'Chromecast']
European Lorry driver
Green and Pleasant Land Day 3. Dear S- It’s a fucking nightmare and I’m cold. The lettuce in my lorry is as rotten as the fools who sent us here. Who, in their right mind, eats lettuce in December when they’re not on holiday a thousand miles south of this pin prick of an obnoxious island? Remember the “rehearsal”? Well here we are with Covid19 playing the tune. It’s true. These idiots still believe class equals intelligent superiority. Worse than that, maybe, is enough people have profited from lazy details to believe this shit. They actually believe they’ve earned the right to live under a tap attached to an aristocratic arsehole in the hope — hope for fuck sake — of catching a trickle of wealth. It’s incredible how stupid some of these people are. You’re right when you say how some of these hopeless thin-necked fuckers are worse than the American tourists they shouted about twenty years ago. These people are fucking idiots, clinging on to an age of barbaric murderous theft as if that’s a source of pride. Don’t they know Hitler fucked up more than they defeated him? I guess not. There’s nothing great about this place anymore. Some people are welcoming, obviously, but they’re accused of sucking Stalin’s balls by slick bellends growing fat from the Norman conquest. The irony is pathetic. These filthy worms have spent a lifetime fucking Marx’s philosophy of the working class, and now the veil of shit is gold. I’ll be home soon.
https://medium.com/@andrewreeveart/european-lorry-driver-17276f352e67
['Andrew Reeve']
2020-12-24 04:35:49.348000+00:00
['Nationalism', 'Wealth', 'Europe', 'Brexit', 'Communism']
Docker — A Beginner’s guide to Dockerfile with a sample project
Docker — A Beginner’s guide to Dockerfile with a sample project A step by step guide to understanding the Dockerfile Photo by Roger Hoyles on Unsplash Automatic creation of Docker images through Dockerfile Dockerfile is used to automate the Docker image creation. Docker builds images by reading instructions from the Dockerfile. We will understand Dockerfile instructions by building a sample project. clone the below repo for all the examples. Here are all the commands that we can use in the Dockerfile. Comments FROM CMD ENTRYPOINT WORKDIR ENV COPY LABEL RUN ADD .dockerignore ARG EXPOSE USER VOLUME Comments Comments in the dockerfile start with # and you can put anywhere those comments. # from base image node FROM This is the first command in the Dockerfile. Without this, we can’t build an image. We can build the image just with this command. when we build just with FROM, we are actually taking the base image CMD whenever the image is instantiated. Dockerfile // build the image docker build -t first-dockerfile -f Dockerfile1 . // list image docker images // run the image docker run -it -d first-dockerfile // use exec for interaction docker exec -it f1edbfca3eac bash Docker image built just with FROM command CMD CMD command is used to give the default commands when the image is instantiated, it doesn’t execute while build stage. There should be only one CMD per Dockerfile, you can list multiple but the last one will be executed. multiple CMD’s but the last one is executed // build the image docker build -t dockerfile2 -f Dockerfile2 . // list image docker images // run the image docker run -it -d dockerfile2 // use exec for interaction docker exec -it <conatiner id> bash build the image and running it if you notice it, the last command is executed when we ran the container. There are a couple of points to notice here. if we specify executable in ENTRYPOINT, we can use CMD to pass default parameters to it (look at ENTRYPOINT section) if not, we can specify executable and default params in the CMD. we can override the default command given in CMD while running the container ENTRYPOINT ENTRYPOINT is used as an executable for the container. Let’s look at the below example we are using ENTRYPOINT for executable command and using CMD command to pass some default commands to the executable. Dockerfile example with ENTRYPOINT // build the image docker build -t dockerfile3 -f Dockerfile3 . // run the container docker run -it dockerfile3 WORKDIR WORKDIR sets the working directory for all the consecutive commands. we can have multiple WORKDIR commands and will be appended with a relative path. Consider the following example where we have two WORKDIR commands leads to /usr/node/app Dockerfile for WORKDIR // build the image docker build -t dockerfile4 -f Dockerfile4 . // run the container docker run -it dockerfile4 // open in another terminal and do exec docker exec -it <container id> bash // see the current working directory pwd build and run the image in one terminal exec in another terminal ENV ENV sets the environment variables for the subsequent instructions in the build stage. Consider the below example where we define the environment variable workdirectory and we used that later with $. There are two forms: single and multiple values // form1 ENV param value // form2 ENV param1=value1,param2=value2 ENV example // build the image docker build -t dockerfile5 -f Dockerfile5 . // run the container docker run -it dockerfile5 // open in another terminal and do exec docker exec -it <container id> bash // see the current working directory pwd ENV in action COPY COPY is used to copy files or directories from source host filesystem to a destination in the container file system. consider this example where we are copying package.json from our system to container file system. we can verify that while building with the RUN command ls -l. Dockerfile with the COPY command RUN ls -l command lists package.json LABEL LABEL is used to add some metadata to the image. if we use the same label as the base image and the most recent label value is applied. Dockerfile LABEL example // build the image docker build -t dockerfile7 -f Dockerfile7 . // inspect the image docker image inspect dockerfile7 we can see the LABEL in json output of inspect RUN RUN executes the instructions in a new layer on top of the existing image and commit those layers and the resulted layer will be used for the next instructions in the Dockerfile. consider this example we are doing npm install and ls -l with one RUN to avoid any additional layers. // build the image docker build -t dockerfile8 -f Dockerfile8 . npm install takes the package.json and created package-lock.json ADD ADD is used to add files or directories and remote files from URL from source host filesystem to a destination in the container file system. Consider this example where we are adding index.js from our system to container file system. We can verify that while building with the RUN command ls -l. Dockerfile with ADD command // build the image docker build -t dockerfile9 -f Dockerfile9 . we can see index.js and other files in the container filesystem .dockerignore Whenever we build the image at the root level, the entire context is sent to the Docker daemon. Sometimes we don’t need to send all the content to Docker daemon, those files or directories should be added to this .dockerignore file. for example, we have node_modules in the context and we have added that to .dockerignore file. node_modules shouldn’t be sent to Docker daemon ARG ARG is used to pass some arguments to consecutive instructions and this is only command other than a comment can be used before FROM. We can see ARG usage in the below file and also we can pass that with the build command. ARG example Dockerfile // build command docker build -t dockerfile10 -f Dockerfile10 . // build-arg usage docker build -t dockerfile10 --build-arg NODE_VERSION=8.11.4 -f Dockerfile10 . Dockerfile is building by taking ARG EXPOSE EXPOSE is used as documentation for the port. This is just a communication between the person who builds the image and the person who runs the container. It doesn’t serve any other purpose other than documentation. For example, we have used port 3070 in the index.js file. So, we are letting people know who runs the container by using EXPOSE instruction in the Dockerfile. USER USER instruction sets the user name and optionally the user group to use when running the image and for any instructions that follow it in the Dockerfile usage of USER in Dockerfile VOLUME VOLUME is used to create a mount point with the specified name. Following are the examples of Dockerfile and running instructions. volume usage in the Dockerfile // build command docker build -t dockerfile13 -f Dockerfile13 . volume directory is listed in the container filesystem Here is an article for a more detailed explanation of volumes. Conclusion we built the project step by step with Dockerfile and looked at most of the instructions. Please look at the Docker docs if you need more info.
https://medium.com/bb-tutorials-and-thoughts/docker-a-beginners-guide-to-dockerfile-with-a-sample-project-6c1ac1f17490
['Bhargav Bachina']
2020-01-05 16:04:02.451000+00:00
['Software Development', 'Docker', 'Web Development', 'Programming', 'Dockerfiles']
Sağ-Sol Çatışmasının Yeni Dinamikleri (1): Politik Doğruculuk ve İfade Özgürlüğü Tartışması
Get this newsletter By signing up, you will create a Medium account if you don’t already have one. Review our Privacy Policy for more information about our privacy practices. Check your inbox Medium sent you an email at to complete your subscription.
https://medium.com/t%C3%BCrkiye/politik-do%C4%9Fruculuk-ve-i%CC%87fade-%C3%B6zg%C3%BCrl%C3%BC%C4%9F%C3%BC-tart%C4%B1%C5%9Fmas%C4%B1-%C3%A7%C4%B1kar%C4%B1lacak-dersler-fd7a0958efc7
['Engin Onuk']
2018-08-19 11:37:54.653000+00:00
['Türkçe', 'Sol', 'Sağ', 'Ifade Özgürlüğü', 'Politika']
Home Deep Cleaning Services in Mundhwa, Pune
Cleaning services in Mundhwa S3 Care Services — All types of Cleaning Solutions for your Home and Office For being healthy, a hygienic environment is essential in your home. With the help of cleaning professionals, you can deep clean your house correctly and creates a completely sterile environment. If you are in Pune and looking for the best top-class home cleaning services, there are a lot of professional cleaning services available in Pune. S3 Care Services is a renowned company in cleaning services and registered as the top-rated cleaning services company in Pune for many years. Components of house/home cleaning service Cleaning of Bathroom including tub, mirrors, toilets, bathroom baseboard, hand wipe, shower tracks, and the exterior part of washrooms Mop down the surface, vacuum all rooms, clean doors, windows, and sliding glass doors. Kitchen cleaning includes cleaning the sink, stovetop, table, interior and exterior of microwave, cleaning kitchen walls and surfaces that become oily and dirty, and body of refrigerator and oven. Cleaning of ceiling fans, walls, and floor surfaces of the whole house. Services Covered by us in Home Cleaning. Floor Cleaning: We use disinfecting chemicals and machinery equipment to remove the deep layer of dust from the floor surface. We use disinfecting chemicals and machinery equipment to remove the deep layer of dust from the floor surface. Cleaning of Toilet and Bathroom: We do deep scrubbing of bathrooms, disinfect & sanitize it properly. We do deep scrubbing of bathrooms, disinfect & sanitize it properly. Clean All Surfaces: Its surface, including steel, wooden, and glass of furniture and floor. Its surface, including steel, wooden, and glass of furniture and floor. Walls & Ceiling: In the entire house, we remove dust from all the walls and ceiling. In the entire house, we remove dust from all the walls and ceiling. Kitchen Cleaning: Clean all the parts of the kitchen, including all furniture and electrical textures. Clean all the parts of the kitchen, including all furniture and electrical textures. Window Cleaning: All of all the glass doors and windows in the house with proper care. All of all the glass doors and windows in the house with proper care. Other Cleaning:Cleaning of electrical appliances, ceiling fans, switchboards, etc. https://www.s3deepcleaningpune.com/home-and-office-deep-cleaning-services-in-mundhwa.php
https://medium.com/@jayeshos00/home-deep-cleaning-services-in-mundhwa-pune-f8bdc9c50afb
[]
2021-12-24 13:03:45.572000+00:00
['Cleaning', 'Cleaning Services', 'Cleaning Company']
The New Neighbor
Through her window she could see two men stepping out of a moving truck. they went to the back, opened the truck and started un-loading. Suddenly a girl came running out of the car that had just arrived. She dashed to the truck and grabbed a small box out of one of the men's hands. Lou could see the girl yelling at the two men but she couldn’t hear what they where saying. she was about to step away from the window when a man and a woman both of them wearing hoods came running out of the car and garbed the girl’s arm. the girl started crying as the woman dragged her to the old house next to Lou’s. the man apologized and helped the two men move. Lou had never seen the man or woman's face. come to think of it- she hadn't seen any of their faces! Lou had the strangest feeling something was going on. Brushing it off she walked to the kitchen to make some cookies. Her hobby was baking. she always baked when she was stressed. Lou lived in a little blue house. it had a first story but it also had a small up-stairs witch was her bedroom. she was a red-head. she had long hair that fell right at the middle of her rig cage. today she was wearing a big baggy yellow hoodie with fish-nets and a black skirt that fell right above her knees. an hour later she went over to the white house next-door. It had what looked like two stories when it was really three. there was a large middle floor and a large basement but a normal sized second floor. Lou rang the doorbell and waited. a few seconds later a woman opened the door a crack. she was wearing a fancy tight dress. she had straight black hair that went right down to her shoulder blades “What.” She growled “Hi! I am your new neighbor. My name is Lou and I brought home-made cookies!” Lou said anxiously… trembling. Lou was never good at talking to anyone new. “okay.” the woman said looking Lou up and down. She handed the woman the box of cookies, and the woman took them! “give me a second…” the woman said still eyeing Lou. She shut the door quietly. Lou was a curious girl and wasn’t raised in a ‘Normal’ environment she had grown up in a old shabby house because right after she was born her mom left. years later her father died so she was left there as a small child of only eight soon after on her ninth birth day she was found by police who heard people talking about always hearing sounds from her house. She was sent to an orphan edge. all the kids there where very kind to her witch is what hade her herself. so, being the curios girl she was she looked through the window and saw the woman opening the lid of the cookie box and taking a bite. the woman's face lit up and turned to the door. Lou rushed to the door and pretended nothing happened. The woman opened the door so wide and smiled a huge smile. “Welcome to our house!” the woman said sweetly “W-What?” Lou said nervously “Come in side!” the woman said pushing Lou in “Agatha! Tomas! Get down here!” Lou was standing there, wondering “what is going on?” A man with a black suit on came from the other room then from upstairs a girl -who seemed to be the same age as Lou- came down in a long dress she had blue eyes and long black braided hair just like her mother. “Dear, please welcome Lou to our house.” the woman said smiling at the girl. The girl said “But mother. Might we want to introduce ourselves to the guest?” the woman looked at the girl “Uh- Yes, Yes. Of cores. my name is Emma.” The woman said “I’m Agatha.” the girl said “I’m Tomas.” the man said “I’m Lou- Wait wait wait wait wait. Can someone explain what's going on?” Lou said confused “oh uh- yes of cores.” Emma said “So- we are having guests over tonight but-” Agatha interrupted “-we don’t have a baker witch we need to make dessert.” Agatha looked very proud for knowing what her mother was going to say “I-I don’t know how to make a big meal!” Lou spitted out. “Oh we know. Agatha said in a snooty sounding voice “We just wanted you to make desert.” “Oh well I could…” Lou said “-Great! Agatha! Go get her changed and show her the baking area!” Agatha pushed Lou toward a closet and shoved her in. “Get changed into this” she grabbed a outfit from the stool next to her and gave it to Lou. then shut the closet door. Lou came out a few minutes later wearing tight jeans and a black tank-top. Her hair was in a loose bun. “Okay! now. Bring her to the room.” Emma said “Right.” Agatha grabbed Lou’s arm and brought her to a normal looking kitchen. “In there…” Agatha pointed to a small door that Lou -if she wanted to go in- would have to crouch down. “is where you will find everything you need.” Agatha threw the piece of paper she had in her hand to Lou. “Take this” she said “what is it?” Lou asked “a list of what you need to make-” Agatha looked at Lou “D’ya think you gonna be able to make it all?” She said in a sweet sounding voice when Lou knew she meant it in a mean way. “I think I’ll be okay.” Lou said through the gap between her teeth that was clenched. before she Left she said “Careful not to mess it up.” Agatha walked off with a smile on her face. “‘Careful not to mess it up’ yeah, yeah I know.” Lou though as she looked at the list “okay so for first cores of desert… FIVE CORSES OF DESERT?! well lets get started…” Lou first decided to make the final part first so that it could bake. she made chocolate, vanilla, and strawberry flavored cake mix- “a three layered cake- Jesus.” She kept going as she put the biggest pan on the oven with the strawberry flavored cake mix. she got out a slightly smaller cake pan and put the vanilla in it and left it to bake right before putting in the chocolate cake mix she took out the strawberry cake. it looked beautiful! she put the chocolate cake mix in and started to make the icing and just then the oven beeped- she took out the chocolate and vanilla cake and they where so beautiful. She made the icing and let the cakes cool then put the icing in the fridge and worked on the other cores. course one was what looked like a donut but no hole with Jelly filling. desert two was the best one in Lou’s opinion. it was small root beer floats in wine glasses. the third was small cakes with small sweets on them. the fourth was simple cookies. and the fifth was a giant three-layered cake with icing all over it. all throughout baking Lou would hear the doorbell ring and then hear Emma say “Coming!” one time she said “Agatha please open the door!” Lou worked and worked and when she was finally done Agatha came in with three kids hanging onto her. “I need help-” she said “What happened?” Lou said rushing over “I told them they could have some sweets” Agatha admitted “‘thank god I hid all the deserts’” Lou though. “Well- maybe I can get some candy for them give me a sec” Lou went through the door and looked through the candy filled shelves and dispensers then saw a bowl of sweets that had a note on it Lou picked it up and read it. Dear Lou. This is Agatha speaking. I knew that the kids would want sweets so I want you to take this bowl. yeah I did put this in here while you where changing yes I know its kind of weird but whatever. just bring these out. Lou put the note down and went out to give the kids the bowl of sweets. “Here you three go.” the three took the bowl and walked of calmly for this was an adult party that they had to come to. “Thanks” Agatha said “no problem” Lou said trying to sound like she wasn’t mad at Agatha for earlier “So looks like you haven't started.” Agatha said looking over at the three ovens. “Oh no i finished. i just hid them.” Lou said but Agatha looked puzzled. “What is it?” Lou asked with a concerned look on her face “Why would you hide them? wouldn’t you want to show it off?” Agatha asked “Wa?” Lou said puzzled “I guess your not like our other bakers” Agatha said her ears where slightly pink. “Sorry.” Agatha said quickly looking at the ground her ears turning pinker “I forgive you.” Lou said lifting up Agatha’s chin “Uh- Lou-” Lou looked away and heard foot steps, whispered to Agatha “stay here” “Okay?” Agatha said confused. by the time the footsteps got there the two where side by side waiting for the door to open “Hello girls.” Emma said “Dinner will be ready soon-” She stopped, looked Lou up and down and made a disgusted face “You- you aren’t wearing that-” she pointed it Lou’s clothes “-to dinner- are you?” Lou looked down at her now flour and stain filled clothes “Uh- No she isn’t!” Agatha stepped forward “I have something she can wear-” Agatha said “Fine just get ready- Dinner is in ten” Emma said Agatha pushed Lou up the stairs and into her room it had a bed like in harry potter With one bed side table and a trunk. There was a small bookshelf with full shelfs and on top the box that Lou had seen the girl taking out of the truck. Lou looked to the other side of the room a balcony- a desk right next to the door with what looks like school supplies- “Okay” Agatha says interrupting Lou’s stare “Sit down- give me a second” Lou sat down on the bed and watched Agatha go to the trunk and take out a high wasted pair of black suit pants. next she took out a button up shirt and set it down next to the pants. Finally she took out a vest, closed the trunk and walked over to Lou with everything in hand and gave it to her “Okay so i think you know how to put this stuff on I’ll be on the balcony.” Agatha walked to the balcony and Lou got changed. “Agatha I-” Lou said as the walked onto the balcony “No before you start” Agatha interrupted “I know we just met but-” Agatha cut off “I LIKE YOU-” Agatha and Lou said and the same time they both looked shocked but understood at that moment it t’was meant to be. the devised a plan- Agatha would pretend to be dead, escape through her bedroom window and wait at Lou’s house. Later that night when Lou went home she would run away with Agatha. Lou and Agatha went down stairs right when Emma was going up “Oh- Agatha, Lou- Go to dinner” Emma said. Agatha and Lou looked at each other but kept walking They sat down and after dinner Emma said “Lou- Go get the desserts.” Lou stood up but before she squeezed Agatha’s hand and went to the room. A little while later she came back with the first course. Everyone loved it. After the fourth course Agatha knew it was almost her time Lou served the fifth course and they where all so happy Emma even offered Lou a job- but Lou refused for their plan would not work if she had a job in that house. Agatha pushed out her chair and went to the bathroom. after a few minuets Emma got worried and went to check on Agatha She screamed and everyone came running. they all saw Agatha’s dead body in the shower. Lou stepped forward and Emma said “You. YOU KILLED HER” she screamed. josh held her back and Lou said “Why would I kill her?” I loved her!” Lou blurted out everyone fell silent Emma said “Oh- I’m- I’m sorry” She flopped to her knees and said “I- I’m going to go lie down” she stood up and headed upstairs. Lou picked up Agatha’s body and placed it down on her bed. When everyone had gone Agatha squeezed Lou's hand. A while later Josh got up “I am going to go check on my wife.” he said as he walked away. we heard a thump and ran upstairs and saw Josh on his knees in front of his Lifeless wife.
https://medium.com/@murdermystery/the-new-neighbor-863948e18b71
['Louie Pingree']
2021-02-22 19:33:02.170000+00:00
['Murder Mystery', 'Suspense', 'Lovestory', 'LGBTQ']
Linear Regression Using Boston Housing Dataset
Overview Boston Dataset is the information collected by U.S census Service concerning housing in Boston city.The data was originally published in 1978 containing nearly 500 samples.We can easily access this data with the help of sklearn library .Our main aim would be to predict housing price based on features present in dataset.let’s get started…… Step 1. Importing Libraries and Acquiring Dataset Importing libraries import numpy as np import pandas as pd import matplotlib.pyplot as plt import seaborn as sns from sklearn.datasets import load_boston from sklearn.preprocessing import StandardScaler from sklearn.model_selection import train_test_split Acquiring Dataset boston=load_boston() type(boston) [0ut]>> sklearn.utils.Bunch The type of boston data is utils.Bunch . sklearn stores data in the form of dictionary like object and Bunch consists of four keys through which we can understand data more precisely i.e:-{data,target,feature_names,DESCR}. i. data: It gives information about features through which we can do our prediction it excludes target variable. -------------------------------------------------------------------- boston.data[0] #printing first row of dataset [out]>>array([6.320e-03, 1.800e+01, 2.310e+00, 0.000e+00, 5.380e-01, 6.575e+00,6.520e+01, 4.090e+00, 1.000e+00, 2.960e+02, 1.530e+01, 3.969e+02,4.980e+00]) -------------------------------------------------------------------- boston.data.shape [out]>> (506, 13) ii. target: It will print data present in target variable. ------------------------------------------------------------------- #reshaping the target into 2d array so that it would be easily #converted into dataframe further target=np.array(boston.target).reshape(-1,1) target.shape [out]>> (506,1) -------------------------------------------------------------------- iii. feature_names :- It will print all the features(excluding target variables) present in dataset. boston.feature_names [out]>> array(['CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD','TAX', 'PTRATIO', 'B', 'LSTAT'], dtype='<U7') iv. DESCR: It gives complete description about dataset including basic definition of features, what does each feature means which feature would act like target variable , is there any missing value present in dataset,Source of dataset ,author of dataset etc. Figure 1 Next step would be to convert dataset into dataframe so that we can preprocess and visualize data and will find suitable feature for prediction. #Converting dataset into dataframe boston_df=pd.DataFrame(boston.data,columns=boston.feature_names) boston_df["Price"]=boston.target #changing target variable name as #price boston_df.head() #printing first five rows of dataframe Figure 2 Step 2. Data Pre-Processing i. Printing shape of dataframe boston_df.shape #checking number of rows and columns present in data [out]>> (506, 14) ii. Printing all the features present in dataframe & data type of each column ------------------------------------------------------------------- boston_df.columns [out]>>Index(['CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD', 'TAX','PTRATIO', 'B', 'LSTAT','Price'],dtype='object') ------------------------------------------------------------------- boston_df.dtypes #Printing data type of each columns Figure 3 iii. Is there any null value present in dataset? boston_df.isnull().values.any() [out]>> False #There is no any null value present in dataset iv. Statistical Description boston_df.describe() Figure 4 Step 3. Data Visualization i. We will try to understand distribution of target variable boston_df.Price ,Whether target variable has been normally distributed or not. plt.figure(figsize=(12,6)) sns.set('notebook') sns.distplot(boston_df.Price,bins=20) plt.show() Figure 5 Upper figure shows the boston_df.Price has been normally distributed with some outliers.We can also conclude that maximum number of houses sold within price range of $20000-$24000. ii. Now we will plot each features with respect to target variable to see whether features has linear relationship with target variable or not. for i in x.columns: #plotting each features with respect to target plt.scatter(x[i],y) plt.xlabel(i) plt.ylabel("Price") plt.show() figure 6(a) figure 6(b) If we will analyse above figures we can conclude that NOX , RM , DIS,LSTAT , AGE are showing near about linear character.Hence we can say that these features are too important for prediction of housing price. iii. Checking multicolinearity using heat map:- a. As we know while solving linear regression problem each features should be independent with one another if there will be some correlation between two independent variables(Features) then it leads to overfitting the model so with the help of heat map we will eliminate all those features which shows strong correlation with one another. b. We also try to extract all those features which will have extensive correlation with target variable. Because , if feature will be strongly correlated with target variables then we would expect better prediction. plt.figure(figsize=(12,6)) corr_val=boston_df.corr() sns.heatmap(data=corr_val,annot=True) Heatmap Extracting all those features which is highly correlated (threshold value=0.5) with target variable. #Etracting all features which is highly correlated with target def HighlyCorrelated(data,threshold): feature=[] values=[] for ele,index in enumerate(data.index): if abs(data[index])> threshold: feature.append(index) values.append(data[index]) df=pd.DataFrame(data=values,index=feature,columns= ["Correlation"]) return df threshold=0.5 corr_df=HighlyCorrelated(corr_val.Price,threshold) corr_df output Here RM , PTRATIO , LSTAT are highly correlated with target variable Price.Hence these variables will give better prediction. Step 4. Detailed overview of model using O.L.S O.L.S does not do prediction but gives summary about the model.By calling summary() we get to know various parameters that drives the linear regression model for prediction i.e:-β1,β0,p-value,r squared etc. #As ols is statistical modeling hence first of all we need to import #it from statsmodels import statsmodels.formula.api as sms reg=sms.ols(formula="Price~CRIM+ZN+INDUS+CHAS+NOX+RM+AGE+DIS+RAD+TAX+PTRATIO+LSTAT",data=boston_df).fit() print(reg.conf_int()) print(reg.summary()) O.L.S a. If we will see table we have coef , which consists Intercept(β0) and β1(would be considered for variable we will take).let’s understand this by an example; I would like to plot prediction of y^ based on LSTAT Here we have β0(Taken from table of confidence interval)=31.6173 and β1=-0.5520(coef of LSTAT). Now we will make model from these inputs. i.e:- y_hat=β0 + β1*LSTAT = 31.6173+(-0.5520)*LSTAT #Plotting linear regression LSTAT=boston_df["LSTAT"] plt.scatter(LSTAT,boston_df["Price"]) y_hat=31.6173+(-0.5520)*RM plt.plot(LSTAT,y_hat,lw=5,c="green") plt.xlabel("LSTAT",fontsize=20) plt.ylabel("Price",fontsize=20) Figure 7 The next thing in the table is p-value . If p-value will be 0.000 means result is statistically significant and accepts null hypothesis. b. Here we have R Squared and adjusted R Squared by the help of O.L.S we will know which will be better fit for accuracy of model. c. t represent student t distribution of features and z represent z-statistics of features. print(reg.rsquared," ","*"*100," ",reg.rsquared_adj) [out]>> 0.7343070437613076 ******************************************************************** 0.7278398724126984 #here rsuared is better fit to check accuracy of model insted of adjusted r sqaured Step 5. Modeling and Prediction i. RM, LSTAT , PTRATIO are Extensively correlated with Price hence we will make our model based on these features only. x =pd.DataFrame(boston_df.loc[:,["RM","LSTAT","PTRATIO"]]) y=boston_df.loc[:,"Price"] y=np.array(y).reshape(-1,1) print(x.shape) print(y.shape) [out]>> (506, 3) (506, 1) ii. Splitting the dataset into training and testing set Here i have taken 80% of dataset as training set and size of testing set test_size=0.2 would be 20%. random_state=int it will seed the value means if we will take same dataset on different system with same train and test set size we will get same accuracy percentage. x_train,x_test,y_train,y_test=train_test_split(x,y,test_size=0.2,random_state=1) print("shape of training set of X:",x_train.shape) print("shape of training set of y:",y_train.shape) print("shape of testing set of x:",x_test.shape) print("shape of testing set of y:",y_test.shape) [out]>> shape of training set of X: (404, 3) shape of training set of y: (404, 1) shape of testing set of x: (102, 3) shape of testing set of y: (102, 1) iii. Training and Testing We will use LinearRegression to train our training and testing set. from sklearn.linear_model import LinearRegression lrm=LinearRegression() lrm.fit(x_train,y_train) Training set: y_train_pred=lrm.predict(X_train) df1=pd.DataFrame({"Actual_train":y_train,"Predicted_train":y_train_pred}) df2=df.head() df2 Figure 8 df2.plot(kind="bar") plt.show() figure 9 Test set: y_test_pred=lrm.predict(x_test) df3=pd.DataFrame({"Actual_test":y_test,"Predicted_test":y_test_pred}) df4=df3.head() df4 Figure 10 Now we will plot the actual value and predicted value .After plotting we will see the variations between them. df4.plot(kind="bar") plt.show() Figure 11 From above figure we can see there is not much variation between predicted value and actual value hence we can say our predicted model insures that it will work well. iv. Model Evaluation Model Evaluation estimates generalized accuracy for unseen data.It also ensures whether our model is overfit or not. overfitted model works well for test dataset but it does not give good predicton for real world dataset so if there would not be large difference between the accuracy of train and test set then we can say that our model is ready for deployment. Training set: from sklearn.metrics import r2_score from sklearn import metrics print("MSE:",metrics.mean_squared_error(y_train,y_train_pred)) print("MAE:",metrics.mean_absolute_error(y_train,y_train_pred)) print("RMSE:",np.sqrt(metrics.mean_squared_error(y_train,y_train_pred))) print("R_squared:",r2_score(y_train,y_train_pred)) Figure 12 Test Set: print("MSE:",metrics.mean_squared_error(y_test,y_test_pred)) print("MAE:",metrics.mean_absolute_error(y_test,y_test_pred)) print("RMSE:",np.sqrt(metrics.mean_squared_error(y_test,y_test_pred))) print("R_squared:",r2_score(y_test,y_test_pred)) Figure 13 R Squared is nearer to 1 and there is not much difference between R squared value of Train set and Test set hence we can say model is not overfitted. Conclusion Drop a comment if you have any query and suggestions. Keep visiting for more machine learning content.
https://medium.com/ai-in-plain-english/linear-regression-using-boston-housing-dataset-e4c2ee891b02
['Akhil Anand']
2020-11-21 13:00:36.110000+00:00
['Programming', 'Data Manipulation', 'Machine Learning', 'Linear Regression', 'AI']
There Can Be Only One: The Singleton Design Pattern
There Can Be Only One: The Singleton Design Pattern When to use the Singleton Pattern? The Singleton Pattern is used when you only want one instance of an object at a time. An example when this pattern is useful is for an access point to a database 💾. In an app, it is beneficial to have one entry point to the database as opposed to multiple entry points (like what would happen with a regular class with a constructor). In the image above, we have three users asking for a box 📦, our box service gives them pointers to the same instance instead of creating new ones for each of them. This is useful if creating a “box” is an expensive or long task ⏱️, with singleton, we only need to create one. Case Study We will create a small app where we have a DBAccessor class which we only want one to exist at a time. We will have 2 clients request to use the accessor and confirm that they are both using the same one. The Singleton Here is an example of a Singleton class, we get rid of the default constructor by explicitly declaring a private constructor so it cannot be called 🚫 outside the class. If a client wants to get the instance of DBAccessor, they will have to use the getInstance() method. Here we create two dbAccessors which actually point to the same object. Query number 1 Query number 2 When we query the two “different” objects, we are actually querying the same one so it increments to 1 and then 2. If we got different instances, we would have seen… Query number 1 Query number 1 The Singleton Pattern is one of the simplest and easiest understood design patterns, there is a clear understanding of how to implement it and when and why you would want to use a singleton over a regular Object with a public constructor.
https://levelup.gitconnected.com/there-can-be-only-one-the-singleton-design-pattern-bf1d47e43a24
['Matthew Macfarquhar']
2020-11-16 04:36:49.871000+00:00
['Programming', 'Software Development', 'Java', 'Design Patterns', 'Singleton']
What does it take to be Vulnerable Again?
Once bitten, twice shy. Right? Well, maybe. Have you ever have had a time in a relationship where you’ve put in everything you’ve got, only to be pushed out of it, every time you try to settle in? Have you ever had to end a relationship with a silent fight, without tears & all the yelling; and pick up the pieces of yourself shattered down on the ground that the relationship broke you into? And then maybe, after a million sunrises later, you find someone again who you want to see in a light like you saw someone in your life before, but not quite that sort of a light. But similar though. And now that you’re finding someone else again, real enough to build a trust again, you’re questioning the whole thing? Well, Ryan Tedder ( frontman of One Republic ) wrote a beautiful song called Halo, just about all of this. And this is our SongOfTheWeek.. Beyoncé included this song in her album I am.. Sasha Fierce. And helped get this song be seen in new light. But today we’re more concerned with the content of the song, than the one who sung it.
https://medium.com/songoftheweek/what-does-it-take-to-be-vulnerable-again-510103e96021
['Kalpesh Mange']
2018-05-10 20:21:22.098000+00:00
['Hope', 'Love', 'Music', 'Halo', 'Pop']
5 Tips for Giving Effective Feedback to Students
Giving feedback to students is tricky. That’s why I began researching how to give effective feedback. What did I find? Here are five helpful feedback tips for my fellow educators. 1. If in doubt about timing, give immediate feedback When you give feedback is important. Generally, there’s immediate feedback and delayed feedback. “Immediate feedback” must directly follow the completion of a task. And this kind of feedback works best for the retention of verbal, procedural, and motor skills. It’s also best if the task is new, particularly difficult, or you’re working with a lower achieving students. 2. Use a variety of feedback methods Students prefer verbal feedback, but there are unique benefits to written and video feedback. That’s why it’s a good idea to mix up your feedback methods. And don’t forget to experiment with new feedback technology. In a comprehensive literature review of feedback research, Dr. Valerie J. Shute encourages professors to “exploit the potential of multimedia . . . consider alternative modes of presentations (e.g., acoustic, visual).” 3. Reference student outcomes. Always try to reference assignment goals or course outcomes in your feedback. Doing so makes your feedback more specific and minimizes any signs of bias. It also gives students a macro view of their learning and allows them to track their progress. 4. Don’t compare students in your feedback. Whenever you get egos involved, learning suffers. That’s why educators should never compare students in giving feedback. 5. Give feedback first and delay the accompanying grade. An interesting study by Dr. Ruth Butler in 1998 suggests that feedback is most effective on it’s own — sans grade. So consider giving feedback first and delaying the accompanying grade. Or just do more feedback-focused exercises with drafts or activities.
https://medium.com/goreact-easy-video-feedback/5-tips-for-giving-effective-feedback-to-students-d77b1a83d0a9
['Chad Jardine']
2019-07-01 22:46:01.166000+00:00
['Teaching', 'Edtech', 'Education', 'Feedback', 'Videos']
Introduction to MongoDB: A NoSQL Database
What is Database? In home you have refrigerator or fridge to store food, veggies and other items, so in electronic/digital world the food is called data and the refrigerator is known as Database. What is NoSQL database? Before NoSQL lets understand what is SQL database. SQL database has a defined structure to store data or information that is in the form of table(Row, Column) just like your refrigerator’s cabinet(well defined space). SQL Database example From above figure it can be noticed that all the data is in row, column format and none of the data is null. If any data is null or empty we have to write null there. In NoSQL database there is no fixed structure that is there is no fixed number of column and the number of column can be different for individual entry or row. MongoDB is a NoSQL database. Here are some points to be noticed: Just like “table” in SQL database, MongoDB has “collections” where all the information is stored. In table we have rows and column then in collection we have “document” that contains “key value pair”. Column is called Key here. For example: Column name is “City” and for the first row the value is “Kanpur”. This can be written as{ “City” : “Kanpur”} for the first Document(first row). We will talk about it later. The document is stored in JSON format. brief To download MongoDB, click here and for installing video tutorial click here. Let’s do some practical: To start MongoDB, first set the path in cmd then type “mongod”. Then open another cmd and type “mongo” First Step Second Step Don’t close any window till our operation completes. We can create multiple document at a time using: db.datatype.insert([{document1},{document 2},{document n}]) OPEN THE IMAGES AND ANALYSE WHAT IS HAPPENING. ALSO READ THE PARAGRAPHS AND LINES WRITTEN OVER THERE. We learnt how to insert document in collection. Now we will see how to find the required document in the list of document. What if we have condition like less than, not equal, greater than etc then we have to use: take the first letter of both less than i.e. l and t so we wrote $lt and so on. greater than = gt, less than =lt, not equal=ne, less than or equal to=lte, etc. Now how to delete collections, documents and database too: Before deleting the database first run the command “db” to check the current db. Next time we will cover method and functions If something is wrong please comment down below. Thank you for reading!!!
https://medium.com/@sunnykkc13/introduction-to-mongodb-a-nosql-database-b076cb8bb188
['Naveen Singh']
2020-12-25 10:46:49.218000+00:00
['Nosql Database', 'Database', 'Mongodb Tutorial', 'Cli', 'Mongodb']
Arguing with Edward Snowden
Arguing with Edward Snowden A Data Scientist’s take on defending Machine Learning models Introduction I’ve recently read Edward Snowden’s Permanent Record during my holiday. I think it is a great book that I highly recommend for basically anyone, however it is particularly interesting for IT-folks for the obvious reasons. It is a great story about a guy growing up together with the internet, starts to serve his country in a patriotic fervour after 9/11, and becomes a whistleblower when he notices the US has gone too far violating privacy in the name of security. Moreover, a paradox I found most interesting is something a Data Scientist can easily relate to. The systems that collect data about one’s browsing on the internet (basically anything you do on the internet) was an engineering masterpiece. It surely did something the NSA has no mandate for, but when building something brilliant, it can be easy to miss the big picture, and help malignant actors by handing over great tools for them. Think about this in terms of machine learning! I am quite sure — although I cannot know — that the Chinese Social Credit System’s mass surveillance network makes use of some state-of-the-art Deep Learning concepts. Or they even do some things more brilliantly, than publicly available research can suggest. But the system it is used for at best raises some super serious questions about individual rights. Being IT professionals, we cannot miss the big picture and have to be mindful of what is the consequence of our work! When discussing the threats of massive private data collection done by governments, Mr Snowden makes some controversial statements about data science as well. Firstly, he incorrectly suggests that machine learning models in general are total black boxes, and their decisions cannot be explained afterwards — thereby making a point that algorithms make obscure decisions that people should make in a transparent way. Secondly, he states that recommendations are just ways to put pressure on an individual to buy popular products. I aim to argue against both of these statements. Model explainability There is an example in the book about COMPAS, a widely used risk-assessment algorithm in the judicial system of the USA. In this case, the point is that an algorithm made a decision having a substantial effect on someone’s life — and neither we nor the algorithm can even explain why so. I think this is an inherently wrong and ill-disposed point of view. There are models which are explainable by nature which is one of the main reasons practitioners use them. If you think about linear regression for a regression problem: the product of the feature value and the corresponding beta gives you the amount this feature contributed to the prediction of the target value. In a classification problem, widely used logistic regressions behave almost the same way, as they are a special linear regression. A decision tree produces the exact series of decisions the algorithm learned to be useful determining the target value. However, bagging and boosting algorithms make use of numerous trees built simultaneously or sequentially (Random Forest, Xgboost, Extremely Randomized Trees, LightGBM, etc). These voting trees, or high-dimensional data in case of Support Vector Machines are harder to interpret or visualize concisely. Moreover, in a deep Neural Network millions of matrix multiplication take place — keeping track of it sounds intimidating of course. However, there are several methods that make these algorithms more transparent. This article shows how deep neural networks can be more interpretable in breast cancer research. Another great article elaborates on different model explainability techniques: permutation importance, partial dependence plots and SHAP-values. There is room for improvement in non-technical human readable interpretations of complex machine learning models, but there are techniques to explain why a model predicted such an output. Consequently, if a model is not explained well, it is almost certainly arising from an omission or failure of a human actor. On top of that, algorithms being biased in terms of socio-economic factors is an accusation appearing increasingly often. It is important to note again, that this a failure of the modeler not the model. The data these models are trained on are reported to contain bias — accounting for that is a challenging task we as modellers surely have to overcome. Luckily, the theoretical foundations and tools are there to assess the “fairness” of algorithms for example, between two racial groups. Advertisements and recommendations A second argument I did not like in the book was about recommendations in general. The author states that recommendations are just about softly pressuring the customer to buy what others did buy. I think this argument misses the real point here. There are advertisements everywhere. I would certainly agree that advertisements — apart from conveying information about a product — are means of putting some pressure on the subjects to buy a product. Nonetheless advertisements are natural and necessary in a market-driven economy, and in a world packed with so many products and services. But if we accept the premise, that some sort of advertising is going to exist, which one would you prefer? The one with no personalization whatsoever, or one where the advertisers’ goal to make you buy happens to cause that you are getting more relevant advertisement about products that you may actually need? I’d prefer the latter. A sophisticated recommender system takes your history into account, along with other people’s history that have a similar record to you. If done right, they are much more than just popular product recommendations. Conclusions In general, I really liked the book. I also admire the bravery of Mr. Snowden that started a discussion about privacy, and the trade-off between privacy invasion and crime prevention. But I also think that the book expresses a negative attitude towards everything in connection with using large amounts of data. Opposing this, I believe that statistical models built on top of massive datasets can greatly benefit humanity — if used for the right purposes, transparently and responsibly.
https://medium.com/starschema-blog/arguing-with-edward-snowden-2e21d553c056
['Mor Kapronczay']
2020-03-12 14:42:16.869000+00:00
['Machine Learning', 'Recommendation System', 'Data Science', 'Edward Snowden']
How to connect vMix
Yellow Duck is an application that will allow you to live stream to Instagram from your PC machine via vMix encoder. Yellow Duck gives you the ability to get your Instagram RTMP URL and stream key, which allows you to stream from external devices. Launch vMix and click the gear icon next to Stream button in the bottom of main window. 2. Use “Yellow Duck” as a profile name. Choose “Custom RTMP Server” as Destination. Copy your RTMP URL and stream key from Yellow Duck application to your vMix URL and Stream Name or Key fields accordingly. 3. Click the gear icon next to Quality field. Set Bitrate to 2000. Change scene resolution in Encoder Size field to 720x1080 as it’s default resolution of Instagram streams. Set Audio Bitrate to 128. Set Profile to Main. Save all changes and close all dialogue windows. 4. Make sure to rotate all your sources clockwise once so it’s more convenient to watch your stream on a mobile device. For doing this, click Configure various input settings button with the gear icon next to your source. Set rotation to “-1.5708463” and zoon to “0.565”. To start stream click Stream button in the main vMix window. You will need to update the stream key in stream settings before each broadcast and start stream only while Yellow Duck application is running.
https://medium.com/yellowducktv/how-to-connect-vmix-dbb3c43aa7da
['Yellow Duck']
2019-07-24 14:05:30.284000+00:00
['Services', 'Instagram', 'Live Streaming', 'Vmix', 'Facebook']
The Difference Between Brand Ambassador and Promoter
Brands and businesses at all levels, from a small local brand to the current global economic giants, need effective advertising and the right marketing strategies to promote their brand and sales of their products. Today, with the rapid expansion of technology and the use of cyberspace, marketing, and advertising methods have become even wider. And in a situation where many people prefer to make their purchases with the advice and use of the experience of others they trust, brands are becoming more and more willing to use more people in their marketing strategy who have the power to lead and influence others. And we are seeing the proliferation of brands using people with influential traits such as social media influencers, bloggers, brand ambassadors and brand promoters to promote themselves and increase sales of their products. Brands can choose their best methods based on their marketing strategy. Here we are going to introduce you to two common types of brand advertising methods Brand ambassadors Promoters And examine their differences and similarities with each other. What is a brand ambassador? Brand ambassadors educate people about the brand benefits they are working with. They must be enthusiastic, energetic, and intimate, and they must be able to best describe the brand to their target customers and the community. To this end, the brand ambassador must have enough information about the product he wants to present, whether through training, he sees from companies or he is a real consumer of the brand product and speak to its audience honestly about the benefits and features of the product. The best examples of brand ambassadors are celebrities who support and endorse a particular brand, thus using their popularity and acceptance among the people to increase the popularity of a brand or product. Brand ambassadors do not only include celebrities with a large number of followers on social media, but also a bodybuilder on social media platforms with a much smaller number of followers than celebrities when it offers a specific supplement to its audience and tells them about the benefits of the product and encouraging them to use this brand, he is working exactly as a brand ambassador. Big brands usually use celebrities as their brand ambassadors and spend a lot of their marketing budget. But smaller brands and businesses often use micro-influencers to promote their products, which may be for money or even to use the brand’s products for free and become a special customer and use brand discounts. In any case and with any financial agreement between the two sides, brand ambassadors can use their popularity and power to influence others and their audience to be one of the best choices to promote your brand in the market and increase your popularity among the people. What is a promoter? Promoters are the people who formally promote your brand and your products. They are the ones who express the features of your products to customers in every possible place from the street to marketing events and in the big stores that sell your products and try to persuade customers to use your brand and sell products. The main task of promoters is to sell products. They communicate directly with the target customer of the brand, and they have the task of turning these potential customers into buyers with the techniques and training they have seen. Their success is based on selling as many products as possible. Although brands usually use regular and trained people as promoters about all the features of their product, there are also promoters who are models and work with modeling agencies, but in special events, brands use them to influence the audience as much as possible. Using brand promoters is one of the fastest ways to make your brand famous and will lead to brand awareness and ultimately loyalty to your brand. People usually use branded products that they know and trust. Advertising your brand by promoters will increase your sales and ultimately more profit for you. Difference between brand ambassadors and promoters Brand ambassadors and promoters have very similar tasks and both of these groups are looking to increase sales and promote a brand. However, there are differences between them, and by knowing these differences, you can better understand which of them are more suitable for working with your brand. The most important difference between promoters and brand ambassadors is that promoters seek to increase sales more than anything else by consulting and explaining the product to customers and it can be said that their most important task is to persuade more people to use the brand while brand ambassadors seek to establish deeper and friendlier relationships with their audiences and turn them into customers and even fans of a brand. Brand promoters often communicate with customers through face-to-face conversations at a marketing event or in stores, but brand ambassadors mostly communicate with their audience through social media or by participating in an advertising teaser. The relationship between promoters and brand customers is often short and limited to a few minutes that the customer takes to become more familiar with the product, but a brand ambassador can persuade his audience to use and try a product and brand for a long time or even years. conclusion Advertising a brand means delivering the message of your products to your audience and customers. You can use the right marketing strategy such as using promoters and brand ambassadors to determine exactly what features and advantages your brand products have over your competitors so that you can surpass others and become a leader in your niche.
https://medium.com/@unamclean/the-difference-between-brand-ambassador-and-promoter-e2a8ab43505c
[]
2020-12-06 16:15:17.211000+00:00
['Influencers', 'Marketing', 'Advertising']
The Most Powerful Leadership Empowers
Leadership is often associated with power: Power Over. Power Over others is how power is commonly understood. Power Over is at the root of authority, hierarchy, and command and control leadership. Power Over is Problematic When Power Over others defines a relationship, it can be used in coercive, oppressive, and abusive ways. When Power Over is exerted on people, it leaves them feeling powerless. Yet Power Over is limited power. It may seem effective in the short term, but Power Over eventually weakens as friction and resistance increase. And when the Power Over dynamic fades, often so does the change that that sort of power was driving as followers embrace the freedom to choose a different path. Embracing Trust-Centered Power But other types of power are more powerful than Power Over, more enduring, and are at the core of more effective leadership. Other types of power unleash, include, and empower others, build trust, and foster the enrollment needed for long-term, sustainable change-making. These are the types of power harnessed by Trust-Centered leaders: Power To: This is productive and generative. It’s the Power To embrace possibility and make change happen, to make a difference. It’s the power to achieve goals, to create something new, to generously move forward together. Power With: This is shared power born of trust, togetherness, and collaboration. This is the Power To be greater than the sum of our parts, to harness diversity and transcend differences to forge the path to collective action. Power With is anchored in mutual understanding, respect, dignity, support, empowerment, and co-creation. Power Within: This involves trusting in oneself and recognizing the gifts each of us has within and courageously sharing these with the world. It involves self-awareness and self-acceptance. It’s also recognizing and celebrating the unique Power Within others so as to fully engage Power With. It’s believing each and every one of us has the Power To make a difference. Practicing Leadership that Empowers Reflecting on your leadership, what types of power are you most frequently exercising? As Trust-Centered leaders, our work is not to maximize Power Over. The most powerful leadership empowers others. How might you more readily empower others by leading in a way that cultivates Power To, Power With, and Power Within?
https://medium.com/spotlight-trust/the-most-powerful-leadership-empowers-8b9b6a9270c6
['Lisa Lambert', 'Co-Founder At Spotlight Trust']
2020-07-08 14:35:48.052000+00:00
['Leadership', 'Diversity And Inclusion', 'Organizational Culture', 'Change', 'Leadership Development']
A short intro to becoming a web hosting company
Have you ever wondered how you can start your own web hosting company? Are you a web designer that needs more features or more control in the web hosting you resell to your clients? Don’t yet need a dedicated server? Then a private label virtual server is your solution. With a private label virtual server you can set your own prices to control your profits, and earn unlimited income. Try our sponsor VULTR There are a lot of companies on the internet where you can register for a hosting reseller account. But do your homework well, otherwise you might regret it later on. Do some research before you register with any hosting company. Check for how long they are in business and if they are reliable. You do not want your customers to join today and leave tomorrow. Try our sponsor VULTR One big issue. PLEASE REGISTER YOUR OWN DOMAIN NAME. You must have full control over it. YOU must be the owner of the domain. YOU must be the administrative contact for it. YOU must be the technical contact for it. Try our sponsor VULTR To run your own reseller hosting company is like running a normal website. Believe me, not much experience is required. Since you are responsible for billing your customers, you ultimately determine the profit margin and manage all your customers accounts through one main control panel. Try our sponsor VULTR You are responsible for marketing, sales, billing and support activity for your customers. You retain the ownership of your customers. The hosting company has no contact with your customers. Try our sponsor VULTR Dedicated Server or Reseller Account: Dedicated server or reseller account? That is the question. Most people who are ready to start a web hosting business ask this question. There are however different benefits for each solution. You should choose a reseller account if you are a new business owner who does not know how to manage a dedicated server. Managing a dedicated server can be time consuming if you are not proficient at server management. With a reseller account a team of experts manages the server that your web sites are hosted on. A reseller account is also a fraction of the cost of a dedicated server. Usually a reseller account is a good place to start because you can upgrade to a dedicated server at any time. Try our sponsor VULTR Another benefit of a reseller hosting account is that you have no software or hardware maintenance costs. In-house managed solutions require the purchase of hardware (and more as your business grows), maintenance contracts, and software ( with its related maintenance.) Using Private Label Virtual Servers, you don’t have to purchase hardware or software, regardless of the number of customers you have. Try our sponsor VULTR You buy wholesale and sell retail, thereby ensuring your monthly profit margin. No unpredictable costs like professional fees or hardware costs (should something go wrong with your in-house service). When your business gets successful, there won’t be any additional data center, network, hardware, or software costs. Try our sponsor VULTR A virtual private server can run its own full-fledged operating system, and each server can be independently rebooted. This usually gives you root-level access to the hosting companies machine, so that you can install almost any software that runs on the chosen OS. Try our sponsor VULTR One more thing. You are free to use this article on your website provided that it stays unchanged and the links in the resourcebox is live, search-engine friendly links.
https://medium.com/@web-hosting-seo/a-short-intro-to-becoming-a-web-hosting-company-79ebecdc4353
['Web Hosting Seo']
2021-12-15 14:26:30.949000+00:00
['Web Hosting Services', 'Web Hosting', 'Web Hosting Company', 'Best Web Hosting', 'Web Hosting Provider']
Personal Guilty Pleasures
Music Music, like art, is very much an individual taste, but some things, like admitting to still liking Elvis, are a bit funky. My big weakness is … Abba — I love the sappy, easy to hum tunes of Abba. Movies Sappy 1930s' and 1940s’ comedy romances — too many to list all but include “It Happened One Night” with Clark Gable and Claudette Colbert, “Blonde Bombshell” with Jean Harlow. Mama Mia — Okay now this one isn’t perfect, but I loved it. Great music, lots of fun. Leaves me smiling every time. Ishtar — This is one of the silliest movies ever, but it actually is a bit of a political statement by the director, making fun of the U.S. government, the secret service, also making fun of the music industry and how a totally talentless pair of songwriters are able to basically blackmail the U.S. government into paying to record an album. Warren Beatty and Dustin Hoffman are perfect as the two musican hacks, and Charles Grodin is super as the sneaky CIA agent. The Director is Elaine May. Mean Girls. Okay, so this is really a teen movie. Don’t care — love it, and have watched it many times. Burn After Reading — this one is a Coen Brothers creation. I love it. Doesn’t matter how many times I see it, I laugh. John Malkovich as the paranoid ex-secret service agent; George Clooney as the uncontrollable womanizer, Brad Pitt as the idiotic health club trainer …just to mention a few. School of Rock — love this. I know a lot of folks do not like Jack Black, but he is PERFECT in this. The performances by the kids are great and the whole movie is very entertaining. Oklahoma — a sucky musical with some amazing songs. The story is a bit weak, but it has some fantastic performances and great show tunes. Television WRKP in Cincinnati. I worked for three radio stations and in every single one, all the WRKP characters were found. Yes, even Les Nessman, bandages and all. Computer Games Solitaire on the computer. When I am feeling brain-dead, uninspired, or upset about anything, I find that the mindless computer solitaire program is great. I have actually had to ration myself, since I tend to keep at it, obsessively attempting to beat my last score. Food You may have guessed, from the lead image, that fresh-baked scones with butter and jam are a weakness for me. Add a little fresh, clotted cream and I will fistfight for them. But my most yearned after dessert is Pavlova. An amazing creation made with slightly crispy meringue topping, whipped cream, and fresh strawberries or raspberries.
https://medium.com/weeds-wildflowers/personal-guilty-pleasures-a58a8b480009
['Louise Peacock']
2020-12-22 18:07:53.019000+00:00
['Guilty Pleasure', 'Photography', 'Food', 'Movies']