text
stringlengths
301
426
source
stringclasses
3 values
__index_level_0__
int64
0
404k
Self Improvement, Communication. struggle to be heard. Your own feelings — Your ability to listen to other people can easily be affected by how you’re feeling at a particular moment. For example, if you’re in a good mood you might feel more inclined to listen actively and offer your best advice based on what you’ve just heard. In
medium
1,344
Self Improvement, Communication. contrast, if you’re in a bad mood, the last thing you might want to do is listen to someone else’s thoughts and offer advice in response. The wrong time and wrong place — These are the physical factors that influence whether you’re willing or able to actively listen to what you’re being told. For
medium
1,345
Self Improvement, Communication. example, having a heart to heart conversation in a busy coffeeshop is unlikely to positively affect your ability to listen actively. 4 components of active listening With the four components of active listening that are pointed out in the book, the onus is on the listener to develop these
medium
1,346
Self Improvement, Communication. components: Acceptance — Acceptance is about respecting the person that you’re talking to; irrespective of what the other person has to say but purely because you’re talking to another human being. Accepting means trying to avoid expressing agreement or disagreement with what the other person is
medium
1,347
Self Improvement, Communication. saying, at least initially. I’ve often made this mistake; being too keen to express my views and thus encouraging the speaker to take a very defensive stance in the conversation. Honesty — Honesty comes down to being open about your reactions to what you’ve heard. Similar to the acceptance
medium
1,348
Self Improvement, Communication. component, honest reactions given too soon can easily stifle further explanation on the part of the speaker. Empathy — Empathy is about your ability to understand the speaker’s situation on an emotional level, based on your own view. Basing your understanding on your own view instead of on a sense
medium
1,349
Self Improvement, Communication. of what should be felt, creates empathy instead of sympathy. Empathy can also be defined as your desire to feel the speaker’s emotions, regardless of your own experience. Specifics — Specifics refers to the need to deal in details rather than generalities. The point here is that for communication
medium
1,350
Self Improvement, Communication. to be worthwhile, you should ask the speaker to be more specific, encouraging the speaker to open up more or “own” the problem that they’re trying to raise. Tips to improve your active listening skills The book provides some useful pointers on how you can best improve your active listening skills,
medium
1,351
Self Improvement, Communication. explaining the essence of each tip outlined here: Minimise external distractions Face the speaker Maintain eye contact Focus on the speaker Be open-minded Be sincerely interested Have sympathy, feel empathy Assess the emotion, not just the words Respond appropriately Minimise internal distractions
medium
1,352
Self Improvement, Communication. Avoid “me” stories Don’t be scared of silence Take notes Practice emotional intelligence Check your understanding The main principles of reflective listening Once you’ve listened actively, “reflective listening” is what comes next. Reflective listening is concerned with how you process what you’ve
medium
1,353
Self Improvement, Communication. heard. The four components of active listening — acceptance, honesty, empathy and specifics — all work towards creating reflective responses in the listener. The main principles of active listening are: Listen more than you talk. Deal with personal specifics, not impersonal generalities. Decipher
medium
1,354
Self Improvement, Communication. the feelings behind the words, to create a better understanding of the issues. Restate and clarify what you have heard. Understand the speaker’s point of view and avoid responding from your own viewpoint. Respond with acceptance and empathy, not coldly or with fake concern. Main learning point:
medium
1,355
Self Improvement, Communication. Understanding more about the common barriers to active listening — and how to best overcome these — was my biggest takeaway from reading “The Art of Active Listening”. The book does a great job at offering practical tips on how to listen actively and how to better process the things you’ve heard.
medium
1,356
Technology, AI, Machine Learning. Saul Kohn, AI/ML Solution Architect at Code and Theory What should clients be focusing the most on right now? There are more machine learning and AI tools available to you than ever before. These represent incredible opportunities and open the doors to new ways of doing business. We can empower AI
medium
1,358
Technology, AI, Machine Learning. tools and methods strategically; with thoughtful, creative and robust data architectures, optimized for your needs — cost, speed, and precision. But with these new possibilities, come new pitfalls. You need to differentiate between companies and teams approaching you that have innovative, new
medium
1,359
Technology, AI, Machine Learning. technologies, that of which are just vaporware. And some of which are just wrapping the good stuff, that you can get better value and control from at the source! Okay, so you’ve selected your best value, best-fit tool. How do you use it? What are the best methods to extract the maximal value from
medium
1,360
Technology, AI, Machine Learning. it? The answers to these questions are a process of understanding how the underlying technology works. A deeper knowledge of how the data flows through them can help everyone better achieve their goals when implementing them. A great technology agency (hint hint) can be a really helpful
medium
1,361
Technology, AI, Machine Learning. co-navigator here. What’s the best way to keep staff motivated and working at a high level? Machine learning and AI projects are, at heart, scientific investigations. You’re trying to figure out if you can make a computer understand a pile of data at a deep enough level to enable it to start making
medium
1,362
Technology, AI, Machine Learning. inferences on new, unseen data. This can be a daunting task! So one of my strongly held beliefs is that “no one walks alone” in machine learning engineering (MLE). What this translates into is fostering partnership. On any given project, I make sure that there are two MLEs. That way there is always
medium
1,363
Technology, AI, Machine Learning. someone to bounce ideas off of, reflect with and interpret results alongside. This also means that when presenting your results or reporting status, you’re never on the spot or holding everything alone. This is especially important when dealing with data: sometimes it just doesn’t let you do what
medium
1,364
Technology, AI, Machine Learning. you want to! It’s no one’s fault, but if you’re alone you can often fall into the mistake of blaming yourself. Okay — so teamwork makes the dream work, but here’s something I hold in tension with the above: some of the best skills, confidence and learnings are built by giving individuals the
medium
1,365
Technology, AI, Machine Learning. autonomy to solve problems their own way. The way I try to resolve this tension is to get to know my team members very well, so I know what they can do now and what they want to build towards. By doing so I can think very, very carefully about who is paired together on a project. In a perfect
medium
1,366
Technology, AI, Machine Learning. world, they’ll do 110% of what we need and teach each other a bunch along the way. In reality, if we get the best result for our clients and everyone learns just one new thing, I’m happy. What is unique about having a culture that is a balance of 50% engineers and 50% creatives? I had been looking
medium
1,367
Technology, AI, Machine Learning. for Code and Theory for a long time before I found it. Finding links between what I do — mathematics, physics and computer science — with art and literature has always been a pet project of mine, and when I found a good connection, it was always a special delight. Now it’s kind of my job! How cool
medium
1,368
Technology, AI, Machine Learning. is that?! We’re talking about Large Language Models and Wittgenstein. We’re considering the Human vs. the Algorithmic Gaze. We’re thinking about Generative Imaging and what it means for artistic expression. You don’t get to have these conversations just anywhere: it takes a very special mix of
medium
1,369
Technology, AI, Machine Learning. technological and creative talent. I’m glad to be a small piece of it! What’s your biggest source of inspiration? There’s a sense of awe I can reach by using mathematics to describe the world, that inspires me to dive ever deeper into understanding the workings of things. It’s a satisfying,
medium
1,370
Technology, AI, Machine Learning. beautiful cleanliness — a romantic notion of the work, I know — to use numbers and logic to create a working, mechanical system. I think it’s what drove me to do physics for my PhD, it’s definitely what made me fall in love with coding, and it’s a passion and an inspiration that I’m driven to share
medium
1,371
Technology, AI, Machine Learning. with all of my peers (and anyone who will listen!) Our world is full of intricate and interwoven systems. The arrangement of cells, the hum of a crowd on 14th st., the fractal nature of a broccoli floret — all of these phenomena embody principles that can be translated into designs and algorithms.
medium
1,372
Technology, AI, Machine Learning. Not only can they, but they have been translated into designs and algorithms! This fusion of natural beauty and scientific inquiry not only motivates me to push the boundaries of what’s possible with computer science, AI, and machine learning, but also reinforces the importance of sustainability,
medium
1,373
Technology, AI, Machine Learning. harmony, and quiet observation in technological advancement. What’s the most exciting project you’ve worked on? Last year I was part of a team that built a precision face-scanning mobile app for a medical device company. This thing had everything: machine learning in the frontend, machine learning
medium
1,374
Technology, AI, Machine Learning. in the backend, augmented reality to guide the user’s scan, 3D graphics — you name it. And at its peak, 30,000 monthly active users! Working in the overlap of all those different skill sets and technologies was a thrill and a joy. I actually try and encourage and broaden those fuzzy boundaries
medium
1,375
Technology, AI, Machine Learning. (between backend/frontend, creative/technological, AI/Human, … you get the idea), since I think it makes everyone cooperate more and generates better ideas. It’s fun to work in a place that encourages that encouragement. What is the one thing you’d tell your younger self? Slow down. Pace yourself.
medium
1,376
Technology, AI, Machine Learning. Give yourself enough space to see the whole project. It lets you think more clearly about what you’re accomplishing in service of your work, or your client, or even yourself — rather than the singular component your gaze is on right now. Saul Kohn is an AI/ML Solution Architect at Code and Theory.
medium
1,377
Snowflake, Sql, Data Science, Data Engineering, Optimization. Redundant join elimination is a new feature that helps some SQL queries run way faster. To get these benefits, you will need to give some hints to Snowflake. Let’s figure out these constraints here. Image generated by AI A quick review of redundant join elimination Sometimes people (and tools)
medium
1,378
Snowflake, Sql, Data Science, Data Engineering, Optimization. write joins in SQL that end up being redundant. If eliminated these queries could run significantly faster, saving time and complexity. Snowflake is ready to do this automatically, but to do this safely Snowflake might need some extra information from you. For example, let’s look at this query
medium
1,379
Snowflake, Sql, Data Science, Data Engineering, Optimization. based on the tables of TPC-H-SF10: select l_orderkey, sum(l_extendedprice*l_quantity), current_timestamp() from lineitem_10 join orders_10 on l_orderkey = o_orderkey group by 1 order by 2 desc limit 10 You can see that there’s a join there to ensure that every row in lineitem belongs to an order in
medium
1,380
Snowflake, Sql, Data Science, Data Engineering, Optimization. orders. If you could guarantee that’s already the case, then Snowflake could eliminate that join automatically, and run the query in half the time. So how can you tell Snowflake that the constraints within your tables and data can be trusted? That’s the experiment we’ll run on the rest of this
medium
1,381
Snowflake, Sql, Data Science, Data Engineering, Optimization. post. Image generated by AI Setting up data First let’s copy data out of TPC-H-SF10 into our own schema — so we can play with its constraints. Here’s how to copy the tables lineitem and orders. I added a order by as a usual practice to make sure the data is well micro partitioned. create table
medium
1,382
Snowflake, Sql, Data Science, Data Engineering, Optimization. orders_10 as select * from snowflake_sample_data.tpch_sf10.orders order by o_orderdate -- 8.9s S ; create table lineitem_10 as select * from snowflake_sample_data.tpch_sf10.lineitem order by l_shipdate -- 24s S ; Now let’s run our baseline query: select l_orderkey, sum(l_extendedprice*l_quantity) ,
medium
1,383
Snowflake, Sql, Data Science, Data Engineering, Optimization. current_timestamp() -- avoids using the query results cache from lineitem_10 join orders_10 on l_orderkey = o_orderkey group by 1 order by 2 desc limit 10 -- 5.2s ; The execution graph shows a join between tables 5.2 seconds on a Small warehouse — now we can help Snowflake optimize it by adding
medium
1,384
Snowflake, Sql, Data Science, Data Engineering, Optimization. constraints. Adding constraints Constraints like primary and foreign keys are common in the OLTP database world. You can set them in Snowflake, but most people don’t bother as Snowflake doesn’t enforce them nor uses them to optimize. That’s changing, as with the upcoming hybrid tables you’ll get
medium
1,385
Snowflake, Sql, Data Science, Data Engineering, Optimization. enforcement of primary keys — and with redundant join elimination, Snowflake can use your constraints to optimize. This is how we can add a primary key to the table orders, and then we can add a foreign key constraint to lineitem to show Snowflake the relationship between these tables: alter table
medium
1,386
Snowflake, Sql, Data Science, Data Engineering, Optimization. orders_10 add primary key (o_orderkey) ; alter table lineitem_10 add constraint lineitem_fkey_orders foreign key (l_orderkey) references orders_10 (o_orderkey) ; And that could be enough, but it turns out it’s not. Run the query again, and you’ll find the same execution graph: The execution graph
medium
1,387
Snowflake, Sql, Data Science, Data Engineering, Optimization. still shows a join between tables Making sure Snowflake can rely on the constraints So why did the constraints between these tables didn’t help Snowflake optimize this join? Turns out you also need to tell Snowflake it can rely on these constraints. To do so: alter table lineitem_10 alter
medium
1,388
Snowflake, Sql, Data Science, Data Engineering, Optimization. constraint lineitem_fkey_orders rely; alter table orders_10 alter primary key rely; That’s it: Now our query runs in less than half of the time, with a much simpler execution graph: The execution graph shows the join has been automatically eliminated for faster results If you want to check the
medium
1,389
Snowflake, Sql, Data Science, Data Engineering, Optimization. state of your constraints — you can find that information on your information_schema: select constraint_name, table_name, constraint_type, enforced, rely from information_schema.table_constraints where table_name IN ('LINEITEM_10', 'ORDERS_10') ; Existing constrains in these tables, including being
medium
1,390
Snowflake, Sql, Data Science, Data Engineering, Optimization. “reliable” Additional notes By the way, our base query: select l_orderkey, sum(l_extendedprice*l_quantity) , current_timestamp() from lineitem_10 join orders_10 on l_orderkey = o_orderkey group by 1 order by 2 desc limit 10 Is very similar in intentions to one like: select l_orderkey,
medium
1,391
Snowflake, Sql, Data Science, Data Engineering, Optimization. sum(l_extendedprice*l_quantity) , current_timestamp() from lineitem_10 where l_orderkey in (select o_orderkey from orders_10) group by 1 order by 2 desc limit 10 But Snowflake did not eliminate that redundant semi-join in my experiments. Don’t worry too much about it: The team is aware and working
medium
1,392
Snowflake, Sql, Data Science, Data Engineering, Optimization. on it. Inside Snowflake — Optimizing the Query Optimizer: Redundant Semi-Joins and Anti-Joins Shruti Sekaran is one of the amazing interns I met at Snowflake during the summer of 2022. I was immediately impressed…medium.com Next steps Investigate how constraints can help in your queries — and share
medium
1,393
Snowflake, Sql, Data Science, Data Engineering, Optimization. your results with. Read “Constraining the centipede” by Serge Gershkovich. Constraining the centipede Snowflake join elimination can save you time and money. Learn how to use it to your advantage.medium.com Introduce data quality testing into your project, to make sure unique and primary keys
medium
1,394
Snowflake, Sql, Data Science, Data Engineering, Optimization. restrictions are reliable for the query optimizer. If you are using dbt, the package constraints can help with setting these constraints. although Snowflake doesn’t enforce most constraints, the query optimizer can consider primary key, unique key, and foreign key constraints during query rewrite
medium
1,395
Snowflake, Sql, Data Science, Data Engineering, Optimization. if the constraint is set to RELY. Since dbt can test that the data in the table complies with the constraints, this package creates constraints on Snowflake with the RELY property to improve query performance Check out the docs for join elimination to learn more — which include examples for
medium
1,396
Snowflake, Sql, Data Science, Data Engineering, Optimization. “Eliminating an Unnecessary Left Outer Join”, “Eliminating an Unnecessary Self-Join”, and “Eliminating an Unnecessary Join on a Primary Key and Foreign Key”. Understanding How Snowflake Can Eliminate Redundant Joins | Snowflake Documentation In some cases, a join on a key column can refer to tables
medium
1,397
Snowflake, Sql, Data Science, Data Engineering, Optimization. that are not needed for the join. If your tables have key…docs.snowflake.com Image generated by AI Want more? Try this out with a Snowflake free trial account — you only need an email address to get started. I’m Felipe Hoffa, Data Cloud Advocate for Snowflake. Thanks for joining me on this
medium
1,398
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. Natural language processing (NLP) and deep learning are growing in popularity for their use in ML technologies like self-driving cars and speech recognition software. As more companies begin to implement deep learning components and other machine learning practices, the demand for software
medium
1,400
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. developers and data scientists with proficiency in deep learning is skyrocketing. Today, we will introduce you to a popular deep learning project, the Text Generator, to familiarize you with important, industry-standard NLP concepts, including Markov chains. By the end of this article, you’ll
medium
1,401
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. understand how to build a Text Generator component for search engine systems and know how to implement Markov chains for faster predictive models. Here’s what we’ll cover today: Introduction to the Text Generator Project What are Markov chains? Text Generation Project Implementation What to learn
medium
1,402
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. next Introduction to the Text Generator Project Text generation is popular across the board and in every industry, especially for mobile, app, and data science. Even journalism uses text generation to aid writing processes. You’ve probably encountered text generation technology in your day-to-day
medium
1,403
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. life. iMessage text completion, Google search, and Google’s Smart Compose on Gmail are just a few examples. These skills are valuable for any aspiring data scientist. Today, we are going to build a text generator using Markov chains. This will be a character-based model that takes the previous
medium
1,404
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. character of the chain and generates the next letter in the sequence. By training our program with sample words, our text generator will learn common patterns in character order. The text generator will then apply these patterns to the input, an incomplete word, and output the character with the
medium
1,405
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. highest probability of completing that word. Let’s suppose we have a string, monke. We need to find the character that is best suited after the character e in the word monke based on our training corpus. Our text generator would determine that y is sometimes after e and would form a completed word.
medium
1,406
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. In other words, we are going to generate the next character for that given string. The text generator project relies on text generation, a subdivision of natural language processing that predicts and generates the next characters based on previously observed patterns in language. Without NLP, we’d
medium
1,407
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. have to create a table of all words in the English language and match the passed string to an existing word. There are two problems with this approach. It would be very slow to search thousands of words. The generator could only complete words that it had seen before. NLP allows us to dramatically
medium
1,408
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. cut runtime and increase versatility because the generator can complete words it hasn’t even encountered before. NLP can be expanded to predict words, phrases, or sentences if needed! For this project, we will specifically be using Markov chains to complete our text. Markov processes are the basis
medium
1,409
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. for many NLP projects involving written language and simulating samples from complex distributions. Markov processes are so powerful that they can be used to generate superficially real-looking text with only a sample document. What are Markov Chains? A Markov chain is a stochastic process that
medium
1,410
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. models a sequence of events in which the probability of each event depends on the state of the previous event. The model requires a finite set of states with fixed conditional probabilities of moving from one state to another. The probability of each shift depends only on the previous state of the
medium
1,411
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. model, not the entire history of events. For example, imagine you wanted to build a Markov chain model to predict weather conditions. We have two states in this model, sunny or rainy. There is a higher probability (70%) that it'll be sunny tomorrow if we've been in the sunny state today. The same
medium
1,412
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. is true for rainy; if it has been rainy, it will most likely continue to rain. However, it’s possible (30%) that the weather will shift states, so we also include that in our Markov chain model. The Markov chain is a perfect model for our text generator because our model will predict the next
medium
1,413
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. character using only the previous character. The advantage of using a Markov chain is that it’s accurate, light on memory (only stores 1 previous state), and fast to execute. Text Generation Project Implementation We’ll complete our text generator project in 6 steps: Generate the lookup table:
medium
1,414
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. Create table to record word frequency Convert frequency to probability: Convert our findings to a usable form Load the dataset: Load and utilize a training set Build the Markov chains: Use probabilities to create chains for each word and character Sample our data: Create a function to sample
medium
1,415
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. individual sections of the corpus Generate text: Test our model 1. Generate the lookup table First, we’ll create a table that records the occurrences of each character state within our training corpus. We will save the last ‘K’ characters and the ‘K+1’ character from the training corpus and save
medium
1,416
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. them in a lookup table. For example, imagine our training corpus contained, “the man was, they, then, the, the”. Then the number of occurrences by word would be: “the” — 3 “then” — 1 “they” — 1 “man” — 1 Here’s what that would look like in a lookup table: In the example above, we have taken K = 3.
medium
1,417
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. Therefore, we'll consider 3 characters at a time and take the next character (K+1) as our output character. In the above lookup table, we have the word (X) as the and the output character (Y) as a single space (" "). We have also calculated how many times this sequence occurs in our dataset, 3 in
medium
1,418
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. this case. We’ll find this data for each word in the corpus to generate all possible pairs of X and Y within the dataset. Here’s how we’d generate a lookup table in code: def generateTable(data,k=4): T = {} for i in range(len(data)-k): X = data[i:i+k] Y = data[i+k] #print("X %s and Y %s "%(X,Y)) if
medium
1,419
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. T.get(X) is None: T[X] = {} T[X][Y] = 1 else: if T[X].get(Y) is None: T[X][Y] = 1 else: T[X][Y] += 1 return T T = generateTable("hello hello helli") print(T) --> {'llo ': {'h': 2}, 'ello': {' ': 2}, 'o he': {'l': 2}, 'lo h': {'e': 2}, 'hell': {'i': 1, 'o': 2}, ' hel': {'l': 2}} Explanation: On line
medium
1,420
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. 3, we created a dictionary that is going to store our X and its corresponding Y and frequency value. Try running the above code and see the output. From line 9 to line 17, we checked for the occurrence of X and Y, and if we already have the X and Y pair in our lookup dictionary, then we just
medium
1,421
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. increment it by 1. 2. Convert frequencies to probabilities Once we have this table and the occurrences, we’ll generate the probability that an occurrence of Y will appear after an occurrence of a given X. Our equation for this will be: For example, if X = the and Y = n our equation would look like
medium
1,422
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. this: Frequency that Y = n when X = the: 2 Total frequency in the table: 8 Therefore: P = 2/8= 0.125= 12.5% Here’s how we’d apply this equation to convert our lookup table to probabilities usable with Markov chains (lookup table from the previous section is embedded as “hidden code”): def
medium
1,423
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. convertFreqIntoProb(T): for kx in T.keys(): s = float(sum(T[kx].values())) for k in T[kx].keys(): T[kx][k] = T[kx][k]/s return T T = convertFreqIntoProb(T) print(T) --> {'llo ': {'h': 1.0}, 'ello': {' ': 1.0}, 'o he': {'l': 1.0}, 'lo h': {'e': 1.0}, 'hell': {'i': 0.3333333333333333, 'o':
medium
1,424
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. 0.6666666666666666}, ' hel': {'l': 1.0}} Explanation: We summed up the frequency values for a particular key and then divided each frequency value of that key by that summed value to get our probabilities. Simple logic! 3. Load the dataset Next, we’ll load our real training corpus. You can use the
medium
1,425
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. long text (.txt) doc that you want. For our tutorial purposes, we’ll use a political speech to provide enough words to teach our model. text_path = "train_corpus.txt" def load_text(filename): with open(filename,encoding='utf8') as f: return f.read().lower() text = load_text(text_path) print('Loaded
medium
1,426
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. the dataset.') This data set will give our generator enough occurrences to make reasonably accurate predictions. As with all machine learning, larger training corpuses will result in more accurate predictions. 4. Build the Markov chains Now let’s construct our Markov chains and associate the
medium
1,427
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. probabilities with each character. We’ll use the generateTable() and convertFreqIntoProb() functions created in step 1 and step 2 to build the Markov models. def MarkovChain(text,k=4): T = generateTable(text,k) T = convertFreqIntoProb(T) return T model = MarkovChain(text) print('Model Created
medium
1,428
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. Successfully!') --> Loaded the dataset. Model Created Successfully! Explanation: On line 1, we created a method to generate the Markov model. This method accepts the text corpus and the value of K, which is the value telling the Markov model to consider K characters and predict the next character.
medium
1,429
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. On line 2, we generated our lookup table by providing the text corpus and K to our method, generateTable(), which we created in the previous lesson. On line 3, we converted the frequencies into the probabilistic values by using the method, convertFreqIntoProb(), which we also created in the
medium
1,430
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. previous lesson. 5. Sample the text Now, we’ll create a sampling function that takes the unfinished word (ctx), the Markov chains model from step 4 (model), and the number of characters used to form the word's base (k). We’ll use this function to sample passed context and return the next likely
medium
1,431
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. character with the probability it is the correct character. import numpy as np def sample_next(ctx,model,k): ctx = ctx[-k:] if model.get(ctx) is None: return " " possible_Chars = list(model[ctx].keys()) possible_values = list(model[ctx].values()) print(possible_Chars) print(possible_values) return
medium
1,432
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. np.random.choice(possible_Chars,p=possible_values) sample_next("commo",model,4) --> Loaded the dataset. Model Created Successfully! ['n'] [1.0] Explanation: The function, sample_next(ctx,model,k), accepts three parameters: the context, the model, and the value of K. The ctx is nothing but the text
medium
1,433
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. that will be used to generate some new text. However, only the last K characters from the context will be used by the model to predict the next character in the sequence. For example, we passed the value of context as commo and value of K = 4, so the context, which the model will look to generate
medium
1,434
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. the next character, is of K characters long, and hence, it will be ommo because the Markov models only take the previous history. You can see the value of the context variable by printing it too. On lines 9 and 10, we printed the possible characters and their probability values, which are also
medium
1,435
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. present in our model. We got the next predicted character as n, and its probability is 1.0. It makes sense because the word commo is more likely to be common after generating the next character. On line 12, we returned a sampled character according to the probabilistic values as we discussed above.
medium
1,436
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. 6. Generate text Finally, we’ll combine all the above functions to generate some text. def generateText(starting_sent,k=4,maxLen=1000): sentence = starting_sent ctx = starting_sent[-k:] for ix in range(maxLen): next_prediction = sample_next(ctx,model,k) sentence += next_prediction ctx =
medium
1,437
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. sentence[-k:] return sentence print("Function Created Successfully!") text = generateText("dear",k=4,maxLen=2000) print(text) --> Loaded the dataset. Model Created Successfully! Function Created Successfully! dear country brought new consciousness. i heartily great service of their lives, our
medium
1,438
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. country, many of tricoloring a color flag on their lives independence today. my devoted to be oppression of independence. these day the obc common many country, millions of oppression of massacrifice of indian whom everest. my dear country is not in the sevents went was demanding and nights by
medium
1,439
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. plowing in the message of the country is crossed, oppressed, women, to overcrowding for years of the south, it is like the ashok chakra of constitutional states crossed, deprived, oppressions of freedom, i bow my heart to proud of our country. my dear country, millions under to be a hundred years
medium
1,440
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. of the south, it is going their heroes. Explanation: The above function takes in three parameters: the starting word from which you want to generate the text, the value of K, and the maximum length of characters up to which you need the text. If you run the code, you’ll get a speech that starts
medium
1,441
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. with “dear” and has a total of 2000 characters. While the speech likely doesn’t make much sense, the words are all fully formed and generally mimic familiar patterns in words. What to learn next Congratulations on completing this text generation project. You now have hands-on experience with
medium
1,442
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. Natural Language Processing and Markov chain models to use as you continue your deep learning journey. Your next steps are to adapt the project to produce more understandable output, learn a tool like GPT-3, or try some more awesome machine learning projects like: Pokemon classification system
medium
1,443
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. Emoji predictor using NLP Text decryption using recurrent neural network To walk you through these projects and more, Educative has created Building Advanced Deep Learning and NLP Projects. This course gives you the chance to practice advanced deep learning concepts as you complete interesting and
medium
1,444
Deep Learning, Natural Language Process, Machine Learning, Markov Chains, Nlp Tutorial. unique projects like the one we did today. By the end, you’ll have the experience necessary to use any of the top deep learning algorithms on your own projects. Happy learning! Continue reading about NLP and Machine Learning on Educative Data Science Simplified: top 5 NLP tasks that use Hugging
medium
1,445