repo_name
stringclasses
1 value
pr_number
int64
4.12k
11.2k
pr_title
stringlengths
9
107
pr_description
stringlengths
107
5.48k
author
stringlengths
4
18
date_created
unknown
date_merged
unknown
previous_commit
stringlengths
40
40
pr_commit
stringlengths
40
40
query
stringlengths
118
5.52k
before_content
stringlengths
0
7.93M
after_content
stringlengths
0
7.93M
label
int64
-1
1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Demonstrates implementation of SHA1 Hash function in a Python class and gives utilities to find hash of string or hash of text from a file. Usage: python sha1.py --string "Hello World!!" python sha1.py --file "hello_world.txt" When run without any arguments, it prints the hash of the string "Hello World!! Welcome to Cryptography" Also contains a Test class to verify that the generated Hash is same as that returned by the hashlib library SHA1 hash or SHA1 sum of a string is a cryptographic function which means it is easy to calculate forwards but extremely difficult to calculate backwards. What this means is, you can easily calculate the hash of a string, but it is extremely difficult to know the original string if you have its hash. This property is useful to communicate securely, send encrypted messages and is very useful in payment systems, blockchain and cryptocurrency etc. The Algorithm as described in the reference: First we start with a message. The message is padded and the length of the message is added to the end. It is then split into blocks of 512 bits or 64 bytes. The blocks are then processed one at a time. Each block must be expanded and compressed. The value after each compression is added to a 160bit buffer called the current hash state. After the last block is processed the current hash state is returned as the final hash. Reference: https://deadhacker.com/2006/02/21/sha-1-illustrated/ """ import argparse import hashlib # hashlib is only used inside the Test class import struct class SHA1Hash: """ Class to contain the entire pipeline for SHA1 Hashing Algorithm >>> SHA1Hash(bytes('Allan', 'utf-8')).final_hash() '872af2d8ac3d8695387e7c804bf0e02c18df9e6e' """ def __init__(self, data): """ Inititates the variables data and h. h is a list of 5 8-digit Hexadecimal numbers corresponding to (1732584193, 4023233417, 2562383102, 271733878, 3285377520) respectively. We will start with this as a message digest. 0x is how you write Hexadecimal numbers in Python """ self.data = data self.h = [0x67452301, 0xEFCDAB89, 0x98BADCFE, 0x10325476, 0xC3D2E1F0] @staticmethod def rotate(n, b): """ Static method to be used inside other methods. Left rotates n by b. >>> SHA1Hash('').rotate(12,2) 48 """ return ((n << b) | (n >> (32 - b))) & 0xFFFFFFFF def padding(self): """ Pads the input message with zeros so that padded_data has 64 bytes or 512 bits """ padding = b"\x80" + b"\x00" * (63 - (len(self.data) + 8) % 64) padded_data = self.data + padding + struct.pack(">Q", 8 * len(self.data)) return padded_data def split_blocks(self): """ Returns a list of bytestrings each of length 64 """ return [ self.padded_data[i : i + 64] for i in range(0, len(self.padded_data), 64) ] # @staticmethod def expand_block(self, block): """ Takes a bytestring-block of length 64, unpacks it to a list of integers and returns a list of 80 integers after some bit operations """ w = list(struct.unpack(">16L", block)) + [0] * 64 for i in range(16, 80): w[i] = self.rotate((w[i - 3] ^ w[i - 8] ^ w[i - 14] ^ w[i - 16]), 1) return w def final_hash(self): """ Calls all the other methods to process the input. Pads the data, then splits into blocks and then does a series of operations for each block (including expansion). For each block, the variable h that was initialized is copied to a,b,c,d,e and these 5 variables a,b,c,d,e undergo several changes. After all the blocks are processed, these 5 variables are pairwise added to h ie a to h[0], b to h[1] and so on. This h becomes our final hash which is returned. """ self.padded_data = self.padding() self.blocks = self.split_blocks() for block in self.blocks: expanded_block = self.expand_block(block) a, b, c, d, e = self.h for i in range(0, 80): if 0 <= i < 20: f = (b & c) | ((~b) & d) k = 0x5A827999 elif 20 <= i < 40: f = b ^ c ^ d k = 0x6ED9EBA1 elif 40 <= i < 60: f = (b & c) | (b & d) | (c & d) k = 0x8F1BBCDC elif 60 <= i < 80: f = b ^ c ^ d k = 0xCA62C1D6 a, b, c, d, e = ( self.rotate(a, 5) + f + e + k + expanded_block[i] & 0xFFFFFFFF, a, self.rotate(b, 30), c, d, ) self.h = ( self.h[0] + a & 0xFFFFFFFF, self.h[1] + b & 0xFFFFFFFF, self.h[2] + c & 0xFFFFFFFF, self.h[3] + d & 0xFFFFFFFF, self.h[4] + e & 0xFFFFFFFF, ) return ("{:08x}" * 5).format(*self.h) def test_sha1_hash(): msg = b"Test String" assert SHA1Hash(msg).final_hash() == hashlib.sha1(msg).hexdigest() # noqa: S324 def main(): """ Provides option 'string' or 'file' to take input and prints the calculated SHA1 hash. unittest.main() has been commented because we probably don't want to run the test each time. """ # unittest.main() parser = argparse.ArgumentParser(description="Process some strings or files") parser.add_argument( "--string", dest="input_string", default="Hello World!! Welcome to Cryptography", help="Hash the string", ) parser.add_argument("--file", dest="input_file", help="Hash contents of a file") args = parser.parse_args() input_string = args.input_string # In any case hash input should be a bytestring if args.input_file: with open(args.input_file, "rb") as f: hash_input = f.read() else: hash_input = bytes(input_string, "utf-8") print(SHA1Hash(hash_input).final_hash()) if __name__ == "__main__": main() import doctest doctest.testmod()
""" Demonstrates implementation of SHA1 Hash function in a Python class and gives utilities to find hash of string or hash of text from a file. Usage: python sha1.py --string "Hello World!!" python sha1.py --file "hello_world.txt" When run without any arguments, it prints the hash of the string "Hello World!! Welcome to Cryptography" Also contains a Test class to verify that the generated Hash is same as that returned by the hashlib library SHA1 hash or SHA1 sum of a string is a cryptographic function which means it is easy to calculate forwards but extremely difficult to calculate backwards. What this means is, you can easily calculate the hash of a string, but it is extremely difficult to know the original string if you have its hash. This property is useful to communicate securely, send encrypted messages and is very useful in payment systems, blockchain and cryptocurrency etc. The Algorithm as described in the reference: First we start with a message. The message is padded and the length of the message is added to the end. It is then split into blocks of 512 bits or 64 bytes. The blocks are then processed one at a time. Each block must be expanded and compressed. The value after each compression is added to a 160bit buffer called the current hash state. After the last block is processed the current hash state is returned as the final hash. Reference: https://deadhacker.com/2006/02/21/sha-1-illustrated/ """ import argparse import hashlib # hashlib is only used inside the Test class import struct class SHA1Hash: """ Class to contain the entire pipeline for SHA1 Hashing Algorithm >>> SHA1Hash(bytes('Allan', 'utf-8')).final_hash() '872af2d8ac3d8695387e7c804bf0e02c18df9e6e' """ def __init__(self, data): """ Inititates the variables data and h. h is a list of 5 8-digit Hexadecimal numbers corresponding to (1732584193, 4023233417, 2562383102, 271733878, 3285377520) respectively. We will start with this as a message digest. 0x is how you write Hexadecimal numbers in Python """ self.data = data self.h = [0x67452301, 0xEFCDAB89, 0x98BADCFE, 0x10325476, 0xC3D2E1F0] @staticmethod def rotate(n, b): """ Static method to be used inside other methods. Left rotates n by b. >>> SHA1Hash('').rotate(12,2) 48 """ return ((n << b) | (n >> (32 - b))) & 0xFFFFFFFF def padding(self): """ Pads the input message with zeros so that padded_data has 64 bytes or 512 bits """ padding = b"\x80" + b"\x00" * (63 - (len(self.data) + 8) % 64) padded_data = self.data + padding + struct.pack(">Q", 8 * len(self.data)) return padded_data def split_blocks(self): """ Returns a list of bytestrings each of length 64 """ return [ self.padded_data[i : i + 64] for i in range(0, len(self.padded_data), 64) ] # @staticmethod def expand_block(self, block): """ Takes a bytestring-block of length 64, unpacks it to a list of integers and returns a list of 80 integers after some bit operations """ w = list(struct.unpack(">16L", block)) + [0] * 64 for i in range(16, 80): w[i] = self.rotate((w[i - 3] ^ w[i - 8] ^ w[i - 14] ^ w[i - 16]), 1) return w def final_hash(self): """ Calls all the other methods to process the input. Pads the data, then splits into blocks and then does a series of operations for each block (including expansion). For each block, the variable h that was initialized is copied to a,b,c,d,e and these 5 variables a,b,c,d,e undergo several changes. After all the blocks are processed, these 5 variables are pairwise added to h ie a to h[0], b to h[1] and so on. This h becomes our final hash which is returned. """ self.padded_data = self.padding() self.blocks = self.split_blocks() for block in self.blocks: expanded_block = self.expand_block(block) a, b, c, d, e = self.h for i in range(0, 80): if 0 <= i < 20: f = (b & c) | ((~b) & d) k = 0x5A827999 elif 20 <= i < 40: f = b ^ c ^ d k = 0x6ED9EBA1 elif 40 <= i < 60: f = (b & c) | (b & d) | (c & d) k = 0x8F1BBCDC elif 60 <= i < 80: f = b ^ c ^ d k = 0xCA62C1D6 a, b, c, d, e = ( self.rotate(a, 5) + f + e + k + expanded_block[i] & 0xFFFFFFFF, a, self.rotate(b, 30), c, d, ) self.h = ( self.h[0] + a & 0xFFFFFFFF, self.h[1] + b & 0xFFFFFFFF, self.h[2] + c & 0xFFFFFFFF, self.h[3] + d & 0xFFFFFFFF, self.h[4] + e & 0xFFFFFFFF, ) return ("{:08x}" * 5).format(*self.h) def test_sha1_hash(): msg = b"Test String" assert SHA1Hash(msg).final_hash() == hashlib.sha1(msg).hexdigest() # noqa: S324 def main(): """ Provides option 'string' or 'file' to take input and prints the calculated SHA1 hash. unittest.main() has been commented because we probably don't want to run the test each time. """ # unittest.main() parser = argparse.ArgumentParser(description="Process some strings or files") parser.add_argument( "--string", dest="input_string", default="Hello World!! Welcome to Cryptography", help="Hash the string", ) parser.add_argument("--file", dest="input_file", help="Hash contents of a file") args = parser.parse_args() input_string = args.input_string # In any case hash input should be a bytestring if args.input_file: with open(args.input_file, "rb") as f: hash_input = f.read() else: hash_input = bytes(input_string, "utf-8") print(SHA1Hash(hash_input).final_hash()) if __name__ == "__main__": main() import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Problem: Comparing two numbers written in index form like 2'11 and 3'7 is not difficult, as any calculator would confirm that 2^11 = 2048 < 3^7 = 2187. However, confirming that 632382^518061 > 519432^525806 would be much more difficult, as both numbers contain over three million digits. Using base_exp.txt, a 22K text file containing one thousand lines with a base/exponent pair on each line, determine which line number has the greatest numerical value. NOTE: The first two lines in the file represent the numbers in the example given above. """ import os from math import log10 def solution(data_file: str = "base_exp.txt") -> int: """ >>> solution() 709 """ largest: float = 0 result = 0 for i, line in enumerate(open(os.path.join(os.path.dirname(__file__), data_file))): a, x = list(map(int, line.split(","))) if x * log10(a) > largest: largest = x * log10(a) result = i + 1 return result if __name__ == "__main__": print(solution())
""" Problem: Comparing two numbers written in index form like 2'11 and 3'7 is not difficult, as any calculator would confirm that 2^11 = 2048 < 3^7 = 2187. However, confirming that 632382^518061 > 519432^525806 would be much more difficult, as both numbers contain over three million digits. Using base_exp.txt, a 22K text file containing one thousand lines with a base/exponent pair on each line, determine which line number has the greatest numerical value. NOTE: The first two lines in the file represent the numbers in the example given above. """ import os from math import log10 def solution(data_file: str = "base_exp.txt") -> int: """ >>> solution() 709 """ largest: float = 0 result = 0 for i, line in enumerate(open(os.path.join(os.path.dirname(__file__), data_file))): a, x = list(map(int, line.split(","))) if x * log10(a) > largest: largest = x * log10(a) result = i + 1 return result if __name__ == "__main__": print(solution())
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/bin/sh # # An example hook script to make use of push options. # The example simply echoes all push options that start with 'echoback=' # and rejects all pushes when the "reject" push option is used. # # To enable this hook, rename this file to "pre-receive". if test -n "$GIT_PUSH_OPTION_COUNT" then i=0 while test "$i" -lt "$GIT_PUSH_OPTION_COUNT" do eval "value=\$GIT_PUSH_OPTION_$i" case "$value" in echoback=*) echo "echo from the pre-receive-hook: ${value#*=}" >&2 ;; reject) exit 1 esac i=$((i + 1)) done fi
#!/bin/sh # # An example hook script to make use of push options. # The example simply echoes all push options that start with 'echoback=' # and rejects all pushes when the "reject" push option is used. # # To enable this hook, rename this file to "pre-receive". if test -n "$GIT_PUSH_OPTION_COUNT" then i=0 while test "$i" -lt "$GIT_PUSH_OPTION_COUNT" do eval "value=\$GIT_PUSH_OPTION_$i" case "$value" in echoback=*) echo "echo from the pre-receive-hook: ${value#*=}" >&2 ;; reject) exit 1 esac i=$((i + 1)) done fi
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Minimalist file that allows pytest to find and run the Test unittest. For details, see: https://doc.pytest.org/en/latest/goodpractices.html#conventions-for-python-test-discovery """ from .prime_check import Test Test()
""" Minimalist file that allows pytest to find and run the Test unittest. For details, see: https://doc.pytest.org/en/latest/goodpractices.html#conventions-for-python-test-discovery """ from .prime_check import Test Test()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from PIL import Image """ Mean thresholding algorithm for image processing https://en.wikipedia.org/wiki/Thresholding_(image_processing) """ def mean_threshold(image: Image) -> Image: """ image: is a grayscale PIL image object """ height, width = image.size mean = 0 pixels = image.load() for i in range(width): for j in range(height): pixel = pixels[j, i] mean += pixel mean //= width * height for j in range(width): for i in range(height): pixels[i, j] = 255 if pixels[i, j] > mean else 0 return image if __name__ == "__main__": image = mean_threshold(Image.open("path_to_image").convert("L")) image.save("output_image_path")
from PIL import Image """ Mean thresholding algorithm for image processing https://en.wikipedia.org/wiki/Thresholding_(image_processing) """ def mean_threshold(image: Image) -> Image: """ image: is a grayscale PIL image object """ height, width = image.size mean = 0 pixels = image.load() for i in range(width): for j in range(height): pixel = pixels[j, i] mean += pixel mean //= width * height for j in range(width): for i in range(height): pixels[i, j] = 255 if pixels[i, j] > mean else 0 return image if __name__ == "__main__": image = mean_threshold(Image.open("path_to_image").convert("L")) image.save("output_image_path")
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
B64_CHARSET = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/" def base64_encode(data: bytes) -> bytes: """Encodes data according to RFC4648. The data is first transformed to binary and appended with binary digits so that its length becomes a multiple of 6, then each 6 binary digits will match a character in the B64_CHARSET string. The number of appended binary digits would later determine how many "=" signs should be added, the padding. For every 2 binary digits added, a "=" sign is added in the output. We can add any binary digits to make it a multiple of 6, for instance, consider the following example: "AA" -> 0010100100101001 -> 001010 010010 1001 As can be seen above, 2 more binary digits should be added, so there's 4 possibilities here: 00, 01, 10 or 11. That being said, Base64 encoding can be used in Steganography to hide data in these appended digits. >>> from base64 import b64encode >>> a = b"This pull request is part of Hacktoberfest20!" >>> b = b"https://tools.ietf.org/html/rfc4648" >>> c = b"A" >>> base64_encode(a) == b64encode(a) True >>> base64_encode(b) == b64encode(b) True >>> base64_encode(c) == b64encode(c) True >>> base64_encode("abc") Traceback (most recent call last): ... TypeError: a bytes-like object is required, not 'str' """ # Make sure the supplied data is a bytes-like object if not isinstance(data, bytes): raise TypeError( f"a bytes-like object is required, not '{data.__class__.__name__}'" ) binary_stream = "".join(bin(byte)[2:].zfill(8) for byte in data) padding_needed = len(binary_stream) % 6 != 0 if padding_needed: # The padding that will be added later padding = b"=" * ((6 - len(binary_stream) % 6) // 2) # Append binary_stream with arbitrary binary digits (0's by default) to make its # length a multiple of 6. binary_stream += "0" * (6 - len(binary_stream) % 6) else: padding = b"" # Encode every 6 binary digits to their corresponding Base64 character return ( "".join( B64_CHARSET[int(binary_stream[index : index + 6], 2)] for index in range(0, len(binary_stream), 6) ).encode() + padding ) def base64_decode(encoded_data: str) -> bytes: """Decodes data according to RFC4648. This does the reverse operation of base64_encode. We first transform the encoded data back to a binary stream, take off the previously appended binary digits according to the padding, at this point we would have a binary stream whose length is multiple of 8, the last step is to convert every 8 bits to a byte. >>> from base64 import b64decode >>> a = "VGhpcyBwdWxsIHJlcXVlc3QgaXMgcGFydCBvZiBIYWNrdG9iZXJmZXN0MjAh" >>> b = "aHR0cHM6Ly90b29scy5pZXRmLm9yZy9odG1sL3JmYzQ2NDg=" >>> c = "QQ==" >>> base64_decode(a) == b64decode(a) True >>> base64_decode(b) == b64decode(b) True >>> base64_decode(c) == b64decode(c) True >>> base64_decode("abc") Traceback (most recent call last): ... AssertionError: Incorrect padding """ # Make sure encoded_data is either a string or a bytes-like object if not isinstance(encoded_data, bytes) and not isinstance(encoded_data, str): raise TypeError( "argument should be a bytes-like object or ASCII string, not " f"'{encoded_data.__class__.__name__}'" ) # In case encoded_data is a bytes-like object, make sure it contains only # ASCII characters so we convert it to a string object if isinstance(encoded_data, bytes): try: encoded_data = encoded_data.decode("utf-8") except UnicodeDecodeError: raise ValueError("base64 encoded data should only contain ASCII characters") padding = encoded_data.count("=") # Check if the encoded string contains non base64 characters if padding: assert all( char in B64_CHARSET for char in encoded_data[:-padding] ), "Invalid base64 character(s) found." else: assert all( char in B64_CHARSET for char in encoded_data ), "Invalid base64 character(s) found." # Check the padding assert len(encoded_data) % 4 == 0 and padding < 3, "Incorrect padding" if padding: # Remove padding if there is one encoded_data = encoded_data[:-padding] binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data )[: -padding * 2] else: binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data ) data = [ int(binary_stream[index : index + 8], 2) for index in range(0, len(binary_stream), 8) ] return bytes(data) if __name__ == "__main__": import doctest doctest.testmod()
B64_CHARSET = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/" def base64_encode(data: bytes) -> bytes: """Encodes data according to RFC4648. The data is first transformed to binary and appended with binary digits so that its length becomes a multiple of 6, then each 6 binary digits will match a character in the B64_CHARSET string. The number of appended binary digits would later determine how many "=" signs should be added, the padding. For every 2 binary digits added, a "=" sign is added in the output. We can add any binary digits to make it a multiple of 6, for instance, consider the following example: "AA" -> 0010100100101001 -> 001010 010010 1001 As can be seen above, 2 more binary digits should be added, so there's 4 possibilities here: 00, 01, 10 or 11. That being said, Base64 encoding can be used in Steganography to hide data in these appended digits. >>> from base64 import b64encode >>> a = b"This pull request is part of Hacktoberfest20!" >>> b = b"https://tools.ietf.org/html/rfc4648" >>> c = b"A" >>> base64_encode(a) == b64encode(a) True >>> base64_encode(b) == b64encode(b) True >>> base64_encode(c) == b64encode(c) True >>> base64_encode("abc") Traceback (most recent call last): ... TypeError: a bytes-like object is required, not 'str' """ # Make sure the supplied data is a bytes-like object if not isinstance(data, bytes): raise TypeError( f"a bytes-like object is required, not '{data.__class__.__name__}'" ) binary_stream = "".join(bin(byte)[2:].zfill(8) for byte in data) padding_needed = len(binary_stream) % 6 != 0 if padding_needed: # The padding that will be added later padding = b"=" * ((6 - len(binary_stream) % 6) // 2) # Append binary_stream with arbitrary binary digits (0's by default) to make its # length a multiple of 6. binary_stream += "0" * (6 - len(binary_stream) % 6) else: padding = b"" # Encode every 6 binary digits to their corresponding Base64 character return ( "".join( B64_CHARSET[int(binary_stream[index : index + 6], 2)] for index in range(0, len(binary_stream), 6) ).encode() + padding ) def base64_decode(encoded_data: str) -> bytes: """Decodes data according to RFC4648. This does the reverse operation of base64_encode. We first transform the encoded data back to a binary stream, take off the previously appended binary digits according to the padding, at this point we would have a binary stream whose length is multiple of 8, the last step is to convert every 8 bits to a byte. >>> from base64 import b64decode >>> a = "VGhpcyBwdWxsIHJlcXVlc3QgaXMgcGFydCBvZiBIYWNrdG9iZXJmZXN0MjAh" >>> b = "aHR0cHM6Ly90b29scy5pZXRmLm9yZy9odG1sL3JmYzQ2NDg=" >>> c = "QQ==" >>> base64_decode(a) == b64decode(a) True >>> base64_decode(b) == b64decode(b) True >>> base64_decode(c) == b64decode(c) True >>> base64_decode("abc") Traceback (most recent call last): ... AssertionError: Incorrect padding """ # Make sure encoded_data is either a string or a bytes-like object if not isinstance(encoded_data, bytes) and not isinstance(encoded_data, str): raise TypeError( "argument should be a bytes-like object or ASCII string, not " f"'{encoded_data.__class__.__name__}'" ) # In case encoded_data is a bytes-like object, make sure it contains only # ASCII characters so we convert it to a string object if isinstance(encoded_data, bytes): try: encoded_data = encoded_data.decode("utf-8") except UnicodeDecodeError: raise ValueError("base64 encoded data should only contain ASCII characters") padding = encoded_data.count("=") # Check if the encoded string contains non base64 characters if padding: assert all( char in B64_CHARSET for char in encoded_data[:-padding] ), "Invalid base64 character(s) found." else: assert all( char in B64_CHARSET for char in encoded_data ), "Invalid base64 character(s) found." # Check the padding assert len(encoded_data) % 4 == 0 and padding < 3, "Incorrect padding" if padding: # Remove padding if there is one encoded_data = encoded_data[:-padding] binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data )[: -padding * 2] else: binary_stream = "".join( bin(B64_CHARSET.index(char))[2:].zfill(6) for char in encoded_data ) data = [ int(binary_stream[index : index + 8], 2) for index in range(0, len(binary_stream), 8) ] return bytes(data) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" == Juggler Sequence == Juggler sequence start with any positive integer n. The next term is obtained as follows: If n term is even, the next term is floor value of square root of n . If n is odd, the next term is floor value of 3 time the square root of n. https://en.wikipedia.org/wiki/Juggler_sequence """ # Author : Akshay Dubey (https://github.com/itsAkshayDubey) import math def juggler_sequence(number: int) -> list[int]: """ >>> juggler_sequence(0) Traceback (most recent call last): ... ValueError: Input value of [number=0] must be a positive integer >>> juggler_sequence(1) [1] >>> juggler_sequence(2) [2, 1] >>> juggler_sequence(3) [3, 5, 11, 36, 6, 2, 1] >>> juggler_sequence(5) [5, 11, 36, 6, 2, 1] >>> juggler_sequence(10) [10, 3, 5, 11, 36, 6, 2, 1] >>> juggler_sequence(25) [25, 125, 1397, 52214, 228, 15, 58, 7, 18, 4, 2, 1] >>> juggler_sequence(6.0) Traceback (most recent call last): ... TypeError: Input value of [number=6.0] must be an integer >>> juggler_sequence(-1) Traceback (most recent call last): ... ValueError: Input value of [number=-1] must be a positive integer """ if not isinstance(number, int): raise TypeError(f"Input value of [number={number}] must be an integer") if number < 1: raise ValueError(f"Input value of [number={number}] must be a positive integer") sequence = [number] while number != 1: if number % 2 == 0: number = math.floor(math.sqrt(number)) else: number = math.floor( math.sqrt(number) * math.sqrt(number) * math.sqrt(number) ) sequence.append(number) return sequence if __name__ == "__main__": import doctest doctest.testmod()
""" == Juggler Sequence == Juggler sequence start with any positive integer n. The next term is obtained as follows: If n term is even, the next term is floor value of square root of n . If n is odd, the next term is floor value of 3 time the square root of n. https://en.wikipedia.org/wiki/Juggler_sequence """ # Author : Akshay Dubey (https://github.com/itsAkshayDubey) import math def juggler_sequence(number: int) -> list[int]: """ >>> juggler_sequence(0) Traceback (most recent call last): ... ValueError: Input value of [number=0] must be a positive integer >>> juggler_sequence(1) [1] >>> juggler_sequence(2) [2, 1] >>> juggler_sequence(3) [3, 5, 11, 36, 6, 2, 1] >>> juggler_sequence(5) [5, 11, 36, 6, 2, 1] >>> juggler_sequence(10) [10, 3, 5, 11, 36, 6, 2, 1] >>> juggler_sequence(25) [25, 125, 1397, 52214, 228, 15, 58, 7, 18, 4, 2, 1] >>> juggler_sequence(6.0) Traceback (most recent call last): ... TypeError: Input value of [number=6.0] must be an integer >>> juggler_sequence(-1) Traceback (most recent call last): ... ValueError: Input value of [number=-1] must be a positive integer """ if not isinstance(number, int): raise TypeError(f"Input value of [number={number}] must be an integer") if number < 1: raise ValueError(f"Input value of [number={number}] must be a positive integer") sequence = [number] while number != 1: if number % 2 == 0: number = math.floor(math.sqrt(number)) else: number = math.floor( math.sqrt(number) * math.sqrt(number) * math.sqrt(number) ) sequence.append(number) return sequence if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Three distinct points are plotted at random on a Cartesian plane, for which -1000 ≤ x, y ≤ 1000, such that a triangle is formed. Consider the following two triangles: A(-340,495), B(-153,-910), C(835,-947) X(-175,41), Y(-421,-714), Z(574,-645) It can be verified that triangle ABC contains the origin, whereas triangle XYZ does not. Using triangles.txt (right click and 'Save Link/Target As...'), a 27K text file containing the coordinates of one thousand "random" triangles, find the number of triangles for which the interior contains the origin. NOTE: The first two examples in the file represent the triangles in the example given above. """ from __future__ import annotations from pathlib import Path def vector_product(point1: tuple[int, int], point2: tuple[int, int]) -> int: """ Return the 2-d vector product of two vectors. >>> vector_product((1, 2), (-5, 0)) 10 >>> vector_product((3, 1), (6, 10)) 24 """ return point1[0] * point2[1] - point1[1] * point2[0] def contains_origin(x1: int, y1: int, x2: int, y2: int, x3: int, y3: int) -> bool: """ Check if the triangle given by the points A(x1, y1), B(x2, y2), C(x3, y3) contains the origin. >>> contains_origin(-340, 495, -153, -910, 835, -947) True >>> contains_origin(-175, 41, -421, -714, 574, -645) False """ point_a: tuple[int, int] = (x1, y1) point_a_to_b: tuple[int, int] = (x2 - x1, y2 - y1) point_a_to_c: tuple[int, int] = (x3 - x1, y3 - y1) a: float = -vector_product(point_a, point_a_to_b) / vector_product( point_a_to_c, point_a_to_b ) b: float = +vector_product(point_a, point_a_to_c) / vector_product( point_a_to_c, point_a_to_b ) return a > 0 and b > 0 and a + b < 1 def solution(filename: str = "p102_triangles.txt") -> int: """ Find the number of triangles whose interior contains the origin. >>> solution("test_triangles.txt") 1 """ data: str = Path(__file__).parent.joinpath(filename).read_text(encoding="utf-8") triangles: list[list[int]] = [] for line in data.strip().split("\n"): triangles.append([int(number) for number in line.split(",")]) ret: int = 0 triangle: list[int] for triangle in triangles: ret += contains_origin(*triangle) return ret if __name__ == "__main__": print(f"{solution() = }")
""" Three distinct points are plotted at random on a Cartesian plane, for which -1000 ≤ x, y ≤ 1000, such that a triangle is formed. Consider the following two triangles: A(-340,495), B(-153,-910), C(835,-947) X(-175,41), Y(-421,-714), Z(574,-645) It can be verified that triangle ABC contains the origin, whereas triangle XYZ does not. Using triangles.txt (right click and 'Save Link/Target As...'), a 27K text file containing the coordinates of one thousand "random" triangles, find the number of triangles for which the interior contains the origin. NOTE: The first two examples in the file represent the triangles in the example given above. """ from __future__ import annotations from pathlib import Path def vector_product(point1: tuple[int, int], point2: tuple[int, int]) -> int: """ Return the 2-d vector product of two vectors. >>> vector_product((1, 2), (-5, 0)) 10 >>> vector_product((3, 1), (6, 10)) 24 """ return point1[0] * point2[1] - point1[1] * point2[0] def contains_origin(x1: int, y1: int, x2: int, y2: int, x3: int, y3: int) -> bool: """ Check if the triangle given by the points A(x1, y1), B(x2, y2), C(x3, y3) contains the origin. >>> contains_origin(-340, 495, -153, -910, 835, -947) True >>> contains_origin(-175, 41, -421, -714, 574, -645) False """ point_a: tuple[int, int] = (x1, y1) point_a_to_b: tuple[int, int] = (x2 - x1, y2 - y1) point_a_to_c: tuple[int, int] = (x3 - x1, y3 - y1) a: float = -vector_product(point_a, point_a_to_b) / vector_product( point_a_to_c, point_a_to_b ) b: float = +vector_product(point_a, point_a_to_c) / vector_product( point_a_to_c, point_a_to_b ) return a > 0 and b > 0 and a + b < 1 def solution(filename: str = "p102_triangles.txt") -> int: """ Find the number of triangles whose interior contains the origin. >>> solution("test_triangles.txt") 1 """ data: str = Path(__file__).parent.joinpath(filename).read_text(encoding="utf-8") triangles: list[list[int]] = [] for line in data.strip().split("\n"): triangles.append([int(number) for number in line.split(",")]) ret: int = 0 triangle: list[int] for triangle in triangles: ret += contains_origin(*triangle) return ret if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
4445,2697,5115,718,2209,2212,654,4348,3079,6821,7668,3276,8874,4190,3785,2752,9473,7817,9137,496,7338,3434,7152,4355,4552,7917,7827,2460,2350,691,3514,5880,3145,7633,7199,3783,5066,7487,3285,1084,8985,760,872,8609,8051,1134,9536,5750,9716,9371,7619,5617,275,9721,2997,2698,1887,8825,6372,3014,2113,7122,7050,6775,5948,2758,1219,3539,348,7989,2735,9862,1263,8089,6401,9462,3168,2758,3748,5870 1096,20,1318,7586,5167,2642,1443,5741,7621,7030,5526,4244,2348,4641,9827,2448,6918,5883,3737,300,7116,6531,567,5997,3971,6623,820,6148,3287,1874,7981,8424,7672,7575,6797,6717,1078,5008,4051,8795,5820,346,1851,6463,2117,6058,3407,8211,117,4822,1317,4377,4434,5925,8341,4800,1175,4173,690,8978,7470,1295,3799,8724,3509,9849,618,3320,7068,9633,2384,7175,544,6583,1908,9983,481,4187,9353,9377 9607,7385,521,6084,1364,8983,7623,1585,6935,8551,2574,8267,4781,3834,2764,2084,2669,4656,9343,7709,2203,9328,8004,6192,5856,3555,2260,5118,6504,1839,9227,1259,9451,1388,7909,5733,6968,8519,9973,1663,5315,7571,3035,4325,4283,2304,6438,3815,9213,9806,9536,196,5542,6907,2475,1159,5820,9075,9470,2179,9248,1828,4592,9167,3713,4640,47,3637,309,7344,6955,346,378,9044,8635,7466,5036,9515,6385,9230 7206,3114,7760,1094,6150,5182,7358,7387,4497,955,101,1478,7777,6966,7010,8417,6453,4955,3496,107,449,8271,131,2948,6185,784,5937,8001,6104,8282,4165,3642,710,2390,575,715,3089,6964,4217,192,5949,7006,715,3328,1152,66,8044,4319,1735,146,4818,5456,6451,4113,1063,4781,6799,602,1504,6245,6550,1417,1343,2363,3785,5448,4545,9371,5420,5068,4613,4882,4241,5043,7873,8042,8434,3939,9256,2187 3620,8024,577,9997,7377,7682,1314,1158,6282,6310,1896,2509,5436,1732,9480,706,496,101,6232,7375,2207,2306,110,6772,3433,2878,8140,5933,8688,1399,2210,7332,6172,6403,7333,4044,2291,1790,2446,7390,8698,5723,3678,7104,1825,2040,140,3982,4905,4160,2200,5041,2512,1488,2268,1175,7588,8321,8078,7312,977,5257,8465,5068,3453,3096,1651,7906,253,9250,6021,8791,8109,6651,3412,345,4778,5152,4883,7505 1074,5438,9008,2679,5397,5429,2652,3403,770,9188,4248,2493,4361,8327,9587,707,9525,5913,93,1899,328,2876,3604,673,8576,6908,7659,2544,3359,3883,5273,6587,3065,1749,3223,604,9925,6941,2823,8767,7039,3290,3214,1787,7904,3421,7137,9560,8451,2669,9219,6332,1576,5477,6755,8348,4164,4307,2984,4012,6629,1044,2874,6541,4942,903,1404,9125,5160,8836,4345,2581,460,8438,1538,5507,668,3352,2678,6942 4295,1176,5596,1521,3061,9868,7037,7129,8933,6659,5947,5063,3653,9447,9245,2679,767,714,116,8558,163,3927,8779,158,5093,2447,5782,3967,1716,931,7772,8164,1117,9244,5783,7776,3846,8862,6014,2330,6947,1777,3112,6008,3491,1906,5952,314,4602,8994,5919,9214,3995,5026,7688,6809,5003,3128,2509,7477,110,8971,3982,8539,2980,4689,6343,5411,2992,5270,5247,9260,2269,7474,1042,7162,5206,1232,4556,4757 510,3556,5377,1406,5721,4946,2635,7847,4251,8293,8281,6351,4912,287,2870,3380,3948,5322,3840,4738,9563,1906,6298,3234,8959,1562,6297,8835,7861,239,6618,1322,2553,2213,5053,5446,4402,6500,5182,8585,6900,5756,9661,903,5186,7687,5998,7997,8081,8955,4835,6069,2621,1581,732,9564,1082,1853,5442,1342,520,1737,3703,5321,4793,2776,1508,1647,9101,2499,6891,4336,7012,3329,3212,1442,9993,3988,4930,7706 9444,3401,5891,9716,1228,7107,109,3563,2700,6161,5039,4992,2242,8541,7372,2067,1294,3058,1306,320,8881,5756,9326,411,8650,8824,5495,8282,8397,2000,1228,7817,2099,6473,3571,5994,4447,1299,5991,543,7874,2297,1651,101,2093,3463,9189,6872,6118,872,1008,1779,2805,9084,4048,2123,5877,55,3075,1737,9459,4535,6453,3644,108,5982,4437,5213,1340,6967,9943,5815,669,8074,1838,6979,9132,9315,715,5048 3327,4030,7177,6336,9933,5296,2621,4785,2755,4832,2512,2118,2244,4407,2170,499,7532,9742,5051,7687,970,6924,3527,4694,5145,1306,2165,5940,2425,8910,3513,1909,6983,346,6377,4304,9330,7203,6605,3709,3346,970,369,9737,5811,4427,9939,3693,8436,5566,1977,3728,2399,3985,8303,2492,5366,9802,9193,7296,1033,5060,9144,2766,1151,7629,5169,5995,58,7619,7565,4208,1713,6279,3209,4908,9224,7409,1325,8540 6882,1265,1775,3648,4690,959,5837,4520,5394,1378,9485,1360,4018,578,9174,2932,9890,3696,116,1723,1178,9355,7063,1594,1918,8574,7594,7942,1547,6166,7888,354,6932,4651,1010,7759,6905,661,7689,6092,9292,3845,9605,8443,443,8275,5163,7720,7265,6356,7779,1798,1754,5225,6661,1180,8024,5666,88,9153,1840,3508,1193,4445,2648,3538,6243,6375,8107,5902,5423,2520,1122,5015,6113,8859,9370,966,8673,2442 7338,3423,4723,6533,848,8041,7921,8277,4094,5368,7252,8852,9166,2250,2801,6125,8093,5738,4038,9808,7359,9494,601,9116,4946,2702,5573,2921,9862,1462,1269,2410,4171,2709,7508,6241,7522,615,2407,8200,4189,5492,5649,7353,2590,5203,4274,710,7329,9063,956,8371,3722,4253,4785,1194,4828,4717,4548,940,983,2575,4511,2938,1827,2027,2700,1236,841,5760,1680,6260,2373,3851,1841,4968,1172,5179,7175,3509 4420,1327,3560,2376,6260,2988,9537,4064,4829,8872,9598,3228,1792,7118,9962,9336,4368,9189,6857,1829,9863,6287,7303,7769,2707,8257,2391,2009,3975,4993,3068,9835,3427,341,8412,2134,4034,8511,6421,3041,9012,2983,7289,100,1355,7904,9186,6920,5856,2008,6545,8331,3655,5011,839,8041,9255,6524,3862,8788,62,7455,3513,5003,8413,3918,2076,7960,6108,3638,6999,3436,1441,4858,4181,1866,8731,7745,3744,1000 356,8296,8325,1058,1277,4743,3850,2388,6079,6462,2815,5620,8495,5378,75,4324,3441,9870,1113,165,1544,1179,2834,562,6176,2313,6836,8839,2986,9454,5199,6888,1927,5866,8760,320,1792,8296,7898,6121,7241,5886,5814,2815,8336,1576,4314,3109,2572,6011,2086,9061,9403,3947,5487,9731,7281,3159,1819,1334,3181,5844,5114,9898,4634,2531,4412,6430,4262,8482,4546,4555,6804,2607,9421,686,8649,8860,7794,6672 9870,152,1558,4963,8750,4754,6521,6256,8818,5208,5691,9659,8377,9725,5050,5343,2539,6101,1844,9700,7750,8114,5357,3001,8830,4438,199,9545,8496,43,2078,327,9397,106,6090,8181,8646,6414,7499,5450,4850,6273,5014,4131,7639,3913,6571,8534,9703,4391,7618,445,1320,5,1894,6771,7383,9191,4708,9706,6939,7937,8726,9382,5216,3685,2247,9029,8154,1738,9984,2626,9438,4167,6351,5060,29,1218,1239,4785 192,5213,8297,8974,4032,6966,5717,1179,6523,4679,9513,1481,3041,5355,9303,9154,1389,8702,6589,7818,6336,3539,5538,3094,6646,6702,6266,2759,4608,4452,617,9406,8064,6379,444,5602,4950,1810,8391,1536,316,8714,1178,5182,5863,5110,5372,4954,1978,2971,5680,4863,2255,4630,5723,2168,538,1692,1319,7540,440,6430,6266,7712,7385,5702,620,641,3136,7350,1478,3155,2820,9109,6261,1122,4470,14,8493,2095 1046,4301,6082,474,4974,7822,2102,5161,5172,6946,8074,9716,6586,9962,9749,5015,2217,995,5388,4402,7652,6399,6539,1349,8101,3677,1328,9612,7922,2879,231,5887,2655,508,4357,4964,3554,5930,6236,7384,4614,280,3093,9600,2110,7863,2631,6626,6620,68,1311,7198,7561,1768,5139,1431,221,230,2940,968,5283,6517,2146,1646,869,9402,7068,8645,7058,1765,9690,4152,2926,9504,2939,7504,6074,2944,6470,7859 4659,736,4951,9344,1927,6271,8837,8711,3241,6579,7660,5499,5616,3743,5801,4682,9748,8796,779,1833,4549,8138,4026,775,4170,2432,4174,3741,7540,8017,2833,4027,396,811,2871,1150,9809,2719,9199,8504,1224,540,2051,3519,7982,7367,2761,308,3358,6505,2050,4836,5090,7864,805,2566,2409,6876,3361,8622,5572,5895,3280,441,7893,8105,1634,2929,274,3926,7786,6123,8233,9921,2674,5340,1445,203,4585,3837 5759,338,7444,7968,7742,3755,1591,4839,1705,650,7061,2461,9230,9391,9373,2413,1213,431,7801,4994,2380,2703,6161,6878,8331,2538,6093,1275,5065,5062,2839,582,1014,8109,3525,1544,1569,8622,7944,2905,6120,1564,1839,5570,7579,1318,2677,5257,4418,5601,7935,7656,5192,1864,5886,6083,5580,6202,8869,1636,7907,4759,9082,5854,3185,7631,6854,5872,5632,5280,1431,2077,9717,7431,4256,8261,9680,4487,4752,4286 1571,1428,8599,1230,7772,4221,8523,9049,4042,8726,7567,6736,9033,2104,4879,4967,6334,6716,3994,1269,8995,6539,3610,7667,6560,6065,874,848,4597,1711,7161,4811,6734,5723,6356,6026,9183,2586,5636,1092,7779,7923,8747,6887,7505,9909,1792,3233,4526,3176,1508,8043,720,5212,6046,4988,709,5277,8256,3642,1391,5803,1468,2145,3970,6301,7767,2359,8487,9771,8785,7520,856,1605,8972,2402,2386,991,1383,5963 1822,4824,5957,6511,9868,4113,301,9353,6228,2881,2966,6956,9124,9574,9233,1601,7340,973,9396,540,4747,8590,9535,3650,7333,7583,4806,3593,2738,8157,5215,8472,2284,9473,3906,6982,5505,6053,7936,6074,7179,6688,1564,1103,6860,5839,2022,8490,910,7551,7805,881,7024,1855,9448,4790,1274,3672,2810,774,7623,4223,4850,6071,9975,4935,1915,9771,6690,3846,517,463,7624,4511,614,6394,3661,7409,1395,8127 8738,3850,9555,3695,4383,2378,87,6256,6740,7682,9546,4255,6105,2000,1851,4073,8957,9022,6547,5189,2487,303,9602,7833,1628,4163,6678,3144,8589,7096,8913,5823,4890,7679,1212,9294,5884,2972,3012,3359,7794,7428,1579,4350,7246,4301,7779,7790,3294,9547,4367,3549,1958,8237,6758,3497,3250,3456,6318,1663,708,7714,6143,6890,3428,6853,9334,7992,591,6449,9786,1412,8500,722,5468,1371,108,3939,4199,2535 7047,4323,1934,5163,4166,461,3544,2767,6554,203,6098,2265,9078,2075,4644,6641,8412,9183,487,101,7566,5622,1975,5726,2920,5374,7779,5631,3753,3725,2672,3621,4280,1162,5812,345,8173,9785,1525,955,5603,2215,2580,5261,2765,2990,5979,389,3907,2484,1232,5933,5871,3304,1138,1616,5114,9199,5072,7442,7245,6472,4760,6359,9053,7876,2564,9404,3043,9026,2261,3374,4460,7306,2326,966,828,3274,1712,3446 3975,4565,8131,5800,4570,2306,8838,4392,9147,11,3911,7118,9645,4994,2028,6062,5431,2279,8752,2658,7836,994,7316,5336,7185,3289,1898,9689,2331,5737,3403,1124,2679,3241,7748,16,2724,5441,6640,9368,9081,5618,858,4969,17,2103,6035,8043,7475,2181,939,415,1617,8500,8253,2155,7843,7974,7859,1746,6336,3193,2617,8736,4079,6324,6645,8891,9396,5522,6103,1857,8979,3835,2475,1310,7422,610,8345,7615 9248,5397,5686,2988,3446,4359,6634,9141,497,9176,6773,7448,1907,8454,916,1596,2241,1626,1384,2741,3649,5362,8791,7170,2903,2475,5325,6451,924,3328,522,90,4813,9737,9557,691,2388,1383,4021,1609,9206,4707,5200,7107,8104,4333,9860,5013,1224,6959,8527,1877,4545,7772,6268,621,4915,9349,5970,706,9583,3071,4127,780,8231,3017,9114,3836,7503,2383,1977,4870,8035,2379,9704,1037,3992,3642,1016,4303 5093,138,4639,6609,1146,5565,95,7521,9077,2272,974,4388,2465,2650,722,4998,3567,3047,921,2736,7855,173,2065,4238,1048,5,6847,9548,8632,9194,5942,4777,7910,8971,6279,7253,2516,1555,1833,3184,9453,9053,6897,7808,8629,4877,1871,8055,4881,7639,1537,7701,2508,7564,5845,5023,2304,5396,3193,2955,1088,3801,6203,1748,3737,1276,13,4120,7715,8552,3047,2921,106,7508,304,1280,7140,2567,9135,5266 6237,4607,7527,9047,522,7371,4883,2540,5867,6366,5301,1570,421,276,3361,527,6637,4861,2401,7522,5808,9371,5298,2045,5096,5447,7755,5115,7060,8529,4078,1943,1697,1764,5453,7085,960,2405,739,2100,5800,728,9737,5704,5693,1431,8979,6428,673,7540,6,7773,5857,6823,150,5869,8486,684,5816,9626,7451,5579,8260,3397,5322,6920,1879,2127,2884,5478,4977,9016,6165,6292,3062,5671,5968,78,4619,4763 9905,7127,9390,5185,6923,3721,9164,9705,4341,1031,1046,5127,7376,6528,3248,4941,1178,7889,3364,4486,5358,9402,9158,8600,1025,874,1839,1783,309,9030,1843,845,8398,1433,7118,70,8071,2877,3904,8866,6722,4299,10,1929,5897,4188,600,1889,3325,2485,6473,4474,7444,6992,4846,6166,4441,2283,2629,4352,7775,1101,2214,9985,215,8270,9750,2740,8361,7103,5930,8664,9690,8302,9267,344,2077,1372,1880,9550 5825,8517,7769,2405,8204,1060,3603,7025,478,8334,1997,3692,7433,9101,7294,7498,9415,5452,3850,3508,6857,9213,6807,4412,7310,854,5384,686,4978,892,8651,3241,2743,3801,3813,8588,6701,4416,6990,6490,3197,6838,6503,114,8343,5844,8646,8694,65,791,5979,2687,2621,2019,8097,1423,3644,9764,4921,3266,3662,5561,2476,8271,8138,6147,1168,3340,1998,9874,6572,9873,6659,5609,2711,3931,9567,4143,7833,8887 6223,2099,2700,589,4716,8333,1362,5007,2753,2848,4441,8397,7192,8191,4916,9955,6076,3370,6396,6971,3156,248,3911,2488,4930,2458,7183,5455,170,6809,6417,3390,1956,7188,577,7526,2203,968,8164,479,8699,7915,507,6393,4632,1597,7534,3604,618,3280,6061,9793,9238,8347,568,9645,2070,5198,6482,5000,9212,6655,5961,7513,1323,3872,6170,3812,4146,2736,67,3151,5548,2781,9679,7564,5043,8587,1893,4531 5826,3690,6724,2121,9308,6986,8106,6659,2142,1642,7170,2877,5757,6494,8026,6571,8387,9961,6043,9758,9607,6450,8631,8334,7359,5256,8523,2225,7487,1977,9555,8048,5763,2414,4948,4265,2427,8978,8088,8841,9208,9601,5810,9398,8866,9138,4176,5875,7212,3272,6759,5678,7649,4922,5422,1343,8197,3154,3600,687,1028,4579,2084,9467,4492,7262,7296,6538,7657,7134,2077,1505,7332,6890,8964,4879,7603,7400,5973,739 1861,1613,4879,1884,7334,966,2000,7489,2123,4287,1472,3263,4726,9203,1040,4103,6075,6049,330,9253,4062,4268,1635,9960,577,1320,3195,9628,1030,4092,4979,6474,6393,2799,6967,8687,7724,7392,9927,2085,3200,6466,8702,265,7646,8665,7986,7266,4574,6587,612,2724,704,3191,8323,9523,3002,704,5064,3960,8209,2027,2758,8393,4875,4641,9584,6401,7883,7014,768,443,5490,7506,1852,2005,8850,5776,4487,4269 4052,6687,4705,7260,6645,6715,3706,5504,8672,2853,1136,8187,8203,4016,871,1809,1366,4952,9294,5339,6872,2645,6083,7874,3056,5218,7485,8796,7401,3348,2103,426,8572,4163,9171,3176,948,7654,9344,3217,1650,5580,7971,2622,76,2874,880,2034,9929,1546,2659,5811,3754,7096,7436,9694,9960,7415,2164,953,2360,4194,2397,1047,2196,6827,575,784,2675,8821,6802,7972,5996,6699,2134,7577,2887,1412,4349,4380 4629,2234,6240,8132,7592,3181,6389,1214,266,1910,2451,8784,2790,1127,6932,1447,8986,2492,5476,397,889,3027,7641,5083,5776,4022,185,3364,5701,2442,2840,4160,9525,4828,6602,2614,7447,3711,4505,7745,8034,6514,4907,2605,7753,6958,7270,6936,3006,8968,439,2326,4652,3085,3425,9863,5049,5361,8688,297,7580,8777,7916,6687,8683,7141,306,9569,2384,1500,3346,4601,7329,9040,6097,2727,6314,4501,4974,2829 8316,4072,2025,6884,3027,1808,5714,7624,7880,8528,4205,8686,7587,3230,1139,7273,6163,6986,3914,9309,1464,9359,4474,7095,2212,7302,2583,9462,7532,6567,1606,4436,8981,5612,6796,4385,5076,2007,6072,3678,8331,1338,3299,8845,4783,8613,4071,1232,6028,2176,3990,2148,3748,103,9453,538,6745,9110,926,3125,473,5970,8728,7072,9062,1404,1317,5139,9862,6496,6062,3338,464,1600,2532,1088,8232,7739,8274,3873 2341,523,7096,8397,8301,6541,9844,244,4993,2280,7689,4025,4196,5522,7904,6048,2623,9258,2149,9461,6448,8087,7245,1917,8340,7127,8466,5725,6996,3421,5313,512,9164,9837,9794,8369,4185,1488,7210,1524,1016,4620,9435,2478,7765,8035,697,6677,3724,6988,5853,7662,3895,9593,1185,4727,6025,5734,7665,3070,138,8469,6748,6459,561,7935,8646,2378,462,7755,3115,9690,8877,3946,2728,8793,244,6323,8666,4271 6430,2406,8994,56,1267,3826,9443,7079,7579,5232,6691,3435,6718,5698,4144,7028,592,2627,217,734,6194,8156,9118,58,2640,8069,4127,3285,694,3197,3377,4143,4802,3324,8134,6953,7625,3598,3584,4289,7065,3434,2106,7132,5802,7920,9060,7531,3321,1725,1067,3751,444,5503,6785,7937,6365,4803,198,6266,8177,1470,6390,1606,2904,7555,9834,8667,2033,1723,5167,1666,8546,8152,473,4475,6451,7947,3062,3281 2810,3042,7759,1741,2275,2609,7676,8640,4117,1958,7500,8048,1757,3954,9270,1971,4796,2912,660,5511,3553,1012,5757,4525,6084,7198,8352,5775,7726,8591,7710,9589,3122,4392,6856,5016,749,2285,3356,7482,9956,7348,2599,8944,495,3462,3578,551,4543,7207,7169,7796,1247,4278,6916,8176,3742,8385,2310,1345,8692,2667,4568,1770,8319,3585,4920,3890,4928,7343,5385,9772,7947,8786,2056,9266,3454,2807,877,2660 6206,8252,5928,5837,4177,4333,207,7934,5581,9526,8906,1498,8411,2984,5198,5134,2464,8435,8514,8674,3876,599,5327,826,2152,4084,2433,9327,9697,4800,2728,3608,3849,3861,3498,9943,1407,3991,7191,9110,5666,8434,4704,6545,5944,2357,1163,4995,9619,6754,4200,9682,6654,4862,4744,5953,6632,1054,293,9439,8286,2255,696,8709,1533,1844,6441,430,1999,6063,9431,7018,8057,2920,6266,6799,356,3597,4024,6665 3847,6356,8541,7225,2325,2946,5199,469,5450,7508,2197,9915,8284,7983,6341,3276,3321,16,1321,7608,5015,3362,8491,6968,6818,797,156,2575,706,9516,5344,5457,9210,5051,8099,1617,9951,7663,8253,9683,2670,1261,4710,1068,8753,4799,1228,2621,3275,6188,4699,1791,9518,8701,5932,4275,6011,9877,2933,4182,6059,2930,6687,6682,9771,654,9437,3169,8596,1827,5471,8909,2352,123,4394,3208,8756,5513,6917,2056 5458,8173,3138,3290,4570,4892,3317,4251,9699,7973,1163,1935,5477,6648,9614,5655,9592,975,9118,2194,7322,8248,8413,3462,8560,1907,7810,6650,7355,2939,4973,6894,3933,3784,3200,2419,9234,4747,2208,2207,1945,2899,1407,6145,8023,3484,5688,7686,2737,3828,3704,9004,5190,9740,8643,8650,5358,4426,1522,1707,3613,9887,6956,2447,2762,833,1449,9489,2573,1080,4167,3456,6809,2466,227,7125,2759,6250,6472,8089 3266,7025,9756,3914,1265,9116,7723,9788,6805,5493,2092,8688,6592,9173,4431,4028,6007,7131,4446,4815,3648,6701,759,3312,8355,4485,4187,5188,8746,7759,3528,2177,5243,8379,3838,7233,4607,9187,7216,2190,6967,2920,6082,7910,5354,3609,8958,6949,7731,494,8753,8707,1523,4426,3543,7085,647,6771,9847,646,5049,824,8417,5260,2730,5702,2513,9275,4279,2767,8684,1165,9903,4518,55,9682,8963,6005,2102,6523 1998,8731,936,1479,5259,7064,4085,91,7745,7136,3773,3810,730,8255,2705,2653,9790,6807,2342,355,9344,2668,3690,2028,9679,8102,574,4318,6481,9175,5423,8062,2867,9657,7553,3442,3920,7430,3945,7639,3714,3392,2525,4995,4850,2867,7951,9667,486,9506,9888,781,8866,1702,3795,90,356,1483,4200,2131,6969,5931,486,6880,4404,1084,5169,4910,6567,8335,4686,5043,2614,3352,2667,4513,6472,7471,5720,1616 8878,1613,1716,868,1906,2681,564,665,5995,2474,7496,3432,9491,9087,8850,8287,669,823,347,6194,2264,2592,7871,7616,8508,4827,760,2676,4660,4881,7572,3811,9032,939,4384,929,7525,8419,5556,9063,662,8887,7026,8534,3111,1454,2082,7598,5726,6687,9647,7608,73,3014,5063,670,5461,5631,3367,9796,8475,7908,5073,1565,5008,5295,4457,1274,4788,1728,338,600,8415,8535,9351,7750,6887,5845,1741,125 3637,6489,9634,9464,9055,2413,7824,9517,7532,3577,7050,6186,6980,9365,9782,191,870,2497,8498,2218,2757,5420,6468,586,3320,9230,1034,1393,9886,5072,9391,1178,8464,8042,6869,2075,8275,3601,7715,9470,8786,6475,8373,2159,9237,2066,3264,5000,679,355,3069,4073,494,2308,5512,4334,9438,8786,8637,9774,1169,1949,6594,6072,4270,9158,7916,5752,6794,9391,6301,5842,3285,2141,3898,8027,4310,8821,7079,1307 8497,6681,4732,7151,7060,5204,9030,7157,833,5014,8723,3207,9796,9286,4913,119,5118,7650,9335,809,3675,2597,5144,3945,5090,8384,187,4102,1260,2445,2792,4422,8389,9290,50,1765,1521,6921,8586,4368,1565,5727,7855,2003,4834,9897,5911,8630,5070,1330,7692,7557,7980,6028,5805,9090,8265,3019,3802,698,9149,5748,1965,9658,4417,5994,5584,8226,2937,272,5743,1278,5698,8736,2595,6475,5342,6596,1149,6920 8188,8009,9546,6310,8772,2500,9846,6592,6872,3857,1307,8125,7042,1544,6159,2330,643,4604,7899,6848,371,8067,2062,3200,7295,1857,9505,6936,384,2193,2190,301,8535,5503,1462,7380,5114,4824,8833,1763,4974,8711,9262,6698,3999,2645,6937,7747,1128,2933,3556,7943,2885,3122,9105,5447,418,2899,5148,3699,9021,9501,597,4084,175,1621,1,1079,6067,5812,4326,9914,6633,5394,4233,6728,9084,1864,5863,1225 9935,8793,9117,1825,9542,8246,8437,3331,9128,9675,6086,7075,319,1334,7932,3583,7167,4178,1726,7720,695,8277,7887,6359,5912,1719,2780,8529,1359,2013,4498,8072,1129,9998,1147,8804,9405,6255,1619,2165,7491,1,8882,7378,3337,503,5758,4109,3577,985,3200,7615,8058,5032,1080,6410,6873,5496,1466,2412,9885,5904,4406,3605,8770,4361,6205,9193,1537,9959,214,7260,9566,1685,100,4920,7138,9819,5637,976 3466,9854,985,1078,7222,8888,5466,5379,3578,4540,6853,8690,3728,6351,7147,3134,6921,9692,857,3307,4998,2172,5783,3931,9417,2541,6299,13,787,2099,9131,9494,896,8600,1643,8419,7248,2660,2609,8579,91,6663,5506,7675,1947,6165,4286,1972,9645,3805,1663,1456,8853,5705,9889,7489,1107,383,4044,2969,3343,152,7805,4980,9929,5033,1737,9953,7197,9158,4071,1324,473,9676,3984,9680,3606,8160,7384,5432 1005,4512,5186,3953,2164,3372,4097,3247,8697,3022,9896,4101,3871,6791,3219,2742,4630,6967,7829,5991,6134,1197,1414,8923,8787,1394,8852,5019,7768,5147,8004,8825,5062,9625,7988,1110,3992,7984,9966,6516,6251,8270,421,3723,1432,4830,6935,8095,9059,2214,6483,6846,3120,1587,6201,6691,9096,9627,6671,4002,3495,9939,7708,7465,5879,6959,6634,3241,3401,2355,9061,2611,7830,3941,2177,2146,5089,7079,519,6351 7280,8586,4261,2831,7217,3141,9994,9940,5462,2189,4005,6942,9848,5350,8060,6665,7519,4324,7684,657,9453,9296,2944,6843,7499,7847,1728,9681,3906,6353,5529,2822,3355,3897,7724,4257,7489,8672,4356,3983,1948,6892,7415,4153,5893,4190,621,1736,4045,9532,7701,3671,1211,1622,3176,4524,9317,7800,5638,6644,6943,5463,3531,2821,1347,5958,3436,1438,2999,994,850,4131,2616,1549,3465,5946,690,9273,6954,7991 9517,399,3249,2596,7736,2142,1322,968,7350,1614,468,3346,3265,7222,6086,1661,5317,2582,7959,4685,2807,2917,1037,5698,1529,3972,8716,2634,3301,3412,8621,743,8001,4734,888,7744,8092,3671,8941,1487,5658,7099,2781,99,1932,4443,4756,4652,9328,1581,7855,4312,5976,7255,6480,3996,2748,1973,9731,4530,2790,9417,7186,5303,3557,351,7182,9428,1342,9020,7599,1392,8304,2070,9138,7215,2008,9937,1106,7110 7444,769,9688,632,1571,6820,8743,4338,337,3366,3073,1946,8219,104,4210,6986,249,5061,8693,7960,6546,1004,8857,5997,9352,4338,6105,5008,2556,6518,6694,4345,3727,7956,20,3954,8652,4424,9387,2035,8358,5962,5304,5194,8650,8282,1256,1103,2138,6679,1985,3653,2770,2433,4278,615,2863,1715,242,3790,2636,6998,3088,1671,2239,957,5411,4595,6282,2881,9974,2401,875,7574,2987,4587,3147,6766,9885,2965 3287,3016,3619,6818,9073,6120,5423,557,2900,2015,8111,3873,1314,4189,1846,4399,7041,7583,2427,2864,3525,5002,2069,748,1948,6015,2684,438,770,8367,1663,7887,7759,1885,157,7770,4520,4878,3857,1137,3525,3050,6276,5569,7649,904,4533,7843,2199,5648,7628,9075,9441,3600,7231,2388,5640,9096,958,3058,584,5899,8150,1181,9616,1098,8162,6819,8171,1519,1140,7665,8801,2632,1299,9192,707,9955,2710,7314 1772,2963,7578,3541,3095,1488,7026,2634,6015,4633,4370,2762,1650,2174,909,8158,2922,8467,4198,4280,9092,8856,8835,5457,2790,8574,9742,5054,9547,4156,7940,8126,9824,7340,8840,6574,3547,1477,3014,6798,7134,435,9484,9859,3031,4,1502,4133,1738,1807,4825,463,6343,9701,8506,9822,9555,8688,8168,3467,3234,6318,1787,5591,419,6593,7974,8486,9861,6381,6758,194,3061,4315,2863,4665,3789,2201,1492,4416 126,8927,6608,5682,8986,6867,1715,6076,3159,788,3140,4744,830,9253,5812,5021,7616,8534,1546,9590,1101,9012,9821,8132,7857,4086,1069,7491,2988,1579,2442,4321,2149,7642,6108,250,6086,3167,24,9528,7663,2685,1220,9196,1397,5776,1577,1730,5481,977,6115,199,6326,2183,3767,5928,5586,7561,663,8649,9688,949,5913,9160,1870,5764,9887,4477,6703,1413,4995,5494,7131,2192,8969,7138,3997,8697,646,1028 8074,1731,8245,624,4601,8706,155,8891,309,2552,8208,8452,2954,3124,3469,4246,3352,1105,4509,8677,9901,4416,8191,9283,5625,7120,2952,8881,7693,830,4580,8228,9459,8611,4499,1179,4988,1394,550,2336,6089,6872,269,7213,1848,917,6672,4890,656,1478,6536,3165,4743,4990,1176,6211,7207,5284,9730,4738,1549,4986,4942,8645,3698,9429,1439,2175,6549,3058,6513,1574,6988,8333,3406,5245,5431,7140,7085,6407 7845,4694,2530,8249,290,5948,5509,1588,5940,4495,5866,5021,4626,3979,3296,7589,4854,1998,5627,3926,8346,6512,9608,1918,7070,4747,4182,2858,2766,4606,6269,4107,8982,8568,9053,4244,5604,102,2756,727,5887,2566,7922,44,5986,621,1202,374,6988,4130,3627,6744,9443,4568,1398,8679,397,3928,9159,367,2917,6127,5788,3304,8129,911,2669,1463,9749,264,4478,8940,1109,7309,2462,117,4692,7724,225,2312 4164,3637,2000,941,8903,39,3443,7172,1031,3687,4901,8082,4945,4515,7204,9310,9349,9535,9940,218,1788,9245,2237,1541,5670,6538,6047,5553,9807,8101,1925,8714,445,8332,7309,6830,5786,5736,7306,2710,3034,1838,7969,6318,7912,2584,2080,7437,6705,2254,7428,820,782,9861,7596,3842,3631,8063,5240,6666,394,4565,7865,4895,9890,6028,6117,4724,9156,4473,4552,602,470,6191,4927,5387,884,3146,1978,3000 4258,6880,1696,3582,5793,4923,2119,1155,9056,9698,6603,3768,5514,9927,9609,6166,6566,4536,4985,4934,8076,9062,6741,6163,7399,4562,2337,5600,2919,9012,8459,1308,6072,1225,9306,8818,5886,7243,7365,8792,6007,9256,6699,7171,4230,7002,8720,7839,4533,1671,478,7774,1607,2317,5437,4705,7886,4760,6760,7271,3081,2997,3088,7675,6208,3101,6821,6840,122,9633,4900,2067,8546,4549,2091,7188,5605,8599,6758,5229 7854,5243,9155,3556,8812,7047,2202,1541,5993,4600,4760,713,434,7911,7426,7414,8729,322,803,7960,7563,4908,6285,6291,736,3389,9339,4132,8701,7534,5287,3646,592,3065,7582,2592,8755,6068,8597,1982,5782,1894,2900,6236,4039,6569,3037,5837,7698,700,7815,2491,7272,5878,3083,6778,6639,3589,5010,8313,2581,6617,5869,8402,6808,2951,2321,5195,497,2190,6187,1342,1316,4453,7740,4154,2959,1781,1482,8256 7178,2046,4419,744,8312,5356,6855,8839,319,2962,5662,47,6307,8662,68,4813,567,2712,9931,1678,3101,8227,6533,4933,6656,92,5846,4780,6256,6361,4323,9985,1231,2175,7178,3034,9744,6155,9165,7787,5836,9318,7860,9644,8941,6480,9443,8188,5928,161,6979,2352,5628,6991,1198,8067,5867,6620,3778,8426,2994,3122,3124,6335,3918,8897,2655,9670,634,1088,1576,8935,7255,474,8166,7417,9547,2886,5560,3842 6957,3111,26,7530,7143,1295,1744,6057,3009,1854,8098,5405,2234,4874,9447,2620,9303,27,7410,969,40,2966,5648,7596,8637,4238,3143,3679,7187,690,9980,7085,7714,9373,5632,7526,6707,3951,9734,4216,2146,3602,5371,6029,3039,4433,4855,4151,1449,3376,8009,7240,7027,4602,2947,9081,4045,8424,9352,8742,923,2705,4266,3232,2264,6761,363,2651,3383,7770,6730,7856,7340,9679,2158,610,4471,4608,910,6241 4417,6756,1013,8797,658,8809,5032,8703,7541,846,3357,2920,9817,1745,9980,7593,4667,3087,779,3218,6233,5568,4296,2289,2654,7898,5021,9461,5593,8214,9173,4203,2271,7980,2983,5952,9992,8399,3468,1776,3188,9314,1720,6523,2933,621,8685,5483,8986,6163,3444,9539,4320,155,3992,2828,2150,6071,524,2895,5468,8063,1210,3348,9071,4862,483,9017,4097,6186,9815,3610,5048,1644,1003,9865,9332,2145,1944,2213 9284,3803,4920,1927,6706,4344,7383,4786,9890,2010,5228,1224,3158,6967,8580,8990,8883,5213,76,8306,2031,4980,5639,9519,7184,5645,7769,3259,8077,9130,1317,3096,9624,3818,1770,695,2454,947,6029,3474,9938,3527,5696,4760,7724,7738,2848,6442,5767,6845,8323,4131,2859,7595,2500,4815,3660,9130,8580,7016,8231,4391,8369,3444,4069,4021,556,6154,627,2778,1496,4206,6356,8434,8491,3816,8231,3190,5575,1015 3787,7572,1788,6803,5641,6844,1961,4811,8535,9914,9999,1450,8857,738,4662,8569,6679,2225,7839,8618,286,2648,5342,2294,3205,4546,176,8705,3741,6134,8324,8021,7004,5205,7032,6637,9442,5539,5584,4819,5874,5807,8589,6871,9016,983,1758,3786,1519,6241,185,8398,495,3370,9133,3051,4549,9674,7311,9738,3316,9383,2658,2776,9481,7558,619,3943,3324,6491,4933,153,9738,4623,912,3595,7771,7939,1219,4405 2650,3883,4154,5809,315,7756,4430,1788,4451,1631,6461,7230,6017,5751,138,588,5282,2442,9110,9035,6349,2515,1570,6122,4192,4174,3530,1933,4186,4420,4609,5739,4135,2963,6308,1161,8809,8619,2796,3819,6971,8228,4188,1492,909,8048,2328,6772,8467,7671,9068,2226,7579,6422,7056,8042,3296,2272,3006,2196,7320,3238,3490,3102,37,1293,3212,4767,5041,8773,5794,4456,6174,7279,7054,2835,7053,9088,790,6640 3101,1057,7057,3826,6077,1025,2955,1224,1114,6729,5902,4698,6239,7203,9423,1804,4417,6686,1426,6941,8071,1029,4985,9010,6122,6597,1622,1574,3513,1684,7086,5505,3244,411,9638,4150,907,9135,829,981,1707,5359,8781,9751,5,9131,3973,7159,1340,6955,7514,7993,6964,8198,1933,2797,877,3993,4453,8020,9349,8646,2779,8679,2961,3547,3374,3510,1129,3568,2241,2625,9138,5974,8206,7669,7678,1833,8700,4480 4865,9912,8038,8238,782,3095,8199,1127,4501,7280,2112,2487,3626,2790,9432,1475,6312,8277,4827,2218,5806,7132,8752,1468,7471,6386,739,8762,8323,8120,5169,9078,9058,3370,9560,7987,8585,8531,5347,9312,1058,4271,1159,5286,5404,6925,8606,9204,7361,2415,560,586,4002,2644,1927,2824,768,4409,2942,3345,1002,808,4941,6267,7979,5140,8643,7553,9438,7320,4938,2666,4609,2778,8158,6730,3748,3867,1866,7181 171,3771,7134,8927,4778,2913,3326,2004,3089,7853,1378,1729,4777,2706,9578,1360,5693,3036,1851,7248,2403,2273,8536,6501,9216,613,9671,7131,7719,6425,773,717,8803,160,1114,7554,7197,753,4513,4322,8499,4533,2609,4226,8710,6627,644,9666,6260,4870,5744,7385,6542,6203,7703,6130,8944,5589,2262,6803,6381,7414,6888,5123,7320,9392,9061,6780,322,8975,7050,5089,1061,2260,3199,1150,1865,5386,9699,6501 3744,8454,6885,8277,919,1923,4001,6864,7854,5519,2491,6057,8794,9645,1776,5714,9786,9281,7538,6916,3215,395,2501,9618,4835,8846,9708,2813,3303,1794,8309,7176,2206,1602,1838,236,4593,2245,8993,4017,10,8215,6921,5206,4023,5932,6997,7801,262,7640,3107,8275,4938,7822,2425,3223,3886,2105,8700,9526,2088,8662,8034,7004,5710,2124,7164,3574,6630,9980,4242,2901,9471,1491,2117,4562,1130,9086,4117,6698 2810,2280,2331,1170,4554,4071,8387,1215,2274,9848,6738,1604,7281,8805,439,1298,8318,7834,9426,8603,6092,7944,1309,8828,303,3157,4638,4439,9175,1921,4695,7716,1494,1015,1772,5913,1127,1952,1950,8905,4064,9890,385,9357,7945,5035,7082,5369,4093,6546,5187,5637,2041,8946,1758,7111,6566,1027,1049,5148,7224,7248,296,6169,375,1656,7993,2816,3717,4279,4675,1609,3317,42,6201,3100,3144,163,9530,4531 7096,6070,1009,4988,3538,5801,7149,3063,2324,2912,7911,7002,4338,7880,2481,7368,3516,2016,7556,2193,1388,3865,8125,4637,4096,8114,750,3144,1938,7002,9343,4095,1392,4220,3455,6969,9647,1321,9048,1996,1640,6626,1788,314,9578,6630,2813,6626,4981,9908,7024,4355,3201,3521,3864,3303,464,1923,595,9801,3391,8366,8084,9374,1041,8807,9085,1892,9431,8317,9016,9221,8574,9981,9240,5395,2009,6310,2854,9255 8830,3145,2960,9615,8220,6061,3452,2918,6481,9278,2297,3385,6565,7066,7316,5682,107,7646,4466,68,1952,9603,8615,54,7191,791,6833,2560,693,9733,4168,570,9127,9537,1925,8287,5508,4297,8452,8795,6213,7994,2420,4208,524,5915,8602,8330,2651,8547,6156,1812,6271,7991,9407,9804,1553,6866,1128,2119,4691,9711,8315,5879,9935,6900,482,682,4126,1041,428,6247,3720,5882,7526,2582,4327,7725,3503,2631 2738,9323,721,7434,1453,6294,2957,3786,5722,6019,8685,4386,3066,9057,6860,499,5315,3045,5194,7111,3137,9104,941,586,3066,755,4177,8819,7040,5309,3583,3897,4428,7788,4721,7249,6559,7324,825,7311,3760,6064,6070,9672,4882,584,1365,9739,9331,5783,2624,7889,1604,1303,1555,7125,8312,425,8936,3233,7724,1480,403,7440,1784,1754,4721,1569,652,3893,4574,5692,9730,4813,9844,8291,9199,7101,3391,8914 6044,2928,9332,3328,8588,447,3830,1176,3523,2705,8365,6136,5442,9049,5526,8575,8869,9031,7280,706,2794,8814,5767,4241,7696,78,6570,556,5083,1426,4502,3336,9518,2292,1885,3740,3153,9348,9331,8051,2759,5407,9028,7840,9255,831,515,2612,9747,7435,8964,4971,2048,4900,5967,8271,1719,9670,2810,6777,1594,6367,6259,8316,3815,1689,6840,9437,4361,822,9619,3065,83,6344,7486,8657,8228,9635,6932,4864 8478,4777,6334,4678,7476,4963,6735,3096,5860,1405,5127,7269,7793,4738,227,9168,2996,8928,765,733,1276,7677,6258,1528,9558,3329,302,8901,1422,8277,6340,645,9125,8869,5952,141,8141,1816,9635,4025,4184,3093,83,2344,2747,9352,7966,1206,1126,1826,218,7939,2957,2729,810,8752,5247,4174,4038,8884,7899,9567,301,5265,5752,7524,4381,1669,3106,8270,6228,6373,754,2547,4240,2313,5514,3022,1040,9738 2265,8192,1763,1369,8469,8789,4836,52,1212,6690,5257,8918,6723,6319,378,4039,2421,8555,8184,9577,1432,7139,8078,5452,9628,7579,4161,7490,5159,8559,1011,81,478,5840,1964,1334,6875,8670,9900,739,1514,8692,522,9316,6955,1345,8132,2277,3193,9773,3923,4177,2183,1236,6747,6575,4874,6003,6409,8187,745,8776,9440,7543,9825,2582,7381,8147,7236,5185,7564,6125,218,7991,6394,391,7659,7456,5128,5294 2132,8992,8160,5782,4420,3371,3798,5054,552,5631,7546,4716,1332,6486,7892,7441,4370,6231,4579,2121,8615,1145,9391,1524,1385,2400,9437,2454,7896,7467,2928,8400,3299,4025,7458,4703,7206,6358,792,6200,725,4275,4136,7390,5984,4502,7929,5085,8176,4600,119,3568,76,9363,6943,2248,9077,9731,6213,5817,6729,4190,3092,6910,759,2682,8380,1254,9604,3011,9291,5329,9453,9746,2739,6522,3765,5634,1113,5789 5304,5499,564,2801,679,2653,1783,3608,7359,7797,3284,796,3222,437,7185,6135,8571,2778,7488,5746,678,6140,861,7750,803,9859,9918,2425,3734,2698,9005,4864,9818,6743,2475,132,9486,3825,5472,919,292,4411,7213,7699,6435,9019,6769,1388,802,2124,1345,8493,9487,8558,7061,8777,8833,2427,2238,5409,4957,8503,3171,7622,5779,6145,2417,5873,5563,5693,9574,9491,1937,7384,4563,6842,5432,2751,3406,7981
4445,2697,5115,718,2209,2212,654,4348,3079,6821,7668,3276,8874,4190,3785,2752,9473,7817,9137,496,7338,3434,7152,4355,4552,7917,7827,2460,2350,691,3514,5880,3145,7633,7199,3783,5066,7487,3285,1084,8985,760,872,8609,8051,1134,9536,5750,9716,9371,7619,5617,275,9721,2997,2698,1887,8825,6372,3014,2113,7122,7050,6775,5948,2758,1219,3539,348,7989,2735,9862,1263,8089,6401,9462,3168,2758,3748,5870 1096,20,1318,7586,5167,2642,1443,5741,7621,7030,5526,4244,2348,4641,9827,2448,6918,5883,3737,300,7116,6531,567,5997,3971,6623,820,6148,3287,1874,7981,8424,7672,7575,6797,6717,1078,5008,4051,8795,5820,346,1851,6463,2117,6058,3407,8211,117,4822,1317,4377,4434,5925,8341,4800,1175,4173,690,8978,7470,1295,3799,8724,3509,9849,618,3320,7068,9633,2384,7175,544,6583,1908,9983,481,4187,9353,9377 9607,7385,521,6084,1364,8983,7623,1585,6935,8551,2574,8267,4781,3834,2764,2084,2669,4656,9343,7709,2203,9328,8004,6192,5856,3555,2260,5118,6504,1839,9227,1259,9451,1388,7909,5733,6968,8519,9973,1663,5315,7571,3035,4325,4283,2304,6438,3815,9213,9806,9536,196,5542,6907,2475,1159,5820,9075,9470,2179,9248,1828,4592,9167,3713,4640,47,3637,309,7344,6955,346,378,9044,8635,7466,5036,9515,6385,9230 7206,3114,7760,1094,6150,5182,7358,7387,4497,955,101,1478,7777,6966,7010,8417,6453,4955,3496,107,449,8271,131,2948,6185,784,5937,8001,6104,8282,4165,3642,710,2390,575,715,3089,6964,4217,192,5949,7006,715,3328,1152,66,8044,4319,1735,146,4818,5456,6451,4113,1063,4781,6799,602,1504,6245,6550,1417,1343,2363,3785,5448,4545,9371,5420,5068,4613,4882,4241,5043,7873,8042,8434,3939,9256,2187 3620,8024,577,9997,7377,7682,1314,1158,6282,6310,1896,2509,5436,1732,9480,706,496,101,6232,7375,2207,2306,110,6772,3433,2878,8140,5933,8688,1399,2210,7332,6172,6403,7333,4044,2291,1790,2446,7390,8698,5723,3678,7104,1825,2040,140,3982,4905,4160,2200,5041,2512,1488,2268,1175,7588,8321,8078,7312,977,5257,8465,5068,3453,3096,1651,7906,253,9250,6021,8791,8109,6651,3412,345,4778,5152,4883,7505 1074,5438,9008,2679,5397,5429,2652,3403,770,9188,4248,2493,4361,8327,9587,707,9525,5913,93,1899,328,2876,3604,673,8576,6908,7659,2544,3359,3883,5273,6587,3065,1749,3223,604,9925,6941,2823,8767,7039,3290,3214,1787,7904,3421,7137,9560,8451,2669,9219,6332,1576,5477,6755,8348,4164,4307,2984,4012,6629,1044,2874,6541,4942,903,1404,9125,5160,8836,4345,2581,460,8438,1538,5507,668,3352,2678,6942 4295,1176,5596,1521,3061,9868,7037,7129,8933,6659,5947,5063,3653,9447,9245,2679,767,714,116,8558,163,3927,8779,158,5093,2447,5782,3967,1716,931,7772,8164,1117,9244,5783,7776,3846,8862,6014,2330,6947,1777,3112,6008,3491,1906,5952,314,4602,8994,5919,9214,3995,5026,7688,6809,5003,3128,2509,7477,110,8971,3982,8539,2980,4689,6343,5411,2992,5270,5247,9260,2269,7474,1042,7162,5206,1232,4556,4757 510,3556,5377,1406,5721,4946,2635,7847,4251,8293,8281,6351,4912,287,2870,3380,3948,5322,3840,4738,9563,1906,6298,3234,8959,1562,6297,8835,7861,239,6618,1322,2553,2213,5053,5446,4402,6500,5182,8585,6900,5756,9661,903,5186,7687,5998,7997,8081,8955,4835,6069,2621,1581,732,9564,1082,1853,5442,1342,520,1737,3703,5321,4793,2776,1508,1647,9101,2499,6891,4336,7012,3329,3212,1442,9993,3988,4930,7706 9444,3401,5891,9716,1228,7107,109,3563,2700,6161,5039,4992,2242,8541,7372,2067,1294,3058,1306,320,8881,5756,9326,411,8650,8824,5495,8282,8397,2000,1228,7817,2099,6473,3571,5994,4447,1299,5991,543,7874,2297,1651,101,2093,3463,9189,6872,6118,872,1008,1779,2805,9084,4048,2123,5877,55,3075,1737,9459,4535,6453,3644,108,5982,4437,5213,1340,6967,9943,5815,669,8074,1838,6979,9132,9315,715,5048 3327,4030,7177,6336,9933,5296,2621,4785,2755,4832,2512,2118,2244,4407,2170,499,7532,9742,5051,7687,970,6924,3527,4694,5145,1306,2165,5940,2425,8910,3513,1909,6983,346,6377,4304,9330,7203,6605,3709,3346,970,369,9737,5811,4427,9939,3693,8436,5566,1977,3728,2399,3985,8303,2492,5366,9802,9193,7296,1033,5060,9144,2766,1151,7629,5169,5995,58,7619,7565,4208,1713,6279,3209,4908,9224,7409,1325,8540 6882,1265,1775,3648,4690,959,5837,4520,5394,1378,9485,1360,4018,578,9174,2932,9890,3696,116,1723,1178,9355,7063,1594,1918,8574,7594,7942,1547,6166,7888,354,6932,4651,1010,7759,6905,661,7689,6092,9292,3845,9605,8443,443,8275,5163,7720,7265,6356,7779,1798,1754,5225,6661,1180,8024,5666,88,9153,1840,3508,1193,4445,2648,3538,6243,6375,8107,5902,5423,2520,1122,5015,6113,8859,9370,966,8673,2442 7338,3423,4723,6533,848,8041,7921,8277,4094,5368,7252,8852,9166,2250,2801,6125,8093,5738,4038,9808,7359,9494,601,9116,4946,2702,5573,2921,9862,1462,1269,2410,4171,2709,7508,6241,7522,615,2407,8200,4189,5492,5649,7353,2590,5203,4274,710,7329,9063,956,8371,3722,4253,4785,1194,4828,4717,4548,940,983,2575,4511,2938,1827,2027,2700,1236,841,5760,1680,6260,2373,3851,1841,4968,1172,5179,7175,3509 4420,1327,3560,2376,6260,2988,9537,4064,4829,8872,9598,3228,1792,7118,9962,9336,4368,9189,6857,1829,9863,6287,7303,7769,2707,8257,2391,2009,3975,4993,3068,9835,3427,341,8412,2134,4034,8511,6421,3041,9012,2983,7289,100,1355,7904,9186,6920,5856,2008,6545,8331,3655,5011,839,8041,9255,6524,3862,8788,62,7455,3513,5003,8413,3918,2076,7960,6108,3638,6999,3436,1441,4858,4181,1866,8731,7745,3744,1000 356,8296,8325,1058,1277,4743,3850,2388,6079,6462,2815,5620,8495,5378,75,4324,3441,9870,1113,165,1544,1179,2834,562,6176,2313,6836,8839,2986,9454,5199,6888,1927,5866,8760,320,1792,8296,7898,6121,7241,5886,5814,2815,8336,1576,4314,3109,2572,6011,2086,9061,9403,3947,5487,9731,7281,3159,1819,1334,3181,5844,5114,9898,4634,2531,4412,6430,4262,8482,4546,4555,6804,2607,9421,686,8649,8860,7794,6672 9870,152,1558,4963,8750,4754,6521,6256,8818,5208,5691,9659,8377,9725,5050,5343,2539,6101,1844,9700,7750,8114,5357,3001,8830,4438,199,9545,8496,43,2078,327,9397,106,6090,8181,8646,6414,7499,5450,4850,6273,5014,4131,7639,3913,6571,8534,9703,4391,7618,445,1320,5,1894,6771,7383,9191,4708,9706,6939,7937,8726,9382,5216,3685,2247,9029,8154,1738,9984,2626,9438,4167,6351,5060,29,1218,1239,4785 192,5213,8297,8974,4032,6966,5717,1179,6523,4679,9513,1481,3041,5355,9303,9154,1389,8702,6589,7818,6336,3539,5538,3094,6646,6702,6266,2759,4608,4452,617,9406,8064,6379,444,5602,4950,1810,8391,1536,316,8714,1178,5182,5863,5110,5372,4954,1978,2971,5680,4863,2255,4630,5723,2168,538,1692,1319,7540,440,6430,6266,7712,7385,5702,620,641,3136,7350,1478,3155,2820,9109,6261,1122,4470,14,8493,2095 1046,4301,6082,474,4974,7822,2102,5161,5172,6946,8074,9716,6586,9962,9749,5015,2217,995,5388,4402,7652,6399,6539,1349,8101,3677,1328,9612,7922,2879,231,5887,2655,508,4357,4964,3554,5930,6236,7384,4614,280,3093,9600,2110,7863,2631,6626,6620,68,1311,7198,7561,1768,5139,1431,221,230,2940,968,5283,6517,2146,1646,869,9402,7068,8645,7058,1765,9690,4152,2926,9504,2939,7504,6074,2944,6470,7859 4659,736,4951,9344,1927,6271,8837,8711,3241,6579,7660,5499,5616,3743,5801,4682,9748,8796,779,1833,4549,8138,4026,775,4170,2432,4174,3741,7540,8017,2833,4027,396,811,2871,1150,9809,2719,9199,8504,1224,540,2051,3519,7982,7367,2761,308,3358,6505,2050,4836,5090,7864,805,2566,2409,6876,3361,8622,5572,5895,3280,441,7893,8105,1634,2929,274,3926,7786,6123,8233,9921,2674,5340,1445,203,4585,3837 5759,338,7444,7968,7742,3755,1591,4839,1705,650,7061,2461,9230,9391,9373,2413,1213,431,7801,4994,2380,2703,6161,6878,8331,2538,6093,1275,5065,5062,2839,582,1014,8109,3525,1544,1569,8622,7944,2905,6120,1564,1839,5570,7579,1318,2677,5257,4418,5601,7935,7656,5192,1864,5886,6083,5580,6202,8869,1636,7907,4759,9082,5854,3185,7631,6854,5872,5632,5280,1431,2077,9717,7431,4256,8261,9680,4487,4752,4286 1571,1428,8599,1230,7772,4221,8523,9049,4042,8726,7567,6736,9033,2104,4879,4967,6334,6716,3994,1269,8995,6539,3610,7667,6560,6065,874,848,4597,1711,7161,4811,6734,5723,6356,6026,9183,2586,5636,1092,7779,7923,8747,6887,7505,9909,1792,3233,4526,3176,1508,8043,720,5212,6046,4988,709,5277,8256,3642,1391,5803,1468,2145,3970,6301,7767,2359,8487,9771,8785,7520,856,1605,8972,2402,2386,991,1383,5963 1822,4824,5957,6511,9868,4113,301,9353,6228,2881,2966,6956,9124,9574,9233,1601,7340,973,9396,540,4747,8590,9535,3650,7333,7583,4806,3593,2738,8157,5215,8472,2284,9473,3906,6982,5505,6053,7936,6074,7179,6688,1564,1103,6860,5839,2022,8490,910,7551,7805,881,7024,1855,9448,4790,1274,3672,2810,774,7623,4223,4850,6071,9975,4935,1915,9771,6690,3846,517,463,7624,4511,614,6394,3661,7409,1395,8127 8738,3850,9555,3695,4383,2378,87,6256,6740,7682,9546,4255,6105,2000,1851,4073,8957,9022,6547,5189,2487,303,9602,7833,1628,4163,6678,3144,8589,7096,8913,5823,4890,7679,1212,9294,5884,2972,3012,3359,7794,7428,1579,4350,7246,4301,7779,7790,3294,9547,4367,3549,1958,8237,6758,3497,3250,3456,6318,1663,708,7714,6143,6890,3428,6853,9334,7992,591,6449,9786,1412,8500,722,5468,1371,108,3939,4199,2535 7047,4323,1934,5163,4166,461,3544,2767,6554,203,6098,2265,9078,2075,4644,6641,8412,9183,487,101,7566,5622,1975,5726,2920,5374,7779,5631,3753,3725,2672,3621,4280,1162,5812,345,8173,9785,1525,955,5603,2215,2580,5261,2765,2990,5979,389,3907,2484,1232,5933,5871,3304,1138,1616,5114,9199,5072,7442,7245,6472,4760,6359,9053,7876,2564,9404,3043,9026,2261,3374,4460,7306,2326,966,828,3274,1712,3446 3975,4565,8131,5800,4570,2306,8838,4392,9147,11,3911,7118,9645,4994,2028,6062,5431,2279,8752,2658,7836,994,7316,5336,7185,3289,1898,9689,2331,5737,3403,1124,2679,3241,7748,16,2724,5441,6640,9368,9081,5618,858,4969,17,2103,6035,8043,7475,2181,939,415,1617,8500,8253,2155,7843,7974,7859,1746,6336,3193,2617,8736,4079,6324,6645,8891,9396,5522,6103,1857,8979,3835,2475,1310,7422,610,8345,7615 9248,5397,5686,2988,3446,4359,6634,9141,497,9176,6773,7448,1907,8454,916,1596,2241,1626,1384,2741,3649,5362,8791,7170,2903,2475,5325,6451,924,3328,522,90,4813,9737,9557,691,2388,1383,4021,1609,9206,4707,5200,7107,8104,4333,9860,5013,1224,6959,8527,1877,4545,7772,6268,621,4915,9349,5970,706,9583,3071,4127,780,8231,3017,9114,3836,7503,2383,1977,4870,8035,2379,9704,1037,3992,3642,1016,4303 5093,138,4639,6609,1146,5565,95,7521,9077,2272,974,4388,2465,2650,722,4998,3567,3047,921,2736,7855,173,2065,4238,1048,5,6847,9548,8632,9194,5942,4777,7910,8971,6279,7253,2516,1555,1833,3184,9453,9053,6897,7808,8629,4877,1871,8055,4881,7639,1537,7701,2508,7564,5845,5023,2304,5396,3193,2955,1088,3801,6203,1748,3737,1276,13,4120,7715,8552,3047,2921,106,7508,304,1280,7140,2567,9135,5266 6237,4607,7527,9047,522,7371,4883,2540,5867,6366,5301,1570,421,276,3361,527,6637,4861,2401,7522,5808,9371,5298,2045,5096,5447,7755,5115,7060,8529,4078,1943,1697,1764,5453,7085,960,2405,739,2100,5800,728,9737,5704,5693,1431,8979,6428,673,7540,6,7773,5857,6823,150,5869,8486,684,5816,9626,7451,5579,8260,3397,5322,6920,1879,2127,2884,5478,4977,9016,6165,6292,3062,5671,5968,78,4619,4763 9905,7127,9390,5185,6923,3721,9164,9705,4341,1031,1046,5127,7376,6528,3248,4941,1178,7889,3364,4486,5358,9402,9158,8600,1025,874,1839,1783,309,9030,1843,845,8398,1433,7118,70,8071,2877,3904,8866,6722,4299,10,1929,5897,4188,600,1889,3325,2485,6473,4474,7444,6992,4846,6166,4441,2283,2629,4352,7775,1101,2214,9985,215,8270,9750,2740,8361,7103,5930,8664,9690,8302,9267,344,2077,1372,1880,9550 5825,8517,7769,2405,8204,1060,3603,7025,478,8334,1997,3692,7433,9101,7294,7498,9415,5452,3850,3508,6857,9213,6807,4412,7310,854,5384,686,4978,892,8651,3241,2743,3801,3813,8588,6701,4416,6990,6490,3197,6838,6503,114,8343,5844,8646,8694,65,791,5979,2687,2621,2019,8097,1423,3644,9764,4921,3266,3662,5561,2476,8271,8138,6147,1168,3340,1998,9874,6572,9873,6659,5609,2711,3931,9567,4143,7833,8887 6223,2099,2700,589,4716,8333,1362,5007,2753,2848,4441,8397,7192,8191,4916,9955,6076,3370,6396,6971,3156,248,3911,2488,4930,2458,7183,5455,170,6809,6417,3390,1956,7188,577,7526,2203,968,8164,479,8699,7915,507,6393,4632,1597,7534,3604,618,3280,6061,9793,9238,8347,568,9645,2070,5198,6482,5000,9212,6655,5961,7513,1323,3872,6170,3812,4146,2736,67,3151,5548,2781,9679,7564,5043,8587,1893,4531 5826,3690,6724,2121,9308,6986,8106,6659,2142,1642,7170,2877,5757,6494,8026,6571,8387,9961,6043,9758,9607,6450,8631,8334,7359,5256,8523,2225,7487,1977,9555,8048,5763,2414,4948,4265,2427,8978,8088,8841,9208,9601,5810,9398,8866,9138,4176,5875,7212,3272,6759,5678,7649,4922,5422,1343,8197,3154,3600,687,1028,4579,2084,9467,4492,7262,7296,6538,7657,7134,2077,1505,7332,6890,8964,4879,7603,7400,5973,739 1861,1613,4879,1884,7334,966,2000,7489,2123,4287,1472,3263,4726,9203,1040,4103,6075,6049,330,9253,4062,4268,1635,9960,577,1320,3195,9628,1030,4092,4979,6474,6393,2799,6967,8687,7724,7392,9927,2085,3200,6466,8702,265,7646,8665,7986,7266,4574,6587,612,2724,704,3191,8323,9523,3002,704,5064,3960,8209,2027,2758,8393,4875,4641,9584,6401,7883,7014,768,443,5490,7506,1852,2005,8850,5776,4487,4269 4052,6687,4705,7260,6645,6715,3706,5504,8672,2853,1136,8187,8203,4016,871,1809,1366,4952,9294,5339,6872,2645,6083,7874,3056,5218,7485,8796,7401,3348,2103,426,8572,4163,9171,3176,948,7654,9344,3217,1650,5580,7971,2622,76,2874,880,2034,9929,1546,2659,5811,3754,7096,7436,9694,9960,7415,2164,953,2360,4194,2397,1047,2196,6827,575,784,2675,8821,6802,7972,5996,6699,2134,7577,2887,1412,4349,4380 4629,2234,6240,8132,7592,3181,6389,1214,266,1910,2451,8784,2790,1127,6932,1447,8986,2492,5476,397,889,3027,7641,5083,5776,4022,185,3364,5701,2442,2840,4160,9525,4828,6602,2614,7447,3711,4505,7745,8034,6514,4907,2605,7753,6958,7270,6936,3006,8968,439,2326,4652,3085,3425,9863,5049,5361,8688,297,7580,8777,7916,6687,8683,7141,306,9569,2384,1500,3346,4601,7329,9040,6097,2727,6314,4501,4974,2829 8316,4072,2025,6884,3027,1808,5714,7624,7880,8528,4205,8686,7587,3230,1139,7273,6163,6986,3914,9309,1464,9359,4474,7095,2212,7302,2583,9462,7532,6567,1606,4436,8981,5612,6796,4385,5076,2007,6072,3678,8331,1338,3299,8845,4783,8613,4071,1232,6028,2176,3990,2148,3748,103,9453,538,6745,9110,926,3125,473,5970,8728,7072,9062,1404,1317,5139,9862,6496,6062,3338,464,1600,2532,1088,8232,7739,8274,3873 2341,523,7096,8397,8301,6541,9844,244,4993,2280,7689,4025,4196,5522,7904,6048,2623,9258,2149,9461,6448,8087,7245,1917,8340,7127,8466,5725,6996,3421,5313,512,9164,9837,9794,8369,4185,1488,7210,1524,1016,4620,9435,2478,7765,8035,697,6677,3724,6988,5853,7662,3895,9593,1185,4727,6025,5734,7665,3070,138,8469,6748,6459,561,7935,8646,2378,462,7755,3115,9690,8877,3946,2728,8793,244,6323,8666,4271 6430,2406,8994,56,1267,3826,9443,7079,7579,5232,6691,3435,6718,5698,4144,7028,592,2627,217,734,6194,8156,9118,58,2640,8069,4127,3285,694,3197,3377,4143,4802,3324,8134,6953,7625,3598,3584,4289,7065,3434,2106,7132,5802,7920,9060,7531,3321,1725,1067,3751,444,5503,6785,7937,6365,4803,198,6266,8177,1470,6390,1606,2904,7555,9834,8667,2033,1723,5167,1666,8546,8152,473,4475,6451,7947,3062,3281 2810,3042,7759,1741,2275,2609,7676,8640,4117,1958,7500,8048,1757,3954,9270,1971,4796,2912,660,5511,3553,1012,5757,4525,6084,7198,8352,5775,7726,8591,7710,9589,3122,4392,6856,5016,749,2285,3356,7482,9956,7348,2599,8944,495,3462,3578,551,4543,7207,7169,7796,1247,4278,6916,8176,3742,8385,2310,1345,8692,2667,4568,1770,8319,3585,4920,3890,4928,7343,5385,9772,7947,8786,2056,9266,3454,2807,877,2660 6206,8252,5928,5837,4177,4333,207,7934,5581,9526,8906,1498,8411,2984,5198,5134,2464,8435,8514,8674,3876,599,5327,826,2152,4084,2433,9327,9697,4800,2728,3608,3849,3861,3498,9943,1407,3991,7191,9110,5666,8434,4704,6545,5944,2357,1163,4995,9619,6754,4200,9682,6654,4862,4744,5953,6632,1054,293,9439,8286,2255,696,8709,1533,1844,6441,430,1999,6063,9431,7018,8057,2920,6266,6799,356,3597,4024,6665 3847,6356,8541,7225,2325,2946,5199,469,5450,7508,2197,9915,8284,7983,6341,3276,3321,16,1321,7608,5015,3362,8491,6968,6818,797,156,2575,706,9516,5344,5457,9210,5051,8099,1617,9951,7663,8253,9683,2670,1261,4710,1068,8753,4799,1228,2621,3275,6188,4699,1791,9518,8701,5932,4275,6011,9877,2933,4182,6059,2930,6687,6682,9771,654,9437,3169,8596,1827,5471,8909,2352,123,4394,3208,8756,5513,6917,2056 5458,8173,3138,3290,4570,4892,3317,4251,9699,7973,1163,1935,5477,6648,9614,5655,9592,975,9118,2194,7322,8248,8413,3462,8560,1907,7810,6650,7355,2939,4973,6894,3933,3784,3200,2419,9234,4747,2208,2207,1945,2899,1407,6145,8023,3484,5688,7686,2737,3828,3704,9004,5190,9740,8643,8650,5358,4426,1522,1707,3613,9887,6956,2447,2762,833,1449,9489,2573,1080,4167,3456,6809,2466,227,7125,2759,6250,6472,8089 3266,7025,9756,3914,1265,9116,7723,9788,6805,5493,2092,8688,6592,9173,4431,4028,6007,7131,4446,4815,3648,6701,759,3312,8355,4485,4187,5188,8746,7759,3528,2177,5243,8379,3838,7233,4607,9187,7216,2190,6967,2920,6082,7910,5354,3609,8958,6949,7731,494,8753,8707,1523,4426,3543,7085,647,6771,9847,646,5049,824,8417,5260,2730,5702,2513,9275,4279,2767,8684,1165,9903,4518,55,9682,8963,6005,2102,6523 1998,8731,936,1479,5259,7064,4085,91,7745,7136,3773,3810,730,8255,2705,2653,9790,6807,2342,355,9344,2668,3690,2028,9679,8102,574,4318,6481,9175,5423,8062,2867,9657,7553,3442,3920,7430,3945,7639,3714,3392,2525,4995,4850,2867,7951,9667,486,9506,9888,781,8866,1702,3795,90,356,1483,4200,2131,6969,5931,486,6880,4404,1084,5169,4910,6567,8335,4686,5043,2614,3352,2667,4513,6472,7471,5720,1616 8878,1613,1716,868,1906,2681,564,665,5995,2474,7496,3432,9491,9087,8850,8287,669,823,347,6194,2264,2592,7871,7616,8508,4827,760,2676,4660,4881,7572,3811,9032,939,4384,929,7525,8419,5556,9063,662,8887,7026,8534,3111,1454,2082,7598,5726,6687,9647,7608,73,3014,5063,670,5461,5631,3367,9796,8475,7908,5073,1565,5008,5295,4457,1274,4788,1728,338,600,8415,8535,9351,7750,6887,5845,1741,125 3637,6489,9634,9464,9055,2413,7824,9517,7532,3577,7050,6186,6980,9365,9782,191,870,2497,8498,2218,2757,5420,6468,586,3320,9230,1034,1393,9886,5072,9391,1178,8464,8042,6869,2075,8275,3601,7715,9470,8786,6475,8373,2159,9237,2066,3264,5000,679,355,3069,4073,494,2308,5512,4334,9438,8786,8637,9774,1169,1949,6594,6072,4270,9158,7916,5752,6794,9391,6301,5842,3285,2141,3898,8027,4310,8821,7079,1307 8497,6681,4732,7151,7060,5204,9030,7157,833,5014,8723,3207,9796,9286,4913,119,5118,7650,9335,809,3675,2597,5144,3945,5090,8384,187,4102,1260,2445,2792,4422,8389,9290,50,1765,1521,6921,8586,4368,1565,5727,7855,2003,4834,9897,5911,8630,5070,1330,7692,7557,7980,6028,5805,9090,8265,3019,3802,698,9149,5748,1965,9658,4417,5994,5584,8226,2937,272,5743,1278,5698,8736,2595,6475,5342,6596,1149,6920 8188,8009,9546,6310,8772,2500,9846,6592,6872,3857,1307,8125,7042,1544,6159,2330,643,4604,7899,6848,371,8067,2062,3200,7295,1857,9505,6936,384,2193,2190,301,8535,5503,1462,7380,5114,4824,8833,1763,4974,8711,9262,6698,3999,2645,6937,7747,1128,2933,3556,7943,2885,3122,9105,5447,418,2899,5148,3699,9021,9501,597,4084,175,1621,1,1079,6067,5812,4326,9914,6633,5394,4233,6728,9084,1864,5863,1225 9935,8793,9117,1825,9542,8246,8437,3331,9128,9675,6086,7075,319,1334,7932,3583,7167,4178,1726,7720,695,8277,7887,6359,5912,1719,2780,8529,1359,2013,4498,8072,1129,9998,1147,8804,9405,6255,1619,2165,7491,1,8882,7378,3337,503,5758,4109,3577,985,3200,7615,8058,5032,1080,6410,6873,5496,1466,2412,9885,5904,4406,3605,8770,4361,6205,9193,1537,9959,214,7260,9566,1685,100,4920,7138,9819,5637,976 3466,9854,985,1078,7222,8888,5466,5379,3578,4540,6853,8690,3728,6351,7147,3134,6921,9692,857,3307,4998,2172,5783,3931,9417,2541,6299,13,787,2099,9131,9494,896,8600,1643,8419,7248,2660,2609,8579,91,6663,5506,7675,1947,6165,4286,1972,9645,3805,1663,1456,8853,5705,9889,7489,1107,383,4044,2969,3343,152,7805,4980,9929,5033,1737,9953,7197,9158,4071,1324,473,9676,3984,9680,3606,8160,7384,5432 1005,4512,5186,3953,2164,3372,4097,3247,8697,3022,9896,4101,3871,6791,3219,2742,4630,6967,7829,5991,6134,1197,1414,8923,8787,1394,8852,5019,7768,5147,8004,8825,5062,9625,7988,1110,3992,7984,9966,6516,6251,8270,421,3723,1432,4830,6935,8095,9059,2214,6483,6846,3120,1587,6201,6691,9096,9627,6671,4002,3495,9939,7708,7465,5879,6959,6634,3241,3401,2355,9061,2611,7830,3941,2177,2146,5089,7079,519,6351 7280,8586,4261,2831,7217,3141,9994,9940,5462,2189,4005,6942,9848,5350,8060,6665,7519,4324,7684,657,9453,9296,2944,6843,7499,7847,1728,9681,3906,6353,5529,2822,3355,3897,7724,4257,7489,8672,4356,3983,1948,6892,7415,4153,5893,4190,621,1736,4045,9532,7701,3671,1211,1622,3176,4524,9317,7800,5638,6644,6943,5463,3531,2821,1347,5958,3436,1438,2999,994,850,4131,2616,1549,3465,5946,690,9273,6954,7991 9517,399,3249,2596,7736,2142,1322,968,7350,1614,468,3346,3265,7222,6086,1661,5317,2582,7959,4685,2807,2917,1037,5698,1529,3972,8716,2634,3301,3412,8621,743,8001,4734,888,7744,8092,3671,8941,1487,5658,7099,2781,99,1932,4443,4756,4652,9328,1581,7855,4312,5976,7255,6480,3996,2748,1973,9731,4530,2790,9417,7186,5303,3557,351,7182,9428,1342,9020,7599,1392,8304,2070,9138,7215,2008,9937,1106,7110 7444,769,9688,632,1571,6820,8743,4338,337,3366,3073,1946,8219,104,4210,6986,249,5061,8693,7960,6546,1004,8857,5997,9352,4338,6105,5008,2556,6518,6694,4345,3727,7956,20,3954,8652,4424,9387,2035,8358,5962,5304,5194,8650,8282,1256,1103,2138,6679,1985,3653,2770,2433,4278,615,2863,1715,242,3790,2636,6998,3088,1671,2239,957,5411,4595,6282,2881,9974,2401,875,7574,2987,4587,3147,6766,9885,2965 3287,3016,3619,6818,9073,6120,5423,557,2900,2015,8111,3873,1314,4189,1846,4399,7041,7583,2427,2864,3525,5002,2069,748,1948,6015,2684,438,770,8367,1663,7887,7759,1885,157,7770,4520,4878,3857,1137,3525,3050,6276,5569,7649,904,4533,7843,2199,5648,7628,9075,9441,3600,7231,2388,5640,9096,958,3058,584,5899,8150,1181,9616,1098,8162,6819,8171,1519,1140,7665,8801,2632,1299,9192,707,9955,2710,7314 1772,2963,7578,3541,3095,1488,7026,2634,6015,4633,4370,2762,1650,2174,909,8158,2922,8467,4198,4280,9092,8856,8835,5457,2790,8574,9742,5054,9547,4156,7940,8126,9824,7340,8840,6574,3547,1477,3014,6798,7134,435,9484,9859,3031,4,1502,4133,1738,1807,4825,463,6343,9701,8506,9822,9555,8688,8168,3467,3234,6318,1787,5591,419,6593,7974,8486,9861,6381,6758,194,3061,4315,2863,4665,3789,2201,1492,4416 126,8927,6608,5682,8986,6867,1715,6076,3159,788,3140,4744,830,9253,5812,5021,7616,8534,1546,9590,1101,9012,9821,8132,7857,4086,1069,7491,2988,1579,2442,4321,2149,7642,6108,250,6086,3167,24,9528,7663,2685,1220,9196,1397,5776,1577,1730,5481,977,6115,199,6326,2183,3767,5928,5586,7561,663,8649,9688,949,5913,9160,1870,5764,9887,4477,6703,1413,4995,5494,7131,2192,8969,7138,3997,8697,646,1028 8074,1731,8245,624,4601,8706,155,8891,309,2552,8208,8452,2954,3124,3469,4246,3352,1105,4509,8677,9901,4416,8191,9283,5625,7120,2952,8881,7693,830,4580,8228,9459,8611,4499,1179,4988,1394,550,2336,6089,6872,269,7213,1848,917,6672,4890,656,1478,6536,3165,4743,4990,1176,6211,7207,5284,9730,4738,1549,4986,4942,8645,3698,9429,1439,2175,6549,3058,6513,1574,6988,8333,3406,5245,5431,7140,7085,6407 7845,4694,2530,8249,290,5948,5509,1588,5940,4495,5866,5021,4626,3979,3296,7589,4854,1998,5627,3926,8346,6512,9608,1918,7070,4747,4182,2858,2766,4606,6269,4107,8982,8568,9053,4244,5604,102,2756,727,5887,2566,7922,44,5986,621,1202,374,6988,4130,3627,6744,9443,4568,1398,8679,397,3928,9159,367,2917,6127,5788,3304,8129,911,2669,1463,9749,264,4478,8940,1109,7309,2462,117,4692,7724,225,2312 4164,3637,2000,941,8903,39,3443,7172,1031,3687,4901,8082,4945,4515,7204,9310,9349,9535,9940,218,1788,9245,2237,1541,5670,6538,6047,5553,9807,8101,1925,8714,445,8332,7309,6830,5786,5736,7306,2710,3034,1838,7969,6318,7912,2584,2080,7437,6705,2254,7428,820,782,9861,7596,3842,3631,8063,5240,6666,394,4565,7865,4895,9890,6028,6117,4724,9156,4473,4552,602,470,6191,4927,5387,884,3146,1978,3000 4258,6880,1696,3582,5793,4923,2119,1155,9056,9698,6603,3768,5514,9927,9609,6166,6566,4536,4985,4934,8076,9062,6741,6163,7399,4562,2337,5600,2919,9012,8459,1308,6072,1225,9306,8818,5886,7243,7365,8792,6007,9256,6699,7171,4230,7002,8720,7839,4533,1671,478,7774,1607,2317,5437,4705,7886,4760,6760,7271,3081,2997,3088,7675,6208,3101,6821,6840,122,9633,4900,2067,8546,4549,2091,7188,5605,8599,6758,5229 7854,5243,9155,3556,8812,7047,2202,1541,5993,4600,4760,713,434,7911,7426,7414,8729,322,803,7960,7563,4908,6285,6291,736,3389,9339,4132,8701,7534,5287,3646,592,3065,7582,2592,8755,6068,8597,1982,5782,1894,2900,6236,4039,6569,3037,5837,7698,700,7815,2491,7272,5878,3083,6778,6639,3589,5010,8313,2581,6617,5869,8402,6808,2951,2321,5195,497,2190,6187,1342,1316,4453,7740,4154,2959,1781,1482,8256 7178,2046,4419,744,8312,5356,6855,8839,319,2962,5662,47,6307,8662,68,4813,567,2712,9931,1678,3101,8227,6533,4933,6656,92,5846,4780,6256,6361,4323,9985,1231,2175,7178,3034,9744,6155,9165,7787,5836,9318,7860,9644,8941,6480,9443,8188,5928,161,6979,2352,5628,6991,1198,8067,5867,6620,3778,8426,2994,3122,3124,6335,3918,8897,2655,9670,634,1088,1576,8935,7255,474,8166,7417,9547,2886,5560,3842 6957,3111,26,7530,7143,1295,1744,6057,3009,1854,8098,5405,2234,4874,9447,2620,9303,27,7410,969,40,2966,5648,7596,8637,4238,3143,3679,7187,690,9980,7085,7714,9373,5632,7526,6707,3951,9734,4216,2146,3602,5371,6029,3039,4433,4855,4151,1449,3376,8009,7240,7027,4602,2947,9081,4045,8424,9352,8742,923,2705,4266,3232,2264,6761,363,2651,3383,7770,6730,7856,7340,9679,2158,610,4471,4608,910,6241 4417,6756,1013,8797,658,8809,5032,8703,7541,846,3357,2920,9817,1745,9980,7593,4667,3087,779,3218,6233,5568,4296,2289,2654,7898,5021,9461,5593,8214,9173,4203,2271,7980,2983,5952,9992,8399,3468,1776,3188,9314,1720,6523,2933,621,8685,5483,8986,6163,3444,9539,4320,155,3992,2828,2150,6071,524,2895,5468,8063,1210,3348,9071,4862,483,9017,4097,6186,9815,3610,5048,1644,1003,9865,9332,2145,1944,2213 9284,3803,4920,1927,6706,4344,7383,4786,9890,2010,5228,1224,3158,6967,8580,8990,8883,5213,76,8306,2031,4980,5639,9519,7184,5645,7769,3259,8077,9130,1317,3096,9624,3818,1770,695,2454,947,6029,3474,9938,3527,5696,4760,7724,7738,2848,6442,5767,6845,8323,4131,2859,7595,2500,4815,3660,9130,8580,7016,8231,4391,8369,3444,4069,4021,556,6154,627,2778,1496,4206,6356,8434,8491,3816,8231,3190,5575,1015 3787,7572,1788,6803,5641,6844,1961,4811,8535,9914,9999,1450,8857,738,4662,8569,6679,2225,7839,8618,286,2648,5342,2294,3205,4546,176,8705,3741,6134,8324,8021,7004,5205,7032,6637,9442,5539,5584,4819,5874,5807,8589,6871,9016,983,1758,3786,1519,6241,185,8398,495,3370,9133,3051,4549,9674,7311,9738,3316,9383,2658,2776,9481,7558,619,3943,3324,6491,4933,153,9738,4623,912,3595,7771,7939,1219,4405 2650,3883,4154,5809,315,7756,4430,1788,4451,1631,6461,7230,6017,5751,138,588,5282,2442,9110,9035,6349,2515,1570,6122,4192,4174,3530,1933,4186,4420,4609,5739,4135,2963,6308,1161,8809,8619,2796,3819,6971,8228,4188,1492,909,8048,2328,6772,8467,7671,9068,2226,7579,6422,7056,8042,3296,2272,3006,2196,7320,3238,3490,3102,37,1293,3212,4767,5041,8773,5794,4456,6174,7279,7054,2835,7053,9088,790,6640 3101,1057,7057,3826,6077,1025,2955,1224,1114,6729,5902,4698,6239,7203,9423,1804,4417,6686,1426,6941,8071,1029,4985,9010,6122,6597,1622,1574,3513,1684,7086,5505,3244,411,9638,4150,907,9135,829,981,1707,5359,8781,9751,5,9131,3973,7159,1340,6955,7514,7993,6964,8198,1933,2797,877,3993,4453,8020,9349,8646,2779,8679,2961,3547,3374,3510,1129,3568,2241,2625,9138,5974,8206,7669,7678,1833,8700,4480 4865,9912,8038,8238,782,3095,8199,1127,4501,7280,2112,2487,3626,2790,9432,1475,6312,8277,4827,2218,5806,7132,8752,1468,7471,6386,739,8762,8323,8120,5169,9078,9058,3370,9560,7987,8585,8531,5347,9312,1058,4271,1159,5286,5404,6925,8606,9204,7361,2415,560,586,4002,2644,1927,2824,768,4409,2942,3345,1002,808,4941,6267,7979,5140,8643,7553,9438,7320,4938,2666,4609,2778,8158,6730,3748,3867,1866,7181 171,3771,7134,8927,4778,2913,3326,2004,3089,7853,1378,1729,4777,2706,9578,1360,5693,3036,1851,7248,2403,2273,8536,6501,9216,613,9671,7131,7719,6425,773,717,8803,160,1114,7554,7197,753,4513,4322,8499,4533,2609,4226,8710,6627,644,9666,6260,4870,5744,7385,6542,6203,7703,6130,8944,5589,2262,6803,6381,7414,6888,5123,7320,9392,9061,6780,322,8975,7050,5089,1061,2260,3199,1150,1865,5386,9699,6501 3744,8454,6885,8277,919,1923,4001,6864,7854,5519,2491,6057,8794,9645,1776,5714,9786,9281,7538,6916,3215,395,2501,9618,4835,8846,9708,2813,3303,1794,8309,7176,2206,1602,1838,236,4593,2245,8993,4017,10,8215,6921,5206,4023,5932,6997,7801,262,7640,3107,8275,4938,7822,2425,3223,3886,2105,8700,9526,2088,8662,8034,7004,5710,2124,7164,3574,6630,9980,4242,2901,9471,1491,2117,4562,1130,9086,4117,6698 2810,2280,2331,1170,4554,4071,8387,1215,2274,9848,6738,1604,7281,8805,439,1298,8318,7834,9426,8603,6092,7944,1309,8828,303,3157,4638,4439,9175,1921,4695,7716,1494,1015,1772,5913,1127,1952,1950,8905,4064,9890,385,9357,7945,5035,7082,5369,4093,6546,5187,5637,2041,8946,1758,7111,6566,1027,1049,5148,7224,7248,296,6169,375,1656,7993,2816,3717,4279,4675,1609,3317,42,6201,3100,3144,163,9530,4531 7096,6070,1009,4988,3538,5801,7149,3063,2324,2912,7911,7002,4338,7880,2481,7368,3516,2016,7556,2193,1388,3865,8125,4637,4096,8114,750,3144,1938,7002,9343,4095,1392,4220,3455,6969,9647,1321,9048,1996,1640,6626,1788,314,9578,6630,2813,6626,4981,9908,7024,4355,3201,3521,3864,3303,464,1923,595,9801,3391,8366,8084,9374,1041,8807,9085,1892,9431,8317,9016,9221,8574,9981,9240,5395,2009,6310,2854,9255 8830,3145,2960,9615,8220,6061,3452,2918,6481,9278,2297,3385,6565,7066,7316,5682,107,7646,4466,68,1952,9603,8615,54,7191,791,6833,2560,693,9733,4168,570,9127,9537,1925,8287,5508,4297,8452,8795,6213,7994,2420,4208,524,5915,8602,8330,2651,8547,6156,1812,6271,7991,9407,9804,1553,6866,1128,2119,4691,9711,8315,5879,9935,6900,482,682,4126,1041,428,6247,3720,5882,7526,2582,4327,7725,3503,2631 2738,9323,721,7434,1453,6294,2957,3786,5722,6019,8685,4386,3066,9057,6860,499,5315,3045,5194,7111,3137,9104,941,586,3066,755,4177,8819,7040,5309,3583,3897,4428,7788,4721,7249,6559,7324,825,7311,3760,6064,6070,9672,4882,584,1365,9739,9331,5783,2624,7889,1604,1303,1555,7125,8312,425,8936,3233,7724,1480,403,7440,1784,1754,4721,1569,652,3893,4574,5692,9730,4813,9844,8291,9199,7101,3391,8914 6044,2928,9332,3328,8588,447,3830,1176,3523,2705,8365,6136,5442,9049,5526,8575,8869,9031,7280,706,2794,8814,5767,4241,7696,78,6570,556,5083,1426,4502,3336,9518,2292,1885,3740,3153,9348,9331,8051,2759,5407,9028,7840,9255,831,515,2612,9747,7435,8964,4971,2048,4900,5967,8271,1719,9670,2810,6777,1594,6367,6259,8316,3815,1689,6840,9437,4361,822,9619,3065,83,6344,7486,8657,8228,9635,6932,4864 8478,4777,6334,4678,7476,4963,6735,3096,5860,1405,5127,7269,7793,4738,227,9168,2996,8928,765,733,1276,7677,6258,1528,9558,3329,302,8901,1422,8277,6340,645,9125,8869,5952,141,8141,1816,9635,4025,4184,3093,83,2344,2747,9352,7966,1206,1126,1826,218,7939,2957,2729,810,8752,5247,4174,4038,8884,7899,9567,301,5265,5752,7524,4381,1669,3106,8270,6228,6373,754,2547,4240,2313,5514,3022,1040,9738 2265,8192,1763,1369,8469,8789,4836,52,1212,6690,5257,8918,6723,6319,378,4039,2421,8555,8184,9577,1432,7139,8078,5452,9628,7579,4161,7490,5159,8559,1011,81,478,5840,1964,1334,6875,8670,9900,739,1514,8692,522,9316,6955,1345,8132,2277,3193,9773,3923,4177,2183,1236,6747,6575,4874,6003,6409,8187,745,8776,9440,7543,9825,2582,7381,8147,7236,5185,7564,6125,218,7991,6394,391,7659,7456,5128,5294 2132,8992,8160,5782,4420,3371,3798,5054,552,5631,7546,4716,1332,6486,7892,7441,4370,6231,4579,2121,8615,1145,9391,1524,1385,2400,9437,2454,7896,7467,2928,8400,3299,4025,7458,4703,7206,6358,792,6200,725,4275,4136,7390,5984,4502,7929,5085,8176,4600,119,3568,76,9363,6943,2248,9077,9731,6213,5817,6729,4190,3092,6910,759,2682,8380,1254,9604,3011,9291,5329,9453,9746,2739,6522,3765,5634,1113,5789 5304,5499,564,2801,679,2653,1783,3608,7359,7797,3284,796,3222,437,7185,6135,8571,2778,7488,5746,678,6140,861,7750,803,9859,9918,2425,3734,2698,9005,4864,9818,6743,2475,132,9486,3825,5472,919,292,4411,7213,7699,6435,9019,6769,1388,802,2124,1345,8493,9487,8558,7061,8777,8833,2427,2238,5409,4957,8503,3171,7622,5779,6145,2417,5873,5563,5693,9574,9491,1937,7384,4563,6842,5432,2751,3406,7981
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
[pytest] markers = mat_ops: tests for matrix operations
[pytest] markers = mat_ops: tests for matrix operations
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations from collections.abc import Iterator from typing import Any class Node: def __init__(self, data: Any): self.data: Any = data self.next: Node | None = None class CircularLinkedList: def __init__(self): self.head = None self.tail = None def __iter__(self) -> Iterator[Any]: node = self.head while self.head: yield node.data node = node.next if node == self.head: break def __len__(self) -> int: return sum(1 for _ in self) def __repr__(self): return "->".join(str(item) for item in iter(self)) def insert_tail(self, data: Any) -> None: self.insert_nth(len(self), data) def insert_head(self, data: Any) -> None: self.insert_nth(0, data) def insert_nth(self, index: int, data: Any) -> None: if index < 0 or index > len(self): raise IndexError("list index out of range.") new_node = Node(data) if self.head is None: new_node.next = new_node # first node points itself self.tail = self.head = new_node elif index == 0: # insert at head new_node.next = self.head self.head = self.tail.next = new_node else: temp = self.head for _ in range(index - 1): temp = temp.next new_node.next = temp.next temp.next = new_node if index == len(self) - 1: # insert at tail self.tail = new_node def delete_front(self): return self.delete_nth(0) def delete_tail(self) -> Any: return self.delete_nth(len(self) - 1) def delete_nth(self, index: int = 0) -> Any: if not 0 <= index < len(self): raise IndexError("list index out of range.") delete_node = self.head if self.head == self.tail: # just one node self.head = self.tail = None elif index == 0: # delete head node self.tail.next = self.tail.next.next self.head = self.head.next else: temp = self.head for _ in range(index - 1): temp = temp.next delete_node = temp.next temp.next = temp.next.next if index == len(self) - 1: # delete at tail self.tail = temp return delete_node.data def is_empty(self) -> bool: return len(self) == 0 def test_circular_linked_list() -> None: """ >>> test_circular_linked_list() """ circular_linked_list = CircularLinkedList() assert len(circular_linked_list) == 0 assert circular_linked_list.is_empty() is True assert str(circular_linked_list) == "" try: circular_linked_list.delete_front() raise AssertionError() # This should not happen except IndexError: assert True # This should happen try: circular_linked_list.delete_tail() raise AssertionError() # This should not happen except IndexError: assert True # This should happen try: circular_linked_list.delete_nth(-1) raise AssertionError() except IndexError: assert True try: circular_linked_list.delete_nth(0) raise AssertionError() except IndexError: assert True assert circular_linked_list.is_empty() is True for i in range(5): assert len(circular_linked_list) == i circular_linked_list.insert_nth(i, i + 1) assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 6)) circular_linked_list.insert_tail(6) assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 7)) circular_linked_list.insert_head(0) assert str(circular_linked_list) == "->".join(str(i) for i in range(0, 7)) assert circular_linked_list.delete_front() == 0 assert circular_linked_list.delete_tail() == 6 assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 6)) assert circular_linked_list.delete_nth(2) == 3 circular_linked_list.insert_nth(2, 3) assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 6)) assert circular_linked_list.is_empty() is False if __name__ == "__main__": import doctest doctest.testmod()
from __future__ import annotations from collections.abc import Iterator from typing import Any class Node: def __init__(self, data: Any): self.data: Any = data self.next: Node | None = None class CircularLinkedList: def __init__(self): self.head = None self.tail = None def __iter__(self) -> Iterator[Any]: node = self.head while self.head: yield node.data node = node.next if node == self.head: break def __len__(self) -> int: return sum(1 for _ in self) def __repr__(self): return "->".join(str(item) for item in iter(self)) def insert_tail(self, data: Any) -> None: self.insert_nth(len(self), data) def insert_head(self, data: Any) -> None: self.insert_nth(0, data) def insert_nth(self, index: int, data: Any) -> None: if index < 0 or index > len(self): raise IndexError("list index out of range.") new_node = Node(data) if self.head is None: new_node.next = new_node # first node points itself self.tail = self.head = new_node elif index == 0: # insert at head new_node.next = self.head self.head = self.tail.next = new_node else: temp = self.head for _ in range(index - 1): temp = temp.next new_node.next = temp.next temp.next = new_node if index == len(self) - 1: # insert at tail self.tail = new_node def delete_front(self): return self.delete_nth(0) def delete_tail(self) -> Any: return self.delete_nth(len(self) - 1) def delete_nth(self, index: int = 0) -> Any: if not 0 <= index < len(self): raise IndexError("list index out of range.") delete_node = self.head if self.head == self.tail: # just one node self.head = self.tail = None elif index == 0: # delete head node self.tail.next = self.tail.next.next self.head = self.head.next else: temp = self.head for _ in range(index - 1): temp = temp.next delete_node = temp.next temp.next = temp.next.next if index == len(self) - 1: # delete at tail self.tail = temp return delete_node.data def is_empty(self) -> bool: return len(self) == 0 def test_circular_linked_list() -> None: """ >>> test_circular_linked_list() """ circular_linked_list = CircularLinkedList() assert len(circular_linked_list) == 0 assert circular_linked_list.is_empty() is True assert str(circular_linked_list) == "" try: circular_linked_list.delete_front() raise AssertionError() # This should not happen except IndexError: assert True # This should happen try: circular_linked_list.delete_tail() raise AssertionError() # This should not happen except IndexError: assert True # This should happen try: circular_linked_list.delete_nth(-1) raise AssertionError() except IndexError: assert True try: circular_linked_list.delete_nth(0) raise AssertionError() except IndexError: assert True assert circular_linked_list.is_empty() is True for i in range(5): assert len(circular_linked_list) == i circular_linked_list.insert_nth(i, i + 1) assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 6)) circular_linked_list.insert_tail(6) assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 7)) circular_linked_list.insert_head(0) assert str(circular_linked_list) == "->".join(str(i) for i in range(0, 7)) assert circular_linked_list.delete_front() == 0 assert circular_linked_list.delete_tail() == 6 assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 6)) assert circular_linked_list.delete_nth(2) == 3 circular_linked_list.insert_nth(2, 3) assert str(circular_linked_list) == "->".join(str(i) for i in range(1, 6)) assert circular_linked_list.is_empty() is False if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" python/black : True """ from __future__ import annotations def prime_factors(n: int) -> list[int]: """ Returns prime factors of n as a list. >>> prime_factors(0) [] >>> prime_factors(100) [2, 2, 5, 5] >>> prime_factors(2560) [2, 2, 2, 2, 2, 2, 2, 2, 2, 5] >>> prime_factors(10**-2) [] >>> prime_factors(0.02) [] >>> x = prime_factors(10**241) # doctest: +NORMALIZE_WHITESPACE >>> x == [2]*241 + [5]*241 True >>> prime_factors(10**-354) [] >>> prime_factors('hello') Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'str' >>> prime_factors([1,2,'hello']) Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'list' """ i = 2 factors = [] while i * i <= n: if n % i: i += 1 else: n //= i factors.append(i) if n > 1: factors.append(n) return factors if __name__ == "__main__": import doctest doctest.testmod()
""" python/black : True """ from __future__ import annotations def prime_factors(n: int) -> list[int]: """ Returns prime factors of n as a list. >>> prime_factors(0) [] >>> prime_factors(100) [2, 2, 5, 5] >>> prime_factors(2560) [2, 2, 2, 2, 2, 2, 2, 2, 2, 5] >>> prime_factors(10**-2) [] >>> prime_factors(0.02) [] >>> x = prime_factors(10**241) # doctest: +NORMALIZE_WHITESPACE >>> x == [2]*241 + [5]*241 True >>> prime_factors(10**-354) [] >>> prime_factors('hello') Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'str' >>> prime_factors([1,2,'hello']) Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'list' """ i = 2 factors = [] while i * i <= n: if n % i: i += 1 else: n //= i factors.append(i) if n > 1: factors.append(n) return factors if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# https://en.wikipedia.org/wiki/Coulomb%27s_law from __future__ import annotations COULOMBS_CONSTANT = 8.988e9 # units = N * m^s * C^-2 def couloumbs_law( force: float, charge1: float, charge2: float, distance: float ) -> dict[str, float]: """ Apply Coulomb's Law on any three given values. These can be force, charge1, charge2, or distance, and then in a Python dict return name/value pair of the zero value. Coulomb's Law states that the magnitude of the electrostatic force of attraction or repulsion between two point charges is directly proportional to the product of the magnitudes of charges and inversely proportional to the square of the distance between them. Reference ---------- Coulomb (1785) "Premier mémoire sur l’électricité et le magnétisme," Histoire de l’Académie Royale des Sciences, pp. 569–577. Parameters ---------- force : float with units in Newtons charge1 : float with units in Coulombs charge2 : float with units in Coulombs distance : float with units in meters Returns ------- result : dict name/value pair of the zero value >>> couloumbs_law(force=0, charge1=3, charge2=5, distance=2000) {'force': 33705.0} >>> couloumbs_law(force=10, charge1=3, charge2=5, distance=0) {'distance': 116112.01488218177} >>> couloumbs_law(force=10, charge1=0, charge2=5, distance=2000) {'charge1': 0.0008900756564307966} >>> couloumbs_law(force=0, charge1=0, charge2=5, distance=2000) Traceback (most recent call last): ... ValueError: One and only one argument must be 0 >>> couloumbs_law(force=0, charge1=3, charge2=5, distance=-2000) Traceback (most recent call last): ... ValueError: Distance cannot be negative """ charge_product = abs(charge1 * charge2) if (force, charge1, charge2, distance).count(0) != 1: raise ValueError("One and only one argument must be 0") if distance < 0: raise ValueError("Distance cannot be negative") if force == 0: force = COULOMBS_CONSTANT * charge_product / (distance**2) return {"force": force} elif charge1 == 0: charge1 = abs(force) * (distance**2) / (COULOMBS_CONSTANT * charge2) return {"charge1": charge1} elif charge2 == 0: charge2 = abs(force) * (distance**2) / (COULOMBS_CONSTANT * charge1) return {"charge2": charge2} elif distance == 0: distance = (COULOMBS_CONSTANT * charge_product / abs(force)) ** 0.5 return {"distance": distance} raise ValueError("Exactly one argument must be 0") if __name__ == "__main__": import doctest doctest.testmod()
# https://en.wikipedia.org/wiki/Coulomb%27s_law from __future__ import annotations COULOMBS_CONSTANT = 8.988e9 # units = N * m^s * C^-2 def couloumbs_law( force: float, charge1: float, charge2: float, distance: float ) -> dict[str, float]: """ Apply Coulomb's Law on any three given values. These can be force, charge1, charge2, or distance, and then in a Python dict return name/value pair of the zero value. Coulomb's Law states that the magnitude of the electrostatic force of attraction or repulsion between two point charges is directly proportional to the product of the magnitudes of charges and inversely proportional to the square of the distance between them. Reference ---------- Coulomb (1785) "Premier mémoire sur l’électricité et le magnétisme," Histoire de l’Académie Royale des Sciences, pp. 569–577. Parameters ---------- force : float with units in Newtons charge1 : float with units in Coulombs charge2 : float with units in Coulombs distance : float with units in meters Returns ------- result : dict name/value pair of the zero value >>> couloumbs_law(force=0, charge1=3, charge2=5, distance=2000) {'force': 33705.0} >>> couloumbs_law(force=10, charge1=3, charge2=5, distance=0) {'distance': 116112.01488218177} >>> couloumbs_law(force=10, charge1=0, charge2=5, distance=2000) {'charge1': 0.0008900756564307966} >>> couloumbs_law(force=0, charge1=0, charge2=5, distance=2000) Traceback (most recent call last): ... ValueError: One and only one argument must be 0 >>> couloumbs_law(force=0, charge1=3, charge2=5, distance=-2000) Traceback (most recent call last): ... ValueError: Distance cannot be negative """ charge_product = abs(charge1 * charge2) if (force, charge1, charge2, distance).count(0) != 1: raise ValueError("One and only one argument must be 0") if distance < 0: raise ValueError("Distance cannot be negative") if force == 0: force = COULOMBS_CONSTANT * charge_product / (distance**2) return {"force": force} elif charge1 == 0: charge1 = abs(force) * (distance**2) / (COULOMBS_CONSTANT * charge2) return {"charge1": charge1} elif charge2 == 0: charge2 = abs(force) * (distance**2) / (COULOMBS_CONSTANT * charge1) return {"charge2": charge2} elif distance == 0: distance = (COULOMBS_CONSTANT * charge_product / abs(force)) ** 0.5 return {"distance": distance} raise ValueError("Exactly one argument must be 0") if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler problem 145: https://projecteuler.net/problem=145 Author: Vineet Rao, Maxim Smolskiy Problem statement: Some positive integers n have the property that the sum [ n + reverse(n) ] consists entirely of odd (decimal) digits. For instance, 36 + 63 = 99 and 409 + 904 = 1313. We will call such numbers reversible; so 36, 63, 409, and 904 are reversible. Leading zeroes are not allowed in either n or reverse(n). There are 120 reversible numbers below one-thousand. How many reversible numbers are there below one-billion (10^9)? """ EVEN_DIGITS = [0, 2, 4, 6, 8] ODD_DIGITS = [1, 3, 5, 7, 9] def reversible_numbers( remaining_length: int, remainder: int, digits: list[int], length: int ) -> int: """ Count the number of reversible numbers of given length. Iterate over possible digits considering parity of current sum remainder. >>> reversible_numbers(1, 0, [0], 1) 0 >>> reversible_numbers(2, 0, [0] * 2, 2) 20 >>> reversible_numbers(3, 0, [0] * 3, 3) 100 """ if remaining_length == 0: if digits[0] == 0 or digits[-1] == 0: return 0 for i in range(length // 2 - 1, -1, -1): remainder += digits[i] + digits[length - i - 1] if remainder % 2 == 0: return 0 remainder //= 10 return 1 if remaining_length == 1: if remainder % 2 == 0: return 0 result = 0 for digit in range(10): digits[length // 2] = digit result += reversible_numbers( 0, (remainder + 2 * digit) // 10, digits, length ) return result result = 0 for digit1 in range(10): digits[(length + remaining_length) // 2 - 1] = digit1 if (remainder + digit1) % 2 == 0: other_parity_digits = ODD_DIGITS else: other_parity_digits = EVEN_DIGITS for digit2 in other_parity_digits: digits[(length - remaining_length) // 2] = digit2 result += reversible_numbers( remaining_length - 2, (remainder + digit1 + digit2) // 10, digits, length, ) return result def solution(max_power: int = 9) -> int: """ To evaluate the solution, use solution() >>> solution(3) 120 >>> solution(6) 18720 >>> solution(7) 68720 """ result = 0 for length in range(1, max_power + 1): result += reversible_numbers(length, 0, [0] * length, length) return result if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler problem 145: https://projecteuler.net/problem=145 Author: Vineet Rao, Maxim Smolskiy Problem statement: Some positive integers n have the property that the sum [ n + reverse(n) ] consists entirely of odd (decimal) digits. For instance, 36 + 63 = 99 and 409 + 904 = 1313. We will call such numbers reversible; so 36, 63, 409, and 904 are reversible. Leading zeroes are not allowed in either n or reverse(n). There are 120 reversible numbers below one-thousand. How many reversible numbers are there below one-billion (10^9)? """ EVEN_DIGITS = [0, 2, 4, 6, 8] ODD_DIGITS = [1, 3, 5, 7, 9] def reversible_numbers( remaining_length: int, remainder: int, digits: list[int], length: int ) -> int: """ Count the number of reversible numbers of given length. Iterate over possible digits considering parity of current sum remainder. >>> reversible_numbers(1, 0, [0], 1) 0 >>> reversible_numbers(2, 0, [0] * 2, 2) 20 >>> reversible_numbers(3, 0, [0] * 3, 3) 100 """ if remaining_length == 0: if digits[0] == 0 or digits[-1] == 0: return 0 for i in range(length // 2 - 1, -1, -1): remainder += digits[i] + digits[length - i - 1] if remainder % 2 == 0: return 0 remainder //= 10 return 1 if remaining_length == 1: if remainder % 2 == 0: return 0 result = 0 for digit in range(10): digits[length // 2] = digit result += reversible_numbers( 0, (remainder + 2 * digit) // 10, digits, length ) return result result = 0 for digit1 in range(10): digits[(length + remaining_length) // 2 - 1] = digit1 if (remainder + digit1) % 2 == 0: other_parity_digits = ODD_DIGITS else: other_parity_digits = EVEN_DIGITS for digit2 in other_parity_digits: digits[(length - remaining_length) // 2] = digit2 result += reversible_numbers( remaining_length - 2, (remainder + digit1 + digit2) // 10, digits, length, ) return result def solution(max_power: int = 9) -> int: """ To evaluate the solution, use solution() >>> solution(3) 120 >>> solution(6) 18720 >>> solution(7) 68720 """ result = 0 for length in range(1, max_power + 1): result += reversible_numbers(length, 0, [0] * length, length) return result if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def binary_count_setbits(a: int) -> int: """ Take in 1 integer, return a number that is the number of 1's in binary representation of that number. >>> binary_count_setbits(25) 3 >>> binary_count_setbits(36) 2 >>> binary_count_setbits(16) 1 >>> binary_count_setbits(58) 4 >>> binary_count_setbits(4294967295) 32 >>> binary_count_setbits(0) 0 >>> binary_count_setbits(-10) Traceback (most recent call last): ... ValueError: Input value must be a positive integer >>> binary_count_setbits(0.8) Traceback (most recent call last): ... TypeError: Input value must be a 'int' type >>> binary_count_setbits("0") Traceback (most recent call last): ... TypeError: '<' not supported between instances of 'str' and 'int' """ if a < 0: raise ValueError("Input value must be a positive integer") elif isinstance(a, float): raise TypeError("Input value must be a 'int' type") return bin(a).count("1") if __name__ == "__main__": import doctest doctest.testmod()
def binary_count_setbits(a: int) -> int: """ Take in 1 integer, return a number that is the number of 1's in binary representation of that number. >>> binary_count_setbits(25) 3 >>> binary_count_setbits(36) 2 >>> binary_count_setbits(16) 1 >>> binary_count_setbits(58) 4 >>> binary_count_setbits(4294967295) 32 >>> binary_count_setbits(0) 0 >>> binary_count_setbits(-10) Traceback (most recent call last): ... ValueError: Input value must be a positive integer >>> binary_count_setbits(0.8) Traceback (most recent call last): ... TypeError: Input value must be a 'int' type >>> binary_count_setbits("0") Traceback (most recent call last): ... TypeError: '<' not supported between instances of 'str' and 'int' """ if a < 0: raise ValueError("Input value must be a positive integer") elif isinstance(a, float): raise TypeError("Input value must be a 'int' type") return bin(a).count("1") if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,732
Correct ruff failures
### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2023-05-14T18:14:30Z"
"2023-05-14T21:03:13Z"
793e564e1d4bd6e00b6e2f80869c5fd1fd2872b3
1faf10b5c2dff8cef3f5d59f60a126bd19bb1c44
Correct ruff failures. ### Describe your change: Fixes #8723 * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Computer Vision Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human does, and provide appropriate output. It is like imparting human intelligence and instincts to a computer. Image processing and computer vision are a little different from each other. Image processing means applying some algorithms for transforming image from one form to the other like smoothing, contrasting, stretching, etc. While computer vision comes from modelling image processing using the techniques of machine learning, computer vision applies machine learning to recognize patterns for interpretation of images (much like the process of visual reasoning of human vision). * <https://en.wikipedia.org/wiki/Computer_vision> * <https://www.algorithmia.com/blog/introduction-to-computer-vision>
# Computer Vision Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human does, and provide appropriate output. It is like imparting human intelligence and instincts to a computer. Image processing and computer vision are a little different from each other. Image processing means applying some algorithms for transforming image from one form to the other like smoothing, contrasting, stretching, etc. While computer vision comes from modelling image processing using the techniques of machine learning, computer vision applies machine learning to recognize patterns for interpretation of images (much like the process of visual reasoning of human vision). * <https://en.wikipedia.org/wiki/Computer_vision> * <https://www.algorithmia.com/blog/introduction-to-computer-vision>
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) * [Word Search](backtracking/word_search.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Index Of Rightmost Set Bit](bit_manipulation/index_of_rightmost_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Is Power Of Two](bit_manipulation/is_power_of_two.py) * [Numbers Different Signs](bit_manipulation/numbers_different_signs.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Autokey](ciphers/autokey.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Lz77](compression/lz77.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Arrays * [Permutations](data_structures/arrays/permutations.py) * [Prefix Sum](data_structures/arrays/prefix_sum.py) * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Distribute Coins](data_structures/binary_tree/distribute_coins.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Is Bst](data_structures/binary_tree/is_bst.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Bloom Filter](data_structures/hashing/bloom_filter.py) * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Map](data_structures/hashing/hash_map.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Tests * [Test Hash Map](data_structures/hashing/tests/test_hash_map.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue By Two Stacks](data_structures/queue/queue_by_two_stacks.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Radix Tree](data_structures/trie/radix_tree.py) * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Fizz Buzz](dynamic_programming/fizz_buzz.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [K Means Clustering Tensorflow](dynamic_programming/k_means_clustering_tensorflow.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Product Subarray](dynamic_programming/max_product_subarray.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Min Distance Up Bottom](dynamic_programming/min_distance_up_bottom.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Size Subarray Sum](dynamic_programming/minimum_size_subarray_sum.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Minimum Tickets Cost](dynamic_programming/minimum_tickets_cost.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Palindrome Partitioning](dynamic_programming/palindrome_partitioning.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) * [Viterbi](dynamic_programming/viterbi.py) * [Word Break](dynamic_programming/word_break.py) ## Electronics * [Apparent Power](electronics/apparent_power.py) * [Builtin Voltage](electronics/builtin_voltage.py) * [Carrier Concentration](electronics/carrier_concentration.py) * [Circular Convolution](electronics/circular_convolution.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Conductivity](electronics/electric_conductivity.py) * [Electric Power](electronics/electric_power.py) * [Electrical Impedance](electronics/electrical_impedance.py) * [Ind Reactance](electronics/ind_reactance.py) * [Ohms Law](electronics/ohms_law.py) * [Real And Reactive Power](electronics/real_and_reactive_power.py) * [Resistor Equivalence](electronics/resistor_equivalence.py) * [Resonant Frequency](electronics/resonant_frequency.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bi Directional Dijkstra](graphs/bi_directional_dijkstra.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Breadth First Search Shortest Path 2](graphs/breadth_first_search_shortest_path_2.py) * [Breadth First Search Zero One Shortest Path](graphs/breadth_first_search_zero_one_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Elf](hashes/elf.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * [Recursive Approach Knapsack](knapsack/recursive_approach_knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * [Dimensionality Reduction](machine_learning/dimensionality_reduction.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Add](maths/add.py) * [Addition Without Arithmetic](maths/addition_without_arithmetic.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Automorphic Number](maths/automorphic_number.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Decimal To Fraction](maths/decimal_to_fraction.py) * [Dodecahedron](maths/dodecahedron.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial](maths/factorial.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Gcd Of N Numbers](maths/gcd_of_n_numbers.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Hexagonal Number](maths/hexagonal_number.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Juggler Sequence](maths/juggler_sequence.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Liouville Lambda](maths/liouville_lambda.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Manhattan Distance](maths/manhattan_distance.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Generator](maths/pi_generator.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * Polynomials * [Single Indeterminate Operations](maths/polynomials/single_indeterminate_operations.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Print Multiplication Table](maths/print_multiplication_table.py) * [Pronic Number](maths/pronic_number.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sumset](maths/sumset.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Twin Prime](maths/twin_prime.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Count Paths](matrix/count_paths.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Pascal Triangle](matrix/pascal_triangle.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Input Data](neural_network/input_data.py) * [Perceptron](neural_network/perceptron.py) * [Simple Neural Network](neural_network/simple_neural_network.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password](other/password.py) * [Quine](other/quine.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Archimedes Principle](physics/archimedes_principle.py) * [Casimir Effect](physics/casimir_effect.py) * [Centripetal Force](physics/centripetal_force.py) * [Grahams Law](physics/grahams_law.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Hubble Parameter](physics/hubble_parameter.py) * [Ideal Gas Law](physics/ideal_gas_law.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) * [Potential Energy](physics/potential_energy.py) * [Rms Speed Of Molecule](physics/rms_speed_of_molecule.py) * [Shear Stress](physics/shear_stress.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 079 * [Sol1](project_euler/problem_079/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 082 * [Sol1](project_euler/problem_082/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 094 * [Sol1](project_euler/problem_094/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 100 * [Sol1](project_euler/problem_100/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 117 * [Sol1](project_euler/problem_117/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 131 * [Sol1](project_euler/problem_131/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 187 * [Sol1](project_euler/problem_187/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) * Problem 800 * [Sol1](project_euler/problem_800/sol1.py) ## Quantum * [Bb84](quantum/bb84.py) * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Fourier Transform](quantum/q_fourier_transform.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Quantum Random](quantum/quantum_random.py) * [Quantum Teleportation](quantum/quantum_teleportation.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Is Srilankan Phone Number](strings/is_srilankan_phone_number.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Text Justification](strings/text_justification.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Convert Number To Words](web_programming/convert_number_to_words.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) * [Word Search](backtracking/word_search.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Index Of Rightmost Set Bit](bit_manipulation/index_of_rightmost_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Is Power Of Two](bit_manipulation/is_power_of_two.py) * [Numbers Different Signs](bit_manipulation/numbers_different_signs.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Autokey](ciphers/autokey.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Lz77](compression/lz77.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Arrays * [Permutations](data_structures/arrays/permutations.py) * [Prefix Sum](data_structures/arrays/prefix_sum.py) * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Distribute Coins](data_structures/binary_tree/distribute_coins.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Is Bst](data_structures/binary_tree/is_bst.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Bloom Filter](data_structures/hashing/bloom_filter.py) * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Map](data_structures/hashing/hash_map.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Tests * [Test Hash Map](data_structures/hashing/tests/test_hash_map.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue By Two Stacks](data_structures/queue/queue_by_two_stacks.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Radix Tree](data_structures/trie/radix_tree.py) * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Fizz Buzz](dynamic_programming/fizz_buzz.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [K Means Clustering Tensorflow](dynamic_programming/k_means_clustering_tensorflow.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Product Subarray](dynamic_programming/max_product_subarray.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Min Distance Up Bottom](dynamic_programming/min_distance_up_bottom.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Size Subarray Sum](dynamic_programming/minimum_size_subarray_sum.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Minimum Tickets Cost](dynamic_programming/minimum_tickets_cost.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Palindrome Partitioning](dynamic_programming/palindrome_partitioning.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) * [Viterbi](dynamic_programming/viterbi.py) * [Word Break](dynamic_programming/word_break.py) ## Electronics * [Apparent Power](electronics/apparent_power.py) * [Builtin Voltage](electronics/builtin_voltage.py) * [Carrier Concentration](electronics/carrier_concentration.py) * [Circular Convolution](electronics/circular_convolution.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Conductivity](electronics/electric_conductivity.py) * [Electric Power](electronics/electric_power.py) * [Electrical Impedance](electronics/electrical_impedance.py) * [Ind Reactance](electronics/ind_reactance.py) * [Ohms Law](electronics/ohms_law.py) * [Real And Reactive Power](electronics/real_and_reactive_power.py) * [Resistor Equivalence](electronics/resistor_equivalence.py) * [Resonant Frequency](electronics/resonant_frequency.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bi Directional Dijkstra](graphs/bi_directional_dijkstra.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Breadth First Search Shortest Path 2](graphs/breadth_first_search_shortest_path_2.py) * [Breadth First Search Zero One Shortest Path](graphs/breadth_first_search_zero_one_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Elf](hashes/elf.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * [Recursive Approach Knapsack](knapsack/recursive_approach_knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * [Dimensionality Reduction](machine_learning/dimensionality_reduction.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Add](maths/add.py) * [Addition Without Arithmetic](maths/addition_without_arithmetic.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Automorphic Number](maths/automorphic_number.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Decimal To Fraction](maths/decimal_to_fraction.py) * [Dodecahedron](maths/dodecahedron.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial](maths/factorial.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Gcd Of N Numbers](maths/gcd_of_n_numbers.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Hexagonal Number](maths/hexagonal_number.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Juggler Sequence](maths/juggler_sequence.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Liouville Lambda](maths/liouville_lambda.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Manhattan Distance](maths/manhattan_distance.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Generator](maths/pi_generator.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * Polynomials * [Single Indeterminate Operations](maths/polynomials/single_indeterminate_operations.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Print Multiplication Table](maths/print_multiplication_table.py) * [Pronic Number](maths/pronic_number.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sumset](maths/sumset.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Twin Prime](maths/twin_prime.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Count Paths](matrix/count_paths.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Pascal Triangle](matrix/pascal_triangle.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Input Data](neural_network/input_data.py) * [Perceptron](neural_network/perceptron.py) * [Simple Neural Network](neural_network/simple_neural_network.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password](other/password.py) * [Quine](other/quine.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Archimedes Principle](physics/archimedes_principle.py) * [Casimir Effect](physics/casimir_effect.py) * [Centripetal Force](physics/centripetal_force.py) * [Grahams Law](physics/grahams_law.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Hubble Parameter](physics/hubble_parameter.py) * [Ideal Gas Law](physics/ideal_gas_law.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) * [Potential Energy](physics/potential_energy.py) * [Rms Speed Of Molecule](physics/rms_speed_of_molecule.py) * [Shear Stress](physics/shear_stress.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 079 * [Sol1](project_euler/problem_079/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 082 * [Sol1](project_euler/problem_082/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 094 * [Sol1](project_euler/problem_094/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 100 * [Sol1](project_euler/problem_100/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 117 * [Sol1](project_euler/problem_117/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 131 * [Sol1](project_euler/problem_131/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 187 * [Sol1](project_euler/problem_187/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) * Problem 800 * [Sol1](project_euler/problem_800/sol1.py) ## Quantum * [Bb84](quantum/bb84.py) * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Fourier Transform](quantum/q_fourier_transform.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Quantum Random](quantum/quantum_random.py) * [Quantum Teleportation](quantum/quantum_teleportation.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Is Srilankan Phone Number](strings/is_srilankan_phone_number.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Text Justification](strings/text_justification.py) * [Top K Frequent Words](strings/top_k_frequent_words.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Convert Number To Words](web_programming/convert_number_to_words.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations from collections.abc import Iterable class Heap: """A Max Heap Implementation >>> unsorted = [103, 9, 1, 7, 11, 15, 25, 201, 209, 107, 5] >>> h = Heap() >>> h.build_max_heap(unsorted) >>> h [209, 201, 25, 103, 107, 15, 1, 9, 7, 11, 5] >>> >>> h.extract_max() 209 >>> h [201, 107, 25, 103, 11, 15, 1, 9, 7, 5] >>> >>> h.insert(100) >>> h [201, 107, 25, 103, 100, 15, 1, 9, 7, 5, 11] >>> >>> h.heap_sort() >>> h [1, 5, 7, 9, 11, 15, 25, 100, 103, 107, 201] """ def __init__(self) -> None: self.h: list[float] = [] self.heap_size: int = 0 def __repr__(self) -> str: return str(self.h) def parent_index(self, child_idx: int) -> int | None: """return the parent index of given child""" if child_idx > 0: return (child_idx - 1) // 2 return None def left_child_idx(self, parent_idx: int) -> int | None: """ return the left child index if the left child exists. if not, return None. """ left_child_index = 2 * parent_idx + 1 if left_child_index < self.heap_size: return left_child_index return None def right_child_idx(self, parent_idx: int) -> int | None: """ return the right child index if the right child exists. if not, return None. """ right_child_index = 2 * parent_idx + 2 if right_child_index < self.heap_size: return right_child_index return None def max_heapify(self, index: int) -> None: """ correct a single violation of the heap property in a subtree's root. """ if index < self.heap_size: violation: int = index left_child = self.left_child_idx(index) right_child = self.right_child_idx(index) # check which child is larger than its parent if left_child is not None and self.h[left_child] > self.h[violation]: violation = left_child if right_child is not None and self.h[right_child] > self.h[violation]: violation = right_child # if violation indeed exists if violation != index: # swap to fix the violation self.h[violation], self.h[index] = self.h[index], self.h[violation] # fix the subsequent violation recursively if any self.max_heapify(violation) def build_max_heap(self, collection: Iterable[float]) -> None: """build max heap from an unsorted array""" self.h = list(collection) self.heap_size = len(self.h) if self.heap_size > 1: # max_heapify from right to left but exclude leaves (last level) for i in range(self.heap_size // 2 - 1, -1, -1): self.max_heapify(i) def extract_max(self) -> float: """get and remove max from heap""" if self.heap_size >= 2: me = self.h[0] self.h[0] = self.h.pop(-1) self.heap_size -= 1 self.max_heapify(0) return me elif self.heap_size == 1: self.heap_size -= 1 return self.h.pop(-1) else: raise Exception("Empty heap") def insert(self, value: float) -> None: """insert a new value into the max heap""" self.h.append(value) idx = (self.heap_size - 1) // 2 self.heap_size += 1 while idx >= 0: self.max_heapify(idx) idx = (idx - 1) // 2 def heap_sort(self) -> None: size = self.heap_size for j in range(size - 1, 0, -1): self.h[0], self.h[j] = self.h[j], self.h[0] self.heap_size -= 1 self.max_heapify(0) self.heap_size = size if __name__ == "__main__": import doctest # run doc test doctest.testmod() # demo for unsorted in [ [0], [2], [3, 5], [5, 3], [5, 5], [0, 0, 0, 0], [1, 1, 1, 1], [2, 2, 3, 5], [0, 2, 2, 3, 5], [2, 5, 3, 0, 2, 3, 0, 3], [6, 1, 2, 7, 9, 3, 4, 5, 10, 8], [103, 9, 1, 7, 11, 15, 25, 201, 209, 107, 5], [-45, -2, -5], ]: print(f"unsorted array: {unsorted}") heap = Heap() heap.build_max_heap(unsorted) print(f"after build heap: {heap}") print(f"max value: {heap.extract_max()}") print(f"after max value removed: {heap}") heap.insert(100) print(f"after new value 100 inserted: {heap}") heap.heap_sort() print(f"heap-sorted array: {heap}\n")
from __future__ import annotations from abc import abstractmethod from collections.abc import Iterable from typing import Generic, Protocol, TypeVar class Comparable(Protocol): @abstractmethod def __lt__(self: T, other: T) -> bool: pass @abstractmethod def __gt__(self: T, other: T) -> bool: pass @abstractmethod def __eq__(self: T, other: object) -> bool: pass T = TypeVar("T", bound=Comparable) class Heap(Generic[T]): """A Max Heap Implementation >>> unsorted = [103, 9, 1, 7, 11, 15, 25, 201, 209, 107, 5] >>> h = Heap() >>> h.build_max_heap(unsorted) >>> h [209, 201, 25, 103, 107, 15, 1, 9, 7, 11, 5] >>> >>> h.extract_max() 209 >>> h [201, 107, 25, 103, 11, 15, 1, 9, 7, 5] >>> >>> h.insert(100) >>> h [201, 107, 25, 103, 100, 15, 1, 9, 7, 5, 11] >>> >>> h.heap_sort() >>> h [1, 5, 7, 9, 11, 15, 25, 100, 103, 107, 201] """ def __init__(self) -> None: self.h: list[T] = [] self.heap_size: int = 0 def __repr__(self) -> str: return str(self.h) def parent_index(self, child_idx: int) -> int | None: """return the parent index of given child""" if child_idx > 0: return (child_idx - 1) // 2 return None def left_child_idx(self, parent_idx: int) -> int | None: """ return the left child index if the left child exists. if not, return None. """ left_child_index = 2 * parent_idx + 1 if left_child_index < self.heap_size: return left_child_index return None def right_child_idx(self, parent_idx: int) -> int | None: """ return the right child index if the right child exists. if not, return None. """ right_child_index = 2 * parent_idx + 2 if right_child_index < self.heap_size: return right_child_index return None def max_heapify(self, index: int) -> None: """ correct a single violation of the heap property in a subtree's root. """ if index < self.heap_size: violation: int = index left_child = self.left_child_idx(index) right_child = self.right_child_idx(index) # check which child is larger than its parent if left_child is not None and self.h[left_child] > self.h[violation]: violation = left_child if right_child is not None and self.h[right_child] > self.h[violation]: violation = right_child # if violation indeed exists if violation != index: # swap to fix the violation self.h[violation], self.h[index] = self.h[index], self.h[violation] # fix the subsequent violation recursively if any self.max_heapify(violation) def build_max_heap(self, collection: Iterable[T]) -> None: """build max heap from an unsorted array""" self.h = list(collection) self.heap_size = len(self.h) if self.heap_size > 1: # max_heapify from right to left but exclude leaves (last level) for i in range(self.heap_size // 2 - 1, -1, -1): self.max_heapify(i) def extract_max(self) -> T: """get and remove max from heap""" if self.heap_size >= 2: me = self.h[0] self.h[0] = self.h.pop(-1) self.heap_size -= 1 self.max_heapify(0) return me elif self.heap_size == 1: self.heap_size -= 1 return self.h.pop(-1) else: raise Exception("Empty heap") def insert(self, value: T) -> None: """insert a new value into the max heap""" self.h.append(value) idx = (self.heap_size - 1) // 2 self.heap_size += 1 while idx >= 0: self.max_heapify(idx) idx = (idx - 1) // 2 def heap_sort(self) -> None: size = self.heap_size for j in range(size - 1, 0, -1): self.h[0], self.h[j] = self.h[j], self.h[0] self.heap_size -= 1 self.max_heapify(0) self.heap_size = size if __name__ == "__main__": import doctest # run doc test doctest.testmod() # demo for unsorted in [ [0], [2], [3, 5], [5, 3], [5, 5], [0, 0, 0, 0], [1, 1, 1, 1], [2, 2, 3, 5], [0, 2, 2, 3, 5], [2, 5, 3, 0, 2, 3, 0, 3], [6, 1, 2, 7, 9, 3, 4, 5, 10, 8], [103, 9, 1, 7, 11, 15, 25, 201, 209, 107, 5], [-45, -2, -5], ]: print(f"unsorted array: {unsorted}") heap: Heap[int] = Heap() heap.build_max_heap(unsorted) print(f"after build heap: {heap}") print(f"max value: {heap.extract_max()}") print(f"after max value removed: {heap}") heap.insert(100) print(f"after new value 100 inserted: {heap}") heap.heap_sort() print(f"heap-sorted array: {heap}\n")
1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Linear Discriminant Analysis Assumptions About Data : 1. The input variables has a gaussian distribution. 2. The variance calculated for each input variables by class grouping is the same. 3. The mix of classes in your training set is representative of the problem. Learning The Model : The LDA model requires the estimation of statistics from the training data : 1. Mean of each input value for each class. 2. Probability of an instance belong to each class. 3. Covariance for the input data for each class Calculate the class means : mean(x) = 1/n ( for i = 1 to i = n --> sum(xi)) Calculate the class probabilities : P(y = 0) = count(y = 0) / (count(y = 0) + count(y = 1)) P(y = 1) = count(y = 1) / (count(y = 0) + count(y = 1)) Calculate the variance : We can calculate the variance for dataset in two steps : 1. Calculate the squared difference for each input variable from the group mean. 2. Calculate the mean of the squared difference. ------------------------------------------------ Squared_Difference = (x - mean(k)) ** 2 Variance = (1 / (count(x) - count(classes))) * (for i = 1 to i = n --> sum(Squared_Difference(xi))) Making Predictions : discriminant(x) = x * (mean / variance) - ((mean ** 2) / (2 * variance)) + Ln(probability) --------------------------------------------------------------------------- After calculating the discriminant value for each class, the class with the largest discriminant value is taken as the prediction. Author: @EverLookNeverSee """ from collections.abc import Callable from math import log from os import name, system from random import gauss, seed from typing import TypeVar # Make a training dataset drawn from a gaussian distribution def gaussian_distribution(mean: float, std_dev: float, instance_count: int) -> list: """ Generate gaussian distribution instances based-on given mean and standard deviation :param mean: mean value of class :param std_dev: value of standard deviation entered by usr or default value of it :param instance_count: instance number of class :return: a list containing generated values based-on given mean, std_dev and instance_count >>> gaussian_distribution(5.0, 1.0, 20) # doctest: +NORMALIZE_WHITESPACE [6.288184753155463, 6.4494456086997705, 5.066335808938262, 4.235456349028368, 3.9078267848958586, 5.031334516831717, 3.977896829989127, 3.56317055489747, 5.199311976483754, 5.133374604658605, 5.546468300338232, 4.086029056264687, 5.005005283626573, 4.935258239627312, 3.494170998739258, 5.537997178661033, 5.320711100998849, 7.3891120432406865, 5.202969177309964, 4.855297691835079] """ seed(1) return [gauss(mean, std_dev) for _ in range(instance_count)] # Make corresponding Y flags to detecting classes def y_generator(class_count: int, instance_count: list) -> list: """ Generate y values for corresponding classes :param class_count: Number of classes(data groupings) in dataset :param instance_count: number of instances in class :return: corresponding values for data groupings in dataset >>> y_generator(1, [10]) [0, 0, 0, 0, 0, 0, 0, 0, 0, 0] >>> y_generator(2, [5, 10]) [0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] >>> y_generator(4, [10, 5, 15, 20]) # doctest: +NORMALIZE_WHITESPACE [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3] """ return [k for k in range(class_count) for _ in range(instance_count[k])] # Calculate the class means def calculate_mean(instance_count: int, items: list) -> float: """ Calculate given class mean :param instance_count: Number of instances in class :param items: items that related to specific class(data grouping) :return: calculated actual mean of considered class >>> items = gaussian_distribution(5.0, 1.0, 20) >>> calculate_mean(len(items), items) 5.011267842911003 """ # the sum of all items divided by number of instances return sum(items) / instance_count # Calculate the class probabilities def calculate_probabilities(instance_count: int, total_count: int) -> float: """ Calculate the probability that a given instance will belong to which class :param instance_count: number of instances in class :param total_count: the number of all instances :return: value of probability for considered class >>> calculate_probabilities(20, 60) 0.3333333333333333 >>> calculate_probabilities(30, 100) 0.3 """ # number of instances in specific class divided by number of all instances return instance_count / total_count # Calculate the variance def calculate_variance(items: list, means: list, total_count: int) -> float: """ Calculate the variance :param items: a list containing all items(gaussian distribution of all classes) :param means: a list containing real mean values of each class :param total_count: the number of all instances :return: calculated variance for considered dataset >>> items = gaussian_distribution(5.0, 1.0, 20) >>> means = [5.011267842911003] >>> total_count = 20 >>> calculate_variance([items], means, total_count) 0.9618530973487491 """ squared_diff = [] # An empty list to store all squared differences # iterate over number of elements in items for i in range(len(items)): # for loop iterates over number of elements in inner layer of items for j in range(len(items[i])): # appending squared differences to 'squared_diff' list squared_diff.append((items[i][j] - means[i]) ** 2) # one divided by (the number of all instances - number of classes) multiplied by # sum of all squared differences n_classes = len(means) # Number of classes in dataset return 1 / (total_count - n_classes) * sum(squared_diff) # Making predictions def predict_y_values( x_items: list, means: list, variance: float, probabilities: list ) -> list: """This function predicts new indexes(groups for our data) :param x_items: a list containing all items(gaussian distribution of all classes) :param means: a list containing real mean values of each class :param variance: calculated value of variance by calculate_variance function :param probabilities: a list containing all probabilities of classes :return: a list containing predicted Y values >>> x_items = [[6.288184753155463, 6.4494456086997705, 5.066335808938262, ... 4.235456349028368, 3.9078267848958586, 5.031334516831717, ... 3.977896829989127, 3.56317055489747, 5.199311976483754, ... 5.133374604658605, 5.546468300338232, 4.086029056264687, ... 5.005005283626573, 4.935258239627312, 3.494170998739258, ... 5.537997178661033, 5.320711100998849, 7.3891120432406865, ... 5.202969177309964, 4.855297691835079], [11.288184753155463, ... 11.44944560869977, 10.066335808938263, 9.235456349028368, ... 8.907826784895859, 10.031334516831716, 8.977896829989128, ... 8.56317055489747, 10.199311976483754, 10.133374604658606, ... 10.546468300338232, 9.086029056264687, 10.005005283626572, ... 9.935258239627313, 8.494170998739259, 10.537997178661033, ... 10.320711100998848, 12.389112043240686, 10.202969177309964, ... 9.85529769183508], [16.288184753155463, 16.449445608699772, ... 15.066335808938263, 14.235456349028368, 13.907826784895859, ... 15.031334516831716, 13.977896829989128, 13.56317055489747, ... 15.199311976483754, 15.133374604658606, 15.546468300338232, ... 14.086029056264687, 15.005005283626572, 14.935258239627313, ... 13.494170998739259, 15.537997178661033, 15.320711100998848, ... 17.389112043240686, 15.202969177309964, 14.85529769183508]] >>> means = [5.011267842911003, 10.011267842911003, 15.011267842911002] >>> variance = 0.9618530973487494 >>> probabilities = [0.3333333333333333, 0.3333333333333333, 0.3333333333333333] >>> predict_y_values(x_items, means, variance, ... probabilities) # doctest: +NORMALIZE_WHITESPACE [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2] """ # An empty list to store generated discriminant values of all items in dataset for # each class results = [] # for loop iterates over number of elements in list for i in range(len(x_items)): # for loop iterates over number of inner items of each element for j in range(len(x_items[i])): temp = [] # to store all discriminant values of each item as a list # for loop iterates over number of classes we have in our dataset for k in range(len(x_items)): # appending values of discriminants for each class to 'temp' list temp.append( x_items[i][j] * (means[k] / variance) - (means[k] ** 2 / (2 * variance)) + log(probabilities[k]) ) # appending discriminant values of each item to 'results' list results.append(temp) return [result.index(max(result)) for result in results] # Calculating Accuracy def accuracy(actual_y: list, predicted_y: list) -> float: """ Calculate the value of accuracy based-on predictions :param actual_y:a list containing initial Y values generated by 'y_generator' function :param predicted_y: a list containing predicted Y values generated by 'predict_y_values' function :return: percentage of accuracy >>> actual_y = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, ... 1, 1 ,1 ,1 ,1 ,1 ,1] >>> predicted_y = [0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 0, 0, ... 0, 0, 1, 1, 1, 0, 1, 1, 1] >>> accuracy(actual_y, predicted_y) 50.0 >>> actual_y = [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, ... 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2] >>> predicted_y = [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, ... 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2] >>> accuracy(actual_y, predicted_y) 100.0 """ # iterate over one element of each list at a time (zip mode) # prediction is correct if actual Y value equals to predicted Y value correct = sum(1 for i, j in zip(actual_y, predicted_y) if i == j) # percentage of accuracy equals to number of correct predictions divided by number # of all data and multiplied by 100 return (correct / len(actual_y)) * 100 num = TypeVar("num") def valid_input( input_type: Callable[[object], num], # Usually float or int input_msg: str, err_msg: str, condition: Callable[[num], bool] = lambda x: True, default: str | None = None, ) -> num: """ Ask for user value and validate that it fulfill a condition. :input_type: user input expected type of value :input_msg: message to show user in the screen :err_msg: message to show in the screen in case of error :condition: function that represents the condition that user input is valid. :default: Default value in case the user does not type anything :return: user's input """ while True: try: user_input = input_type(input(input_msg).strip() or default) if condition(user_input): return user_input else: print(f"{user_input}: {err_msg}") continue except ValueError: print( f"{user_input}: Incorrect input type, expected {input_type.__name__!r}" ) # Main Function def main(): """This function starts execution phase""" while True: print(" Linear Discriminant Analysis ".center(50, "*")) print("*" * 50, "\n") print("First of all we should specify the number of classes that") print("we want to generate as training dataset") # Trying to get number of classes n_classes = valid_input( input_type=int, condition=lambda x: x > 0, input_msg="Enter the number of classes (Data Groupings): ", err_msg="Number of classes should be positive!", ) print("-" * 100) # Trying to get the value of standard deviation std_dev = valid_input( input_type=float, condition=lambda x: x >= 0, input_msg=( "Enter the value of standard deviation" "(Default value is 1.0 for all classes): " ), err_msg="Standard deviation should not be negative!", default="1.0", ) print("-" * 100) # Trying to get number of instances in classes and theirs means to generate # dataset counts = [] # An empty list to store instance counts of classes in dataset for i in range(n_classes): user_count = valid_input( input_type=int, condition=lambda x: x > 0, input_msg=(f"Enter The number of instances for class_{i+1}: "), err_msg="Number of instances should be positive!", ) counts.append(user_count) print("-" * 100) # An empty list to store values of user-entered means of classes user_means = [] for a in range(n_classes): user_mean = valid_input( input_type=float, input_msg=(f"Enter the value of mean for class_{a+1}: "), err_msg="This is an invalid value.", ) user_means.append(user_mean) print("-" * 100) print("Standard deviation: ", std_dev) # print out the number of instances in classes in separated line for i, count in enumerate(counts, 1): print(f"Number of instances in class_{i} is: {count}") print("-" * 100) # print out mean values of classes separated line for i, user_mean in enumerate(user_means, 1): print(f"Mean of class_{i} is: {user_mean}") print("-" * 100) # Generating training dataset drawn from gaussian distribution x = [ gaussian_distribution(user_means[j], std_dev, counts[j]) for j in range(n_classes) ] print("Generated Normal Distribution: \n", x) print("-" * 100) # Generating Ys to detecting corresponding classes y = y_generator(n_classes, counts) print("Generated Corresponding Ys: \n", y) print("-" * 100) # Calculating the value of actual mean for each class actual_means = [calculate_mean(counts[k], x[k]) for k in range(n_classes)] # for loop iterates over number of elements in 'actual_means' list and print # out them in separated line for i, actual_mean in enumerate(actual_means, 1): print(f"Actual(Real) mean of class_{i} is: {actual_mean}") print("-" * 100) # Calculating the value of probabilities for each class probabilities = [ calculate_probabilities(counts[i], sum(counts)) for i in range(n_classes) ] # for loop iterates over number of elements in 'probabilities' list and print # out them in separated line for i, probability in enumerate(probabilities, 1): print(f"Probability of class_{i} is: {probability}") print("-" * 100) # Calculating the values of variance for each class variance = calculate_variance(x, actual_means, sum(counts)) print("Variance: ", variance) print("-" * 100) # Predicting Y values # storing predicted Y values in 'pre_indexes' variable pre_indexes = predict_y_values(x, actual_means, variance, probabilities) print("-" * 100) # Calculating Accuracy of the model print(f"Accuracy: {accuracy(y, pre_indexes)}") print("-" * 100) print(" DONE ".center(100, "+")) if input("Press any key to restart or 'q' for quit: ").strip().lower() == "q": print("\n" + "GoodBye!".center(100, "-") + "\n") break system("clear" if name == "posix" else "cls") # noqa: S605 if __name__ == "__main__": main()
""" Linear Discriminant Analysis Assumptions About Data : 1. The input variables has a gaussian distribution. 2. The variance calculated for each input variables by class grouping is the same. 3. The mix of classes in your training set is representative of the problem. Learning The Model : The LDA model requires the estimation of statistics from the training data : 1. Mean of each input value for each class. 2. Probability of an instance belong to each class. 3. Covariance for the input data for each class Calculate the class means : mean(x) = 1/n ( for i = 1 to i = n --> sum(xi)) Calculate the class probabilities : P(y = 0) = count(y = 0) / (count(y = 0) + count(y = 1)) P(y = 1) = count(y = 1) / (count(y = 0) + count(y = 1)) Calculate the variance : We can calculate the variance for dataset in two steps : 1. Calculate the squared difference for each input variable from the group mean. 2. Calculate the mean of the squared difference. ------------------------------------------------ Squared_Difference = (x - mean(k)) ** 2 Variance = (1 / (count(x) - count(classes))) * (for i = 1 to i = n --> sum(Squared_Difference(xi))) Making Predictions : discriminant(x) = x * (mean / variance) - ((mean ** 2) / (2 * variance)) + Ln(probability) --------------------------------------------------------------------------- After calculating the discriminant value for each class, the class with the largest discriminant value is taken as the prediction. Author: @EverLookNeverSee """ from collections.abc import Callable from math import log from os import name, system from random import gauss, seed from typing import TypeVar # Make a training dataset drawn from a gaussian distribution def gaussian_distribution(mean: float, std_dev: float, instance_count: int) -> list: """ Generate gaussian distribution instances based-on given mean and standard deviation :param mean: mean value of class :param std_dev: value of standard deviation entered by usr or default value of it :param instance_count: instance number of class :return: a list containing generated values based-on given mean, std_dev and instance_count >>> gaussian_distribution(5.0, 1.0, 20) # doctest: +NORMALIZE_WHITESPACE [6.288184753155463, 6.4494456086997705, 5.066335808938262, 4.235456349028368, 3.9078267848958586, 5.031334516831717, 3.977896829989127, 3.56317055489747, 5.199311976483754, 5.133374604658605, 5.546468300338232, 4.086029056264687, 5.005005283626573, 4.935258239627312, 3.494170998739258, 5.537997178661033, 5.320711100998849, 7.3891120432406865, 5.202969177309964, 4.855297691835079] """ seed(1) return [gauss(mean, std_dev) for _ in range(instance_count)] # Make corresponding Y flags to detecting classes def y_generator(class_count: int, instance_count: list) -> list: """ Generate y values for corresponding classes :param class_count: Number of classes(data groupings) in dataset :param instance_count: number of instances in class :return: corresponding values for data groupings in dataset >>> y_generator(1, [10]) [0, 0, 0, 0, 0, 0, 0, 0, 0, 0] >>> y_generator(2, [5, 10]) [0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1] >>> y_generator(4, [10, 5, 15, 20]) # doctest: +NORMALIZE_WHITESPACE [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3] """ return [k for k in range(class_count) for _ in range(instance_count[k])] # Calculate the class means def calculate_mean(instance_count: int, items: list) -> float: """ Calculate given class mean :param instance_count: Number of instances in class :param items: items that related to specific class(data grouping) :return: calculated actual mean of considered class >>> items = gaussian_distribution(5.0, 1.0, 20) >>> calculate_mean(len(items), items) 5.011267842911003 """ # the sum of all items divided by number of instances return sum(items) / instance_count # Calculate the class probabilities def calculate_probabilities(instance_count: int, total_count: int) -> float: """ Calculate the probability that a given instance will belong to which class :param instance_count: number of instances in class :param total_count: the number of all instances :return: value of probability for considered class >>> calculate_probabilities(20, 60) 0.3333333333333333 >>> calculate_probabilities(30, 100) 0.3 """ # number of instances in specific class divided by number of all instances return instance_count / total_count # Calculate the variance def calculate_variance(items: list, means: list, total_count: int) -> float: """ Calculate the variance :param items: a list containing all items(gaussian distribution of all classes) :param means: a list containing real mean values of each class :param total_count: the number of all instances :return: calculated variance for considered dataset >>> items = gaussian_distribution(5.0, 1.0, 20) >>> means = [5.011267842911003] >>> total_count = 20 >>> calculate_variance([items], means, total_count) 0.9618530973487491 """ squared_diff = [] # An empty list to store all squared differences # iterate over number of elements in items for i in range(len(items)): # for loop iterates over number of elements in inner layer of items for j in range(len(items[i])): # appending squared differences to 'squared_diff' list squared_diff.append((items[i][j] - means[i]) ** 2) # one divided by (the number of all instances - number of classes) multiplied by # sum of all squared differences n_classes = len(means) # Number of classes in dataset return 1 / (total_count - n_classes) * sum(squared_diff) # Making predictions def predict_y_values( x_items: list, means: list, variance: float, probabilities: list ) -> list: """This function predicts new indexes(groups for our data) :param x_items: a list containing all items(gaussian distribution of all classes) :param means: a list containing real mean values of each class :param variance: calculated value of variance by calculate_variance function :param probabilities: a list containing all probabilities of classes :return: a list containing predicted Y values >>> x_items = [[6.288184753155463, 6.4494456086997705, 5.066335808938262, ... 4.235456349028368, 3.9078267848958586, 5.031334516831717, ... 3.977896829989127, 3.56317055489747, 5.199311976483754, ... 5.133374604658605, 5.546468300338232, 4.086029056264687, ... 5.005005283626573, 4.935258239627312, 3.494170998739258, ... 5.537997178661033, 5.320711100998849, 7.3891120432406865, ... 5.202969177309964, 4.855297691835079], [11.288184753155463, ... 11.44944560869977, 10.066335808938263, 9.235456349028368, ... 8.907826784895859, 10.031334516831716, 8.977896829989128, ... 8.56317055489747, 10.199311976483754, 10.133374604658606, ... 10.546468300338232, 9.086029056264687, 10.005005283626572, ... 9.935258239627313, 8.494170998739259, 10.537997178661033, ... 10.320711100998848, 12.389112043240686, 10.202969177309964, ... 9.85529769183508], [16.288184753155463, 16.449445608699772, ... 15.066335808938263, 14.235456349028368, 13.907826784895859, ... 15.031334516831716, 13.977896829989128, 13.56317055489747, ... 15.199311976483754, 15.133374604658606, 15.546468300338232, ... 14.086029056264687, 15.005005283626572, 14.935258239627313, ... 13.494170998739259, 15.537997178661033, 15.320711100998848, ... 17.389112043240686, 15.202969177309964, 14.85529769183508]] >>> means = [5.011267842911003, 10.011267842911003, 15.011267842911002] >>> variance = 0.9618530973487494 >>> probabilities = [0.3333333333333333, 0.3333333333333333, 0.3333333333333333] >>> predict_y_values(x_items, means, variance, ... probabilities) # doctest: +NORMALIZE_WHITESPACE [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2] """ # An empty list to store generated discriminant values of all items in dataset for # each class results = [] # for loop iterates over number of elements in list for i in range(len(x_items)): # for loop iterates over number of inner items of each element for j in range(len(x_items[i])): temp = [] # to store all discriminant values of each item as a list # for loop iterates over number of classes we have in our dataset for k in range(len(x_items)): # appending values of discriminants for each class to 'temp' list temp.append( x_items[i][j] * (means[k] / variance) - (means[k] ** 2 / (2 * variance)) + log(probabilities[k]) ) # appending discriminant values of each item to 'results' list results.append(temp) return [result.index(max(result)) for result in results] # Calculating Accuracy def accuracy(actual_y: list, predicted_y: list) -> float: """ Calculate the value of accuracy based-on predictions :param actual_y:a list containing initial Y values generated by 'y_generator' function :param predicted_y: a list containing predicted Y values generated by 'predict_y_values' function :return: percentage of accuracy >>> actual_y = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, ... 1, 1 ,1 ,1 ,1 ,1 ,1] >>> predicted_y = [0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 0, 0, ... 0, 0, 1, 1, 1, 0, 1, 1, 1] >>> accuracy(actual_y, predicted_y) 50.0 >>> actual_y = [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, ... 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2] >>> predicted_y = [0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, ... 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2] >>> accuracy(actual_y, predicted_y) 100.0 """ # iterate over one element of each list at a time (zip mode) # prediction is correct if actual Y value equals to predicted Y value correct = sum(1 for i, j in zip(actual_y, predicted_y) if i == j) # percentage of accuracy equals to number of correct predictions divided by number # of all data and multiplied by 100 return (correct / len(actual_y)) * 100 num = TypeVar("num") def valid_input( input_type: Callable[[object], num], # Usually float or int input_msg: str, err_msg: str, condition: Callable[[num], bool] = lambda x: True, default: str | None = None, ) -> num: """ Ask for user value and validate that it fulfill a condition. :input_type: user input expected type of value :input_msg: message to show user in the screen :err_msg: message to show in the screen in case of error :condition: function that represents the condition that user input is valid. :default: Default value in case the user does not type anything :return: user's input """ while True: try: user_input = input_type(input(input_msg).strip() or default) if condition(user_input): return user_input else: print(f"{user_input}: {err_msg}") continue except ValueError: print( f"{user_input}: Incorrect input type, expected {input_type.__name__!r}" ) # Main Function def main(): """This function starts execution phase""" while True: print(" Linear Discriminant Analysis ".center(50, "*")) print("*" * 50, "\n") print("First of all we should specify the number of classes that") print("we want to generate as training dataset") # Trying to get number of classes n_classes = valid_input( input_type=int, condition=lambda x: x > 0, input_msg="Enter the number of classes (Data Groupings): ", err_msg="Number of classes should be positive!", ) print("-" * 100) # Trying to get the value of standard deviation std_dev = valid_input( input_type=float, condition=lambda x: x >= 0, input_msg=( "Enter the value of standard deviation" "(Default value is 1.0 for all classes): " ), err_msg="Standard deviation should not be negative!", default="1.0", ) print("-" * 100) # Trying to get number of instances in classes and theirs means to generate # dataset counts = [] # An empty list to store instance counts of classes in dataset for i in range(n_classes): user_count = valid_input( input_type=int, condition=lambda x: x > 0, input_msg=(f"Enter The number of instances for class_{i+1}: "), err_msg="Number of instances should be positive!", ) counts.append(user_count) print("-" * 100) # An empty list to store values of user-entered means of classes user_means = [] for a in range(n_classes): user_mean = valid_input( input_type=float, input_msg=(f"Enter the value of mean for class_{a+1}: "), err_msg="This is an invalid value.", ) user_means.append(user_mean) print("-" * 100) print("Standard deviation: ", std_dev) # print out the number of instances in classes in separated line for i, count in enumerate(counts, 1): print(f"Number of instances in class_{i} is: {count}") print("-" * 100) # print out mean values of classes separated line for i, user_mean in enumerate(user_means, 1): print(f"Mean of class_{i} is: {user_mean}") print("-" * 100) # Generating training dataset drawn from gaussian distribution x = [ gaussian_distribution(user_means[j], std_dev, counts[j]) for j in range(n_classes) ] print("Generated Normal Distribution: \n", x) print("-" * 100) # Generating Ys to detecting corresponding classes y = y_generator(n_classes, counts) print("Generated Corresponding Ys: \n", y) print("-" * 100) # Calculating the value of actual mean for each class actual_means = [calculate_mean(counts[k], x[k]) for k in range(n_classes)] # for loop iterates over number of elements in 'actual_means' list and print # out them in separated line for i, actual_mean in enumerate(actual_means, 1): print(f"Actual(Real) mean of class_{i} is: {actual_mean}") print("-" * 100) # Calculating the value of probabilities for each class probabilities = [ calculate_probabilities(counts[i], sum(counts)) for i in range(n_classes) ] # for loop iterates over number of elements in 'probabilities' list and print # out them in separated line for i, probability in enumerate(probabilities, 1): print(f"Probability of class_{i} is: {probability}") print("-" * 100) # Calculating the values of variance for each class variance = calculate_variance(x, actual_means, sum(counts)) print("Variance: ", variance) print("-" * 100) # Predicting Y values # storing predicted Y values in 'pre_indexes' variable pre_indexes = predict_y_values(x, actual_means, variance, probabilities) print("-" * 100) # Calculating Accuracy of the model print(f"Accuracy: {accuracy(y, pre_indexes)}") print("-" * 100) print(" DONE ".center(100, "+")) if input("Press any key to restart or 'q' for quit: ").strip().lower() == "q": print("\n" + "GoodBye!".center(100, "-") + "\n") break system("cls" if name == "nt" else "clear") # noqa: S605 if __name__ == "__main__": main()
1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def double_factorial(n: int) -> int: """ Compute double factorial using recursive method. Recursion can be costly for large numbers. To learn about the theory behind this algorithm: https://en.wikipedia.org/wiki/Double_factorial >>> import math >>> all(double_factorial(i) == math.prod(range(i, 0, -2)) for i in range(20)) True >>> double_factorial(0.1) Traceback (most recent call last): ... ValueError: double_factorial() only accepts integral values >>> double_factorial(-1) Traceback (most recent call last): ... ValueError: double_factorial() not defined for negative values """ if not isinstance(n, int): raise ValueError("double_factorial() only accepts integral values") if n < 0: raise ValueError("double_factorial() not defined for negative values") return 1 if n <= 1 else n * double_factorial(n - 2) if __name__ == "__main__": import doctest doctest.testmod()
def double_factorial(n: int) -> int: """ Compute double factorial using recursive method. Recursion can be costly for large numbers. To learn about the theory behind this algorithm: https://en.wikipedia.org/wiki/Double_factorial >>> import math >>> all(double_factorial(i) == math.prod(range(i, 0, -2)) for i in range(20)) True >>> double_factorial(0.1) Traceback (most recent call last): ... ValueError: double_factorial() only accepts integral values >>> double_factorial(-1) Traceback (most recent call last): ... ValueError: double_factorial() not defined for negative values """ if not isinstance(n, int): raise ValueError("double_factorial() only accepts integral values") if n < 0: raise ValueError("double_factorial() not defined for negative values") return 1 if n <= 1 else n * double_factorial(n - 2) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Author Anurag Kumar | [email protected] | git/anuragkumarak95 Simple example of fractal generation using recursion. What is the Sierpiński Triangle? The Sierpiński triangle (sometimes spelled Sierpinski), also called the Sierpiński gasket or Sierpiński sieve, is a fractal attractive fixed set with the overall shape of an equilateral triangle, subdivided recursively into smaller equilateral triangles. Originally constructed as a curve, this is one of the basic examples of self-similar sets—that is, it is a mathematically generated pattern that is reproducible at any magnification or reduction. It is named after the Polish mathematician Wacław Sierpiński, but appeared as a decorative pattern many centuries before the work of Sierpiński. Usage: python sierpinski_triangle.py <int:depth_for_fractal> Credits: The above description is taken from https://en.wikipedia.org/wiki/Sierpi%C5%84ski_triangle This code was written by editing the code from https://www.riannetrujillo.com/blog/python-fractal/ """ import sys import turtle def get_mid(p1: tuple[float, float], p2: tuple[float, float]) -> tuple[float, float]: """ Find the midpoint of two points >>> get_mid((0, 0), (2, 2)) (1.0, 1.0) >>> get_mid((-3, -3), (3, 3)) (0.0, 0.0) >>> get_mid((1, 0), (3, 2)) (2.0, 1.0) >>> get_mid((0, 0), (1, 1)) (0.5, 0.5) >>> get_mid((0, 0), (0, 0)) (0.0, 0.0) """ return (p1[0] + p2[0]) / 2, (p1[1] + p2[1]) / 2 def triangle( vertex1: tuple[float, float], vertex2: tuple[float, float], vertex3: tuple[float, float], depth: int, ) -> None: """ Recursively draw the Sierpinski triangle given the vertices of the triangle and the recursion depth """ my_pen.up() my_pen.goto(vertex1[0], vertex1[1]) my_pen.down() my_pen.goto(vertex2[0], vertex2[1]) my_pen.goto(vertex3[0], vertex3[1]) my_pen.goto(vertex1[0], vertex1[1]) if depth == 0: return triangle(vertex1, get_mid(vertex1, vertex2), get_mid(vertex1, vertex3), depth - 1) triangle(vertex2, get_mid(vertex1, vertex2), get_mid(vertex2, vertex3), depth - 1) triangle(vertex3, get_mid(vertex3, vertex2), get_mid(vertex1, vertex3), depth - 1) if __name__ == "__main__": if len(sys.argv) != 2: raise ValueError( "Correct format for using this script: " "python fractals.py <int:depth_for_fractal>" ) my_pen = turtle.Turtle() my_pen.ht() my_pen.speed(5) my_pen.pencolor("red") vertices = [(-175, -125), (0, 175), (175, -125)] # vertices of triangle triangle(vertices[0], vertices[1], vertices[2], int(sys.argv[1]))
""" Author Anurag Kumar | [email protected] | git/anuragkumarak95 Simple example of fractal generation using recursion. What is the Sierpiński Triangle? The Sierpiński triangle (sometimes spelled Sierpinski), also called the Sierpiński gasket or Sierpiński sieve, is a fractal attractive fixed set with the overall shape of an equilateral triangle, subdivided recursively into smaller equilateral triangles. Originally constructed as a curve, this is one of the basic examples of self-similar sets—that is, it is a mathematically generated pattern that is reproducible at any magnification or reduction. It is named after the Polish mathematician Wacław Sierpiński, but appeared as a decorative pattern many centuries before the work of Sierpiński. Usage: python sierpinski_triangle.py <int:depth_for_fractal> Credits: The above description is taken from https://en.wikipedia.org/wiki/Sierpi%C5%84ski_triangle This code was written by editing the code from https://www.riannetrujillo.com/blog/python-fractal/ """ import sys import turtle def get_mid(p1: tuple[float, float], p2: tuple[float, float]) -> tuple[float, float]: """ Find the midpoint of two points >>> get_mid((0, 0), (2, 2)) (1.0, 1.0) >>> get_mid((-3, -3), (3, 3)) (0.0, 0.0) >>> get_mid((1, 0), (3, 2)) (2.0, 1.0) >>> get_mid((0, 0), (1, 1)) (0.5, 0.5) >>> get_mid((0, 0), (0, 0)) (0.0, 0.0) """ return (p1[0] + p2[0]) / 2, (p1[1] + p2[1]) / 2 def triangle( vertex1: tuple[float, float], vertex2: tuple[float, float], vertex3: tuple[float, float], depth: int, ) -> None: """ Recursively draw the Sierpinski triangle given the vertices of the triangle and the recursion depth """ my_pen.up() my_pen.goto(vertex1[0], vertex1[1]) my_pen.down() my_pen.goto(vertex2[0], vertex2[1]) my_pen.goto(vertex3[0], vertex3[1]) my_pen.goto(vertex1[0], vertex1[1]) if depth == 0: return triangle(vertex1, get_mid(vertex1, vertex2), get_mid(vertex1, vertex3), depth - 1) triangle(vertex2, get_mid(vertex1, vertex2), get_mid(vertex2, vertex3), depth - 1) triangle(vertex3, get_mid(vertex3, vertex2), get_mid(vertex1, vertex3), depth - 1) if __name__ == "__main__": if len(sys.argv) != 2: raise ValueError( "Correct format for using this script: " "python fractals.py <int:depth_for_fractal>" ) my_pen = turtle.Turtle() my_pen.ht() my_pen.speed(5) my_pen.pencolor("red") vertices = [(-175, -125), (0, 175), (175, -125)] # vertices of triangle triangle(vertices[0], vertices[1], vertices[2], int(sys.argv[1]))
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 129: https://projecteuler.net/problem=129 A number consisting entirely of ones is called a repunit. We shall define R(k) to be a repunit of length k; for example, R(6) = 111111. Given that n is a positive integer and GCD(n, 10) = 1, it can be shown that there always exists a value, k, for which R(k) is divisible by n, and let A(n) be the least such value of k; for example, A(7) = 6 and A(41) = 5. The least value of n for which A(n) first exceeds ten is 17. Find the least value of n for which A(n) first exceeds one-million. """ def least_divisible_repunit(divisor: int) -> int: """ Return the least value k such that the Repunit of length k is divisible by divisor. >>> least_divisible_repunit(7) 6 >>> least_divisible_repunit(41) 5 >>> least_divisible_repunit(1234567) 34020 """ if divisor % 5 == 0 or divisor % 2 == 0: return 0 repunit = 1 repunit_index = 1 while repunit: repunit = (10 * repunit + 1) % divisor repunit_index += 1 return repunit_index def solution(limit: int = 1000000) -> int: """ Return the least value of n for which least_divisible_repunit(n) first exceeds limit. >>> solution(10) 17 >>> solution(100) 109 >>> solution(1000) 1017 """ divisor = limit - 1 if divisor % 2 == 0: divisor += 1 while least_divisible_repunit(divisor) <= limit: divisor += 2 return divisor if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 129: https://projecteuler.net/problem=129 A number consisting entirely of ones is called a repunit. We shall define R(k) to be a repunit of length k; for example, R(6) = 111111. Given that n is a positive integer and GCD(n, 10) = 1, it can be shown that there always exists a value, k, for which R(k) is divisible by n, and let A(n) be the least such value of k; for example, A(7) = 6 and A(41) = 5. The least value of n for which A(n) first exceeds ten is 17. Find the least value of n for which A(n) first exceeds one-million. """ def least_divisible_repunit(divisor: int) -> int: """ Return the least value k such that the Repunit of length k is divisible by divisor. >>> least_divisible_repunit(7) 6 >>> least_divisible_repunit(41) 5 >>> least_divisible_repunit(1234567) 34020 """ if divisor % 5 == 0 or divisor % 2 == 0: return 0 repunit = 1 repunit_index = 1 while repunit: repunit = (10 * repunit + 1) % divisor repunit_index += 1 return repunit_index def solution(limit: int = 1000000) -> int: """ Return the least value of n for which least_divisible_repunit(n) first exceeds limit. >>> solution(10) 17 >>> solution(100) 109 >>> solution(1000) 1017 """ divisor = limit - 1 if divisor % 2 == 0: divisor += 1 while least_divisible_repunit(divisor) <= limit: divisor += 2 return divisor if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def calculate_pi(limit: int) -> str: """ https://en.wikipedia.org/wiki/Leibniz_formula_for_%CF%80 Leibniz Formula for Pi The Leibniz formula is the special case arctan 1 = 1/4 Pi . Leibniz's formula converges extremely slowly: it exhibits sublinear convergence. Convergence (https://en.wikipedia.org/wiki/Leibniz_formula_for_%CF%80#Convergence) We cannot try to prove against an interrupted, uncompleted generation. https://en.wikipedia.org/wiki/Leibniz_formula_for_%CF%80#Unusual_behaviour The errors can in fact be predicted; but those calculations also approach infinity for accuracy. Our output will always be a string since we can defintely store all digits in there. For simplicity' sake, let's just compare against known values and since our outpit is a string, we need to convert to float. >>> import math >>> float(calculate_pi(15)) == math.pi True Since we cannot predict errors or interrupt any infinite alternating series generation since they approach infinity, or interrupt any alternating series, we are going to need math.isclose() >>> math.isclose(float(calculate_pi(50)), math.pi) True >>> math.isclose(float(calculate_pi(100)), math.pi) True Since math.pi-constant contains only 16 digits, here some test with preknown values: >>> calculate_pi(50) '3.14159265358979323846264338327950288419716939937510' >>> calculate_pi(80) '3.14159265358979323846264338327950288419716939937510582097494459230781640628620899' To apply the Leibniz formula for calculating pi, the variables q, r, t, k, n, and l are used for the iteration process. """ q = 1 r = 0 t = 1 k = 1 n = 3 l = 3 decimal = limit counter = 0 result = "" """ We will avoid using yield since we otherwise get a Generator-Object, which we can't just compare against anything. We would have to make a list out of it after the generation, so we will just stick to plain return logic: """ while counter != decimal + 1: if 4 * q + r - t < n * t: result += str(n) if counter == 0: result += "." if decimal == counter: break counter += 1 nr = 10 * (r - n * t) n = ((10 * (3 * q + r)) // t) - 10 * n q *= 10 r = nr else: nr = (2 * q + r) * l nn = (q * (7 * k) + 2 + (r * l)) // (t * l) q *= k t *= l l += 2 k += 1 n = nn r = nr return result def main() -> None: print(f"{calculate_pi(50) = }") import doctest doctest.testmod() if __name__ == "__main__": main()
def calculate_pi(limit: int) -> str: """ https://en.wikipedia.org/wiki/Leibniz_formula_for_%CF%80 Leibniz Formula for Pi The Leibniz formula is the special case arctan 1 = 1/4 Pi . Leibniz's formula converges extremely slowly: it exhibits sublinear convergence. Convergence (https://en.wikipedia.org/wiki/Leibniz_formula_for_%CF%80#Convergence) We cannot try to prove against an interrupted, uncompleted generation. https://en.wikipedia.org/wiki/Leibniz_formula_for_%CF%80#Unusual_behaviour The errors can in fact be predicted; but those calculations also approach infinity for accuracy. Our output will always be a string since we can defintely store all digits in there. For simplicity' sake, let's just compare against known values and since our outpit is a string, we need to convert to float. >>> import math >>> float(calculate_pi(15)) == math.pi True Since we cannot predict errors or interrupt any infinite alternating series generation since they approach infinity, or interrupt any alternating series, we are going to need math.isclose() >>> math.isclose(float(calculate_pi(50)), math.pi) True >>> math.isclose(float(calculate_pi(100)), math.pi) True Since math.pi-constant contains only 16 digits, here some test with preknown values: >>> calculate_pi(50) '3.14159265358979323846264338327950288419716939937510' >>> calculate_pi(80) '3.14159265358979323846264338327950288419716939937510582097494459230781640628620899' To apply the Leibniz formula for calculating pi, the variables q, r, t, k, n, and l are used for the iteration process. """ q = 1 r = 0 t = 1 k = 1 n = 3 l = 3 decimal = limit counter = 0 result = "" """ We will avoid using yield since we otherwise get a Generator-Object, which we can't just compare against anything. We would have to make a list out of it after the generation, so we will just stick to plain return logic: """ while counter != decimal + 1: if 4 * q + r - t < n * t: result += str(n) if counter == 0: result += "." if decimal == counter: break counter += 1 nr = 10 * (r - n * t) n = ((10 * (3 * q + r)) // t) - 10 * n q *= 10 r = nr else: nr = (2 * q + r) * l nn = (q * (7 * k) + 2 + (r * l)) // (t * l) q *= k t *= l l += 2 k += 1 n = nn r = nr return result def main() -> None: print(f"{calculate_pi(50) = }") import doctest doctest.testmod() if __name__ == "__main__": main()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Implementation of an auto-balanced binary tree! For doctests run following command: python3 -m doctest -v avl_tree.py For testing run: python avl_tree.py """ from __future__ import annotations import math import random from typing import Any class MyQueue: def __init__(self) -> None: self.data: list[Any] = [] self.head: int = 0 self.tail: int = 0 def is_empty(self) -> bool: return self.head == self.tail def push(self, data: Any) -> None: self.data.append(data) self.tail = self.tail + 1 def pop(self) -> Any: ret = self.data[self.head] self.head = self.head + 1 return ret def count(self) -> int: return self.tail - self.head def print_queue(self) -> None: print(self.data) print("**************") print(self.data[self.head : self.tail]) class MyNode: def __init__(self, data: Any) -> None: self.data = data self.left: MyNode | None = None self.right: MyNode | None = None self.height: int = 1 def get_data(self) -> Any: return self.data def get_left(self) -> MyNode | None: return self.left def get_right(self) -> MyNode | None: return self.right def get_height(self) -> int: return self.height def set_data(self, data: Any) -> None: self.data = data def set_left(self, node: MyNode | None) -> None: self.left = node def set_right(self, node: MyNode | None) -> None: self.right = node def set_height(self, height: int) -> None: self.height = height def get_height(node: MyNode | None) -> int: if node is None: return 0 return node.get_height() def my_max(a: int, b: int) -> int: if a > b: return a return b def right_rotation(node: MyNode) -> MyNode: r""" A B / \ / \ B C Bl A / \ --> / / \ Bl Br UB Br C / UB UB = unbalanced node """ print("left rotation node:", node.get_data()) ret = node.get_left() assert ret is not None node.set_left(ret.get_right()) ret.set_right(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def left_rotation(node: MyNode) -> MyNode: """ a mirror symmetry rotation of the left_rotation """ print("right rotation node:", node.get_data()) ret = node.get_right() assert ret is not None node.set_right(ret.get_left()) ret.set_left(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def lr_rotation(node: MyNode) -> MyNode: r""" A A Br / \ / \ / \ B C LR Br C RR B A / \ --> / \ --> / / \ Bl Br B UB Bl UB C \ / UB Bl RR = right_rotation LR = left_rotation """ left_child = node.get_left() assert left_child is not None node.set_left(left_rotation(left_child)) return right_rotation(node) def rl_rotation(node: MyNode) -> MyNode: right_child = node.get_right() assert right_child is not None node.set_right(right_rotation(right_child)) return left_rotation(node) def insert_node(node: MyNode | None, data: Any) -> MyNode | None: if node is None: return MyNode(data) if data < node.get_data(): node.set_left(insert_node(node.get_left(), data)) if ( get_height(node.get_left()) - get_height(node.get_right()) == 2 ): # an unbalance detected left_child = node.get_left() assert left_child is not None if ( data < left_child.get_data() ): # new node is the left child of the left child node = right_rotation(node) else: node = lr_rotation(node) else: node.set_right(insert_node(node.get_right(), data)) if get_height(node.get_right()) - get_height(node.get_left()) == 2: right_child = node.get_right() assert right_child is not None if data < right_child.get_data(): node = rl_rotation(node) else: node = left_rotation(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) return node def get_right_most(root: MyNode) -> Any: while True: right_child = root.get_right() if right_child is None: break root = right_child return root.get_data() def get_left_most(root: MyNode) -> Any: while True: left_child = root.get_left() if left_child is None: break root = left_child return root.get_data() def del_node(root: MyNode, data: Any) -> MyNode | None: left_child = root.get_left() right_child = root.get_right() if root.get_data() == data: if left_child is not None and right_child is not None: temp_data = get_left_most(right_child) root.set_data(temp_data) root.set_right(del_node(right_child, temp_data)) elif left_child is not None: root = left_child elif right_child is not None: root = right_child else: return None elif root.get_data() > data: if left_child is None: print("No such data") return root else: root.set_left(del_node(left_child, data)) else: # root.get_data() < data if right_child is None: return root else: root.set_right(del_node(right_child, data)) if get_height(right_child) - get_height(left_child) == 2: assert right_child is not None if get_height(right_child.get_right()) > get_height(right_child.get_left()): root = left_rotation(root) else: root = rl_rotation(root) elif get_height(right_child) - get_height(left_child) == -2: assert left_child is not None if get_height(left_child.get_left()) > get_height(left_child.get_right()): root = right_rotation(root) else: root = lr_rotation(root) height = my_max(get_height(root.get_right()), get_height(root.get_left())) + 1 root.set_height(height) return root class AVLtree: """ An AVL tree doctest Examples: >>> t = AVLtree() >>> t.insert(4) insert:4 >>> print(str(t).replace(" \\n","\\n")) 4 ************************************* >>> t.insert(2) insert:2 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* >>> t.insert(3) insert:3 right rotation node: 2 left rotation node: 4 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 3 2 4 ************************************* >>> t.get_height() 2 >>> t.del_node(3) delete:3 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* """ def __init__(self) -> None: self.root: MyNode | None = None def get_height(self) -> int: return get_height(self.root) def insert(self, data: Any) -> None: print("insert:" + str(data)) self.root = insert_node(self.root, data) def del_node(self, data: Any) -> None: print("delete:" + str(data)) if self.root is None: print("Tree is empty!") return self.root = del_node(self.root, data) def __str__( self, ) -> str: # a level traversale, gives a more intuitive look on the tree output = "" q = MyQueue() q.push(self.root) layer = self.get_height() if layer == 0: return output cnt = 0 while not q.is_empty(): node = q.pop() space = " " * int(math.pow(2, layer - 1)) output += space if node is None: output += "*" q.push(None) q.push(None) else: output += str(node.get_data()) q.push(node.get_left()) q.push(node.get_right()) output += space cnt = cnt + 1 for i in range(100): if cnt == math.pow(2, i) - 1: layer = layer - 1 if layer == 0: output += "\n*************************************" return output output += "\n" break output += "\n*************************************" return output def _test() -> None: import doctest doctest.testmod() if __name__ == "__main__": _test() t = AVLtree() lst = list(range(10)) random.shuffle(lst) for i in lst: t.insert(i) print(str(t)) random.shuffle(lst) for i in lst: t.del_node(i) print(str(t))
""" Implementation of an auto-balanced binary tree! For doctests run following command: python3 -m doctest -v avl_tree.py For testing run: python avl_tree.py """ from __future__ import annotations import math import random from typing import Any class MyQueue: def __init__(self) -> None: self.data: list[Any] = [] self.head: int = 0 self.tail: int = 0 def is_empty(self) -> bool: return self.head == self.tail def push(self, data: Any) -> None: self.data.append(data) self.tail = self.tail + 1 def pop(self) -> Any: ret = self.data[self.head] self.head = self.head + 1 return ret def count(self) -> int: return self.tail - self.head def print_queue(self) -> None: print(self.data) print("**************") print(self.data[self.head : self.tail]) class MyNode: def __init__(self, data: Any) -> None: self.data = data self.left: MyNode | None = None self.right: MyNode | None = None self.height: int = 1 def get_data(self) -> Any: return self.data def get_left(self) -> MyNode | None: return self.left def get_right(self) -> MyNode | None: return self.right def get_height(self) -> int: return self.height def set_data(self, data: Any) -> None: self.data = data def set_left(self, node: MyNode | None) -> None: self.left = node def set_right(self, node: MyNode | None) -> None: self.right = node def set_height(self, height: int) -> None: self.height = height def get_height(node: MyNode | None) -> int: if node is None: return 0 return node.get_height() def my_max(a: int, b: int) -> int: if a > b: return a return b def right_rotation(node: MyNode) -> MyNode: r""" A B / \ / \ B C Bl A / \ --> / / \ Bl Br UB Br C / UB UB = unbalanced node """ print("left rotation node:", node.get_data()) ret = node.get_left() assert ret is not None node.set_left(ret.get_right()) ret.set_right(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def left_rotation(node: MyNode) -> MyNode: """ a mirror symmetry rotation of the left_rotation """ print("right rotation node:", node.get_data()) ret = node.get_right() assert ret is not None node.set_right(ret.get_left()) ret.set_left(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def lr_rotation(node: MyNode) -> MyNode: r""" A A Br / \ / \ / \ B C LR Br C RR B A / \ --> / \ --> / / \ Bl Br B UB Bl UB C \ / UB Bl RR = right_rotation LR = left_rotation """ left_child = node.get_left() assert left_child is not None node.set_left(left_rotation(left_child)) return right_rotation(node) def rl_rotation(node: MyNode) -> MyNode: right_child = node.get_right() assert right_child is not None node.set_right(right_rotation(right_child)) return left_rotation(node) def insert_node(node: MyNode | None, data: Any) -> MyNode | None: if node is None: return MyNode(data) if data < node.get_data(): node.set_left(insert_node(node.get_left(), data)) if ( get_height(node.get_left()) - get_height(node.get_right()) == 2 ): # an unbalance detected left_child = node.get_left() assert left_child is not None if ( data < left_child.get_data() ): # new node is the left child of the left child node = right_rotation(node) else: node = lr_rotation(node) else: node.set_right(insert_node(node.get_right(), data)) if get_height(node.get_right()) - get_height(node.get_left()) == 2: right_child = node.get_right() assert right_child is not None if data < right_child.get_data(): node = rl_rotation(node) else: node = left_rotation(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) return node def get_right_most(root: MyNode) -> Any: while True: right_child = root.get_right() if right_child is None: break root = right_child return root.get_data() def get_left_most(root: MyNode) -> Any: while True: left_child = root.get_left() if left_child is None: break root = left_child return root.get_data() def del_node(root: MyNode, data: Any) -> MyNode | None: left_child = root.get_left() right_child = root.get_right() if root.get_data() == data: if left_child is not None and right_child is not None: temp_data = get_left_most(right_child) root.set_data(temp_data) root.set_right(del_node(right_child, temp_data)) elif left_child is not None: root = left_child elif right_child is not None: root = right_child else: return None elif root.get_data() > data: if left_child is None: print("No such data") return root else: root.set_left(del_node(left_child, data)) else: # root.get_data() < data if right_child is None: return root else: root.set_right(del_node(right_child, data)) if get_height(right_child) - get_height(left_child) == 2: assert right_child is not None if get_height(right_child.get_right()) > get_height(right_child.get_left()): root = left_rotation(root) else: root = rl_rotation(root) elif get_height(right_child) - get_height(left_child) == -2: assert left_child is not None if get_height(left_child.get_left()) > get_height(left_child.get_right()): root = right_rotation(root) else: root = lr_rotation(root) height = my_max(get_height(root.get_right()), get_height(root.get_left())) + 1 root.set_height(height) return root class AVLtree: """ An AVL tree doctest Examples: >>> t = AVLtree() >>> t.insert(4) insert:4 >>> print(str(t).replace(" \\n","\\n")) 4 ************************************* >>> t.insert(2) insert:2 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* >>> t.insert(3) insert:3 right rotation node: 2 left rotation node: 4 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 3 2 4 ************************************* >>> t.get_height() 2 >>> t.del_node(3) delete:3 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* """ def __init__(self) -> None: self.root: MyNode | None = None def get_height(self) -> int: return get_height(self.root) def insert(self, data: Any) -> None: print("insert:" + str(data)) self.root = insert_node(self.root, data) def del_node(self, data: Any) -> None: print("delete:" + str(data)) if self.root is None: print("Tree is empty!") return self.root = del_node(self.root, data) def __str__( self, ) -> str: # a level traversale, gives a more intuitive look on the tree output = "" q = MyQueue() q.push(self.root) layer = self.get_height() if layer == 0: return output cnt = 0 while not q.is_empty(): node = q.pop() space = " " * int(math.pow(2, layer - 1)) output += space if node is None: output += "*" q.push(None) q.push(None) else: output += str(node.get_data()) q.push(node.get_left()) q.push(node.get_right()) output += space cnt = cnt + 1 for i in range(100): if cnt == math.pow(2, i) - 1: layer = layer - 1 if layer == 0: output += "\n*************************************" return output output += "\n" break output += "\n*************************************" return output def _test() -> None: import doctest doctest.testmod() if __name__ == "__main__": _test() t = AVLtree() lst = list(range(10)) random.shuffle(lst) for i in lst: t.insert(i) print(str(t)) random.shuffle(lst) for i in lst: t.del_node(i) print(str(t))
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Multiple image resizing techniques """ import numpy as np from cv2 import destroyAllWindows, imread, imshow, waitKey class NearestNeighbour: """ Simplest and fastest version of image resizing. Source: https://en.wikipedia.org/wiki/Nearest-neighbor_interpolation """ def __init__(self, img, dst_width: int, dst_height: int): if dst_width < 0 or dst_height < 0: raise ValueError("Destination width/height should be > 0") self.img = img self.src_w = img.shape[1] self.src_h = img.shape[0] self.dst_w = dst_width self.dst_h = dst_height self.ratio_x = self.src_w / self.dst_w self.ratio_y = self.src_h / self.dst_h self.output = self.output_img = ( np.ones((self.dst_h, self.dst_w, 3), np.uint8) * 255 ) def process(self): for i in range(self.dst_h): for j in range(self.dst_w): self.output[i][j] = self.img[self.get_y(i)][self.get_x(j)] def get_x(self, x: int) -> int: """ Get parent X coordinate for destination X :param x: Destination X coordinate :return: Parent X coordinate based on `x ratio` >>> nn = NearestNeighbour(imread("digital_image_processing/image_data/lena.jpg", ... 1), 100, 100) >>> nn.ratio_x = 0.5 >>> nn.get_x(4) 2 """ return int(self.ratio_x * x) def get_y(self, y: int) -> int: """ Get parent Y coordinate for destination Y :param y: Destination X coordinate :return: Parent X coordinate based on `y ratio` >>> nn = NearestNeighbour(imread("digital_image_processing/image_data/lena.jpg", ... 1), 100, 100) >>> nn.ratio_y = 0.5 >>> nn.get_y(4) 2 """ return int(self.ratio_y * y) if __name__ == "__main__": dst_w, dst_h = 800, 600 im = imread("image_data/lena.jpg", 1) n = NearestNeighbour(im, dst_w, dst_h) n.process() imshow( f"Image resized from: {im.shape[1]}x{im.shape[0]} to {dst_w}x{dst_h}", n.output ) waitKey(0) destroyAllWindows()
""" Multiple image resizing techniques """ import numpy as np from cv2 import destroyAllWindows, imread, imshow, waitKey class NearestNeighbour: """ Simplest and fastest version of image resizing. Source: https://en.wikipedia.org/wiki/Nearest-neighbor_interpolation """ def __init__(self, img, dst_width: int, dst_height: int): if dst_width < 0 or dst_height < 0: raise ValueError("Destination width/height should be > 0") self.img = img self.src_w = img.shape[1] self.src_h = img.shape[0] self.dst_w = dst_width self.dst_h = dst_height self.ratio_x = self.src_w / self.dst_w self.ratio_y = self.src_h / self.dst_h self.output = self.output_img = ( np.ones((self.dst_h, self.dst_w, 3), np.uint8) * 255 ) def process(self): for i in range(self.dst_h): for j in range(self.dst_w): self.output[i][j] = self.img[self.get_y(i)][self.get_x(j)] def get_x(self, x: int) -> int: """ Get parent X coordinate for destination X :param x: Destination X coordinate :return: Parent X coordinate based on `x ratio` >>> nn = NearestNeighbour(imread("digital_image_processing/image_data/lena.jpg", ... 1), 100, 100) >>> nn.ratio_x = 0.5 >>> nn.get_x(4) 2 """ return int(self.ratio_x * x) def get_y(self, y: int) -> int: """ Get parent Y coordinate for destination Y :param y: Destination X coordinate :return: Parent X coordinate based on `y ratio` >>> nn = NearestNeighbour(imread("digital_image_processing/image_data/lena.jpg", ... 1), 100, 100) >>> nn.ratio_y = 0.5 >>> nn.get_y(4) 2 """ return int(self.ratio_y * y) if __name__ == "__main__": dst_w, dst_h = 800, 600 im = imread("image_data/lena.jpg", 1) n = NearestNeighbour(im, dst_w, dst_h) n.process() imshow( f"Image resized from: {im.shape[1]}x{im.shape[0]} to {dst_w}x{dst_h}", n.output ) waitKey(0) destroyAllWindows()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/env python3 import hashlib import importlib.util import json import os import pathlib from types import ModuleType import pytest import requests PROJECT_EULER_DIR_PATH = pathlib.Path.cwd().joinpath("project_euler") PROJECT_EULER_ANSWERS_PATH = pathlib.Path.cwd().joinpath( "scripts", "project_euler_answers.json" ) with open(PROJECT_EULER_ANSWERS_PATH) as file_handle: PROBLEM_ANSWERS: dict[str, str] = json.load(file_handle) def convert_path_to_module(file_path: pathlib.Path) -> ModuleType: """Converts a file path to a Python module""" spec = importlib.util.spec_from_file_location(file_path.name, str(file_path)) module = importlib.util.module_from_spec(spec) # type: ignore spec.loader.exec_module(module) # type: ignore return module def all_solution_file_paths() -> list[pathlib.Path]: """Collects all the solution file path in the Project Euler directory""" solution_file_paths = [] for problem_dir_path in PROJECT_EULER_DIR_PATH.iterdir(): if problem_dir_path.is_file() or problem_dir_path.name.startswith("_"): continue for file_path in problem_dir_path.iterdir(): if file_path.suffix != ".py" or file_path.name.startswith(("_", "test")): continue solution_file_paths.append(file_path) return solution_file_paths def get_files_url() -> str: """Return the pull request number which triggered this action.""" with open(os.environ["GITHUB_EVENT_PATH"]) as file: event = json.load(file) return event["pull_request"]["url"] + "/files" def added_solution_file_path() -> list[pathlib.Path]: """Collects only the solution file path which got added in the current pull request. This will only be triggered if the script is ran from GitHub Actions. """ solution_file_paths = [] headers = { "Accept": "application/vnd.github.v3+json", "Authorization": "token " + os.environ["GITHUB_TOKEN"], } files = requests.get(get_files_url(), headers=headers).json() for file in files: filepath = pathlib.Path.cwd().joinpath(file["filename"]) if ( filepath.suffix != ".py" or filepath.name.startswith(("_", "test")) or not filepath.name.startswith("sol") ): continue solution_file_paths.append(filepath) return solution_file_paths def collect_solution_file_paths() -> list[pathlib.Path]: if os.environ.get("CI") and os.environ.get("GITHUB_EVENT_NAME") == "pull_request": # Return only if there are any, otherwise default to all solutions if filepaths := added_solution_file_path(): return filepaths return all_solution_file_paths() @pytest.mark.parametrize( "solution_path", collect_solution_file_paths(), ids=lambda path: f"{path.parent.name}/{path.name}", ) def test_project_euler(solution_path: pathlib.Path) -> None: """Testing for all Project Euler solutions""" # problem_[extract this part] and pad it with zeroes for width 3 problem_number: str = solution_path.parent.name[8:].zfill(3) expected: str = PROBLEM_ANSWERS[problem_number] solution_module = convert_path_to_module(solution_path) answer = str(solution_module.solution()) # type: ignore answer = hashlib.sha256(answer.encode()).hexdigest() assert ( answer == expected ), f"Expected solution to {problem_number} to have hash {expected}, got {answer}"
#!/usr/bin/env python3 import hashlib import importlib.util import json import os import pathlib from types import ModuleType import pytest import requests PROJECT_EULER_DIR_PATH = pathlib.Path.cwd().joinpath("project_euler") PROJECT_EULER_ANSWERS_PATH = pathlib.Path.cwd().joinpath( "scripts", "project_euler_answers.json" ) with open(PROJECT_EULER_ANSWERS_PATH) as file_handle: PROBLEM_ANSWERS: dict[str, str] = json.load(file_handle) def convert_path_to_module(file_path: pathlib.Path) -> ModuleType: """Converts a file path to a Python module""" spec = importlib.util.spec_from_file_location(file_path.name, str(file_path)) module = importlib.util.module_from_spec(spec) # type: ignore spec.loader.exec_module(module) # type: ignore return module def all_solution_file_paths() -> list[pathlib.Path]: """Collects all the solution file path in the Project Euler directory""" solution_file_paths = [] for problem_dir_path in PROJECT_EULER_DIR_PATH.iterdir(): if problem_dir_path.is_file() or problem_dir_path.name.startswith("_"): continue for file_path in problem_dir_path.iterdir(): if file_path.suffix != ".py" or file_path.name.startswith(("_", "test")): continue solution_file_paths.append(file_path) return solution_file_paths def get_files_url() -> str: """Return the pull request number which triggered this action.""" with open(os.environ["GITHUB_EVENT_PATH"]) as file: event = json.load(file) return event["pull_request"]["url"] + "/files" def added_solution_file_path() -> list[pathlib.Path]: """Collects only the solution file path which got added in the current pull request. This will only be triggered if the script is ran from GitHub Actions. """ solution_file_paths = [] headers = { "Accept": "application/vnd.github.v3+json", "Authorization": "token " + os.environ["GITHUB_TOKEN"], } files = requests.get(get_files_url(), headers=headers).json() for file in files: filepath = pathlib.Path.cwd().joinpath(file["filename"]) if ( filepath.suffix != ".py" or filepath.name.startswith(("_", "test")) or not filepath.name.startswith("sol") ): continue solution_file_paths.append(filepath) return solution_file_paths def collect_solution_file_paths() -> list[pathlib.Path]: if os.environ.get("CI") and os.environ.get("GITHUB_EVENT_NAME") == "pull_request": # Return only if there are any, otherwise default to all solutions if filepaths := added_solution_file_path(): return filepaths return all_solution_file_paths() @pytest.mark.parametrize( "solution_path", collect_solution_file_paths(), ids=lambda path: f"{path.parent.name}/{path.name}", ) def test_project_euler(solution_path: pathlib.Path) -> None: """Testing for all Project Euler solutions""" # problem_[extract this part] and pad it with zeroes for width 3 problem_number: str = solution_path.parent.name[8:].zfill(3) expected: str = PROBLEM_ANSWERS[problem_number] solution_module = convert_path_to_module(solution_path) answer = str(solution_module.solution()) # type: ignore answer = hashlib.sha256(answer.encode()).hexdigest() assert ( answer == expected ), f"Expected solution to {problem_number} to have hash {expected}, got {answer}"
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms/community). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.11+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `ruff .` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install ruff # only required the first time ruff . ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms/community) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms/community). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.11+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `ruff .` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install ruff # only required the first time ruff . ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms/community) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from decimal import Decimal, getcontext from math import ceil, factorial def pi(precision: int) -> str: """ The Chudnovsky algorithm is a fast method for calculating the digits of PI, based on Ramanujan’s PI formulae. https://en.wikipedia.org/wiki/Chudnovsky_algorithm PI = constant_term / ((multinomial_term * linear_term) / exponential_term) where constant_term = 426880 * sqrt(10005) The linear_term and the exponential_term can be defined iteratively as follows: L_k+1 = L_k + 545140134 where L_0 = 13591409 X_k+1 = X_k * -262537412640768000 where X_0 = 1 The multinomial_term is defined as follows: 6k! / ((3k)! * (k!) ^ 3) where k is the k_th iteration. This algorithm correctly calculates around 14 digits of PI per iteration >>> pi(10) '3.14159265' >>> pi(100) '3.14159265358979323846264338327950288419716939937510582097494459230781640628620899862803482534211706' >>> pi('hello') Traceback (most recent call last): ... TypeError: Undefined for non-integers >>> pi(-1) Traceback (most recent call last): ... ValueError: Undefined for non-natural numbers """ if not isinstance(precision, int): raise TypeError("Undefined for non-integers") elif precision < 1: raise ValueError("Undefined for non-natural numbers") getcontext().prec = precision num_iterations = ceil(precision / 14) constant_term = 426880 * Decimal(10005).sqrt() exponential_term = 1 linear_term = 13591409 partial_sum = Decimal(linear_term) for k in range(1, num_iterations): multinomial_term = factorial(6 * k) // (factorial(3 * k) * factorial(k) ** 3) linear_term += 545140134 exponential_term *= -262537412640768000 partial_sum += Decimal(multinomial_term * linear_term) / exponential_term return str(constant_term / partial_sum)[:-1] if __name__ == "__main__": n = 50 print(f"The first {n} digits of pi is: {pi(n)}")
from decimal import Decimal, getcontext from math import ceil, factorial def pi(precision: int) -> str: """ The Chudnovsky algorithm is a fast method for calculating the digits of PI, based on Ramanujan’s PI formulae. https://en.wikipedia.org/wiki/Chudnovsky_algorithm PI = constant_term / ((multinomial_term * linear_term) / exponential_term) where constant_term = 426880 * sqrt(10005) The linear_term and the exponential_term can be defined iteratively as follows: L_k+1 = L_k + 545140134 where L_0 = 13591409 X_k+1 = X_k * -262537412640768000 where X_0 = 1 The multinomial_term is defined as follows: 6k! / ((3k)! * (k!) ^ 3) where k is the k_th iteration. This algorithm correctly calculates around 14 digits of PI per iteration >>> pi(10) '3.14159265' >>> pi(100) '3.14159265358979323846264338327950288419716939937510582097494459230781640628620899862803482534211706' >>> pi('hello') Traceback (most recent call last): ... TypeError: Undefined for non-integers >>> pi(-1) Traceback (most recent call last): ... ValueError: Undefined for non-natural numbers """ if not isinstance(precision, int): raise TypeError("Undefined for non-integers") elif precision < 1: raise ValueError("Undefined for non-natural numbers") getcontext().prec = precision num_iterations = ceil(precision / 14) constant_term = 426880 * Decimal(10005).sqrt() exponential_term = 1 linear_term = 13591409 partial_sum = Decimal(linear_term) for k in range(1, num_iterations): multinomial_term = factorial(6 * k) // (factorial(3 * k) * factorial(k) ** 3) linear_term += 545140134 exponential_term *= -262537412640768000 partial_sum += Decimal(multinomial_term * linear_term) / exponential_term return str(constant_term / partial_sum)[:-1] if __name__ == "__main__": n = 50 print(f"The first {n} digits of pi is: {pi(n)}")
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 115: https://projecteuler.net/problem=115 NOTE: This is a more difficult version of Problem 114 (https://projecteuler.net/problem=114). A row measuring n units in length has red blocks with a minimum length of m units placed on it, such that any two red blocks (which are allowed to be different lengths) are separated by at least one black square. Let the fill-count function, F(m, n), represent the number of ways that a row can be filled. For example, F(3, 29) = 673135 and F(3, 30) = 1089155. That is, for m = 3, it can be seen that n = 30 is the smallest value for which the fill-count function first exceeds one million. In the same way, for m = 10, it can be verified that F(10, 56) = 880711 and F(10, 57) = 1148904, so n = 57 is the least value for which the fill-count function first exceeds one million. For m = 50, find the least value of n for which the fill-count function first exceeds one million. """ from itertools import count def solution(min_block_length: int = 50) -> int: """ Returns for given minimum block length the least value of n for which the fill-count function first exceeds one million >>> solution(3) 30 >>> solution(10) 57 """ fill_count_functions = [1] * min_block_length for n in count(min_block_length): fill_count_functions.append(1) for block_length in range(min_block_length, n + 1): for block_start in range(n - block_length): fill_count_functions[n] += fill_count_functions[ n - block_start - block_length - 1 ] fill_count_functions[n] += 1 if fill_count_functions[n] > 1_000_000: break return n if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 115: https://projecteuler.net/problem=115 NOTE: This is a more difficult version of Problem 114 (https://projecteuler.net/problem=114). A row measuring n units in length has red blocks with a minimum length of m units placed on it, such that any two red blocks (which are allowed to be different lengths) are separated by at least one black square. Let the fill-count function, F(m, n), represent the number of ways that a row can be filled. For example, F(3, 29) = 673135 and F(3, 30) = 1089155. That is, for m = 3, it can be seen that n = 30 is the smallest value for which the fill-count function first exceeds one million. In the same way, for m = 10, it can be verified that F(10, 56) = 880711 and F(10, 57) = 1148904, so n = 57 is the least value for which the fill-count function first exceeds one million. For m = 50, find the least value of n for which the fill-count function first exceeds one million. """ from itertools import count def solution(min_block_length: int = 50) -> int: """ Returns for given minimum block length the least value of n for which the fill-count function first exceeds one million >>> solution(3) 30 >>> solution(10) 57 """ fill_count_functions = [1] * min_block_length for n in count(min_block_length): fill_count_functions.append(1) for block_length in range(min_block_length, n + 1): for block_start in range(n - block_length): fill_count_functions[n] += fill_count_functions[ n - block_start - block_length - 1 ] fill_count_functions[n] += 1 if fill_count_functions[n] > 1_000_000: break return n if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" This is pure Python implementation of counting sort algorithm For doctests run following command: python -m doctest -v counting_sort.py or python3 -m doctest -v counting_sort.py For manual testing run: python counting_sort.py """ def counting_sort(collection): """Pure implementation of counting sort algorithm in Python :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> counting_sort([0, 5, 3, 2, 2]) [0, 2, 2, 3, 5] >>> counting_sort([]) [] >>> counting_sort([-2, -5, -45]) [-45, -5, -2] """ # if the collection is empty, returns empty if collection == []: return [] # get some information about the collection coll_len = len(collection) coll_max = max(collection) coll_min = min(collection) # create the counting array counting_arr_length = coll_max + 1 - coll_min counting_arr = [0] * counting_arr_length # count how much a number appears in the collection for number in collection: counting_arr[number - coll_min] += 1 # sum each position with it's predecessors. now, counting_arr[i] tells # us how many elements <= i has in the collection for i in range(1, counting_arr_length): counting_arr[i] = counting_arr[i] + counting_arr[i - 1] # create the output collection ordered = [0] * coll_len # place the elements in the output, respecting the original order (stable # sort) from end to begin, updating counting_arr for i in reversed(range(0, coll_len)): ordered[counting_arr[collection[i] - coll_min] - 1] = collection[i] counting_arr[collection[i] - coll_min] -= 1 return ordered def counting_sort_string(string): """ >>> counting_sort_string("thisisthestring") 'eghhiiinrsssttt' """ return "".join([chr(i) for i in counting_sort([ord(c) for c in string])]) if __name__ == "__main__": # Test string sort assert counting_sort_string("thisisthestring") == "eghhiiinrsssttt" user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item) for item in user_input.split(",")] print(counting_sort(unsorted))
""" This is pure Python implementation of counting sort algorithm For doctests run following command: python -m doctest -v counting_sort.py or python3 -m doctest -v counting_sort.py For manual testing run: python counting_sort.py """ def counting_sort(collection): """Pure implementation of counting sort algorithm in Python :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> counting_sort([0, 5, 3, 2, 2]) [0, 2, 2, 3, 5] >>> counting_sort([]) [] >>> counting_sort([-2, -5, -45]) [-45, -5, -2] """ # if the collection is empty, returns empty if collection == []: return [] # get some information about the collection coll_len = len(collection) coll_max = max(collection) coll_min = min(collection) # create the counting array counting_arr_length = coll_max + 1 - coll_min counting_arr = [0] * counting_arr_length # count how much a number appears in the collection for number in collection: counting_arr[number - coll_min] += 1 # sum each position with it's predecessors. now, counting_arr[i] tells # us how many elements <= i has in the collection for i in range(1, counting_arr_length): counting_arr[i] = counting_arr[i] + counting_arr[i - 1] # create the output collection ordered = [0] * coll_len # place the elements in the output, respecting the original order (stable # sort) from end to begin, updating counting_arr for i in reversed(range(0, coll_len)): ordered[counting_arr[collection[i] - coll_min] - 1] = collection[i] counting_arr[collection[i] - coll_min] -= 1 return ordered def counting_sort_string(string): """ >>> counting_sort_string("thisisthestring") 'eghhiiinrsssttt' """ return "".join([chr(i) for i in counting_sort([ord(c) for c in string])]) if __name__ == "__main__": # Test string sort assert counting_sort_string("thisisthestring") == "eghhiiinrsssttt" user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item) for item in user_input.split(",")] print(counting_sort(unsorted))
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" This is a Python implementation of the levenshtein distance. Levenshtein distance is a string metric for measuring the difference between two sequences. For doctests run following command: python -m doctest -v levenshtein-distance.py or python3 -m doctest -v levenshtein-distance.py For manual testing run: python levenshtein-distance.py """ def levenshtein_distance(first_word: str, second_word: str) -> int: """Implementation of the levenshtein distance in Python. :param first_word: the first word to measure the difference. :param second_word: the second word to measure the difference. :return: the levenshtein distance between the two words. Examples: >>> levenshtein_distance("planet", "planetary") 3 >>> levenshtein_distance("", "test") 4 >>> levenshtein_distance("book", "back") 2 >>> levenshtein_distance("book", "book") 0 >>> levenshtein_distance("test", "") 4 >>> levenshtein_distance("", "") 0 >>> levenshtein_distance("orchestration", "container") 10 """ # The longer word should come first if len(first_word) < len(second_word): return levenshtein_distance(second_word, first_word) if len(second_word) == 0: return len(first_word) previous_row = list(range(len(second_word) + 1)) for i, c1 in enumerate(first_word): current_row = [i + 1] for j, c2 in enumerate(second_word): # Calculate insertions, deletions and substitutions insertions = previous_row[j + 1] + 1 deletions = current_row[j] + 1 substitutions = previous_row[j] + (c1 != c2) # Get the minimum to append to the current row current_row.append(min(insertions, deletions, substitutions)) # Store the previous row previous_row = current_row # Returns the last element (distance) return previous_row[-1] if __name__ == "__main__": first_word = input("Enter the first word:\n").strip() second_word = input("Enter the second word:\n").strip() result = levenshtein_distance(first_word, second_word) print(f"Levenshtein distance between {first_word} and {second_word} is {result}")
""" This is a Python implementation of the levenshtein distance. Levenshtein distance is a string metric for measuring the difference between two sequences. For doctests run following command: python -m doctest -v levenshtein-distance.py or python3 -m doctest -v levenshtein-distance.py For manual testing run: python levenshtein-distance.py """ def levenshtein_distance(first_word: str, second_word: str) -> int: """Implementation of the levenshtein distance in Python. :param first_word: the first word to measure the difference. :param second_word: the second word to measure the difference. :return: the levenshtein distance between the two words. Examples: >>> levenshtein_distance("planet", "planetary") 3 >>> levenshtein_distance("", "test") 4 >>> levenshtein_distance("book", "back") 2 >>> levenshtein_distance("book", "book") 0 >>> levenshtein_distance("test", "") 4 >>> levenshtein_distance("", "") 0 >>> levenshtein_distance("orchestration", "container") 10 """ # The longer word should come first if len(first_word) < len(second_word): return levenshtein_distance(second_word, first_word) if len(second_word) == 0: return len(first_word) previous_row = list(range(len(second_word) + 1)) for i, c1 in enumerate(first_word): current_row = [i + 1] for j, c2 in enumerate(second_word): # Calculate insertions, deletions and substitutions insertions = previous_row[j + 1] + 1 deletions = current_row[j] + 1 substitutions = previous_row[j] + (c1 != c2) # Get the minimum to append to the current row current_row.append(min(insertions, deletions, substitutions)) # Store the previous row previous_row = current_row # Returns the last element (distance) return previous_row[-1] if __name__ == "__main__": first_word = input("Enter the first word:\n").strip() second_word = input("Enter the second word:\n").strip() result = levenshtein_distance(first_word, second_word) print(f"Levenshtein distance between {first_word} and {second_word} is {result}")
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import math class SegmentTree: def __init__(self, a): self.N = len(a) self.st = [0] * ( 4 * self.N ) # approximate the overall size of segment tree with array N self.build(1, 0, self.N - 1) def left(self, idx): return idx * 2 def right(self, idx): return idx * 2 + 1 def build(self, idx, l, r): # noqa: E741 if l == r: self.st[idx] = A[l] else: mid = (l + r) // 2 self.build(self.left(idx), l, mid) self.build(self.right(idx), mid + 1, r) self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)]) def update(self, a, b, val): return self.update_recursive(1, 0, self.N - 1, a - 1, b - 1, val) def update_recursive(self, idx, l, r, a, b, val): # noqa: E741 """ update(1, 1, N, a, b, v) for update val v to [a,b] """ if r < a or l > b: return True if l == r: self.st[idx] = val return True mid = (l + r) // 2 self.update_recursive(self.left(idx), l, mid, a, b, val) self.update_recursive(self.right(idx), mid + 1, r, a, b, val) self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)]) return True def query(self, a, b): return self.query_recursive(1, 0, self.N - 1, a - 1, b - 1) def query_recursive(self, idx, l, r, a, b): # noqa: E741 """ query(1, 1, N, a, b) for query max of [a,b] """ if r < a or l > b: return -math.inf if l >= a and r <= b: return self.st[idx] mid = (l + r) // 2 q1 = self.query_recursive(self.left(idx), l, mid, a, b) q2 = self.query_recursive(self.right(idx), mid + 1, r, a, b) return max(q1, q2) def show_data(self): show_list = [] for i in range(1, N + 1): show_list += [self.query(i, i)] print(show_list) if __name__ == "__main__": A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8] N = 15 segt = SegmentTree(A) print(segt.query(4, 6)) print(segt.query(7, 11)) print(segt.query(7, 12)) segt.update(1, 3, 111) print(segt.query(1, 15)) segt.update(7, 8, 235) segt.show_data()
import math class SegmentTree: def __init__(self, a): self.N = len(a) self.st = [0] * ( 4 * self.N ) # approximate the overall size of segment tree with array N self.build(1, 0, self.N - 1) def left(self, idx): return idx * 2 def right(self, idx): return idx * 2 + 1 def build(self, idx, l, r): # noqa: E741 if l == r: self.st[idx] = A[l] else: mid = (l + r) // 2 self.build(self.left(idx), l, mid) self.build(self.right(idx), mid + 1, r) self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)]) def update(self, a, b, val): return self.update_recursive(1, 0, self.N - 1, a - 1, b - 1, val) def update_recursive(self, idx, l, r, a, b, val): # noqa: E741 """ update(1, 1, N, a, b, v) for update val v to [a,b] """ if r < a or l > b: return True if l == r: self.st[idx] = val return True mid = (l + r) // 2 self.update_recursive(self.left(idx), l, mid, a, b, val) self.update_recursive(self.right(idx), mid + 1, r, a, b, val) self.st[idx] = max(self.st[self.left(idx)], self.st[self.right(idx)]) return True def query(self, a, b): return self.query_recursive(1, 0, self.N - 1, a - 1, b - 1) def query_recursive(self, idx, l, r, a, b): # noqa: E741 """ query(1, 1, N, a, b) for query max of [a,b] """ if r < a or l > b: return -math.inf if l >= a and r <= b: return self.st[idx] mid = (l + r) // 2 q1 = self.query_recursive(self.left(idx), l, mid, a, b) q2 = self.query_recursive(self.right(idx), mid + 1, r, a, b) return max(q1, q2) def show_data(self): show_list = [] for i in range(1, N + 1): show_list += [self.query(i, i)] print(show_list) if __name__ == "__main__": A = [1, 2, -4, 7, 3, -5, 6, 11, -20, 9, 14, 15, 5, 2, -8] N = 15 segt = SegmentTree(A) print(segt.query(4, 6)) print(segt.query(7, 11)) print(segt.query(7, 12)) segt.update(1, 3, 111) print(segt.query(1, 15)) segt.update(7, 8, 235) segt.show_data()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Simulate the evolution of a highway with only one road that is a loop. The highway is divided in cells, each cell can have at most one car in it. The highway is a loop so when a car comes to one end, it will come out on the other. Each car is represented by its speed (from 0 to 5). Some information about speed: -1 means that the cell on the highway is empty 0 to 5 are the speed of the cars with 0 being the lowest and 5 the highest highway: list[int] Where every position and speed of every car will be stored probability The probability that a driver will slow down initial_speed The speed of the cars a the start frequency How many cells there are between two cars at the start max_speed The maximum speed a car can go to number_of_cells How many cell are there in the highway number_of_update How many times will the position be updated More information here: https://en.wikipedia.org/wiki/Nagel%E2%80%93Schreckenberg_model Examples for doctest: >>> simulate(construct_highway(6, 3, 0), 2, 0, 2) [[0, -1, -1, 0, -1, -1], [-1, 1, -1, -1, 1, -1], [-1, -1, 1, -1, -1, 1]] >>> simulate(construct_highway(5, 2, -2), 3, 0, 2) [[0, -1, 0, -1, 0], [0, -1, 0, -1, -1], [0, -1, -1, 1, -1], [-1, 1, -1, 0, -1]] """ from random import randint, random def construct_highway( number_of_cells: int, frequency: int, initial_speed: int, random_frequency: bool = False, random_speed: bool = False, max_speed: int = 5, ) -> list: """ Build the highway following the parameters given >>> construct_highway(10, 2, 6) [[6, -1, 6, -1, 6, -1, 6, -1, 6, -1]] >>> construct_highway(10, 10, 2) [[2, -1, -1, -1, -1, -1, -1, -1, -1, -1]] """ highway = [[-1] * number_of_cells] # Create a highway without any car i = 0 initial_speed = max(initial_speed, 0) while i < number_of_cells: highway[0][i] = ( randint(0, max_speed) if random_speed else initial_speed ) # Place the cars i += ( randint(1, max_speed * 2) if random_frequency else frequency ) # Arbitrary number, may need tuning return highway def get_distance(highway_now: list, car_index: int) -> int: """ Get the distance between a car (at index car_index) and the next car >>> get_distance([6, -1, 6, -1, 6], 2) 1 >>> get_distance([2, -1, -1, -1, 3, 1, 0, 1, 3, 2], 0) 3 >>> get_distance([-1, -1, -1, -1, 2, -1, -1, -1, 3], -1) 4 """ distance = 0 cells = highway_now[car_index + 1 :] for cell in range(len(cells)): # May need a better name for this if cells[cell] != -1: # If the cell is not empty then return distance # we have the distance we wanted distance += 1 # Here if the car is near the end of the highway return distance + get_distance(highway_now, -1) def update(highway_now: list, probability: float, max_speed: int) -> list: """ Update the speed of the cars >>> update([-1, -1, -1, -1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, -1, -1, -1, 3, -1, -1, -1, -1, 4] >>> update([-1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, 3, -1, -1, -1, -1, 1] """ number_of_cells = len(highway_now) # Beforce calculations, the highway is empty next_highway = [-1] * number_of_cells for car_index in range(number_of_cells): if highway_now[car_index] != -1: # Add 1 to the current speed of the car and cap the speed next_highway[car_index] = min(highway_now[car_index] + 1, max_speed) # Number of empty cell before the next car dn = get_distance(highway_now, car_index) - 1 # We can't have the car causing an accident next_highway[car_index] = min(next_highway[car_index], dn) if random() < probability: # Randomly, a driver will slow down next_highway[car_index] = max(next_highway[car_index] - 1, 0) return next_highway def simulate( highway: list, number_of_update: int, probability: float, max_speed: int ) -> list: """ The main function, it will simulate the evolution of the highway >>> simulate([[-1, 2, -1, -1, -1, 3]], 2, 0.0, 3) [[-1, 2, -1, -1, -1, 3], [-1, -1, -1, 2, -1, 0], [1, -1, -1, 0, -1, -1]] >>> simulate([[-1, 2, -1, 3]], 4, 0.0, 3) [[-1, 2, -1, 3], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0]] """ number_of_cells = len(highway[0]) for i in range(number_of_update): next_speeds_calculated = update(highway[i], probability, max_speed) real_next_speeds = [-1] * number_of_cells for car_index in range(number_of_cells): speed = next_speeds_calculated[car_index] if speed != -1: # Change the position based on the speed (with % to create the loop) index = (car_index + speed) % number_of_cells # Commit the change of position real_next_speeds[index] = speed highway.append(real_next_speeds) return highway if __name__ == "__main__": import doctest doctest.testmod()
""" Simulate the evolution of a highway with only one road that is a loop. The highway is divided in cells, each cell can have at most one car in it. The highway is a loop so when a car comes to one end, it will come out on the other. Each car is represented by its speed (from 0 to 5). Some information about speed: -1 means that the cell on the highway is empty 0 to 5 are the speed of the cars with 0 being the lowest and 5 the highest highway: list[int] Where every position and speed of every car will be stored probability The probability that a driver will slow down initial_speed The speed of the cars a the start frequency How many cells there are between two cars at the start max_speed The maximum speed a car can go to number_of_cells How many cell are there in the highway number_of_update How many times will the position be updated More information here: https://en.wikipedia.org/wiki/Nagel%E2%80%93Schreckenberg_model Examples for doctest: >>> simulate(construct_highway(6, 3, 0), 2, 0, 2) [[0, -1, -1, 0, -1, -1], [-1, 1, -1, -1, 1, -1], [-1, -1, 1, -1, -1, 1]] >>> simulate(construct_highway(5, 2, -2), 3, 0, 2) [[0, -1, 0, -1, 0], [0, -1, 0, -1, -1], [0, -1, -1, 1, -1], [-1, 1, -1, 0, -1]] """ from random import randint, random def construct_highway( number_of_cells: int, frequency: int, initial_speed: int, random_frequency: bool = False, random_speed: bool = False, max_speed: int = 5, ) -> list: """ Build the highway following the parameters given >>> construct_highway(10, 2, 6) [[6, -1, 6, -1, 6, -1, 6, -1, 6, -1]] >>> construct_highway(10, 10, 2) [[2, -1, -1, -1, -1, -1, -1, -1, -1, -1]] """ highway = [[-1] * number_of_cells] # Create a highway without any car i = 0 initial_speed = max(initial_speed, 0) while i < number_of_cells: highway[0][i] = ( randint(0, max_speed) if random_speed else initial_speed ) # Place the cars i += ( randint(1, max_speed * 2) if random_frequency else frequency ) # Arbitrary number, may need tuning return highway def get_distance(highway_now: list, car_index: int) -> int: """ Get the distance between a car (at index car_index) and the next car >>> get_distance([6, -1, 6, -1, 6], 2) 1 >>> get_distance([2, -1, -1, -1, 3, 1, 0, 1, 3, 2], 0) 3 >>> get_distance([-1, -1, -1, -1, 2, -1, -1, -1, 3], -1) 4 """ distance = 0 cells = highway_now[car_index + 1 :] for cell in range(len(cells)): # May need a better name for this if cells[cell] != -1: # If the cell is not empty then return distance # we have the distance we wanted distance += 1 # Here if the car is near the end of the highway return distance + get_distance(highway_now, -1) def update(highway_now: list, probability: float, max_speed: int) -> list: """ Update the speed of the cars >>> update([-1, -1, -1, -1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, -1, -1, -1, 3, -1, -1, -1, -1, 4] >>> update([-1, -1, 2, -1, -1, -1, -1, 3], 0.0, 5) [-1, -1, 3, -1, -1, -1, -1, 1] """ number_of_cells = len(highway_now) # Beforce calculations, the highway is empty next_highway = [-1] * number_of_cells for car_index in range(number_of_cells): if highway_now[car_index] != -1: # Add 1 to the current speed of the car and cap the speed next_highway[car_index] = min(highway_now[car_index] + 1, max_speed) # Number of empty cell before the next car dn = get_distance(highway_now, car_index) - 1 # We can't have the car causing an accident next_highway[car_index] = min(next_highway[car_index], dn) if random() < probability: # Randomly, a driver will slow down next_highway[car_index] = max(next_highway[car_index] - 1, 0) return next_highway def simulate( highway: list, number_of_update: int, probability: float, max_speed: int ) -> list: """ The main function, it will simulate the evolution of the highway >>> simulate([[-1, 2, -1, -1, -1, 3]], 2, 0.0, 3) [[-1, 2, -1, -1, -1, 3], [-1, -1, -1, 2, -1, 0], [1, -1, -1, 0, -1, -1]] >>> simulate([[-1, 2, -1, 3]], 4, 0.0, 3) [[-1, 2, -1, 3], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0], [-1, 0, -1, 0]] """ number_of_cells = len(highway[0]) for i in range(number_of_update): next_speeds_calculated = update(highway[i], probability, max_speed) real_next_speeds = [-1] * number_of_cells for car_index in range(number_of_cells): speed = next_speeds_calculated[car_index] if speed != -1: # Change the position based on the speed (with % to create the loop) index = (car_index + speed) % number_of_cells # Commit the change of position real_next_speeds[index] = speed highway.append(real_next_speeds) return highway if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import os import sys from . import rsa_key_generator as rkg DEFAULT_BLOCK_SIZE = 128 BYTE_SIZE = 256 def get_blocks_from_text( message: str, block_size: int = DEFAULT_BLOCK_SIZE ) -> list[int]: message_bytes = message.encode("ascii") block_ints = [] for block_start in range(0, len(message_bytes), block_size): block_int = 0 for i in range(block_start, min(block_start + block_size, len(message_bytes))): block_int += message_bytes[i] * (BYTE_SIZE ** (i % block_size)) block_ints.append(block_int) return block_ints def get_text_from_blocks( block_ints: list[int], message_length: int, block_size: int = DEFAULT_BLOCK_SIZE ) -> str: message: list[str] = [] for block_int in block_ints: block_message: list[str] = [] for i in range(block_size - 1, -1, -1): if len(message) + i < message_length: ascii_number = block_int // (BYTE_SIZE**i) block_int = block_int % (BYTE_SIZE**i) block_message.insert(0, chr(ascii_number)) message.extend(block_message) return "".join(message) def encrypt_message( message: str, key: tuple[int, int], block_size: int = DEFAULT_BLOCK_SIZE ) -> list[int]: encrypted_blocks = [] n, e = key for block in get_blocks_from_text(message, block_size): encrypted_blocks.append(pow(block, e, n)) return encrypted_blocks def decrypt_message( encrypted_blocks: list[int], message_length: int, key: tuple[int, int], block_size: int = DEFAULT_BLOCK_SIZE, ) -> str: decrypted_blocks = [] n, d = key for block in encrypted_blocks: decrypted_blocks.append(pow(block, d, n)) return get_text_from_blocks(decrypted_blocks, message_length, block_size) def read_key_file(key_filename: str) -> tuple[int, int, int]: with open(key_filename) as fo: content = fo.read() key_size, n, eor_d = content.split(",") return (int(key_size), int(n), int(eor_d)) def encrypt_and_write_to_file( message_filename: str, key_filename: str, message: str, block_size: int = DEFAULT_BLOCK_SIZE, ) -> str: key_size, n, e = read_key_file(key_filename) if key_size < block_size * 8: sys.exit( "ERROR: Block size is {} bits and key size is {} bits. The RSA cipher " "requires the block size to be equal to or greater than the key size. " "Either decrease the block size or use different keys.".format( block_size * 8, key_size ) ) encrypted_blocks = [str(i) for i in encrypt_message(message, (n, e), block_size)] encrypted_content = ",".join(encrypted_blocks) encrypted_content = f"{len(message)}_{block_size}_{encrypted_content}" with open(message_filename, "w") as fo: fo.write(encrypted_content) return encrypted_content def read_from_file_and_decrypt(message_filename: str, key_filename: str) -> str: key_size, n, d = read_key_file(key_filename) with open(message_filename) as fo: content = fo.read() message_length_str, block_size_str, encrypted_message = content.split("_") message_length = int(message_length_str) block_size = int(block_size_str) if key_size < block_size * 8: sys.exit( "ERROR: Block size is {} bits and key size is {} bits. The RSA cipher " "requires the block size to be equal to or greater than the key size. " "Did you specify the correct key file and encrypted file?".format( block_size * 8, key_size ) ) encrypted_blocks = [] for block in encrypted_message.split(","): encrypted_blocks.append(int(block)) return decrypt_message(encrypted_blocks, message_length, (n, d), block_size) def main() -> None: filename = "encrypted_file.txt" response = input(r"Encrypt\Decrypt [e\d]: ") if response.lower().startswith("e"): mode = "encrypt" elif response.lower().startswith("d"): mode = "decrypt" if mode == "encrypt": if not os.path.exists("rsa_pubkey.txt"): rkg.make_key_files("rsa", 1024) message = input("\nEnter message: ") pubkey_filename = "rsa_pubkey.txt" print(f"Encrypting and writing to {filename}...") encrypted_text = encrypt_and_write_to_file(filename, pubkey_filename, message) print("\nEncrypted text:") print(encrypted_text) elif mode == "decrypt": privkey_filename = "rsa_privkey.txt" print(f"Reading from {filename} and decrypting...") decrypted_text = read_from_file_and_decrypt(filename, privkey_filename) print("writing decryption to rsa_decryption.txt...") with open("rsa_decryption.txt", "w") as dec: dec.write(decrypted_text) print("\nDecryption:") print(decrypted_text) if __name__ == "__main__": main()
import os import sys from . import rsa_key_generator as rkg DEFAULT_BLOCK_SIZE = 128 BYTE_SIZE = 256 def get_blocks_from_text( message: str, block_size: int = DEFAULT_BLOCK_SIZE ) -> list[int]: message_bytes = message.encode("ascii") block_ints = [] for block_start in range(0, len(message_bytes), block_size): block_int = 0 for i in range(block_start, min(block_start + block_size, len(message_bytes))): block_int += message_bytes[i] * (BYTE_SIZE ** (i % block_size)) block_ints.append(block_int) return block_ints def get_text_from_blocks( block_ints: list[int], message_length: int, block_size: int = DEFAULT_BLOCK_SIZE ) -> str: message: list[str] = [] for block_int in block_ints: block_message: list[str] = [] for i in range(block_size - 1, -1, -1): if len(message) + i < message_length: ascii_number = block_int // (BYTE_SIZE**i) block_int = block_int % (BYTE_SIZE**i) block_message.insert(0, chr(ascii_number)) message.extend(block_message) return "".join(message) def encrypt_message( message: str, key: tuple[int, int], block_size: int = DEFAULT_BLOCK_SIZE ) -> list[int]: encrypted_blocks = [] n, e = key for block in get_blocks_from_text(message, block_size): encrypted_blocks.append(pow(block, e, n)) return encrypted_blocks def decrypt_message( encrypted_blocks: list[int], message_length: int, key: tuple[int, int], block_size: int = DEFAULT_BLOCK_SIZE, ) -> str: decrypted_blocks = [] n, d = key for block in encrypted_blocks: decrypted_blocks.append(pow(block, d, n)) return get_text_from_blocks(decrypted_blocks, message_length, block_size) def read_key_file(key_filename: str) -> tuple[int, int, int]: with open(key_filename) as fo: content = fo.read() key_size, n, eor_d = content.split(",") return (int(key_size), int(n), int(eor_d)) def encrypt_and_write_to_file( message_filename: str, key_filename: str, message: str, block_size: int = DEFAULT_BLOCK_SIZE, ) -> str: key_size, n, e = read_key_file(key_filename) if key_size < block_size * 8: sys.exit( "ERROR: Block size is {} bits and key size is {} bits. The RSA cipher " "requires the block size to be equal to or greater than the key size. " "Either decrease the block size or use different keys.".format( block_size * 8, key_size ) ) encrypted_blocks = [str(i) for i in encrypt_message(message, (n, e), block_size)] encrypted_content = ",".join(encrypted_blocks) encrypted_content = f"{len(message)}_{block_size}_{encrypted_content}" with open(message_filename, "w") as fo: fo.write(encrypted_content) return encrypted_content def read_from_file_and_decrypt(message_filename: str, key_filename: str) -> str: key_size, n, d = read_key_file(key_filename) with open(message_filename) as fo: content = fo.read() message_length_str, block_size_str, encrypted_message = content.split("_") message_length = int(message_length_str) block_size = int(block_size_str) if key_size < block_size * 8: sys.exit( "ERROR: Block size is {} bits and key size is {} bits. The RSA cipher " "requires the block size to be equal to or greater than the key size. " "Did you specify the correct key file and encrypted file?".format( block_size * 8, key_size ) ) encrypted_blocks = [] for block in encrypted_message.split(","): encrypted_blocks.append(int(block)) return decrypt_message(encrypted_blocks, message_length, (n, d), block_size) def main() -> None: filename = "encrypted_file.txt" response = input(r"Encrypt\Decrypt [e\d]: ") if response.lower().startswith("e"): mode = "encrypt" elif response.lower().startswith("d"): mode = "decrypt" if mode == "encrypt": if not os.path.exists("rsa_pubkey.txt"): rkg.make_key_files("rsa", 1024) message = input("\nEnter message: ") pubkey_filename = "rsa_pubkey.txt" print(f"Encrypting and writing to {filename}...") encrypted_text = encrypt_and_write_to_file(filename, pubkey_filename, message) print("\nEncrypted text:") print(encrypted_text) elif mode == "decrypt": privkey_filename = "rsa_privkey.txt" print(f"Reading from {filename} and decrypting...") decrypted_text = read_from_file_and_decrypt(filename, privkey_filename) print("writing decryption to rsa_decryption.txt...") with open("rsa_decryption.txt", "w") as dec: dec.write(decrypted_text) print("\nDecryption:") print(decrypted_text) if __name__ == "__main__": main()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def get_highest_set_bit_position(number: int) -> int: """ Returns position of the highest set bit of a number. Ref - https://graphics.stanford.edu/~seander/bithacks.html#IntegerLogObvious >>> get_highest_set_bit_position(25) 5 >>> get_highest_set_bit_position(37) 6 >>> get_highest_set_bit_position(1) 1 >>> get_highest_set_bit_position(4) 3 >>> get_highest_set_bit_position(0) 0 >>> get_highest_set_bit_position(0.8) Traceback (most recent call last): ... TypeError: Input value must be an 'int' type """ if not isinstance(number, int): raise TypeError("Input value must be an 'int' type") position = 0 while number: position += 1 number >>= 1 return position if __name__ == "__main__": import doctest doctest.testmod()
def get_highest_set_bit_position(number: int) -> int: """ Returns position of the highest set bit of a number. Ref - https://graphics.stanford.edu/~seander/bithacks.html#IntegerLogObvious >>> get_highest_set_bit_position(25) 5 >>> get_highest_set_bit_position(37) 6 >>> get_highest_set_bit_position(1) 1 >>> get_highest_set_bit_position(4) 3 >>> get_highest_set_bit_position(0) 0 >>> get_highest_set_bit_position(0.8) Traceback (most recent call last): ... TypeError: Input value must be an 'int' type """ if not isinstance(number, int): raise TypeError("Input value must be an 'int' type") position = 0 while number: position += 1 number >>= 1 return position if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def permute(nums: list[int]) -> list[list[int]]: """ Return all permutations. >>> from itertools import permutations >>> numbers= [1,2,3] >>> all(list(nums) in permute(numbers) for nums in permutations(numbers)) True """ result = [] if len(nums) == 1: return [nums.copy()] for _ in range(len(nums)): n = nums.pop(0) permutations = permute(nums) for perm in permutations: perm.append(n) result.extend(permutations) nums.append(n) return result if __name__ == "__main__": import doctest doctest.testmod()
def permute(nums: list[int]) -> list[list[int]]: """ Return all permutations. >>> from itertools import permutations >>> numbers= [1,2,3] >>> all(list(nums) in permute(numbers) for nums in permutations(numbers)) True """ result = [] if len(nums) == 1: return [nums.copy()] for _ in range(len(nums)): n = nums.pop(0) permutations = permute(nums) for perm in permutations: perm.append(n) result.extend(permutations) nums.append(n) return result if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" python/black : True """ from __future__ import annotations def prime_factors(n: int) -> list[int]: """ Returns prime factors of n as a list. >>> prime_factors(0) [] >>> prime_factors(100) [2, 2, 5, 5] >>> prime_factors(2560) [2, 2, 2, 2, 2, 2, 2, 2, 2, 5] >>> prime_factors(10**-2) [] >>> prime_factors(0.02) [] >>> x = prime_factors(10**241) # doctest: +NORMALIZE_WHITESPACE >>> x == [2]*241 + [5]*241 True >>> prime_factors(10**-354) [] >>> prime_factors('hello') Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'str' >>> prime_factors([1,2,'hello']) Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'list' """ i = 2 factors = [] while i * i <= n: if n % i: i += 1 else: n //= i factors.append(i) if n > 1: factors.append(n) return factors if __name__ == "__main__": import doctest doctest.testmod()
""" python/black : True """ from __future__ import annotations def prime_factors(n: int) -> list[int]: """ Returns prime factors of n as a list. >>> prime_factors(0) [] >>> prime_factors(100) [2, 2, 5, 5] >>> prime_factors(2560) [2, 2, 2, 2, 2, 2, 2, 2, 2, 5] >>> prime_factors(10**-2) [] >>> prime_factors(0.02) [] >>> x = prime_factors(10**241) # doctest: +NORMALIZE_WHITESPACE >>> x == [2]*241 + [5]*241 True >>> prime_factors(10**-354) [] >>> prime_factors('hello') Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'str' >>> prime_factors([1,2,'hello']) Traceback (most recent call last): ... TypeError: '<=' not supported between instances of 'int' and 'list' """ i = 2 factors = [] while i * i <= n: if n % i: i += 1 else: n //= i factors.append(i) if n > 1: factors.append(n) return factors if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from .stack import Stack def balanced_parentheses(parentheses: str) -> bool: """Use a stack to check if a string of parentheses is balanced. >>> balanced_parentheses("([]{})") True >>> balanced_parentheses("[()]{}{[()()]()}") True >>> balanced_parentheses("[(])") False >>> balanced_parentheses("1+2*3-4") True >>> balanced_parentheses("") True """ stack: Stack[str] = Stack() bracket_pairs = {"(": ")", "[": "]", "{": "}"} for bracket in parentheses: if bracket in bracket_pairs: stack.push(bracket) elif bracket in (")", "]", "}"): if stack.is_empty() or bracket_pairs[stack.pop()] != bracket: return False return stack.is_empty() if __name__ == "__main__": from doctest import testmod testmod() examples = ["((()))", "((())", "(()))"] print("Balanced parentheses demonstration:\n") for example in examples: not_str = "" if balanced_parentheses(example) else "not " print(f"{example} is {not_str}balanced")
from .stack import Stack def balanced_parentheses(parentheses: str) -> bool: """Use a stack to check if a string of parentheses is balanced. >>> balanced_parentheses("([]{})") True >>> balanced_parentheses("[()]{}{[()()]()}") True >>> balanced_parentheses("[(])") False >>> balanced_parentheses("1+2*3-4") True >>> balanced_parentheses("") True """ stack: Stack[str] = Stack() bracket_pairs = {"(": ")", "[": "]", "{": "}"} for bracket in parentheses: if bracket in bracket_pairs: stack.push(bracket) elif bracket in (")", "]", "}"): if stack.is_empty() or bracket_pairs[stack.pop()] != bracket: return False return stack.is_empty() if __name__ == "__main__": from doctest import testmod testmod() examples = ["((()))", "((())", "(()))"] print("Balanced parentheses demonstration:\n") for example in examples: not_str = "" if balanced_parentheses(example) else "not " print(f"{example} is {not_str}balanced")
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Prize Strings Problem 191 A particular school offers cash rewards to children with good attendance and punctuality. If they are absent for three consecutive days or late on more than one occasion then they forfeit their prize. During an n-day period a trinary string is formed for each child consisting of L's (late), O's (on time), and A's (absent). Although there are eighty-one trinary strings for a 4-day period that can be formed, exactly forty-three strings would lead to a prize: OOOO OOOA OOOL OOAO OOAA OOAL OOLO OOLA OAOO OAOA OAOL OAAO OAAL OALO OALA OLOO OLOA OLAO OLAA AOOO AOOA AOOL AOAO AOAA AOAL AOLO AOLA AAOO AAOA AAOL AALO AALA ALOO ALOA ALAO ALAA LOOO LOOA LOAO LOAA LAOO LAOA LAAO How many "prize" strings exist over a 30-day period? References: - The original Project Euler project page: https://projecteuler.net/problem=191 """ cache: dict[tuple[int, int, int], int] = {} def _calculate(days: int, absent: int, late: int) -> int: """ A small helper function for the recursion, mainly to have a clean interface for the solution() function below. It should get called with the number of days (corresponding to the desired length of the 'prize strings'), and the initial values for the number of consecutive absent days and number of total late days. >>> _calculate(days=4, absent=0, late=0) 43 >>> _calculate(days=30, absent=2, late=0) 0 >>> _calculate(days=30, absent=1, late=0) 98950096 """ # if we are absent twice, or late 3 consecutive days, # no further prize strings are possible if late == 3 or absent == 2: return 0 # if we have no days left, and have not failed any other rules, # we have a prize string if days == 0: return 1 # No easy solution, so now we need to do the recursive calculation # First, check if the combination is already in the cache, and # if yes, return the stored value from there since we already # know the number of possible prize strings from this point on key = (days, absent, late) if key in cache: return cache[key] # now we calculate the three possible ways that can unfold from # this point on, depending on our attendance today # 1) if we are late (but not absent), the "absent" counter stays as # it is, but the "late" counter increases by one state_late = _calculate(days - 1, absent, late + 1) # 2) if we are absent, the "absent" counter increases by 1, and the # "late" counter resets to 0 state_absent = _calculate(days - 1, absent + 1, 0) # 3) if we are on time, this resets the "late" counter and keeps the # absent counter state_ontime = _calculate(days - 1, absent, 0) prizestrings = state_late + state_absent + state_ontime cache[key] = prizestrings return prizestrings def solution(days: int = 30) -> int: """ Returns the number of possible prize strings for a particular number of days, using a simple recursive function with caching to speed it up. >>> solution() 1918080160 >>> solution(4) 43 """ return _calculate(days, absent=0, late=0) if __name__ == "__main__": print(solution())
""" Prize Strings Problem 191 A particular school offers cash rewards to children with good attendance and punctuality. If they are absent for three consecutive days or late on more than one occasion then they forfeit their prize. During an n-day period a trinary string is formed for each child consisting of L's (late), O's (on time), and A's (absent). Although there are eighty-one trinary strings for a 4-day period that can be formed, exactly forty-three strings would lead to a prize: OOOO OOOA OOOL OOAO OOAA OOAL OOLO OOLA OAOO OAOA OAOL OAAO OAAL OALO OALA OLOO OLOA OLAO OLAA AOOO AOOA AOOL AOAO AOAA AOAL AOLO AOLA AAOO AAOA AAOL AALO AALA ALOO ALOA ALAO ALAA LOOO LOOA LOAO LOAA LAOO LAOA LAAO How many "prize" strings exist over a 30-day period? References: - The original Project Euler project page: https://projecteuler.net/problem=191 """ cache: dict[tuple[int, int, int], int] = {} def _calculate(days: int, absent: int, late: int) -> int: """ A small helper function for the recursion, mainly to have a clean interface for the solution() function below. It should get called with the number of days (corresponding to the desired length of the 'prize strings'), and the initial values for the number of consecutive absent days and number of total late days. >>> _calculate(days=4, absent=0, late=0) 43 >>> _calculate(days=30, absent=2, late=0) 0 >>> _calculate(days=30, absent=1, late=0) 98950096 """ # if we are absent twice, or late 3 consecutive days, # no further prize strings are possible if late == 3 or absent == 2: return 0 # if we have no days left, and have not failed any other rules, # we have a prize string if days == 0: return 1 # No easy solution, so now we need to do the recursive calculation # First, check if the combination is already in the cache, and # if yes, return the stored value from there since we already # know the number of possible prize strings from this point on key = (days, absent, late) if key in cache: return cache[key] # now we calculate the three possible ways that can unfold from # this point on, depending on our attendance today # 1) if we are late (but not absent), the "absent" counter stays as # it is, but the "late" counter increases by one state_late = _calculate(days - 1, absent, late + 1) # 2) if we are absent, the "absent" counter increases by 1, and the # "late" counter resets to 0 state_absent = _calculate(days - 1, absent + 1, 0) # 3) if we are on time, this resets the "late" counter and keeps the # absent counter state_ontime = _calculate(days - 1, absent, 0) prizestrings = state_late + state_absent + state_ontime cache[key] = prizestrings return prizestrings def solution(days: int = 30) -> int: """ Returns the number of possible prize strings for a particular number of days, using a simple recursive function with caching to speed it up. >>> solution() 1918080160 >>> solution(4) 43 """ return _calculate(days, absent=0, late=0) if __name__ == "__main__": print(solution())
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" If we are presented with the first k terms of a sequence it is impossible to say with certainty the value of the next term, as there are infinitely many polynomial functions that can model the sequence. As an example, let us consider the sequence of cube numbers. This is defined by the generating function, u(n) = n3: 1, 8, 27, 64, 125, 216, ... Suppose we were only given the first two terms of this sequence. Working on the principle that "simple is best" we should assume a linear relationship and predict the next term to be 15 (common difference 7). Even if we were presented with the first three terms, by the same principle of simplicity, a quadratic relationship should be assumed. We shall define OP(k, n) to be the nth term of the optimum polynomial generating function for the first k terms of a sequence. It should be clear that OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a bad OP (BOP). As a basis, if we were only given the first term of sequence, it would be most sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1). Hence we obtain the following OPs for the cubic sequence: OP(1, n) = 1 1, 1, 1, 1, ... OP(2, n) = 7n-6 1, 8, 15, ... OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ... OP(4, n) = n^3 1, 8, 27, 64, 125, ... Clearly no BOPs exist for k ≥ 4. By considering the sum of FITs generated by the BOPs (indicated in red above), we obtain 1 + 15 + 58 = 74. Consider the following tenth degree polynomial generating function: 1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10 Find the sum of FITs for the BOPs. """ from __future__ import annotations from collections.abc import Callable Matrix = list[list[float | int]] def solve(matrix: Matrix, vector: Matrix) -> Matrix: """ Solve the linear system of equations Ax = b (A = "matrix", b = "vector") for x using Gaussian elimination and back substitution. We assume that A is an invertible square matrix and that b is a column vector of the same height. >>> solve([[1, 0], [0, 1]], [[1],[2]]) [[1.0], [2.0]] >>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]]) [[2.0], [3.0], [-1.0]] """ size: int = len(matrix) augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)] row: int row2: int col: int col2: int pivot_row: int ratio: float for row in range(size): for col in range(size): augmented[row][col] = matrix[row][col] augmented[row][size] = vector[row][0] row = 0 col = 0 while row < size and col < size: # pivoting pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[ 1 ] if augmented[pivot_row][col] == 0: col += 1 continue else: augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row] for row2 in range(row + 1, size): ratio = augmented[row2][col] / augmented[row][col] augmented[row2][col] = 0 for col2 in range(col + 1, size + 1): augmented[row2][col2] -= augmented[row][col2] * ratio row += 1 col += 1 # back substitution for col in range(1, size): for row in range(col): ratio = augmented[row][col] / augmented[col][col] for col2 in range(col, size + 1): augmented[row][col2] -= augmented[col][col2] * ratio # round to get rid of numbers like 2.000000000000004 return [ [round(augmented[row][size] / augmented[row][row], 10)] for row in range(size) ] def interpolate(y_list: list[int]) -> Callable[[int], int]: """ Given a list of data points (1,y0),(2,y1), ..., return a function that interpolates the data points. We find the coefficients of the interpolating polynomial by solving a system of linear equations corresponding to x = 1, 2, 3... >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ size: int = len(y_list) matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)] vector: Matrix = [[0] for _ in range(size)] coeffs: Matrix x_val: int y_val: int col: int for x_val, y_val in enumerate(y_list): for col in range(size): matrix[x_val][col] = (x_val + 1) ** (size - col - 1) vector[x_val][0] = y_val coeffs = solve(matrix, vector) def interpolated_func(var: int) -> int: """ >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ return sum( round(coeffs[x_val][0]) * (var ** (size - x_val - 1)) for x_val in range(size) ) return interpolated_func def question_function(variable: int) -> int: """ The generating function u as specified in the question. >>> question_function(0) 1 >>> question_function(1) 1 >>> question_function(5) 8138021 >>> question_function(10) 9090909091 """ return ( 1 - variable + variable**2 - variable**3 + variable**4 - variable**5 + variable**6 - variable**7 + variable**8 - variable**9 + variable**10 ) def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int: """ Find the sum of the FITs of the BOPS. For each interpolating polynomial of order 1, 2, ... , 10, find the first x such that the value of the polynomial at x does not equal u(x). >>> solution(lambda n: n ** 3, 3) 74 """ data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)] polynomials: list[Callable[[int], int]] = [ interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1) ] ret: int = 0 poly: Callable[[int], int] x_val: int for poly in polynomials: x_val = 1 while func(x_val) == poly(x_val): x_val += 1 ret += poly(x_val) return ret if __name__ == "__main__": print(f"{solution() = }")
""" If we are presented with the first k terms of a sequence it is impossible to say with certainty the value of the next term, as there are infinitely many polynomial functions that can model the sequence. As an example, let us consider the sequence of cube numbers. This is defined by the generating function, u(n) = n3: 1, 8, 27, 64, 125, 216, ... Suppose we were only given the first two terms of this sequence. Working on the principle that "simple is best" we should assume a linear relationship and predict the next term to be 15 (common difference 7). Even if we were presented with the first three terms, by the same principle of simplicity, a quadratic relationship should be assumed. We shall define OP(k, n) to be the nth term of the optimum polynomial generating function for the first k terms of a sequence. It should be clear that OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a bad OP (BOP). As a basis, if we were only given the first term of sequence, it would be most sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1). Hence we obtain the following OPs for the cubic sequence: OP(1, n) = 1 1, 1, 1, 1, ... OP(2, n) = 7n-6 1, 8, 15, ... OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ... OP(4, n) = n^3 1, 8, 27, 64, 125, ... Clearly no BOPs exist for k ≥ 4. By considering the sum of FITs generated by the BOPs (indicated in red above), we obtain 1 + 15 + 58 = 74. Consider the following tenth degree polynomial generating function: 1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10 Find the sum of FITs for the BOPs. """ from __future__ import annotations from collections.abc import Callable Matrix = list[list[float | int]] def solve(matrix: Matrix, vector: Matrix) -> Matrix: """ Solve the linear system of equations Ax = b (A = "matrix", b = "vector") for x using Gaussian elimination and back substitution. We assume that A is an invertible square matrix and that b is a column vector of the same height. >>> solve([[1, 0], [0, 1]], [[1],[2]]) [[1.0], [2.0]] >>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]]) [[2.0], [3.0], [-1.0]] """ size: int = len(matrix) augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)] row: int row2: int col: int col2: int pivot_row: int ratio: float for row in range(size): for col in range(size): augmented[row][col] = matrix[row][col] augmented[row][size] = vector[row][0] row = 0 col = 0 while row < size and col < size: # pivoting pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[ 1 ] if augmented[pivot_row][col] == 0: col += 1 continue else: augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row] for row2 in range(row + 1, size): ratio = augmented[row2][col] / augmented[row][col] augmented[row2][col] = 0 for col2 in range(col + 1, size + 1): augmented[row2][col2] -= augmented[row][col2] * ratio row += 1 col += 1 # back substitution for col in range(1, size): for row in range(col): ratio = augmented[row][col] / augmented[col][col] for col2 in range(col, size + 1): augmented[row][col2] -= augmented[col][col2] * ratio # round to get rid of numbers like 2.000000000000004 return [ [round(augmented[row][size] / augmented[row][row], 10)] for row in range(size) ] def interpolate(y_list: list[int]) -> Callable[[int], int]: """ Given a list of data points (1,y0),(2,y1), ..., return a function that interpolates the data points. We find the coefficients of the interpolating polynomial by solving a system of linear equations corresponding to x = 1, 2, 3... >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ size: int = len(y_list) matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)] vector: Matrix = [[0] for _ in range(size)] coeffs: Matrix x_val: int y_val: int col: int for x_val, y_val in enumerate(y_list): for col in range(size): matrix[x_val][col] = (x_val + 1) ** (size - col - 1) vector[x_val][0] = y_val coeffs = solve(matrix, vector) def interpolated_func(var: int) -> int: """ >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ return sum( round(coeffs[x_val][0]) * (var ** (size - x_val - 1)) for x_val in range(size) ) return interpolated_func def question_function(variable: int) -> int: """ The generating function u as specified in the question. >>> question_function(0) 1 >>> question_function(1) 1 >>> question_function(5) 8138021 >>> question_function(10) 9090909091 """ return ( 1 - variable + variable**2 - variable**3 + variable**4 - variable**5 + variable**6 - variable**7 + variable**8 - variable**9 + variable**10 ) def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int: """ Find the sum of the FITs of the BOPS. For each interpolating polynomial of order 1, 2, ... , 10, find the first x such that the value of the polynomial at x does not equal u(x). >>> solution(lambda n: n ** 3, 3) 74 """ data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)] polynomials: list[Callable[[int], int]] = [ interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1) ] ret: int = 0 poly: Callable[[int], int] x_val: int for poly in polynomials: x_val = 1 while func(x_val) == poly(x_val): x_val += 1 ret += poly(x_val) return ret if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import os import random import sys from . import cryptomath_module as cryptomath from . import rabin_miller min_primitive_root = 3 # I have written my code naively same as definition of primitive root # however every time I run this program, memory exceeded... # so I used 4.80 Algorithm in # Handbook of Applied Cryptography(CRC Press, ISBN : 0-8493-8523-7, October 1996) # and it seems to run nicely! def primitive_root(p_val: int) -> int: print("Generating primitive root of p") while True: g = random.randrange(3, p_val) if pow(g, 2, p_val) == 1: continue if pow(g, p_val, p_val) == 1: continue return g def generate_key(key_size: int) -> tuple[tuple[int, int, int, int], tuple[int, int]]: print("Generating prime p...") p = rabin_miller.generate_large_prime(key_size) # select large prime number. e_1 = primitive_root(p) # one primitive root on modulo p. d = random.randrange(3, p) # private_key -> have to be greater than 2 for safety. e_2 = cryptomath.find_mod_inverse(pow(e_1, d, p), p) public_key = (key_size, e_1, e_2, p) private_key = (key_size, d) return public_key, private_key def make_key_files(name: str, key_size: int) -> None: if os.path.exists(f"{name}_pubkey.txt") or os.path.exists(f"{name}_privkey.txt"): print("\nWARNING:") print( f'"{name}_pubkey.txt" or "{name}_privkey.txt" already exists. \n' "Use a different name or delete these files and re-run this program." ) sys.exit() public_key, private_key = generate_key(key_size) print(f"\nWriting public key to file {name}_pubkey.txt...") with open(f"{name}_pubkey.txt", "w") as fo: fo.write(f"{public_key[0]},{public_key[1]},{public_key[2]},{public_key[3]}") print(f"Writing private key to file {name}_privkey.txt...") with open(f"{name}_privkey.txt", "w") as fo: fo.write(f"{private_key[0]},{private_key[1]}") def main() -> None: print("Making key files...") make_key_files("elgamal", 2048) print("Key files generation successful") if __name__ == "__main__": main()
import os import random import sys from . import cryptomath_module as cryptomath from . import rabin_miller min_primitive_root = 3 # I have written my code naively same as definition of primitive root # however every time I run this program, memory exceeded... # so I used 4.80 Algorithm in # Handbook of Applied Cryptography(CRC Press, ISBN : 0-8493-8523-7, October 1996) # and it seems to run nicely! def primitive_root(p_val: int) -> int: print("Generating primitive root of p") while True: g = random.randrange(3, p_val) if pow(g, 2, p_val) == 1: continue if pow(g, p_val, p_val) == 1: continue return g def generate_key(key_size: int) -> tuple[tuple[int, int, int, int], tuple[int, int]]: print("Generating prime p...") p = rabin_miller.generate_large_prime(key_size) # select large prime number. e_1 = primitive_root(p) # one primitive root on modulo p. d = random.randrange(3, p) # private_key -> have to be greater than 2 for safety. e_2 = cryptomath.find_mod_inverse(pow(e_1, d, p), p) public_key = (key_size, e_1, e_2, p) private_key = (key_size, d) return public_key, private_key def make_key_files(name: str, key_size: int) -> None: if os.path.exists(f"{name}_pubkey.txt") or os.path.exists(f"{name}_privkey.txt"): print("\nWARNING:") print( f'"{name}_pubkey.txt" or "{name}_privkey.txt" already exists. \n' "Use a different name or delete these files and re-run this program." ) sys.exit() public_key, private_key = generate_key(key_size) print(f"\nWriting public key to file {name}_pubkey.txt...") with open(f"{name}_pubkey.txt", "w") as fo: fo.write(f"{public_key[0]},{public_key[1]},{public_key[2]},{public_key[3]}") print(f"Writing private key to file {name}_privkey.txt...") with open(f"{name}_privkey.txt", "w") as fo: fo.write(f"{private_key[0]},{private_key[1]}") def main() -> None: print("Making key files...") make_key_files("elgamal", 2048) print("Key files generation successful") if __name__ == "__main__": main()
-1
TheAlgorithms/Python
8,685
Solving the `Top k most frequent words` problem using a max-heap
Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
cclauss
"2023-04-23T22:13:44Z"
"2023-04-27T17:32:07Z"
c1b3ea5355266bb47daba378ca10200c4d359453
4c1f876567673db0934ba65d662ea221465ec921
Solving the `Top k most frequent words` problem using a max-heap. Contributed be @aparibocci in #8125 * [x] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" This is pure Python implementation of tree traversal algorithms """ from __future__ import annotations import queue class TreeNode: def __init__(self, data): self.data = data self.right = None self.left = None def build_tree(): print("\n********Press N to stop entering at any point of time********\n") check = input("Enter the value of the root node: ").strip().lower() or "n" if check == "n": return None q: queue.Queue = queue.Queue() tree_node = TreeNode(int(check)) q.put(tree_node) while not q.empty(): node_found = q.get() msg = f"Enter the left node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node left_node = TreeNode(int(check)) node_found.left = left_node q.put(left_node) msg = f"Enter the right node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node right_node = TreeNode(int(check)) node_found.right = right_node q.put(right_node) return None def pre_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return print(node.data, end=",") pre_order(node.left) pre_order(node.right) def in_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return in_order(node.left) print(node.data, end=",") in_order(node.right) def post_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return post_order(node.left) post_order(node.right) print(node.data, end=",") def level_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order(root) 1,2,3,4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: q.put(node_dequeued.left) if node_dequeued.right: q.put(node_dequeued.right) def level_order_actual(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order_actual(root) 1, 2,3, 4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): list_ = [] while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: list_.append(node_dequeued.left) if node_dequeued.right: list_.append(node_dequeued.right) print() for node in list_: q.put(node) # iteration version def pre_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order_iter(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: # start from root node, find its left child print(n.data, end=",") stack.append(n) n = n.left # end of while means current node doesn't have left child n = stack.pop() # start to traverse its right child n = n.right def in_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order_iter(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: stack.append(n) n = n.left n = stack.pop() print(n.data, end=",") n = n.right def post_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order_iter(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return stack1, stack2 = [], [] n = node stack1.append(n) while stack1: # to find the reversed order of post order, store it in stack2 n = stack1.pop() if n.left: stack1.append(n.left) if n.right: stack1.append(n.right) stack2.append(n) while stack2: # pop up from stack2 will be the post order print(stack2.pop().data, end=",") def prompt(s: str = "", width=50, char="*") -> str: if not s: return "\n" + width * char left, extra = divmod(width - len(s) - 2, 2) return f"{left * char} {s} {(left + extra) * char}" if __name__ == "__main__": import doctest doctest.testmod() print(prompt("Binary Tree Traversals")) node = build_tree() print(prompt("Pre Order Traversal")) pre_order(node) print(prompt() + "\n") print(prompt("In Order Traversal")) in_order(node) print(prompt() + "\n") print(prompt("Post Order Traversal")) post_order(node) print(prompt() + "\n") print(prompt("Level Order Traversal")) level_order(node) print(prompt() + "\n") print(prompt("Actual Level Order Traversal")) level_order_actual(node) print("*" * 50 + "\n") print(prompt("Pre Order Traversal - Iteration Version")) pre_order_iter(node) print(prompt() + "\n") print(prompt("In Order Traversal - Iteration Version")) in_order_iter(node) print(prompt() + "\n") print(prompt("Post Order Traversal - Iteration Version")) post_order_iter(node) print(prompt())
""" This is pure Python implementation of tree traversal algorithms """ from __future__ import annotations import queue class TreeNode: def __init__(self, data): self.data = data self.right = None self.left = None def build_tree(): print("\n********Press N to stop entering at any point of time********\n") check = input("Enter the value of the root node: ").strip().lower() or "n" if check == "n": return None q: queue.Queue = queue.Queue() tree_node = TreeNode(int(check)) q.put(tree_node) while not q.empty(): node_found = q.get() msg = f"Enter the left node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node left_node = TreeNode(int(check)) node_found.left = left_node q.put(left_node) msg = f"Enter the right node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node right_node = TreeNode(int(check)) node_found.right = right_node q.put(right_node) return None def pre_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return print(node.data, end=",") pre_order(node.left) pre_order(node.right) def in_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return in_order(node.left) print(node.data, end=",") in_order(node.right) def post_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return post_order(node.left) post_order(node.right) print(node.data, end=",") def level_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order(root) 1,2,3,4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: q.put(node_dequeued.left) if node_dequeued.right: q.put(node_dequeued.right) def level_order_actual(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order_actual(root) 1, 2,3, 4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): list_ = [] while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: list_.append(node_dequeued.left) if node_dequeued.right: list_.append(node_dequeued.right) print() for node in list_: q.put(node) # iteration version def pre_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order_iter(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: # start from root node, find its left child print(n.data, end=",") stack.append(n) n = n.left # end of while means current node doesn't have left child n = stack.pop() # start to traverse its right child n = n.right def in_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order_iter(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: stack.append(n) n = n.left n = stack.pop() print(n.data, end=",") n = n.right def post_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order_iter(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return stack1, stack2 = [], [] n = node stack1.append(n) while stack1: # to find the reversed order of post order, store it in stack2 n = stack1.pop() if n.left: stack1.append(n.left) if n.right: stack1.append(n.right) stack2.append(n) while stack2: # pop up from stack2 will be the post order print(stack2.pop().data, end=",") def prompt(s: str = "", width=50, char="*") -> str: if not s: return "\n" + width * char left, extra = divmod(width - len(s) - 2, 2) return f"{left * char} {s} {(left + extra) * char}" if __name__ == "__main__": import doctest doctest.testmod() print(prompt("Binary Tree Traversals")) node = build_tree() print(prompt("Pre Order Traversal")) pre_order(node) print(prompt() + "\n") print(prompt("In Order Traversal")) in_order(node) print(prompt() + "\n") print(prompt("Post Order Traversal")) post_order(node) print(prompt() + "\n") print(prompt("Level Order Traversal")) level_order(node) print(prompt() + "\n") print(prompt("Actual Level Order Traversal")) level_order_actual(node) print("*" * 50 + "\n") print(prompt("Pre Order Traversal - Iteration Version")) pre_order_iter(node) print(prompt() + "\n") print(prompt("In Order Traversal - Iteration Version")) in_order_iter(node) print(prompt() + "\n") print(prompt("Post Order Traversal - Iteration Version")) post_order_iter(node) print(prompt())
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Configuration for probot-stale - https://github.com/probot/stale # Number of days of inactivity before an Issue or Pull Request becomes stale daysUntilStale: 30 # Number of days of inactivity before an Issue or Pull Request with the stale label is closed. # Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale. daysUntilClose: 7 # Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled) onlyLabels: [] # Issues or Pull Requests with these labels will never be considered stale. Set to `[]` to disable exemptLabels: - "Status: on hold" # Set to true to ignore issues in a project (defaults to false) exemptProjects: false # Set to true to ignore issues in a milestone (defaults to false) exemptMilestones: false # Set to true to ignore issues with an assignee (defaults to false) exemptAssignees: false # Label to use when marking as stale staleLabel: stale # Limit the number of actions per hour, from 1-30. Default is 30 limitPerRun: 5 # Comment to post when removing the stale label. # unmarkComment: > # Your comment here. # Optionally, specify configuration settings that are specific to just 'issues' or 'pulls': pulls: # Comment to post when marking as stale. Set to `false` to disable markComment: > This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. # Comment to post when closing a stale Pull Request. closeComment: > Please reopen this pull request once you commit the changes requested or make improvements on the code. If this is not the case and you need some help, feel free to seek help from our [Gitter](https://gitter.im/TheAlgorithms) or ping one of the reviewers. Thank you for your contributions! issues: # Comment to post when marking as stale. Set to `false` to disable markComment: > This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. # Comment to post when closing a stale Issue. closeComment: > Please reopen this issue once you add more information and updates here. If this is not the case and you need some help, feel free to seek help from our [Gitter](https://gitter.im/TheAlgorithms) or ping one of the reviewers. Thank you for your contributions!
# Configuration for probot-stale - https://github.com/probot/stale # Number of days of inactivity before an Issue or Pull Request becomes stale daysUntilStale: 30 # Number of days of inactivity before an Issue or Pull Request with the stale label is closed. # Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale. daysUntilClose: 7 # Only issues or pull requests with all of these labels are check if stale. Defaults to `[]` (disabled) onlyLabels: [] # Issues or Pull Requests with these labels will never be considered stale. Set to `[]` to disable exemptLabels: - "Status: on hold" # Set to true to ignore issues in a project (defaults to false) exemptProjects: false # Set to true to ignore issues in a milestone (defaults to false) exemptMilestones: false # Set to true to ignore issues with an assignee (defaults to false) exemptAssignees: false # Label to use when marking as stale staleLabel: stale # Limit the number of actions per hour, from 1-30. Default is 30 limitPerRun: 5 # Comment to post when removing the stale label. # unmarkComment: > # Your comment here. # Optionally, specify configuration settings that are specific to just 'issues' or 'pulls': pulls: # Comment to post when marking as stale. Set to `false` to disable markComment: > This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. # Comment to post when closing a stale Pull Request. closeComment: > Please reopen this pull request once you commit the changes requested or make improvements on the code. If this is not the case and you need some help, feel free to seek help from our [Gitter](https://app.gitter.im/#/room/#TheAlgorithms_community:gitter.im) or ping one of the reviewers. Thank you for your contributions! issues: # Comment to post when marking as stale. Set to `false` to disable markComment: > This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. # Comment to post when closing a stale Issue. closeComment: > Please reopen this issue once you add more information and updates here. If this is not the case and you need some help, feel free to seek help from our [Gitter](https://app.gitter.im/#/room/#TheAlgorithms_community:gitter.im) or ping one of the reviewers. Thank you for your contributions!
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.4.0 hooks: - id: check-executables-have-shebangs - id: check-toml - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace - id: requirements-txt-fixer - repo: https://github.com/MarcoGorelli/auto-walrus rev: v0.2.2 hooks: - id: auto-walrus - repo: https://github.com/charliermarsh/ruff-pre-commit rev: v0.0.257 hooks: - id: ruff - repo: https://github.com/psf/black rev: 23.1.0 hooks: - id: black - repo: https://github.com/codespell-project/codespell rev: v2.2.4 hooks: - id: codespell additional_dependencies: - tomli - repo: https://github.com/tox-dev/pyproject-fmt rev: "0.9.2" hooks: - id: pyproject-fmt - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false - repo: https://github.com/abravalheri/validate-pyproject rev: v0.12.1 hooks: - id: validate-pyproject - repo: https://github.com/pre-commit/mirrors-mypy rev: v1.1.1 hooks: - id: mypy args: - --ignore-missing-imports - --install-types # See mirrors-mypy README.md - --non-interactive additional_dependencies: [types-requests]
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.4.0 hooks: - id: check-executables-have-shebangs - id: check-toml - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace - id: requirements-txt-fixer - repo: https://github.com/MarcoGorelli/auto-walrus rev: v0.2.2 hooks: - id: auto-walrus - repo: https://github.com/charliermarsh/ruff-pre-commit rev: v0.0.259 hooks: - id: ruff - repo: https://github.com/psf/black rev: 23.1.0 hooks: - id: black - repo: https://github.com/codespell-project/codespell rev: v2.2.4 hooks: - id: codespell additional_dependencies: - tomli - repo: https://github.com/tox-dev/pyproject-fmt rev: "0.9.2" hooks: - id: pyproject-fmt - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false - repo: https://github.com/abravalheri/validate-pyproject rev: v0.12.1 hooks: - id: validate-pyproject - repo: https://github.com/pre-commit/mirrors-mypy rev: v1.1.1 hooks: - id: mypy args: - --ignore-missing-imports - --install-types # See mirrors-mypy README.md - --non-interactive additional_dependencies: [types-requests]
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.11+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `ruff .` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install ruff # only required the first time ruff . ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://app.gitter.im/#/room/#TheAlgorithms_community:gitter.im). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.11+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `ruff .` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install ruff # only required the first time ruff . ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://app.gitter.im/#/room/#TheAlgorithms_community:gitter.im) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
<div align="center"> <!-- Title: --> <a href="https://github.com/TheAlgorithms/"> <img src="https://raw.githubusercontent.com/TheAlgorithms/website/1cd824df116b27029f17c2d1b42d81731f28a920/public/logo.svg" height="100"> </a> <h1><a href="https://github.com/TheAlgorithms/">The Algorithms</a> - Python</h1> <!-- Labels: --> <!-- First row: --> <a href="https://gitpod.io/#https://github.com/TheAlgorithms/Python"> <img src="https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod&style=flat-square" height="20" alt="Gitpod Ready-to-Code"> </a> <a href="https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md"> <img src="https://img.shields.io/static/v1.svg?label=Contributions&message=Welcome&color=0059b3&style=flat-square" height="20" alt="Contributions Welcome"> </a> <img src="https://img.shields.io/github/repo-size/TheAlgorithms/Python.svg?label=Repo%20size&style=flat-square" height="20"> <a href="https://discord.gg/c7MnfGFGa6"> <img src="https://img.shields.io/discord/808045925556682782.svg?logo=discord&colorB=7289DA&style=flat-square" height="20" alt="Discord chat"> </a> <a href="https://gitter.im/TheAlgorithms"> <img src="https://img.shields.io/badge/Chat-Gitter-ff69b4.svg?label=Chat&logo=gitter&style=flat-square" height="20" alt="Gitter chat"> </a> <!-- Second row: --> <br> <a href="https://github.com/TheAlgorithms/Python/actions"> <img src="https://img.shields.io/github/actions/workflow/status/TheAlgorithms/Python/build.yml?branch=master&label=CI&logo=github&style=flat-square" height="20" alt="GitHub Workflow Status"> </a> <a href="https://github.com/pre-commit/pre-commit"> <img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" height="20" alt="pre-commit"> </a> <a href="https://github.com/psf/black"> <img src="https://img.shields.io/static/v1?label=code%20style&message=black&color=black&style=flat-square" height="20" alt="code style: black"> </a> <!-- Short description: --> <h3>All algorithms implemented in Python - for education</h3> </div> Implementations are for learning purposes only. They may be less efficient than the implementations in the Python standard library. Use them at your discretion. ## Getting Started Read through our [Contribution Guidelines](CONTRIBUTING.md) before you contribute. ## Community Channels We are on [Discord](https://discord.gg/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms)! Community channels are a great way for you to ask questions and get help. Please join us! ## List of Algorithms See our [directory](DIRECTORY.md) for easier navigation and a better overview of the project.
<div align="center"> <!-- Title: --> <a href="https://github.com/TheAlgorithms/"> <img src="https://raw.githubusercontent.com/TheAlgorithms/website/1cd824df116b27029f17c2d1b42d81731f28a920/public/logo.svg" height="100"> </a> <h1><a href="https://github.com/TheAlgorithms/">The Algorithms</a> - Python</h1> <!-- Labels: --> <!-- First row: --> <a href="https://gitpod.io/#https://github.com/TheAlgorithms/Python"> <img src="https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod&style=flat-square" height="20" alt="Gitpod Ready-to-Code"> </a> <a href="https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md"> <img src="https://img.shields.io/static/v1.svg?label=Contributions&message=Welcome&color=0059b3&style=flat-square" height="20" alt="Contributions Welcome"> </a> <img src="https://img.shields.io/github/repo-size/TheAlgorithms/Python.svg?label=Repo%20size&style=flat-square" height="20"> <a href="https://discord.gg/c7MnfGFGa6"> <img src="https://img.shields.io/discord/808045925556682782.svg?logo=discord&colorB=7289DA&style=flat-square" height="20" alt="Discord chat"> </a> <a href="https://app.gitter.im/#/room/#TheAlgorithms_community:gitter.im"> <img src="https://img.shields.io/badge/Chat-Gitter-ff69b4.svg?label=Chat&logo=gitter&style=flat-square" height="20" alt="Gitter chat"> </a> <!-- Second row: --> <br> <a href="https://github.com/TheAlgorithms/Python/actions"> <img src="https://img.shields.io/github/actions/workflow/status/TheAlgorithms/Python/build.yml?branch=master&label=CI&logo=github&style=flat-square" height="20" alt="GitHub Workflow Status"> </a> <a href="https://github.com/pre-commit/pre-commit"> <img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" height="20" alt="pre-commit"> </a> <a href="https://github.com/psf/black"> <img src="https://img.shields.io/static/v1?label=code%20style&message=black&color=black&style=flat-square" height="20" alt="code style: black"> </a> <!-- Short description: --> <h3>All algorithms implemented in Python - for education</h3> </div> Implementations are for learning purposes only. They may be less efficient than the implementations in the Python standard library. Use them at your discretion. ## Getting Started Read through our [Contribution Guidelines](CONTRIBUTING.md) before you contribute. ## Community Channels We are on [Discord](https://discord.gg/c7MnfGFGa6) and [Gitter](https://app.gitter.im/#/room/#TheAlgorithms_community:gitter.im)! Community channels are a great way for you to ask questions and get help. Please join us! ## List of Algorithms See our [directory](DIRECTORY.md) for easier navigation and a better overview of the project.
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Demonstrates implementation of SHA1 Hash function in a Python class and gives utilities to find hash of string or hash of text from a file. Usage: python sha1.py --string "Hello World!!" python sha1.py --file "hello_world.txt" When run without any arguments, it prints the hash of the string "Hello World!! Welcome to Cryptography" Also contains a Test class to verify that the generated Hash is same as that returned by the hashlib library SHA1 hash or SHA1 sum of a string is a cryptographic function which means it is easy to calculate forwards but extremely difficult to calculate backwards. What this means is, you can easily calculate the hash of a string, but it is extremely difficult to know the original string if you have its hash. This property is useful to communicate securely, send encrypted messages and is very useful in payment systems, blockchain and cryptocurrency etc. The Algorithm as described in the reference: First we start with a message. The message is padded and the length of the message is added to the end. It is then split into blocks of 512 bits or 64 bytes. The blocks are then processed one at a time. Each block must be expanded and compressed. The value after each compression is added to a 160bit buffer called the current hash state. After the last block is processed the current hash state is returned as the final hash. Reference: https://deadhacker.com/2006/02/21/sha-1-illustrated/ """ import argparse import hashlib # hashlib is only used inside the Test class import struct class SHA1Hash: """ Class to contain the entire pipeline for SHA1 Hashing Algorithm >>> SHA1Hash(bytes('Allan', 'utf-8')).final_hash() '872af2d8ac3d8695387e7c804bf0e02c18df9e6e' """ def __init__(self, data): """ Inititates the variables data and h. h is a list of 5 8-digit Hexadecimal numbers corresponding to (1732584193, 4023233417, 2562383102, 271733878, 3285377520) respectively. We will start with this as a message digest. 0x is how you write Hexadecimal numbers in Python """ self.data = data self.h = [0x67452301, 0xEFCDAB89, 0x98BADCFE, 0x10325476, 0xC3D2E1F0] @staticmethod def rotate(n, b): """ Static method to be used inside other methods. Left rotates n by b. >>> SHA1Hash('').rotate(12,2) 48 """ return ((n << b) | (n >> (32 - b))) & 0xFFFFFFFF def padding(self): """ Pads the input message with zeros so that padded_data has 64 bytes or 512 bits """ padding = b"\x80" + b"\x00" * (63 - (len(self.data) + 8) % 64) padded_data = self.data + padding + struct.pack(">Q", 8 * len(self.data)) return padded_data def split_blocks(self): """ Returns a list of bytestrings each of length 64 """ return [ self.padded_data[i : i + 64] for i in range(0, len(self.padded_data), 64) ] # @staticmethod def expand_block(self, block): """ Takes a bytestring-block of length 64, unpacks it to a list of integers and returns a list of 80 integers after some bit operations """ w = list(struct.unpack(">16L", block)) + [0] * 64 for i in range(16, 80): w[i] = self.rotate((w[i - 3] ^ w[i - 8] ^ w[i - 14] ^ w[i - 16]), 1) return w def final_hash(self): """ Calls all the other methods to process the input. Pads the data, then splits into blocks and then does a series of operations for each block (including expansion). For each block, the variable h that was initialized is copied to a,b,c,d,e and these 5 variables a,b,c,d,e undergo several changes. After all the blocks are processed, these 5 variables are pairwise added to h ie a to h[0], b to h[1] and so on. This h becomes our final hash which is returned. """ self.padded_data = self.padding() self.blocks = self.split_blocks() for block in self.blocks: expanded_block = self.expand_block(block) a, b, c, d, e = self.h for i in range(0, 80): if 0 <= i < 20: f = (b & c) | ((~b) & d) k = 0x5A827999 elif 20 <= i < 40: f = b ^ c ^ d k = 0x6ED9EBA1 elif 40 <= i < 60: f = (b & c) | (b & d) | (c & d) k = 0x8F1BBCDC elif 60 <= i < 80: f = b ^ c ^ d k = 0xCA62C1D6 a, b, c, d, e = ( self.rotate(a, 5) + f + e + k + expanded_block[i] & 0xFFFFFFFF, a, self.rotate(b, 30), c, d, ) self.h = ( self.h[0] + a & 0xFFFFFFFF, self.h[1] + b & 0xFFFFFFFF, self.h[2] + c & 0xFFFFFFFF, self.h[3] + d & 0xFFFFFFFF, self.h[4] + e & 0xFFFFFFFF, ) return "%08x%08x%08x%08x%08x" % tuple(self.h) def test_sha1_hash(): msg = b"Test String" assert SHA1Hash(msg).final_hash() == hashlib.sha1(msg).hexdigest() # noqa: S324 def main(): """ Provides option 'string' or 'file' to take input and prints the calculated SHA1 hash. unittest.main() has been commented because we probably don't want to run the test each time. """ # unittest.main() parser = argparse.ArgumentParser(description="Process some strings or files") parser.add_argument( "--string", dest="input_string", default="Hello World!! Welcome to Cryptography", help="Hash the string", ) parser.add_argument("--file", dest="input_file", help="Hash contents of a file") args = parser.parse_args() input_string = args.input_string # In any case hash input should be a bytestring if args.input_file: with open(args.input_file, "rb") as f: hash_input = f.read() else: hash_input = bytes(input_string, "utf-8") print(SHA1Hash(hash_input).final_hash()) if __name__ == "__main__": main() import doctest doctest.testmod()
""" Demonstrates implementation of SHA1 Hash function in a Python class and gives utilities to find hash of string or hash of text from a file. Usage: python sha1.py --string "Hello World!!" python sha1.py --file "hello_world.txt" When run without any arguments, it prints the hash of the string "Hello World!! Welcome to Cryptography" Also contains a Test class to verify that the generated Hash is same as that returned by the hashlib library SHA1 hash or SHA1 sum of a string is a cryptographic function which means it is easy to calculate forwards but extremely difficult to calculate backwards. What this means is, you can easily calculate the hash of a string, but it is extremely difficult to know the original string if you have its hash. This property is useful to communicate securely, send encrypted messages and is very useful in payment systems, blockchain and cryptocurrency etc. The Algorithm as described in the reference: First we start with a message. The message is padded and the length of the message is added to the end. It is then split into blocks of 512 bits or 64 bytes. The blocks are then processed one at a time. Each block must be expanded and compressed. The value after each compression is added to a 160bit buffer called the current hash state. After the last block is processed the current hash state is returned as the final hash. Reference: https://deadhacker.com/2006/02/21/sha-1-illustrated/ """ import argparse import hashlib # hashlib is only used inside the Test class import struct class SHA1Hash: """ Class to contain the entire pipeline for SHA1 Hashing Algorithm >>> SHA1Hash(bytes('Allan', 'utf-8')).final_hash() '872af2d8ac3d8695387e7c804bf0e02c18df9e6e' """ def __init__(self, data): """ Inititates the variables data and h. h is a list of 5 8-digit Hexadecimal numbers corresponding to (1732584193, 4023233417, 2562383102, 271733878, 3285377520) respectively. We will start with this as a message digest. 0x is how you write Hexadecimal numbers in Python """ self.data = data self.h = [0x67452301, 0xEFCDAB89, 0x98BADCFE, 0x10325476, 0xC3D2E1F0] @staticmethod def rotate(n, b): """ Static method to be used inside other methods. Left rotates n by b. >>> SHA1Hash('').rotate(12,2) 48 """ return ((n << b) | (n >> (32 - b))) & 0xFFFFFFFF def padding(self): """ Pads the input message with zeros so that padded_data has 64 bytes or 512 bits """ padding = b"\x80" + b"\x00" * (63 - (len(self.data) + 8) % 64) padded_data = self.data + padding + struct.pack(">Q", 8 * len(self.data)) return padded_data def split_blocks(self): """ Returns a list of bytestrings each of length 64 """ return [ self.padded_data[i : i + 64] for i in range(0, len(self.padded_data), 64) ] # @staticmethod def expand_block(self, block): """ Takes a bytestring-block of length 64, unpacks it to a list of integers and returns a list of 80 integers after some bit operations """ w = list(struct.unpack(">16L", block)) + [0] * 64 for i in range(16, 80): w[i] = self.rotate((w[i - 3] ^ w[i - 8] ^ w[i - 14] ^ w[i - 16]), 1) return w def final_hash(self): """ Calls all the other methods to process the input. Pads the data, then splits into blocks and then does a series of operations for each block (including expansion). For each block, the variable h that was initialized is copied to a,b,c,d,e and these 5 variables a,b,c,d,e undergo several changes. After all the blocks are processed, these 5 variables are pairwise added to h ie a to h[0], b to h[1] and so on. This h becomes our final hash which is returned. """ self.padded_data = self.padding() self.blocks = self.split_blocks() for block in self.blocks: expanded_block = self.expand_block(block) a, b, c, d, e = self.h for i in range(0, 80): if 0 <= i < 20: f = (b & c) | ((~b) & d) k = 0x5A827999 elif 20 <= i < 40: f = b ^ c ^ d k = 0x6ED9EBA1 elif 40 <= i < 60: f = (b & c) | (b & d) | (c & d) k = 0x8F1BBCDC elif 60 <= i < 80: f = b ^ c ^ d k = 0xCA62C1D6 a, b, c, d, e = ( self.rotate(a, 5) + f + e + k + expanded_block[i] & 0xFFFFFFFF, a, self.rotate(b, 30), c, d, ) self.h = ( self.h[0] + a & 0xFFFFFFFF, self.h[1] + b & 0xFFFFFFFF, self.h[2] + c & 0xFFFFFFFF, self.h[3] + d & 0xFFFFFFFF, self.h[4] + e & 0xFFFFFFFF, ) return ("{:08x}" * 5).format(*self.h) def test_sha1_hash(): msg = b"Test String" assert SHA1Hash(msg).final_hash() == hashlib.sha1(msg).hexdigest() # noqa: S324 def main(): """ Provides option 'string' or 'file' to take input and prints the calculated SHA1 hash. unittest.main() has been commented because we probably don't want to run the test each time. """ # unittest.main() parser = argparse.ArgumentParser(description="Process some strings or files") parser.add_argument( "--string", dest="input_string", default="Hello World!! Welcome to Cryptography", help="Hash the string", ) parser.add_argument("--file", dest="input_file", help="Hash contents of a file") args = parser.parse_args() input_string = args.input_string # In any case hash input should be a bytestring if args.input_file: with open(args.input_file, "rb") as f: hash_input = f.read() else: hash_input = bytes(input_string, "utf-8") print(SHA1Hash(hash_input).final_hash()) if __name__ == "__main__": main() import doctest doctest.testmod()
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Implementation of sequential minimal optimization (SMO) for support vector machines (SVM). Sequential minimal optimization (SMO) is an algorithm for solving the quadratic programming (QP) problem that arises during the training of support vector machines. It was invented by John Platt in 1998. Input: 0: type: numpy.ndarray. 1: first column of ndarray must be tags of samples, must be 1 or -1. 2: rows of ndarray represent samples. Usage: Command: python3 sequential_minimum_optimization.py Code: from sequential_minimum_optimization import SmoSVM, Kernel kernel = Kernel(kernel='poly', degree=3., coef0=1., gamma=0.5) init_alphas = np.zeros(train.shape[0]) SVM = SmoSVM(train=train, alpha_list=init_alphas, kernel_func=kernel, cost=0.4, b=0.0, tolerance=0.001) SVM.fit() predict = SVM.predict(test_samples) Reference: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/smo-book.pdf https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr-98-14.pdf """ import os import sys import urllib.request import numpy as np import pandas as pd from matplotlib import pyplot as plt from sklearn.datasets import make_blobs, make_circles from sklearn.preprocessing import StandardScaler CANCER_DATASET_URL = ( "https://archive.ics.uci.edu/ml/machine-learning-databases/" "breast-cancer-wisconsin/wdbc.data" ) class SmoSVM: def __init__( self, train, kernel_func, alpha_list=None, cost=0.4, b=0.0, tolerance=0.001, auto_norm=True, ): self._init = True self._auto_norm = auto_norm self._c = np.float64(cost) self._b = np.float64(b) self._tol = np.float64(tolerance) if tolerance > 0.0001 else np.float64(0.001) self.tags = train[:, 0] self.samples = self._norm(train[:, 1:]) if self._auto_norm else train[:, 1:] self.alphas = alpha_list if alpha_list is not None else np.zeros(train.shape[0]) self.Kernel = kernel_func self._eps = 0.001 self._all_samples = list(range(self.length)) self._K_matrix = self._calculate_k_matrix() self._error = np.zeros(self.length) self._unbound = [] self.choose_alpha = self._choose_alphas() # Calculate alphas using SMO algorithm def fit(self): k = self._k state = None while True: # 1: Find alpha1, alpha2 try: i1, i2 = self.choose_alpha.send(state) state = None except StopIteration: print("Optimization done!\nEvery sample satisfy the KKT condition!") break # 2: calculate new alpha2 and new alpha1 y1, y2 = self.tags[i1], self.tags[i2] a1, a2 = self.alphas[i1].copy(), self.alphas[i2].copy() e1, e2 = self._e(i1), self._e(i2) args = (i1, i2, a1, a2, e1, e2, y1, y2) a1_new, a2_new = self._get_new_alpha(*args) if not a1_new and not a2_new: state = False continue self.alphas[i1], self.alphas[i2] = a1_new, a2_new # 3: update threshold(b) b1_new = np.float64( -e1 - y1 * k(i1, i1) * (a1_new - a1) - y2 * k(i2, i1) * (a2_new - a2) + self._b ) b2_new = np.float64( -e2 - y2 * k(i2, i2) * (a2_new - a2) - y1 * k(i1, i2) * (a1_new - a1) + self._b ) if 0.0 < a1_new < self._c: b = b1_new if 0.0 < a2_new < self._c: b = b2_new if not (np.float64(0) < a2_new < self._c) and not ( np.float64(0) < a1_new < self._c ): b = (b1_new + b2_new) / 2.0 b_old = self._b self._b = b # 4: update error value,here we only calculate those non-bound samples' # error self._unbound = [i for i in self._all_samples if self._is_unbound(i)] for s in self.unbound: if s in (i1, i2): continue self._error[s] += ( y1 * (a1_new - a1) * k(i1, s) + y2 * (a2_new - a2) * k(i2, s) + (self._b - b_old) ) # if i1 or i2 is non-bound,update there error value to zero if self._is_unbound(i1): self._error[i1] = 0 if self._is_unbound(i2): self._error[i2] = 0 # Predict test samples def predict(self, test_samples, classify=True): if test_samples.shape[1] > self.samples.shape[1]: raise ValueError( "Test samples' feature length does not equal to that of train samples" ) if self._auto_norm: test_samples = self._norm(test_samples) results = [] for test_sample in test_samples: result = self._predict(test_sample) if classify: results.append(1 if result > 0 else -1) else: results.append(result) return np.array(results) # Check if alpha violate KKT condition def _check_obey_kkt(self, index): alphas = self.alphas tol = self._tol r = self._e(index) * self.tags[index] c = self._c return (r < -tol and alphas[index] < c) or (r > tol and alphas[index] > 0.0) # Get value calculated from kernel function def _k(self, i1, i2): # for test samples,use Kernel function if isinstance(i2, np.ndarray): return self.Kernel(self.samples[i1], i2) # for train samples,Kernel values have been saved in matrix else: return self._K_matrix[i1, i2] # Get sample's error def _e(self, index): """ Two cases: 1:Sample[index] is non-bound,Fetch error from list: _error 2:sample[index] is bound,Use predicted value deduct true value: g(xi) - yi """ # get from error data if self._is_unbound(index): return self._error[index] # get by g(xi) - yi else: gx = np.dot(self.alphas * self.tags, self._K_matrix[:, index]) + self._b yi = self.tags[index] return gx - yi # Calculate Kernel matrix of all possible i1,i2 ,saving time def _calculate_k_matrix(self): k_matrix = np.zeros([self.length, self.length]) for i in self._all_samples: for j in self._all_samples: k_matrix[i, j] = np.float64( self.Kernel(self.samples[i, :], self.samples[j, :]) ) return k_matrix # Predict test sample's tag def _predict(self, sample): k = self._k predicted_value = ( np.sum( [ self.alphas[i1] * self.tags[i1] * k(i1, sample) for i1 in self._all_samples ] ) + self._b ) return predicted_value # Choose alpha1 and alpha2 def _choose_alphas(self): locis = yield from self._choose_a1() if not locis: return None return locis def _choose_a1(self): """ Choose first alpha ;steps: 1:First loop over all sample 2:Second loop over all non-bound samples till all non-bound samples does not voilate kkt condition. 3:Repeat this two process endlessly,till all samples does not voilate kkt condition samples after first loop. """ while True: all_not_obey = True # all sample print("scanning all sample!") for i1 in [i for i in self._all_samples if self._check_obey_kkt(i)]: all_not_obey = False yield from self._choose_a2(i1) # non-bound sample print("scanning non-bound sample!") while True: not_obey = True for i1 in [ i for i in self._all_samples if self._check_obey_kkt(i) and self._is_unbound(i) ]: not_obey = False yield from self._choose_a2(i1) if not_obey: print("all non-bound samples fit the KKT condition!") break if all_not_obey: print("all samples fit the KKT condition! Optimization done!") break return False def _choose_a2(self, i1): """ Choose the second alpha by using heuristic algorithm ;steps: 1: Choose alpha2 which gets the maximum step size (|E1 - E2|). 2: Start in a random point,loop over all non-bound samples till alpha1 and alpha2 are optimized. 3: Start in a random point,loop over all samples till alpha1 and alpha2 are optimized. """ self._unbound = [i for i in self._all_samples if self._is_unbound(i)] if len(self.unbound) > 0: tmp_error = self._error.copy().tolist() tmp_error_dict = { index: value for index, value in enumerate(tmp_error) if self._is_unbound(index) } if self._e(i1) >= 0: i2 = min(tmp_error_dict, key=lambda index: tmp_error_dict[index]) else: i2 = max(tmp_error_dict, key=lambda index: tmp_error_dict[index]) cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self.unbound, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self._all_samples, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return # Get the new alpha2 and new alpha1 def _get_new_alpha(self, i1, i2, a1, a2, e1, e2, y1, y2): k = self._k if i1 == i2: return None, None # calculate L and H which bound the new alpha2 s = y1 * y2 if s == -1: l, h = max(0.0, a2 - a1), min(self._c, self._c + a2 - a1) else: l, h = max(0.0, a2 + a1 - self._c), min(self._c, a2 + a1) if l == h: return None, None # calculate eta k11 = k(i1, i1) k22 = k(i2, i2) k12 = k(i1, i2) # select the new alpha2 which could get the minimal objectives if (eta := k11 + k22 - 2.0 * k12) > 0.0: a2_new_unc = a2 + (y2 * (e1 - e2)) / eta # a2_new has a boundary if a2_new_unc >= h: a2_new = h elif a2_new_unc <= l: a2_new = l else: a2_new = a2_new_unc else: b = self._b l1 = a1 + s * (a2 - l) h1 = a1 + s * (a2 - h) # way 1 f1 = y1 * (e1 + b) - a1 * k(i1, i1) - s * a2 * k(i1, i2) f2 = y2 * (e2 + b) - a2 * k(i2, i2) - s * a1 * k(i1, i2) ol = ( l1 * f1 + l * f2 + 1 / 2 * l1**2 * k(i1, i1) + 1 / 2 * l**2 * k(i2, i2) + s * l * l1 * k(i1, i2) ) oh = ( h1 * f1 + h * f2 + 1 / 2 * h1**2 * k(i1, i1) + 1 / 2 * h**2 * k(i2, i2) + s * h * h1 * k(i1, i2) ) """ # way 2 Use objective function check which alpha2 new could get the minimal objectives """ if ol < (oh - self._eps): a2_new = l elif ol > oh + self._eps: a2_new = h else: a2_new = a2 # a1_new has a boundary too a1_new = a1 + s * (a2 - a2_new) if a1_new < 0: a2_new += s * a1_new a1_new = 0 if a1_new > self._c: a2_new += s * (a1_new - self._c) a1_new = self._c return a1_new, a2_new # Normalise data using min_max way def _norm(self, data): if self._init: self._min = np.min(data, axis=0) self._max = np.max(data, axis=0) self._init = False return (data - self._min) / (self._max - self._min) else: return (data - self._min) / (self._max - self._min) def _is_unbound(self, index): return bool(0.0 < self.alphas[index] < self._c) def _is_support(self, index): return bool(self.alphas[index] > 0) @property def unbound(self): return self._unbound @property def support(self): return [i for i in range(self.length) if self._is_support(i)] @property def length(self): return self.samples.shape[0] class Kernel: def __init__(self, kernel, degree=1.0, coef0=0.0, gamma=1.0): self.degree = np.float64(degree) self.coef0 = np.float64(coef0) self.gamma = np.float64(gamma) self._kernel_name = kernel self._kernel = self._get_kernel(kernel_name=kernel) self._check() def _polynomial(self, v1, v2): return (self.gamma * np.inner(v1, v2) + self.coef0) ** self.degree def _linear(self, v1, v2): return np.inner(v1, v2) + self.coef0 def _rbf(self, v1, v2): return np.exp(-1 * (self.gamma * np.linalg.norm(v1 - v2) ** 2)) def _check(self): if self._kernel == self._rbf and self.gamma < 0: raise ValueError("gamma value must greater than 0") def _get_kernel(self, kernel_name): maps = {"linear": self._linear, "poly": self._polynomial, "rbf": self._rbf} return maps[kernel_name] def __call__(self, v1, v2): return self._kernel(v1, v2) def __repr__(self): return self._kernel_name def count_time(func): def call_func(*args, **kwargs): import time start_time = time.time() func(*args, **kwargs) end_time = time.time() print(f"smo algorithm cost {end_time - start_time} seconds") return call_func @count_time def test_cancel_data(): print("Hello!\nStart test svm by smo algorithm!") # 0: download dataset and load into pandas' dataframe if not os.path.exists(r"cancel_data.csv"): request = urllib.request.Request( CANCER_DATASET_URL, headers={"User-Agent": "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)"}, ) response = urllib.request.urlopen(request) content = response.read().decode("utf-8") with open(r"cancel_data.csv", "w") as f: f.write(content) data = pd.read_csv(r"cancel_data.csv", header=None) # 1: pre-processing data del data[data.columns.tolist()[0]] data = data.dropna(axis=0) data = data.replace({"M": np.float64(1), "B": np.float64(-1)}) samples = np.array(data)[:, :] # 2: dividing data into train_data data and test_data data train_data, test_data = samples[:328, :], samples[328:, :] test_tags, test_samples = test_data[:, 0], test_data[:, 1:] # 3: choose kernel function,and set initial alphas to zero(optional) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) al = np.zeros(train_data.shape[0]) # 4: calculating best alphas using SMO algorithm and predict test_data samples mysvm = SmoSVM( train=train_data, kernel_func=mykernel, alpha_list=al, cost=0.4, b=0.0, tolerance=0.001, ) mysvm.fit() predict = mysvm.predict(test_samples) # 5: check accuracy score = 0 test_num = test_tags.shape[0] for i in range(test_tags.shape[0]): if test_tags[i] == predict[i]: score += 1 print(f"\nall: {test_num}\nright: {score}\nfalse: {test_num - score}") print(f"Rough Accuracy: {score / test_tags.shape[0]}") def test_demonstration(): # change stdout print("\nStart plot,please wait!!!") sys.stdout = open(os.devnull, "w") ax1 = plt.subplot2grid((2, 2), (0, 0)) ax2 = plt.subplot2grid((2, 2), (0, 1)) ax3 = plt.subplot2grid((2, 2), (1, 0)) ax4 = plt.subplot2grid((2, 2), (1, 1)) ax1.set_title("linear svm,cost:0.1") test_linear_kernel(ax1, cost=0.1) ax2.set_title("linear svm,cost:500") test_linear_kernel(ax2, cost=500) ax3.set_title("rbf kernel svm,cost:0.1") test_rbf_kernel(ax3, cost=0.1) ax4.set_title("rbf kernel svm,cost:500") test_rbf_kernel(ax4, cost=500) sys.stdout = sys.__stdout__ print("Plot done!!!") def test_linear_kernel(ax, cost): train_x, train_y = make_blobs( n_samples=500, centers=2, n_features=2, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="linear", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def test_rbf_kernel(ax, cost): train_x, train_y = make_circles( n_samples=500, noise=0.1, factor=0.1, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def plot_partition_boundary( model, train_data, ax, resolution=100, colors=("b", "k", "r") ): """ We can not get the optimum w of our kernel svm model which is different from linear svm. For this reason, we generate randomly distributed points with high desity and prediced values of these points are calculated by using our trained model. Then we could use this prediced values to draw contour map. And this contour map can represent svm's partition boundary. """ train_data_x = train_data[:, 1] train_data_y = train_data[:, 2] train_data_tags = train_data[:, 0] xrange = np.linspace(train_data_x.min(), train_data_x.max(), resolution) yrange = np.linspace(train_data_y.min(), train_data_y.max(), resolution) test_samples = np.array([(x, y) for x in xrange for y in yrange]).reshape( resolution * resolution, 2 ) test_tags = model.predict(test_samples, classify=False) grid = test_tags.reshape((len(xrange), len(yrange))) # Plot contour map which represents the partition boundary ax.contour( xrange, yrange, np.mat(grid).T, levels=(-1, 0, 1), linestyles=("--", "-", "--"), linewidths=(1, 1, 1), colors=colors, ) # Plot all train samples ax.scatter( train_data_x, train_data_y, c=train_data_tags, cmap=plt.cm.Dark2, lw=0, alpha=0.5, ) # Plot support vectors support = model.support ax.scatter( train_data_x[support], train_data_y[support], c=train_data_tags[support], cmap=plt.cm.Dark2, ) if __name__ == "__main__": test_cancel_data() test_demonstration() plt.show()
""" Implementation of sequential minimal optimization (SMO) for support vector machines (SVM). Sequential minimal optimization (SMO) is an algorithm for solving the quadratic programming (QP) problem that arises during the training of support vector machines. It was invented by John Platt in 1998. Input: 0: type: numpy.ndarray. 1: first column of ndarray must be tags of samples, must be 1 or -1. 2: rows of ndarray represent samples. Usage: Command: python3 sequential_minimum_optimization.py Code: from sequential_minimum_optimization import SmoSVM, Kernel kernel = Kernel(kernel='poly', degree=3., coef0=1., gamma=0.5) init_alphas = np.zeros(train.shape[0]) SVM = SmoSVM(train=train, alpha_list=init_alphas, kernel_func=kernel, cost=0.4, b=0.0, tolerance=0.001) SVM.fit() predict = SVM.predict(test_samples) Reference: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/smo-book.pdf https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr-98-14.pdf """ import os import sys import urllib.request import numpy as np import pandas as pd from matplotlib import pyplot as plt from sklearn.datasets import make_blobs, make_circles from sklearn.preprocessing import StandardScaler CANCER_DATASET_URL = ( "https://archive.ics.uci.edu/ml/machine-learning-databases/" "breast-cancer-wisconsin/wdbc.data" ) class SmoSVM: def __init__( self, train, kernel_func, alpha_list=None, cost=0.4, b=0.0, tolerance=0.001, auto_norm=True, ): self._init = True self._auto_norm = auto_norm self._c = np.float64(cost) self._b = np.float64(b) self._tol = np.float64(tolerance) if tolerance > 0.0001 else np.float64(0.001) self.tags = train[:, 0] self.samples = self._norm(train[:, 1:]) if self._auto_norm else train[:, 1:] self.alphas = alpha_list if alpha_list is not None else np.zeros(train.shape[0]) self.Kernel = kernel_func self._eps = 0.001 self._all_samples = list(range(self.length)) self._K_matrix = self._calculate_k_matrix() self._error = np.zeros(self.length) self._unbound = [] self.choose_alpha = self._choose_alphas() # Calculate alphas using SMO algorithm def fit(self): k = self._k state = None while True: # 1: Find alpha1, alpha2 try: i1, i2 = self.choose_alpha.send(state) state = None except StopIteration: print("Optimization done!\nEvery sample satisfy the KKT condition!") break # 2: calculate new alpha2 and new alpha1 y1, y2 = self.tags[i1], self.tags[i2] a1, a2 = self.alphas[i1].copy(), self.alphas[i2].copy() e1, e2 = self._e(i1), self._e(i2) args = (i1, i2, a1, a2, e1, e2, y1, y2) a1_new, a2_new = self._get_new_alpha(*args) if not a1_new and not a2_new: state = False continue self.alphas[i1], self.alphas[i2] = a1_new, a2_new # 3: update threshold(b) b1_new = np.float64( -e1 - y1 * k(i1, i1) * (a1_new - a1) - y2 * k(i2, i1) * (a2_new - a2) + self._b ) b2_new = np.float64( -e2 - y2 * k(i2, i2) * (a2_new - a2) - y1 * k(i1, i2) * (a1_new - a1) + self._b ) if 0.0 < a1_new < self._c: b = b1_new if 0.0 < a2_new < self._c: b = b2_new if not (np.float64(0) < a2_new < self._c) and not ( np.float64(0) < a1_new < self._c ): b = (b1_new + b2_new) / 2.0 b_old = self._b self._b = b # 4: update error value,here we only calculate those non-bound samples' # error self._unbound = [i for i in self._all_samples if self._is_unbound(i)] for s in self.unbound: if s in (i1, i2): continue self._error[s] += ( y1 * (a1_new - a1) * k(i1, s) + y2 * (a2_new - a2) * k(i2, s) + (self._b - b_old) ) # if i1 or i2 is non-bound,update there error value to zero if self._is_unbound(i1): self._error[i1] = 0 if self._is_unbound(i2): self._error[i2] = 0 # Predict test samples def predict(self, test_samples, classify=True): if test_samples.shape[1] > self.samples.shape[1]: raise ValueError( "Test samples' feature length does not equal to that of train samples" ) if self._auto_norm: test_samples = self._norm(test_samples) results = [] for test_sample in test_samples: result = self._predict(test_sample) if classify: results.append(1 if result > 0 else -1) else: results.append(result) return np.array(results) # Check if alpha violate KKT condition def _check_obey_kkt(self, index): alphas = self.alphas tol = self._tol r = self._e(index) * self.tags[index] c = self._c return (r < -tol and alphas[index] < c) or (r > tol and alphas[index] > 0.0) # Get value calculated from kernel function def _k(self, i1, i2): # for test samples,use Kernel function if isinstance(i2, np.ndarray): return self.Kernel(self.samples[i1], i2) # for train samples,Kernel values have been saved in matrix else: return self._K_matrix[i1, i2] # Get sample's error def _e(self, index): """ Two cases: 1:Sample[index] is non-bound,Fetch error from list: _error 2:sample[index] is bound,Use predicted value deduct true value: g(xi) - yi """ # get from error data if self._is_unbound(index): return self._error[index] # get by g(xi) - yi else: gx = np.dot(self.alphas * self.tags, self._K_matrix[:, index]) + self._b yi = self.tags[index] return gx - yi # Calculate Kernel matrix of all possible i1,i2 ,saving time def _calculate_k_matrix(self): k_matrix = np.zeros([self.length, self.length]) for i in self._all_samples: for j in self._all_samples: k_matrix[i, j] = np.float64( self.Kernel(self.samples[i, :], self.samples[j, :]) ) return k_matrix # Predict test sample's tag def _predict(self, sample): k = self._k predicted_value = ( np.sum( [ self.alphas[i1] * self.tags[i1] * k(i1, sample) for i1 in self._all_samples ] ) + self._b ) return predicted_value # Choose alpha1 and alpha2 def _choose_alphas(self): locis = yield from self._choose_a1() if not locis: return None return locis def _choose_a1(self): """ Choose first alpha ;steps: 1:First loop over all sample 2:Second loop over all non-bound samples till all non-bound samples does not voilate kkt condition. 3:Repeat this two process endlessly,till all samples does not voilate kkt condition samples after first loop. """ while True: all_not_obey = True # all sample print("scanning all sample!") for i1 in [i for i in self._all_samples if self._check_obey_kkt(i)]: all_not_obey = False yield from self._choose_a2(i1) # non-bound sample print("scanning non-bound sample!") while True: not_obey = True for i1 in [ i for i in self._all_samples if self._check_obey_kkt(i) and self._is_unbound(i) ]: not_obey = False yield from self._choose_a2(i1) if not_obey: print("all non-bound samples fit the KKT condition!") break if all_not_obey: print("all samples fit the KKT condition! Optimization done!") break return False def _choose_a2(self, i1): """ Choose the second alpha by using heuristic algorithm ;steps: 1: Choose alpha2 which gets the maximum step size (|E1 - E2|). 2: Start in a random point,loop over all non-bound samples till alpha1 and alpha2 are optimized. 3: Start in a random point,loop over all samples till alpha1 and alpha2 are optimized. """ self._unbound = [i for i in self._all_samples if self._is_unbound(i)] if len(self.unbound) > 0: tmp_error = self._error.copy().tolist() tmp_error_dict = { index: value for index, value in enumerate(tmp_error) if self._is_unbound(index) } if self._e(i1) >= 0: i2 = min(tmp_error_dict, key=lambda index: tmp_error_dict[index]) else: i2 = max(tmp_error_dict, key=lambda index: tmp_error_dict[index]) cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self.unbound, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return for i2 in np.roll(self._all_samples, np.random.choice(self.length)): cmd = yield i1, i2 if cmd is None: return # Get the new alpha2 and new alpha1 def _get_new_alpha(self, i1, i2, a1, a2, e1, e2, y1, y2): k = self._k if i1 == i2: return None, None # calculate L and H which bound the new alpha2 s = y1 * y2 if s == -1: l, h = max(0.0, a2 - a1), min(self._c, self._c + a2 - a1) else: l, h = max(0.0, a2 + a1 - self._c), min(self._c, a2 + a1) if l == h: return None, None # calculate eta k11 = k(i1, i1) k22 = k(i2, i2) k12 = k(i1, i2) # select the new alpha2 which could get the minimal objectives if (eta := k11 + k22 - 2.0 * k12) > 0.0: a2_new_unc = a2 + (y2 * (e1 - e2)) / eta # a2_new has a boundary if a2_new_unc >= h: a2_new = h elif a2_new_unc <= l: a2_new = l else: a2_new = a2_new_unc else: b = self._b l1 = a1 + s * (a2 - l) h1 = a1 + s * (a2 - h) # way 1 f1 = y1 * (e1 + b) - a1 * k(i1, i1) - s * a2 * k(i1, i2) f2 = y2 * (e2 + b) - a2 * k(i2, i2) - s * a1 * k(i1, i2) ol = ( l1 * f1 + l * f2 + 1 / 2 * l1**2 * k(i1, i1) + 1 / 2 * l**2 * k(i2, i2) + s * l * l1 * k(i1, i2) ) oh = ( h1 * f1 + h * f2 + 1 / 2 * h1**2 * k(i1, i1) + 1 / 2 * h**2 * k(i2, i2) + s * h * h1 * k(i1, i2) ) """ # way 2 Use objective function check which alpha2 new could get the minimal objectives """ if ol < (oh - self._eps): a2_new = l elif ol > oh + self._eps: a2_new = h else: a2_new = a2 # a1_new has a boundary too a1_new = a1 + s * (a2 - a2_new) if a1_new < 0: a2_new += s * a1_new a1_new = 0 if a1_new > self._c: a2_new += s * (a1_new - self._c) a1_new = self._c return a1_new, a2_new # Normalise data using min_max way def _norm(self, data): if self._init: self._min = np.min(data, axis=0) self._max = np.max(data, axis=0) self._init = False return (data - self._min) / (self._max - self._min) else: return (data - self._min) / (self._max - self._min) def _is_unbound(self, index): return bool(0.0 < self.alphas[index] < self._c) def _is_support(self, index): return bool(self.alphas[index] > 0) @property def unbound(self): return self._unbound @property def support(self): return [i for i in range(self.length) if self._is_support(i)] @property def length(self): return self.samples.shape[0] class Kernel: def __init__(self, kernel, degree=1.0, coef0=0.0, gamma=1.0): self.degree = np.float64(degree) self.coef0 = np.float64(coef0) self.gamma = np.float64(gamma) self._kernel_name = kernel self._kernel = self._get_kernel(kernel_name=kernel) self._check() def _polynomial(self, v1, v2): return (self.gamma * np.inner(v1, v2) + self.coef0) ** self.degree def _linear(self, v1, v2): return np.inner(v1, v2) + self.coef0 def _rbf(self, v1, v2): return np.exp(-1 * (self.gamma * np.linalg.norm(v1 - v2) ** 2)) def _check(self): if self._kernel == self._rbf and self.gamma < 0: raise ValueError("gamma value must greater than 0") def _get_kernel(self, kernel_name): maps = {"linear": self._linear, "poly": self._polynomial, "rbf": self._rbf} return maps[kernel_name] def __call__(self, v1, v2): return self._kernel(v1, v2) def __repr__(self): return self._kernel_name def count_time(func): def call_func(*args, **kwargs): import time start_time = time.time() func(*args, **kwargs) end_time = time.time() print(f"smo algorithm cost {end_time - start_time} seconds") return call_func @count_time def test_cancel_data(): print("Hello!\nStart test svm by smo algorithm!") # 0: download dataset and load into pandas' dataframe if not os.path.exists(r"cancel_data.csv"): request = urllib.request.Request( CANCER_DATASET_URL, headers={"User-Agent": "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT)"}, ) response = urllib.request.urlopen(request) # noqa: S310 content = response.read().decode("utf-8") with open(r"cancel_data.csv", "w") as f: f.write(content) data = pd.read_csv(r"cancel_data.csv", header=None) # 1: pre-processing data del data[data.columns.tolist()[0]] data = data.dropna(axis=0) data = data.replace({"M": np.float64(1), "B": np.float64(-1)}) samples = np.array(data)[:, :] # 2: dividing data into train_data data and test_data data train_data, test_data = samples[:328, :], samples[328:, :] test_tags, test_samples = test_data[:, 0], test_data[:, 1:] # 3: choose kernel function,and set initial alphas to zero(optional) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) al = np.zeros(train_data.shape[0]) # 4: calculating best alphas using SMO algorithm and predict test_data samples mysvm = SmoSVM( train=train_data, kernel_func=mykernel, alpha_list=al, cost=0.4, b=0.0, tolerance=0.001, ) mysvm.fit() predict = mysvm.predict(test_samples) # 5: check accuracy score = 0 test_num = test_tags.shape[0] for i in range(test_tags.shape[0]): if test_tags[i] == predict[i]: score += 1 print(f"\nall: {test_num}\nright: {score}\nfalse: {test_num - score}") print(f"Rough Accuracy: {score / test_tags.shape[0]}") def test_demonstration(): # change stdout print("\nStart plot,please wait!!!") sys.stdout = open(os.devnull, "w") ax1 = plt.subplot2grid((2, 2), (0, 0)) ax2 = plt.subplot2grid((2, 2), (0, 1)) ax3 = plt.subplot2grid((2, 2), (1, 0)) ax4 = plt.subplot2grid((2, 2), (1, 1)) ax1.set_title("linear svm,cost:0.1") test_linear_kernel(ax1, cost=0.1) ax2.set_title("linear svm,cost:500") test_linear_kernel(ax2, cost=500) ax3.set_title("rbf kernel svm,cost:0.1") test_rbf_kernel(ax3, cost=0.1) ax4.set_title("rbf kernel svm,cost:500") test_rbf_kernel(ax4, cost=500) sys.stdout = sys.__stdout__ print("Plot done!!!") def test_linear_kernel(ax, cost): train_x, train_y = make_blobs( n_samples=500, centers=2, n_features=2, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="linear", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def test_rbf_kernel(ax, cost): train_x, train_y = make_circles( n_samples=500, noise=0.1, factor=0.1, random_state=1 ) train_y[train_y == 0] = -1 scaler = StandardScaler() train_x_scaled = scaler.fit_transform(train_x, train_y) train_data = np.hstack((train_y.reshape(500, 1), train_x_scaled)) mykernel = Kernel(kernel="rbf", degree=5, coef0=1, gamma=0.5) mysvm = SmoSVM( train=train_data, kernel_func=mykernel, cost=cost, tolerance=0.001, auto_norm=False, ) mysvm.fit() plot_partition_boundary(mysvm, train_data, ax=ax) def plot_partition_boundary( model, train_data, ax, resolution=100, colors=("b", "k", "r") ): """ We can not get the optimum w of our kernel svm model which is different from linear svm. For this reason, we generate randomly distributed points with high desity and prediced values of these points are calculated by using our trained model. Then we could use this prediced values to draw contour map. And this contour map can represent svm's partition boundary. """ train_data_x = train_data[:, 1] train_data_y = train_data[:, 2] train_data_tags = train_data[:, 0] xrange = np.linspace(train_data_x.min(), train_data_x.max(), resolution) yrange = np.linspace(train_data_y.min(), train_data_y.max(), resolution) test_samples = np.array([(x, y) for x in xrange for y in yrange]).reshape( resolution * resolution, 2 ) test_tags = model.predict(test_samples, classify=False) grid = test_tags.reshape((len(xrange), len(yrange))) # Plot contour map which represents the partition boundary ax.contour( xrange, yrange, np.mat(grid).T, levels=(-1, 0, 1), linestyles=("--", "-", "--"), linewidths=(1, 1, 1), colors=colors, ) # Plot all train samples ax.scatter( train_data_x, train_data_y, c=train_data_tags, cmap=plt.cm.Dark2, lw=0, alpha=0.5, ) # Plot support vectors support = model.support ax.scatter( train_data_x[support], train_data_y[support], c=train_data_tags[support], cmap=plt.cm.Dark2, ) if __name__ == "__main__": test_cancel_data() test_demonstration() plt.show()
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" - - - - - -- - - - - - - - - - - - - - - - - - - - - - - Name - - CNN - Convolution Neural Network For Photo Recognizing Goal - - Recognize Handing Writing Word Photo Detail:Total 5 layers neural network * Convolution layer * Pooling layer * Input layer layer of BP * Hidden layer of BP * Output layer of BP Author: Stephen Lee Github: [email protected] Date: 2017.9.20 - - - - - -- - - - - - - - - - - - - - - - - - - - - - - """ import pickle import numpy as np from matplotlib import pyplot as plt class CNN: def __init__( self, conv1_get, size_p1, bp_num1, bp_num2, bp_num3, rate_w=0.2, rate_t=0.2 ): """ :param conv1_get: [a,c,d],size, number, step of convolution kernel :param size_p1: pooling size :param bp_num1: units number of flatten layer :param bp_num2: units number of hidden layer :param bp_num3: units number of output layer :param rate_w: rate of weight learning :param rate_t: rate of threshold learning """ self.num_bp1 = bp_num1 self.num_bp2 = bp_num2 self.num_bp3 = bp_num3 self.conv1 = conv1_get[:2] self.step_conv1 = conv1_get[2] self.size_pooling1 = size_p1 self.rate_weight = rate_w self.rate_thre = rate_t self.w_conv1 = [ np.mat(-1 * np.random.rand(self.conv1[0], self.conv1[0]) + 0.5) for i in range(self.conv1[1]) ] self.wkj = np.mat(-1 * np.random.rand(self.num_bp3, self.num_bp2) + 0.5) self.vji = np.mat(-1 * np.random.rand(self.num_bp2, self.num_bp1) + 0.5) self.thre_conv1 = -2 * np.random.rand(self.conv1[1]) + 1 self.thre_bp2 = -2 * np.random.rand(self.num_bp2) + 1 self.thre_bp3 = -2 * np.random.rand(self.num_bp3) + 1 def save_model(self, save_path): # save model dict with pickle model_dic = { "num_bp1": self.num_bp1, "num_bp2": self.num_bp2, "num_bp3": self.num_bp3, "conv1": self.conv1, "step_conv1": self.step_conv1, "size_pooling1": self.size_pooling1, "rate_weight": self.rate_weight, "rate_thre": self.rate_thre, "w_conv1": self.w_conv1, "wkj": self.wkj, "vji": self.vji, "thre_conv1": self.thre_conv1, "thre_bp2": self.thre_bp2, "thre_bp3": self.thre_bp3, } with open(save_path, "wb") as f: pickle.dump(model_dic, f) print(f"Model saved: {save_path}") @classmethod def read_model(cls, model_path): # read saved model with open(model_path, "rb") as f: model_dic = pickle.load(f) conv_get = model_dic.get("conv1") conv_get.append(model_dic.get("step_conv1")) size_p1 = model_dic.get("size_pooling1") bp1 = model_dic.get("num_bp1") bp2 = model_dic.get("num_bp2") bp3 = model_dic.get("num_bp3") r_w = model_dic.get("rate_weight") r_t = model_dic.get("rate_thre") # create model instance conv_ins = CNN(conv_get, size_p1, bp1, bp2, bp3, r_w, r_t) # modify model parameter conv_ins.w_conv1 = model_dic.get("w_conv1") conv_ins.wkj = model_dic.get("wkj") conv_ins.vji = model_dic.get("vji") conv_ins.thre_conv1 = model_dic.get("thre_conv1") conv_ins.thre_bp2 = model_dic.get("thre_bp2") conv_ins.thre_bp3 = model_dic.get("thre_bp3") return conv_ins def sig(self, x): return 1 / (1 + np.exp(-1 * x)) def do_round(self, x): return round(x, 3) def convolute(self, data, convs, w_convs, thre_convs, conv_step): # convolution process size_conv = convs[0] num_conv = convs[1] size_data = np.shape(data)[0] # get the data slice of original image data, data_focus data_focus = [] for i_focus in range(0, size_data - size_conv + 1, conv_step): for j_focus in range(0, size_data - size_conv + 1, conv_step): focus = data[ i_focus : i_focus + size_conv, j_focus : j_focus + size_conv ] data_focus.append(focus) # calculate the feature map of every single kernel, and saved as list of matrix data_featuremap = [] size_feature_map = int((size_data - size_conv) / conv_step + 1) for i_map in range(num_conv): featuremap = [] for i_focus in range(len(data_focus)): net_focus = ( np.sum(np.multiply(data_focus[i_focus], w_convs[i_map])) - thre_convs[i_map] ) featuremap.append(self.sig(net_focus)) featuremap = np.asmatrix(featuremap).reshape( size_feature_map, size_feature_map ) data_featuremap.append(featuremap) # expanding the data slice to One dimenssion focus1_list = [] for each_focus in data_focus: focus1_list.extend(self.Expand_Mat(each_focus)) focus_list = np.asarray(focus1_list) return focus_list, data_featuremap def pooling(self, featuremaps, size_pooling, pooling_type="average_pool"): # pooling process size_map = len(featuremaps[0]) size_pooled = int(size_map / size_pooling) featuremap_pooled = [] for i_map in range(len(featuremaps)): feature_map = featuremaps[i_map] map_pooled = [] for i_focus in range(0, size_map, size_pooling): for j_focus in range(0, size_map, size_pooling): focus = feature_map[ i_focus : i_focus + size_pooling, j_focus : j_focus + size_pooling, ] if pooling_type == "average_pool": # average pooling map_pooled.append(np.average(focus)) elif pooling_type == "max_pooling": # max pooling map_pooled.append(np.max(focus)) map_pooled = np.asmatrix(map_pooled).reshape(size_pooled, size_pooled) featuremap_pooled.append(map_pooled) return featuremap_pooled def _expand(self, data): # expanding three dimension data to one dimension list data_expanded = [] for i in range(len(data)): shapes = np.shape(data[i]) data_listed = data[i].reshape(1, shapes[0] * shapes[1]) data_listed = data_listed.getA().tolist()[0] data_expanded.extend(data_listed) data_expanded = np.asarray(data_expanded) return data_expanded def _expand_mat(self, data_mat): # expanding matrix to one dimension list data_mat = np.asarray(data_mat) shapes = np.shape(data_mat) data_expanded = data_mat.reshape(1, shapes[0] * shapes[1]) return data_expanded def _calculate_gradient_from_pool( self, out_map, pd_pool, num_map, size_map, size_pooling ): """ calculate the gradient from the data slice of pool layer pd_pool: list of matrix out_map: the shape of data slice(size_map*size_map) return: pd_all: list of matrix, [num, size_map, size_map] """ pd_all = [] i_pool = 0 for i_map in range(num_map): pd_conv1 = np.ones((size_map, size_map)) for i in range(0, size_map, size_pooling): for j in range(0, size_map, size_pooling): pd_conv1[i : i + size_pooling, j : j + size_pooling] = pd_pool[ i_pool ] i_pool = i_pool + 1 pd_conv2 = np.multiply( pd_conv1, np.multiply(out_map[i_map], (1 - out_map[i_map])) ) pd_all.append(pd_conv2) return pd_all def train( self, patterns, datas_train, datas_teach, n_repeat, error_accuracy, draw_e=bool ): # model traning print("----------------------Start Training-------------------------") print((" - - Shape: Train_Data ", np.shape(datas_train))) print((" - - Shape: Teach_Data ", np.shape(datas_teach))) rp = 0 all_mse = [] mse = 10000 while rp < n_repeat and mse >= error_accuracy: error_count = 0 print(f"-------------Learning Time {rp}--------------") for p in range(len(datas_train)): # print('------------Learning Image: %d--------------'%p) data_train = np.asmatrix(datas_train[p]) data_teach = np.asarray(datas_teach[p]) data_focus1, data_conved1 = self.convolute( data_train, self.conv1, self.w_conv1, self.thre_conv1, conv_step=self.step_conv1, ) data_pooled1 = self.pooling(data_conved1, self.size_pooling1) shape_featuremap1 = np.shape(data_conved1) """ print(' -----original shape ', np.shape(data_train)) print(' ---- after convolution ',np.shape(data_conv1)) print(' -----after pooling ',np.shape(data_pooled1)) """ data_bp_input = self._expand(data_pooled1) bp_out1 = data_bp_input bp_net_j = np.dot(bp_out1, self.vji.T) - self.thre_bp2 bp_out2 = self.sig(bp_net_j) bp_net_k = np.dot(bp_out2, self.wkj.T) - self.thre_bp3 bp_out3 = self.sig(bp_net_k) # --------------Model Leaning ------------------------ # calculate error and gradient--------------- pd_k_all = np.multiply( (data_teach - bp_out3), np.multiply(bp_out3, (1 - bp_out3)) ) pd_j_all = np.multiply( np.dot(pd_k_all, self.wkj), np.multiply(bp_out2, (1 - bp_out2)) ) pd_i_all = np.dot(pd_j_all, self.vji) pd_conv1_pooled = pd_i_all / (self.size_pooling1 * self.size_pooling1) pd_conv1_pooled = pd_conv1_pooled.T.getA().tolist() pd_conv1_all = self._calculate_gradient_from_pool( data_conved1, pd_conv1_pooled, shape_featuremap1[0], shape_featuremap1[1], self.size_pooling1, ) # weight and threshold learning process--------- # convolution layer for k_conv in range(self.conv1[1]): pd_conv_list = self._expand_mat(pd_conv1_all[k_conv]) delta_w = self.rate_weight * np.dot(pd_conv_list, data_focus1) self.w_conv1[k_conv] = self.w_conv1[k_conv] + delta_w.reshape( (self.conv1[0], self.conv1[0]) ) self.thre_conv1[k_conv] = ( self.thre_conv1[k_conv] - np.sum(pd_conv1_all[k_conv]) * self.rate_thre ) # all connected layer self.wkj = self.wkj + pd_k_all.T * bp_out2 * self.rate_weight self.vji = self.vji + pd_j_all.T * bp_out1 * self.rate_weight self.thre_bp3 = self.thre_bp3 - pd_k_all * self.rate_thre self.thre_bp2 = self.thre_bp2 - pd_j_all * self.rate_thre # calculate the sum error of all single image errors = np.sum(abs(data_teach - bp_out3)) error_count += errors # print(' ----Teach ',data_teach) # print(' ----BP_output ',bp_out3) rp = rp + 1 mse = error_count / patterns all_mse.append(mse) def draw_error(): yplot = [error_accuracy for i in range(int(n_repeat * 1.2))] plt.plot(all_mse, "+-") plt.plot(yplot, "r--") plt.xlabel("Learning Times") plt.ylabel("All_mse") plt.grid(True, alpha=0.5) plt.show() print("------------------Training Complished---------------------") print((" - - Training epoch: ", rp, f" - - Mse: {mse:.6f}")) if draw_e: draw_error() return mse def predict(self, datas_test): # model predict produce_out = [] print("-------------------Start Testing-------------------------") print((" - - Shape: Test_Data ", np.shape(datas_test))) for p in range(len(datas_test)): data_test = np.asmatrix(datas_test[p]) data_focus1, data_conved1 = self.convolute( data_test, self.conv1, self.w_conv1, self.thre_conv1, conv_step=self.step_conv1, ) data_pooled1 = self.pooling(data_conved1, self.size_pooling1) data_bp_input = self._expand(data_pooled1) bp_out1 = data_bp_input bp_net_j = bp_out1 * self.vji.T - self.thre_bp2 bp_out2 = self.sig(bp_net_j) bp_net_k = bp_out2 * self.wkj.T - self.thre_bp3 bp_out3 = self.sig(bp_net_k) produce_out.extend(bp_out3.getA().tolist()) res = [list(map(self.do_round, each)) for each in produce_out] return np.asarray(res) def convolution(self, data): # return the data of image after convoluting process so we can check it out data_test = np.asmatrix(data) data_focus1, data_conved1 = self.convolute( data_test, self.conv1, self.w_conv1, self.thre_conv1, conv_step=self.step_conv1, ) data_pooled1 = self.pooling(data_conved1, self.size_pooling1) return data_conved1, data_pooled1 if __name__ == "__main__": """ I will put the example on other file """
""" - - - - - -- - - - - - - - - - - - - - - - - - - - - - - Name - - CNN - Convolution Neural Network For Photo Recognizing Goal - - Recognize Handing Writing Word Photo Detail:Total 5 layers neural network * Convolution layer * Pooling layer * Input layer layer of BP * Hidden layer of BP * Output layer of BP Author: Stephen Lee Github: [email protected] Date: 2017.9.20 - - - - - -- - - - - - - - - - - - - - - - - - - - - - - """ import pickle import numpy as np from matplotlib import pyplot as plt class CNN: def __init__( self, conv1_get, size_p1, bp_num1, bp_num2, bp_num3, rate_w=0.2, rate_t=0.2 ): """ :param conv1_get: [a,c,d],size, number, step of convolution kernel :param size_p1: pooling size :param bp_num1: units number of flatten layer :param bp_num2: units number of hidden layer :param bp_num3: units number of output layer :param rate_w: rate of weight learning :param rate_t: rate of threshold learning """ self.num_bp1 = bp_num1 self.num_bp2 = bp_num2 self.num_bp3 = bp_num3 self.conv1 = conv1_get[:2] self.step_conv1 = conv1_get[2] self.size_pooling1 = size_p1 self.rate_weight = rate_w self.rate_thre = rate_t self.w_conv1 = [ np.mat(-1 * np.random.rand(self.conv1[0], self.conv1[0]) + 0.5) for i in range(self.conv1[1]) ] self.wkj = np.mat(-1 * np.random.rand(self.num_bp3, self.num_bp2) + 0.5) self.vji = np.mat(-1 * np.random.rand(self.num_bp2, self.num_bp1) + 0.5) self.thre_conv1 = -2 * np.random.rand(self.conv1[1]) + 1 self.thre_bp2 = -2 * np.random.rand(self.num_bp2) + 1 self.thre_bp3 = -2 * np.random.rand(self.num_bp3) + 1 def save_model(self, save_path): # save model dict with pickle model_dic = { "num_bp1": self.num_bp1, "num_bp2": self.num_bp2, "num_bp3": self.num_bp3, "conv1": self.conv1, "step_conv1": self.step_conv1, "size_pooling1": self.size_pooling1, "rate_weight": self.rate_weight, "rate_thre": self.rate_thre, "w_conv1": self.w_conv1, "wkj": self.wkj, "vji": self.vji, "thre_conv1": self.thre_conv1, "thre_bp2": self.thre_bp2, "thre_bp3": self.thre_bp3, } with open(save_path, "wb") as f: pickle.dump(model_dic, f) print(f"Model saved: {save_path}") @classmethod def read_model(cls, model_path): # read saved model with open(model_path, "rb") as f: model_dic = pickle.load(f) # noqa: S301 conv_get = model_dic.get("conv1") conv_get.append(model_dic.get("step_conv1")) size_p1 = model_dic.get("size_pooling1") bp1 = model_dic.get("num_bp1") bp2 = model_dic.get("num_bp2") bp3 = model_dic.get("num_bp3") r_w = model_dic.get("rate_weight") r_t = model_dic.get("rate_thre") # create model instance conv_ins = CNN(conv_get, size_p1, bp1, bp2, bp3, r_w, r_t) # modify model parameter conv_ins.w_conv1 = model_dic.get("w_conv1") conv_ins.wkj = model_dic.get("wkj") conv_ins.vji = model_dic.get("vji") conv_ins.thre_conv1 = model_dic.get("thre_conv1") conv_ins.thre_bp2 = model_dic.get("thre_bp2") conv_ins.thre_bp3 = model_dic.get("thre_bp3") return conv_ins def sig(self, x): return 1 / (1 + np.exp(-1 * x)) def do_round(self, x): return round(x, 3) def convolute(self, data, convs, w_convs, thre_convs, conv_step): # convolution process size_conv = convs[0] num_conv = convs[1] size_data = np.shape(data)[0] # get the data slice of original image data, data_focus data_focus = [] for i_focus in range(0, size_data - size_conv + 1, conv_step): for j_focus in range(0, size_data - size_conv + 1, conv_step): focus = data[ i_focus : i_focus + size_conv, j_focus : j_focus + size_conv ] data_focus.append(focus) # calculate the feature map of every single kernel, and saved as list of matrix data_featuremap = [] size_feature_map = int((size_data - size_conv) / conv_step + 1) for i_map in range(num_conv): featuremap = [] for i_focus in range(len(data_focus)): net_focus = ( np.sum(np.multiply(data_focus[i_focus], w_convs[i_map])) - thre_convs[i_map] ) featuremap.append(self.sig(net_focus)) featuremap = np.asmatrix(featuremap).reshape( size_feature_map, size_feature_map ) data_featuremap.append(featuremap) # expanding the data slice to One dimenssion focus1_list = [] for each_focus in data_focus: focus1_list.extend(self.Expand_Mat(each_focus)) focus_list = np.asarray(focus1_list) return focus_list, data_featuremap def pooling(self, featuremaps, size_pooling, pooling_type="average_pool"): # pooling process size_map = len(featuremaps[0]) size_pooled = int(size_map / size_pooling) featuremap_pooled = [] for i_map in range(len(featuremaps)): feature_map = featuremaps[i_map] map_pooled = [] for i_focus in range(0, size_map, size_pooling): for j_focus in range(0, size_map, size_pooling): focus = feature_map[ i_focus : i_focus + size_pooling, j_focus : j_focus + size_pooling, ] if pooling_type == "average_pool": # average pooling map_pooled.append(np.average(focus)) elif pooling_type == "max_pooling": # max pooling map_pooled.append(np.max(focus)) map_pooled = np.asmatrix(map_pooled).reshape(size_pooled, size_pooled) featuremap_pooled.append(map_pooled) return featuremap_pooled def _expand(self, data): # expanding three dimension data to one dimension list data_expanded = [] for i in range(len(data)): shapes = np.shape(data[i]) data_listed = data[i].reshape(1, shapes[0] * shapes[1]) data_listed = data_listed.getA().tolist()[0] data_expanded.extend(data_listed) data_expanded = np.asarray(data_expanded) return data_expanded def _expand_mat(self, data_mat): # expanding matrix to one dimension list data_mat = np.asarray(data_mat) shapes = np.shape(data_mat) data_expanded = data_mat.reshape(1, shapes[0] * shapes[1]) return data_expanded def _calculate_gradient_from_pool( self, out_map, pd_pool, num_map, size_map, size_pooling ): """ calculate the gradient from the data slice of pool layer pd_pool: list of matrix out_map: the shape of data slice(size_map*size_map) return: pd_all: list of matrix, [num, size_map, size_map] """ pd_all = [] i_pool = 0 for i_map in range(num_map): pd_conv1 = np.ones((size_map, size_map)) for i in range(0, size_map, size_pooling): for j in range(0, size_map, size_pooling): pd_conv1[i : i + size_pooling, j : j + size_pooling] = pd_pool[ i_pool ] i_pool = i_pool + 1 pd_conv2 = np.multiply( pd_conv1, np.multiply(out_map[i_map], (1 - out_map[i_map])) ) pd_all.append(pd_conv2) return pd_all def train( self, patterns, datas_train, datas_teach, n_repeat, error_accuracy, draw_e=bool ): # model traning print("----------------------Start Training-------------------------") print((" - - Shape: Train_Data ", np.shape(datas_train))) print((" - - Shape: Teach_Data ", np.shape(datas_teach))) rp = 0 all_mse = [] mse = 10000 while rp < n_repeat and mse >= error_accuracy: error_count = 0 print(f"-------------Learning Time {rp}--------------") for p in range(len(datas_train)): # print('------------Learning Image: %d--------------'%p) data_train = np.asmatrix(datas_train[p]) data_teach = np.asarray(datas_teach[p]) data_focus1, data_conved1 = self.convolute( data_train, self.conv1, self.w_conv1, self.thre_conv1, conv_step=self.step_conv1, ) data_pooled1 = self.pooling(data_conved1, self.size_pooling1) shape_featuremap1 = np.shape(data_conved1) """ print(' -----original shape ', np.shape(data_train)) print(' ---- after convolution ',np.shape(data_conv1)) print(' -----after pooling ',np.shape(data_pooled1)) """ data_bp_input = self._expand(data_pooled1) bp_out1 = data_bp_input bp_net_j = np.dot(bp_out1, self.vji.T) - self.thre_bp2 bp_out2 = self.sig(bp_net_j) bp_net_k = np.dot(bp_out2, self.wkj.T) - self.thre_bp3 bp_out3 = self.sig(bp_net_k) # --------------Model Leaning ------------------------ # calculate error and gradient--------------- pd_k_all = np.multiply( (data_teach - bp_out3), np.multiply(bp_out3, (1 - bp_out3)) ) pd_j_all = np.multiply( np.dot(pd_k_all, self.wkj), np.multiply(bp_out2, (1 - bp_out2)) ) pd_i_all = np.dot(pd_j_all, self.vji) pd_conv1_pooled = pd_i_all / (self.size_pooling1 * self.size_pooling1) pd_conv1_pooled = pd_conv1_pooled.T.getA().tolist() pd_conv1_all = self._calculate_gradient_from_pool( data_conved1, pd_conv1_pooled, shape_featuremap1[0], shape_featuremap1[1], self.size_pooling1, ) # weight and threshold learning process--------- # convolution layer for k_conv in range(self.conv1[1]): pd_conv_list = self._expand_mat(pd_conv1_all[k_conv]) delta_w = self.rate_weight * np.dot(pd_conv_list, data_focus1) self.w_conv1[k_conv] = self.w_conv1[k_conv] + delta_w.reshape( (self.conv1[0], self.conv1[0]) ) self.thre_conv1[k_conv] = ( self.thre_conv1[k_conv] - np.sum(pd_conv1_all[k_conv]) * self.rate_thre ) # all connected layer self.wkj = self.wkj + pd_k_all.T * bp_out2 * self.rate_weight self.vji = self.vji + pd_j_all.T * bp_out1 * self.rate_weight self.thre_bp3 = self.thre_bp3 - pd_k_all * self.rate_thre self.thre_bp2 = self.thre_bp2 - pd_j_all * self.rate_thre # calculate the sum error of all single image errors = np.sum(abs(data_teach - bp_out3)) error_count += errors # print(' ----Teach ',data_teach) # print(' ----BP_output ',bp_out3) rp = rp + 1 mse = error_count / patterns all_mse.append(mse) def draw_error(): yplot = [error_accuracy for i in range(int(n_repeat * 1.2))] plt.plot(all_mse, "+-") plt.plot(yplot, "r--") plt.xlabel("Learning Times") plt.ylabel("All_mse") plt.grid(True, alpha=0.5) plt.show() print("------------------Training Complished---------------------") print((" - - Training epoch: ", rp, f" - - Mse: {mse:.6f}")) if draw_e: draw_error() return mse def predict(self, datas_test): # model predict produce_out = [] print("-------------------Start Testing-------------------------") print((" - - Shape: Test_Data ", np.shape(datas_test))) for p in range(len(datas_test)): data_test = np.asmatrix(datas_test[p]) data_focus1, data_conved1 = self.convolute( data_test, self.conv1, self.w_conv1, self.thre_conv1, conv_step=self.step_conv1, ) data_pooled1 = self.pooling(data_conved1, self.size_pooling1) data_bp_input = self._expand(data_pooled1) bp_out1 = data_bp_input bp_net_j = bp_out1 * self.vji.T - self.thre_bp2 bp_out2 = self.sig(bp_net_j) bp_net_k = bp_out2 * self.wkj.T - self.thre_bp3 bp_out3 = self.sig(bp_net_k) produce_out.extend(bp_out3.getA().tolist()) res = [list(map(self.do_round, each)) for each in produce_out] return np.asarray(res) def convolution(self, data): # return the data of image after convoluting process so we can check it out data_test = np.asmatrix(data) data_focus1, data_conved1 = self.convolute( data_test, self.conv1, self.w_conv1, self.thre_conv1, conv_step=self.step_conv1, ) data_pooled1 = self.pooling(data_conved1, self.size_pooling1) return data_conved1, data_pooled1 if __name__ == "__main__": """ I will put the example on other file """
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Project Euler Problems are taken from https://projecteuler.net/, the Project Euler. [Problems are licensed under CC BY-NC-SA 4.0](https://projecteuler.net/copyright). Project Euler is a series of challenging mathematical/computer programming problems that require more than just mathematical insights to solve. Project Euler is ideal for mathematicians who are learning to code. The solutions will be checked by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) with the help of [this script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). The efficiency of your code is also checked. You can view the top 10 slowest solutions on GitHub Actions logs (under `slowest 10 durations`) and open a pull request to improve those solutions. ## Solution Guidelines Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before reading the solution guidelines, make sure you read the whole [Contributing Guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) as it won't be repeated in here. If you have any doubt on the guidelines, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). You can use the [template](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#solution-template) we have provided below as your starting point but be sure to read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) part first. ### Coding Style * Please maintain consistency in project directory and solution file names. Keep the following points in mind: * Create a new directory only for the problems which do not exist yet. * If you create a new directory, please create an empty `__init__.py` file inside it as well. * Please name the project **directory** as `problem_<problem_number>` where `problem_number` should be filled with 0s so as to occupy 3 digits. Example: `problem_001`, `problem_002`, `problem_067`, `problem_145`, and so on. * Please provide a link to the problem and other references, if used, in the **module-level docstring**. * All imports should come ***after*** the module-level docstring. * You can have as many helper functions as you want but there should be one main function called `solution` which should satisfy the conditions as stated below: * It should contain positional argument(s) whose default value is the question input. Example: Please take a look at [Problem 1](https://projecteuler.net/problem=1) where the question is to *Find the sum of all the multiples of 3 or 5 below 1000.* In this case the main solution function will be `solution(limit: int = 1000)`. * When the `solution` function is called without any arguments like so: `solution()`, it should return the answer to the problem. * Every function, which includes all the helper functions, if any, and the main solution function, should have `doctest` in the function docstring along with a brief statement mentioning what the function is about. * There should not be a `doctest` for testing the answer as that is done by our GitHub Actions build using this [script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). Keeping in mind the above example of [Problem 1](https://projecteuler.net/problem=1): ```python def solution(limit: int = 1000): """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution method in the module-level docstring. >>> solution(1) ... >>> solution(16) ... >>> solution(100) ... """ ``` ### Solution Template You can use the below template as your starting point but please read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) first to understand how the template works. Please change the name of the helper functions accordingly, change the parameter names with a descriptive one, replace the content within `[square brackets]` (including the brackets) with the appropriate content. ```python """ Project Euler Problem [problem number]: [link to the original problem] ... [Entire problem statement] ... ... [Solution explanation - Optional] ... References [Optional]: - [Wikipedia link to the topic] - [Stackoverflow link] ... """ import module1 import module2 ... def helper1(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement explaining what the function is about. ... A more elaborate description ... [Optional] ... [Doctest] ... """ ... # calculations ... return # You can have multiple helper functions but the solution function should be # after all the helper functions ... def solution(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution in the module-level docstring. ... [Doctest as mentioned above] ... """ ... # calculations ... return answer if __name__ == "__main__": print(f"{solution() = }") ```
# Project Euler Problems are taken from https://projecteuler.net/, the Project Euler. [Problems are licensed under CC BY-NC-SA 4.0](https://projecteuler.net/copyright). Project Euler is a series of challenging mathematical/computer programming problems that require more than just mathematical insights to solve. Project Euler is ideal for mathematicians who are learning to code. The solutions will be checked by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) with the help of [this script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). The efficiency of your code is also checked. You can view the top 10 slowest solutions on GitHub Actions logs (under `slowest 10 durations`) and open a pull request to improve those solutions. ## Solution Guidelines Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before reading the solution guidelines, make sure you read the whole [Contributing Guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) as it won't be repeated in here. If you have any doubt on the guidelines, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://app.gitter.im/#/room/#TheAlgorithms_community:gitter.im). You can use the [template](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#solution-template) we have provided below as your starting point but be sure to read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) part first. ### Coding Style * Please maintain consistency in project directory and solution file names. Keep the following points in mind: * Create a new directory only for the problems which do not exist yet. * If you create a new directory, please create an empty `__init__.py` file inside it as well. * Please name the project **directory** as `problem_<problem_number>` where `problem_number` should be filled with 0s so as to occupy 3 digits. Example: `problem_001`, `problem_002`, `problem_067`, `problem_145`, and so on. * Please provide a link to the problem and other references, if used, in the **module-level docstring**. * All imports should come ***after*** the module-level docstring. * You can have as many helper functions as you want but there should be one main function called `solution` which should satisfy the conditions as stated below: * It should contain positional argument(s) whose default value is the question input. Example: Please take a look at [Problem 1](https://projecteuler.net/problem=1) where the question is to *Find the sum of all the multiples of 3 or 5 below 1000.* In this case the main solution function will be `solution(limit: int = 1000)`. * When the `solution` function is called without any arguments like so: `solution()`, it should return the answer to the problem. * Every function, which includes all the helper functions, if any, and the main solution function, should have `doctest` in the function docstring along with a brief statement mentioning what the function is about. * There should not be a `doctest` for testing the answer as that is done by our GitHub Actions build using this [script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). Keeping in mind the above example of [Problem 1](https://projecteuler.net/problem=1): ```python def solution(limit: int = 1000): """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution method in the module-level docstring. >>> solution(1) ... >>> solution(16) ... >>> solution(100) ... """ ``` ### Solution Template You can use the below template as your starting point but please read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) first to understand how the template works. Please change the name of the helper functions accordingly, change the parameter names with a descriptive one, replace the content within `[square brackets]` (including the brackets) with the appropriate content. ```python """ Project Euler Problem [problem number]: [link to the original problem] ... [Entire problem statement] ... ... [Solution explanation - Optional] ... References [Optional]: - [Wikipedia link to the topic] - [Stackoverflow link] ... """ import module1 import module2 ... def helper1(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement explaining what the function is about. ... A more elaborate description ... [Optional] ... [Doctest] ... """ ... # calculations ... return # You can have multiple helper functions but the solution function should be # after all the helper functions ... def solution(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution in the module-level docstring. ... [Doctest as mentioned above] ... """ ... # calculations ... return answer if __name__ == "__main__": print(f"{solution() = }") ```
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
[tool.pytest.ini_options] markers = [ "mat_ops: mark a test as utilizing matrix operations.", ] addopts = [ "--durations=10", "--doctest-modules", "--showlocals", ] [tool.coverage.report] omit = [".env/*"] sort = "Cover" [tool.codespell] ignore-words-list = "3rt,ans,crate,damon,fo,followings,hist,iff,kwanza,mater,secant,som,sur,tim,zar" skip = "./.*,*.json,ciphers/prehistoric_men.txt,project_euler/problem_022/p022_names.txt,pyproject.toml,strings/dictionary.txt,strings/words.txt" [tool.ruff] ignore = [ # `ruff rule S101` for a description of that rule "B904", # B904: Within an `except` clause, raise exceptions with `raise ... from err` "B905", # B905: `zip()` without an explicit `strict=` parameter "E741", # E741: Ambiguous variable name 'l' "G004", # G004 Logging statement uses f-string "N999", # N999: Invalid module name "PLC1901", # PLC1901: `{}` can be simplified to `{}` as an empty string is falsey "PLR2004", # PLR2004: Magic value used in comparison "PLR5501", # PLR5501: Consider using `elif` instead of `else` "PLW0120", # PLW0120: `else` clause on loop without a `break` statement "PLW060", # PLW060: Using global for `{name}` but no assignment is done -- DO NOT FIX "PLW2901", # PLW2901: Redefined loop variable "RUF00", # RUF00: Ambiguous unicode character -- DO NOT FIX "RUF100", # RUF100: Unused `noqa` directive "S101", # S101: Use of `assert` detected -- DO NOT FIX "S105", # S105: Possible hardcoded password: 'password' "S113", # S113: Probable use of requests call without timeout "UP038", # UP038: Use `X | Y` in `{}` call instead of `(X, Y)` -- DO NOT FIX ] select = [ # https://beta.ruff.rs/docs/rules "A", # A: builtins "B", # B: bugbear "C40", # C40: comprehensions "C90", # C90: mccabe code complexity "E", # E: pycodestyle errors "F", # F: pyflakes "G", # G: logging format "I", # I: isort "N", # N: pep8 naming "PL", # PL: pylint "PIE", # PIE: pie "PYI", # PYI: type hinting stub files "RUF", # RUF: ruff "S", # S: bandit "TID", # TID: tidy imports "UP", # UP: pyupgrade "W", # W: pycodestyle warnings "YTT", # YTT: year 2020 ] show-source = true target-version = "py311" [tool.ruff.mccabe] # DO NOT INCREASE THIS VALUE max-complexity = 20 # default: 10 [tool.ruff.pylint] # DO NOT INCREASE THESE VALUES max-args = 10 # default: 5 max-branches = 20 # default: 12 max-returns = 8 # default: 6 max-statements = 88 # default: 50
[tool.pytest.ini_options] markers = [ "mat_ops: mark a test as utilizing matrix operations.", ] addopts = [ "--durations=10", "--doctest-modules", "--showlocals", ] [tool.coverage.report] omit = [".env/*"] sort = "Cover" [tool.codespell] ignore-words-list = "3rt,ans,crate,damon,fo,followings,hist,iff,kwanza,mater,secant,som,sur,tim,zar" skip = "./.*,*.json,ciphers/prehistoric_men.txt,project_euler/problem_022/p022_names.txt,pyproject.toml,strings/dictionary.txt,strings/words.txt" [tool.ruff] ignore = [ # `ruff rule S101` for a description of that rule "B904", # B904: Within an `except` clause, raise exceptions with `raise ... from err` "B905", # B905: `zip()` without an explicit `strict=` parameter "E741", # E741: Ambiguous variable name 'l' "G004", # G004 Logging statement uses f-string "N999", # N999: Invalid module name "PLC1901", # PLC1901: `{}` can be simplified to `{}` as an empty string is falsey "PLR2004", # PLR2004: Magic value used in comparison "PLR5501", # PLR5501: Consider using `elif` instead of `else` "PLW0120", # PLW0120: `else` clause on loop without a `break` statement "PLW060", # PLW060: Using global for `{name}` but no assignment is done -- DO NOT FIX "PLW2901", # PLW2901: Redefined loop variable "RUF00", # RUF00: Ambiguous unicode character -- DO NOT FIX "RUF100", # RUF100: Unused `noqa` directive "S101", # S101: Use of `assert` detected -- DO NOT FIX "S105", # S105: Possible hardcoded password: 'password' "S113", # S113: Probable use of requests call without timeout "S311", # S311: Standard pseudo-random generators are not suitable for cryptographic purposes "UP038", # UP038: Use `X | Y` in `{}` call instead of `(X, Y)` -- DO NOT FIX ] select = [ # https://beta.ruff.rs/docs/rules "A", # A: builtins "B", # B: bugbear "C40", # C40: comprehensions "C90", # C90: mccabe code complexity "E", # E: pycodestyle errors "F", # F: pyflakes "G", # G: logging format "I", # I: isort "N", # N: pep8 naming "PL", # PL: pylint "PIE", # PIE: pie "PYI", # PYI: type hinting stub files "RUF", # RUF: ruff "S", # S: bandit "TID", # TID: tidy imports "UP", # UP: pyupgrade "W", # W: pycodestyle warnings "YTT", # YTT: year 2020 ] show-source = true target-version = "py311" [tool.ruff.mccabe] # DO NOT INCREASE THIS VALUE max-complexity = 20 # default: 10 [tool.ruff.pylint] # DO NOT INCREASE THESE VALUES max-args = 10 # default: 5 max-branches = 20 # default: 12 max-returns = 8 # default: 6 max-statements = 88 # default: 50
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import json import os import re import sys import urllib.request import requests from bs4 import BeautifulSoup headers = { "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36" " (KHTML, like Gecko) Chrome/70.0.3538.102 Safari/537.36 Edge/18.19582" } def download_images_from_google_query(query: str = "dhaka", max_images: int = 5) -> int: """ Searches google using the provided query term and downloads the images in a folder. Args: query : The image search term to be provided by the user. Defaults to "dhaka". image_numbers : [description]. Defaults to 5. Returns: The number of images successfully downloaded. # Comment out slow (4.20s call) doctests # >>> download_images_from_google_query() 5 # >>> download_images_from_google_query("potato") 5 """ max_images = min(max_images, 50) # Prevent abuse! params = { "q": query, "tbm": "isch", "hl": "en", "ijn": "0", } html = requests.get("https://www.google.com/search", params=params, headers=headers) soup = BeautifulSoup(html.text, "html.parser") matched_images_data = "".join( re.findall(r"AF_initDataCallback\(([^<]+)\);", str(soup.select("script"))) ) matched_images_data_fix = json.dumps(matched_images_data) matched_images_data_json = json.loads(matched_images_data_fix) matched_google_image_data = re.findall( r"\[\"GRID_STATE0\",null,\[\[1,\[0,\".*?\",(.*),\"All\",", matched_images_data_json, ) if not matched_google_image_data: return 0 removed_matched_google_images_thumbnails = re.sub( r"\[\"(https\:\/\/encrypted-tbn0\.gstatic\.com\/images\?.*?)\",\d+,\d+\]", "", str(matched_google_image_data), ) matched_google_full_resolution_images = re.findall( r"(?:'|,),\[\"(https:|http.*?)\",\d+,\d+\]", removed_matched_google_images_thumbnails, ) for index, fixed_full_res_image in enumerate(matched_google_full_resolution_images): if index >= max_images: return index original_size_img_not_fixed = bytes(fixed_full_res_image, "ascii").decode( "unicode-escape" ) original_size_img = bytes(original_size_img_not_fixed, "ascii").decode( "unicode-escape" ) opener = urllib.request.build_opener() opener.addheaders = [ ( "User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36" " (KHTML, like Gecko) Chrome/70.0.3538.102 Safari/537.36 Edge/18.19582", ) ] urllib.request.install_opener(opener) path_name = f"query_{query.replace(' ', '_')}" if not os.path.exists(path_name): os.makedirs(path_name) urllib.request.urlretrieve( original_size_img, f"{path_name}/original_size_img_{index}.jpg" ) return index if __name__ == "__main__": try: image_count = download_images_from_google_query(sys.argv[1]) print(f"{image_count} images were downloaded to disk.") except IndexError: print("Please provide a search term.") raise
import json import os import re import sys import urllib.request import requests from bs4 import BeautifulSoup headers = { "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36" " (KHTML, like Gecko) Chrome/70.0.3538.102 Safari/537.36 Edge/18.19582" } def download_images_from_google_query(query: str = "dhaka", max_images: int = 5) -> int: """ Searches google using the provided query term and downloads the images in a folder. Args: query : The image search term to be provided by the user. Defaults to "dhaka". image_numbers : [description]. Defaults to 5. Returns: The number of images successfully downloaded. # Comment out slow (4.20s call) doctests # >>> download_images_from_google_query() 5 # >>> download_images_from_google_query("potato") 5 """ max_images = min(max_images, 50) # Prevent abuse! params = { "q": query, "tbm": "isch", "hl": "en", "ijn": "0", } html = requests.get("https://www.google.com/search", params=params, headers=headers) soup = BeautifulSoup(html.text, "html.parser") matched_images_data = "".join( re.findall(r"AF_initDataCallback\(([^<]+)\);", str(soup.select("script"))) ) matched_images_data_fix = json.dumps(matched_images_data) matched_images_data_json = json.loads(matched_images_data_fix) matched_google_image_data = re.findall( r"\[\"GRID_STATE0\",null,\[\[1,\[0,\".*?\",(.*),\"All\",", matched_images_data_json, ) if not matched_google_image_data: return 0 removed_matched_google_images_thumbnails = re.sub( r"\[\"(https\:\/\/encrypted-tbn0\.gstatic\.com\/images\?.*?)\",\d+,\d+\]", "", str(matched_google_image_data), ) matched_google_full_resolution_images = re.findall( r"(?:'|,),\[\"(https:|http.*?)\",\d+,\d+\]", removed_matched_google_images_thumbnails, ) for index, fixed_full_res_image in enumerate(matched_google_full_resolution_images): if index >= max_images: return index original_size_img_not_fixed = bytes(fixed_full_res_image, "ascii").decode( "unicode-escape" ) original_size_img = bytes(original_size_img_not_fixed, "ascii").decode( "unicode-escape" ) opener = urllib.request.build_opener() opener.addheaders = [ ( "User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36" " (KHTML, like Gecko) Chrome/70.0.3538.102 Safari/537.36 Edge/18.19582", ) ] urllib.request.install_opener(opener) path_name = f"query_{query.replace(' ', '_')}" if not os.path.exists(path_name): os.makedirs(path_name) urllib.request.urlretrieve( # noqa: S310 original_size_img, f"{path_name}/original_size_img_{index}.jpg" ) return index if __name__ == "__main__": try: image_count = download_images_from_google_query(sys.argv[1]) print(f"{image_count} images were downloaded to disk.") except IndexError: print("Please provide a search term.") raise
1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Python program for Bitonic Sort. Note that this program works only when size of input is a power of 2. """ from __future__ import annotations def comp_and_swap(array: list[int], index1: int, index2: int, direction: int) -> None: """Compare the value at given index1 and index2 of the array and swap them as per the given direction. The parameter direction indicates the sorting direction, ASCENDING(1) or DESCENDING(0); if (a[i] > a[j]) agrees with the direction, then a[i] and a[j] are interchanged. >>> arr = [12, 42, -21, 1] >>> comp_and_swap(arr, 1, 2, 1) >>> arr [12, -21, 42, 1] >>> comp_and_swap(arr, 1, 2, 0) >>> arr [12, 42, -21, 1] >>> comp_and_swap(arr, 0, 3, 1) >>> arr [1, 42, -21, 12] >>> comp_and_swap(arr, 0, 3, 0) >>> arr [12, 42, -21, 1] """ if (direction == 1 and array[index1] > array[index2]) or ( direction == 0 and array[index1] < array[index2] ): array[index1], array[index2] = array[index2], array[index1] def bitonic_merge(array: list[int], low: int, length: int, direction: int) -> None: """ It recursively sorts a bitonic sequence in ascending order, if direction = 1, and in descending if direction = 0. The sequence to be sorted starts at index position low, the parameter length is the number of elements to be sorted. >>> arr = [12, 42, -21, 1] >>> bitonic_merge(arr, 0, 4, 1) >>> arr [-21, 1, 12, 42] >>> bitonic_merge(arr, 0, 4, 0) >>> arr [42, 12, 1, -21] """ if length > 1: middle = int(length / 2) for i in range(low, low + middle): comp_and_swap(array, i, i + middle, direction) bitonic_merge(array, low, middle, direction) bitonic_merge(array, low + middle, middle, direction) def bitonic_sort(array: list[int], low: int, length: int, direction: int) -> None: """ This function first produces a bitonic sequence by recursively sorting its two halves in opposite sorting orders, and then calls bitonic_merge to make them in the same order. >>> arr = [12, 34, 92, -23, 0, -121, -167, 145] >>> bitonic_sort(arr, 0, 8, 1) >>> arr [-167, -121, -23, 0, 12, 34, 92, 145] >>> bitonic_sort(arr, 0, 8, 0) >>> arr [145, 92, 34, 12, 0, -23, -121, -167] """ if length > 1: middle = int(length / 2) bitonic_sort(array, low, middle, 1) bitonic_sort(array, low + middle, middle, 0) bitonic_merge(array, low, length, direction) if __name__ == "__main__": user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item.strip()) for item in user_input.split(",")] bitonic_sort(unsorted, 0, len(unsorted), 1) print("\nSorted array in ascending order is: ", end="") print(*unsorted, sep=", ") bitonic_merge(unsorted, 0, len(unsorted), 0) print("Sorted array in descending order is: ", end="") print(*unsorted, sep=", ")
""" Python program for Bitonic Sort. Note that this program works only when size of input is a power of 2. """ from __future__ import annotations def comp_and_swap(array: list[int], index1: int, index2: int, direction: int) -> None: """Compare the value at given index1 and index2 of the array and swap them as per the given direction. The parameter direction indicates the sorting direction, ASCENDING(1) or DESCENDING(0); if (a[i] > a[j]) agrees with the direction, then a[i] and a[j] are interchanged. >>> arr = [12, 42, -21, 1] >>> comp_and_swap(arr, 1, 2, 1) >>> arr [12, -21, 42, 1] >>> comp_and_swap(arr, 1, 2, 0) >>> arr [12, 42, -21, 1] >>> comp_and_swap(arr, 0, 3, 1) >>> arr [1, 42, -21, 12] >>> comp_and_swap(arr, 0, 3, 0) >>> arr [12, 42, -21, 1] """ if (direction == 1 and array[index1] > array[index2]) or ( direction == 0 and array[index1] < array[index2] ): array[index1], array[index2] = array[index2], array[index1] def bitonic_merge(array: list[int], low: int, length: int, direction: int) -> None: """ It recursively sorts a bitonic sequence in ascending order, if direction = 1, and in descending if direction = 0. The sequence to be sorted starts at index position low, the parameter length is the number of elements to be sorted. >>> arr = [12, 42, -21, 1] >>> bitonic_merge(arr, 0, 4, 1) >>> arr [-21, 1, 12, 42] >>> bitonic_merge(arr, 0, 4, 0) >>> arr [42, 12, 1, -21] """ if length > 1: middle = int(length / 2) for i in range(low, low + middle): comp_and_swap(array, i, i + middle, direction) bitonic_merge(array, low, middle, direction) bitonic_merge(array, low + middle, middle, direction) def bitonic_sort(array: list[int], low: int, length: int, direction: int) -> None: """ This function first produces a bitonic sequence by recursively sorting its two halves in opposite sorting orders, and then calls bitonic_merge to make them in the same order. >>> arr = [12, 34, 92, -23, 0, -121, -167, 145] >>> bitonic_sort(arr, 0, 8, 1) >>> arr [-167, -121, -23, 0, 12, 34, 92, 145] >>> bitonic_sort(arr, 0, 8, 0) >>> arr [145, 92, 34, 12, 0, -23, -121, -167] """ if length > 1: middle = int(length / 2) bitonic_sort(array, low, middle, 1) bitonic_sort(array, low + middle, middle, 0) bitonic_merge(array, low, length, direction) if __name__ == "__main__": user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item.strip()) for item in user_input.split(",")] bitonic_sort(unsorted, 0, len(unsorted), 1) print("\nSorted array in ascending order is: ", end="") print(*unsorted, sep=", ") bitonic_merge(unsorted, 0, len(unsorted), 0) print("Sorted array in descending order is: ", end="") print(*unsorted, sep=", ")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" FP-GraphMiner - A Fast Frequent Pattern Mining Algorithm for Network Graphs A novel Frequent Pattern Graph Mining algorithm, FP-GraphMiner, that compactly represents a set of network graphs as a Frequent Pattern Graph (or FP-Graph). This graph can be used to efficiently mine frequent subgraphs including maximal frequent subgraphs and maximum common subgraphs. URL: https://www.researchgate.net/publication/235255851 """ # fmt: off edge_array = [ ['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'be-e6', 'bh-e12', 'cd-e2', 'ce-e4', 'de-e1', 'df-e8', 'dg-e5', 'dh-e10', 'ef-e3', 'eg-e2', 'fg-e6', 'gh-e6', 'hi-e3'], ['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'be-e6', 'cd-e2', 'de-e1', 'df-e8', 'ef-e3', 'eg-e2', 'fg-e6'], ['ab-e1', 'ac-e3', 'bc-e4', 'bd-e2', 'de-e1', 'df-e8', 'dg-e5', 'ef-e3', 'eg-e2', 'eh-e12', 'fg-e6', 'fh-e10', 'gh-e6'], ['ab-e1', 'ac-e3', 'bc-e4', 'bd-e2', 'bh-e12', 'cd-e2', 'df-e8', 'dh-e10'], ['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'cd-e2', 'ce-e4', 'de-e1', 'df-e8', 'dg-e5', 'ef-e3', 'eg-e2', 'fg-e6'] ] # fmt: on def get_distinct_edge(edge_array): """ Return Distinct edges from edge array of multiple graphs >>> sorted(get_distinct_edge(edge_array)) ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h'] """ distinct_edge = set() for row in edge_array: for item in row: distinct_edge.add(item[0]) return list(distinct_edge) def get_bitcode(edge_array, distinct_edge): """ Return bitcode of distinct_edge """ bitcode = ["0"] * len(edge_array) for i, row in enumerate(edge_array): for item in row: if distinct_edge in item[0]: bitcode[i] = "1" break return "".join(bitcode) def get_frequency_table(edge_array): """ Returns Frequency Table """ distinct_edge = get_distinct_edge(edge_array) frequency_table = {} for item in distinct_edge: bit = get_bitcode(edge_array, item) # print('bit',bit) # bt=''.join(bit) s = bit.count("1") frequency_table[item] = [s, bit] # Store [Distinct edge, WT(Bitcode), Bitcode] in descending order sorted_frequency_table = [ [k, v[0], v[1]] for k, v in sorted(frequency_table.items(), key=lambda v: v[1][0], reverse=True) ] return sorted_frequency_table def get_nodes(frequency_table): """ Returns nodes format nodes={bitcode:edges that represent the bitcode} >>> get_nodes([['ab', 5, '11111'], ['ac', 5, '11111'], ['df', 5, '11111'], ... ['bd', 5, '11111'], ['bc', 5, '11111']]) {'11111': ['ab', 'ac', 'df', 'bd', 'bc']} """ nodes = {} for _, item in enumerate(frequency_table): nodes.setdefault(item[2], []).append(item[0]) return nodes def get_cluster(nodes): """ Returns cluster format cluster:{WT(bitcode):nodes with same WT} """ cluster = {} for key, value in nodes.items(): cluster.setdefault(key.count("1"), {})[key] = value return cluster def get_support(cluster): """ Returns support >>> get_support({5: {'11111': ['ab', 'ac', 'df', 'bd', 'bc']}, ... 4: {'11101': ['ef', 'eg', 'de', 'fg'], '11011': ['cd']}, ... 3: {'11001': ['ad'], '10101': ['dg']}, ... 2: {'10010': ['dh', 'bh'], '11000': ['be'], '10100': ['gh'], ... '10001': ['ce']}, ... 1: {'00100': ['fh', 'eh'], '10000': ['hi']}}) [100.0, 80.0, 60.0, 40.0, 20.0] """ return [i * 100 / len(cluster) for i in cluster] def print_all() -> None: print("\nNodes\n") for key, value in nodes.items(): print(key, value) print("\nSupport\n") print(support) print("\n Cluster \n") for key, value in sorted(cluster.items(), reverse=True): print(key, value) print("\n Graph\n") for key, value in graph.items(): print(key, value) print("\n Edge List of Frequent subgraphs \n") for edge_list in freq_subgraph_edge_list: print(edge_list) def create_edge(nodes, graph, cluster, c1): """ create edge between the nodes """ for i in cluster[c1]: count = 0 c2 = c1 + 1 while c2 < max(cluster.keys()): for j in cluster[c2]: """ creates edge only if the condition satisfies """ if int(i, 2) & int(j, 2) == int(i, 2): if tuple(nodes[i]) in graph: graph[tuple(nodes[i])].append(nodes[j]) else: graph[tuple(nodes[i])] = [nodes[j]] count += 1 if count == 0: c2 = c2 + 1 else: break def construct_graph(cluster, nodes): x = cluster[max(cluster.keys())] cluster[max(cluster.keys()) + 1] = "Header" graph = {} for i in x: if (["Header"],) in graph: graph[(["Header"],)].append(x[i]) else: graph[(["Header"],)] = [x[i]] for i in x: graph[(x[i],)] = [["Header"]] i = 1 while i < max(cluster) - 1: create_edge(nodes, graph, cluster, i) i = i + 1 return graph def my_dfs(graph, start, end, path=None): """ find different DFS walk from given node to Header node """ path = (path or []) + [start] if start == end: paths.append(path) for node in graph[start]: if tuple(node) not in path: my_dfs(graph, tuple(node), end, path) def find_freq_subgraph_given_support(s, cluster, graph): """ find edges of multiple frequent subgraphs """ k = int(s / 100 * (len(cluster) - 1)) for i in cluster[k]: my_dfs(graph, tuple(cluster[k][i]), (["Header"],)) def freq_subgraphs_edge_list(paths): """ returns Edge list for frequent subgraphs """ freq_sub_el = [] for edges in paths: el = [] for j in range(len(edges) - 1): temp = list(edges[j]) for e in temp: edge = (e[0], e[1]) el.append(edge) freq_sub_el.append(el) return freq_sub_el def preprocess(edge_array): """ Preprocess the edge array >>> preprocess([['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'be-e6', 'bh-e12', ... 'cd-e2', 'ce-e4', 'de-e1', 'df-e8', 'dg-e5', 'dh-e10', 'ef-e3', ... 'eg-e2', 'fg-e6', 'gh-e6', 'hi-e3']]) """ for i in range(len(edge_array)): for j in range(len(edge_array[i])): t = edge_array[i][j].split("-") edge_array[i][j] = t if __name__ == "__main__": preprocess(edge_array) frequency_table = get_frequency_table(edge_array) nodes = get_nodes(frequency_table) cluster = get_cluster(nodes) support = get_support(cluster) graph = construct_graph(cluster, nodes) find_freq_subgraph_given_support(60, cluster, graph) paths: list = [] freq_subgraph_edge_list = freq_subgraphs_edge_list(paths) print_all()
""" FP-GraphMiner - A Fast Frequent Pattern Mining Algorithm for Network Graphs A novel Frequent Pattern Graph Mining algorithm, FP-GraphMiner, that compactly represents a set of network graphs as a Frequent Pattern Graph (or FP-Graph). This graph can be used to efficiently mine frequent subgraphs including maximal frequent subgraphs and maximum common subgraphs. URL: https://www.researchgate.net/publication/235255851 """ # fmt: off edge_array = [ ['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'be-e6', 'bh-e12', 'cd-e2', 'ce-e4', 'de-e1', 'df-e8', 'dg-e5', 'dh-e10', 'ef-e3', 'eg-e2', 'fg-e6', 'gh-e6', 'hi-e3'], ['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'be-e6', 'cd-e2', 'de-e1', 'df-e8', 'ef-e3', 'eg-e2', 'fg-e6'], ['ab-e1', 'ac-e3', 'bc-e4', 'bd-e2', 'de-e1', 'df-e8', 'dg-e5', 'ef-e3', 'eg-e2', 'eh-e12', 'fg-e6', 'fh-e10', 'gh-e6'], ['ab-e1', 'ac-e3', 'bc-e4', 'bd-e2', 'bh-e12', 'cd-e2', 'df-e8', 'dh-e10'], ['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'cd-e2', 'ce-e4', 'de-e1', 'df-e8', 'dg-e5', 'ef-e3', 'eg-e2', 'fg-e6'] ] # fmt: on def get_distinct_edge(edge_array): """ Return Distinct edges from edge array of multiple graphs >>> sorted(get_distinct_edge(edge_array)) ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h'] """ distinct_edge = set() for row in edge_array: for item in row: distinct_edge.add(item[0]) return list(distinct_edge) def get_bitcode(edge_array, distinct_edge): """ Return bitcode of distinct_edge """ bitcode = ["0"] * len(edge_array) for i, row in enumerate(edge_array): for item in row: if distinct_edge in item[0]: bitcode[i] = "1" break return "".join(bitcode) def get_frequency_table(edge_array): """ Returns Frequency Table """ distinct_edge = get_distinct_edge(edge_array) frequency_table = {} for item in distinct_edge: bit = get_bitcode(edge_array, item) # print('bit',bit) # bt=''.join(bit) s = bit.count("1") frequency_table[item] = [s, bit] # Store [Distinct edge, WT(Bitcode), Bitcode] in descending order sorted_frequency_table = [ [k, v[0], v[1]] for k, v in sorted(frequency_table.items(), key=lambda v: v[1][0], reverse=True) ] return sorted_frequency_table def get_nodes(frequency_table): """ Returns nodes format nodes={bitcode:edges that represent the bitcode} >>> get_nodes([['ab', 5, '11111'], ['ac', 5, '11111'], ['df', 5, '11111'], ... ['bd', 5, '11111'], ['bc', 5, '11111']]) {'11111': ['ab', 'ac', 'df', 'bd', 'bc']} """ nodes = {} for _, item in enumerate(frequency_table): nodes.setdefault(item[2], []).append(item[0]) return nodes def get_cluster(nodes): """ Returns cluster format cluster:{WT(bitcode):nodes with same WT} """ cluster = {} for key, value in nodes.items(): cluster.setdefault(key.count("1"), {})[key] = value return cluster def get_support(cluster): """ Returns support >>> get_support({5: {'11111': ['ab', 'ac', 'df', 'bd', 'bc']}, ... 4: {'11101': ['ef', 'eg', 'de', 'fg'], '11011': ['cd']}, ... 3: {'11001': ['ad'], '10101': ['dg']}, ... 2: {'10010': ['dh', 'bh'], '11000': ['be'], '10100': ['gh'], ... '10001': ['ce']}, ... 1: {'00100': ['fh', 'eh'], '10000': ['hi']}}) [100.0, 80.0, 60.0, 40.0, 20.0] """ return [i * 100 / len(cluster) for i in cluster] def print_all() -> None: print("\nNodes\n") for key, value in nodes.items(): print(key, value) print("\nSupport\n") print(support) print("\n Cluster \n") for key, value in sorted(cluster.items(), reverse=True): print(key, value) print("\n Graph\n") for key, value in graph.items(): print(key, value) print("\n Edge List of Frequent subgraphs \n") for edge_list in freq_subgraph_edge_list: print(edge_list) def create_edge(nodes, graph, cluster, c1): """ create edge between the nodes """ for i in cluster[c1]: count = 0 c2 = c1 + 1 while c2 < max(cluster.keys()): for j in cluster[c2]: """ creates edge only if the condition satisfies """ if int(i, 2) & int(j, 2) == int(i, 2): if tuple(nodes[i]) in graph: graph[tuple(nodes[i])].append(nodes[j]) else: graph[tuple(nodes[i])] = [nodes[j]] count += 1 if count == 0: c2 = c2 + 1 else: break def construct_graph(cluster, nodes): x = cluster[max(cluster.keys())] cluster[max(cluster.keys()) + 1] = "Header" graph = {} for i in x: if (["Header"],) in graph: graph[(["Header"],)].append(x[i]) else: graph[(["Header"],)] = [x[i]] for i in x: graph[(x[i],)] = [["Header"]] i = 1 while i < max(cluster) - 1: create_edge(nodes, graph, cluster, i) i = i + 1 return graph def my_dfs(graph, start, end, path=None): """ find different DFS walk from given node to Header node """ path = (path or []) + [start] if start == end: paths.append(path) for node in graph[start]: if tuple(node) not in path: my_dfs(graph, tuple(node), end, path) def find_freq_subgraph_given_support(s, cluster, graph): """ find edges of multiple frequent subgraphs """ k = int(s / 100 * (len(cluster) - 1)) for i in cluster[k]: my_dfs(graph, tuple(cluster[k][i]), (["Header"],)) def freq_subgraphs_edge_list(paths): """ returns Edge list for frequent subgraphs """ freq_sub_el = [] for edges in paths: el = [] for j in range(len(edges) - 1): temp = list(edges[j]) for e in temp: edge = (e[0], e[1]) el.append(edge) freq_sub_el.append(el) return freq_sub_el def preprocess(edge_array): """ Preprocess the edge array >>> preprocess([['ab-e1', 'ac-e3', 'ad-e5', 'bc-e4', 'bd-e2', 'be-e6', 'bh-e12', ... 'cd-e2', 'ce-e4', 'de-e1', 'df-e8', 'dg-e5', 'dh-e10', 'ef-e3', ... 'eg-e2', 'fg-e6', 'gh-e6', 'hi-e3']]) """ for i in range(len(edge_array)): for j in range(len(edge_array[i])): t = edge_array[i][j].split("-") edge_array[i][j] = t if __name__ == "__main__": preprocess(edge_array) frequency_table = get_frequency_table(edge_array) nodes = get_nodes(frequency_table) cluster = get_cluster(nodes) support = get_support(cluster) graph = construct_graph(cluster, nodes) find_freq_subgraph_given_support(60, cluster, graph) paths: list = [] freq_subgraph_edge_list = freq_subgraphs_edge_list(paths) print_all()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Problem 15: https://projecteuler.net/problem=15 Starting in the top left corner of a 2×2 grid, and only being able to move to the right and down, there are exactly 6 routes to the bottom right corner. How many such routes are there through a 20×20 grid? """ from math import factorial def solution(n: int = 20) -> int: """ Returns the number of paths possible in a n x n grid starting at top left corner going to bottom right corner and being able to move right and down only. >>> solution(25) 126410606437752 >>> solution(23) 8233430727600 >>> solution(20) 137846528820 >>> solution(15) 155117520 >>> solution(1) 2 """ n = 2 * n # middle entry of odd rows starting at row 3 is the solution for n = 1, # 2, 3,... k = n // 2 return int(factorial(n) / (factorial(k) * factorial(n - k))) if __name__ == "__main__": import sys if len(sys.argv) == 1: print(solution(20)) else: try: n = int(sys.argv[1]) print(solution(n)) except ValueError: print("Invalid entry - please enter a number.")
""" Problem 15: https://projecteuler.net/problem=15 Starting in the top left corner of a 2×2 grid, and only being able to move to the right and down, there are exactly 6 routes to the bottom right corner. How many such routes are there through a 20×20 grid? """ from math import factorial def solution(n: int = 20) -> int: """ Returns the number of paths possible in a n x n grid starting at top left corner going to bottom right corner and being able to move right and down only. >>> solution(25) 126410606437752 >>> solution(23) 8233430727600 >>> solution(20) 137846528820 >>> solution(15) 155117520 >>> solution(1) 2 """ n = 2 * n # middle entry of odd rows starting at row 3 is the solution for n = 1, # 2, 3,... k = n // 2 return int(factorial(n) / (factorial(k) * factorial(n - k))) if __name__ == "__main__": import sys if len(sys.argv) == 1: print(solution(20)) else: try: n = int(sys.argv[1]) print(solution(n)) except ValueError: print("Invalid entry - please enter a number.")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations def find_primitive(n: int) -> int | None: for r in range(1, n): li = [] for x in range(n - 1): val = pow(r, x, n) if val in li: break li.append(val) else: return r return None if __name__ == "__main__": q = int(input("Enter a prime number q: ")) a = find_primitive(q) if a is None: print(f"Cannot find the primitive for the value: {a!r}") else: a_private = int(input("Enter private key of A: ")) a_public = pow(a, a_private, q) b_private = int(input("Enter private key of B: ")) b_public = pow(a, b_private, q) a_secret = pow(b_public, a_private, q) b_secret = pow(a_public, b_private, q) print("The key value generated by A is: ", a_secret) print("The key value generated by B is: ", b_secret)
from __future__ import annotations def find_primitive(n: int) -> int | None: for r in range(1, n): li = [] for x in range(n - 1): val = pow(r, x, n) if val in li: break li.append(val) else: return r return None if __name__ == "__main__": q = int(input("Enter a prime number q: ")) a = find_primitive(q) if a is None: print(f"Cannot find the primitive for the value: {a!r}") else: a_private = int(input("Enter private key of A: ")) a_public = pow(a, a_private, q) b_private = int(input("Enter private key of B: ")) b_public = pow(a, b_private, q) a_secret = pow(b_public, a_private, q) b_secret = pow(a_public, b_private, q) print("The key value generated by A is: ", a_secret) print("The key value generated by B is: ", b_secret)
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" https://en.wikipedia.org/wiki/Autokey_cipher An autokey cipher (also known as the autoclave cipher) is a cipher that incorporates the message (the plaintext) into the key. The key is generated from the message in some automated fashion, sometimes by selecting certain letters from the text or, more commonly, by adding a short primer key to the front of the message. """ def encrypt(plaintext: str, key: str) -> str: """ Encrypt a given plaintext (string) and key (string), returning the encrypted ciphertext. >>> encrypt("hello world", "coffee") 'jsqqs avvwo' >>> encrypt("coffee is good as python", "TheAlgorithms") 'vvjfpk wj ohvp su ddylsv' >>> encrypt("coffee is good as python", 2) Traceback (most recent call last): ... TypeError: key must be a string >>> encrypt("", "TheAlgorithms") Traceback (most recent call last): ... ValueError: plaintext is empty """ if not isinstance(plaintext, str): raise TypeError("plaintext must be a string") if not isinstance(key, str): raise TypeError("key must be a string") if not plaintext: raise ValueError("plaintext is empty") if not key: raise ValueError("key is empty") key += plaintext plaintext = plaintext.lower() key = key.lower() plaintext_iterator = 0 key_iterator = 0 ciphertext = "" while plaintext_iterator < len(plaintext): if ( ord(plaintext[plaintext_iterator]) < 97 or ord(plaintext[plaintext_iterator]) > 122 ): ciphertext += plaintext[plaintext_iterator] plaintext_iterator += 1 elif ord(key[key_iterator]) < 97 or ord(key[key_iterator]) > 122: key_iterator += 1 else: ciphertext += chr( ( (ord(plaintext[plaintext_iterator]) - 97 + ord(key[key_iterator])) - 97 ) % 26 + 97 ) key_iterator += 1 plaintext_iterator += 1 return ciphertext def decrypt(ciphertext: str, key: str) -> str: """ Decrypt a given ciphertext (string) and key (string), returning the decrypted ciphertext. >>> decrypt("jsqqs avvwo", "coffee") 'hello world' >>> decrypt("vvjfpk wj ohvp su ddylsv", "TheAlgorithms") 'coffee is good as python' >>> decrypt("vvjfpk wj ohvp su ddylsv", "") Traceback (most recent call last): ... ValueError: key is empty >>> decrypt(527.26, "TheAlgorithms") Traceback (most recent call last): ... TypeError: ciphertext must be a string """ if not isinstance(ciphertext, str): raise TypeError("ciphertext must be a string") if not isinstance(key, str): raise TypeError("key must be a string") if not ciphertext: raise ValueError("ciphertext is empty") if not key: raise ValueError("key is empty") key = key.lower() ciphertext_iterator = 0 key_iterator = 0 plaintext = "" while ciphertext_iterator < len(ciphertext): if ( ord(ciphertext[ciphertext_iterator]) < 97 or ord(ciphertext[ciphertext_iterator]) > 122 ): plaintext += ciphertext[ciphertext_iterator] else: plaintext += chr( (ord(ciphertext[ciphertext_iterator]) - ord(key[key_iterator])) % 26 + 97 ) key += chr( (ord(ciphertext[ciphertext_iterator]) - ord(key[key_iterator])) % 26 + 97 ) key_iterator += 1 ciphertext_iterator += 1 return plaintext if __name__ == "__main__": import doctest doctest.testmod() operation = int(input("Type 1 to encrypt or 2 to decrypt:")) if operation == 1: plaintext = input("Typeplaintext to be encrypted:\n") key = input("Type the key:\n") print(encrypt(plaintext, key)) elif operation == 2: ciphertext = input("Type the ciphertext to be decrypted:\n") key = input("Type the key:\n") print(decrypt(ciphertext, key)) decrypt("jsqqs avvwo", "coffee")
""" https://en.wikipedia.org/wiki/Autokey_cipher An autokey cipher (also known as the autoclave cipher) is a cipher that incorporates the message (the plaintext) into the key. The key is generated from the message in some automated fashion, sometimes by selecting certain letters from the text or, more commonly, by adding a short primer key to the front of the message. """ def encrypt(plaintext: str, key: str) -> str: """ Encrypt a given plaintext (string) and key (string), returning the encrypted ciphertext. >>> encrypt("hello world", "coffee") 'jsqqs avvwo' >>> encrypt("coffee is good as python", "TheAlgorithms") 'vvjfpk wj ohvp su ddylsv' >>> encrypt("coffee is good as python", 2) Traceback (most recent call last): ... TypeError: key must be a string >>> encrypt("", "TheAlgorithms") Traceback (most recent call last): ... ValueError: plaintext is empty """ if not isinstance(plaintext, str): raise TypeError("plaintext must be a string") if not isinstance(key, str): raise TypeError("key must be a string") if not plaintext: raise ValueError("plaintext is empty") if not key: raise ValueError("key is empty") key += plaintext plaintext = plaintext.lower() key = key.lower() plaintext_iterator = 0 key_iterator = 0 ciphertext = "" while plaintext_iterator < len(plaintext): if ( ord(plaintext[plaintext_iterator]) < 97 or ord(plaintext[plaintext_iterator]) > 122 ): ciphertext += plaintext[plaintext_iterator] plaintext_iterator += 1 elif ord(key[key_iterator]) < 97 or ord(key[key_iterator]) > 122: key_iterator += 1 else: ciphertext += chr( ( (ord(plaintext[plaintext_iterator]) - 97 + ord(key[key_iterator])) - 97 ) % 26 + 97 ) key_iterator += 1 plaintext_iterator += 1 return ciphertext def decrypt(ciphertext: str, key: str) -> str: """ Decrypt a given ciphertext (string) and key (string), returning the decrypted ciphertext. >>> decrypt("jsqqs avvwo", "coffee") 'hello world' >>> decrypt("vvjfpk wj ohvp su ddylsv", "TheAlgorithms") 'coffee is good as python' >>> decrypt("vvjfpk wj ohvp su ddylsv", "") Traceback (most recent call last): ... ValueError: key is empty >>> decrypt(527.26, "TheAlgorithms") Traceback (most recent call last): ... TypeError: ciphertext must be a string """ if not isinstance(ciphertext, str): raise TypeError("ciphertext must be a string") if not isinstance(key, str): raise TypeError("key must be a string") if not ciphertext: raise ValueError("ciphertext is empty") if not key: raise ValueError("key is empty") key = key.lower() ciphertext_iterator = 0 key_iterator = 0 plaintext = "" while ciphertext_iterator < len(ciphertext): if ( ord(ciphertext[ciphertext_iterator]) < 97 or ord(ciphertext[ciphertext_iterator]) > 122 ): plaintext += ciphertext[ciphertext_iterator] else: plaintext += chr( (ord(ciphertext[ciphertext_iterator]) - ord(key[key_iterator])) % 26 + 97 ) key += chr( (ord(ciphertext[ciphertext_iterator]) - ord(key[key_iterator])) % 26 + 97 ) key_iterator += 1 ciphertext_iterator += 1 return plaintext if __name__ == "__main__": import doctest doctest.testmod() operation = int(input("Type 1 to encrypt or 2 to decrypt:")) if operation == 1: plaintext = input("Typeplaintext to be encrypted:\n") key = input("Type the key:\n") print(encrypt(plaintext, key)) elif operation == 2: ciphertext = input("Type the ciphertext to be decrypted:\n") key = input("Type the key:\n") print(decrypt(ciphertext, key)) decrypt("jsqqs avvwo", "coffee")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Problem 33: https://projecteuler.net/problem=33 The fraction 49/98 is a curious fraction, as an inexperienced mathematician in attempting to simplify it may incorrectly believe that 49/98 = 4/8, which is correct, is obtained by cancelling the 9s. We shall consider fractions like, 30/50 = 3/5, to be trivial examples. There are exactly four non-trivial examples of this type of fraction, less than one in value, and containing two digits in the numerator and denominator. If the product of these four fractions is given in its lowest common terms, find the value of the denominator. """ from __future__ import annotations from fractions import Fraction def is_digit_cancelling(num: int, den: int) -> bool: return ( num != den and num % 10 == den // 10 and (num // 10) / (den % 10) == num / den ) def fraction_list(digit_len: int) -> list[str]: """ >>> fraction_list(2) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(3) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(4) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(0) [] >>> fraction_list(5) ['16/64', '19/95', '26/65', '49/98'] """ solutions = [] den = 11 last_digit = int("1" + "0" * digit_len) for num in range(den, last_digit): while den <= 99: if (num != den) and (num % 10 == den // 10) and (den % 10 != 0): if is_digit_cancelling(num, den): solutions.append(f"{num}/{den}") den += 1 num += 1 den = 10 return solutions def solution(n: int = 2) -> int: """ Return the solution to the problem """ result = 1.0 for fraction in fraction_list(n): frac = Fraction(fraction) result *= frac.denominator / frac.numerator return int(result) if __name__ == "__main__": print(solution())
""" Problem 33: https://projecteuler.net/problem=33 The fraction 49/98 is a curious fraction, as an inexperienced mathematician in attempting to simplify it may incorrectly believe that 49/98 = 4/8, which is correct, is obtained by cancelling the 9s. We shall consider fractions like, 30/50 = 3/5, to be trivial examples. There are exactly four non-trivial examples of this type of fraction, less than one in value, and containing two digits in the numerator and denominator. If the product of these four fractions is given in its lowest common terms, find the value of the denominator. """ from __future__ import annotations from fractions import Fraction def is_digit_cancelling(num: int, den: int) -> bool: return ( num != den and num % 10 == den // 10 and (num // 10) / (den % 10) == num / den ) def fraction_list(digit_len: int) -> list[str]: """ >>> fraction_list(2) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(3) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(4) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(0) [] >>> fraction_list(5) ['16/64', '19/95', '26/65', '49/98'] """ solutions = [] den = 11 last_digit = int("1" + "0" * digit_len) for num in range(den, last_digit): while den <= 99: if (num != den) and (num % 10 == den // 10) and (den % 10 != 0): if is_digit_cancelling(num, den): solutions.append(f"{num}/{den}") den += 1 num += 1 den = 10 return solutions def solution(n: int = 2) -> int: """ Return the solution to the problem """ result = 1.0 for fraction in fraction_list(n): frac = Fraction(fraction) result *= frac.denominator / frac.numerator return int(result) if __name__ == "__main__": print(solution())
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Description The Koch snowflake is a fractal curve and one of the earliest fractals to have been described. The Koch snowflake can be built up iteratively, in a sequence of stages. The first stage is an equilateral triangle, and each successive stage is formed by adding outward bends to each side of the previous stage, making smaller equilateral triangles. This can be achieved through the following steps for each line: 1. divide the line segment into three segments of equal length. 2. draw an equilateral triangle that has the middle segment from step 1 as its base and points outward. 3. remove the line segment that is the base of the triangle from step 2. (description adapted from https://en.wikipedia.org/wiki/Koch_snowflake ) (for a more detailed explanation and an implementation in the Processing language, see https://natureofcode.com/book/chapter-8-fractals/ #84-the-koch-curve-and-the-arraylist-technique ) Requirements (pip): - matplotlib - numpy """ from __future__ import annotations import matplotlib.pyplot as plt # type: ignore import numpy # initial triangle of Koch snowflake VECTOR_1 = numpy.array([0, 0]) VECTOR_2 = numpy.array([0.5, 0.8660254]) VECTOR_3 = numpy.array([1, 0]) INITIAL_VECTORS = [VECTOR_1, VECTOR_2, VECTOR_3, VECTOR_1] # uncomment for simple Koch curve instead of Koch snowflake # INITIAL_VECTORS = [VECTOR_1, VECTOR_3] def iterate(initial_vectors: list[numpy.ndarray], steps: int) -> list[numpy.ndarray]: """ Go through the number of iterations determined by the argument "steps". Be careful with high values (above 5) since the time to calculate increases exponentially. >>> iterate([numpy.array([0, 0]), numpy.array([1, 0])], 1) [array([0, 0]), array([0.33333333, 0. ]), array([0.5 , \ 0.28867513]), array([0.66666667, 0. ]), array([1, 0])] """ vectors = initial_vectors for _ in range(steps): vectors = iteration_step(vectors) return vectors def iteration_step(vectors: list[numpy.ndarray]) -> list[numpy.ndarray]: """ Loops through each pair of adjacent vectors. Each line between two adjacent vectors is divided into 4 segments by adding 3 additional vectors in-between the original two vectors. The vector in the middle is constructed through a 60 degree rotation so it is bent outwards. >>> iteration_step([numpy.array([0, 0]), numpy.array([1, 0])]) [array([0, 0]), array([0.33333333, 0. ]), array([0.5 , \ 0.28867513]), array([0.66666667, 0. ]), array([1, 0])] """ new_vectors = [] for i, start_vector in enumerate(vectors[:-1]): end_vector = vectors[i + 1] new_vectors.append(start_vector) difference_vector = end_vector - start_vector new_vectors.append(start_vector + difference_vector / 3) new_vectors.append( start_vector + difference_vector / 3 + rotate(difference_vector / 3, 60) ) new_vectors.append(start_vector + difference_vector * 2 / 3) new_vectors.append(vectors[-1]) return new_vectors def rotate(vector: numpy.ndarray, angle_in_degrees: float) -> numpy.ndarray: """ Standard rotation of a 2D vector with a rotation matrix (see https://en.wikipedia.org/wiki/Rotation_matrix ) >>> rotate(numpy.array([1, 0]), 60) array([0.5 , 0.8660254]) >>> rotate(numpy.array([1, 0]), 90) array([6.123234e-17, 1.000000e+00]) """ theta = numpy.radians(angle_in_degrees) c, s = numpy.cos(theta), numpy.sin(theta) rotation_matrix = numpy.array(((c, -s), (s, c))) return numpy.dot(rotation_matrix, vector) def plot(vectors: list[numpy.ndarray]) -> None: """ Utility function to plot the vectors using matplotlib.pyplot No doctest was implemented since this function does not have a return value """ # avoid stretched display of graph axes = plt.gca() axes.set_aspect("equal") # matplotlib.pyplot.plot takes a list of all x-coordinates and a list of all # y-coordinates as inputs, which are constructed from the vector-list using # zip() x_coordinates, y_coordinates = zip(*vectors) plt.plot(x_coordinates, y_coordinates) plt.show() if __name__ == "__main__": import doctest doctest.testmod() processed_vectors = iterate(INITIAL_VECTORS, 5) plot(processed_vectors)
""" Description The Koch snowflake is a fractal curve and one of the earliest fractals to have been described. The Koch snowflake can be built up iteratively, in a sequence of stages. The first stage is an equilateral triangle, and each successive stage is formed by adding outward bends to each side of the previous stage, making smaller equilateral triangles. This can be achieved through the following steps for each line: 1. divide the line segment into three segments of equal length. 2. draw an equilateral triangle that has the middle segment from step 1 as its base and points outward. 3. remove the line segment that is the base of the triangle from step 2. (description adapted from https://en.wikipedia.org/wiki/Koch_snowflake ) (for a more detailed explanation and an implementation in the Processing language, see https://natureofcode.com/book/chapter-8-fractals/ #84-the-koch-curve-and-the-arraylist-technique ) Requirements (pip): - matplotlib - numpy """ from __future__ import annotations import matplotlib.pyplot as plt # type: ignore import numpy # initial triangle of Koch snowflake VECTOR_1 = numpy.array([0, 0]) VECTOR_2 = numpy.array([0.5, 0.8660254]) VECTOR_3 = numpy.array([1, 0]) INITIAL_VECTORS = [VECTOR_1, VECTOR_2, VECTOR_3, VECTOR_1] # uncomment for simple Koch curve instead of Koch snowflake # INITIAL_VECTORS = [VECTOR_1, VECTOR_3] def iterate(initial_vectors: list[numpy.ndarray], steps: int) -> list[numpy.ndarray]: """ Go through the number of iterations determined by the argument "steps". Be careful with high values (above 5) since the time to calculate increases exponentially. >>> iterate([numpy.array([0, 0]), numpy.array([1, 0])], 1) [array([0, 0]), array([0.33333333, 0. ]), array([0.5 , \ 0.28867513]), array([0.66666667, 0. ]), array([1, 0])] """ vectors = initial_vectors for _ in range(steps): vectors = iteration_step(vectors) return vectors def iteration_step(vectors: list[numpy.ndarray]) -> list[numpy.ndarray]: """ Loops through each pair of adjacent vectors. Each line between two adjacent vectors is divided into 4 segments by adding 3 additional vectors in-between the original two vectors. The vector in the middle is constructed through a 60 degree rotation so it is bent outwards. >>> iteration_step([numpy.array([0, 0]), numpy.array([1, 0])]) [array([0, 0]), array([0.33333333, 0. ]), array([0.5 , \ 0.28867513]), array([0.66666667, 0. ]), array([1, 0])] """ new_vectors = [] for i, start_vector in enumerate(vectors[:-1]): end_vector = vectors[i + 1] new_vectors.append(start_vector) difference_vector = end_vector - start_vector new_vectors.append(start_vector + difference_vector / 3) new_vectors.append( start_vector + difference_vector / 3 + rotate(difference_vector / 3, 60) ) new_vectors.append(start_vector + difference_vector * 2 / 3) new_vectors.append(vectors[-1]) return new_vectors def rotate(vector: numpy.ndarray, angle_in_degrees: float) -> numpy.ndarray: """ Standard rotation of a 2D vector with a rotation matrix (see https://en.wikipedia.org/wiki/Rotation_matrix ) >>> rotate(numpy.array([1, 0]), 60) array([0.5 , 0.8660254]) >>> rotate(numpy.array([1, 0]), 90) array([6.123234e-17, 1.000000e+00]) """ theta = numpy.radians(angle_in_degrees) c, s = numpy.cos(theta), numpy.sin(theta) rotation_matrix = numpy.array(((c, -s), (s, c))) return numpy.dot(rotation_matrix, vector) def plot(vectors: list[numpy.ndarray]) -> None: """ Utility function to plot the vectors using matplotlib.pyplot No doctest was implemented since this function does not have a return value """ # avoid stretched display of graph axes = plt.gca() axes.set_aspect("equal") # matplotlib.pyplot.plot takes a list of all x-coordinates and a list of all # y-coordinates as inputs, which are constructed from the vector-list using # zip() x_coordinates, y_coordinates = zip(*vectors) plt.plot(x_coordinates, y_coordinates) plt.show() if __name__ == "__main__": import doctest doctest.testmod() processed_vectors = iterate(INITIAL_VECTORS, 5) plot(processed_vectors)
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Conversion Conversion programs convert a type of data, a number from a numerical base or unit into one of another type, base or unit, e.g. binary to decimal, integer to string or foot to meters. * <https://en.wikipedia.org/wiki/Data_conversion> * <https://en.wikipedia.org/wiki/Transcoding>
# Conversion Conversion programs convert a type of data, a number from a numerical base or unit into one of another type, base or unit, e.g. binary to decimal, integer to string or foot to meters. * <https://en.wikipedia.org/wiki/Data_conversion> * <https://en.wikipedia.org/wiki/Transcoding>
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Convert International System of Units (SI) and Binary prefixes """ from __future__ import annotations from enum import Enum class SIUnit(Enum): yotta = 24 zetta = 21 exa = 18 peta = 15 tera = 12 giga = 9 mega = 6 kilo = 3 hecto = 2 deca = 1 deci = -1 centi = -2 milli = -3 micro = -6 nano = -9 pico = -12 femto = -15 atto = -18 zepto = -21 yocto = -24 class BinaryUnit(Enum): yotta = 8 zetta = 7 exa = 6 peta = 5 tera = 4 giga = 3 mega = 2 kilo = 1 def convert_si_prefix( known_amount: float, known_prefix: str | SIUnit, unknown_prefix: str | SIUnit, ) -> float: """ Wikipedia reference: https://en.wikipedia.org/wiki/Binary_prefix Wikipedia reference: https://en.wikipedia.org/wiki/International_System_of_Units >>> convert_si_prefix(1, SIUnit.giga, SIUnit.mega) 1000 >>> convert_si_prefix(1, SIUnit.mega, SIUnit.giga) 0.001 >>> convert_si_prefix(1, SIUnit.kilo, SIUnit.kilo) 1 >>> convert_si_prefix(1, 'giga', 'mega') 1000 >>> convert_si_prefix(1, 'gIGa', 'mEGa') 1000 """ if isinstance(known_prefix, str): known_prefix = SIUnit[known_prefix.lower()] if isinstance(unknown_prefix, str): unknown_prefix = SIUnit[unknown_prefix.lower()] unknown_amount: float = known_amount * ( 10 ** (known_prefix.value - unknown_prefix.value) ) return unknown_amount def convert_binary_prefix( known_amount: float, known_prefix: str | BinaryUnit, unknown_prefix: str | BinaryUnit, ) -> float: """ Wikipedia reference: https://en.wikipedia.org/wiki/Metric_prefix >>> convert_binary_prefix(1, BinaryUnit.giga, BinaryUnit.mega) 1024 >>> convert_binary_prefix(1, BinaryUnit.mega, BinaryUnit.giga) 0.0009765625 >>> convert_binary_prefix(1, BinaryUnit.kilo, BinaryUnit.kilo) 1 >>> convert_binary_prefix(1, 'giga', 'mega') 1024 >>> convert_binary_prefix(1, 'gIGa', 'mEGa') 1024 """ if isinstance(known_prefix, str): known_prefix = BinaryUnit[known_prefix.lower()] if isinstance(unknown_prefix, str): unknown_prefix = BinaryUnit[unknown_prefix.lower()] unknown_amount: float = known_amount * ( 2 ** ((known_prefix.value - unknown_prefix.value) * 10) ) return unknown_amount if __name__ == "__main__": import doctest doctest.testmod()
""" Convert International System of Units (SI) and Binary prefixes """ from __future__ import annotations from enum import Enum class SIUnit(Enum): yotta = 24 zetta = 21 exa = 18 peta = 15 tera = 12 giga = 9 mega = 6 kilo = 3 hecto = 2 deca = 1 deci = -1 centi = -2 milli = -3 micro = -6 nano = -9 pico = -12 femto = -15 atto = -18 zepto = -21 yocto = -24 class BinaryUnit(Enum): yotta = 8 zetta = 7 exa = 6 peta = 5 tera = 4 giga = 3 mega = 2 kilo = 1 def convert_si_prefix( known_amount: float, known_prefix: str | SIUnit, unknown_prefix: str | SIUnit, ) -> float: """ Wikipedia reference: https://en.wikipedia.org/wiki/Binary_prefix Wikipedia reference: https://en.wikipedia.org/wiki/International_System_of_Units >>> convert_si_prefix(1, SIUnit.giga, SIUnit.mega) 1000 >>> convert_si_prefix(1, SIUnit.mega, SIUnit.giga) 0.001 >>> convert_si_prefix(1, SIUnit.kilo, SIUnit.kilo) 1 >>> convert_si_prefix(1, 'giga', 'mega') 1000 >>> convert_si_prefix(1, 'gIGa', 'mEGa') 1000 """ if isinstance(known_prefix, str): known_prefix = SIUnit[known_prefix.lower()] if isinstance(unknown_prefix, str): unknown_prefix = SIUnit[unknown_prefix.lower()] unknown_amount: float = known_amount * ( 10 ** (known_prefix.value - unknown_prefix.value) ) return unknown_amount def convert_binary_prefix( known_amount: float, known_prefix: str | BinaryUnit, unknown_prefix: str | BinaryUnit, ) -> float: """ Wikipedia reference: https://en.wikipedia.org/wiki/Metric_prefix >>> convert_binary_prefix(1, BinaryUnit.giga, BinaryUnit.mega) 1024 >>> convert_binary_prefix(1, BinaryUnit.mega, BinaryUnit.giga) 0.0009765625 >>> convert_binary_prefix(1, BinaryUnit.kilo, BinaryUnit.kilo) 1 >>> convert_binary_prefix(1, 'giga', 'mega') 1024 >>> convert_binary_prefix(1, 'gIGa', 'mEGa') 1024 """ if isinstance(known_prefix, str): known_prefix = BinaryUnit[known_prefix.lower()] if isinstance(unknown_prefix, str): unknown_prefix = BinaryUnit[unknown_prefix.lower()] unknown_amount: float = known_amount * ( 2 ** ((known_prefix.value - unknown_prefix.value) * 10) ) return unknown_amount if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from collections import deque from .hash_table import HashTable class HashTableWithLinkedList(HashTable): def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) def _set_value(self, key, data): self.values[key] = deque([]) if self.values[key] is None else self.values[key] self.values[key].appendleft(data) self._keys[key] = self.values[key] def balanced_factor(self): return ( sum(self.charge_factor - len(slot) for slot in self.values) / self.size_table * self.charge_factor ) def _collision_resolution(self, key, data=None): if not ( len(self.values[key]) == self.charge_factor and self.values.count(None) == 0 ): return key return super()._collision_resolution(key, data)
from collections import deque from .hash_table import HashTable class HashTableWithLinkedList(HashTable): def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) def _set_value(self, key, data): self.values[key] = deque([]) if self.values[key] is None else self.values[key] self.values[key].appendleft(data) self._keys[key] = self.values[key] def balanced_factor(self): return ( sum(self.charge_factor - len(slot) for slot in self.values) / self.size_table * self.charge_factor ) def _collision_resolution(self, key, data=None): if not ( len(self.values[key]) == self.charge_factor and self.values.count(None) == 0 ): return key return super()._collision_resolution(key, data)
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Convert a positive Decimal Number to Any Other Representation""" from string import ascii_uppercase ALPHABET_VALUES = {str(ord(c) - 55): c for c in ascii_uppercase} def decimal_to_any(num: int, base: int) -> str: """ Convert a positive integer to another base as str. >>> decimal_to_any(0, 2) '0' >>> decimal_to_any(5, 4) '11' >>> decimal_to_any(20, 3) '202' >>> decimal_to_any(58, 16) '3A' >>> decimal_to_any(243, 17) 'E5' >>> decimal_to_any(34923, 36) 'QY3' >>> decimal_to_any(10, 11) 'A' >>> decimal_to_any(16, 16) '10' >>> decimal_to_any(36, 36) '10' >>> # negatives will error >>> decimal_to_any(-45, 8) # doctest: +ELLIPSIS Traceback (most recent call last): ... ValueError: parameter must be positive int >>> # floats will error >>> decimal_to_any(34.4, 6) # doctest: +ELLIPSIS Traceback (most recent call last): ... TypeError: int() can't convert non-string with explicit base >>> # a float base will error >>> decimal_to_any(5, 2.5) # doctest: +ELLIPSIS Traceback (most recent call last): ... TypeError: 'float' object cannot be interpreted as an integer >>> # a str base will error >>> decimal_to_any(10, '16') # doctest: +ELLIPSIS Traceback (most recent call last): ... TypeError: 'str' object cannot be interpreted as an integer >>> # a base less than 2 will error >>> decimal_to_any(7, 0) # doctest: +ELLIPSIS Traceback (most recent call last): ... ValueError: base must be >= 2 >>> # a base greater than 36 will error >>> decimal_to_any(34, 37) # doctest: +ELLIPSIS Traceback (most recent call last): ... ValueError: base must be <= 36 """ if isinstance(num, float): raise TypeError("int() can't convert non-string with explicit base") if num < 0: raise ValueError("parameter must be positive int") if isinstance(base, str): raise TypeError("'str' object cannot be interpreted as an integer") if isinstance(base, float): raise TypeError("'float' object cannot be interpreted as an integer") if base in (0, 1): raise ValueError("base must be >= 2") if base > 36: raise ValueError("base must be <= 36") new_value = "" mod = 0 div = 0 while div != 1: div, mod = divmod(num, base) if base >= 11 and 9 < mod < 36: actual_value = ALPHABET_VALUES[str(mod)] else: actual_value = str(mod) new_value += actual_value div = num // base num = div if div == 0: return str(new_value[::-1]) elif div == 1: new_value += str(div) return str(new_value[::-1]) return new_value[::-1] if __name__ == "__main__": import doctest doctest.testmod() for base in range(2, 37): for num in range(1000): assert int(decimal_to_any(num, base), base) == num, ( num, base, decimal_to_any(num, base), int(decimal_to_any(num, base), base), )
"""Convert a positive Decimal Number to Any Other Representation""" from string import ascii_uppercase ALPHABET_VALUES = {str(ord(c) - 55): c for c in ascii_uppercase} def decimal_to_any(num: int, base: int) -> str: """ Convert a positive integer to another base as str. >>> decimal_to_any(0, 2) '0' >>> decimal_to_any(5, 4) '11' >>> decimal_to_any(20, 3) '202' >>> decimal_to_any(58, 16) '3A' >>> decimal_to_any(243, 17) 'E5' >>> decimal_to_any(34923, 36) 'QY3' >>> decimal_to_any(10, 11) 'A' >>> decimal_to_any(16, 16) '10' >>> decimal_to_any(36, 36) '10' >>> # negatives will error >>> decimal_to_any(-45, 8) # doctest: +ELLIPSIS Traceback (most recent call last): ... ValueError: parameter must be positive int >>> # floats will error >>> decimal_to_any(34.4, 6) # doctest: +ELLIPSIS Traceback (most recent call last): ... TypeError: int() can't convert non-string with explicit base >>> # a float base will error >>> decimal_to_any(5, 2.5) # doctest: +ELLIPSIS Traceback (most recent call last): ... TypeError: 'float' object cannot be interpreted as an integer >>> # a str base will error >>> decimal_to_any(10, '16') # doctest: +ELLIPSIS Traceback (most recent call last): ... TypeError: 'str' object cannot be interpreted as an integer >>> # a base less than 2 will error >>> decimal_to_any(7, 0) # doctest: +ELLIPSIS Traceback (most recent call last): ... ValueError: base must be >= 2 >>> # a base greater than 36 will error >>> decimal_to_any(34, 37) # doctest: +ELLIPSIS Traceback (most recent call last): ... ValueError: base must be <= 36 """ if isinstance(num, float): raise TypeError("int() can't convert non-string with explicit base") if num < 0: raise ValueError("parameter must be positive int") if isinstance(base, str): raise TypeError("'str' object cannot be interpreted as an integer") if isinstance(base, float): raise TypeError("'float' object cannot be interpreted as an integer") if base in (0, 1): raise ValueError("base must be >= 2") if base > 36: raise ValueError("base must be <= 36") new_value = "" mod = 0 div = 0 while div != 1: div, mod = divmod(num, base) if base >= 11 and 9 < mod < 36: actual_value = ALPHABET_VALUES[str(mod)] else: actual_value = str(mod) new_value += actual_value div = num // base num = div if div == 0: return str(new_value[::-1]) elif div == 1: new_value += str(div) return str(new_value[::-1]) return new_value[::-1] if __name__ == "__main__": import doctest doctest.testmod() for base in range(2, 37): for num in range(1000): assert int(decimal_to_any(num, base), base) == num, ( num, base, decimal_to_any(num, base), int(decimal_to_any(num, base), base), )
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def bin_to_decimal(bin_string: str) -> int: """ Convert a binary value to its decimal equivalent >>> bin_to_decimal("101") 5 >>> bin_to_decimal(" 1010 ") 10 >>> bin_to_decimal("-11101") -29 >>> bin_to_decimal("0") 0 >>> bin_to_decimal("a") Traceback (most recent call last): ... ValueError: Non-binary value was passed to the function >>> bin_to_decimal("") Traceback (most recent call last): ... ValueError: Empty string was passed to the function >>> bin_to_decimal("39") Traceback (most recent call last): ... ValueError: Non-binary value was passed to the function """ bin_string = str(bin_string).strip() if not bin_string: raise ValueError("Empty string was passed to the function") is_negative = bin_string[0] == "-" if is_negative: bin_string = bin_string[1:] if not all(char in "01" for char in bin_string): raise ValueError("Non-binary value was passed to the function") decimal_number = 0 for char in bin_string: decimal_number = 2 * decimal_number + int(char) return -decimal_number if is_negative else decimal_number if __name__ == "__main__": from doctest import testmod testmod()
def bin_to_decimal(bin_string: str) -> int: """ Convert a binary value to its decimal equivalent >>> bin_to_decimal("101") 5 >>> bin_to_decimal(" 1010 ") 10 >>> bin_to_decimal("-11101") -29 >>> bin_to_decimal("0") 0 >>> bin_to_decimal("a") Traceback (most recent call last): ... ValueError: Non-binary value was passed to the function >>> bin_to_decimal("") Traceback (most recent call last): ... ValueError: Empty string was passed to the function >>> bin_to_decimal("39") Traceback (most recent call last): ... ValueError: Non-binary value was passed to the function """ bin_string = str(bin_string).strip() if not bin_string: raise ValueError("Empty string was passed to the function") is_negative = bin_string[0] == "-" if is_negative: bin_string = bin_string[1:] if not all(char in "01" for char in bin_string): raise ValueError("Non-binary value was passed to the function") decimal_number = 0 for char in bin_string: decimal_number = 2 * decimal_number + int(char) return -decimal_number if is_negative else decimal_number if __name__ == "__main__": from doctest import testmod testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Highly divisible triangular numbers Problem 12 The sequence of triangle numbers is generated by adding the natural numbers. So the 7th triangle number would be 1 + 2 + 3 + 4 + 5 + 6 + 7 = 28. The first ten terms would be: 1, 3, 6, 10, 15, 21, 28, 36, 45, 55, ... Let us list the factors of the first seven triangle numbers: 1: 1 3: 1,3 6: 1,2,3,6 10: 1,2,5,10 15: 1,3,5,15 21: 1,3,7,21 28: 1,2,4,7,14,28 We can see that 28 is the first triangle number to have over five divisors. What is the value of the first triangle number to have over five hundred divisors? """ def count_divisors(n): n_divisors = 1 i = 2 while i * i <= n: multiplicity = 0 while n % i == 0: n //= i multiplicity += 1 n_divisors *= multiplicity + 1 i += 1 if n > 1: n_divisors *= 2 return n_divisors def solution(): """Returns the value of the first triangle number to have over five hundred divisors. >>> solution() 76576500 """ t_num = 1 i = 1 while True: i += 1 t_num += i if count_divisors(t_num) > 500: break return t_num if __name__ == "__main__": print(solution())
""" Highly divisible triangular numbers Problem 12 The sequence of triangle numbers is generated by adding the natural numbers. So the 7th triangle number would be 1 + 2 + 3 + 4 + 5 + 6 + 7 = 28. The first ten terms would be: 1, 3, 6, 10, 15, 21, 28, 36, 45, 55, ... Let us list the factors of the first seven triangle numbers: 1: 1 3: 1,3 6: 1,2,3,6 10: 1,2,5,10 15: 1,3,5,15 21: 1,3,7,21 28: 1,2,4,7,14,28 We can see that 28 is the first triangle number to have over five divisors. What is the value of the first triangle number to have over five hundred divisors? """ def count_divisors(n): n_divisors = 1 i = 2 while i * i <= n: multiplicity = 0 while n % i == 0: n //= i multiplicity += 1 n_divisors *= multiplicity + 1 i += 1 if n > 1: n_divisors *= 2 return n_divisors def solution(): """Returns the value of the first triangle number to have over five hundred divisors. >>> solution() 76576500 """ t_num = 1 i = 1 while True: i += 1 t_num += i if count_divisors(t_num) > 500: break return t_num if __name__ == "__main__": print(solution())
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" https://en.wikipedia.org/wiki/Computus#Gauss'_Easter_algorithm """ import math from datetime import datetime, timedelta def gauss_easter(year: int) -> datetime: """ Calculation Gregorian easter date for given year >>> gauss_easter(2007) datetime.datetime(2007, 4, 8, 0, 0) >>> gauss_easter(2008) datetime.datetime(2008, 3, 23, 0, 0) >>> gauss_easter(2020) datetime.datetime(2020, 4, 12, 0, 0) >>> gauss_easter(2021) datetime.datetime(2021, 4, 4, 0, 0) """ metonic_cycle = year % 19 julian_leap_year = year % 4 non_leap_year = year % 7 leap_day_inhibits = math.floor(year / 100) lunar_orbit_correction = math.floor((13 + 8 * leap_day_inhibits) / 25) leap_day_reinstall_number = leap_day_inhibits / 4 secular_moon_shift = ( 15 - lunar_orbit_correction + leap_day_inhibits - leap_day_reinstall_number ) % 30 century_starting_point = (4 + leap_day_inhibits - leap_day_reinstall_number) % 7 # days to be added to March 21 days_to_add = (19 * metonic_cycle + secular_moon_shift) % 30 # PHM -> Paschal Full Moon days_from_phm_to_sunday = ( 2 * julian_leap_year + 4 * non_leap_year + 6 * days_to_add + century_starting_point ) % 7 if days_to_add == 29 and days_from_phm_to_sunday == 6: return datetime(year, 4, 19) elif days_to_add == 28 and days_from_phm_to_sunday == 6: return datetime(year, 4, 18) else: return datetime(year, 3, 22) + timedelta( days=int(days_to_add + days_from_phm_to_sunday) ) if __name__ == "__main__": for year in (1994, 2000, 2010, 2021, 2023): tense = "will be" if year > datetime.now().year else "was" print(f"Easter in {year} {tense} {gauss_easter(year)}")
""" https://en.wikipedia.org/wiki/Computus#Gauss'_Easter_algorithm """ import math from datetime import datetime, timedelta def gauss_easter(year: int) -> datetime: """ Calculation Gregorian easter date for given year >>> gauss_easter(2007) datetime.datetime(2007, 4, 8, 0, 0) >>> gauss_easter(2008) datetime.datetime(2008, 3, 23, 0, 0) >>> gauss_easter(2020) datetime.datetime(2020, 4, 12, 0, 0) >>> gauss_easter(2021) datetime.datetime(2021, 4, 4, 0, 0) """ metonic_cycle = year % 19 julian_leap_year = year % 4 non_leap_year = year % 7 leap_day_inhibits = math.floor(year / 100) lunar_orbit_correction = math.floor((13 + 8 * leap_day_inhibits) / 25) leap_day_reinstall_number = leap_day_inhibits / 4 secular_moon_shift = ( 15 - lunar_orbit_correction + leap_day_inhibits - leap_day_reinstall_number ) % 30 century_starting_point = (4 + leap_day_inhibits - leap_day_reinstall_number) % 7 # days to be added to March 21 days_to_add = (19 * metonic_cycle + secular_moon_shift) % 30 # PHM -> Paschal Full Moon days_from_phm_to_sunday = ( 2 * julian_leap_year + 4 * non_leap_year + 6 * days_to_add + century_starting_point ) % 7 if days_to_add == 29 and days_from_phm_to_sunday == 6: return datetime(year, 4, 19) elif days_to_add == 28 and days_from_phm_to_sunday == 6: return datetime(year, 4, 18) else: return datetime(year, 3, 22) + timedelta( days=int(days_to_add + days_from_phm_to_sunday) ) if __name__ == "__main__": for year in (1994, 2000, 2010, 2021, 2023): tense = "will be" if year > datetime.now().year else "was" print(f"Easter in {year} {tense} {gauss_easter(year)}")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 6: https://projecteuler.net/problem=6 Sum square difference The sum of the squares of the first ten natural numbers is, 1^2 + 2^2 + ... + 10^2 = 385 The square of the sum of the first ten natural numbers is, (1 + 2 + ... + 10)^2 = 55^2 = 3025 Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025 - 385 = 2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum. """ def solution(n: int = 100) -> int: """ Returns the difference between the sum of the squares of the first n natural numbers and the square of the sum. >>> solution(10) 2640 >>> solution(15) 13160 >>> solution(20) 41230 >>> solution(50) 1582700 """ sum_of_squares = 0 sum_of_ints = 0 for i in range(1, n + 1): sum_of_squares += i**2 sum_of_ints += i return sum_of_ints**2 - sum_of_squares if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 6: https://projecteuler.net/problem=6 Sum square difference The sum of the squares of the first ten natural numbers is, 1^2 + 2^2 + ... + 10^2 = 385 The square of the sum of the first ten natural numbers is, (1 + 2 + ... + 10)^2 = 55^2 = 3025 Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025 - 385 = 2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum. """ def solution(n: int = 100) -> int: """ Returns the difference between the sum of the squares of the first n natural numbers and the square of the sum. >>> solution(10) 2640 >>> solution(15) 13160 >>> solution(20) 41230 >>> solution(50) 1582700 """ sum_of_squares = 0 sum_of_ints = 0 for i in range(1, n + 1): sum_of_squares += i**2 sum_of_ints += i return sum_of_ints**2 - sum_of_squares if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Absolute Value.""" def abs_val(num: float) -> float: """ Find the absolute value of a number. >>> abs_val(-5.1) 5.1 >>> abs_val(-5) == abs_val(5) True >>> abs_val(0) 0 """ return -num if num < 0 else num def abs_min(x: list[int]) -> int: """ >>> abs_min([0,5,1,11]) 0 >>> abs_min([3,-10,-2]) -2 >>> abs_min([]) Traceback (most recent call last): ... ValueError: abs_min() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_min() arg is an empty sequence") j = x[0] for i in x: if abs_val(i) < abs_val(j): j = i return j def abs_max(x: list[int]) -> int: """ >>> abs_max([0,5,1,11]) 11 >>> abs_max([3,-10,-2]) -10 >>> abs_max([]) Traceback (most recent call last): ... ValueError: abs_max() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_max() arg is an empty sequence") j = x[0] for i in x: if abs(i) > abs(j): j = i return j def abs_max_sort(x: list[int]) -> int: """ >>> abs_max_sort([0,5,1,11]) 11 >>> abs_max_sort([3,-10,-2]) -10 >>> abs_max_sort([]) Traceback (most recent call last): ... ValueError: abs_max_sort() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_max_sort() arg is an empty sequence") return sorted(x, key=abs)[-1] def test_abs_val(): """ >>> test_abs_val() """ assert abs_val(0) == 0 assert abs_val(34) == 34 assert abs_val(-100000000000) == 100000000000 a = [-3, -1, 2, -11] assert abs_max(a) == -11 assert abs_max_sort(a) == -11 assert abs_min(a) == -1 if __name__ == "__main__": import doctest doctest.testmod() test_abs_val() print(abs_val(-34)) # --> 34
"""Absolute Value.""" def abs_val(num: float) -> float: """ Find the absolute value of a number. >>> abs_val(-5.1) 5.1 >>> abs_val(-5) == abs_val(5) True >>> abs_val(0) 0 """ return -num if num < 0 else num def abs_min(x: list[int]) -> int: """ >>> abs_min([0,5,1,11]) 0 >>> abs_min([3,-10,-2]) -2 >>> abs_min([]) Traceback (most recent call last): ... ValueError: abs_min() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_min() arg is an empty sequence") j = x[0] for i in x: if abs_val(i) < abs_val(j): j = i return j def abs_max(x: list[int]) -> int: """ >>> abs_max([0,5,1,11]) 11 >>> abs_max([3,-10,-2]) -10 >>> abs_max([]) Traceback (most recent call last): ... ValueError: abs_max() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_max() arg is an empty sequence") j = x[0] for i in x: if abs(i) > abs(j): j = i return j def abs_max_sort(x: list[int]) -> int: """ >>> abs_max_sort([0,5,1,11]) 11 >>> abs_max_sort([3,-10,-2]) -10 >>> abs_max_sort([]) Traceback (most recent call last): ... ValueError: abs_max_sort() arg is an empty sequence """ if len(x) == 0: raise ValueError("abs_max_sort() arg is an empty sequence") return sorted(x, key=abs)[-1] def test_abs_val(): """ >>> test_abs_val() """ assert abs_val(0) == 0 assert abs_val(34) == 34 assert abs_val(-100000000000) == 100000000000 a = [-3, -1, 2, -11] assert abs_max(a) == -11 assert abs_max_sort(a) == -11 assert abs_min(a) == -1 if __name__ == "__main__": import doctest doctest.testmod() test_abs_val() print(abs_val(-34)) # --> 34
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def permute(nums: list[int]) -> list[list[int]]: """ Return all permutations. >>> from itertools import permutations >>> numbers= [1,2,3] >>> all(list(nums) in permute(numbers) for nums in permutations(numbers)) True """ result = [] if len(nums) == 1: return [nums.copy()] for _ in range(len(nums)): n = nums.pop(0) permutations = permute(nums) for perm in permutations: perm.append(n) result.extend(permutations) nums.append(n) return result if __name__ == "__main__": import doctest doctest.testmod()
def permute(nums: list[int]) -> list[list[int]]: """ Return all permutations. >>> from itertools import permutations >>> numbers= [1,2,3] >>> all(list(nums) in permute(numbers) for nums in permutations(numbers)) True """ result = [] if len(nums) == 1: return [nums.copy()] for _ in range(len(nums)): n = nums.pop(0) permutations = permute(nums) for perm in permutations: perm.append(n) result.extend(permutations) nums.append(n) return result if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Consider all integer combinations of ab for 2 <= a <= 5 and 2 <= b <= 5: 2^2=4, 2^3=8, 2^4=16, 2^5=32 3^2=9, 3^3=27, 3^4=81, 3^5=243 4^2=16, 4^3=64, 4^4=256, 4^5=1024 5^2=25, 5^3=125, 5^4=625, 5^5=3125 If they are then placed in numerical order, with any repeats removed, we get the following sequence of 15 distinct terms: 4, 8, 9, 16, 25, 27, 32, 64, 81, 125, 243, 256, 625, 1024, 3125 How many distinct terms are in the sequence generated by ab for 2 <= a <= 100 and 2 <= b <= 100? """ def solution(n: int = 100) -> int: """Returns the number of distinct terms in the sequence generated by a^b for 2 <= a <= 100 and 2 <= b <= 100. >>> solution(100) 9183 >>> solution(50) 2184 >>> solution(20) 324 >>> solution(5) 15 >>> solution(2) 1 >>> solution(1) 0 """ collect_powers = set() current_pow = 0 n = n + 1 # maximum limit for a in range(2, n): for b in range(2, n): current_pow = a**b # calculates the current power collect_powers.add(current_pow) # adds the result to the set return len(collect_powers) if __name__ == "__main__": print("Number of terms ", solution(int(str(input()).strip())))
""" Consider all integer combinations of ab for 2 <= a <= 5 and 2 <= b <= 5: 2^2=4, 2^3=8, 2^4=16, 2^5=32 3^2=9, 3^3=27, 3^4=81, 3^5=243 4^2=16, 4^3=64, 4^4=256, 4^5=1024 5^2=25, 5^3=125, 5^4=625, 5^5=3125 If they are then placed in numerical order, with any repeats removed, we get the following sequence of 15 distinct terms: 4, 8, 9, 16, 25, 27, 32, 64, 81, 125, 243, 256, 625, 1024, 3125 How many distinct terms are in the sequence generated by ab for 2 <= a <= 100 and 2 <= b <= 100? """ def solution(n: int = 100) -> int: """Returns the number of distinct terms in the sequence generated by a^b for 2 <= a <= 100 and 2 <= b <= 100. >>> solution(100) 9183 >>> solution(50) 2184 >>> solution(20) 324 >>> solution(5) 15 >>> solution(2) 1 >>> solution(1) 0 """ collect_powers = set() current_pow = 0 n = n + 1 # maximum limit for a in range(2, n): for b in range(2, n): current_pow = a**b # calculates the current power collect_powers.add(current_pow) # adds the result to the set return len(collect_powers) if __name__ == "__main__": print("Number of terms ", solution(int(str(input()).strip())))
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" In this problem, we want to determine all possible subsequences of the given sequence. We use backtracking to solve this problem. Time complexity: O(2^n), where n denotes the length of the given sequence. """ from __future__ import annotations from typing import Any def generate_all_subsequences(sequence: list[Any]) -> None: create_state_space_tree(sequence, [], 0) def create_state_space_tree( sequence: list[Any], current_subsequence: list[Any], index: int ) -> None: """ Creates a state space tree to iterate through each branch using DFS. We know that each state has exactly two children. It terminates when it reaches the end of the given sequence. """ if index == len(sequence): print(current_subsequence) return create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.append(sequence[index]) create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.pop() if __name__ == "__main__": seq: list[Any] = [3, 1, 2, 4] generate_all_subsequences(seq) seq.clear() seq.extend(["A", "B", "C"]) generate_all_subsequences(seq)
""" In this problem, we want to determine all possible subsequences of the given sequence. We use backtracking to solve this problem. Time complexity: O(2^n), where n denotes the length of the given sequence. """ from __future__ import annotations from typing import Any def generate_all_subsequences(sequence: list[Any]) -> None: create_state_space_tree(sequence, [], 0) def create_state_space_tree( sequence: list[Any], current_subsequence: list[Any], index: int ) -> None: """ Creates a state space tree to iterate through each branch using DFS. We know that each state has exactly two children. It terminates when it reaches the end of the given sequence. """ if index == len(sequence): print(current_subsequence) return create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.append(sequence[index]) create_state_space_tree(sequence, current_subsequence, index + 1) current_subsequence.pop() if __name__ == "__main__": seq: list[Any] = [3, 1, 2, 4] generate_all_subsequences(seq) seq.clear() seq.extend(["A", "B", "C"]) generate_all_subsequences(seq)
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def get_1s_count(number: int) -> int: """ Count the number of set bits in a 32 bit integer using Brian Kernighan's way. Ref - https://graphics.stanford.edu/~seander/bithacks.html#CountBitsSetKernighan >>> get_1s_count(25) 3 >>> get_1s_count(37) 3 >>> get_1s_count(21) 3 >>> get_1s_count(58) 4 >>> get_1s_count(0) 0 >>> get_1s_count(256) 1 >>> get_1s_count(-1) Traceback (most recent call last): ... ValueError: Input must be a non-negative integer >>> get_1s_count(0.8) Traceback (most recent call last): ... ValueError: Input must be a non-negative integer >>> get_1s_count("25") Traceback (most recent call last): ... ValueError: Input must be a non-negative integer """ if not isinstance(number, int) or number < 0: raise ValueError("Input must be a non-negative integer") count = 0 while number: # This way we arrive at next set bit (next 1) instead of looping # through each bit and checking for 1s hence the # loop won't run 32 times it will only run the number of `1` times number &= number - 1 count += 1 return count if __name__ == "__main__": import doctest doctest.testmod()
def get_1s_count(number: int) -> int: """ Count the number of set bits in a 32 bit integer using Brian Kernighan's way. Ref - https://graphics.stanford.edu/~seander/bithacks.html#CountBitsSetKernighan >>> get_1s_count(25) 3 >>> get_1s_count(37) 3 >>> get_1s_count(21) 3 >>> get_1s_count(58) 4 >>> get_1s_count(0) 0 >>> get_1s_count(256) 1 >>> get_1s_count(-1) Traceback (most recent call last): ... ValueError: Input must be a non-negative integer >>> get_1s_count(0.8) Traceback (most recent call last): ... ValueError: Input must be a non-negative integer >>> get_1s_count("25") Traceback (most recent call last): ... ValueError: Input must be a non-negative integer """ if not isinstance(number, int) or number < 0: raise ValueError("Input must be a non-negative integer") count = 0 while number: # This way we arrive at next set bit (next 1) instead of looping # through each bit and checking for 1s hence the # loop won't run 32 times it will only run the number of `1` times number &= number - 1 count += 1 return count if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def reverse_words(input_str: str) -> str: """ Reverses words in a given string >>> reverse_words("I love Python") 'Python love I' >>> reverse_words("I Love Python") 'Python Love I' """ return " ".join(input_str.split()[::-1]) if __name__ == "__main__": import doctest doctest.testmod()
def reverse_words(input_str: str) -> str: """ Reverses words in a given string >>> reverse_words("I love Python") 'Python love I' >>> reverse_words("I Love Python") 'Python Love I' """ return " ".join(input_str.split()[::-1]) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 91: https://projecteuler.net/problem=91 The points P (x1, y1) and Q (x2, y2) are plotted at integer coordinates and are joined to the origin, O(0,0), to form ΔOPQ.  There are exactly fourteen triangles containing a right angle that can be formed when each coordinate lies between 0 and 2 inclusive; that is, 0 ≤ x1, y1, x2, y2 ≤ 2.  Given that 0 ≤ x1, y1, x2, y2 ≤ 50, how many right triangles can be formed? """ from itertools import combinations, product def is_right(x1: int, y1: int, x2: int, y2: int) -> bool: """ Check if the triangle described by P(x1,y1), Q(x2,y2) and O(0,0) is right-angled. Note: this doesn't check if P and Q are equal, but that's handled by the use of itertools.combinations in the solution function. >>> is_right(0, 1, 2, 0) True >>> is_right(1, 0, 2, 2) False """ if x1 == y1 == 0 or x2 == y2 == 0: return False a_square = x1 * x1 + y1 * y1 b_square = x2 * x2 + y2 * y2 c_square = (x1 - x2) * (x1 - x2) + (y1 - y2) * (y1 - y2) return ( a_square + b_square == c_square or a_square + c_square == b_square or b_square + c_square == a_square ) def solution(limit: int = 50) -> int: """ Return the number of right triangles OPQ that can be formed by two points P, Q which have both x- and y- coordinates between 0 and limit inclusive. >>> solution(2) 14 >>> solution(10) 448 """ return sum( 1 for pt1, pt2 in combinations(product(range(limit + 1), repeat=2), 2) if is_right(*pt1, *pt2) ) if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 91: https://projecteuler.net/problem=91 The points P (x1, y1) and Q (x2, y2) are plotted at integer coordinates and are joined to the origin, O(0,0), to form ΔOPQ.  There are exactly fourteen triangles containing a right angle that can be formed when each coordinate lies between 0 and 2 inclusive; that is, 0 ≤ x1, y1, x2, y2 ≤ 2.  Given that 0 ≤ x1, y1, x2, y2 ≤ 50, how many right triangles can be formed? """ from itertools import combinations, product def is_right(x1: int, y1: int, x2: int, y2: int) -> bool: """ Check if the triangle described by P(x1,y1), Q(x2,y2) and O(0,0) is right-angled. Note: this doesn't check if P and Q are equal, but that's handled by the use of itertools.combinations in the solution function. >>> is_right(0, 1, 2, 0) True >>> is_right(1, 0, 2, 2) False """ if x1 == y1 == 0 or x2 == y2 == 0: return False a_square = x1 * x1 + y1 * y1 b_square = x2 * x2 + y2 * y2 c_square = (x1 - x2) * (x1 - x2) + (y1 - y2) * (y1 - y2) return ( a_square + b_square == c_square or a_square + c_square == b_square or b_square + c_square == a_square ) def solution(limit: int = 50) -> int: """ Return the number of right triangles OPQ that can be formed by two points P, Q which have both x- and y- coordinates between 0 and limit inclusive. >>> solution(2) 14 >>> solution(10) 448 """ return sum( 1 for pt1, pt2 in combinations(product(range(limit + 1), repeat=2), 2) if is_right(*pt1, *pt2) ) if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import random class Onepad: @staticmethod def encrypt(text: str) -> tuple[list[int], list[int]]: """Function to encrypt text using pseudo-random numbers""" plain = [ord(i) for i in text] key = [] cipher = [] for i in plain: k = random.randint(1, 300) c = (i + k) * k cipher.append(c) key.append(k) return cipher, key @staticmethod def decrypt(cipher: list[int], key: list[int]) -> str: """Function to decrypt text using pseudo-random numbers.""" plain = [] for i in range(len(key)): p = int((cipher[i] - (key[i]) ** 2) / key[i]) plain.append(chr(p)) return "".join(plain) if __name__ == "__main__": c, k = Onepad().encrypt("Hello") print(c, k) print(Onepad().decrypt(c, k))
import random class Onepad: @staticmethod def encrypt(text: str) -> tuple[list[int], list[int]]: """Function to encrypt text using pseudo-random numbers""" plain = [ord(i) for i in text] key = [] cipher = [] for i in plain: k = random.randint(1, 300) c = (i + k) * k cipher.append(c) key.append(k) return cipher, key @staticmethod def decrypt(cipher: list[int], key: list[int]) -> str: """Function to decrypt text using pseudo-random numbers.""" plain = [] for i in range(len(key)): p = int((cipher[i] - (key[i]) ** 2) / key[i]) plain.append(chr(p)) return "".join(plain) if __name__ == "__main__": c, k = Onepad().encrypt("Hello") print(c, k) print(Onepad().decrypt(c, k))
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import numpy as np def power_iteration( input_matrix: np.ndarray, vector: np.ndarray, error_tol: float = 1e-12, max_iterations: int = 100, ) -> tuple[float, np.ndarray]: """ Power Iteration. Find the largest eigenvalue and corresponding eigenvector of matrix input_matrix given a random vector in the same space. Will work so long as vector has component of largest eigenvector. input_matrix must be either real or Hermitian. Input input_matrix: input matrix whose largest eigenvalue we will find. Numpy array. np.shape(input_matrix) == (N,N). vector: random initial vector in same space as matrix. Numpy array. np.shape(vector) == (N,) or (N,1) Output largest_eigenvalue: largest eigenvalue of the matrix input_matrix. Float. Scalar. largest_eigenvector: eigenvector corresponding to largest_eigenvalue. Numpy array. np.shape(largest_eigenvector) == (N,) or (N,1). >>> import numpy as np >>> input_matrix = np.array([ ... [41, 4, 20], ... [ 4, 26, 30], ... [20, 30, 50] ... ]) >>> vector = np.array([41,4,20]) >>> power_iteration(input_matrix,vector) (79.66086378788381, array([0.44472726, 0.46209842, 0.76725662])) """ # Ensure matrix is square. assert np.shape(input_matrix)[0] == np.shape(input_matrix)[1] # Ensure proper dimensionality. assert np.shape(input_matrix)[0] == np.shape(vector)[0] # Ensure inputs are either both complex or both real assert np.iscomplexobj(input_matrix) == np.iscomplexobj(vector) is_complex = np.iscomplexobj(input_matrix) if is_complex: # Ensure complex input_matrix is Hermitian assert np.array_equal(input_matrix, input_matrix.conj().T) # Set convergence to False. Will define convergence when we exceed max_iterations # or when we have small changes from one iteration to next. convergence = False lambda_previous = 0 iterations = 0 error = 1e12 while not convergence: # Multiple matrix by the vector. w = np.dot(input_matrix, vector) # Normalize the resulting output vector. vector = w / np.linalg.norm(w) # Find rayleigh quotient # (faster than usual b/c we know vector is normalized already) vector_h = vector.conj().T if is_complex else vector.T lambda_ = np.dot(vector_h, np.dot(input_matrix, vector)) # Check convergence. error = np.abs(lambda_ - lambda_previous) / lambda_ iterations += 1 if error <= error_tol or iterations >= max_iterations: convergence = True lambda_previous = lambda_ if is_complex: lambda_ = np.real(lambda_) return lambda_, vector def test_power_iteration() -> None: """ >>> test_power_iteration() # self running tests """ real_input_matrix = np.array([[41, 4, 20], [4, 26, 30], [20, 30, 50]]) real_vector = np.array([41, 4, 20]) complex_input_matrix = real_input_matrix.astype(np.complex128) imag_matrix = np.triu(1j * complex_input_matrix, 1) complex_input_matrix += imag_matrix complex_input_matrix += -1 * imag_matrix.T complex_vector = np.array([41, 4, 20]).astype(np.complex128) for problem_type in ["real", "complex"]: if problem_type == "real": input_matrix = real_input_matrix vector = real_vector elif problem_type == "complex": input_matrix = complex_input_matrix vector = complex_vector # Our implementation. eigen_value, eigen_vector = power_iteration(input_matrix, vector) # Numpy implementation. # Get eigenvalues and eigenvectors using built-in numpy # eigh (eigh used for symmetric or hermetian matrices). eigen_values, eigen_vectors = np.linalg.eigh(input_matrix) # Last eigenvalue is the maximum one. eigen_value_max = eigen_values[-1] # Last column in this matrix is eigenvector corresponding to largest eigenvalue. eigen_vector_max = eigen_vectors[:, -1] # Check our implementation and numpy gives close answers. assert np.abs(eigen_value - eigen_value_max) <= 1e-6 # Take absolute values element wise of each eigenvector. # as they are only unique to a minus sign. assert np.linalg.norm(np.abs(eigen_vector) - np.abs(eigen_vector_max)) <= 1e-6 if __name__ == "__main__": import doctest doctest.testmod() test_power_iteration()
import numpy as np def power_iteration( input_matrix: np.ndarray, vector: np.ndarray, error_tol: float = 1e-12, max_iterations: int = 100, ) -> tuple[float, np.ndarray]: """ Power Iteration. Find the largest eigenvalue and corresponding eigenvector of matrix input_matrix given a random vector in the same space. Will work so long as vector has component of largest eigenvector. input_matrix must be either real or Hermitian. Input input_matrix: input matrix whose largest eigenvalue we will find. Numpy array. np.shape(input_matrix) == (N,N). vector: random initial vector in same space as matrix. Numpy array. np.shape(vector) == (N,) or (N,1) Output largest_eigenvalue: largest eigenvalue of the matrix input_matrix. Float. Scalar. largest_eigenvector: eigenvector corresponding to largest_eigenvalue. Numpy array. np.shape(largest_eigenvector) == (N,) or (N,1). >>> import numpy as np >>> input_matrix = np.array([ ... [41, 4, 20], ... [ 4, 26, 30], ... [20, 30, 50] ... ]) >>> vector = np.array([41,4,20]) >>> power_iteration(input_matrix,vector) (79.66086378788381, array([0.44472726, 0.46209842, 0.76725662])) """ # Ensure matrix is square. assert np.shape(input_matrix)[0] == np.shape(input_matrix)[1] # Ensure proper dimensionality. assert np.shape(input_matrix)[0] == np.shape(vector)[0] # Ensure inputs are either both complex or both real assert np.iscomplexobj(input_matrix) == np.iscomplexobj(vector) is_complex = np.iscomplexobj(input_matrix) if is_complex: # Ensure complex input_matrix is Hermitian assert np.array_equal(input_matrix, input_matrix.conj().T) # Set convergence to False. Will define convergence when we exceed max_iterations # or when we have small changes from one iteration to next. convergence = False lambda_previous = 0 iterations = 0 error = 1e12 while not convergence: # Multiple matrix by the vector. w = np.dot(input_matrix, vector) # Normalize the resulting output vector. vector = w / np.linalg.norm(w) # Find rayleigh quotient # (faster than usual b/c we know vector is normalized already) vector_h = vector.conj().T if is_complex else vector.T lambda_ = np.dot(vector_h, np.dot(input_matrix, vector)) # Check convergence. error = np.abs(lambda_ - lambda_previous) / lambda_ iterations += 1 if error <= error_tol or iterations >= max_iterations: convergence = True lambda_previous = lambda_ if is_complex: lambda_ = np.real(lambda_) return lambda_, vector def test_power_iteration() -> None: """ >>> test_power_iteration() # self running tests """ real_input_matrix = np.array([[41, 4, 20], [4, 26, 30], [20, 30, 50]]) real_vector = np.array([41, 4, 20]) complex_input_matrix = real_input_matrix.astype(np.complex128) imag_matrix = np.triu(1j * complex_input_matrix, 1) complex_input_matrix += imag_matrix complex_input_matrix += -1 * imag_matrix.T complex_vector = np.array([41, 4, 20]).astype(np.complex128) for problem_type in ["real", "complex"]: if problem_type == "real": input_matrix = real_input_matrix vector = real_vector elif problem_type == "complex": input_matrix = complex_input_matrix vector = complex_vector # Our implementation. eigen_value, eigen_vector = power_iteration(input_matrix, vector) # Numpy implementation. # Get eigenvalues and eigenvectors using built-in numpy # eigh (eigh used for symmetric or hermetian matrices). eigen_values, eigen_vectors = np.linalg.eigh(input_matrix) # Last eigenvalue is the maximum one. eigen_value_max = eigen_values[-1] # Last column in this matrix is eigenvector corresponding to largest eigenvalue. eigen_vector_max = eigen_vectors[:, -1] # Check our implementation and numpy gives close answers. assert np.abs(eigen_value - eigen_value_max) <= 1e-6 # Take absolute values element wise of each eigenvector. # as they are only unique to a minus sign. assert np.linalg.norm(np.abs(eigen_vector) - np.abs(eigen_vector_max)) <= 1e-6 if __name__ == "__main__": import doctest doctest.testmod() test_power_iteration()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" If we are presented with the first k terms of a sequence it is impossible to say with certainty the value of the next term, as there are infinitely many polynomial functions that can model the sequence. As an example, let us consider the sequence of cube numbers. This is defined by the generating function, u(n) = n3: 1, 8, 27, 64, 125, 216, ... Suppose we were only given the first two terms of this sequence. Working on the principle that "simple is best" we should assume a linear relationship and predict the next term to be 15 (common difference 7). Even if we were presented with the first three terms, by the same principle of simplicity, a quadratic relationship should be assumed. We shall define OP(k, n) to be the nth term of the optimum polynomial generating function for the first k terms of a sequence. It should be clear that OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a bad OP (BOP). As a basis, if we were only given the first term of sequence, it would be most sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1). Hence we obtain the following OPs for the cubic sequence: OP(1, n) = 1 1, 1, 1, 1, ... OP(2, n) = 7n-6 1, 8, 15, ... OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ... OP(4, n) = n^3 1, 8, 27, 64, 125, ... Clearly no BOPs exist for k ≥ 4. By considering the sum of FITs generated by the BOPs (indicated in red above), we obtain 1 + 15 + 58 = 74. Consider the following tenth degree polynomial generating function: 1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10 Find the sum of FITs for the BOPs. """ from __future__ import annotations from collections.abc import Callable Matrix = list[list[float | int]] def solve(matrix: Matrix, vector: Matrix) -> Matrix: """ Solve the linear system of equations Ax = b (A = "matrix", b = "vector") for x using Gaussian elimination and back substitution. We assume that A is an invertible square matrix and that b is a column vector of the same height. >>> solve([[1, 0], [0, 1]], [[1],[2]]) [[1.0], [2.0]] >>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]]) [[2.0], [3.0], [-1.0]] """ size: int = len(matrix) augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)] row: int row2: int col: int col2: int pivot_row: int ratio: float for row in range(size): for col in range(size): augmented[row][col] = matrix[row][col] augmented[row][size] = vector[row][0] row = 0 col = 0 while row < size and col < size: # pivoting pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[ 1 ] if augmented[pivot_row][col] == 0: col += 1 continue else: augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row] for row2 in range(row + 1, size): ratio = augmented[row2][col] / augmented[row][col] augmented[row2][col] = 0 for col2 in range(col + 1, size + 1): augmented[row2][col2] -= augmented[row][col2] * ratio row += 1 col += 1 # back substitution for col in range(1, size): for row in range(col): ratio = augmented[row][col] / augmented[col][col] for col2 in range(col, size + 1): augmented[row][col2] -= augmented[col][col2] * ratio # round to get rid of numbers like 2.000000000000004 return [ [round(augmented[row][size] / augmented[row][row], 10)] for row in range(size) ] def interpolate(y_list: list[int]) -> Callable[[int], int]: """ Given a list of data points (1,y0),(2,y1), ..., return a function that interpolates the data points. We find the coefficients of the interpolating polynomial by solving a system of linear equations corresponding to x = 1, 2, 3... >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ size: int = len(y_list) matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)] vector: Matrix = [[0] for _ in range(size)] coeffs: Matrix x_val: int y_val: int col: int for x_val, y_val in enumerate(y_list): for col in range(size): matrix[x_val][col] = (x_val + 1) ** (size - col - 1) vector[x_val][0] = y_val coeffs = solve(matrix, vector) def interpolated_func(var: int) -> int: """ >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ return sum( round(coeffs[x_val][0]) * (var ** (size - x_val - 1)) for x_val in range(size) ) return interpolated_func def question_function(variable: int) -> int: """ The generating function u as specified in the question. >>> question_function(0) 1 >>> question_function(1) 1 >>> question_function(5) 8138021 >>> question_function(10) 9090909091 """ return ( 1 - variable + variable**2 - variable**3 + variable**4 - variable**5 + variable**6 - variable**7 + variable**8 - variable**9 + variable**10 ) def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int: """ Find the sum of the FITs of the BOPS. For each interpolating polynomial of order 1, 2, ... , 10, find the first x such that the value of the polynomial at x does not equal u(x). >>> solution(lambda n: n ** 3, 3) 74 """ data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)] polynomials: list[Callable[[int], int]] = [ interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1) ] ret: int = 0 poly: Callable[[int], int] x_val: int for poly in polynomials: x_val = 1 while func(x_val) == poly(x_val): x_val += 1 ret += poly(x_val) return ret if __name__ == "__main__": print(f"{solution() = }")
""" If we are presented with the first k terms of a sequence it is impossible to say with certainty the value of the next term, as there are infinitely many polynomial functions that can model the sequence. As an example, let us consider the sequence of cube numbers. This is defined by the generating function, u(n) = n3: 1, 8, 27, 64, 125, 216, ... Suppose we were only given the first two terms of this sequence. Working on the principle that "simple is best" we should assume a linear relationship and predict the next term to be 15 (common difference 7). Even if we were presented with the first three terms, by the same principle of simplicity, a quadratic relationship should be assumed. We shall define OP(k, n) to be the nth term of the optimum polynomial generating function for the first k terms of a sequence. It should be clear that OP(k, n) will accurately generate the terms of the sequence for n ≤ k, and potentially the first incorrect term (FIT) will be OP(k, k+1); in which case we shall call it a bad OP (BOP). As a basis, if we were only given the first term of sequence, it would be most sensible to assume constancy; that is, for n ≥ 2, OP(1, n) = u(1). Hence we obtain the following OPs for the cubic sequence: OP(1, n) = 1 1, 1, 1, 1, ... OP(2, n) = 7n-6 1, 8, 15, ... OP(3, n) = 6n^2-11n+6 1, 8, 27, 58, ... OP(4, n) = n^3 1, 8, 27, 64, 125, ... Clearly no BOPs exist for k ≥ 4. By considering the sum of FITs generated by the BOPs (indicated in red above), we obtain 1 + 15 + 58 = 74. Consider the following tenth degree polynomial generating function: 1 - n + n^2 - n^3 + n^4 - n^5 + n^6 - n^7 + n^8 - n^9 + n^10 Find the sum of FITs for the BOPs. """ from __future__ import annotations from collections.abc import Callable Matrix = list[list[float | int]] def solve(matrix: Matrix, vector: Matrix) -> Matrix: """ Solve the linear system of equations Ax = b (A = "matrix", b = "vector") for x using Gaussian elimination and back substitution. We assume that A is an invertible square matrix and that b is a column vector of the same height. >>> solve([[1, 0], [0, 1]], [[1],[2]]) [[1.0], [2.0]] >>> solve([[2, 1, -1],[-3, -1, 2],[-2, 1, 2]],[[8], [-11],[-3]]) [[2.0], [3.0], [-1.0]] """ size: int = len(matrix) augmented: Matrix = [[0 for _ in range(size + 1)] for _ in range(size)] row: int row2: int col: int col2: int pivot_row: int ratio: float for row in range(size): for col in range(size): augmented[row][col] = matrix[row][col] augmented[row][size] = vector[row][0] row = 0 col = 0 while row < size and col < size: # pivoting pivot_row = max((abs(augmented[row2][col]), row2) for row2 in range(col, size))[ 1 ] if augmented[pivot_row][col] == 0: col += 1 continue else: augmented[row], augmented[pivot_row] = augmented[pivot_row], augmented[row] for row2 in range(row + 1, size): ratio = augmented[row2][col] / augmented[row][col] augmented[row2][col] = 0 for col2 in range(col + 1, size + 1): augmented[row2][col2] -= augmented[row][col2] * ratio row += 1 col += 1 # back substitution for col in range(1, size): for row in range(col): ratio = augmented[row][col] / augmented[col][col] for col2 in range(col, size + 1): augmented[row][col2] -= augmented[col][col2] * ratio # round to get rid of numbers like 2.000000000000004 return [ [round(augmented[row][size] / augmented[row][row], 10)] for row in range(size) ] def interpolate(y_list: list[int]) -> Callable[[int], int]: """ Given a list of data points (1,y0),(2,y1), ..., return a function that interpolates the data points. We find the coefficients of the interpolating polynomial by solving a system of linear equations corresponding to x = 1, 2, 3... >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ size: int = len(y_list) matrix: Matrix = [[0 for _ in range(size)] for _ in range(size)] vector: Matrix = [[0] for _ in range(size)] coeffs: Matrix x_val: int y_val: int col: int for x_val, y_val in enumerate(y_list): for col in range(size): matrix[x_val][col] = (x_val + 1) ** (size - col - 1) vector[x_val][0] = y_val coeffs = solve(matrix, vector) def interpolated_func(var: int) -> int: """ >>> interpolate([1])(3) 1 >>> interpolate([1, 8])(3) 15 >>> interpolate([1, 8, 27])(4) 58 >>> interpolate([1, 8, 27, 64])(6) 216 """ return sum( round(coeffs[x_val][0]) * (var ** (size - x_val - 1)) for x_val in range(size) ) return interpolated_func def question_function(variable: int) -> int: """ The generating function u as specified in the question. >>> question_function(0) 1 >>> question_function(1) 1 >>> question_function(5) 8138021 >>> question_function(10) 9090909091 """ return ( 1 - variable + variable**2 - variable**3 + variable**4 - variable**5 + variable**6 - variable**7 + variable**8 - variable**9 + variable**10 ) def solution(func: Callable[[int], int] = question_function, order: int = 10) -> int: """ Find the sum of the FITs of the BOPS. For each interpolating polynomial of order 1, 2, ... , 10, find the first x such that the value of the polynomial at x does not equal u(x). >>> solution(lambda n: n ** 3, 3) 74 """ data_points: list[int] = [func(x_val) for x_val in range(1, order + 1)] polynomials: list[Callable[[int], int]] = [ interpolate(data_points[:max_coeff]) for max_coeff in range(1, order + 1) ] ret: int = 0 poly: Callable[[int], int] x_val: int for poly in polynomials: x_val = 1 while func(x_val) == poly(x_val): x_val += 1 ret += poly(x_val) return ret if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" The nqueens problem is of placing N queens on a N * N chess board such that no queen can attack any other queens placed on that chess board. This means that one queen cannot have any other queen on its horizontal, vertical and diagonal lines. """ from __future__ import annotations solution = [] def is_safe(board: list[list[int]], row: int, column: int) -> bool: """ This function returns a boolean value True if it is safe to place a queen there considering the current state of the board. Parameters : board(2D matrix) : board row ,column : coordinates of the cell on a board Returns : Boolean Value """ for i in range(len(board)): if board[row][i] == 1: return False for i in range(len(board)): if board[i][column] == 1: return False for i, j in zip(range(row, -1, -1), range(column, -1, -1)): if board[i][j] == 1: return False for i, j in zip(range(row, -1, -1), range(column, len(board))): if board[i][j] == 1: return False return True def solve(board: list[list[int]], row: int) -> bool: """ It creates a state space tree and calls the safe function until it receives a False Boolean and terminates that branch and backtracks to the next possible solution branch. """ if row >= len(board): """ If the row number exceeds N we have board with a successful combination and that combination is appended to the solution list and the board is printed. """ solution.append(board) printboard(board) print() return True for i in range(len(board)): """ For every row it iterates through each column to check if it is feasible to place a queen there. If all the combinations for that particular branch are successful the board is reinitialized for the next possible combination. """ if is_safe(board, row, i): board[row][i] = 1 solve(board, row + 1) board[row][i] = 0 return False def printboard(board: list[list[int]]) -> None: """ Prints the boards that have a successful combination. """ for i in range(len(board)): for j in range(len(board)): if board[i][j] == 1: print("Q", end=" ") else: print(".", end=" ") print() # n=int(input("The no. of queens")) n = 8 board = [[0 for i in range(n)] for j in range(n)] solve(board, 0) print("The total no. of solutions are :", len(solution))
""" The nqueens problem is of placing N queens on a N * N chess board such that no queen can attack any other queens placed on that chess board. This means that one queen cannot have any other queen on its horizontal, vertical and diagonal lines. """ from __future__ import annotations solution = [] def is_safe(board: list[list[int]], row: int, column: int) -> bool: """ This function returns a boolean value True if it is safe to place a queen there considering the current state of the board. Parameters : board(2D matrix) : board row ,column : coordinates of the cell on a board Returns : Boolean Value """ for i in range(len(board)): if board[row][i] == 1: return False for i in range(len(board)): if board[i][column] == 1: return False for i, j in zip(range(row, -1, -1), range(column, -1, -1)): if board[i][j] == 1: return False for i, j in zip(range(row, -1, -1), range(column, len(board))): if board[i][j] == 1: return False return True def solve(board: list[list[int]], row: int) -> bool: """ It creates a state space tree and calls the safe function until it receives a False Boolean and terminates that branch and backtracks to the next possible solution branch. """ if row >= len(board): """ If the row number exceeds N we have board with a successful combination and that combination is appended to the solution list and the board is printed. """ solution.append(board) printboard(board) print() return True for i in range(len(board)): """ For every row it iterates through each column to check if it is feasible to place a queen there. If all the combinations for that particular branch are successful the board is reinitialized for the next possible combination. """ if is_safe(board, row, i): board[row][i] = 1 solve(board, row + 1) board[row][i] = 0 return False def printboard(board: list[list[int]]) -> None: """ Prints the boards that have a successful combination. """ for i in range(len(board)): for j in range(len(board)): if board[i][j] == 1: print("Q", end=" ") else: print(".", end=" ") print() # n=int(input("The no. of queens")) n = 8 board = [[0 for i in range(n)] for j in range(n)] solve(board, 0) print("The total no. of solutions are :", len(solution))
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
BITS_TO_HEX = { "0000": "0", "0001": "1", "0010": "2", "0011": "3", "0100": "4", "0101": "5", "0110": "6", "0111": "7", "1000": "8", "1001": "9", "1010": "a", "1011": "b", "1100": "c", "1101": "d", "1110": "e", "1111": "f", } def bin_to_hexadecimal(binary_str: str) -> str: """ Converting a binary string into hexadecimal using Grouping Method >>> bin_to_hexadecimal('101011111') '0x15f' >>> bin_to_hexadecimal(' 1010 ') '0x0a' >>> bin_to_hexadecimal('-11101') '-0x1d' >>> bin_to_hexadecimal('a') Traceback (most recent call last): ... ValueError: Non-binary value was passed to the function >>> bin_to_hexadecimal('') Traceback (most recent call last): ... ValueError: Empty string was passed to the function """ # Sanitising parameter binary_str = str(binary_str).strip() # Exceptions if not binary_str: raise ValueError("Empty string was passed to the function") is_negative = binary_str[0] == "-" binary_str = binary_str[1:] if is_negative else binary_str if not all(char in "01" for char in binary_str): raise ValueError("Non-binary value was passed to the function") binary_str = ( "0" * (4 * (divmod(len(binary_str), 4)[0] + 1) - len(binary_str)) + binary_str ) hexadecimal = [] for x in range(0, len(binary_str), 4): hexadecimal.append(BITS_TO_HEX[binary_str[x : x + 4]]) hexadecimal_str = "0x" + "".join(hexadecimal) return "-" + hexadecimal_str if is_negative else hexadecimal_str if __name__ == "__main__": from doctest import testmod testmod()
BITS_TO_HEX = { "0000": "0", "0001": "1", "0010": "2", "0011": "3", "0100": "4", "0101": "5", "0110": "6", "0111": "7", "1000": "8", "1001": "9", "1010": "a", "1011": "b", "1100": "c", "1101": "d", "1110": "e", "1111": "f", } def bin_to_hexadecimal(binary_str: str) -> str: """ Converting a binary string into hexadecimal using Grouping Method >>> bin_to_hexadecimal('101011111') '0x15f' >>> bin_to_hexadecimal(' 1010 ') '0x0a' >>> bin_to_hexadecimal('-11101') '-0x1d' >>> bin_to_hexadecimal('a') Traceback (most recent call last): ... ValueError: Non-binary value was passed to the function >>> bin_to_hexadecimal('') Traceback (most recent call last): ... ValueError: Empty string was passed to the function """ # Sanitising parameter binary_str = str(binary_str).strip() # Exceptions if not binary_str: raise ValueError("Empty string was passed to the function") is_negative = binary_str[0] == "-" binary_str = binary_str[1:] if is_negative else binary_str if not all(char in "01" for char in binary_str): raise ValueError("Non-binary value was passed to the function") binary_str = ( "0" * (4 * (divmod(len(binary_str), 4)[0] + 1) - len(binary_str)) + binary_str ) hexadecimal = [] for x in range(0, len(binary_str), 4): hexadecimal.append(BITS_TO_HEX[binary_str[x : x + 4]]) hexadecimal_str = "0x" + "".join(hexadecimal) return "-" + hexadecimal_str if is_negative else hexadecimal_str if __name__ == "__main__": from doctest import testmod testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Build the quantum fourier transform (qft) for a desire number of quantum bits using Qiskit framework. This experiment run in IBM Q simulator with 10000 shots. This circuit can be use as a building block to design the Shor's algorithm in quantum computing. As well as, quantum phase estimation among others. . References: https://en.wikipedia.org/wiki/Quantum_Fourier_transform https://qiskit.org/textbook/ch-algorithms/quantum-fourier-transform.html """ import math import numpy as np import qiskit from qiskit import Aer, ClassicalRegister, QuantumCircuit, QuantumRegister, execute def quantum_fourier_transform(number_of_qubits: int = 3) -> qiskit.result.counts.Counts: """ # >>> quantum_fourier_transform(2) # {'00': 2500, '01': 2500, '11': 2500, '10': 2500} # quantum circuit for number_of_qubits = 3: ┌───┐ qr_0: ──────■──────────────────────■───────┤ H ├─X─ │ ┌───┐ │P(π/2) └───┘ │ qr_1: ──────┼────────■───────┤ H ├─■─────────────┼─ ┌───┐ │P(π/4) │P(π/2) └───┘ │ qr_2: ┤ H ├─■────────■───────────────────────────X─ └───┘ cr: 3/═════════════════════════════════════════════ Args: n : number of qubits Returns: qiskit.result.counts.Counts: distribute counts. >>> quantum_fourier_transform(2) {'00': 2500, '01': 2500, '10': 2500, '11': 2500} >>> quantum_fourier_transform(-1) Traceback (most recent call last): ... ValueError: number of qubits must be > 0. >>> quantum_fourier_transform('a') Traceback (most recent call last): ... TypeError: number of qubits must be a integer. >>> quantum_fourier_transform(100) Traceback (most recent call last): ... ValueError: number of qubits too large to simulate(>10). >>> quantum_fourier_transform(0.5) Traceback (most recent call last): ... ValueError: number of qubits must be exact integer. """ if isinstance(number_of_qubits, str): raise TypeError("number of qubits must be a integer.") if number_of_qubits <= 0: raise ValueError("number of qubits must be > 0.") if math.floor(number_of_qubits) != number_of_qubits: raise ValueError("number of qubits must be exact integer.") if number_of_qubits > 10: raise ValueError("number of qubits too large to simulate(>10).") qr = QuantumRegister(number_of_qubits, "qr") cr = ClassicalRegister(number_of_qubits, "cr") quantum_circuit = QuantumCircuit(qr, cr) counter = number_of_qubits for i in range(counter): quantum_circuit.h(number_of_qubits - i - 1) counter -= 1 for j in range(counter): quantum_circuit.cp(np.pi / 2 ** (counter - j), j, counter) for k in range(number_of_qubits // 2): quantum_circuit.swap(k, number_of_qubits - k - 1) # measure all the qubits quantum_circuit.measure(qr, cr) # simulate with 10000 shots backend = Aer.get_backend("qasm_simulator") job = execute(quantum_circuit, backend, shots=10000) return job.result().get_counts(quantum_circuit) if __name__ == "__main__": print( f"Total count for quantum fourier transform state is: \ {quantum_fourier_transform(3)}" )
""" Build the quantum fourier transform (qft) for a desire number of quantum bits using Qiskit framework. This experiment run in IBM Q simulator with 10000 shots. This circuit can be use as a building block to design the Shor's algorithm in quantum computing. As well as, quantum phase estimation among others. . References: https://en.wikipedia.org/wiki/Quantum_Fourier_transform https://qiskit.org/textbook/ch-algorithms/quantum-fourier-transform.html """ import math import numpy as np import qiskit from qiskit import Aer, ClassicalRegister, QuantumCircuit, QuantumRegister, execute def quantum_fourier_transform(number_of_qubits: int = 3) -> qiskit.result.counts.Counts: """ # >>> quantum_fourier_transform(2) # {'00': 2500, '01': 2500, '11': 2500, '10': 2500} # quantum circuit for number_of_qubits = 3: ┌───┐ qr_0: ──────■──────────────────────■───────┤ H ├─X─ │ ┌───┐ │P(π/2) └───┘ │ qr_1: ──────┼────────■───────┤ H ├─■─────────────┼─ ┌───┐ │P(π/4) │P(π/2) └───┘ │ qr_2: ┤ H ├─■────────■───────────────────────────X─ └───┘ cr: 3/═════════════════════════════════════════════ Args: n : number of qubits Returns: qiskit.result.counts.Counts: distribute counts. >>> quantum_fourier_transform(2) {'00': 2500, '01': 2500, '10': 2500, '11': 2500} >>> quantum_fourier_transform(-1) Traceback (most recent call last): ... ValueError: number of qubits must be > 0. >>> quantum_fourier_transform('a') Traceback (most recent call last): ... TypeError: number of qubits must be a integer. >>> quantum_fourier_transform(100) Traceback (most recent call last): ... ValueError: number of qubits too large to simulate(>10). >>> quantum_fourier_transform(0.5) Traceback (most recent call last): ... ValueError: number of qubits must be exact integer. """ if isinstance(number_of_qubits, str): raise TypeError("number of qubits must be a integer.") if number_of_qubits <= 0: raise ValueError("number of qubits must be > 0.") if math.floor(number_of_qubits) != number_of_qubits: raise ValueError("number of qubits must be exact integer.") if number_of_qubits > 10: raise ValueError("number of qubits too large to simulate(>10).") qr = QuantumRegister(number_of_qubits, "qr") cr = ClassicalRegister(number_of_qubits, "cr") quantum_circuit = QuantumCircuit(qr, cr) counter = number_of_qubits for i in range(counter): quantum_circuit.h(number_of_qubits - i - 1) counter -= 1 for j in range(counter): quantum_circuit.cp(np.pi / 2 ** (counter - j), j, counter) for k in range(number_of_qubits // 2): quantum_circuit.swap(k, number_of_qubits - k - 1) # measure all the qubits quantum_circuit.measure(qr, cr) # simulate with 10000 shots backend = Aer.get_backend("qasm_simulator") job = execute(quantum_circuit, backend, shots=10000) return job.result().get_counts(quantum_circuit) if __name__ == "__main__": print( f"Total count for quantum fourier transform state is: \ {quantum_fourier_transform(3)}" )
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" The sum-of-subsetsproblem states that a set of non-negative integers, and a value M, determine all possible subsets of the given set whose summation sum equal to given M. Summation of the chosen numbers must be equal to given number M and one number can be used only once. """ from __future__ import annotations def generate_sum_of_subsets_soln(nums: list[int], max_sum: int) -> list[list[int]]: result: list[list[int]] = [] path: list[int] = [] num_index = 0 remaining_nums_sum = sum(nums) create_state_space_tree(nums, max_sum, num_index, path, result, remaining_nums_sum) return result def create_state_space_tree( nums: list[int], max_sum: int, num_index: int, path: list[int], result: list[list[int]], remaining_nums_sum: int, ) -> None: """ Creates a state space tree to iterate through each branch using DFS. It terminates the branching of a node when any of the two conditions given below satisfy. This algorithm follows depth-fist-search and backtracks when the node is not branchable. """ if sum(path) > max_sum or (remaining_nums_sum + sum(path)) < max_sum: return if sum(path) == max_sum: result.append(path) return for index in range(num_index, len(nums)): create_state_space_tree( nums, max_sum, index + 1, [*path, nums[index]], result, remaining_nums_sum - nums[index], ) """ remove the comment to take an input from the user print("Enter the elements") nums = list(map(int, input().split())) print("Enter max_sum sum") max_sum = int(input()) """ nums = [3, 34, 4, 12, 5, 2] max_sum = 9 result = generate_sum_of_subsets_soln(nums, max_sum) print(*result)
""" The sum-of-subsetsproblem states that a set of non-negative integers, and a value M, determine all possible subsets of the given set whose summation sum equal to given M. Summation of the chosen numbers must be equal to given number M and one number can be used only once. """ from __future__ import annotations def generate_sum_of_subsets_soln(nums: list[int], max_sum: int) -> list[list[int]]: result: list[list[int]] = [] path: list[int] = [] num_index = 0 remaining_nums_sum = sum(nums) create_state_space_tree(nums, max_sum, num_index, path, result, remaining_nums_sum) return result def create_state_space_tree( nums: list[int], max_sum: int, num_index: int, path: list[int], result: list[list[int]], remaining_nums_sum: int, ) -> None: """ Creates a state space tree to iterate through each branch using DFS. It terminates the branching of a node when any of the two conditions given below satisfy. This algorithm follows depth-fist-search and backtracks when the node is not branchable. """ if sum(path) > max_sum or (remaining_nums_sum + sum(path)) < max_sum: return if sum(path) == max_sum: result.append(path) return for index in range(num_index, len(nums)): create_state_space_tree( nums, max_sum, index + 1, [*path, nums[index]], result, remaining_nums_sum - nums[index], ) """ remove the comment to take an input from the user print("Enter the elements") nums = list(map(int, input().split())) print("Enter max_sum sum") max_sum = int(input()) """ nums = [3, 34, 4, 12, 5, 2] max_sum = 9 result = generate_sum_of_subsets_soln(nums, max_sum) print(*result)
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# https://www.tutorialspoint.com/python3/bitwise_operators_example.htm def binary_or(a: int, b: int) -> str: """ Take in 2 integers, convert them to binary, and return a binary number that is the result of a binary or operation on the integers provided. >>> binary_or(25, 32) '0b111001' >>> binary_or(37, 50) '0b110111' >>> binary_or(21, 30) '0b11111' >>> binary_or(58, 73) '0b1111011' >>> binary_or(0, 255) '0b11111111' >>> binary_or(0, 256) '0b100000000' >>> binary_or(0, -1) Traceback (most recent call last): ... ValueError: the value of both inputs must be positive >>> binary_or(0, 1.1) Traceback (most recent call last): ... TypeError: 'float' object cannot be interpreted as an integer >>> binary_or("0", "1") Traceback (most recent call last): ... TypeError: '<' not supported between instances of 'str' and 'int' """ if a < 0 or b < 0: raise ValueError("the value of both inputs must be positive") a_binary = str(bin(a))[2:] # remove the leading "0b" b_binary = str(bin(b))[2:] max_len = max(len(a_binary), len(b_binary)) return "0b" + "".join( str(int("1" in (char_a, char_b))) for char_a, char_b in zip(a_binary.zfill(max_len), b_binary.zfill(max_len)) ) if __name__ == "__main__": import doctest doctest.testmod()
# https://www.tutorialspoint.com/python3/bitwise_operators_example.htm def binary_or(a: int, b: int) -> str: """ Take in 2 integers, convert them to binary, and return a binary number that is the result of a binary or operation on the integers provided. >>> binary_or(25, 32) '0b111001' >>> binary_or(37, 50) '0b110111' >>> binary_or(21, 30) '0b11111' >>> binary_or(58, 73) '0b1111011' >>> binary_or(0, 255) '0b11111111' >>> binary_or(0, 256) '0b100000000' >>> binary_or(0, -1) Traceback (most recent call last): ... ValueError: the value of both inputs must be positive >>> binary_or(0, 1.1) Traceback (most recent call last): ... TypeError: 'float' object cannot be interpreted as an integer >>> binary_or("0", "1") Traceback (most recent call last): ... TypeError: '<' not supported between instances of 'str' and 'int' """ if a < 0 or b < 0: raise ValueError("the value of both inputs must be positive") a_binary = str(bin(a))[2:] # remove the leading "0b" b_binary = str(bin(b))[2:] max_len = max(len(a_binary), len(b_binary)) return "0b" + "".join( str(int("1" in (char_a, char_b))) for char_a, char_b in zip(a_binary.zfill(max_len), b_binary.zfill(max_len)) ) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" This is a pure Python implementation of the bogosort algorithm, also known as permutation sort, stupid sort, slowsort, shotgun sort, or monkey sort. Bogosort generates random permutations until it guesses the correct one. More info on: https://en.wikipedia.org/wiki/Bogosort For doctests run following command: python -m doctest -v bogo_sort.py or python3 -m doctest -v bogo_sort.py For manual testing run: python bogo_sort.py """ import random def bogo_sort(collection): """Pure implementation of the bogosort algorithm in Python :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> bogo_sort([0, 5, 3, 2, 2]) [0, 2, 2, 3, 5] >>> bogo_sort([]) [] >>> bogo_sort([-2, -5, -45]) [-45, -5, -2] """ def is_sorted(collection): for i in range(len(collection) - 1): if collection[i] > collection[i + 1]: return False return True while not is_sorted(collection): random.shuffle(collection) return collection if __name__ == "__main__": user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item) for item in user_input.split(",")] print(bogo_sort(unsorted))
""" This is a pure Python implementation of the bogosort algorithm, also known as permutation sort, stupid sort, slowsort, shotgun sort, or monkey sort. Bogosort generates random permutations until it guesses the correct one. More info on: https://en.wikipedia.org/wiki/Bogosort For doctests run following command: python -m doctest -v bogo_sort.py or python3 -m doctest -v bogo_sort.py For manual testing run: python bogo_sort.py """ import random def bogo_sort(collection): """Pure implementation of the bogosort algorithm in Python :param collection: some mutable ordered collection with heterogeneous comparable items inside :return: the same collection ordered by ascending Examples: >>> bogo_sort([0, 5, 3, 2, 2]) [0, 2, 2, 3, 5] >>> bogo_sort([]) [] >>> bogo_sort([-2, -5, -45]) [-45, -5, -2] """ def is_sorted(collection): for i in range(len(collection) - 1): if collection[i] > collection[i + 1]: return False return True while not is_sorted(collection): random.shuffle(collection) return collection if __name__ == "__main__": user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item) for item in user_input.split(",")] print(bogo_sort(unsorted))
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from scipy.constants import g """ Finding the gravitational potential energy of an object with reference to the earth,by taking its mass and height above the ground as input Description : Gravitational energy or gravitational potential energy is the potential energy a massive object has in relation to another massive object due to gravity. It is the potential energy associated with the gravitational field, which is released (converted into kinetic energy) when the objects fall towards each other. Gravitational potential energy increases when two objects are brought further apart. For two pairwise interacting point particles, the gravitational potential energy U is given by U=-GMm/R where M and m are the masses of the two particles, R is the distance between them, and G is the gravitational constant. Close to the Earth's surface, the gravitational field is approximately constant, and the gravitational potential energy of an object reduces to U=mgh where m is the object's mass, g=GM/R² is the gravity of Earth, and h is the height of the object's center of mass above a chosen reference level. Reference : "https://en.m.wikipedia.org/wiki/Gravitational_energy" """ def potential_energy(mass: float, height: float) -> float: # function will accept mass and height as parameters and return potential energy """ >>> potential_energy(10,10) 980.665 >>> potential_energy(0,5) 0.0 >>> potential_energy(8,0) 0.0 >>> potential_energy(10,5) 490.3325 >>> potential_energy(0,0) 0.0 >>> potential_energy(2,8) 156.9064 >>> potential_energy(20,100) 19613.3 """ if mass < 0: # handling of negative values of mass raise ValueError("The mass of a body cannot be negative") if height < 0: # handling of negative values of height raise ValueError("The height above the ground cannot be negative") return mass * g * height if __name__ == "__main__": from doctest import testmod testmod(name="potential_energy")
from scipy.constants import g """ Finding the gravitational potential energy of an object with reference to the earth,by taking its mass and height above the ground as input Description : Gravitational energy or gravitational potential energy is the potential energy a massive object has in relation to another massive object due to gravity. It is the potential energy associated with the gravitational field, which is released (converted into kinetic energy) when the objects fall towards each other. Gravitational potential energy increases when two objects are brought further apart. For two pairwise interacting point particles, the gravitational potential energy U is given by U=-GMm/R where M and m are the masses of the two particles, R is the distance between them, and G is the gravitational constant. Close to the Earth's surface, the gravitational field is approximately constant, and the gravitational potential energy of an object reduces to U=mgh where m is the object's mass, g=GM/R² is the gravity of Earth, and h is the height of the object's center of mass above a chosen reference level. Reference : "https://en.m.wikipedia.org/wiki/Gravitational_energy" """ def potential_energy(mass: float, height: float) -> float: # function will accept mass and height as parameters and return potential energy """ >>> potential_energy(10,10) 980.665 >>> potential_energy(0,5) 0.0 >>> potential_energy(8,0) 0.0 >>> potential_energy(10,5) 490.3325 >>> potential_energy(0,0) 0.0 >>> potential_energy(2,8) 156.9064 >>> potential_energy(20,100) 19613.3 """ if mass < 0: # handling of negative values of mass raise ValueError("The mass of a body cannot be negative") if height < 0: # handling of negative values of height raise ValueError("The height above the ground cannot be negative") return mass * g * height if __name__ == "__main__": from doctest import testmod testmod(name="potential_energy")
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" * Author: Manuel Di Lullo (https://github.com/manueldilullo) * Description: Convert a number to use the correct SI or Binary unit prefix. Inspired by prefix_conversion.py file in this repository by lance-pyles URL: https://en.wikipedia.org/wiki/Metric_prefix#List_of_SI_prefixes URL: https://en.wikipedia.org/wiki/Binary_prefix """ from __future__ import annotations from enum import Enum, unique from typing import TypeVar # Create a generic variable that can be 'Enum', or any subclass. T = TypeVar("T", bound="Enum") @unique class BinaryUnit(Enum): yotta = 80 zetta = 70 exa = 60 peta = 50 tera = 40 giga = 30 mega = 20 kilo = 10 @unique class SIUnit(Enum): yotta = 24 zetta = 21 exa = 18 peta = 15 tera = 12 giga = 9 mega = 6 kilo = 3 hecto = 2 deca = 1 deci = -1 centi = -2 milli = -3 micro = -6 nano = -9 pico = -12 femto = -15 atto = -18 zepto = -21 yocto = -24 @classmethod def get_positive(cls: type[T]) -> dict: """ Returns a dictionary with only the elements of this enum that has a positive value >>> from itertools import islice >>> positive = SIUnit.get_positive() >>> inc = iter(positive.items()) >>> dict(islice(inc, len(positive) // 2)) {'yotta': 24, 'zetta': 21, 'exa': 18, 'peta': 15, 'tera': 12} >>> dict(inc) {'giga': 9, 'mega': 6, 'kilo': 3, 'hecto': 2, 'deca': 1} """ return {unit.name: unit.value for unit in cls if unit.value > 0} @classmethod def get_negative(cls: type[T]) -> dict: """ Returns a dictionary with only the elements of this enum that has a negative value @example >>> from itertools import islice >>> negative = SIUnit.get_negative() >>> inc = iter(negative.items()) >>> dict(islice(inc, len(negative) // 2)) {'deci': -1, 'centi': -2, 'milli': -3, 'micro': -6, 'nano': -9} >>> dict(inc) {'pico': -12, 'femto': -15, 'atto': -18, 'zepto': -21, 'yocto': -24} """ return {unit.name: unit.value for unit in cls if unit.value < 0} def add_si_prefix(value: float) -> str: """ Function that converts a number to his version with SI prefix @input value (an integer) @example: >>> add_si_prefix(10000) '10.0 kilo' """ prefixes = SIUnit.get_positive() if value > 0 else SIUnit.get_negative() for name_prefix, value_prefix in prefixes.items(): numerical_part = value / (10**value_prefix) if numerical_part > 1: return f"{str(numerical_part)} {name_prefix}" return str(value) def add_binary_prefix(value: float) -> str: """ Function that converts a number to his version with Binary prefix @input value (an integer) @example: >>> add_binary_prefix(65536) '64.0 kilo' """ for prefix in BinaryUnit: numerical_part = value / (2**prefix.value) if numerical_part > 1: return f"{str(numerical_part)} {prefix.name}" return str(value) if __name__ == "__main__": import doctest doctest.testmod()
""" * Author: Manuel Di Lullo (https://github.com/manueldilullo) * Description: Convert a number to use the correct SI or Binary unit prefix. Inspired by prefix_conversion.py file in this repository by lance-pyles URL: https://en.wikipedia.org/wiki/Metric_prefix#List_of_SI_prefixes URL: https://en.wikipedia.org/wiki/Binary_prefix """ from __future__ import annotations from enum import Enum, unique from typing import TypeVar # Create a generic variable that can be 'Enum', or any subclass. T = TypeVar("T", bound="Enum") @unique class BinaryUnit(Enum): yotta = 80 zetta = 70 exa = 60 peta = 50 tera = 40 giga = 30 mega = 20 kilo = 10 @unique class SIUnit(Enum): yotta = 24 zetta = 21 exa = 18 peta = 15 tera = 12 giga = 9 mega = 6 kilo = 3 hecto = 2 deca = 1 deci = -1 centi = -2 milli = -3 micro = -6 nano = -9 pico = -12 femto = -15 atto = -18 zepto = -21 yocto = -24 @classmethod def get_positive(cls: type[T]) -> dict: """ Returns a dictionary with only the elements of this enum that has a positive value >>> from itertools import islice >>> positive = SIUnit.get_positive() >>> inc = iter(positive.items()) >>> dict(islice(inc, len(positive) // 2)) {'yotta': 24, 'zetta': 21, 'exa': 18, 'peta': 15, 'tera': 12} >>> dict(inc) {'giga': 9, 'mega': 6, 'kilo': 3, 'hecto': 2, 'deca': 1} """ return {unit.name: unit.value for unit in cls if unit.value > 0} @classmethod def get_negative(cls: type[T]) -> dict: """ Returns a dictionary with only the elements of this enum that has a negative value @example >>> from itertools import islice >>> negative = SIUnit.get_negative() >>> inc = iter(negative.items()) >>> dict(islice(inc, len(negative) // 2)) {'deci': -1, 'centi': -2, 'milli': -3, 'micro': -6, 'nano': -9} >>> dict(inc) {'pico': -12, 'femto': -15, 'atto': -18, 'zepto': -21, 'yocto': -24} """ return {unit.name: unit.value for unit in cls if unit.value < 0} def add_si_prefix(value: float) -> str: """ Function that converts a number to his version with SI prefix @input value (an integer) @example: >>> add_si_prefix(10000) '10.0 kilo' """ prefixes = SIUnit.get_positive() if value > 0 else SIUnit.get_negative() for name_prefix, value_prefix in prefixes.items(): numerical_part = value / (10**value_prefix) if numerical_part > 1: return f"{str(numerical_part)} {name_prefix}" return str(value) def add_binary_prefix(value: float) -> str: """ Function that converts a number to his version with Binary prefix @input value (an integer) @example: >>> add_binary_prefix(65536) '64.0 kilo' """ for prefix in BinaryUnit: numerical_part = value / (2**prefix.value) if numerical_part > 1: return f"{str(numerical_part)} {prefix.name}" return str(value) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" A XOR Gate is a logic gate in boolean algebra which results to 1 (True) if only one of the two inputs is 1, and 0 (False) if an even number of inputs are 1. Following is the truth table of a XOR Gate: ------------------------------ | Input 1 | Input 2 | Output | ------------------------------ | 0 | 0 | 0 | | 0 | 1 | 1 | | 1 | 0 | 1 | | 1 | 1 | 0 | ------------------------------ Refer - https://www.geeksforgeeks.org/logic-gates-in-python/ """ def xor_gate(input_1: int, input_2: int) -> int: """ calculate xor of the input values >>> xor_gate(0, 0) 0 >>> xor_gate(0, 1) 1 >>> xor_gate(1, 0) 1 >>> xor_gate(1, 1) 0 """ return (input_1, input_2).count(0) % 2 def test_xor_gate() -> None: """ Tests the xor_gate function """ assert xor_gate(0, 0) == 0 assert xor_gate(0, 1) == 1 assert xor_gate(1, 0) == 1 assert xor_gate(1, 1) == 0 if __name__ == "__main__": print(xor_gate(0, 0)) print(xor_gate(0, 1))
""" A XOR Gate is a logic gate in boolean algebra which results to 1 (True) if only one of the two inputs is 1, and 0 (False) if an even number of inputs are 1. Following is the truth table of a XOR Gate: ------------------------------ | Input 1 | Input 2 | Output | ------------------------------ | 0 | 0 | 0 | | 0 | 1 | 1 | | 1 | 0 | 1 | | 1 | 1 | 0 | ------------------------------ Refer - https://www.geeksforgeeks.org/logic-gates-in-python/ """ def xor_gate(input_1: int, input_2: int) -> int: """ calculate xor of the input values >>> xor_gate(0, 0) 0 >>> xor_gate(0, 1) 1 >>> xor_gate(1, 0) 1 >>> xor_gate(1, 1) 0 """ return (input_1, input_2).count(0) % 2 def test_xor_gate() -> None: """ Tests the xor_gate function """ assert xor_gate(0, 0) == 0 assert xor_gate(0, 1) == 1 assert xor_gate(1, 0) == 1 assert xor_gate(1, 1) == 0 if __name__ == "__main__": print(xor_gate(0, 0)) print(xor_gate(0, 1))
-1
TheAlgorithms/Python
8,551
New gitter link added or replaced
### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
KaixLina
"2023-03-26T04:26:40Z"
"2023-03-26T15:19:19Z"
3f9150c1b2dd15808a4962e03a1455f8d825512c
7cdb011ba440a07768179bfaea190bddefc890d8
New gitter link added or replaced. ### Describe your change: Fixes #8546 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" https://en.wikipedia.org/wiki/Floor_and_ceiling_functions """ def ceil(x: float) -> int: """ Return the ceiling of x as an Integral. :param x: the number :return: the smallest integer >= x. >>> import math >>> all(ceil(n) == math.ceil(n) for n ... in (1, -1, 0, -0, 1.1, -1.1, 1.0, -1.0, 1_000_000_000)) True """ return int(x) if x - int(x) <= 0 else int(x) + 1 if __name__ == "__main__": import doctest doctest.testmod()
""" https://en.wikipedia.org/wiki/Floor_and_ceiling_functions """ def ceil(x: float) -> int: """ Return the ceiling of x as an Integral. :param x: the number :return: the smallest integer >= x. >>> import math >>> all(ceil(n) == math.ceil(n) for n ... in (1, -1, 0, -0, 1.1, -1.1, 1.0, -1.0, 1_000_000_000)) True """ return int(x) if x - int(x) <= 0 else int(x) + 1 if __name__ == "__main__": import doctest doctest.testmod()
-1