repo_name
stringclasses
1 value
pr_number
int64
4.12k
11.2k
pr_title
stringlengths
9
107
pr_description
stringlengths
107
5.48k
author
stringlengths
4
18
date_created
unknown
date_merged
unknown
previous_commit
stringlengths
40
40
pr_commit
stringlengths
40
40
query
stringlengths
118
5.52k
before_content
stringlengths
0
7.93M
after_content
stringlengths
0
7.93M
label
int64
-1
1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
if __name__ == "__main__": import socket # Import socket module sock = socket.socket() # Create a socket object host = socket.gethostname() # Get local machine name port = 12312 sock.connect((host, port)) sock.send(b"Hello server!") with open("Received_file", "wb") as out_file: print("File opened") print("Receiving data...") while True: data = sock.recv(1024) print(f"{data = }") if not data: break out_file.write(data) # Write data to a file print("Successfully got the file") sock.close() print("Connection closed")
if __name__ == "__main__": import socket # Import socket module sock = socket.socket() # Create a socket object host = socket.gethostname() # Get local machine name port = 12312 sock.connect((host, port)) sock.send(b"Hello server!") with open("Received_file", "wb") as out_file: print("File opened") print("Receiving data...") while True: data = sock.recv(1024) print(f"{data = }") if not data: break out_file.write(data) # Write data to a file print("Successfully got the file") sock.close() print("Connection closed")
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" https://en.wikipedia.org/wiki/Bidirectional_search """ from __future__ import annotations import time Path = list[tuple[int, int]] grid = [ [0, 0, 0, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0], # 0 are free path whereas 1's are obstacles [0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0], [1, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], ] delta = [[-1, 0], [0, -1], [1, 0], [0, 1]] # up, left, down, right class Node: def __init__( self, pos_x: int, pos_y: int, goal_x: int, goal_y: int, parent: Node | None ): self.pos_x = pos_x self.pos_y = pos_y self.pos = (pos_y, pos_x) self.goal_x = goal_x self.goal_y = goal_y self.parent = parent class BreadthFirstSearch: """ # Comment out slow pytests... # 9.15s call graphs/bidirectional_breadth_first_search.py:: \ # graphs.bidirectional_breadth_first_search.BreadthFirstSearch # >>> bfs = BreadthFirstSearch((0, 0), (len(grid) - 1, len(grid[0]) - 1)) # >>> (bfs.start.pos_y + delta[3][0], bfs.start.pos_x + delta[3][1]) (0, 1) # >>> [x.pos for x in bfs.get_successors(bfs.start)] [(1, 0), (0, 1)] # >>> (bfs.start.pos_y + delta[2][0], bfs.start.pos_x + delta[2][1]) (1, 0) # >>> bfs.retrace_path(bfs.start) [(0, 0)] # >>> bfs.search() # doctest: +NORMALIZE_WHITESPACE [(0, 0), (1, 0), (2, 0), (3, 0), (3, 1), (4, 1), (5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (6, 5), (6, 6)] """ def __init__(self, start: tuple[int, int], goal: tuple[int, int]): self.start = Node(start[1], start[0], goal[1], goal[0], None) self.target = Node(goal[1], goal[0], goal[1], goal[0], None) self.node_queue = [self.start] self.reached = False def search(self) -> Path | None: while self.node_queue: current_node = self.node_queue.pop(0) if current_node.pos == self.target.pos: self.reached = True return self.retrace_path(current_node) successors = self.get_successors(current_node) for node in successors: self.node_queue.append(node) if not self.reached: return [self.start.pos] return None def get_successors(self, parent: Node) -> list[Node]: """ Returns a list of successors (both in the grid and free spaces) """ successors = [] for action in delta: pos_x = parent.pos_x + action[1] pos_y = parent.pos_y + action[0] if not (0 <= pos_x <= len(grid[0]) - 1 and 0 <= pos_y <= len(grid) - 1): continue if grid[pos_y][pos_x] != 0: continue successors.append( Node(pos_x, pos_y, self.target.pos_y, self.target.pos_x, parent) ) return successors def retrace_path(self, node: Node | None) -> Path: """ Retrace the path from parents to parents until start node """ current_node = node path = [] while current_node is not None: path.append((current_node.pos_y, current_node.pos_x)) current_node = current_node.parent path.reverse() return path class BidirectionalBreadthFirstSearch: """ >>> bd_bfs = BidirectionalBreadthFirstSearch((0, 0), (len(grid) - 1, ... len(grid[0]) - 1)) >>> bd_bfs.fwd_bfs.start.pos == bd_bfs.bwd_bfs.target.pos True >>> bd_bfs.retrace_bidirectional_path(bd_bfs.fwd_bfs.start, ... bd_bfs.bwd_bfs.start) [(0, 0)] >>> bd_bfs.search() # doctest: +NORMALIZE_WHITESPACE [(0, 0), (0, 1), (0, 2), (1, 2), (2, 2), (2, 3), (2, 4), (3, 4), (3, 5), (3, 6), (4, 6), (5, 6), (6, 6)] """ def __init__(self, start, goal): self.fwd_bfs = BreadthFirstSearch(start, goal) self.bwd_bfs = BreadthFirstSearch(goal, start) self.reached = False def search(self) -> Path | None: while self.fwd_bfs.node_queue or self.bwd_bfs.node_queue: current_fwd_node = self.fwd_bfs.node_queue.pop(0) current_bwd_node = self.bwd_bfs.node_queue.pop(0) if current_bwd_node.pos == current_fwd_node.pos: self.reached = True return self.retrace_bidirectional_path( current_fwd_node, current_bwd_node ) self.fwd_bfs.target = current_bwd_node self.bwd_bfs.target = current_fwd_node successors = { self.fwd_bfs: self.fwd_bfs.get_successors(current_fwd_node), self.bwd_bfs: self.bwd_bfs.get_successors(current_bwd_node), } for bfs in [self.fwd_bfs, self.bwd_bfs]: for node in successors[bfs]: bfs.node_queue.append(node) if not self.reached: return [self.fwd_bfs.start.pos] return None def retrace_bidirectional_path(self, fwd_node: Node, bwd_node: Node) -> Path: fwd_path = self.fwd_bfs.retrace_path(fwd_node) bwd_path = self.bwd_bfs.retrace_path(bwd_node) bwd_path.pop() bwd_path.reverse() path = fwd_path + bwd_path return path if __name__ == "__main__": # all coordinates are given in format [y,x] import doctest doctest.testmod() init = (0, 0) goal = (len(grid) - 1, len(grid[0]) - 1) for elem in grid: print(elem) start_bfs_time = time.time() bfs = BreadthFirstSearch(init, goal) path = bfs.search() bfs_time = time.time() - start_bfs_time print("Unidirectional BFS computation time : ", bfs_time) start_bd_bfs_time = time.time() bd_bfs = BidirectionalBreadthFirstSearch(init, goal) bd_path = bd_bfs.search() bd_bfs_time = time.time() - start_bd_bfs_time print("Bidirectional BFS computation time : ", bd_bfs_time)
""" https://en.wikipedia.org/wiki/Bidirectional_search """ from __future__ import annotations import time Path = list[tuple[int, int]] grid = [ [0, 0, 0, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0], # 0 are free path whereas 1's are obstacles [0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0], [1, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0], ] delta = [[-1, 0], [0, -1], [1, 0], [0, 1]] # up, left, down, right class Node: def __init__( self, pos_x: int, pos_y: int, goal_x: int, goal_y: int, parent: Node | None ): self.pos_x = pos_x self.pos_y = pos_y self.pos = (pos_y, pos_x) self.goal_x = goal_x self.goal_y = goal_y self.parent = parent class BreadthFirstSearch: """ # Comment out slow pytests... # 9.15s call graphs/bidirectional_breadth_first_search.py:: \ # graphs.bidirectional_breadth_first_search.BreadthFirstSearch # >>> bfs = BreadthFirstSearch((0, 0), (len(grid) - 1, len(grid[0]) - 1)) # >>> (bfs.start.pos_y + delta[3][0], bfs.start.pos_x + delta[3][1]) (0, 1) # >>> [x.pos for x in bfs.get_successors(bfs.start)] [(1, 0), (0, 1)] # >>> (bfs.start.pos_y + delta[2][0], bfs.start.pos_x + delta[2][1]) (1, 0) # >>> bfs.retrace_path(bfs.start) [(0, 0)] # >>> bfs.search() # doctest: +NORMALIZE_WHITESPACE [(0, 0), (1, 0), (2, 0), (3, 0), (3, 1), (4, 1), (5, 1), (5, 2), (5, 3), (5, 4), (5, 5), (6, 5), (6, 6)] """ def __init__(self, start: tuple[int, int], goal: tuple[int, int]): self.start = Node(start[1], start[0], goal[1], goal[0], None) self.target = Node(goal[1], goal[0], goal[1], goal[0], None) self.node_queue = [self.start] self.reached = False def search(self) -> Path | None: while self.node_queue: current_node = self.node_queue.pop(0) if current_node.pos == self.target.pos: self.reached = True return self.retrace_path(current_node) successors = self.get_successors(current_node) for node in successors: self.node_queue.append(node) if not self.reached: return [self.start.pos] return None def get_successors(self, parent: Node) -> list[Node]: """ Returns a list of successors (both in the grid and free spaces) """ successors = [] for action in delta: pos_x = parent.pos_x + action[1] pos_y = parent.pos_y + action[0] if not (0 <= pos_x <= len(grid[0]) - 1 and 0 <= pos_y <= len(grid) - 1): continue if grid[pos_y][pos_x] != 0: continue successors.append( Node(pos_x, pos_y, self.target.pos_y, self.target.pos_x, parent) ) return successors def retrace_path(self, node: Node | None) -> Path: """ Retrace the path from parents to parents until start node """ current_node = node path = [] while current_node is not None: path.append((current_node.pos_y, current_node.pos_x)) current_node = current_node.parent path.reverse() return path class BidirectionalBreadthFirstSearch: """ >>> bd_bfs = BidirectionalBreadthFirstSearch((0, 0), (len(grid) - 1, ... len(grid[0]) - 1)) >>> bd_bfs.fwd_bfs.start.pos == bd_bfs.bwd_bfs.target.pos True >>> bd_bfs.retrace_bidirectional_path(bd_bfs.fwd_bfs.start, ... bd_bfs.bwd_bfs.start) [(0, 0)] >>> bd_bfs.search() # doctest: +NORMALIZE_WHITESPACE [(0, 0), (0, 1), (0, 2), (1, 2), (2, 2), (2, 3), (2, 4), (3, 4), (3, 5), (3, 6), (4, 6), (5, 6), (6, 6)] """ def __init__(self, start, goal): self.fwd_bfs = BreadthFirstSearch(start, goal) self.bwd_bfs = BreadthFirstSearch(goal, start) self.reached = False def search(self) -> Path | None: while self.fwd_bfs.node_queue or self.bwd_bfs.node_queue: current_fwd_node = self.fwd_bfs.node_queue.pop(0) current_bwd_node = self.bwd_bfs.node_queue.pop(0) if current_bwd_node.pos == current_fwd_node.pos: self.reached = True return self.retrace_bidirectional_path( current_fwd_node, current_bwd_node ) self.fwd_bfs.target = current_bwd_node self.bwd_bfs.target = current_fwd_node successors = { self.fwd_bfs: self.fwd_bfs.get_successors(current_fwd_node), self.bwd_bfs: self.bwd_bfs.get_successors(current_bwd_node), } for bfs in [self.fwd_bfs, self.bwd_bfs]: for node in successors[bfs]: bfs.node_queue.append(node) if not self.reached: return [self.fwd_bfs.start.pos] return None def retrace_bidirectional_path(self, fwd_node: Node, bwd_node: Node) -> Path: fwd_path = self.fwd_bfs.retrace_path(fwd_node) bwd_path = self.bwd_bfs.retrace_path(bwd_node) bwd_path.pop() bwd_path.reverse() path = fwd_path + bwd_path return path if __name__ == "__main__": # all coordinates are given in format [y,x] import doctest doctest.testmod() init = (0, 0) goal = (len(grid) - 1, len(grid[0]) - 1) for elem in grid: print(elem) start_bfs_time = time.time() bfs = BreadthFirstSearch(init, goal) path = bfs.search() bfs_time = time.time() - start_bfs_time print("Unidirectional BFS computation time : ", bfs_time) start_bd_bfs_time = time.time() bd_bfs = BidirectionalBreadthFirstSearch(init, goal) bd_path = bd_bfs.search() bd_bfs_time = time.time() - start_bd_bfs_time print("Bidirectional BFS computation time : ", bd_bfs_time)
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Given weights and values of n items, put these items in a knapsack of capacity W to get the maximum total value in the knapsack. Note that only the integer weights 0-1 knapsack problem is solvable using dynamic programming. """ def mf_knapsack(i, wt, val, j): """ This code involves the concept of memory functions. Here we solve the subproblems which are needed unlike the below example F is a 2D array with -1s filled up """ global f # a global dp table for knapsack if f[i][j] < 0: if j < wt[i - 1]: val = mf_knapsack(i - 1, wt, val, j) else: val = max( mf_knapsack(i - 1, wt, val, j), mf_knapsack(i - 1, wt, val, j - wt[i - 1]) + val[i - 1], ) f[i][j] = val return f[i][j] def knapsack(w, wt, val, n): dp = [[0] * (w + 1) for _ in range(n + 1)] for i in range(1, n + 1): for w_ in range(1, w + 1): if wt[i - 1] <= w_: dp[i][w_] = max(val[i - 1] + dp[i - 1][w_ - wt[i - 1]], dp[i - 1][w_]) else: dp[i][w_] = dp[i - 1][w_] return dp[n][w_], dp def knapsack_with_example_solution(w: int, wt: list, val: list): """ Solves the integer weights knapsack problem returns one of the several possible optimal subsets. Parameters --------- W: int, the total maximum weight for the given knapsack problem. wt: list, the vector of weights for all items where wt[i] is the weight of the i-th item. val: list, the vector of values for all items where val[i] is the value of the i-th item Returns ------- optimal_val: float, the optimal value for the given knapsack problem example_optional_set: set, the indices of one of the optimal subsets which gave rise to the optimal value. Examples ------- >>> knapsack_with_example_solution(10, [1, 3, 5, 2], [10, 20, 100, 22]) (142, {2, 3, 4}) >>> knapsack_with_example_solution(6, [4, 3, 2, 3], [3, 2, 4, 4]) (8, {3, 4}) >>> knapsack_with_example_solution(6, [4, 3, 2, 3], [3, 2, 4]) Traceback (most recent call last): ... ValueError: The number of weights must be the same as the number of values. But got 4 weights and 3 values """ if not (isinstance(wt, (list, tuple)) and isinstance(val, (list, tuple))): raise ValueError( "Both the weights and values vectors must be either lists or tuples" ) num_items = len(wt) if num_items != len(val): raise ValueError( "The number of weights must be the " "same as the number of values.\nBut " f"got {num_items} weights and {len(val)} values" ) for i in range(num_items): if not isinstance(wt[i], int): raise TypeError( "All weights must be integers but " f"got weight of type {type(wt[i])} at index {i}" ) optimal_val, dp_table = knapsack(w, wt, val, num_items) example_optional_set: set = set() _construct_solution(dp_table, wt, num_items, w, example_optional_set) return optimal_val, example_optional_set def _construct_solution(dp: list, wt: list, i: int, j: int, optimal_set: set): """ Recursively reconstructs one of the optimal subsets given a filled DP table and the vector of weights Parameters --------- dp: list of list, the table of a solved integer weight dynamic programming problem wt: list or tuple, the vector of weights of the items i: int, the index of the item under consideration j: int, the current possible maximum weight optimal_set: set, the optimal subset so far. This gets modified by the function. Returns ------- None """ # for the current item i at a maximum weight j to be part of an optimal subset, # the optimal value at (i, j) must be greater than the optimal value at (i-1, j). # where i - 1 means considering only the previous items at the given maximum weight if i > 0 and j > 0: if dp[i - 1][j] == dp[i][j]: _construct_solution(dp, wt, i - 1, j, optimal_set) else: optimal_set.add(i) _construct_solution(dp, wt, i - 1, j - wt[i - 1], optimal_set) if __name__ == "__main__": """ Adding test case for knapsack """ val = [3, 2, 4, 4] wt = [4, 3, 2, 3] n = 4 w = 6 f = [[0] * (w + 1)] + [[0] + [-1] * (w + 1) for _ in range(n + 1)] optimal_solution, _ = knapsack(w, wt, val, n) print(optimal_solution) print(mf_knapsack(n, wt, val, w)) # switched the n and w # testing the dynamic programming problem with example # the optimal subset for the above example are items 3 and 4 optimal_solution, optimal_subset = knapsack_with_example_solution(w, wt, val) assert optimal_solution == 8 assert optimal_subset == {3, 4} print("optimal_value = ", optimal_solution) print("An optimal subset corresponding to the optimal value", optimal_subset)
""" Given weights and values of n items, put these items in a knapsack of capacity W to get the maximum total value in the knapsack. Note that only the integer weights 0-1 knapsack problem is solvable using dynamic programming. """ def mf_knapsack(i, wt, val, j): """ This code involves the concept of memory functions. Here we solve the subproblems which are needed unlike the below example F is a 2D array with -1s filled up """ global f # a global dp table for knapsack if f[i][j] < 0: if j < wt[i - 1]: val = mf_knapsack(i - 1, wt, val, j) else: val = max( mf_knapsack(i - 1, wt, val, j), mf_knapsack(i - 1, wt, val, j - wt[i - 1]) + val[i - 1], ) f[i][j] = val return f[i][j] def knapsack(w, wt, val, n): dp = [[0] * (w + 1) for _ in range(n + 1)] for i in range(1, n + 1): for w_ in range(1, w + 1): if wt[i - 1] <= w_: dp[i][w_] = max(val[i - 1] + dp[i - 1][w_ - wt[i - 1]], dp[i - 1][w_]) else: dp[i][w_] = dp[i - 1][w_] return dp[n][w_], dp def knapsack_with_example_solution(w: int, wt: list, val: list): """ Solves the integer weights knapsack problem returns one of the several possible optimal subsets. Parameters --------- W: int, the total maximum weight for the given knapsack problem. wt: list, the vector of weights for all items where wt[i] is the weight of the i-th item. val: list, the vector of values for all items where val[i] is the value of the i-th item Returns ------- optimal_val: float, the optimal value for the given knapsack problem example_optional_set: set, the indices of one of the optimal subsets which gave rise to the optimal value. Examples ------- >>> knapsack_with_example_solution(10, [1, 3, 5, 2], [10, 20, 100, 22]) (142, {2, 3, 4}) >>> knapsack_with_example_solution(6, [4, 3, 2, 3], [3, 2, 4, 4]) (8, {3, 4}) >>> knapsack_with_example_solution(6, [4, 3, 2, 3], [3, 2, 4]) Traceback (most recent call last): ... ValueError: The number of weights must be the same as the number of values. But got 4 weights and 3 values """ if not (isinstance(wt, (list, tuple)) and isinstance(val, (list, tuple))): raise ValueError( "Both the weights and values vectors must be either lists or tuples" ) num_items = len(wt) if num_items != len(val): raise ValueError( "The number of weights must be the " "same as the number of values.\nBut " f"got {num_items} weights and {len(val)} values" ) for i in range(num_items): if not isinstance(wt[i], int): raise TypeError( "All weights must be integers but " f"got weight of type {type(wt[i])} at index {i}" ) optimal_val, dp_table = knapsack(w, wt, val, num_items) example_optional_set: set = set() _construct_solution(dp_table, wt, num_items, w, example_optional_set) return optimal_val, example_optional_set def _construct_solution(dp: list, wt: list, i: int, j: int, optimal_set: set): """ Recursively reconstructs one of the optimal subsets given a filled DP table and the vector of weights Parameters --------- dp: list of list, the table of a solved integer weight dynamic programming problem wt: list or tuple, the vector of weights of the items i: int, the index of the item under consideration j: int, the current possible maximum weight optimal_set: set, the optimal subset so far. This gets modified by the function. Returns ------- None """ # for the current item i at a maximum weight j to be part of an optimal subset, # the optimal value at (i, j) must be greater than the optimal value at (i-1, j). # where i - 1 means considering only the previous items at the given maximum weight if i > 0 and j > 0: if dp[i - 1][j] == dp[i][j]: _construct_solution(dp, wt, i - 1, j, optimal_set) else: optimal_set.add(i) _construct_solution(dp, wt, i - 1, j - wt[i - 1], optimal_set) if __name__ == "__main__": """ Adding test case for knapsack """ val = [3, 2, 4, 4] wt = [4, 3, 2, 3] n = 4 w = 6 f = [[0] * (w + 1)] + [[0] + [-1] * (w + 1) for _ in range(n + 1)] optimal_solution, _ = knapsack(w, wt, val, n) print(optimal_solution) print(mf_knapsack(n, wt, val, w)) # switched the n and w # testing the dynamic programming problem with example # the optimal subset for the above example are items 3 and 4 optimal_solution, optimal_subset = knapsack_with_example_solution(w, wt, val) assert optimal_solution == 8 assert optimal_subset == {3, 4} print("optimal_value = ", optimal_solution) print("An optimal subset corresponding to the optimal value", optimal_subset)
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 1: https://projecteuler.net/problem=1 Multiples of 3 and 5 If we list all the natural numbers below 10 that are multiples of 3 or 5, we get 3, 5, 6 and 9. The sum of these multiples is 23. Find the sum of all the multiples of 3 or 5 below 1000. """ def solution(n: int = 1000) -> int: """ Returns the sum of all the multiples of 3 or 5 below n. >>> solution(3) 0 >>> solution(4) 3 >>> solution(10) 23 >>> solution(600) 83700 >>> solution(-7) 0 """ return sum(e for e in range(3, n) if e % 3 == 0 or e % 5 == 0) if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 1: https://projecteuler.net/problem=1 Multiples of 3 and 5 If we list all the natural numbers below 10 that are multiples of 3 or 5, we get 3, 5, 6 and 9. The sum of these multiples is 23. Find the sum of all the multiples of 3 or 5 below 1000. """ def solution(n: int = 1000) -> int: """ Returns the sum of all the multiples of 3 or 5 below n. >>> solution(3) 0 >>> solution(4) 3 >>> solution(10) 23 >>> solution(600) 83700 >>> solution(-7) 0 """ return sum(e for e in range(3, n) if e % 3 == 0 or e % 5 == 0) if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from math import log from scipy.constants import Boltzmann, physical_constants T = 300 # TEMPERATURE (unit = K) def builtin_voltage( donor_conc: float, # donor concentration acceptor_conc: float, # acceptor concentration intrinsic_conc: float, # intrinsic concentration ) -> float: """ This function can calculate the Builtin Voltage of a pn junction diode. This is calculated from the given three values. Examples - >>> builtin_voltage(donor_conc=1e17, acceptor_conc=1e17, intrinsic_conc=1e10) 0.833370010652644 >>> builtin_voltage(donor_conc=0, acceptor_conc=1600, intrinsic_conc=200) Traceback (most recent call last): ... ValueError: Donor concentration should be positive >>> builtin_voltage(donor_conc=1000, acceptor_conc=0, intrinsic_conc=1200) Traceback (most recent call last): ... ValueError: Acceptor concentration should be positive >>> builtin_voltage(donor_conc=1000, acceptor_conc=1000, intrinsic_conc=0) Traceback (most recent call last): ... ValueError: Intrinsic concentration should be positive >>> builtin_voltage(donor_conc=1000, acceptor_conc=3000, intrinsic_conc=2000) Traceback (most recent call last): ... ValueError: Donor concentration should be greater than intrinsic concentration >>> builtin_voltage(donor_conc=3000, acceptor_conc=1000, intrinsic_conc=2000) Traceback (most recent call last): ... ValueError: Acceptor concentration should be greater than intrinsic concentration """ if donor_conc <= 0: raise ValueError("Donor concentration should be positive") elif acceptor_conc <= 0: raise ValueError("Acceptor concentration should be positive") elif intrinsic_conc <= 0: raise ValueError("Intrinsic concentration should be positive") elif donor_conc <= intrinsic_conc: raise ValueError( "Donor concentration should be greater than intrinsic concentration" ) elif acceptor_conc <= intrinsic_conc: raise ValueError( "Acceptor concentration should be greater than intrinsic concentration" ) else: return ( Boltzmann * T * log((donor_conc * acceptor_conc) / intrinsic_conc**2) / physical_constants["electron volt"][0] ) if __name__ == "__main__": import doctest doctest.testmod()
from math import log from scipy.constants import Boltzmann, physical_constants T = 300 # TEMPERATURE (unit = K) def builtin_voltage( donor_conc: float, # donor concentration acceptor_conc: float, # acceptor concentration intrinsic_conc: float, # intrinsic concentration ) -> float: """ This function can calculate the Builtin Voltage of a pn junction diode. This is calculated from the given three values. Examples - >>> builtin_voltage(donor_conc=1e17, acceptor_conc=1e17, intrinsic_conc=1e10) 0.833370010652644 >>> builtin_voltage(donor_conc=0, acceptor_conc=1600, intrinsic_conc=200) Traceback (most recent call last): ... ValueError: Donor concentration should be positive >>> builtin_voltage(donor_conc=1000, acceptor_conc=0, intrinsic_conc=1200) Traceback (most recent call last): ... ValueError: Acceptor concentration should be positive >>> builtin_voltage(donor_conc=1000, acceptor_conc=1000, intrinsic_conc=0) Traceback (most recent call last): ... ValueError: Intrinsic concentration should be positive >>> builtin_voltage(donor_conc=1000, acceptor_conc=3000, intrinsic_conc=2000) Traceback (most recent call last): ... ValueError: Donor concentration should be greater than intrinsic concentration >>> builtin_voltage(donor_conc=3000, acceptor_conc=1000, intrinsic_conc=2000) Traceback (most recent call last): ... ValueError: Acceptor concentration should be greater than intrinsic concentration """ if donor_conc <= 0: raise ValueError("Donor concentration should be positive") elif acceptor_conc <= 0: raise ValueError("Acceptor concentration should be positive") elif intrinsic_conc <= 0: raise ValueError("Intrinsic concentration should be positive") elif donor_conc <= intrinsic_conc: raise ValueError( "Donor concentration should be greater than intrinsic concentration" ) elif acceptor_conc <= intrinsic_conc: raise ValueError( "Acceptor concentration should be greater than intrinsic concentration" ) else: return ( Boltzmann * T * log((donor_conc * acceptor_conc) / intrinsic_conc**2) / physical_constants["electron volt"][0] ) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Conway's Game of Life implemented in Python. https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life """ from __future__ import annotations from PIL import Image # Define glider example GLIDER = [ [0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0], [1, 1, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], ] # Define blinker example BLINKER = [[0, 1, 0], [0, 1, 0], [0, 1, 0]] def new_generation(cells: list[list[int]]) -> list[list[int]]: """ Generates the next generation for a given state of Conway's Game of Life. >>> new_generation(BLINKER) [[0, 0, 0], [1, 1, 1], [0, 0, 0]] """ next_generation = [] for i in range(len(cells)): next_generation_row = [] for j in range(len(cells[i])): # Get the number of live neighbours neighbour_count = 0 if i > 0 and j > 0: neighbour_count += cells[i - 1][j - 1] if i > 0: neighbour_count += cells[i - 1][j] if i > 0 and j < len(cells[i]) - 1: neighbour_count += cells[i - 1][j + 1] if j > 0: neighbour_count += cells[i][j - 1] if j < len(cells[i]) - 1: neighbour_count += cells[i][j + 1] if i < len(cells) - 1 and j > 0: neighbour_count += cells[i + 1][j - 1] if i < len(cells) - 1: neighbour_count += cells[i + 1][j] if i < len(cells) - 1 and j < len(cells[i]) - 1: neighbour_count += cells[i + 1][j + 1] # Rules of the game of life (excerpt from Wikipedia): # 1. Any live cell with two or three live neighbours survives. # 2. Any dead cell with three live neighbours becomes a live cell. # 3. All other live cells die in the next generation. # Similarly, all other dead cells stay dead. alive = cells[i][j] == 1 if ( (alive and 2 <= neighbour_count <= 3) or not alive and neighbour_count == 3 ): next_generation_row.append(1) else: next_generation_row.append(0) next_generation.append(next_generation_row) return next_generation def generate_images(cells: list[list[int]], frames: int) -> list[Image.Image]: """ Generates a list of images of subsequent Game of Life states. """ images = [] for _ in range(frames): # Create output image img = Image.new("RGB", (len(cells[0]), len(cells))) pixels = img.load() # Save cells to image for x in range(len(cells)): for y in range(len(cells[0])): colour = 255 - cells[y][x] * 255 pixels[x, y] = (colour, colour, colour) # Save image images.append(img) cells = new_generation(cells) return images if __name__ == "__main__": images = generate_images(GLIDER, 16) images[0].save("out.gif", save_all=True, append_images=images[1:])
""" Conway's Game of Life implemented in Python. https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life """ from __future__ import annotations from PIL import Image # Define glider example GLIDER = [ [0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0], [1, 1, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0], ] # Define blinker example BLINKER = [[0, 1, 0], [0, 1, 0], [0, 1, 0]] def new_generation(cells: list[list[int]]) -> list[list[int]]: """ Generates the next generation for a given state of Conway's Game of Life. >>> new_generation(BLINKER) [[0, 0, 0], [1, 1, 1], [0, 0, 0]] """ next_generation = [] for i in range(len(cells)): next_generation_row = [] for j in range(len(cells[i])): # Get the number of live neighbours neighbour_count = 0 if i > 0 and j > 0: neighbour_count += cells[i - 1][j - 1] if i > 0: neighbour_count += cells[i - 1][j] if i > 0 and j < len(cells[i]) - 1: neighbour_count += cells[i - 1][j + 1] if j > 0: neighbour_count += cells[i][j - 1] if j < len(cells[i]) - 1: neighbour_count += cells[i][j + 1] if i < len(cells) - 1 and j > 0: neighbour_count += cells[i + 1][j - 1] if i < len(cells) - 1: neighbour_count += cells[i + 1][j] if i < len(cells) - 1 and j < len(cells[i]) - 1: neighbour_count += cells[i + 1][j + 1] # Rules of the game of life (excerpt from Wikipedia): # 1. Any live cell with two or three live neighbours survives. # 2. Any dead cell with three live neighbours becomes a live cell. # 3. All other live cells die in the next generation. # Similarly, all other dead cells stay dead. alive = cells[i][j] == 1 if ( (alive and 2 <= neighbour_count <= 3) or not alive and neighbour_count == 3 ): next_generation_row.append(1) else: next_generation_row.append(0) next_generation.append(next_generation_row) return next_generation def generate_images(cells: list[list[int]], frames: int) -> list[Image.Image]: """ Generates a list of images of subsequent Game of Life states. """ images = [] for _ in range(frames): # Create output image img = Image.new("RGB", (len(cells[0]), len(cells))) pixels = img.load() # Save cells to image for x in range(len(cells)): for y in range(len(cells[0])): colour = 255 - cells[y][x] * 255 pixels[x, y] = (colour, colour, colour) # Save image images.append(img) cells = new_generation(cells) return images if __name__ == "__main__": images = generate_images(GLIDER, 16) images[0].save("out.gif", save_all=True, append_images=images[1:])
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Calculates the SumSet of two sets of numbers (A and B) Source: https://en.wikipedia.org/wiki/Sumset """ def sumset(set_a: set, set_b: set) -> set: """ :param first set: a set of numbers :param second set: a set of numbers :return: the nth number in Sylvester's sequence >>> sumset({1, 2, 3}, {4, 5, 6}) {5, 6, 7, 8, 9} >>> sumset({1, 2, 3}, {4, 5, 6, 7}) {5, 6, 7, 8, 9, 10} >>> sumset({1, 2, 3, 4}, 3) Traceback (most recent call last): ... AssertionError: The input value of [set_b=3] is not a set """ assert isinstance(set_a, set), f"The input value of [set_a={set_a}] is not a set" assert isinstance(set_b, set), f"The input value of [set_b={set_b}] is not a set" return {a + b for a in set_a for b in set_b} if __name__ == "__main__": from doctest import testmod testmod()
""" Calculates the SumSet of two sets of numbers (A and B) Source: https://en.wikipedia.org/wiki/Sumset """ def sumset(set_a: set, set_b: set) -> set: """ :param first set: a set of numbers :param second set: a set of numbers :return: the nth number in Sylvester's sequence >>> sumset({1, 2, 3}, {4, 5, 6}) {5, 6, 7, 8, 9} >>> sumset({1, 2, 3}, {4, 5, 6, 7}) {5, 6, 7, 8, 9, 10} >>> sumset({1, 2, 3, 4}, 3) Traceback (most recent call last): ... AssertionError: The input value of [set_b=3] is not a set """ assert isinstance(set_a, set), f"The input value of [set_a={set_a}] is not a set" assert isinstance(set_b, set), f"The input value of [set_b={set_b}] is not a set" return {a + b for a in set_a for b in set_b} if __name__ == "__main__": from doctest import testmod testmod()
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MIT License Copyright (c) 2016-2022 TheAlgorithms and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
MIT License Copyright (c) 2016-2022 TheAlgorithms and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" This is pure Python implementation of interpolation search algorithm """ def interpolation_search(sorted_collection, item): """Pure implementation of interpolation search algorithm in Python Be careful collection must be ascending sorted, otherwise result will be unpredictable :param sorted_collection: some ascending sorted collection with comparable items :param item: item value to search :return: index of found item or None if item is not found """ left = 0 right = len(sorted_collection) - 1 while left <= right: # avoid divided by 0 during interpolation if sorted_collection[left] == sorted_collection[right]: if sorted_collection[left] == item: return left else: return None point = left + ((item - sorted_collection[left]) * (right - left)) // ( sorted_collection[right] - sorted_collection[left] ) # out of range check if point < 0 or point >= len(sorted_collection): return None current_item = sorted_collection[point] if current_item == item: return point else: if point < left: right = left left = point elif point > right: left = right right = point else: if item < current_item: right = point - 1 else: left = point + 1 return None def interpolation_search_by_recursion(sorted_collection, item, left, right): """Pure implementation of interpolation search algorithm in Python by recursion Be careful collection must be ascending sorted, otherwise result will be unpredictable First recursion should be started with left=0 and right=(len(sorted_collection)-1) :param sorted_collection: some ascending sorted collection with comparable items :param item: item value to search :return: index of found item or None if item is not found """ # avoid divided by 0 during interpolation if sorted_collection[left] == sorted_collection[right]: if sorted_collection[left] == item: return left else: return None point = left + ((item - sorted_collection[left]) * (right - left)) // ( sorted_collection[right] - sorted_collection[left] ) # out of range check if point < 0 or point >= len(sorted_collection): return None if sorted_collection[point] == item: return point elif point < left: return interpolation_search_by_recursion(sorted_collection, item, point, left) elif point > right: return interpolation_search_by_recursion(sorted_collection, item, right, left) else: if sorted_collection[point] > item: return interpolation_search_by_recursion( sorted_collection, item, left, point - 1 ) else: return interpolation_search_by_recursion( sorted_collection, item, point + 1, right ) def __assert_sorted(collection): """Check if collection is ascending sorted, if not - raises :py:class:`ValueError` :param collection: collection :return: True if collection is ascending sorted :raise: :py:class:`ValueError` if collection is not ascending sorted Examples: >>> __assert_sorted([0, 1, 2, 4]) True >>> __assert_sorted([10, -1, 5]) Traceback (most recent call last): ... ValueError: Collection must be ascending sorted """ if collection != sorted(collection): raise ValueError("Collection must be ascending sorted") return True if __name__ == "__main__": import sys """ user_input = input('Enter numbers separated by comma:\n').strip() collection = [int(item) for item in user_input.split(',')] try: __assert_sorted(collection) except ValueError: sys.exit('Sequence must be ascending sorted to apply interpolation search') target_input = input('Enter a single number to be found in the list:\n') target = int(target_input) """ debug = 0 if debug == 1: collection = [10, 30, 40, 45, 50, 66, 77, 93] try: __assert_sorted(collection) except ValueError: sys.exit("Sequence must be ascending sorted to apply interpolation search") target = 67 result = interpolation_search(collection, target) if result is not None: print(f"{target} found at positions: {result}") else: print("Not found")
""" This is pure Python implementation of interpolation search algorithm """ def interpolation_search(sorted_collection, item): """Pure implementation of interpolation search algorithm in Python Be careful collection must be ascending sorted, otherwise result will be unpredictable :param sorted_collection: some ascending sorted collection with comparable items :param item: item value to search :return: index of found item or None if item is not found """ left = 0 right = len(sorted_collection) - 1 while left <= right: # avoid divided by 0 during interpolation if sorted_collection[left] == sorted_collection[right]: if sorted_collection[left] == item: return left else: return None point = left + ((item - sorted_collection[left]) * (right - left)) // ( sorted_collection[right] - sorted_collection[left] ) # out of range check if point < 0 or point >= len(sorted_collection): return None current_item = sorted_collection[point] if current_item == item: return point else: if point < left: right = left left = point elif point > right: left = right right = point else: if item < current_item: right = point - 1 else: left = point + 1 return None def interpolation_search_by_recursion(sorted_collection, item, left, right): """Pure implementation of interpolation search algorithm in Python by recursion Be careful collection must be ascending sorted, otherwise result will be unpredictable First recursion should be started with left=0 and right=(len(sorted_collection)-1) :param sorted_collection: some ascending sorted collection with comparable items :param item: item value to search :return: index of found item or None if item is not found """ # avoid divided by 0 during interpolation if sorted_collection[left] == sorted_collection[right]: if sorted_collection[left] == item: return left else: return None point = left + ((item - sorted_collection[left]) * (right - left)) // ( sorted_collection[right] - sorted_collection[left] ) # out of range check if point < 0 or point >= len(sorted_collection): return None if sorted_collection[point] == item: return point elif point < left: return interpolation_search_by_recursion(sorted_collection, item, point, left) elif point > right: return interpolation_search_by_recursion(sorted_collection, item, right, left) else: if sorted_collection[point] > item: return interpolation_search_by_recursion( sorted_collection, item, left, point - 1 ) else: return interpolation_search_by_recursion( sorted_collection, item, point + 1, right ) def __assert_sorted(collection): """Check if collection is ascending sorted, if not - raises :py:class:`ValueError` :param collection: collection :return: True if collection is ascending sorted :raise: :py:class:`ValueError` if collection is not ascending sorted Examples: >>> __assert_sorted([0, 1, 2, 4]) True >>> __assert_sorted([10, -1, 5]) Traceback (most recent call last): ... ValueError: Collection must be ascending sorted """ if collection != sorted(collection): raise ValueError("Collection must be ascending sorted") return True if __name__ == "__main__": import sys """ user_input = input('Enter numbers separated by comma:\n').strip() collection = [int(item) for item in user_input.split(',')] try: __assert_sorted(collection) except ValueError: sys.exit('Sequence must be ascending sorted to apply interpolation search') target_input = input('Enter a single number to be found in the list:\n') target = int(target_input) """ debug = 0 if debug == 1: collection = [10, 30, 40, 45, 50, 66, 77, 93] try: __assert_sorted(collection) except ValueError: sys.exit("Sequence must be ascending sorted to apply interpolation search") target = 67 result = interpolation_search(collection, target) if result is not None: print(f"{target} found at positions: {result}") else: print("Not found")
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Longest Common Substring Problem Statement: Given two sequences, find the longest common substring present in both of them. A substring is necessarily continuous. Example: "abcdef" and "xabded" have two longest common substrings, "ab" or "de". Therefore, algorithm should return any one of them. """ def longest_common_substring(text1: str, text2: str) -> str: """ Finds the longest common substring between two strings. >>> longest_common_substring("", "") '' >>> longest_common_substring("a","") '' >>> longest_common_substring("", "a") '' >>> longest_common_substring("a", "a") 'a' >>> longest_common_substring("abcdef", "bcd") 'bcd' >>> longest_common_substring("abcdef", "xabded") 'ab' >>> longest_common_substring("GeeksforGeeks", "GeeksQuiz") 'Geeks' >>> longest_common_substring("abcdxyz", "xyzabcd") 'abcd' >>> longest_common_substring("zxabcdezy", "yzabcdezx") 'abcdez' >>> longest_common_substring("OldSite:GeeksforGeeks.org", "NewSite:GeeksQuiz.com") 'Site:Geeks' >>> longest_common_substring(1, 1) Traceback (most recent call last): ... ValueError: longest_common_substring() takes two strings for inputs """ if not (isinstance(text1, str) and isinstance(text2, str)): raise ValueError("longest_common_substring() takes two strings for inputs") text1_length = len(text1) text2_length = len(text2) dp = [[0] * (text2_length + 1) for _ in range(text1_length + 1)] ans_index = 0 ans_length = 0 for i in range(1, text1_length + 1): for j in range(1, text2_length + 1): if text1[i - 1] == text2[j - 1]: dp[i][j] = 1 + dp[i - 1][j - 1] if dp[i][j] > ans_length: ans_index = i ans_length = dp[i][j] return text1[ans_index - ans_length : ans_index] if __name__ == "__main__": import doctest doctest.testmod()
""" Longest Common Substring Problem Statement: Given two sequences, find the longest common substring present in both of them. A substring is necessarily continuous. Example: "abcdef" and "xabded" have two longest common substrings, "ab" or "de". Therefore, algorithm should return any one of them. """ def longest_common_substring(text1: str, text2: str) -> str: """ Finds the longest common substring between two strings. >>> longest_common_substring("", "") '' >>> longest_common_substring("a","") '' >>> longest_common_substring("", "a") '' >>> longest_common_substring("a", "a") 'a' >>> longest_common_substring("abcdef", "bcd") 'bcd' >>> longest_common_substring("abcdef", "xabded") 'ab' >>> longest_common_substring("GeeksforGeeks", "GeeksQuiz") 'Geeks' >>> longest_common_substring("abcdxyz", "xyzabcd") 'abcd' >>> longest_common_substring("zxabcdezy", "yzabcdezx") 'abcdez' >>> longest_common_substring("OldSite:GeeksforGeeks.org", "NewSite:GeeksQuiz.com") 'Site:Geeks' >>> longest_common_substring(1, 1) Traceback (most recent call last): ... ValueError: longest_common_substring() takes two strings for inputs """ if not (isinstance(text1, str) and isinstance(text2, str)): raise ValueError("longest_common_substring() takes two strings for inputs") text1_length = len(text1) text2_length = len(text2) dp = [[0] * (text2_length + 1) for _ in range(text1_length + 1)] ans_index = 0 ans_length = 0 for i in range(1, text1_length + 1): for j in range(1, text2_length + 1): if text1[i - 1] == text2[j - 1]: dp[i][j] = 1 + dp[i - 1][j - 1] if dp[i][j] > ans_length: ans_index = i ans_length = dp[i][j] return text1[ans_index - ans_length : ans_index] if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
name: Bug report description: Create a bug report to help us address errors in the repository labels: [bug] body: - type: markdown attributes: value: > Before requesting please search [existing issues](https://github.com/TheAlgorithms/Python/labels/bug). Usage questions such as "How do I...?" belong on the [Discord](https://discord.gg/c7MnfGFGa6) and will be closed. - type: input attributes: label: "Repository commit" description: > The commit hash for `TheAlgorithms/Python` repository. You can get this by running the command `git rev-parse HEAD` locally. placeholder: "a0b0f414ae134aa1772d33bb930e5a960f9979e8" validations: required: true - type: input attributes: label: "Python version (python --version)" placeholder: "Python 3.10.7" validations: required: true - type: textarea attributes: label: "Dependencies version (pip freeze)" description: > This is the output of the command `pip freeze --all`. Note that the actual output might be different as compared to the placeholder text. placeholder: | appnope==0.1.3 asttokens==2.0.8 backcall==0.2.0 ... validations: required: true - type: textarea attributes: label: "Expected behavior" description: "Describe the behavior you expect. May include images or videos." validations: required: true - type: textarea attributes: label: "Actual behavior" validations: required: true
name: Bug report description: Create a bug report to help us address errors in the repository labels: [bug] body: - type: markdown attributes: value: > Before requesting please search [existing issues](https://github.com/TheAlgorithms/Python/labels/bug). Usage questions such as "How do I...?" belong on the [Discord](https://discord.gg/c7MnfGFGa6) and will be closed. - type: input attributes: label: "Repository commit" description: > The commit hash for `TheAlgorithms/Python` repository. You can get this by running the command `git rev-parse HEAD` locally. placeholder: "a0b0f414ae134aa1772d33bb930e5a960f9979e8" validations: required: true - type: input attributes: label: "Python version (python --version)" placeholder: "Python 3.10.7" validations: required: true - type: textarea attributes: label: "Dependencies version (pip freeze)" description: > This is the output of the command `pip freeze --all`. Note that the actual output might be different as compared to the placeholder text. placeholder: | appnope==0.1.3 asttokens==2.0.8 backcall==0.2.0 ... validations: required: true - type: textarea attributes: label: "Expected behavior" description: "Describe the behavior you expect. May include images or videos." validations: required: true - type: textarea attributes: label: "Actual behavior" validations: required: true
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import requests APPID = "" # <-- Put your OpenWeatherMap appid here! URL_BASE = "https://api.openweathermap.org/data/2.5/" def current_weather(q: str = "Chicago", appid: str = APPID) -> dict: """https://openweathermap.org/api""" return requests.get(URL_BASE + "weather", params=locals()).json() def weather_forecast(q: str = "Kolkata, India", appid: str = APPID) -> dict: """https://openweathermap.org/forecast5""" return requests.get(URL_BASE + "forecast", params=locals()).json() def weather_onecall(lat: float = 55.68, lon: float = 12.57, appid: str = APPID) -> dict: """https://openweathermap.org/api/one-call-api""" return requests.get(URL_BASE + "onecall", params=locals()).json() if __name__ == "__main__": from pprint import pprint while True: location = input("Enter a location:").strip() if location: pprint(current_weather(location)) else: break
import requests APPID = "" # <-- Put your OpenWeatherMap appid here! URL_BASE = "https://api.openweathermap.org/data/2.5/" def current_weather(q: str = "Chicago", appid: str = APPID) -> dict: """https://openweathermap.org/api""" return requests.get(URL_BASE + "weather", params=locals()).json() def weather_forecast(q: str = "Kolkata, India", appid: str = APPID) -> dict: """https://openweathermap.org/forecast5""" return requests.get(URL_BASE + "forecast", params=locals()).json() def weather_onecall(lat: float = 55.68, lon: float = 12.57, appid: str = APPID) -> dict: """https://openweathermap.org/api/one-call-api""" return requests.get(URL_BASE + "onecall", params=locals()).json() if __name__ == "__main__": from pprint import pprint while True: location = input("Enter a location:").strip() if location: pprint(current_weather(location)) else: break
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.11+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `flake8 . --ignore=E203,W503 --max-line-length=88` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install flake8 # only required the first time flake8 . --ignore=E203,W503 --max-line-length=88 --show-source ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.11+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `flake8 . --ignore=E203,W503 --max-line-length=88` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install flake8 # only required the first time flake8 . --ignore=E203,W503 --max-line-length=88 --show-source ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Implementation of an auto-balanced binary tree! For doctests run following command: python3 -m doctest -v avl_tree.py For testing run: python avl_tree.py """ from __future__ import annotations import math import random from typing import Any class MyQueue: def __init__(self) -> None: self.data: list[Any] = [] self.head: int = 0 self.tail: int = 0 def is_empty(self) -> bool: return self.head == self.tail def push(self, data: Any) -> None: self.data.append(data) self.tail = self.tail + 1 def pop(self) -> Any: ret = self.data[self.head] self.head = self.head + 1 return ret def count(self) -> int: return self.tail - self.head def print_queue(self) -> None: print(self.data) print("**************") print(self.data[self.head : self.tail]) class MyNode: def __init__(self, data: Any) -> None: self.data = data self.left: MyNode | None = None self.right: MyNode | None = None self.height: int = 1 def get_data(self) -> Any: return self.data def get_left(self) -> MyNode | None: return self.left def get_right(self) -> MyNode | None: return self.right def get_height(self) -> int: return self.height def set_data(self, data: Any) -> None: self.data = data return def set_left(self, node: MyNode | None) -> None: self.left = node return def set_right(self, node: MyNode | None) -> None: self.right = node return def set_height(self, height: int) -> None: self.height = height return def get_height(node: MyNode | None) -> int: if node is None: return 0 return node.get_height() def my_max(a: int, b: int) -> int: if a > b: return a return b def right_rotation(node: MyNode) -> MyNode: r""" A B / \ / \ B C Bl A / \ --> / / \ Bl Br UB Br C / UB UB = unbalanced node """ print("left rotation node:", node.get_data()) ret = node.get_left() assert ret is not None node.set_left(ret.get_right()) ret.set_right(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def left_rotation(node: MyNode) -> MyNode: """ a mirror symmetry rotation of the left_rotation """ print("right rotation node:", node.get_data()) ret = node.get_right() assert ret is not None node.set_right(ret.get_left()) ret.set_left(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def lr_rotation(node: MyNode) -> MyNode: r""" A A Br / \ / \ / \ B C LR Br C RR B A / \ --> / \ --> / / \ Bl Br B UB Bl UB C \ / UB Bl RR = right_rotation LR = left_rotation """ left_child = node.get_left() assert left_child is not None node.set_left(left_rotation(left_child)) return right_rotation(node) def rl_rotation(node: MyNode) -> MyNode: right_child = node.get_right() assert right_child is not None node.set_right(right_rotation(right_child)) return left_rotation(node) def insert_node(node: MyNode | None, data: Any) -> MyNode | None: if node is None: return MyNode(data) if data < node.get_data(): node.set_left(insert_node(node.get_left(), data)) if ( get_height(node.get_left()) - get_height(node.get_right()) == 2 ): # an unbalance detected left_child = node.get_left() assert left_child is not None if ( data < left_child.get_data() ): # new node is the left child of the left child node = right_rotation(node) else: node = lr_rotation(node) else: node.set_right(insert_node(node.get_right(), data)) if get_height(node.get_right()) - get_height(node.get_left()) == 2: right_child = node.get_right() assert right_child is not None if data < right_child.get_data(): node = rl_rotation(node) else: node = left_rotation(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) return node def get_right_most(root: MyNode) -> Any: while True: right_child = root.get_right() if right_child is None: break root = right_child return root.get_data() def get_left_most(root: MyNode) -> Any: while True: left_child = root.get_left() if left_child is None: break root = left_child return root.get_data() def del_node(root: MyNode, data: Any) -> MyNode | None: left_child = root.get_left() right_child = root.get_right() if root.get_data() == data: if left_child is not None and right_child is not None: temp_data = get_left_most(right_child) root.set_data(temp_data) root.set_right(del_node(right_child, temp_data)) elif left_child is not None: root = left_child elif right_child is not None: root = right_child else: return None elif root.get_data() > data: if left_child is None: print("No such data") return root else: root.set_left(del_node(left_child, data)) else: # root.get_data() < data if right_child is None: return root else: root.set_right(del_node(right_child, data)) if get_height(right_child) - get_height(left_child) == 2: assert right_child is not None if get_height(right_child.get_right()) > get_height(right_child.get_left()): root = left_rotation(root) else: root = rl_rotation(root) elif get_height(right_child) - get_height(left_child) == -2: assert left_child is not None if get_height(left_child.get_left()) > get_height(left_child.get_right()): root = right_rotation(root) else: root = lr_rotation(root) height = my_max(get_height(root.get_right()), get_height(root.get_left())) + 1 root.set_height(height) return root class AVLtree: """ An AVL tree doctest Examples: >>> t = AVLtree() >>> t.insert(4) insert:4 >>> print(str(t).replace(" \\n","\\n")) 4 ************************************* >>> t.insert(2) insert:2 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* >>> t.insert(3) insert:3 right rotation node: 2 left rotation node: 4 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 3 2 4 ************************************* >>> t.get_height() 2 >>> t.del_node(3) delete:3 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* """ def __init__(self) -> None: self.root: MyNode | None = None def get_height(self) -> int: return get_height(self.root) def insert(self, data: Any) -> None: print("insert:" + str(data)) self.root = insert_node(self.root, data) def del_node(self, data: Any) -> None: print("delete:" + str(data)) if self.root is None: print("Tree is empty!") return self.root = del_node(self.root, data) def __str__( self, ) -> str: # a level traversale, gives a more intuitive look on the tree output = "" q = MyQueue() q.push(self.root) layer = self.get_height() if layer == 0: return output cnt = 0 while not q.is_empty(): node = q.pop() space = " " * int(math.pow(2, layer - 1)) output += space if node is None: output += "*" q.push(None) q.push(None) else: output += str(node.get_data()) q.push(node.get_left()) q.push(node.get_right()) output += space cnt = cnt + 1 for i in range(100): if cnt == math.pow(2, i) - 1: layer = layer - 1 if layer == 0: output += "\n*************************************" return output output += "\n" break output += "\n*************************************" return output def _test() -> None: import doctest doctest.testmod() if __name__ == "__main__": _test() t = AVLtree() lst = list(range(10)) random.shuffle(lst) for i in lst: t.insert(i) print(str(t)) random.shuffle(lst) for i in lst: t.del_node(i) print(str(t))
""" Implementation of an auto-balanced binary tree! For doctests run following command: python3 -m doctest -v avl_tree.py For testing run: python avl_tree.py """ from __future__ import annotations import math import random from typing import Any class MyQueue: def __init__(self) -> None: self.data: list[Any] = [] self.head: int = 0 self.tail: int = 0 def is_empty(self) -> bool: return self.head == self.tail def push(self, data: Any) -> None: self.data.append(data) self.tail = self.tail + 1 def pop(self) -> Any: ret = self.data[self.head] self.head = self.head + 1 return ret def count(self) -> int: return self.tail - self.head def print_queue(self) -> None: print(self.data) print("**************") print(self.data[self.head : self.tail]) class MyNode: def __init__(self, data: Any) -> None: self.data = data self.left: MyNode | None = None self.right: MyNode | None = None self.height: int = 1 def get_data(self) -> Any: return self.data def get_left(self) -> MyNode | None: return self.left def get_right(self) -> MyNode | None: return self.right def get_height(self) -> int: return self.height def set_data(self, data: Any) -> None: self.data = data return def set_left(self, node: MyNode | None) -> None: self.left = node return def set_right(self, node: MyNode | None) -> None: self.right = node return def set_height(self, height: int) -> None: self.height = height return def get_height(node: MyNode | None) -> int: if node is None: return 0 return node.get_height() def my_max(a: int, b: int) -> int: if a > b: return a return b def right_rotation(node: MyNode) -> MyNode: r""" A B / \ / \ B C Bl A / \ --> / / \ Bl Br UB Br C / UB UB = unbalanced node """ print("left rotation node:", node.get_data()) ret = node.get_left() assert ret is not None node.set_left(ret.get_right()) ret.set_right(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def left_rotation(node: MyNode) -> MyNode: """ a mirror symmetry rotation of the left_rotation """ print("right rotation node:", node.get_data()) ret = node.get_right() assert ret is not None node.set_right(ret.get_left()) ret.set_left(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) h2 = my_max(get_height(ret.get_right()), get_height(ret.get_left())) + 1 ret.set_height(h2) return ret def lr_rotation(node: MyNode) -> MyNode: r""" A A Br / \ / \ / \ B C LR Br C RR B A / \ --> / \ --> / / \ Bl Br B UB Bl UB C \ / UB Bl RR = right_rotation LR = left_rotation """ left_child = node.get_left() assert left_child is not None node.set_left(left_rotation(left_child)) return right_rotation(node) def rl_rotation(node: MyNode) -> MyNode: right_child = node.get_right() assert right_child is not None node.set_right(right_rotation(right_child)) return left_rotation(node) def insert_node(node: MyNode | None, data: Any) -> MyNode | None: if node is None: return MyNode(data) if data < node.get_data(): node.set_left(insert_node(node.get_left(), data)) if ( get_height(node.get_left()) - get_height(node.get_right()) == 2 ): # an unbalance detected left_child = node.get_left() assert left_child is not None if ( data < left_child.get_data() ): # new node is the left child of the left child node = right_rotation(node) else: node = lr_rotation(node) else: node.set_right(insert_node(node.get_right(), data)) if get_height(node.get_right()) - get_height(node.get_left()) == 2: right_child = node.get_right() assert right_child is not None if data < right_child.get_data(): node = rl_rotation(node) else: node = left_rotation(node) h1 = my_max(get_height(node.get_right()), get_height(node.get_left())) + 1 node.set_height(h1) return node def get_right_most(root: MyNode) -> Any: while True: right_child = root.get_right() if right_child is None: break root = right_child return root.get_data() def get_left_most(root: MyNode) -> Any: while True: left_child = root.get_left() if left_child is None: break root = left_child return root.get_data() def del_node(root: MyNode, data: Any) -> MyNode | None: left_child = root.get_left() right_child = root.get_right() if root.get_data() == data: if left_child is not None and right_child is not None: temp_data = get_left_most(right_child) root.set_data(temp_data) root.set_right(del_node(right_child, temp_data)) elif left_child is not None: root = left_child elif right_child is not None: root = right_child else: return None elif root.get_data() > data: if left_child is None: print("No such data") return root else: root.set_left(del_node(left_child, data)) else: # root.get_data() < data if right_child is None: return root else: root.set_right(del_node(right_child, data)) if get_height(right_child) - get_height(left_child) == 2: assert right_child is not None if get_height(right_child.get_right()) > get_height(right_child.get_left()): root = left_rotation(root) else: root = rl_rotation(root) elif get_height(right_child) - get_height(left_child) == -2: assert left_child is not None if get_height(left_child.get_left()) > get_height(left_child.get_right()): root = right_rotation(root) else: root = lr_rotation(root) height = my_max(get_height(root.get_right()), get_height(root.get_left())) + 1 root.set_height(height) return root class AVLtree: """ An AVL tree doctest Examples: >>> t = AVLtree() >>> t.insert(4) insert:4 >>> print(str(t).replace(" \\n","\\n")) 4 ************************************* >>> t.insert(2) insert:2 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* >>> t.insert(3) insert:3 right rotation node: 2 left rotation node: 4 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 3 2 4 ************************************* >>> t.get_height() 2 >>> t.del_node(3) delete:3 >>> print(str(t).replace(" \\n","\\n").replace(" \\n","\\n")) 4 2 * ************************************* """ def __init__(self) -> None: self.root: MyNode | None = None def get_height(self) -> int: return get_height(self.root) def insert(self, data: Any) -> None: print("insert:" + str(data)) self.root = insert_node(self.root, data) def del_node(self, data: Any) -> None: print("delete:" + str(data)) if self.root is None: print("Tree is empty!") return self.root = del_node(self.root, data) def __str__( self, ) -> str: # a level traversale, gives a more intuitive look on the tree output = "" q = MyQueue() q.push(self.root) layer = self.get_height() if layer == 0: return output cnt = 0 while not q.is_empty(): node = q.pop() space = " " * int(math.pow(2, layer - 1)) output += space if node is None: output += "*" q.push(None) q.push(None) else: output += str(node.get_data()) q.push(node.get_left()) q.push(node.get_right()) output += space cnt = cnt + 1 for i in range(100): if cnt == math.pow(2, i) - 1: layer = layer - 1 if layer == 0: output += "\n*************************************" return output output += "\n" break output += "\n*************************************" return output def _test() -> None: import doctest doctest.testmod() if __name__ == "__main__": _test() t = AVLtree() lst = list(range(10)) random.shuffle(lst) for i in lst: t.insert(i) print(str(t)) random.shuffle(lst) for i in lst: t.del_node(i) print(str(t))
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Problem 16: https://projecteuler.net/problem=16 2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26. What is the sum of the digits of the number 2^1000? """ def solution(power: int = 1000) -> int: """Returns the sum of the digits of the number 2^power. >>> solution(1000) 1366 >>> solution(50) 76 >>> solution(20) 31 >>> solution(15) 26 """ num = 2**power string_num = str(num) list_num = list(string_num) sum_of_num = 0 for i in list_num: sum_of_num += int(i) return sum_of_num if __name__ == "__main__": power = int(input("Enter the power of 2: ").strip()) print("2 ^ ", power, " = ", 2**power) result = solution(power) print("Sum of the digits is: ", result)
""" Problem 16: https://projecteuler.net/problem=16 2^15 = 32768 and the sum of its digits is 3 + 2 + 7 + 6 + 8 = 26. What is the sum of the digits of the number 2^1000? """ def solution(power: int = 1000) -> int: """Returns the sum of the digits of the number 2^power. >>> solution(1000) 1366 >>> solution(50) 76 >>> solution(20) 31 >>> solution(15) 26 """ num = 2**power string_num = str(num) list_num = list(string_num) sum_of_num = 0 for i in list_num: sum_of_num += int(i) return sum_of_num if __name__ == "__main__": power = int(input("Enter the power of 2: ").strip()) print("2 ^ ", power, " = ", 2**power) result = solution(power) print("Sum of the digits is: ", result)
-1
TheAlgorithms/Python
7,932
refactor: Move pascals triange to maths/
### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
CaedenPH
"2022-11-01T07:28:48Z"
"2022-11-01T19:25:39Z"
4e6c1c049dffdc984232fe1fce1e4791fc527d11
f512b4d105b6f3188deced19761b6ed288378f0d
refactor: Move pascals triange to maths/. ### Describe your change: Moves the pascal triangle from `other/` to `maths/` because it is very much math related and should 100% be there, next to `perfect_cube`, `perfect_number`, etc * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [x] This pull request is all my own work -- I have not plagiarized. * [x] I know that pull requests will not be merged if they fail the automated tests. * [x] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [x] All new Python files are placed inside an existing directory. * [x] All filenames are in all lowercase characters with no spaces or dashes. * [x] All functions and variable names follow Python naming conventions. * [x] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [x] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [x] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [x] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
JFIF``ExifMM*;HasiJ >2424 2019:07:22 20:14:152019:07:22 20:14:15Has http://ns.adobe.com/xap/1.0/<?xpacket begin='' id='W5M0MpCehiHzreSzNTczkc9d'?> <x:xmpmeta xmlns:x="adobe:ns:meta/"><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"/><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:xmp="http://ns.adobe.com/xap/1.0/"><xmp:CreateDate>2019-07-22T20:14:15.236</xmp:CreateDate></rdf:Description><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"><dc:creator><rdf:Seq xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:li>Has</rdf:li></rdf:Seq> </dc:creator></rdf:Description></rdf:RDF></x:xmpmeta> <?xpacket end='w'?>C   '!%"."%()+,+ /3/*2'*+*C  ***************************************************e" }!1AQa"q2#BR$3br %&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz w!1AQaq"2B #3Rbr $4%&'()*56789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz ?F(((((((((((((((((((((((((((((((((((( Vk8w}=Y[Plgl17P=T_/\& iR7ԫhzzSys 72,PąFrIG'<Aً+,cy.*;j( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (25p7񎭧hzlڵkٺAeI՝MzF1W|TM-{-.RE+iݟZ_=~↱ҠK:2_yNup2Wddqֲ5-_ßm,W,d˨o\al1f5VEhf!Xfq uey皲EQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEVFVnz2裟ǥk]AgA߾Jڄ#Ra-H9S)t: Z:hG  WgҴ OMt$э]S3VYEl Ҍ!K$4ns=kG<EwxsRd[ѣ¡'5|kSbuv->/.7-6zVFǫ+}Ӛ\OP9o.e\F =vnO|Jp"fx<X<>jzuj}[&FqA uN#wQ(޼ӭq^5~{*P<%vFI9 UռUg#Oހ-A zWOWӣg "mOL]`uMEFc<zJp7ޣ-2G?pp3QgvzN>+eW+cPTP̪2{ő~!t4596@~uь3g?INxV5xrMJc y@[É9(}֖s,x;ђ1$ r)79\Fѥ?u۸(y"ڂj=2%):FyC ㏭i䊩jvzN׷ {t`FI~PE&gܩ, \rjQjir"iV1hHh\r:wI9s[O&p\S]Or9#ր% 4ns=ꎷ.^j- _eb|ώހ40NڝXՓ]Э58mweh#T9^:0$FGF 8b0=)sy4Îzp3Vh⍦}@EP:嬾mnOY\#F9B,SFX֣ bq˘ V884.E! u5pby?ҍ֢~.{9-'Ś>}ͩ06C0# FNx!_Vrޞ `q֍#9 dojYb*bOA@w <ѸgD3ƫ>fш>7nսoX4dH\#ӝ`NB@ 2zU;٧ݮZ%ްE ^ygfHRxm5_(i[_qwCОCہF犠ӖMP֭{PZ\'Sf4-qQEQEdkzrp:i<wz_ҸMsi*>k;{@M^) 4=,󶗪R|ɞ$_Q]]ұ5kt)<sU@q⬃g'LR=M#險y:?yOIWAP@9PEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEP\w5߁t+K'IJ zcEiJ~΢gs*7%Ko[ [[pH-<e،][c'L>^? *MϾgMCz =c5σiiyXpG&,z>)GX|ߔ qxU.tD _lmr~l 7JCAaK6A;xljg=OAXA &yjmk|E_ݾ&i/ׁY CĖ=jS<z.1@4xO>65)8[rJP9MgUm#ǺE_ͨ{ w#StoKoxWYVOVۄ_D,ͦY2;?-sbYmkP]>mu6:J*0yϧjMWV:Mnfx\a#JÚ6z}s[Og2`m]],eI崓ȵxڇ<%sE>ZJ1qJ^O=xedapP܋w8t6:-4kg6{,Z]whzwQ6|D 3i;|¿iXWm+k$𗍴Xu]J7ZwXWw˓UVoǠ^k7FTdJq ދzj"mGnn=ϴoNh?$7#q6xT~}uɨ:2Fgr9@^!໷Qmv#;H9֮_[Y^#}2j [83n.qjiڷD+ksb]뙢Ѯ%f H3󫟆~%=~9D4An{fSE[];n+@?6m&PRH.F{a8r?4DSh|VdLoeݎ$ >h1i2а ׽i^w^1Ӽ@|EkXbê]s@:5T'+·W,vuezWEW մۨOpm.!hx gp`'ޯxzhzڭ*Ҽ^bJGBsZs{-WND0"*O<{'}YѾ&^Pa\,SA9/i549I@&@:j76Ft^ !OoZܶeRMէL[x6#6^mŷ9ռ]7GºN V_dY̓c`NVѵ,eX%;~Un:?|PݼKn  shGwy3z<)6$3)_52;{R+&'fc1Z~^EO/-2&-7¾)7VP^X[\m+6 px8xNѼm< C,O?aqiWuG$VJiH1nxSH4Zi_=b%8| (ۓ۽p^.6ijj$bS0*d\V5]W;x]K\Gd}@5±-nm/Qkf'JO1xZ]GZZ" \9;*s89yA^Omȩi +*'ú&A$ޢ5靌G{> GW}o텵ʘРaNF@ڀ)k0j-i_Fei%B_0t㚊h@DI=Z~?1 Vd9<Nڨ|1{OZk7RZMa6=~#jvjRꮫ Yʱ\$`F5K&n-!2G'PJ85O֝d .Wr ?.tRno4$D 1=۞*ƓSQ +]EqQ[k.&w{%"s8=0+;G|3|6ɥ:<so##֝[x|OyKkl-Dجp~P;j~4ֵhֺIJ)eŰ3~Ǻyu}J(Cgz.)yVYgcsq}@(hwqmxnnKHJF-ibi427Bw|:ku6VHۮՔB]<Cqk{B=b>~YZ_ÿXxB] MnAghMU7rJf2/#%ֵCX"n^Ei.DZx_R^RilOJi-;IX鳤7^gߵ_爆jv7VeO#'%<#E׼uxWXZtKBş` <+k u[qyhS{W1t:bxU25v3`7sWY^ Կ6q"@0<Z5}S^6)6K2h j5|WKԉR~ +bŲJ0|ppw>:~=j+o-m[o 9lszu_ö:[ؠ9&`=R'IkVAqZjobҤM"!8χ-W#u8?ts?r"ޯɦ\kodL-6ā.$yP}DZ{yL#;|`gso,"8LpW.k.=:Uh:AvM#D}*ǁf?ľ7/4}GpCA@x?Eǃl4qzz&ǣ.7nkFGU9d8Hݻ ?7tz:]ӠdE4;h}8֯6lG,/7;,+Y;Inq^y|?񖍫jZ%SϜwsz_Y.`C3Ϊ8fTr$Sm~&֚no/.n]X>o) cx 617ZڞCw?!B@qWU/8Q 7|q@oj NJ}ӓS.GEe ^r젂@sSOjƕ-jVޏsV&kW~ Z}YW 5_^"|%?}ơpvQ0P䪖 $ccj>on!QCZۻ6N02?޽1G.;<o( p28ݝ<^X WePy-{Pt/YxOK1ݠ~d?P[[m lKq (25Qpzi=}?RT_/\&q֟)P"zsMjTt4\ʨOMx[\}u(mVH>IJB'L{Zr]eG(sSq`Րz~[Dp-[8e?znIq#5G̝z+IEZ\7g2Hq>𣬍o{o[ܠP{ 6us-j `:tV۞( ( ( ( ( ( ( ( ( (T*vEp54mfc,E3z@=Ar?5!ng.q$l̤zPZEPEPEPEPEPEW o^e#| 7PI`ˇ⎷tڕnTi#%vl͜(;-"u+y$8 *heII``ѷ9Q^mO[k >nYN(=ekIuk~!ݰOL +Vֵm'_%.>i^ 0Oe5#_h^ ߞH1l>uuo-ʣQK=RV|V  Vլԥb@kgt^ۖ*%/^#_^%.omu{g=7e@WccgD_kA>V, v;zxkKxPo"KOP1ȇ2[ofCO i%HCd!rAz0Ҹki:Yc#W.s>P[[Qy0pYHe @EyjcJ.mMpܹ{zuo\A/Ktdʴg=h״"'}̌OAW$9yMh|A5?R9!d%x[׾5ķ7m:?2I8`;s@Ds޼]=q pHz9#V?n_ŖN1QSy~tzumior`3~"31nyb{ ωu['OMs@[ <$ һk%uHz f붲\iQC[<mcBs=^U'+}[V|<Z]Y'{({{x>wtA).I\FtɠR;20^oqrql em0q<shӛa_xAӵh4Vۆ ,315׿Ͷ!E' rZlgw5Hխif{eP5*vր=.H Ay8ުh^#<GWCnжୌ⹝?Ʒ7~ þ$Kvܶ MI'kWx[-.Pڻa?3@EpZwYF[kڮIkZ-sא8M{uccœ޵o*Y5jv:5ϩ%3Y%8 "øe~+/օ;?$ss#zkBWԴ/8T dCX_t m4 ֲbq/ǿ5zg4l!g20|zoiڝ 6tga1kAs}kRiyYE9caj_oVm)-7-y|szMtn5 g\gCchzމq<idTc#y:k]4 As^enN<ٛhK NUMnmfM(^iB>&ut&4xoT3MS|g_|gK:ke,M RGҗEGumY%A@>|ZO<)f.n@n G./}(ipv:uvG @xF1QӢFG8:g7IY2!vH1^y|mcڃ[Y͞ )G5Xwc%w\)q4-⥮Oľ5͎STEi#n?3~&=Hz9U[))9;qw?3C@U--=/k'tM_LP'd9޵_Iy!M1E,vɴ;{MMc&ڱ-OhZ^%M0J~D$e?w_]iZaN0ErK6wJd8sY׈tY\ܶ5YD=OL 6)}UrVNO=:q<]kxgJkti&Dʙ?tVSVew}zh^)E,jċ ) K\ y Fyk⟎4ɐpzhݨ4"ƝiRmssKK8! ,])2㯿@t2nmg.6Cͱ.}ruSL-̋`ѮA@f|S :,wP OsҀ=bT9bq":W^=jJ=#Tzj4} c20#}E^s6Xxz;‡b397׎(<1X<SMV|:ypH#mEPF/\n:f~w)]޿/\g)J,<ObB$ JvZԼkp |y"`uҖ6VZİ DA:-\״$Z'6װ)Ot uf}3TA׳=t5q9涹^}> y84h:3麺}RBxy}EtcF@ ETk$3@ia'^цC)4j@r)h(((Z%2큟YCZ=YP:Z)dտ e4+m"ϻCB,>:Ӝ~5@*zq(A\f>W;amomgw<nWOWLTwIbV0̲T,ہ U<v3>H+~!x[= Yխ<PӬl>zOGGVޡdy{{<%&NVkKZ3W_pMS b? xwPBQ5 Kdq[F*gԒIzb>襬N(((((H `G@q⨼@?yrkM52ND9Rҡ5M{7.H quXcǥz#Fp/EH>ܾsۥkkCX> I!<GQ+՜Td͓(%еO▹{qkjZ_r]Wsޛ`j ȱ̆"dݸc4ӻ<ws@G6Ajpy_Ý:4#UY3!L*A^ׂzRPJ \\I}#6G'?YucLsuq\2̙>2{cnu8|#||R@^1'=Y jF.}41J=8iCo\x4 {M3Xa >&E򔁰d92º|Aix!4HRd߂9vl^ş$ińÚOk|5ZxnEuy7@ϭZW/E^]j21IjTS Y<'K.V{U=H[˦~0ֵ-;meUʨ МX!4'#&>˦ǣSUxY[ֺ?M[>` p^?ݙu9^>l7('ހ</^𾳬τf3Yjvu |ܞzGkm Á69x89Zo^(м>3(?sq֓W|W}AГ uc,Q$sڨ H[(>Yǵyw^,3j <b kirv<SSοf\y3TcU?$w$, x#+zW@uP-s^^%.,b{f#4_&u $}v.yW(vM{(o@7xa/m -*:2By# .q{a:,nN8`~n:zNOz3Ɨ%ѳ.w4! ̈ =k~h?12 r?{xGĚ}3LMC dWLX8O:׍5=4/ \i:Ȱ[9|)G(Xke 3L*a^Gxs_ԵǮx^[3<McĄaJ9>tPͦxÚW!s][&.n^#zxmignW@{/ ɨ[;[ދv2NsuWin+ 6ڊk|k'yssaoj y[nkB:~#CMM: Wy3,Tz(|-_ ԴϪiܵżN <k^"Ԯ_ypB$pI]m</^x)&.lU4GP~=+|Fp-DifCG54Px[W=;cHY rNpG9^*XtjYrMJ-jI^9Decp''{m/C+4-W){I%L<lX8ERK]iz] -YeJ ׷@7I/m->C!IoduN?Uj-<3=^=тncU+#2 pE{uF6z%ż{6[qV:_5jliI)")9=8{ E0<3@YxN+Ƌdz 8-2R3ZB¿mK=c硊16U}mWQIv]b㷹$QTU |Ru{:0Rqs3o|-fA<QyDFVNps+nm[XͧK)$gN>TsZ%'KS&|S3^Es> e<7kick[uc]5PEPF/\g)JQp?5JT?:- >dQ@P.E*K f&8*zT7iڲ}R|/0x>{O q8<޹ _wӵT>n>x_Pj+ZZͲ\Fd;Hh]<gS2ZuA#re{>o xWI5 jJhwȑ><=VR*^K4.uS]ZzYث,IYf>4qxQږ[o8;\jλ֬mb\ϗ}+48'D|83uL*7}XI'\|]me!oXfh?:h^tvYfafHrS;SŚ63jZ6v4Go@NJC3ʆ$OFV7kPoO.;$!w<ѼV:Xͥ\ks wwJRI3 -*,/Y[}2xAi?aSMWW*m9%)o?*t. m;M v|"*`q>+<GZ[3x1wQ gҹ]=@Mܸ5ðr7v mZ.$=@c^Y^xGU=J3$w6pA >]wy6#kˈCO'l:@ci$QOz& ๠4);ɼ5Kc0{`)72Z?TwoE5ī\QZT|b[$[mGӮBy ]Iz#R/5?HJK9>qqp')!UrN<W] zƓf&ēK̗-#]NU+Cc:<7ݮvKH)k3`((((('@ @Nfe{|n|SFö"t/[K:Gox Sq]kg\A}lw#i0s0yt>;mSO5+8gky$e]#{fGֿri6btPqGy]i_ ;msS7m|8'@wX.y#`+Vxo^l "wcEu%%6\/moy1 U~Sҳ£BACQ.= DIT +"F^>N ><_ 3,As& ~!-|;:+Ǩ&$*'tuuy{bw݌n˒\Ʃ3C\ȬcQy|}[ra*<Wg;R{YYfN<3c$Ҁ;xU_i>oqO-Ӣ@M떾WۭB#3;m2fKŌ ݌>\Wj1$EAT`P/'4,n᳛[ۼM杨p$q׮*O_o_4Kh1{(YxrlyQ\N87;C'J/_{%<$` hkB_H^{ʖRk9F*z_?>~Hd2wDR˙Sgҽi2V4!TK+7fXq(Szii:&o*in/3 rsVoww 4:JYPyyNŐJDe[% ]({Q4Wa>\b*𵏍&մ½q͵\62Nҽ^6Kd>BHä2X#p[xEt=OaH0/F8(>%xGuTHnݎ=:fXuygx՟~6G٬7!!D\c!foD-eV{3yd+f0s HPe7s' y?<ֻJYּG^!֙lS8%RN) =+K֞ Eᷖ&CF͆v~!/KkyR##+z}x:}R}+>5'۴o>^s)kXLo+IDi/r7Ӑy׭v!>5w>j_/q~2:kxgT KXj.<,0n8sW7Y$Si#d9+\·µ"qxO}=jb>:sˏQP./iˊY#wl־{;m#恫K<R'39?»;+(uόokA!2[hs:c 路A JWck鷦tw8#a>/|j<3/(ſtφ$m5F-?V$1[K#Ir`F lu?zIkq7۠3n֐s^/"*_FF7ɍ^iqhgpnt`ӚFޡCҖ(N(((((((((7P2:R@ 1=iQ@Q@Q@Q@Q@_ҸMsi+Ek{L֟)P"z>RՐQEQEqS8PAa{;P^[/X-Iv8 ڤ>jдFsR̂fO  p@FXX,qkEύ<:"6UZj-sg4Cf)cRHn>QNAԊ;3ռ1[xL,5!\YL$ 6~DZC[]gSÌǖ03XCZrjʗ, ťg{HpwZ?֡z7[ê]5F:( r vEռSi:Y UH4j0u$k[Uǖީ)* /|E=v&oo.&K̲sqE O薷Q=pwM͏w<UDQ倣с\YF񹳵u' >iZ i0ȉx\uҀ/eG$j> jׇbK-R+u3ZiL*r):[Sx9۰=+CE Gja릘3sӯ?A@w>Q׵CW\=]0JyPʱy6VM:}~ ttт@דS6;.?5L>\:/QU#IL.o\mi} (#4(K@ xÎk.?.AkZ"\M\68VV8;]BoxMV=]enP@>\EzzeR*z^8i@XdJ4_1Svczl)2y\}6G3>im?pSP5O]V IP(:rOZu# ((((((8:8(3UѬum@VA bm=kk綌\@P#rAPc g05wMѩESۃUx^KMѭg9]88yg<c@״䴎-XM3ֶ</V\+P$HCs_҃@8Zo~B{kqv_Zu jVwL7ƱDʠ=t>1۸c* yE  hP>Z߅`Pi7 y9Ԛ |pG;vr>UdRg`w ~5v*pC{qy<Nw([]F{}GPr~C6uΟlG2D^k(r t@'q[vejG>gJ噶,ⵈkYPdGjq\0 h!x!/2;DFӡK>s)2kwo< {P3m ì.=7jO1szFT =9 ӃS9ro1]]ݶm\-ĤdV5ᖟmk xV:Ү8,p:9M=(ɼ-54 KK{Uaq̲}!~= z~Z|zw I![󜟥yQ9oxfMPʩ"r?tA'oW85-Ra,96{5؄8*նe5ԩ YَEbI ϫI\hI.[z/G ŴZEtwJ9懭YxCմ-ku&u*v=*$z n_^Omխ- >5/W5ڮw޺8S8n>E4[Hl sM~WD[K)}9}]K4˻y$' "[k9%/]ΠҀ6sB$4-/$ɒh?Zc#NծK+ᾼ;&S7ֹ75]nKNA}HUaq][㨠 Җ J$$J}8bpE4?\ SSO $ɁOOEfImڍL2,ʡh]І &0p *+mBkhl(G['@]y:BD؏eEs~nδmCqD>?$ǨF)6:}Mgv1ߥvS<3Ұ<O}#)luwܾT[DfwlWƀ:*+O1G$ʚd?4fZ os6eە##9@Hi:RT:zPo=9ZA0=hJ)Pzupxgv{אF(xMO$đ7$~̧k6'ue 2}}hM[q45Zmy 6Q@G_xg]I eT۷?y['4 X%.E$y8Ym: h>(#_U W }3Zwz_ҸMsi*ZOQGҖuJZ((t:Z6Z ŞvDvde9R8SίK;{4[]f8EA2(7pyWe&[aV9y{"@~x6Ò]]B4F g|6UQM*Dic$XH@}s^ofvonamTJE6 $"(QHaEPEPEPEPEPJ.1򥢀 ?j??uB* ((((((((((((((((((l={P>(xHºgij4i{,`{vCS]2oQnv!>!MfQEq)n>^>_JK'¹isM,˱TcJ%Ok=^0&!1qڭh^ ޗ'w$ߚp-(ڴ5jxOÚN)<7QL*`?ixǒVT'yqv\ƶjz ~`cIzW?x7?6Wdr dRx)t?J{um}ȥ;Niּg/a|+:$| z `W~luWx>3׼yk-#<,,1c#փw.J.!q׷Zo9d`?G_ZxSB[bufZu!=gG_ZNMAm$dMXzvZl]P i3LxH&nxx!F=?_ jNZ&"w6\Bѭ"| &8RԴ_]|/*^*= ⶧6N#`Z_+.<;]Ni%%$8ú=k}29,* }HQ1&yxڶb[bK $q^rf`hS8\;W߉]j71%|/, q@/ E|:1^XΙ=<$K KmcDNI, sKyCiwKEu=ܮI'+Qֳ {υ:ͥ[Ėz|FC `qhׂ lehmM)$ H1yY*ˑyVk˧å 42^p'Lh2b!3^vx1z.NYfjH?w>֬K- ( rz瞕sq(m}+O39u7<wjVR+iȦ(Xcp9;K!#gV-_[Oݰm[PmNOֱk6t m=9I?Ě.ᯱY&vRM#ͼE-{Za Az臈ãN[-N[*֖āx=Z}jmQ1ޣ sּZ4K⦧|.JHX- %Ew|G[.+BѸ16^:&uYi cލыb-OMi<LOl׆^'|@$c[X&~؋#gps4/75 abY]W:ɏ[?uL7??WOgo慹5S]g_!€)K[t>pl<dk:.2Ֆʺ/UI$: +qK jyX+wV]*M&23pq@F.d6Z,QB9q!*ogGRM]K1RZFLr,rOz﴿ P]>-+mBm-f[~1}S/3e=Zm8hL[v s|AĞԴ.INӭf|A+IIҤ[$Mdx}J7!Ri.J6jk;xe4B;X`= @&h#4S ( Oνi>?|H1ZYX\ܷOBQkn O}އV0Tr#qU~&;zQkۀRT$nĞu@&i?m]16vG>U(Ek{L֟)]޿/\g)JGQGҖ((sSI府*2ǂ~U#9=o+st˟4.GP9 Z>d9(™=x]%F{N*J((((((((+$UW/a% QIP((((((((((((((((((W▊ۂz_ZQ˴_zEvmqԔP~Wq15%O”c#=PZB?3I%rctq Q@M&6*9cp3TP1KE!Qy.a0SQ@t1 w(SޗAϽ>aӊ6KAOEU+XvhI%@rMM}^%ۂ=#df1FKPdժ(DF׍vʃX,\ME1U1N-ݝ{f#$5%[# O(]'ޥMaqZE=h0A=GIE1b x=O(Ek{L֟)]޿/\g)JGQGҖ('p ē F|˫".8)';Aސ6ى2\Hz!}EGZ\Y;Ef=\p PCƻO8( REQEQEQEQEQEQEQER1“\w<ms᛽*JGXPYUni6ʹk(IcgOOO_Y i>!8n?=];S* rMXukEizIu)۽uT,ѧ[;[mg$ށшʱF u3a*sĬtQEQEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPF/\g)JQp?5JT?:- >dQ@ /8_Ś󵾋1[%Jg8*a@RFkz;KY&eX{$W5nRbG8Z`Ӵ-Sˊ;U1HW9PEPEPEPEPEPEPEPEP7*kʾ-U^ 2܌cU=+>-sY7+R)JztyycKdOJ~t]hq_.a 1fOB+<`(((~n&/`M \Am nQ[>+ Sv`j(C ( ( )7sҍ#Rnd)s@(=hܸF3Z( -<p= -PHh#֍:pPNM!`:Z)7Z7I=:ё@ E&Oz\E!`$z=JZ(Ѓ@ E&F@ E֐F(h)%>qZ)7ތu@FA(h=h6<zHXp( =@ E&H]@a@p2=zPFE&'Ro_Q4QFERP2H"MngҀ( T_/\& iRJ5ϽkOҔk=DuJZh#;".R汵N=Jfc82X^okBsl ;PQ^LV&g͹Ŕ|8ЧjuQ@Q@Q@Q@Q@Q@Q@.]>D4<~W/m"e t鞽GOZYj:&`u. ]\2XPkoo śip2@ 8bPdA@{kYIw{2C.~{.eE !\'/5/6jMLdVs?cڍXmom0}ͳ-u}v03;z+-ovcߊ<WwK;xZ{M6Eh ,^cIO L#O}|뜑u_SҿJ?li_ ocҸe?3_.a %Q]o_Ъa#Ш(()u-L\5~<I{-θ1AdvpI99>õt@a@',Hc>moF%5#>aYTJG>%ҍ6`|])#>Q1,? jߴzm[[;r$pܒI$WiZce)r̰1(F1wQEvQ@#T;N:_ILcj6GsS`(]⿇z[:Mn쯓3,D'$88Q؏^Yo]LoG{ӧzC6w^tWUյ4[iE.g+I4G(x~UD$g* ־ bK)I$pEyvztGt$7,ϸ1wo%y.=s,o&*[3I4VwBے}2*퇋4MU 7KʾkuF|8t-,ɉ$ >ӏjt5Sӵ;fAS1rFI<ezYkV4w2:[O5vmv屻;v}kqE=@<jti~xckj7.@+zJ{1pGMV~)~SYVz!KJ _;J +t|#Ɨ@ s`Ww&HfǷtJ{%Z\7Xn:qq[zmF;dNN~M\VcHAl]MΣgk³ws[iZ PZ^$cYw '^}?Z^r6p3I${ omٮn %ij4!ݝq܎9ط hӾ#鷿/.bk8_5_2V-sx-#Ğ5Śc1s2>թiM/qgk$'_¹ͦɭ|[T̒]1H'9=Fq=v]{Jң待,EdQxHVѯI9SZ V5K]*g9n# z/ Zxb/^]xz7sܒv㨠OҸAӼ?-'BuwV0I#H5~-&8-;#PLOԞ9S^_ښvpV ďSqP>3kg=_2g,q&G:/t)wE%J$N6qKq ߄4i˯ΘlL^0u-/SfxTPg99_^ I;#P6:WɤYmt wۃb<1{sMDm/|Kٚ;'Ph9%]{cހ;x{,:iÓrzh[xZz|&آGzdμ^\x:d : >lcַ|}si l9,W H㷑|C gr㹠J<u;TO a!r+OAt-{eDIr=kmeiuxh2A't2x )ahSZ1. yڀ=)W9G=Im%2P|{+bI=y5=;⏎mAtLIZ @? 5n|{{2j XF r{cYúVi-"2N`U?9vy[{2pl,M$cIm;OMiW꺳ڔDy8y`x^ϵt8sjil }xnWYj)7wd m:H?5PP#/qhuco-uԼXӘ*K9(k⦷9Usx_Wݣđl,XKޥ( Čɯ-rh:=~/p,f"G89۞þj_ǫ-KRh~^v.sN^-tY{mJ,`̗sSO[h`֭^K w*BƟ,bYtY3*`kbF,0r}j&5{[ iD< 9b|Mh-v=p\F?gּ?:͟o屖O$tU`wW;O c}ye7ϸ;Ddf:<X_{+Yf< qҼV4{>oplNBmR>+U:sWviVy%)%a~=?ƾҮNԵtH,W9wlHHVQZ4eʩLzgYSr|x֬If]]h=K%]h@,Z`}y/ [bX`h\H^@ωtM _jXnԱ!& kRj6MUw6v|Um FM<_C{4ߕ OzY<#_:uIɪV\] 7`ӊ-6$_r`oS85Qsᑪ.uVI|\tc9INjM-{ g*Ց֭#XL YDRXKi>Ф{oþ3мEzCi- o-K``p1޽3OKIhf@*S%'Ӧ5ɵ0~8LOz@v5>`46,w3z Oeɪ6cB|Y?xuS@/6}Yeq?wzxEK81%oIWG_t_ jRHt~:RWzE(PdW5Ο -lhY1>z퇍|;<c[LaO!#TOoQ{-+QxT;D9^iGnuGDX5jїsOvVou-HǶ*ˈ8U(25QpzOLk6>)]/\q1?IJ}oR,qKXm1sU+ySUXo)4yZ5[7sjuzΫ7b5hj6ۗF##ңgF_b,fD4 `+oRzqAY/az- -J%nj\KKiiVg) F0]Ox~3E?=`B~U0BrN8] JGYW%sԒk/_3-QEQEQEQEQEQEQEs ZkB*+3(#p=}h\uB(?aC:0=(.KқM?c?)G*zc 8ikoofm-n$WaH7>bJ/ xPu*uOsnE~#>-s^@^YoFk:ʯŏ)JҿJ?qK=:g<+? ]krK!߃ֿ1qUa~&,GQE0QEQEPyu/tm>wj9$d֎H\ #Qc%i+,4m/Jgm3MimX85vd}<3I'v4U* Q%c'n{sǞv c!04*qRQU$,ԣ丌^JD?3(qY2*Ih##]Tu&1q ۟ h4M̭ԑRhZVvih0 a@wx?*C* dzP{*PGAw9 <Jm6x*qSy= *?_j[[ImoZEo1̰p -X"Yiֶ/ŊPЀ9`=qQ.gkv"Qaډ$¨穨VҬRڦLsZFEV8UihCx?0L" ú;*Mۦ#w)AE\.f8+$fMbqZ*j;W#[~4O{9H0P7nrn3 b6*i=.;iI`b8YSh/N:sҀ!Y[oHG*L R@`Þxi5-}lpHM+M/f1Vu'T#ú*4 :p9kmmsiXuB#0ӌ|n,j|OoEo,%h/mɊApph&,関М_-s^zT~eBZܔ%5oI*Qk_G%P>6yA('MtN0qҀ%9Uot Kh,%pLyh)4uiW#w, eF=GJ[LM@#dqWx>(_:M8-څǝ\ c>эJksiw1zq-=YZipK W#;xn1֟#@-tm6-mA(U p9bt'ivp+n=kDHf.G=pH i|nN{mԹ3O?KgͰrD*5i㆟Uτ%Z![d2GL{z\sǭ+˂3T/4='RIivw2(>V4^/"oכ8uu)tfC3n2@hN9֤p*?sciz[;i}Ie3W~cz]8!v6Pn qPf:}L7nt:`})h.sfG!x It=.{Xd\/OU(.a^xM($JY\&_'eZjJPȫsWpOJER'@ot]7RhΣamws줲rZָǐ)Lzm*t;y{6܍)CV#O)^nL~6\fg@mr (XL[I'xaT-HսpOJuFEu2QnWlig=Y Eg[M5g ="=Cxn,|C(9<?F4z]4J۠9qV,=6k Dd14PFEW 53*\{U(i֑vI*=r;D·@E)D[d鑎zԢ*A;[ha "8S9Qz^+Ii0@O_ (25Qp?5JWw+>a?JREҙ3E/,j()>}kM[h:t $ќ<nYZmx}v~`Ls<2J.`cҠ+kD EQY((((((((((ok.`x?AQןHƷ\hcKFh.آȮpçj*'(iIB-_ j%|GGt sGrSq E56ْ]W]@\#?Mmᧇ/~5^s$VY#xbNPoX(+B((ϥ-#P뺦g@xgMRMjғ2v ;B 9$d0P8+7ƿ3Z^D,q,w5am>]fX<O~ꮲ4J4_׵X5M^[|ͤj'.Iw53R&j^_9H`#<st\u۳]Y&g^*}6 ;wUl6/8I:P]4x_ 7vzle0 7+kboOkCjJ&Vp18~X&6 fO6%S<#}Դ8oog\C@wɫL9?]nK3[!A.ux<.kZk3X[[%t7AӃ'6xsZJ~%YsSjkh^h7doX`H }OsZG/jg"X%Ki1pr~Hdž-|/;iys-ܼĬr~?L>K^mlG+[Ma<m "6;sd 4dҬn4"I#)hٗnsω֣^/^vWsks^i7z&q]iVf5͌#[S\{h丙n#R#1@> x{nk/j$3\[1' gc֋Ϭ?Eֺ|:^-ǵN_ /< &jiq d- ̀zǻSuI2hǺ4NC7ɆGu ~#[a{>wvdIR~kCҼSc KQ]Ned ~2Hcu Ns=2rcL_M:iqkf`H1y_ހ9Y>0:D~%MbCbxSSú_d ^߳[0I gϗzzWqkS3MۛAlH\8<#7,uY.nH.`+}Ҁy |FI#17Rq1`Ij?pvZoxvH$Cynɂ~=kI[_ 7.~j"ܪ O p4e.mu{Y+ (>R4MWK6Ku4Vi $QDI"Koͯ\&# 0rGl3RhItiz0ɸYKrPiM{.j/ cXdޛ|S]GZ]F_Y1x.@:p{WXX1Y8SF$c9=OsSOOԼW{} ĪΤaqmב@D?m9,*,<o\Ml'G/1ȓdz5[ͮiھttMʑ+wzv>mBv{Bs=ˢ O@i1S.agf p6.# \?,_ǭ_jJi>qT*:H[5/!MZ]:$9Ɇ$r:L2g[xM'/K n'4[T w[3F>ukWt_Aj6z旨my&W+YRI#ҽbIt=-6byFYakɬeU:tf;mF).xS'08U_İƗY| <c8BҵBV*n3yjȡyp?g:]_~f}F]*k}?<>{ʬyAxH>! jt/FQc sր9} To~jFR`+ 5VȯI\d@H/ sڷO L./!y: a񵯊e40ZBMjshXXOLmLatw#5xb+7t}yu=UAYе-Gq2<JmR͖*ϡOn.վq +(JzqsWro/mrdOQZ}5/~"5.[*P[<:eG ׁۥoFl˩=ئkh9#<yQ_ $`ӟZ%lv@u ݴ|됯9v2Y' -e{Jsse itdsk>t<J䫋okh$`qIqҲcUk/-WէJ#A7v]@i:O4Bi/DΗ,Ly_o^VtkZ׵϶RxglQo8e~lc]Vۋ k[wڼpVrcqU?\xgR.5ZHdD q@\$q榮Z^}ZUv .cϠ]QEQEQEQEQEQEQEQEQEQEQEQEQEdkJϽkOҔ^8Ҹ=pi^saiLT?zae$2"dm>l噧/UѨ6bhU'æ}* Y%Z4,<2rW88=*,QEQEQEQEQEQEQEQEQEQEQEeqLa IE&(KEFѱSv)PEPEPEPEPv/(:Qy-݋FQ@ ع) JFiPB(9 )hk㚊)qIP2Pp6vȐ D45oE93+E7b4ykiPm)h_N* 斊oA)h()h(Zw:hA(?}V^ym__(9z@ מ:ؼ׭-@F)hcZ8Q@Ts=A寥ZSkKq-@;REQEQEQEQEQEQEQEQEQEQEQEQEQEdk Hp{_ä{dl` 'm{"ݏҼVw\1lY*+&\ e$cQ& \/KQ<6qi]<G 8T_oմ+-[F+k1#+8f-\g3T*s}8 ;Kfh-,vǘ=W<U/ wcTFZKjy64!&* {UzQ@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@PN(xIo%A0Gǵrw-Kk*'d@k )ٽMᇩ5t='Lz4 $K a1b~ER<Ȋ9YQZQ@Q@Q@Q@Q@u;)/QLnsjʚynfb]zN=+;X@)h(((((((((((((((((((((Zj6]Em wʔz6M/OX< ⑎~d~~.4xL<}I$~&(a9=R]bVN i'f^VW+#b& 1YT1\ό+g kYLl1/ x.:iQT(WoxŽnt@mٺ^q _EX*XPI#$nZE۩ (;08r0 a::?u|Jl^Do,dIWҥs"6'Tڗ&zRs(Zxn/AE^ӵuKO[ C 8 Ҹ K ɚG(N?{ֻxQV=ȡL;c#tbiFG<D!7$ڷQEzAEAvk6$HPcҩn0rW̥UE٧3NjlIĎXHB! 946hA\**I6'2Ȥ9?*JVz(^:z[h ?x:5- 9o݀bqa7u;[_yK<{1=OESF6s>ZJ3$㠮BVOWn^߉U\;t{\5MnK!(6ZT)3ޠGh>[o3XǽQ:P $2[+^TӮ~xT%DfTیzqGq{ݿx4f+SoU׭thMsGJІh Iu922:u«4Q=P?Stۉ|.HϤ]16W1?׹xX1~oo1G82ܔ]FtڴwZiop7zy϶jeXɫի\uR#ѡ)J-hl^Do,dIgZxn/AE_Կw\_A5W]_XXLФ8 BwޞJ'$s,Bݚ3ӵuKO[ C 8 ҭTp{B>vrGZeklwÛsnQP] p&D<݊=[f׍}(#| >ETҽgN?+>j\J-IOL)ڝܐvx7$Y?ϲs=V-UЖcv, b{rMmJe J]"PCf~[, /,4RpMT5[mkNK-K0N:~Z&Z7uֹq +"G6;Nj}#:B,6?[Ė*.&kHSJ\1Z 5v´dq$ztX#GT~RF*.Q}'*y6QgS qz$3}tkQP_ v0v}&)Vo5Sg\}FlnOIS^jgEJyw_/:XN?^tZ~$3!F6è^U_%Z窔gdvДM9nQEflQE<M\=>kxn8vnyl5xp5Bp xϰ;GCZQ暹͏j>Z^ !LNb1=@}1ݻ?j<;c (v귛xuy΅>Vo3lĶ\]V)|ֽUtfOƽz$dyg{^ >Ѯ.HbRw8'52lV8( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9˻mcı4)cj!B{mY̏w:+-ќe#Ҝ*=dݺ8_Vk<rF$%o3p?J篕.hnD3FTTQ\ZV%YNyqNao-Lw[H';=xXX^G;\ihA.Y_î=xM-r³zk=v4^b)䷊h4,&M{SGiX(1:i\Ztd9[ZJ1,~`' c-kRl鼨نDjYj״ۋlo֟1#fH.{:]Z'wJEN)-7v̱%m [ /|(Y[oNAjwqq3[s9^'U%$] ,[d??(dn⨗<yw6eNmOcG\ *#Z/?Ш-.{')3P{wzf7e&(NnֶWhmF;ƸmK\&;1>B#fQRkFI9^z_3PjZ}&]VAing*TJFz2sӟYoS+@\آu#/h Bpm龖lpg79P?Z,KhP\7fgV8]]^Q?_&xHΟ׾0<Cm7iiʱ J ;VE9 i^!}C(H;[+BjV;-ܹVk6Rv2K 4m9j~!si4dyUUl$Hld|`` `9jUW)jk~&tpNU}ktҰPbuQݎbx:Qc8 uTRtO* k욵Cw~m-[f(R63`qG<~&ѷ'ir  K[h^)\+(«d*ut{KmyYtG '#>;WE?g5nyu/kǿ}|Pգ}RtF(?'=ilˮ%杮A=Ɵ>2A<StդKastelr@Zд$XV0Aފ)Itc98'w{[[o'#Z/?Ыvv|mvqXZEqʤ=ѥx'c~T{}m(E?Ozl<9@~٩U$wwTVU!kjNڧS2Yzf.V; rxǩx{ .%GXLRp{Oy[r^]iZɫz-kEZ-RYUwJ4}Zt,yoԚѢntV5rRn.UoOJ-3v8#t(<E?ympn#*UgW .XxMc.3i,?>g fڣ;Ƶ]f]Ӆȋ$R; Qd+c,2w,&PG2HCUJ&"y x2Q\`QEx[] 2[7,r"ޞX]Z=>cDW+zu+ D\IFm\?u31 j6ԯкH,j*ךi ˤ?j"JQvs/#Ku,sMRۺϟױ1šsSI<κׅGYF'JUGy!{ 1CTe(@˗Q¿s]TƇ|Ek2Mea xedk,{2</AǞ1dLx`q +|]$K]j7=<a]»ەU?VU:g2|h sMErx{4[x_Y9yk yK}bObX.6\1;IdQ@Q@Q@Q@Q@Q@Q@Q@Q@B#h2jAA+zW?[\4[0I$G|Bd?*W7foNF*+tKY^@CWWmuݺMm"*sUO܉Ӝ>%bj(3 ( ( ( ( ( ( ( ( ( ( ( ( ( (2|E c< Kk+<)CҜ fkƳZK,W?>#h3Mb.vF~ 5kY:!!l`0_zծǗORugM)Y+;d5=@+:pgg}WD_?oΤaItU+Kfw`;?>|Gb~@/a~'5&Un-vkoRyKX.vLH] tj:f.u ĞIcJZ,:[`27vt5x..a HȄD@ 涩J4vsQN+JZh)"5";U|ShEq#9Qp8,oQ]Z_Y:b;{u9yYO}9ƌ$j}AR&$M-0leU׽EꖚU!pRoY}M!ImS8Sbu/@52_]:'JV,]}iͤ,2uEW:ŘԾZoݳ}9یgq\P+gе 4r籮  `pGtL<a&MvcG:IM4z'њQEqZMktY6}]6{uw$,2$Kʑ띘EmW s4نK8i6'ns5J.twǛ_*XfyoGWc}oYGwe'HS!;xB6dV#ʹQE"kX9 F˞;NTyf#2p.minz3ohV 6"3߀۶67c[sF&HRcx?^[%^]V#NRQz2J Idު[_CR&HA!<?KV]k+|4ߏ2[8q|d5%veizCbxSWN7{M<֫cR)[M:gw[R4ˊ4.SEm|Ҥ d$ bu*O# ϯC\f<Ao$\DF)^ E1"J*$)T}Ko}G7tzm_Ovf^\lPMAj:ͩnx$>W+aX&kI5 p{ XO4QX{Kgb50鶧syZXO7M*Ͳ0;=p 3Wܠ1Xp_ui ow#ȖQK_QR5O (:((((wM-.\vdۃ܊J?]q \؊Ό=79]8Ej- 1a;z8u OYl?][Idz׏:k=xӜU 7z]RiW&N$,} Σ^xGG/MWH,yRY[:u0eNxB|"Jkbu,{nm#1I"_G/'grt0WkH_ʰ2I?.?=k?UtfKʤX`Rͭvk*M*ea1`3˽ ]u%@w,{s${\60SVRmFh&\['dU|teFmp/8byf2rrcp:`洿=Ф⿆Y巒m(:p*k0YZO ۬v.#xn Yqr3p4neKKA4cyU0gvHI;uzWb<@tV#a)T1>YL|NNGԃEMѬ+(e5yEq8((((((( y~#Iv4x bI5/W>LN s/bO)SLHux o<02#<-ĠʫW-,E(WhQVL%'}֜I\1]NpBYZXN< f$jD'bM,nk*p'xjsKHv# ( ( ( ( ( ( ( ( ( ( ( ( ( (8n>&hb(Y q%X}kT(@;+O3"#e S(+Y[rQʤ%2g6,rO;\㴟[&Ll gCRgP _ drH4Ǜ,t#\Ж=&# |k?suk(JvkѵuoԷx}L 4y;v~V甿'Yz|Ż,@p w'ҺJƬv^GN ec{7k맓zإiM$Zgn>T_ˬx^fbƿ*AVj5KݣJkO3u+mcoo.bXeJ#'姄[ Ё"R ǧEX( zrik_m "ݻ_jsS\J~ƍmJB7;P/M2P[*;yhU{?svks_S/EthJ4U2ye'@ E/!ޠg# m"q*p:+.hZzgQp[i_2^[cRQ{XԴM.fЭ}ˌæ;}1]9#qmc<$g8Uvf[xO.4oی ӪcTlK"UqqOs[u3z^wvM6ԼR?2TM(@T%.!)*]jT}O#_KZ ^XFn/b|o nO,u46;qxsQKK=RwgUOvU֑d6/`g|'8k`.nVuUw}nmԹ_\I}22[KHAcZ6k0 m*4kYsd~5GIG=QDf,3T ǺHO NHt·(E;{jysf\HCXA2KrCICеhQY{Z3Ώѽ܈+kwE(Գqo^_ d8=<><vUҝ8̛v"j{9$t܌+iz z"a$⳴h))ZRuV註Fx&*R\63K~r;MIY2%- bCEiAr?%L3} tu U`z3@T X|t}M{E67ֵ~ $gXz'm/?,S`n zZݢSJ>*mS5o[ã^x}gѧi&O)L]QWZ4V2cR.-=(((((= q:Z3edBB v\24>G{kVi܈sY2y#u(ЬR.لhHBwwml~?kRϤJ yKq%˴me0~Uڥz\ȗ`֬KM^]xX0<KQEuJ)ms~%˶Ic@VQ"Uמ8":1@la&<`eINZ!duH#C 4WqglFHu8Ai6WP}ռ Ch%m $3,H<@Ny)>N$Y`o*hg Ҝ@#r+7 s`{K V4rKwc'@w[z MttW9 އ=no&N?u34\'z?;GEsfhC_7@w[z MttW9 އ=no&N?u34\'z?:7ծ/+[;ǼSíRҼ;Emk܋u$p=- o#+ lfkz]}64#X4S#Bų$wDhR7!̝+հ/d$c'߉<IsK -/.QlѺEe$t7 t.d[x&ڮȾdqye$ӱa9en[Oʸ7AjwzEȯ$@,Ȅ m@Oº]Gu4b(~o6VlX5mI4A(]Fqۊ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( KI-dvL5ŲIv\bڄ{.Ybx#{Vs:]j:r46'8aJTD$0ҶSӥoS\UIJWAEUQ@5 u:3@]o _z[ jvp]#Aqq2>G0C ? JoMqp 9<nK'k|Uq Dw|CiK<ojomז0m'2@2#84hxgn'o/W/Oɣ]* Jfm՘c%=~|ßD2mE=7F˻>ԯ1,LIo> o__VW*Եeb!k vwcQzԐxGӯ-/.QW{{dUYYPkߑ@?8>—O л:Qfگ4,_gi;tnHnʜ]usxW7)(:{Me j[I Y+~?_ƒײqw57,To3RHאW)rRC(I5k[/{a܇O"YQJ6䞛x 蚗[A^."˷^6txW7*ŗ4Hm[hWoo+US@|=2ek7TCi8+Eyf8OנH_V|4A4RHbpsXIp2>t4]xo1_]?4N3@v){]ú|dQW%C1ˎ(jUvf'bNwE!T=WqVpFF*(PҦ;ּ9hz.^5_21Hr@$tJ͟@K[<#칔c޺O 4 ۠UX؟W" vMf⮝÷qib?t>nQ{Ya"Aw - K#R6srzVt2M#Z%(ۛ;*( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( Pv pQ@%m:0,77++}_㍻`ge'X:(:xf~;X.O\)oڬ!VXf7+ 4R)n~ra;9Uh7o?6O9o9+~>X:Q |Fv# esگxS3 B {f?q//|l0T0MGm$Ȩ *z(@e1v_p{Mý^0o pBݎX{֖[lv6*V $}9o6:4 f[.!QyX\msm,iᅫtt{|'52MwMty$mLĊ7!L!X*IQ!l5x;vy<m*O\J3Нh&AVw> 'P~O5@fcٵ$v/!ZF=Y֍((?DžR4lں. n q_\ .clpp3]Gw "I#B?QXZ$4]$\JmrVj-NZՒƁL8c^kҴ"JeI<YO5R> bAcf;rI20vUnfxS(9(((((((((((((((((((((((((((((((((((((((((((((sž$\*=~R/F W9o\iW xIELokRq6 O_0ѫgX [0X.w+)Q4g^:XkZmo"WVО]؍LjFÝI Ϩ[a=HLow w pLp_Us }BaVp,1誣%OqW~#Cy&A=&_=3rvA/i~ / &69ߜ"9-;]>L;x#HPEs7TKwRn-HcZFXq]xPҖ(((~\xyZEvWsg0I\%LѱV(FG^p+u]X{6F-Xb03Yx/JдY \\a95tѫ&r/ #\G=nz(ڠW*Š($((((((((((((((((((((((((((((((((((((((((((((((O~񥶻2Z]<'!0G t"o<|YRk s;9CK4`BWm?jyMV>HZ k ;(?0@OO񍦅_<+!Gue9Hu* ԃ:-uFRB YY\y'$LV(((((((((((((((((((((((((((((((((((((((((((((((((dZxė$Nk@e^B܀O€6(=kͼO\\x}VOuJG#pvV7}=J$!"ZQHX<EvV0VQGy xvh'8V_i~(`:QH (?>em3gMsr|KԷiB'猏Q]E*R1ӳEb&m_Plf;e܄+rARЊ+A EPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPnšw7),;Ƹ%K%uo-sGC˭@&>"*ƫ0 7uحӌ~=092{WWP6~voM.WsGuMMHVr*!`?J467J&(-7c,Oַ8Դ0Fom%FvB\t^>|#l&umGd(ҭèJO<*1^~ENJ:$MO5̶/_1aW$%POT4]*DӭZ‘.y''=jzM+3' NjzXI6Ӭx˂6BFH9R;rjDŽt@xf O`H4c!O\psW4sxGZs_%iМȅ];8㓪|Dѭu(3 ~gҵ51 ;!XUԑ k45G[\A ^MadF (0G5ѕ<:HW]\׌huxPD쪌 Jv80J3wii&fmLa(vH|_A]>^jCT+'Ku}?QmB<[`Fb '_\<䘮F~eEU#\ 7B|̹)#+ zf591 rK^ r5 ֆ)!':V&Kx/E-g?kU\%~2rע+0 ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9GM(iz#]A= } KCYkZ[%R_B-pnchj|Iw5֨iPtO칿8ta%֨6vr:W.8ÿ -7} 3;5 &MOZ[[HFz$ oD<_ZOlI%ʬNc+H4$ɚ|ѱT1]qnC#2 OYfy$x$TpJB;P+⋅Ś ۛvC/Ѐ>b5c,qeA?4nEejH~#LkE̍ R^&RylI57UNnx74A=HR ǀ?sZi^֚q Mg,[Ъ;{P6bcE>yTrۘ$]U#56_<9 y侷`n'T<䎛 ׳G"̊e="MMGSu c]~6VZX%ݤwm~89 X_JkKx,᷁6E G_}K-zSaOX^(XQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQER)ku$I8h]bQ6`90 zA˘H+}]kgd(Nq!wfKxK_5vLԭ+fG;VI$hqykw.g;@# Hn4L9-%XC2u_Di#/|Uٴȴ!c$t][8njkjӄ6n!xԵ0\F֓>춥GZ <Ig`An_izmk [oXI%j% ͦKcb~b L< ;[#Vs]]vJA %jIZ|1I=,2̸FV7't\bIj&rK8`֓f7F9|A>hbI wS(8ywdWhSdnV9$m)T7 ,æVkH0Ǔ<1&EQpOq+c6Uv v+__+<r1'$UdzIvR]~^xwUy4ֲDUNҳzsY=;[mByfg Å!D8<28v_ T^^:yQ|Vn$@ Fz.a{^ SuK[+m"\ͧIWI8A КxZ\L̑ Oc>h~bi9|F:_R; B2|yQ"s m=.9-M}$]D8bJ t)VU$O`s_%m_FU9Wg]%2)mX=xOQGXCYgtkor+g g5Y}s@׶lmn돘{2q¥>ww%K=ļ- ,xJ>zhY9R|ˆ# ZZgҴotؠHԄuS '*d\&CxV?zC޴u h̷:}fkx3av Btړ1-4³BΕmk,nL̾swu? _꺦 Kr+x$߼[aÐ8‹ h̷6{Q gO$9V("嶸PB!I' 8V48Elz SaMr;̖<`oEPH9VuQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEsKEfúO0m"*p?:~]y ݫf94=X՘0^vI)$:M6cLػ8ZI2=kו|;k7&ݮpnGkWCyj$Ub^O>5X 圆Cr:c*o&=4щ+PQTc ]u/j0dagc ˑAY`\d"JG6-ZTiQMуHX"d = n5&o o򷕸 \sF:VQo5E"y'pGZ(+ .40xz] 䙞 9,8*4 Xt`3:|.Ӽ_]ڝMz qjZpw*F dp󎵏OKE;zBAj#%ʕ9$u|3ɡok-Is4xy'&2O@< qc:J7W%X*.y@GpSsF;|1a̖wPW*fr\C|];ͼ1捯~C(N=ր3wp5lV 'BO*DEjHčO%bmn|ѓ+ҼZ|S O CM:7Ii"0o3h>\$>hc2x2m^&ۋH;w (-%ՋW!Ef/l?:fQ^kOk[XxiJZ@YԼ2͕ן uZFdna#zV<QܓDv/oo&n@<+S?ﯭzbF66waq(U%9I#m2ء%==}1~>E(((((((( xMqj(#i<S|;WSu$j֩ťi/,3̋M9f %w?Rx385.m[C6Vi(2ˎAƀ:" |j#ᫍkNvkS:1ZC{k{$)9 }+(((((((((((((((((((((((((((((((((((Aid#$+oX!qѻ05Eyp|2\xz$$+#|6&6LU-kNSn"GKߘq*9ù2}Ɲ0Z$Qidhmʷ b:M:*\\X;J|GpIEQg*(PH.ƨj+Vx)PIUԎ=A2%lm96װ4Ieq@hh_ 4+TTkDPyX׾2ŏvRWCuoM ȥ]d0#>ة_ &{ &HJ7@$V Y!Tq.ҦI2GLvƱFrN RG8khth{[hV"AeQGn'ܧ$k c[^,`J37*D+sWQOum` %("F5U(((((((((4=?Ě<^Y\ ē[ǡ(AǶqMm/t,cu)#e 2nO H=ZTP; *NyVi3O+L 'fQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQE
JFIF``ExifMM*;HasiJ >2424 2019:07:22 20:14:152019:07:22 20:14:15Has http://ns.adobe.com/xap/1.0/<?xpacket begin='' id='W5M0MpCehiHzreSzNTczkc9d'?> <x:xmpmeta xmlns:x="adobe:ns:meta/"><rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"/><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:xmp="http://ns.adobe.com/xap/1.0/"><xmp:CreateDate>2019-07-22T20:14:15.236</xmp:CreateDate></rdf:Description><rdf:Description rdf:about="uuid:faf5bdd5-ba3d-11da-ad31-d33d75182f1b" xmlns:dc="http://purl.org/dc/elements/1.1/"><dc:creator><rdf:Seq xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"><rdf:li>Has</rdf:li></rdf:Seq> </dc:creator></rdf:Description></rdf:RDF></x:xmpmeta> <?xpacket end='w'?>C   '!%"."%()+,+ /3/*2'*+*C  ***************************************************e" }!1AQa"q2#BR$3br %&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz w!1AQaq"2B #3Rbr $4%&'()*56789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz ?F(((((((((((((((((((((((((((((((((((( Vk8w}=Y[Plgl17P=T_/\& iR7ԫhzzSys 72,PąFrIG'<Aً+,cy.*;j( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (25p7񎭧hzlڵkٺAeI՝MzF1W|TM-{-.RE+iݟZ_=~↱ҠK:2_yNup2Wddqֲ5-_ßm,W,d˨o\al1f5VEhf!Xfq uey皲EQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEVFVnz2裟ǥk]AgA߾Jڄ#Ra-H9S)t: Z:hG  WgҴ OMt$э]S3VYEl Ҍ!K$4ns=kG<EwxsRd[ѣ¡'5|kSbuv->/.7-6zVFǫ+}Ӛ\OP9o.e\F =vnO|Jp"fx<X<>jzuj}[&FqA uN#wQ(޼ӭq^5~{*P<%vFI9 UռUg#Oހ-A zWOWӣg "mOL]`uMEFc<zJp7ޣ-2G?pp3QgvzN>+eW+cPTP̪2{ő~!t4596@~uь3g?INxV5xrMJc y@[É9(}֖s,x;ђ1$ r)79\Fѥ?u۸(y"ڂj=2%):FyC ㏭i䊩jvzN׷ {t`FI~PE&gܩ, \rjQjir"iV1hHh\r:wI9s[O&p\S]Or9#ր% 4ns=ꎷ.^j- _eb|ώހ40NڝXՓ]Э58mweh#T9^:0$FGF 8b0=)sy4Îzp3Vh⍦}@EP:嬾mnOY\#F9B,SFX֣ bq˘ V884.E! u5pby?ҍ֢~.{9-'Ś>}ͩ06C0# FNx!_Vrޞ `q֍#9 dojYb*bOA@w <ѸgD3ƫ>fш>7nսoX4dH\#ӝ`NB@ 2zU;٧ݮZ%ްE ^ygfHRxm5_(i[_qwCОCہF犠ӖMP֭{PZ\'Sf4-qQEQEdkzrp:i<wz_ҸMsi*>k;{@M^) 4=,󶗪R|ɞ$_Q]]ұ5kt)<sU@q⬃g'LR=M#險y:?yOIWAP@9PEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEP\w5߁t+K'IJ zcEiJ~΢gs*7%Ko[ [[pH-<e،][c'L>^? *MϾgMCz =c5σiiyXpG&,z>)GX|ߔ qxU.tD _lmr~l 7JCAaK6A;xljg=OAXA &yjmk|E_ݾ&i/ׁY CĖ=jS<z.1@4xO>65)8[rJP9MgUm#ǺE_ͨ{ w#StoKoxWYVOVۄ_D,ͦY2;?-sbYmkP]>mu6:J*0yϧjMWV:Mnfx\a#JÚ6z}s[Og2`m]],eI崓ȵxڇ<%sE>ZJ1qJ^O=xedapP܋w8t6:-4kg6{,Z]whzwQ6|D 3i;|¿iXWm+k$𗍴Xu]J7ZwXWw˓UVoǠ^k7FTdJq ދzj"mGnn=ϴoNh?$7#q6xT~}uɨ:2Fgr9@^!໷Qmv#;H9֮_[Y^#}2j [83n.qjiڷD+ksb]뙢Ѯ%f H3󫟆~%=~9D4An{fSE[];n+@?6m&PRH.F{a8r?4DSh|VdLoeݎ$ >h1i2а ׽i^w^1Ӽ@|EkXbê]s@:5T'+·W,vuezWEW մۨOpm.!hx gp`'ޯxzhzڭ*Ҽ^bJGBsZs{-WND0"*O<{'}YѾ&^Pa\,SA9/i549I@&@:j76Ft^ !OoZܶeRMէL[x6#6^mŷ9ռ]7GºN V_dY̓c`NVѵ,eX%;~Un:?|PݼKn  shGwy3z<)6$3)_52;{R+&'fc1Z~^EO/-2&-7¾)7VP^X[\m+6 px8xNѼm< C,O?aqiWuG$VJiH1nxSH4Zi_=b%8| (ۓ۽p^.6ijj$bS0*d\V5]W;x]K\Gd}@5±-nm/Qkf'JO1xZ]GZZ" \9;*s89yA^Omȩi +*'ú&A$ޢ5靌G{> GW}o텵ʘРaNF@ڀ)k0j-i_Fei%B_0t㚊h@DI=Z~?1 Vd9<Nڨ|1{OZk7RZMa6=~#jvjRꮫ Yʱ\$`F5K&n-!2G'PJ85O֝d .Wr ?.tRno4$D 1=۞*ƓSQ +]EqQ[k.&w{%"s8=0+;G|3|6ɥ:<so##֝[x|OyKkl-Dجp~P;j~4ֵhֺIJ)eŰ3~Ǻyu}J(Cgz.)yVYgcsq}@(hwqmxnnKHJF-ibi427Bw|:ku6VHۮՔB]<Cqk{B=b>~YZ_ÿXxB] MnAghMU7rJf2/#%ֵCX"n^Ei.DZx_R^RilOJi-;IX鳤7^gߵ_爆jv7VeO#'%<#E׼uxWXZtKBş` <+k u[qyhS{W1t:bxU25v3`7sWY^ Կ6q"@0<Z5}S^6)6K2h j5|WKԉR~ +bŲJ0|ppw>:~=j+o-m[o 9lszu_ö:[ؠ9&`=R'IkVAqZjobҤM"!8χ-W#u8?ts?r"ޯɦ\kodL-6ā.$yP}DZ{yL#;|`gso,"8LpW.k.=:Uh:AvM#D}*ǁf?ľ7/4}GpCA@x?Eǃl4qzz&ǣ.7nkFGU9d8Hݻ ?7tz:]ӠdE4;h}8֯6lG,/7;,+Y;Inq^y|?񖍫jZ%SϜwsz_Y.`C3Ϊ8fTr$Sm~&֚no/.n]X>o) cx 617ZڞCw?!B@qWU/8Q 7|q@oj NJ}ӓS.GEe ^r젂@sSOjƕ-jVޏsV&kW~ Z}YW 5_^"|%?}ơpvQ0P䪖 $ccj>on!QCZۻ6N02?޽1G.;<o( p28ݝ<^X WePy-{Pt/YxOK1ݠ~d?P[[m lKq (25Qpzi=}?RT_/\&q֟)P"zsMjTt4\ʨOMx[\}u(mVH>IJB'L{Zr]eG(sSq`Րz~[Dp-[8e?znIq#5G̝z+IEZ\7g2Hq>𣬍o{o[ܠP{ 6us-j `:tV۞( ( ( ( ( ( ( ( ( (T*vEp54mfc,E3z@=Ar?5!ng.q$l̤zPZEPEPEPEPEPEW o^e#| 7PI`ˇ⎷tڕnTi#%vl͜(;-"u+y$8 *heII``ѷ9Q^mO[k >nYN(=ekIuk~!ݰOL +Vֵm'_%.>i^ 0Oe5#_h^ ߞH1l>uuo-ʣQK=RV|V  Vլԥb@kgt^ۖ*%/^#_^%.omu{g=7e@WccgD_kA>V, v;zxkKxPo"KOP1ȇ2[ofCO i%HCd!rAz0Ҹki:Yc#W.s>P[[Qy0pYHe @EyjcJ.mMpܹ{zuo\A/Ktdʴg=h״"'}̌OAW$9yMh|A5?R9!d%x[׾5ķ7m:?2I8`;s@Ds޼]=q pHz9#V?n_ŖN1QSy~tzumior`3~"31nyb{ ωu['OMs@[ <$ һk%uHz f붲\iQC[<mcBs=^U'+}[V|<Z]Y'{({{x>wtA).I\FtɠR;20^oqrql em0q<shӛa_xAӵh4Vۆ ,315׿Ͷ!E' rZlgw5Hխif{eP5*vր=.H Ay8ުh^#<GWCnжୌ⹝?Ʒ7~ þ$Kvܶ MI'kWx[-.Pڻa?3@EpZwYF[kڮIkZ-sא8M{uccœ޵o*Y5jv:5ϩ%3Y%8 "øe~+/օ;?$ss#zkBWԴ/8T dCX_t m4 ֲbq/ǿ5zg4l!g20|zoiڝ 6tga1kAs}kRiyYE9caj_oVm)-7-y|szMtn5 g\gCchzމq<idTc#y:k]4 As^enN<ٛhK NUMnmfM(^iB>&ut&4xoT3MS|g_|gK:ke,M RGҗEGumY%A@>|ZO<)f.n@n G./}(ipv:uvG @xF1QӢFG8:g7IY2!vH1^y|mcڃ[Y͞ )G5Xwc%w\)q4-⥮Oľ5͎STEi#n?3~&=Hz9U[))9;qw?3C@U--=/k'tM_LP'd9޵_Iy!M1E,vɴ;{MMc&ڱ-OhZ^%M0J~D$e?w_]iZaN0ErK6wJd8sY׈tY\ܶ5YD=OL 6)}UrVNO=:q<]kxgJkti&Dʙ?tVSVew}zh^)E,jċ ) K\ y Fyk⟎4ɐpzhݨ4"ƝiRmssKK8! ,])2㯿@t2nmg.6Cͱ.}ruSL-̋`ѮA@f|S :,wP OsҀ=bT9bq":W^=jJ=#Tzj4} c20#}E^s6Xxz;‡b397׎(<1X<SMV|:ypH#mEPF/\n:f~w)]޿/\g)J,<ObB$ JvZԼkp |y"`uҖ6VZİ DA:-\״$Z'6װ)Ot uf}3TA׳=t5q9涹^}> y84h:3麺}RBxy}EtcF@ ETk$3@ia'^цC)4j@r)h(((Z%2큟YCZ=YP:Z)dտ e4+m"ϻCB,>:Ӝ~5@*zq(A\f>W;amomgw<nWOWLTwIbV0̲T,ہ U<v3>H+~!x[= Yխ<PӬl>zOGGVޡdy{{<%&NVkKZ3W_pMS b? xwPBQ5 Kdq[F*gԒIzb>襬N(((((H `G@q⨼@?yrkM52ND9Rҡ5M{7.H quXcǥz#Fp/EH>ܾsۥkkCX> I!<GQ+՜Td͓(%еO▹{qkjZ_r]Wsޛ`j ȱ̆"dݸc4ӻ<ws@G6Ajpy_Ý:4#UY3!L*A^ׂzRPJ \\I}#6G'?YucLsuq\2̙>2{cnu8|#||R@^1'=Y jF.}41J=8iCo\x4 {M3Xa >&E򔁰d92º|Aix!4HRd߂9vl^ş$ińÚOk|5ZxnEuy7@ϭZW/E^]j21IjTS Y<'K.V{U=H[˦~0ֵ-;meUʨ МX!4'#&>˦ǣSUxY[ֺ?M[>` p^?ݙu9^>l7('ހ</^𾳬τf3Yjvu |ܞzGkm Á69x89Zo^(м>3(?sq֓W|W}AГ uc,Q$sڨ H[(>Yǵyw^,3j <b kirv<SSοf\y3TcU?$w$, x#+zW@uP-s^^%.,b{f#4_&u $}v.yW(vM{(o@7xa/m -*:2By# .q{a:,nN8`~n:zNOz3Ɨ%ѳ.w4! ̈ =k~h?12 r?{xGĚ}3LMC dWLX8O:׍5=4/ \i:Ȱ[9|)G(Xke 3L*a^Gxs_ԵǮx^[3<McĄaJ9>tPͦxÚW!s][&.n^#zxmignW@{/ ɨ[;[ދv2NsuWin+ 6ڊk|k'yssaoj y[nkB:~#CMM: Wy3,Tz(|-_ ԴϪiܵżN <k^"Ԯ_ypB$pI]m</^x)&.lU4GP~=+|Fp-DifCG54Px[W=;cHY rNpG9^*XtjYrMJ-jI^9Decp''{m/C+4-W){I%L<lX8ERK]iz] -YeJ ׷@7I/m->C!IoduN?Uj-<3=^=тncU+#2 pE{uF6z%ż{6[qV:_5jliI)")9=8{ E0<3@YxN+Ƌdz 8-2R3ZB¿mK=c硊16U}mWQIv]b㷹$QTU |Ru{:0Rqs3o|-fA<QyDFVNps+nm[XͧK)$gN>TsZ%'KS&|S3^Es> e<7kick[uc]5PEPF/\g)JQp?5JT?:- >dQ@P.E*K f&8*zT7iڲ}R|/0x>{O q8<޹ _wӵT>n>x_Pj+ZZͲ\Fd;Hh]<gS2ZuA#re{>o xWI5 jJhwȑ><=VR*^K4.uS]ZzYث,IYf>4qxQږ[o8;\jλ֬mb\ϗ}+48'D|83uL*7}XI'\|]me!oXfh?:h^tvYfafHrS;SŚ63jZ6v4Go@NJC3ʆ$OFV7kPoO.;$!w<ѼV:Xͥ\ks wwJRI3 -*,/Y[}2xAi?aSMWW*m9%)o?*t. m;M v|"*`q>+<GZ[3x1wQ gҹ]=@Mܸ5ðr7v mZ.$=@c^Y^xGU=J3$w6pA >]wy6#kˈCO'l:@ci$QOz& ๠4);ɼ5Kc0{`)72Z?TwoE5ī\QZT|b[$[mGӮBy ]Iz#R/5?HJK9>qqp')!UrN<W] zƓf&ēK̗-#]NU+Cc:<7ݮvKH)k3`((((('@ @Nfe{|n|SFö"t/[K:Gox Sq]kg\A}lw#i0s0yt>;mSO5+8gky$e]#{fGֿri6btPqGy]i_ ;msS7m|8'@wX.y#`+Vxo^l "wcEu%%6\/moy1 U~Sҳ£BACQ.= DIT +"F^>N ><_ 3,As& ~!-|;:+Ǩ&$*'tuuy{bw݌n˒\Ʃ3C\ȬcQy|}[ra*<Wg;R{YYfN<3c$Ҁ;xU_i>oqO-Ӣ@M떾WۭB#3;m2fKŌ ݌>\Wj1$EAT`P/'4,n᳛[ۼM杨p$q׮*O_o_4Kh1{(YxrlyQ\N87;C'J/_{%<$` hkB_H^{ʖRk9F*z_?>~Hd2wDR˙Sgҽi2V4!TK+7fXq(Szii:&o*in/3 rsVoww 4:JYPyyNŐJDe[% ]({Q4Wa>\b*𵏍&մ½q͵\62Nҽ^6Kd>BHä2X#p[xEt=OaH0/F8(>%xGuTHnݎ=:fXuygx՟~6G٬7!!D\c!foD-eV{3yd+f0s HPe7s' y?<ֻJYּG^!֙lS8%RN) =+K֞ Eᷖ&CF͆v~!/KkyR##+z}x:}R}+>5'۴o>^s)kXLo+IDi/r7Ӑy׭v!>5w>j_/q~2:kxgT KXj.<,0n8sW7Y$Si#d9+\·µ"qxO}=jb>:sˏQP./iˊY#wl־{;m#恫K<R'39?»;+(uόokA!2[hs:c 路A JWck鷦tw8#a>/|j<3/(ſtφ$m5F-?V$1[K#Ir`F lu?zIkq7۠3n֐s^/"*_FF7ɍ^iqhgpnt`ӚFޡCҖ(N(((((((((7P2:R@ 1=iQ@Q@Q@Q@Q@_ҸMsi+Ek{L֟)P"z>RՐQEQEqS8PAa{;P^[/X-Iv8 ڤ>jдFsR̂fO  p@FXX,qkEύ<:"6UZj-sg4Cf)cRHn>QNAԊ;3ռ1[xL,5!\YL$ 6~DZC[]gSÌǖ03XCZrjʗ, ťg{HpwZ?֡z7[ê]5F:( r vEռSi:Y UH4j0u$k[Uǖީ)* /|E=v&oo.&K̲sqE O薷Q=pwM͏w<UDQ倣с\YF񹳵u' >iZ i0ȉx\uҀ/eG$j> jׇbK-R+u3ZiL*r):[Sx9۰=+CE Gja릘3sӯ?A@w>Q׵CW\=]0JyPʱy6VM:}~ ttт@דS6;.?5L>\:/QU#IL.o\mi} (#4(K@ xÎk.?.AkZ"\M\68VV8;]BoxMV=]enP@>\EzzeR*z^8i@XdJ4_1Svczl)2y\}6G3>im?pSP5O]V IP(:rOZu# ((((((8:8(3UѬum@VA bm=kk綌\@P#rAPc g05wMѩESۃUx^KMѭg9]88yg<c@״䴎-XM3ֶ</V\+P$HCs_҃@8Zo~B{kqv_Zu jVwL7ƱDʠ=t>1۸c* yE  hP>Z߅`Pi7 y9Ԛ |pG;vr>UdRg`w ~5v*pC{qy<Nw([]F{}GPr~C6uΟlG2D^k(r t@'q[vejG>gJ噶,ⵈkYPdGjq\0 h!x!/2;DFӡK>s)2kwo< {P3m ì.=7jO1szFT =9 ӃS9ro1]]ݶm\-ĤdV5ᖟmk xV:Ү8,p:9M=(ɼ-54 KK{Uaq̲}!~= z~Z|zw I![󜟥yQ9oxfMPʩ"r?tA'oW85-Ra,96{5؄8*նe5ԩ YَEbI ϫI\hI.[z/G ŴZEtwJ9懭YxCմ-ku&u*v=*$z n_^Omխ- >5/W5ڮw޺8S8n>E4[Hl sM~WD[K)}9}]K4˻y$' "[k9%/]ΠҀ6sB$4-/$ɒh?Zc#NծK+ᾼ;&S7ֹ75]nKNA}HUaq][㨠 Җ J$$J}8bpE4?\ SSO $ɁOOEfImڍL2,ʡh]І &0p *+mBkhl(G['@]y:BD؏eEs~nδmCqD>?$ǨF)6:}Mgv1ߥvS<3Ұ<O}#)luwܾT[DfwlWƀ:*+O1G$ʚd?4fZ os6eە##9@Hi:RT:zPo=9ZA0=hJ)Pzupxgv{אF(xMO$đ7$~̧k6'ue 2}}hM[q45Zmy 6Q@G_xg]I eT۷?y['4 X%.E$y8Ym: h>(#_U W }3Zwz_ҸMsi*ZOQGҖuJZ((t:Z6Z ŞvDvde9R8SίK;{4[]f8EA2(7pyWe&[aV9y{"@~x6Ò]]B4F g|6UQM*Dic$XH@}s^ofvonamTJE6 $"(QHaEPEPEPEPEPJ.1򥢀 ?j??uB* ((((((((((((((((((l={P>(xHºgij4i{,`{vCS]2oQnv!>!MfQEq)n>^>_JK'¹isM,˱TcJ%Ok=^0&!1qڭh^ ޗ'w$ߚp-(ڴ5jxOÚN)<7QL*`?ixǒVT'yqv\ƶjz ~`cIzW?x7?6Wdr dRx)t?J{um}ȥ;Niּg/a|+:$| z `W~luWx>3׼yk-#<,,1c#փw.J.!q׷Zo9d`?G_ZxSB[bufZu!=gG_ZNMAm$dMXzvZl]P i3LxH&nxx!F=?_ jNZ&"w6\Bѭ"| &8RԴ_]|/*^*= ⶧6N#`Z_+.<;]Ni%%$8ú=k}29,* }HQ1&yxڶb[bK $q^rf`hS8\;W߉]j71%|/, q@/ E|:1^XΙ=<$K KmcDNI, sKyCiwKEu=ܮI'+Qֳ {υ:ͥ[Ėz|FC `qhׂ lehmM)$ H1yY*ˑyVk˧å 42^p'Lh2b!3^vx1z.NYfjH?w>֬K- ( rz瞕sq(m}+O39u7<wjVR+iȦ(Xcp9;K!#gV-_[Oݰm[PmNOֱk6t m=9I?Ě.ᯱY&vRM#ͼE-{Za Az臈ãN[-N[*֖āx=Z}jmQ1ޣ sּZ4K⦧|.JHX- %Ew|G[.+BѸ16^:&uYi cލыb-OMi<LOl׆^'|@$c[X&~؋#gps4/75 abY]W:ɏ[?uL7??WOgo慹5S]g_!€)K[t>pl<dk:.2Ֆʺ/UI$: +qK jyX+wV]*M&23pq@F.d6Z,QB9q!*ogGRM]K1RZFLr,rOz﴿ P]>-+mBm-f[~1}S/3e=Zm8hL[v s|AĞԴ.INӭf|A+IIҤ[$Mdx}J7!Ri.J6jk;xe4B;X`= @&h#4S ( Oνi>?|H1ZYX\ܷOBQkn O}އV0Tr#qU~&;zQkۀRT$nĞu@&i?m]16vG>U(Ek{L֟)]޿/\g)JGQGҖ((sSI府*2ǂ~U#9=o+st˟4.GP9 Z>d9(™=x]%F{N*J((((((((+$UW/a% QIP((((((((((((((((((W▊ۂz_ZQ˴_zEvmqԔP~Wq15%O”c#=PZB?3I%rctq Q@M&6*9cp3TP1KE!Qy.a0SQ@t1 w(SޗAϽ>aӊ6KAOEU+XvhI%@rMM}^%ۂ=#df1FKPdժ(DF׍vʃX,\ME1U1N-ݝ{f#$5%[# O(]'ޥMaqZE=h0A=GIE1b x=O(Ek{L֟)]޿/\g)JGQGҖ('p ē F|˫".8)';Aސ6ى2\Hz!}EGZ\Y;Ef=\p PCƻO8( REQEQEQEQEQEQEQER1“\w<ms᛽*JGXPYUni6ʹk(IcgOOO_Y i>!8n?=];S* rMXukEizIu)۽uT,ѧ[;[mg$ށшʱF u3a*sĬtQEQEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPF/\g)JQp?5JT?:- >dQ@ /8_Ś󵾋1[%Jg8*a@RFkz;KY&eX{$W5nRbG8Z`Ӵ-Sˊ;U1HW9PEPEPEPEPEPEPEPEP7*kʾ-U^ 2܌cU=+>-sY7+R)JztyycKdOJ~t]hq_.a 1fOB+<`(((~n&/`M \Am nQ[>+ Sv`j(C ( ( )7sҍ#Rnd)s@(=hܸF3Z( -<p= -PHh#֍:pPNM!`:Z)7Z7I=:ё@ E&Oz\E!`$z=JZ(Ѓ@ E&F@ E֐F(h)%>qZ)7ތu@FA(h=h6<zHXp( =@ E&H]@a@p2=zPFE&'Ro_Q4QFERP2H"MngҀ( T_/\& iRJ5ϽkOҔk=DuJZh#;".R汵N=Jfc82X^okBsl ;PQ^LV&g͹Ŕ|8ЧjuQ@Q@Q@Q@Q@Q@Q@.]>D4<~W/m"e t鞽GOZYj:&`u. ]\2XPkoo śip2@ 8bPdA@{kYIw{2C.~{.eE !\'/5/6jMLdVs?cڍXmom0}ͳ-u}v03;z+-ovcߊ<WwK;xZ{M6Eh ,^cIO L#O}|뜑u_SҿJ?li_ ocҸe?3_.a %Q]o_Ъa#Ш(()u-L\5~<I{-θ1AdvpI99>õt@a@',Hc>moF%5#>aYTJG>%ҍ6`|])#>Q1,? jߴzm[[;r$pܒI$WiZce)r̰1(F1wQEvQ@#T;N:_ILcj6GsS`(]⿇z[:Mn쯓3,D'$88Q؏^Yo]LoG{ӧzC6w^tWUյ4[iE.g+I4G(x~UD$g* ־ bK)I$pEyvztGt$7,ϸ1wo%y.=s,o&*[3I4VwBے}2*퇋4MU 7KʾkuF|8t-,ɉ$ >ӏjt5Sӵ;fAS1rFI<ezYkV4w2:[O5vmv屻;v}kqE=@<jti~xckj7.@+zJ{1pGMV~)~SYVz!KJ _;J +t|#Ɨ@ s`Ww&HfǷtJ{%Z\7Xn:qq[zmF;dNN~M\VcHAl]MΣgk³ws[iZ PZ^$cYw '^}?Z^r6p3I${ omٮn %ij4!ݝq܎9ط hӾ#鷿/.bk8_5_2V-sx-#Ğ5Śc1s2>թiM/qgk$'_¹ͦɭ|[T̒]1H'9=Fq=v]{Jң待,EdQxHVѯI9SZ V5K]*g9n# z/ Zxb/^]xz7sܒv㨠OҸAӼ?-'BuwV0I#H5~-&8-;#PLOԞ9S^_ښvpV ďSqP>3kg=_2g,q&G:/t)wE%J$N6qKq ߄4i˯ΘlL^0u-/SfxTPg99_^ I;#P6:WɤYmt wۃb<1{sMDm/|Kٚ;'Ph9%]{cހ;x{,:iÓrzh[xZz|&آGzdμ^\x:d : >lcַ|}si l9,W H㷑|C gr㹠J<u;TO a!r+OAt-{eDIr=kmeiuxh2A't2x )ahSZ1. yڀ=)W9G=Im%2P|{+bI=y5=;⏎mAtLIZ @? 5n|{{2j XF r{cYúVi-"2N`U?9vy[{2pl,M$cIm;OMiW꺳ڔDy8y`x^ϵt8sjil }xnWYj)7wd m:H?5PP#/qhuco-uԼXӘ*K9(k⦷9Usx_Wݣđl,XKޥ( Čɯ-rh:=~/p,f"G89۞þj_ǫ-KRh~^v.sN^-tY{mJ,`̗sSO[h`֭^K w*BƟ,bYtY3*`kbF,0r}j&5{[ iD< 9b|Mh-v=p\F?gּ?:͟o屖O$tU`wW;O c}ye7ϸ;Ddf:<X_{+Yf< qҼV4{>oplNBmR>+U:sWviVy%)%a~=?ƾҮNԵtH,W9wlHHVQZ4eʩLzgYSr|x֬If]]h=K%]h@,Z`}y/ [bX`h\H^@ωtM _jXnԱ!& kRj6MUw6v|Um FM<_C{4ߕ OzY<#_:uIɪV\] 7`ӊ-6$_r`oS85Qsᑪ.uVI|\tc9INjM-{ g*Ց֭#XL YDRXKi>Ф{oþ3мEzCi- o-K``p1޽3OKIhf@*S%'Ӧ5ɵ0~8LOz@v5>`46,w3z Oeɪ6cB|Y?xuS@/6}Yeq?wzxEK81%oIWG_t_ jRHt~:RWzE(PdW5Ο -lhY1>z퇍|;<c[LaO!#TOoQ{-+QxT;D9^iGnuGDX5jїsOvVou-HǶ*ˈ8U(25QpzOLk6>)]/\q1?IJ}oR,qKXm1sU+ySUXo)4yZ5[7sjuzΫ7b5hj6ۗF##ңgF_b,fD4 `+oRzqAY/az- -J%nj\KKiiVg) F0]Ox~3E?=`B~U0BrN8] JGYW%sԒk/_3-QEQEQEQEQEQEQEs ZkB*+3(#p=}h\uB(?aC:0=(.KқM?c?)G*zc 8ikoofm-n$WaH7>bJ/ xPu*uOsnE~#>-s^@^YoFk:ʯŏ)JҿJ?qK=:g<+? ]krK!߃ֿ1qUa~&,GQE0QEQEPyu/tm>wj9$d֎H\ #Qc%i+,4m/Jgm3MimX85vd}<3I'v4U* Q%c'n{sǞv c!04*qRQU$,ԣ丌^JD?3(qY2*Ih##]Tu&1q ۟ h4M̭ԑRhZVvih0 a@wx?*C* dzP{*PGAw9 <Jm6x*qSy= *?_j[[ImoZEo1̰p -X"Yiֶ/ŊPЀ9`=qQ.gkv"Qaډ$¨穨VҬRڦLsZFEV8UihCx?0L" ú;*Mۦ#w)AE\.f8+$fMbqZ*j;W#[~4O{9H0P7nrn3 b6*i=.;iI`b8YSh/N:sҀ!Y[oHG*L R@`Þxi5-}lpHM+M/f1Vu'T#ú*4 :p9kmmsiXuB#0ӌ|n,j|OoEo,%h/mɊApph&,関М_-s^zT~eBZܔ%5oI*Qk_G%P>6yA('MtN0qҀ%9Uot Kh,%pLyh)4uiW#w, eF=GJ[LM@#dqWx>(_:M8-څǝ\ c>эJksiw1zq-=YZipK W#;xn1֟#@-tm6-mA(U p9bt'ivp+n=kDHf.G=pH i|nN{mԹ3O?KgͰrD*5i㆟Uτ%Z![d2GL{z\sǭ+˂3T/4='RIivw2(>V4^/"oכ8uu)tfC3n2@hN9֤p*?sciz[;i}Ie3W~cz]8!v6Pn qPf:}L7nt:`})h.sfG!x It=.{Xd\/OU(.a^xM($JY\&_'eZjJPȫsWpOJER'@ot]7RhΣamws줲rZָǐ)Lzm*t;y{6܍)CV#O)^nL~6\fg@mr (XL[I'xaT-HսpOJuFEu2QnWlig=Y Eg[M5g ="=Cxn,|C(9<?F4z]4J۠9qV,=6k Dd14PFEW 53*\{U(i֑vI*=r;D·@E)D[d鑎zԢ*A;[ha "8S9Qz^+Ii0@O_ (25Qp?5JWw+>a?JREҙ3E/,j()>}kM[h:t $ќ<nYZmx}v~`Ls<2J.`cҠ+kD EQY((((((((((ok.`x?AQןHƷ\hcKFh.آȮpçj*'(iIB-_ j%|GGt sGrSq E56ْ]W]@\#?Mmᧇ/~5^s$VY#xbNPoX(+B((ϥ-#P뺦g@xgMRMjғ2v ;B 9$d0P8+7ƿ3Z^D,q,w5am>]fX<O~ꮲ4J4_׵X5M^[|ͤj'.Iw53R&j^_9H`#<st\u۳]Y&g^*}6 ;wUl6/8I:P]4x_ 7vzle0 7+kboOkCjJ&Vp18~X&6 fO6%S<#}Դ8oog\C@wɫL9?]nK3[!A.ux<.kZk3X[[%t7AӃ'6xsZJ~%YsSjkh^h7doX`H }OsZG/jg"X%Ki1pr~Hdž-|/;iys-ܼĬr~?L>K^mlG+[Ma<m "6;sd 4dҬn4"I#)hٗnsω֣^/^vWsks^i7z&q]iVf5͌#[S\{h丙n#R#1@> x{nk/j$3\[1' gc֋Ϭ?Eֺ|:^-ǵN_ /< &jiq d- ̀zǻSuI2hǺ4NC7ɆGu ~#[a{>wvdIR~kCҼSc KQ]Ned ~2Hcu Ns=2rcL_M:iqkf`H1y_ހ9Y>0:D~%MbCbxSSú_d ^߳[0I gϗzzWqkS3MۛAlH\8<#7,uY.nH.`+}Ҁy |FI#17Rq1`Ij?pvZoxvH$Cynɂ~=kI[_ 7.~j"ܪ O p4e.mu{Y+ (>R4MWK6Ku4Vi $QDI"Koͯ\&# 0rGl3RhItiz0ɸYKrPiM{.j/ cXdޛ|S]GZ]F_Y1x.@:p{WXX1Y8SF$c9=OsSOOԼW{} ĪΤaqmב@D?m9,*,<o\Ml'G/1ȓdz5[ͮiھttMʑ+wzv>mBv{Bs=ˢ O@i1S.agf p6.# \?,_ǭ_jJi>qT*:H[5/!MZ]:$9Ɇ$r:L2g[xM'/K n'4[T w[3F>ukWt_Aj6z旨my&W+YRI#ҽbIt=-6byFYakɬeU:tf;mF).xS'08U_İƗY| <c8BҵBV*n3yjȡyp?g:]_~f}F]*k}?<>{ʬyAxH>! jt/FQc sր9} To~jFR`+ 5VȯI\d@H/ sڷO L./!y: a񵯊e40ZBMjshXXOLmLatw#5xb+7t}yu=UAYе-Gq2<JmR͖*ϡOn.վq +(JzqsWro/mrdOQZ}5/~"5.[*P[<:eG ׁۥoFl˩=ئkh9#<yQ_ $`ӟZ%lv@u ݴ|됯9v2Y' -e{Jsse itdsk>t<J䫋okh$`qIqҲcUk/-WէJ#A7v]@i:O4Bi/DΗ,Ly_o^VtkZ׵϶RxglQo8e~lc]Vۋ k[wڼpVrcqU?\xgR.5ZHdD q@\$q榮Z^}ZUv .cϠ]QEQEQEQEQEQEQEQEQEQEQEQEQEdkJϽkOҔ^8Ҹ=pi^saiLT?zae$2"dm>l噧/UѨ6bhU'æ}* Y%Z4,<2rW88=*,QEQEQEQEQEQEQEQEQEQEQEeqLa IE&(KEFѱSv)PEPEPEPEPv/(:Qy-݋FQ@ ع) JFiPB(9 )hk㚊)qIP2Pp6vȐ D45oE93+E7b4ykiPm)h_N* 斊oA)h()h(Zw:hA(?}V^ym__(9z@ מ:ؼ׭-@F)hcZ8Q@Ts=A寥ZSkKq-@;REQEQEQEQEQEQEQEQEQEQEQEQEQEdk Hp{_ä{dl` 'm{"ݏҼVw\1lY*+&\ e$cQ& \/KQ<6qi]<G 8T_oմ+-[F+k1#+8f-\g3T*s}8 ;Kfh-,vǘ=W<U/ wcTFZKjy64!&* {UzQ@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@Q@PN(xIo%A0Gǵrw-Kk*'d@k )ٽMᇩ5t='Lz4 $K a1b~ER<Ȋ9YQZQ@Q@Q@Q@Q@u;)/QLnsjʚynfb]zN=+;X@)h(((((((((((((((((((((Zj6]Em wʔz6M/OX< ⑎~d~~.4xL<}I$~&(a9=R]bVN i'f^VW+#b& 1YT1\ό+g kYLl1/ x.:iQT(WoxŽnt@mٺ^q _EX*XPI#$nZE۩ (;08r0 a::?u|Jl^Do,dIWҥs"6'Tڗ&zRs(Zxn/AE^ӵuKO[ C 8 Ҹ K ɚG(N?{ֻxQV=ȡL;c#tbiFG<D!7$ڷQEzAEAvk6$HPcҩn0rW̥UE٧3NjlIĎXHB! 946hA\**I6'2Ȥ9?*JVz(^:z[h ?x:5- 9o݀bqa7u;[_yK<{1=OESF6s>ZJ3$㠮BVOWn^߉U\;t{\5MnK!(6ZT)3ޠGh>[o3XǽQ:P $2[+^TӮ~xT%DfTیzqGq{ݿx4f+SoU׭thMsGJІh Iu922:u«4Q=P?Stۉ|.HϤ]16W1?׹xX1~oo1G82ܔ]FtڴwZiop7zy϶jeXɫի\uR#ѡ)J-hl^Do,dIgZxn/AE_Կw\_A5W]_XXLФ8 BwޞJ'$s,Bݚ3ӵuKO[ C 8 ҭTp{B>vrGZeklwÛsnQP] p&D<݊=[f׍}(#| >ETҽgN?+>j\J-IOL)ڝܐvx7$Y?ϲs=V-UЖcv, b{rMmJe J]"PCf~[, /,4RpMT5[mkNK-K0N:~Z&Z7uֹq +"G6;Nj}#:B,6?[Ė*.&kHSJ\1Z 5v´dq$ztX#GT~RF*.Q}'*y6QgS qz$3}tkQP_ v0v}&)Vo5Sg\}FlnOIS^jgEJyw_/:XN?^tZ~$3!F6è^U_%Z窔gdvДM9nQEflQE<M\=>kxn8vnyl5xp5Bp xϰ;GCZQ暹͏j>Z^ !LNb1=@}1ݻ?j<;c (v귛xuy΅>Vo3lĶ\]V)|ֽUtfOƽz$dyg{^ >Ѯ.HbRw8'52lV8( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9˻mcı4)cj!B{mY̏w:+-ќe#Ҝ*=dݺ8_Vk<rF$%o3p?J篕.hnD3FTTQ\ZV%YNyqNao-Lw[H';=xXX^G;\ihA.Y_î=xM-r³zk=v4^b)䷊h4,&M{SGiX(1:i\Ztd9[ZJ1,~`' c-kRl鼨نDjYj״ۋlo֟1#fH.{:]Z'wJEN)-7v̱%m [ /|(Y[oNAjwqq3[s9^'U%$] ,[d??(dn⨗<yw6eNmOcG\ *#Z/?Ш-.{')3P{wzf7e&(NnֶWhmF;ƸmK\&;1>B#fQRkFI9^z_3PjZ}&]VAing*TJFz2sӟYoS+@\آu#/h Bpm龖lpg79P?Z,KhP\7fgV8]]^Q?_&xHΟ׾0<Cm7iiʱ J ;VE9 i^!}C(H;[+BjV;-ܹVk6Rv2K 4m9j~!si4dyUUl$Hld|`` `9jUW)jk~&tpNU}ktҰPbuQݎbx:Qc8 uTRtO* k욵Cw~m-[f(R63`qG<~&ѷ'ir  K[h^)\+(«d*ut{KmyYtG '#>;WE?g5nyu/kǿ}|Pգ}RtF(?'=ilˮ%杮A=Ɵ>2A<StդKastelr@Zд$XV0Aފ)Itc98'w{[[o'#Z/?Ыvv|mvqXZEqʤ=ѥx'c~T{}m(E?Ozl<9@~٩U$wwTVU!kjNڧS2Yzf.V; rxǩx{ .%GXLRp{Oy[r^]iZɫz-kEZ-RYUwJ4}Zt,yoԚѢntV5rRn.UoOJ-3v8#t(<E?ympn#*UgW .XxMc.3i,?>g fڣ;Ƶ]f]Ӆȋ$R; Qd+c,2w,&PG2HCUJ&"y x2Q\`QEx[] 2[7,r"ޞX]Z=>cDW+zu+ D\IFm\?u31 j6ԯкH,j*ךi ˤ?j"JQvs/#Ku,sMRۺϟױ1šsSI<κׅGYF'JUGy!{ 1CTe(@˗Q¿s]TƇ|Ek2Mea xedk,{2</AǞ1dLx`q +|]$K]j7=<a]»ەU?VU:g2|h sMErx{4[x_Y9yk yK}bObX.6\1;IdQ@Q@Q@Q@Q@Q@Q@Q@Q@B#h2jAA+zW?[\4[0I$G|Bd?*W7foNF*+tKY^@CWWmuݺMm"*sUO܉Ӝ>%bj(3 ( ( ( ( ( ( ( ( ( ( ( ( ( (2|E c< Kk+<)CҜ fkƳZK,W?>#h3Mb.vF~ 5kY:!!l`0_zծǗORugM)Y+;d5=@+:pgg}WD_?oΤaItU+Kfw`;?>|Gb~@/a~'5&Un-vkoRyKX.vLH] tj:f.u ĞIcJZ,:[`27vt5x..a HȄD@ 涩J4vsQN+JZh)"5";U|ShEq#9Qp8,oQ]Z_Y:b;{u9yYO}9ƌ$j}AR&$M-0leU׽EꖚU!pRoY}M!ImS8Sbu/@52_]:'JV,]}iͤ,2uEW:ŘԾZoݳ}9یgq\P+gе 4r籮  `pGtL<a&MvcG:IM4z'њQEqZMktY6}]6{uw$,2$Kʑ띘EmW s4نK8i6'ns5J.twǛ_*XfyoGWc}oYGwe'HS!;xB6dV#ʹQE"kX9 F˞;NTyf#2p.minz3ohV 6"3߀۶67c[sF&HRcx?^[%^]V#NRQz2J Idު[_CR&HA!<?KV]k+|4ߏ2[8q|d5%veizCbxSWN7{M<֫cR)[M:gw[R4ˊ4.SEm|Ҥ d$ bu*O# ϯC\f<Ao$\DF)^ E1"J*$)T}Ko}G7tzm_Ovf^\lPMAj:ͩnx$>W+aX&kI5 p{ XO4QX{Kgb50鶧syZXO7M*Ͳ0;=p 3Wܠ1Xp_ui ow#ȖQK_QR5O (:((((wM-.\vdۃ܊J?]q \؊Ό=79]8Ej- 1a;z8u OYl?][Idz׏:k=xӜU 7z]RiW&N$,} Σ^xGG/MWH,yRY[:u0eNxB|"Jkbu,{nm#1I"_G/'grt0WkH_ʰ2I?.?=k?UtfKʤX`Rͭvk*M*ea1`3˽ ]u%@w,{s${\60SVRmFh&\['dU|teFmp/8byf2rrcp:`洿=Ф⿆Y巒m(:p*k0YZO ۬v.#xn Yqr3p4neKKA4cyU0gvHI;uzWb<@tV#a)T1>YL|NNGԃEMѬ+(e5yEq8((((((( y~#Iv4x bI5/W>LN s/bO)SLHux o<02#<-ĠʫW-,E(WhQVL%'}֜I\1]NpBYZXN< f$jD'bM,nk*p'xjsKHv# ( ( ( ( ( ( ( ( ( ( ( ( ( (8n>&hb(Y q%X}kT(@;+O3"#e S(+Y[rQʤ%2g6,rO;\㴟[&Ll gCRgP _ drH4Ǜ,t#\Ж=&# |k?suk(JvkѵuoԷx}L 4y;v~V甿'Yz|Ż,@p w'ҺJƬv^GN ec{7k맓zإiM$Zgn>T_ˬx^fbƿ*AVj5KݣJkO3u+mcoo.bXeJ#'姄[ Ё"R ǧEX( zrik_m "ݻ_jsS\J~ƍmJB7;P/M2P[*;yhU{?svks_S/EthJ4U2ye'@ E/!ޠg# m"q*p:+.hZzgQp[i_2^[cRQ{XԴM.fЭ}ˌæ;}1]9#qmc<$g8Uvf[xO.4oی ӪcTlK"UqqOs[u3z^wvM6ԼR?2TM(@T%.!)*]jT}O#_KZ ^XFn/b|o nO,u46;qxsQKK=RwgUOvU֑d6/`g|'8k`.nVuUw}nmԹ_\I}22[KHAcZ6k0 m*4kYsd~5GIG=QDf,3T ǺHO NHt·(E;{jysf\HCXA2KrCICеhQY{Z3Ώѽ܈+kwE(Գqo^_ d8=<><vUҝ8̛v"j{9$t܌+iz z"a$⳴h))ZRuV註Fx&*R\63K~r;MIY2%- bCEiAr?%L3} tu U`z3@T X|t}M{E67ֵ~ $gXz'm/?,S`n zZݢSJ>*mS5o[ã^x}gѧi&O)L]QWZ4V2cR.-=(((((= q:Z3edBB v\24>G{kVi܈sY2y#u(ЬR.لhHBwwml~?kRϤJ yKq%˴me0~Uڥz\ȗ`֬KM^]xX0<KQEuJ)ms~%˶Ic@VQ"Uמ8":1@la&<`eINZ!duH#C 4WqglFHu8Ai6WP}ռ Ch%m $3,H<@Ny)>N$Y`o*hg Ҝ@#r+7 s`{K V4rKwc'@w[z MttW9 އ=no&N?u34\'z?;GEsfhC_7@w[z MttW9 އ=no&N?u34\'z?:7ծ/+[;ǼSíRҼ;Emk܋u$p=- o#+ lfkz]}64#X4S#Bų$wDhR7!̝+հ/d$c'߉<IsK -/.QlѺEe$t7 t.d[x&ڮȾdqye$ӱa9en[Oʸ7AjwzEȯ$@,Ȅ m@Oº]Gu4b(~o6VlX5mI4A(]Fqۊ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( KI-dvL5ŲIv\bڄ{.Ybx#{Vs:]j:r46'8aJTD$0ҶSӥoS\UIJWAEUQ@5 u:3@]o _z[ jvp]#Aqq2>G0C ? JoMqp 9<nK'k|Uq Dw|CiK<ojomז0m'2@2#84hxgn'o/W/Oɣ]* Jfm՘c%=~|ßD2mE=7F˻>ԯ1,LIo> o__VW*Եeb!k vwcQzԐxGӯ-/.QW{{dUYYPkߑ@?8>—O л:Qfگ4,_gi;tnHnʜ]usxW7)(:{Me j[I Y+~?_ƒײqw57,To3RHאW)rRC(I5k[/{a܇O"YQJ6䞛x 蚗[A^."˷^6txW7*ŗ4Hm[hWoo+US@|=2ek7TCi8+Eyf8OנH_V|4A4RHbpsXIp2>t4]xo1_]?4N3@v){]ú|dQW%C1ˎ(jUvf'bNwE!T=WqVpFF*(PҦ;ּ9hz.^5_21Hr@$tJ͟@K[<#칔c޺O 4 ۠UX؟W" vMf⮝÷qib?t>nQ{Ya"Aw - K#R6srzVt2M#Z%(ۛ;*( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( Pv pQ@%m:0,77++}_㍻`ge'X:(:xf~;X.O\)oڬ!VXf7+ 4R)n~ra;9Uh7o?6O9o9+~>X:Q |Fv# esگxS3 B {f?q//|l0T0MGm$Ȩ *z(@e1v_p{Mý^0o pBݎX{֖[lv6*V $}9o6:4 f[.!QyX\msm,iᅫtt{|'52MwMty$mLĊ7!L!X*IQ!l5x;vy<m*O\J3Нh&AVw> 'P~O5@fcٵ$v/!ZF=Y֍((?DžR4lں. n q_\ .clpp3]Gw "I#B?QXZ$4]$\JmrVj-NZՒƁL8c^kҴ"JeI<YO5R> bAcf;rI20vUnfxS(9(((((((((((((((((((((((((((((((((((((((((((((sž$\*=~R/F W9o\iW xIELokRq6 O_0ѫgX [0X.w+)Q4g^:XkZmo"WVО]؍LjFÝI Ϩ[a=HLow w pLp_Us }BaVp,1誣%OqW~#Cy&A=&_=3rvA/i~ / &69ߜ"9-;]>L;x#HPEs7TKwRn-HcZFXq]xPҖ(((~\xyZEvWsg0I\%LѱV(FG^p+u]X{6F-Xb03Yx/JдY \\a95tѫ&r/ #\G=nz(ڠW*Š($((((((((((((((((((((((((((((((((((((((((((((((O~񥶻2Z]<'!0G t"o<|YRk s;9CK4`BWm?jyMV>HZ k ;(?0@OO񍦅_<+!Gue9Hu* ԃ:-uFRB YY\y'$LV(((((((((((((((((((((((((((((((((((((((((((((((((dZxė$Nk@e^B܀O€6(=kͼO\\x}VOuJG#pvV7}=J$!"ZQHX<EvV0VQGy xvh'8V_i~(`:QH (?>em3gMsr|KԷiB'猏Q]E*R1ӳEb&m_Plf;e܄+rARЊ+A EPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPEPnšw7),;Ƹ%K%uo-sGC˭@&>"*ƫ0 7uحӌ~=092{WWP6~voM.WsGuMMHVr*!`?J467J&(-7c,Oַ8Դ0Fom%FvB\t^>|#l&umGd(ҭèJO<*1^~ENJ:$MO5̶/_1aW$%POT4]*DӭZ‘.y''=jzM+3' NjzXI6Ӭx˂6BFH9R;rjDŽt@xf O`H4c!O\psW4sxGZs_%iМȅ];8㓪|Dѭu(3 ~gҵ51 ;!XUԑ k45G[\A ^MadF (0G5ѕ<:HW]\׌huxPD쪌 Jv80J3wii&fmLa(vH|_A]>^jCT+'Ku}?QmB<[`Fb '_\<䘮F~eEU#\ 7B|̹)#+ zf591 rK^ r5 ֆ)!':V&Kx/E-g?kU\%~2rע+0 ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( ( (9GM(iz#]A= } KCYkZ[%R_B-pnchj|Iw5֨iPtO칿8ta%֨6vr:W.8ÿ -7} 3;5 &MOZ[[HFz$ oD<_ZOlI%ʬNc+H4$ɚ|ѱT1]qnC#2 OYfy$x$TpJB;P+⋅Ś ۛvC/Ѐ>b5c,qeA?4nEejH~#LkE̍ R^&RylI57UNnx74A=HR ǀ?sZi^֚q Mg,[Ъ;{P6bcE>yTrۘ$]U#56_<9 y侷`n'T<䎛 ׳G"̊e="MMGSu c]~6VZX%ݤwm~89 X_JkKx,᷁6E G_}K-zSaOX^(XQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQER)ku$I8h]bQ6`90 zA˘H+}]kgd(Nq!wfKxK_5vLԭ+fG;VI$hqykw.g;@# Hn4L9-%XC2u_Di#/|Uٴȴ!c$t][8njkjӄ6n!xԵ0\F֓>춥GZ <Ig`An_izmk [oXI%j% ͦKcb~b L< ;[#Vs]]vJA %jIZ|1I=,2̸FV7't\bIj&rK8`֓f7F9|A>hbI wS(8ywdWhSdnV9$m)T7 ,æVkH0Ǔ<1&EQpOq+c6Uv v+__+<r1'$UdzIvR]~^xwUy4ֲDUNҳzsY=;[mByfg Å!D8<28v_ T^^:yQ|Vn$@ Fz.a{^ SuK[+m"\ͧIWI8A КxZ\L̑ Oc>h~bi9|F:_R; B2|yQ"s m=.9-M}$]D8bJ t)VU$O`s_%m_FU9Wg]%2)mX=xOQGXCYgtkor+g g5Y}s@׶lmn돘{2q¥>ww%K=ļ- ,xJ>zhY9R|ˆ# ZZgҴotؠHԄuS '*d\&CxV?zC޴u h̷:}fkx3av Btړ1-4³BΕmk,nL̾swu? _꺦 Kr+x$߼[aÐ8‹ h̷6{Q gO$9V("嶸PB!I' 8V48Elz SaMr;̖<`oEPH9VuQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEsKEfúO0m"*p?:~]y ݫf94=X՘0^vI)$:M6cLػ8ZI2=kו|;k7&ݮpnGkWCyj$Ub^O>5X 圆Cr:c*o&=4щ+PQTc ]u/j0dagc ˑAY`\d"JG6-ZTiQMуHX"d = n5&o o򷕸 \sF:VQo5E"y'pGZ(+ .40xz] 䙞 9,8*4 Xt`3:|.Ӽ_]ڝMz qjZpw*F dp󎵏OKE;zBAj#%ʕ9$u|3ɡok-Is4xy'&2O@< qc:J7W%X*.y@GpSsF;|1a̖wPW*fr\C|];ͼ1捯~C(N=ր3wp5lV 'BO*DEjHčO%bmn|ѓ+ҼZ|S O CM:7Ii"0o3h>\$>hc2x2m^&ۋH;w (-%ՋW!Ef/l?:fQ^kOk[XxiJZ@YԼ2͕ן uZFdna#zV<QܓDv/oo&n@<+S?ﯭzbF66waq(U%9I#m2ء%==}1~>E(((((((( xMqj(#i<S|;WSu$j֩ťi/,3̋M9f %w?Rx385.m[C6Vi(2ˎAƀ:" |j#ᫍkNvkS:1ZC{k{$)9 }+(((((((((((((((((((((((((((((((((((Aid#$+oX!qѻ05Eyp|2\xz$$+#|6&6LU-kNSn"GKߘq*9ù2}Ɲ0Z$Qidhmʷ b:M:*\\X;J|GpIEQg*(PH.ƨj+Vx)PIUԎ=A2%lm96װ4Ieq@hh_ 4+TTkDPyX׾2ŏvRWCuoM ȥ]d0#>ة_ &{ &HJ7@$V Y!Tq.ҦI2GLvƱFrN RG8khth{[hV"AeQGn'ܧ$k c[^,`J37*D+sWQOum` %("F5U(((((((((4=?Ě<^Y\ ē[ǡ(AǶqMm/t,cu)#e 2nO H=ZTP; *NyVi3O+L 'fQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQEQE
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.3.0 hooks: - id: check-executables-have-shebangs - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace exclude: | (?x)^( data_structures/heap/binomial_heap.py )$ - id: requirements-txt-fixer - repo: https://github.com/MarcoGorelli/auto-walrus rev: v0.2.1 hooks: - id: auto-walrus - repo: https://github.com/psf/black rev: 22.10.0 hooks: - id: black - repo: https://github.com/PyCQA/isort rev: 5.10.1 hooks: - id: isort args: - --profile=black - repo: https://github.com/asottile/pyupgrade rev: v3.1.0 hooks: - id: pyupgrade args: - --py310-plus - repo: https://github.com/PyCQA/flake8 rev: 5.0.4 hooks: - id: flake8 # See .flake8 for args additional_dependencies: &flake8-plugins - flake8-bugbear - flake8-builtins - flake8-broken-line - flake8-comprehensions - pep8-naming - repo: https://github.com/asottile/yesqa rev: v1.4.0 hooks: - id: yesqa additional_dependencies: *flake8-plugins - repo: https://github.com/pre-commit/mirrors-mypy rev: v0.982 hooks: - id: mypy args: - --ignore-missing-imports - --install-types # See mirrors-mypy README.md - --non-interactive additional_dependencies: [types-requests] - repo: https://github.com/codespell-project/codespell rev: v2.2.2 hooks: - id: codespell args: - --ignore-words-list=ans,crate,damon,fo,followings,hist,iff,mater,secant,som,sur,tim,zar exclude: | (?x)^( ciphers/prehistoric_men.txt | strings/dictionary.txt | strings/words.txt | project_euler/problem_022/p022_names.txt )$ - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.3.0 hooks: - id: check-executables-have-shebangs - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace - id: requirements-txt-fixer - repo: https://github.com/MarcoGorelli/auto-walrus rev: v0.2.1 hooks: - id: auto-walrus - repo: https://github.com/psf/black rev: 22.10.0 hooks: - id: black - repo: https://github.com/PyCQA/isort rev: 5.10.1 hooks: - id: isort args: - --profile=black - repo: https://github.com/asottile/pyupgrade rev: v3.1.0 hooks: - id: pyupgrade args: - --py310-plus - repo: https://github.com/PyCQA/flake8 rev: 5.0.4 hooks: - id: flake8 # See .flake8 for args additional_dependencies: &flake8-plugins - flake8-bugbear - flake8-builtins - flake8-broken-line - flake8-comprehensions - pep8-naming - repo: https://github.com/asottile/yesqa rev: v1.4.0 hooks: - id: yesqa additional_dependencies: *flake8-plugins - repo: https://github.com/pre-commit/mirrors-mypy rev: v0.982 hooks: - id: mypy args: - --ignore-missing-imports - --install-types # See mirrors-mypy README.md - --non-interactive additional_dependencies: [types-requests] - repo: https://github.com/codespell-project/codespell rev: v2.2.2 hooks: - id: codespell args: - --ignore-words-list=ans,crate,damon,fo,followings,hist,iff,mater,secant,som,sur,tim,zar exclude: | (?x)^( ciphers/prehistoric_men.txt | strings/dictionary.txt | strings/words.txt | project_euler/problem_022/p022_names.txt )$ - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false
1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.9+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `flake8 . --ignore=E203,W503 --max-line-length=88` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install flake8 # only required the first time flake8 . --ignore=E203,W503 --max-line-length=88 --show-source ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
# Contributing guidelines ## Before contributing Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before sending your pull requests, make sure that you __read the whole guidelines__. If you have any doubt on the contributing guide, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). ## Contributing ### Contributor We are very happy that you are considering implementing algorithms and data structures for others! This repository is referenced and used by learners from all over the globe. Being one of our contributors, you agree and confirm that: - You did your work - no plagiarism allowed - Any plagiarized work will not be merged. - Your work will be distributed under [MIT License](LICENSE.md) once your pull request is merged - Your submitted work fulfils or mostly fulfils our styles and standards __New implementation__ is welcome! For example, new solutions for a problem, different representations for a graph data structure or algorithm designs with different complexity but __identical implementation__ of an existing implementation is not allowed. Please check whether the solution is already implemented or not before submitting your pull request. __Improving comments__ and __writing proper tests__ are also highly welcome. ### Contribution We appreciate any contribution, from fixing a grammar mistake in a comment to implementing complex algorithms. Please read this section if you are contributing your work. Your contribution will be tested by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) to save time and mental energy. After you have submitted your pull request, you should see the GitHub Actions tests start to run at the bottom of your submission page. If those tests fail, then click on the ___details___ button try to read through the GitHub Actions output to understand the failure. If you do not understand, please leave a comment on your submission page and a community member will try to help. Please help us keep our issue list small by adding fixes: #{$ISSUE_NO} to the commit message of pull requests that resolve open issues. GitHub will use this tag to auto-close the issue when the PR is merged. #### What is an Algorithm? An Algorithm is one or more functions (or classes) that: * take one or more inputs, * perform some internal calculations or data manipulations, * return one or more outputs, * have minimal side effects (Ex. `print()`, `plot()`, `read()`, `write()`). Algorithms should be packaged in a way that would make it easy for readers to put them into larger programs. Algorithms should: * have intuitive class and function names that make their purpose clear to readers * use Python naming conventions and intuitive variable names to ease comprehension * be flexible to take different input values * have Python type hints for their input parameters and return values * raise Python exceptions (`ValueError`, etc.) on erroneous input values * have docstrings with clear explanations and/or URLs to source materials * contain doctests that test both valid and erroneous input values * return all calculation results instead of printing or plotting them Algorithms in this repo should not be how-to examples for existing Python packages. Instead, they should perform internal calculations or manipulations to convert input values into different output values. Those calculations or manipulations can use data types, classes, or functions of existing Python packages but each algorithm in this repo should add unique value. #### Pre-commit plugin Use [pre-commit](https://pre-commit.com/#installation) to automatically format your code to match our coding style: ```bash python3 -m pip install pre-commit # only required the first time pre-commit install ``` That's it! The plugin will run every time you commit any changes. If there are any errors found during the run, fix them and commit those changes. You can even run the plugin manually on all files: ```bash pre-commit run --all-files --show-diff-on-failure ``` #### Coding Style We want your work to be readable by others; therefore, we encourage you to note the following: - Please write in Python 3.10+. For instance: `print()` is a function in Python 3 so `print "Hello"` will *not* work but `print("Hello")` will. - Please focus hard on the naming of functions, classes, and variables. Help your reader by using __descriptive names__ that can help you to remove redundant comments. - Single letter variable names are *old school* so please avoid them unless their life only spans a few lines. - Expand acronyms because `gcd()` is hard to understand but `greatest_common_divisor()` is not. - Please follow the [Python Naming Conventions](https://pep8.org/#prescriptive-naming-conventions) so variable_names and function_names should be lower_case, CONSTANTS in UPPERCASE, ClassNames should be CamelCase, etc. - We encourage the use of Python [f-strings](https://realpython.com/python-f-strings/#f-strings-a-new-and-improved-way-to-format-strings-in-python) where they make the code easier to read. - Please consider running [__psf/black__](https://github.com/python/black) on your Python file(s) before submitting your pull request. This is not yet a requirement but it does make your code more readable and automatically aligns it with much of [PEP 8](https://www.python.org/dev/peps/pep-0008/). There are other code formatters (autopep8, yapf) but the __black__ formatter is now hosted by the Python Software Foundation. To use it, ```bash python3 -m pip install black # only required the first time black . ``` - All submissions will need to pass the test `flake8 . --ignore=E203,W503 --max-line-length=88` before they will be accepted so if possible, try this test locally on your Python file(s) before submitting your pull request. ```bash python3 -m pip install flake8 # only required the first time flake8 . --ignore=E203,W503 --max-line-length=88 --show-source ``` - Original code submission require docstrings or comments to describe your work. - More on docstrings and comments: If you used a Wikipedia article or some other source material to create your algorithm, please add the URL in a docstring or comment to help your reader. The following are considered to be bad and may be requested to be improved: ```python x = x + 2 # increased by 2 ``` This is too trivial. Comments are expected to be explanatory. For comments, you can write them above, on or below a line of code, as long as you are consistent within the same piece of code. We encourage you to put docstrings inside your functions but please pay attention to the indentation of docstrings. The following is a good example: ```python def sum_ab(a, b): """ Return the sum of two integers a and b. """ return a + b ``` - Write tests (especially [__doctests__](https://docs.python.org/3/library/doctest.html)) to illustrate and verify your work. We highly encourage the use of _doctests on all functions_. ```python def sum_ab(a, b): """ Return the sum of two integers a and b >>> sum_ab(2, 2) 4 >>> sum_ab(-2, 3) 1 >>> sum_ab(4.9, 5.1) 10.0 """ return a + b ``` These doctests will be run by pytest as part of our automated testing so please try to run your doctests locally and make sure that they are found and pass: ```bash python3 -m doctest -v my_submission.py ``` The use of the Python builtin `input()` function is __not__ encouraged: ```python input('Enter your input:') # Or even worse... input = eval(input("Enter your input: ")) ``` However, if your code uses `input()` then we encourage you to gracefully deal with leading and trailing whitespace in user input by adding `.strip()` as in: ```python starting_value = int(input("Please enter a starting value: ").strip()) ``` The use of [Python type hints](https://docs.python.org/3/library/typing.html) is encouraged for function parameters and return values. Our automated testing will run [mypy](http://mypy-lang.org) so run that locally before making your submission. ```python def sum_ab(a: int, b: int) -> int: return a + b ``` Instructions on how to install mypy can be found [here](https://github.com/python/mypy). Please use the command `mypy --ignore-missing-imports .` to test all files or `mypy --ignore-missing-imports path/to/file.py` to test a specific file. - [__List comprehensions and generators__](https://docs.python.org/3/tutorial/datastructures.html#list-comprehensions) are preferred over the use of `lambda`, `map`, `filter`, `reduce` but the important thing is to demonstrate the power of Python in code that is easy to read and maintain. - Avoid importing external libraries for basic algorithms. Only use those libraries for complicated algorithms. - If you need a third-party module that is not in the file __requirements.txt__, please add it to that file as part of your submission. #### Other Requirements for Submissions - If you are submitting code in the `project_euler/` directory, please also read [the dedicated Guideline](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md) before contributing to our Project Euler library. - The file extension for code files should be `.py`. Jupyter Notebooks should be submitted to [TheAlgorithms/Jupyter](https://github.com/TheAlgorithms/Jupyter). - Strictly use snake_case (underscore_separated) in your file_name, as it will be easy to parse in future using scripts. - Please avoid creating new directories if at all possible. Try to fit your work into the existing directory structure. - If possible, follow the standard *within* the folder you are submitting to. - If you have modified/added code work, make sure the code compiles before submitting. - If you have modified/added documentation work, ensure your language is concise and contains no grammar errors. - Do not update the README.md or DIRECTORY.md file which will be periodically autogenerated by our GitHub Actions processes. - Add a corresponding explanation to [Algorithms-Explanation](https://github.com/TheAlgorithms/Algorithms-Explanation) (Optional but recommended). - All submissions will be tested with [__mypy__](http://www.mypy-lang.org) so we encourage you to add [__Python type hints__](https://docs.python.org/3/library/typing.html) where it makes sense to do so. - Most importantly, - __Be consistent in the use of these guidelines when submitting.__ - __Join__ us on [Discord](https://discord.com/invite/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms) __now!__ - Happy coding! Writer [@poyea](https://github.com/poyea), Jun 2019.
1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Equal Loudness Filter](audio_filters/equal_loudness_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) ## Electronics * [Carrier Concentration](electronics/carrier_concentration.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Power](electronics/electric_power.py) * [Ohms Law](electronics/ohms_law.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Breadth First Search Shortest Path 2](graphs/breadth_first_search_shortest_path_2.py) * [Breadth First Search Zero One Shortest Path](graphs/breadth_first_search_zero_one_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Random Forest Classifier](machine_learning/random_forest_classifier.py) * [Random Forest Regressor](machine_learning/random_forest_regressor.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Abs Max](maths/abs_max.py) * [Abs Min](maths/abs_min.py) * [Add](maths/add.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial Iterative](maths/factorial_iterative.py) * [Factorial Recursive](maths/factorial_recursive.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Perceptron](neural_network/perceptron.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Check Strong Password](other/check_strong_password.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password Generator](other/password_generator.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Casimir Effect](physics/casimir_effect.py) * [Centripetal Force](physics/centripetal_force.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) * [Potential Energy](physics/potential_energy.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) ## Quantum * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Is Srilankan Phone Number](strings/is_srilankan_phone_number.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Billioners](web_programming/get_top_billioners.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Equal Loudness Filter](audio_filters/equal_loudness_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Arrays * [Permutations](data_structures/arrays/permutations.py) * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Fizz Buzz](dynamic_programming/fizz_buzz.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) * [Viterbi](dynamic_programming/viterbi.py) ## Electronics * [Carrier Concentration](electronics/carrier_concentration.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Conductivity](electronics/electric_conductivity.py) * [Electric Power](electronics/electric_power.py) * [Electrical Impedance](electronics/electrical_impedance.py) * [Ohms Law](electronics/ohms_law.py) * [Resonant Frequency](electronics/resonant_frequency.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Breadth First Search Shortest Path 2](graphs/breadth_first_search_shortest_path_2.py) * [Breadth First Search Zero One Shortest Path](graphs/breadth_first_search_zero_one_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * [Recursive Approach Knapsack](knapsack/recursive_approach_knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Random Forest Classifier](machine_learning/random_forest_classifier.py) * [Random Forest Regressor](machine_learning/random_forest_regressor.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Abs Max](maths/abs_max.py) * [Abs Min](maths/abs_min.py) * [Add](maths/add.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial Iterative](maths/factorial_iterative.py) * [Factorial Recursive](maths/factorial_recursive.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Count Paths](matrix/count_paths.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Perceptron](neural_network/perceptron.py) * [Simple Neural Network](neural_network/simple_neural_network.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Check Strong Password](other/check_strong_password.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password Generator](other/password_generator.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Casimir Effect](physics/casimir_effect.py) * [Centripetal Force](physics/centripetal_force.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) * [Potential Energy](physics/potential_energy.py) * [Sheer Stress](physics/sheer_stress.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) ## Quantum * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Is Srilankan Phone Number](strings/is_srilankan_phone_number.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Text Justification](strings/text_justification.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Billioners](web_programming/get_top_billioners.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
### Interest * Compound Interest: "Compound interest is calculated by multiplying the initial principal amount by one plus the annual interest rate raised to the number of compound periods minus one." [Compound Interest](https://www.investopedia.com/) * Simple Interest: "Simple interest paid or received over a certain period is a fixed percentage of the principal amount that was borrowed or lent. " [Simple Interest](https://www.investopedia.com/)
### Interest * Compound Interest: "Compound interest is calculated by multiplying the initial principal amount by one plus the annual interest rate raised to the number of compound periods minus one." [Compound Interest](https://www.investopedia.com/) * Simple Interest: "Simple interest paid or received over a certain period is a fixed percentage of the principal amount that was borrowed or lent. " [Simple Interest](https://www.investopedia.com/)
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Audio Filter Audio filters work on the frequency of an audio signal to attenuate unwanted frequency and amplify wanted ones. They are used within anything related to sound, whether it is radio communication or a hi-fi system. * <https://www.masteringbox.com/filter-types/> * <http://ethanwiner.com/filters.html> * <https://en.wikipedia.org/wiki/Audio_filter> * <https://en.wikipedia.org/wiki/Electronic_filter>
# Audio Filter Audio filters work on the frequency of an audio signal to attenuate unwanted frequency and amplify wanted ones. They are used within anything related to sound, whether it is radio communication or a hi-fi system. * <https://www.masteringbox.com/filter-types/> * <http://ethanwiner.com/filters.html> * <https://en.wikipedia.org/wiki/Audio_filter> * <https://en.wikipedia.org/wiki/Electronic_filter>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Normal Distribution QuickSort QuickSort Algorithm where the pivot element is chosen randomly between first and last elements of the array, and the array elements are taken from Standard Normal Distribution. ## Array elements The array elements are taken from a Standard Normal Distribution, having mean = 0 and standard deviation = 1. ### The code ```python >>> import numpy as np >>> from tempfile import TemporaryFile >>> outfile = TemporaryFile() >>> p = 100 # 100 elements are to be sorted >>> mu, sigma = 0, 1 # mean and standard deviation >>> X = np.random.normal(mu, sigma, p) >>> np.save(outfile, X) >>> 'The array is' >>> X ``` ------ #### The distribution of the array elements ```python >>> mu, sigma = 0, 1 # mean and standard deviation >>> s = np.random.normal(mu, sigma, p) >>> count, bins, ignored = plt.hist(s, 30, normed=True) >>> plt.plot(bins , 1/(sigma * np.sqrt(2 * np.pi)) *np.exp( - (bins - mu)**2 / (2 * sigma**2) ),linewidth=2, color='r') >>> plt.show() ``` ------ ![normal distribution large](https://upload.wikimedia.org/wikipedia/commons/thumb/2/25/The_Normal_Distribution.svg/1280px-The_Normal_Distribution.svg.png) ------ ## Comparing the numbers of comparisons We can plot the function for Checking 'The Number of Comparisons' taking place between Normal Distribution QuickSort and Ordinary QuickSort: ```python >>> import matplotlib.pyplot as plt # Normal Distribution QuickSort is red >>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,6,15,43,136,340,800,2156,6821,16325],linewidth=2, color='r') # Ordinary QuickSort is green >>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,4,16,67,122,362,949,2131,5086,12866],linewidth=2, color='g') >>> plt.show() ```
# Normal Distribution QuickSort QuickSort Algorithm where the pivot element is chosen randomly between first and last elements of the array, and the array elements are taken from Standard Normal Distribution. ## Array elements The array elements are taken from a Standard Normal Distribution, having mean = 0 and standard deviation = 1. ### The code ```python >>> import numpy as np >>> from tempfile import TemporaryFile >>> outfile = TemporaryFile() >>> p = 100 # 100 elements are to be sorted >>> mu, sigma = 0, 1 # mean and standard deviation >>> X = np.random.normal(mu, sigma, p) >>> np.save(outfile, X) >>> 'The array is' >>> X ``` ------ #### The distribution of the array elements ```python >>> mu, sigma = 0, 1 # mean and standard deviation >>> s = np.random.normal(mu, sigma, p) >>> count, bins, ignored = plt.hist(s, 30, normed=True) >>> plt.plot(bins , 1/(sigma * np.sqrt(2 * np.pi)) *np.exp( - (bins - mu)**2 / (2 * sigma**2) ),linewidth=2, color='r') >>> plt.show() ``` ------ ![normal distribution large](https://upload.wikimedia.org/wikipedia/commons/thumb/2/25/The_Normal_Distribution.svg/1280px-The_Normal_Distribution.svg.png) ------ ## Comparing the numbers of comparisons We can plot the function for Checking 'The Number of Comparisons' taking place between Normal Distribution QuickSort and Ordinary QuickSort: ```python >>> import matplotlib.pyplot as plt # Normal Distribution QuickSort is red >>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,6,15,43,136,340,800,2156,6821,16325],linewidth=2, color='r') # Ordinary QuickSort is green >>> plt.plot([1,2,4,16,32,64,128,256,512,1024,2048],[1,1,4,16,67,122,362,949,2131,5086,12866],linewidth=2, color='g') >>> plt.show() ```
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Locally Weighted Linear Regression It is a non-parametric ML algorithm that does not learn on a fixed set of parameters such as **linear regression**. \ So, here comes a question of what is *linear regression*? \ **Linear regression** is a supervised learning algorithm used for computing linear relationships between input (X) and output (Y). \ ### Terminology Involved number_of_features(i) = Number of features involved. \ number_of_training_examples(m) = Number of training examples. \ output_sequence(y) = Output Sequence. \ $\theta$ $^T$ x = predicted point. \ J($\theta$) = COst function of point. The steps involved in ordinary linear regression are: Training phase: Compute \theta to minimize the cost. \ J($\theta$) = $\sum_{i=1}^m$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ Predict output: for given query point x, \ return: ($\theta$)$^T$ x <img src="https://miro.medium.com/max/700/1*FZsLp8yTULf77qrp0Qd91g.png" alt="Linear Regression"> This training phase is possible when data points are linear, but there again comes a question can we predict non-linear relationship between x and y ? as shown below <img src="https://miro.medium.com/max/700/1*DHYvJg55uN-Kj8jHaxDKvQ.png" alt="Non-linear Data"> <br /> <br /> So, here comes the role of non-parametric algorithm which doesn't compute predictions based on fixed set of params. Rather parameters $\theta$ are computed individually for each query point/data point x. <br /> <br /> While Computing $\theta$ , a higher preference is given to points in the vicinity of x than points farther from x. Cost Function J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ $w^i$ is non-negative weight associated to training point $x^i$. \ $w^i$ is large fr $x^i$'s lying closer to query point $x_i$. \ $w^i$ is small for $x^i$'s lying farther to query point $x_i$. A Typical weight can be computed using \ $w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$) Where $\tau$ is the bandwidth parameter that controls $w^i$ distance from x. Let's look at a example : Suppose, we had a query point x=5.0 and training points $x^1$=4.9 and $x^2$=5.0 than we can calculate weights as : $w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$) with $\tau$=0.5 $w^1$ = $\exp$(-$\frac{(4.9-5)^2}{2(0.5)^2}$) = 0.9802 $w^2$ = $\exp$(-$\frac{(3-5)^2}{2(0.5)^2}$) = 0.000335 So, J($\theta$) = 0.9802*($\theta$ $^T$ $x^1$ - $y^1$) + 0.000335*($\theta$ $^T$ $x^2$ - $y^2$) So, here by we can conclude that the weight fall exponentially as the distance between x & $x^i$ increases and So, does the contribution of error in prediction for $x^i$ to the cost. Steps involved in LWL are : \ Compute \theta to minimize the cost. J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ \ Predict Output: for given query point x, \ return : $\theta$ $^T$ x <img src="https://miro.medium.com/max/700/1*H3QS05Q1GJtY-tiBL00iug.png" alt="LWL">
# Locally Weighted Linear Regression It is a non-parametric ML algorithm that does not learn on a fixed set of parameters such as **linear regression**. \ So, here comes a question of what is *linear regression*? \ **Linear regression** is a supervised learning algorithm used for computing linear relationships between input (X) and output (Y). \ ### Terminology Involved number_of_features(i) = Number of features involved. \ number_of_training_examples(m) = Number of training examples. \ output_sequence(y) = Output Sequence. \ $\theta$ $^T$ x = predicted point. \ J($\theta$) = COst function of point. The steps involved in ordinary linear regression are: Training phase: Compute \theta to minimize the cost. \ J($\theta$) = $\sum_{i=1}^m$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ Predict output: for given query point x, \ return: ($\theta$)$^T$ x <img src="https://miro.medium.com/max/700/1*FZsLp8yTULf77qrp0Qd91g.png" alt="Linear Regression"> This training phase is possible when data points are linear, but there again comes a question can we predict non-linear relationship between x and y ? as shown below <img src="https://miro.medium.com/max/700/1*DHYvJg55uN-Kj8jHaxDKvQ.png" alt="Non-linear Data"> <br /> <br /> So, here comes the role of non-parametric algorithm which doesn't compute predictions based on fixed set of params. Rather parameters $\theta$ are computed individually for each query point/data point x. <br /> <br /> While Computing $\theta$ , a higher preference is given to points in the vicinity of x than points farther from x. Cost Function J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ $w^i$ is non-negative weight associated to training point $x^i$. \ $w^i$ is large fr $x^i$'s lying closer to query point $x_i$. \ $w^i$ is small for $x^i$'s lying farther to query point $x_i$. A Typical weight can be computed using \ $w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$) Where $\tau$ is the bandwidth parameter that controls $w^i$ distance from x. Let's look at a example : Suppose, we had a query point x=5.0 and training points $x^1$=4.9 and $x^2$=5.0 than we can calculate weights as : $w^i$ = $\exp$(-$\frac{(x^i-x)(x^i-x)^T}{2\tau^2}$) with $\tau$=0.5 $w^1$ = $\exp$(-$\frac{(4.9-5)^2}{2(0.5)^2}$) = 0.9802 $w^2$ = $\exp$(-$\frac{(3-5)^2}{2(0.5)^2}$) = 0.000335 So, J($\theta$) = 0.9802*($\theta$ $^T$ $x^1$ - $y^1$) + 0.000335*($\theta$ $^T$ $x^2$ - $y^2$) So, here by we can conclude that the weight fall exponentially as the distance between x & $x^i$ increases and So, does the contribution of error in prediction for $x^i$ to the cost. Steps involved in LWL are : \ Compute \theta to minimize the cost. J($\theta$) = $\sum_{i=1}^m$ $w^i$ (($\theta$)$^T$ $x^i$ - $y^i$)$^2$ \ Predict Output: for given query point x, \ return : $\theta$ $^T$ x <img src="https://miro.medium.com/max/700/1*H3QS05Q1GJtY-tiBL00iug.png" alt="LWL">
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
<div align="center"> <!-- Title: --> <a href="https://github.com/TheAlgorithms/"> <img src="https://raw.githubusercontent.com/TheAlgorithms/website/1cd824df116b27029f17c2d1b42d81731f28a920/public/logo.svg" height="100"> </a> <h1><a href="https://github.com/TheAlgorithms/">The Algorithms</a> - Python</h1> <!-- Labels: --> <!-- First row: --> <a href="https://gitpod.io/#https://github.com/TheAlgorithms/Python"> <img src="https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod&style=flat-square" height="20" alt="Gitpod Ready-to-Code"> </a> <a href="https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md"> <img src="https://img.shields.io/static/v1.svg?label=Contributions&message=Welcome&color=0059b3&style=flat-square" height="20" alt="Contributions Welcome"> </a> <img src="https://img.shields.io/github/repo-size/TheAlgorithms/Python.svg?label=Repo%20size&style=flat-square" height="20"> <a href="https://discord.gg/c7MnfGFGa6"> <img src="https://img.shields.io/discord/808045925556682782.svg?logo=discord&colorB=7289DA&style=flat-square" height="20" alt="Discord chat"> </a> <a href="https://gitter.im/TheAlgorithms"> <img src="https://img.shields.io/badge/Chat-Gitter-ff69b4.svg?label=Chat&logo=gitter&style=flat-square" height="20" alt="Gitter chat"> </a> <!-- Second row: --> <br> <a href="https://github.com/TheAlgorithms/Python/actions"> <img src="https://img.shields.io/github/workflow/status/TheAlgorithms/Python/build?label=CI&logo=github&style=flat-square" height="20" alt="GitHub Workflow Status"> </a> <a href="https://github.com/pre-commit/pre-commit"> <img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" height="20" alt="pre-commit"> </a> <a href="https://github.com/psf/black"> <img src="https://img.shields.io/static/v1?label=code%20style&message=black&color=black&style=flat-square" height="20" alt="code style: black"> </a> <!-- Short description: --> <h3>All algorithms implemented in Python - for education</h3> </div> Implementations are for learning purposes only. They may be less efficient than the implementations in the Python standard library. Use them at your discretion. ## Getting Started Read through our [Contribution Guidelines](CONTRIBUTING.md) before you contribute. ## Community Channels We are on [Discord](https://discord.gg/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms)! Community channels are a great way for you to ask questions and get help. Please join us! ## List of Algorithms See our [directory](DIRECTORY.md) for easier navigation and a better overview of the project.
<div align="center"> <!-- Title: --> <a href="https://github.com/TheAlgorithms/"> <img src="https://raw.githubusercontent.com/TheAlgorithms/website/1cd824df116b27029f17c2d1b42d81731f28a920/public/logo.svg" height="100"> </a> <h1><a href="https://github.com/TheAlgorithms/">The Algorithms</a> - Python</h1> <!-- Labels: --> <!-- First row: --> <a href="https://gitpod.io/#https://github.com/TheAlgorithms/Python"> <img src="https://img.shields.io/badge/Gitpod-Ready--to--Code-blue?logo=gitpod&style=flat-square" height="20" alt="Gitpod Ready-to-Code"> </a> <a href="https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md"> <img src="https://img.shields.io/static/v1.svg?label=Contributions&message=Welcome&color=0059b3&style=flat-square" height="20" alt="Contributions Welcome"> </a> <img src="https://img.shields.io/github/repo-size/TheAlgorithms/Python.svg?label=Repo%20size&style=flat-square" height="20"> <a href="https://discord.gg/c7MnfGFGa6"> <img src="https://img.shields.io/discord/808045925556682782.svg?logo=discord&colorB=7289DA&style=flat-square" height="20" alt="Discord chat"> </a> <a href="https://gitter.im/TheAlgorithms"> <img src="https://img.shields.io/badge/Chat-Gitter-ff69b4.svg?label=Chat&logo=gitter&style=flat-square" height="20" alt="Gitter chat"> </a> <!-- Second row: --> <br> <a href="https://github.com/TheAlgorithms/Python/actions"> <img src="https://img.shields.io/github/workflow/status/TheAlgorithms/Python/build?label=CI&logo=github&style=flat-square" height="20" alt="GitHub Workflow Status"> </a> <a href="https://github.com/pre-commit/pre-commit"> <img src="https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white&style=flat-square" height="20" alt="pre-commit"> </a> <a href="https://github.com/psf/black"> <img src="https://img.shields.io/static/v1?label=code%20style&message=black&color=black&style=flat-square" height="20" alt="code style: black"> </a> <!-- Short description: --> <h3>All algorithms implemented in Python - for education</h3> </div> Implementations are for learning purposes only. They may be less efficient than the implementations in the Python standard library. Use them at your discretion. ## Getting Started Read through our [Contribution Guidelines](CONTRIBUTING.md) before you contribute. ## Community Channels We are on [Discord](https://discord.gg/c7MnfGFGa6) and [Gitter](https://gitter.im/TheAlgorithms)! Community channels are a great way for you to ask questions and get help. Please join us! ## List of Algorithms See our [directory](DIRECTORY.md) for easier navigation and a better overview of the project.
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# A naive recursive implementation of 0-1 Knapsack Problem This overview is taken from: https://en.wikipedia.org/wiki/Knapsack_problem --- ## Overview The knapsack problem is a problem in combinatorial optimization: Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. It derives its name from the problem faced by someone who is constrained by a fixed-size knapsack and must fill it with the most valuable items. The problem often arises in resource allocation where the decision makers have to choose from a set of non-divisible projects or tasks under a fixed budget or time constraint, respectively. The knapsack problem has been studied for more than a century, with early works dating as far back as 1897 The name "knapsack problem" dates back to the early works of mathematician Tobias Dantzig (1884–1956), and refers to the commonplace problem of packing the most valuable or useful items without overloading the luggage. --- ## Documentation This module uses docstrings to enable the use of Python's in-built `help(...)` function. For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`. --- ## Usage Import the module `knapsack.py` from the **.** directory into your project. --- ## Tests `.` contains Python unit tests which can be run with `python3 -m unittest -v`.
# A naive recursive implementation of 0-1 Knapsack Problem This overview is taken from: https://en.wikipedia.org/wiki/Knapsack_problem --- ## Overview The knapsack problem is a problem in combinatorial optimization: Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. It derives its name from the problem faced by someone who is constrained by a fixed-size knapsack and must fill it with the most valuable items. The problem often arises in resource allocation where the decision makers have to choose from a set of non-divisible projects or tasks under a fixed budget or time constraint, respectively. The knapsack problem has been studied for more than a century, with early works dating as far back as 1897 The name "knapsack problem" dates back to the early works of mathematician Tobias Dantzig (1884–1956), and refers to the commonplace problem of packing the most valuable or useful items without overloading the luggage. --- ## Documentation This module uses docstrings to enable the use of Python's in-built `help(...)` function. For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`. --- ## Usage Import the module `knapsack.py` from the **.** directory into your project. --- ## Tests `.` contains Python unit tests which can be run with `python3 -m unittest -v`.
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
### Describe your change: * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [ ] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
### Describe your change: * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [ ] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Linear algebra library for Python This module contains classes and functions for doing linear algebra. --- ## Overview ### class Vector - - This class represents a vector of arbitrary size and related operations. **Overview of the methods:** - constructor(components) : init the vector - set(components) : changes the vector components. - \_\_str\_\_() : toString method - component(i): gets the i-th component (0-indexed) - \_\_len\_\_() : gets the size / length of the vector (number of components) - euclidean_length() : returns the eulidean length of the vector - operator + : vector addition - operator - : vector subtraction - operator * : scalar multiplication and dot product - copy() : copies this vector and returns it - change_component(pos,value) : changes the specified component - function zero_vector(dimension) - returns a zero vector of 'dimension' - function unit_basis_vector(dimension, pos) - returns a unit basis vector with a one at index 'pos' (0-indexed) - function axpy(scalar, vector1, vector2) - computes the axpy operation - function random_vector(N, a, b) - returns a random vector of size N, with random integer components between 'a' and 'b' inclusive ### class Matrix - - This class represents a matrix of arbitrary size and operations on it. **Overview of the methods:** - \_\_str\_\_() : returns a string representation - operator * : implements the matrix vector multiplication implements the matrix-scalar multiplication. - change_component(x, y, value) : changes the specified component. - component(x, y) : returns the specified component. - width() : returns the width of the matrix - height() : returns the height of the matrix - determinant() : returns the determinant of the matrix if it is square - operator + : implements the matrix-addition. - operator - : implements the matrix-subtraction - function square_zero_matrix(N) - returns a square zero-matrix of dimension NxN - function random_matrix(W, H, a, b) - returns a random matrix WxH with integer components between 'a' and 'b' inclusive --- ## Documentation This module uses docstrings to enable the use of Python's in-built `help(...)` function. For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`. --- ## Usage Import the module `lib.py` from the **src** directory into your project. Alternatively, you can directly use the Python bytecode file `lib.pyc`. --- ## Tests `src/tests.py` contains Python unit tests which can be run with `python3 -m unittest -v`.
# Linear algebra library for Python This module contains classes and functions for doing linear algebra. --- ## Overview ### class Vector - - This class represents a vector of arbitrary size and related operations. **Overview of the methods:** - constructor(components) : init the vector - set(components) : changes the vector components. - \_\_str\_\_() : toString method - component(i): gets the i-th component (0-indexed) - \_\_len\_\_() : gets the size / length of the vector (number of components) - euclidean_length() : returns the eulidean length of the vector - operator + : vector addition - operator - : vector subtraction - operator * : scalar multiplication and dot product - copy() : copies this vector and returns it - change_component(pos,value) : changes the specified component - function zero_vector(dimension) - returns a zero vector of 'dimension' - function unit_basis_vector(dimension, pos) - returns a unit basis vector with a one at index 'pos' (0-indexed) - function axpy(scalar, vector1, vector2) - computes the axpy operation - function random_vector(N, a, b) - returns a random vector of size N, with random integer components between 'a' and 'b' inclusive ### class Matrix - - This class represents a matrix of arbitrary size and operations on it. **Overview of the methods:** - \_\_str\_\_() : returns a string representation - operator * : implements the matrix vector multiplication implements the matrix-scalar multiplication. - change_component(x, y, value) : changes the specified component. - component(x, y) : returns the specified component. - width() : returns the width of the matrix - height() : returns the height of the matrix - determinant() : returns the determinant of the matrix if it is square - operator + : implements the matrix-addition. - operator - : implements the matrix-subtraction - function square_zero_matrix(N) - returns a square zero-matrix of dimension NxN - function random_matrix(W, H, a, b) - returns a random matrix WxH with integer components between 'a' and 'b' inclusive --- ## Documentation This module uses docstrings to enable the use of Python's in-built `help(...)` function. For instance, try `help(Vector)`, `help(unit_basis_vector)`, and `help(CLASSNAME.METHODNAME)`. --- ## Usage Import the module `lib.py` from the **src** directory into your project. Alternatively, you can directly use the Python bytecode file `lib.pyc`. --- ## Tests `src/tests.py` contains Python unit tests which can be run with `python3 -m unittest -v`.
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Compression Data compression is everywhere, you need it to store data without taking too much space. Either the compression lose some data (then we talk about lossy compression, such as .jpg) or it does not (and then it is lossless compression, such as .png) Lossless compression is mainly used for archive purpose as it allow storing data without losing information about the file archived. On the other hand, lossy compression is used for transfer of file where quality isn't necessarily what is required (i.e: images on Twitter). * <https://www.sciencedirect.com/topics/computer-science/compression-algorithm> * <https://en.wikipedia.org/wiki/Data_compression> * <https://en.wikipedia.org/wiki/Pigeonhole_principle>
# Compression Data compression is everywhere, you need it to store data without taking too much space. Either the compression lose some data (then we talk about lossy compression, such as .jpg) or it does not (and then it is lossless compression, such as .png) Lossless compression is mainly used for archive purpose as it allow storing data without losing information about the file archived. On the other hand, lossy compression is used for transfer of file where quality isn't necessarily what is required (i.e: images on Twitter). * <https://www.sciencedirect.com/topics/computer-science/compression-algorithm> * <https://en.wikipedia.org/wiki/Data_compression> * <https://en.wikipedia.org/wiki/Pigeonhole_principle>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
MIT License Copyright (c) 2016-2022 TheAlgorithms and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
MIT License Copyright (c) 2016-2022 TheAlgorithms and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Cellular Automata Cellular automata are a way to simulate the behavior of "life", no matter if it is a robot or cell. They usually follow simple rules but can lead to the creation of complex forms. The most popular cellular automaton is Conway's [Game of Life](https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life). * <https://en.wikipedia.org/wiki/Cellular_automaton> * <https://mathworld.wolfram.com/ElementaryCellularAutomaton.html>
# Cellular Automata Cellular automata are a way to simulate the behavior of "life", no matter if it is a robot or cell. They usually follow simple rules but can lead to the creation of complex forms. The most popular cellular automaton is Conway's [Game of Life](https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life). * <https://en.wikipedia.org/wiki/Cellular_automaton> * <https://mathworld.wolfram.com/ElementaryCellularAutomaton.html>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Ciphers Ciphers are used to protect data from people that are not allowed to have it. They are everywhere on the internet to protect your connections. * <https://en.wikipedia.org/wiki/Cipher> * <http://practicalcryptography.com/ciphers/> * <https://practicalcryptography.com/ciphers/classical-era/>
# Ciphers Ciphers are used to protect data from people that are not allowed to have it. They are everywhere on the internet to protect your connections. * <https://en.wikipedia.org/wiki/Cipher> * <http://practicalcryptography.com/ciphers/> * <https://practicalcryptography.com/ciphers/classical-era/>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Boolean Algebra Boolean algebra is used to do arithmetic with bits of values True (1) or False (0). There are three basic operations: 'and', 'or' and 'not'. * <https://en.wikipedia.org/wiki/Boolean_algebra> * <https://plato.stanford.edu/entries/boolalg-math/>
# Boolean Algebra Boolean algebra is used to do arithmetic with bits of values True (1) or False (0). There are three basic operations: 'and', 'or' and 'not'. * <https://en.wikipedia.org/wiki/Boolean_algebra> * <https://plato.stanford.edu/entries/boolalg-math/>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Computer Vision Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human does, and provide appropriate output. It is like imparting human intelligence and instincts to a computer. Image processing and computer vision are a little different from each other. Image processing means applying some algorithms for transforming image from one form to the other like smoothing, contrasting, stretching, etc. While computer vision comes from modelling image processing using the techniques of machine learning, computer vision applies machine learning to recognize patterns for interpretation of images (much like the process of visual reasoning of human vision). * <https://en.wikipedia.org/wiki/Computer_vision> * <https://www.algorithmia.com/blog/introduction-to-computer-vision>
# Computer Vision Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human does, and provide appropriate output. It is like imparting human intelligence and instincts to a computer. Image processing and computer vision are a little different from each other. Image processing means applying some algorithms for transforming image from one form to the other like smoothing, contrasting, stretching, etc. While computer vision comes from modelling image processing using the techniques of machine learning, computer vision applies machine learning to recognize patterns for interpretation of images (much like the process of visual reasoning of human vision). * <https://en.wikipedia.org/wiki/Computer_vision> * <https://www.algorithmia.com/blog/introduction-to-computer-vision>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Binary Tree Traversal ## Overview The combination of binary trees being data structures and traversal being an algorithm relates to classic problems, either directly or indirectly. > If you can grasp the traversal of binary trees, the traversal of other complicated trees will be easy for you. The following are some common ways to traverse trees. - Depth First Traversals (DFS): In-order, Pre-order, Post-order - Level Order Traversal or Breadth First or Traversal (BFS) There are applications for both DFS and BFS. Stack can be used to simplify the process of DFS traversal. Besides, since tree is a recursive data structure, recursion and stack are two key points for DFS. Graph for DFS: ![binary-tree-traversal-dfs](https://tva1.sinaimg.cn/large/007S8ZIlly1ghluhzhynsg30dw0dw3yl.gif) The key point of BFS is how to determine whether the traversal of each level has been completed. The answer is to use a variable as a flag to represent the end of the traversal of current level. ## Pre-order Traversal The traversal order of pre-order traversal is `root-left-right`. Algorithm Pre-order 1. Visit the root node and push it into a stack. 2. Pop a node from the stack, and push its right and left child node into the stack respectively. 3. Repeat step 2. Conclusion: This problem involves the classic recursive data structure (i.e. a binary tree), and the algorithm above demonstrates how a simplified solution can be reached by using a stack. If you look at the bigger picture, you'll find that the process of traversal is as followed. `Visit the left subtrees respectively from top to bottom, and visit the right subtrees respectively from bottom to top`. If we are to implement it from this perspective, things will be somewhat different. For the `top to bottom` part we can simply use recursion, and for the `bottom to top` part we can turn to stack. ## In-order Traversal The traversal order of in-order traversal is `left-root-right`. So the root node is not printed first. Things are getting a bit complicated here. Algorithm In-order 1. Visit the root and push it into a stack. 2. If there is a left child node, push it into the stack. Repeat this process until a leaf node reached. > At this point the root node and all the left nodes are in the stack. 3. Start popping nodes from the stack. If a node has a right child node, push the child node into the stack. Repeat step 2. It's worth pointing out that the in-order traversal of a binary search tree (BST) is a sorted array, which is helpful for coming up simplified solutions for some problems. ## Post-order Traversal The traversal order of post-order traversal is `left-right-root`. This one is a bit of a challenge. It deserves the `hard` tag of LeetCode. In this case, the root node is printed not as the first but the last one. A cunning way to do it is to: Record whether the current node has been visited. If 1) it's a leaf node or 2) both its left and right subtrees have been traversed, then it can be popped from the stack. As for `1) it's a leaf node`, you can easily tell whether a node is a leaf if both its left and right are `null`. As for `2) both its left and right subtrees have been traversed`, we only need a variable to record whether a node has been visited or not. In the worst case, we need to record the status for every single node and the space complexity is `O(n)`. But if you come to think about it, as we are using a stack and start printing the result from the leaf nodes, it makes sense that we only record the status for the current node popping from the stack, reducing the space complexity to `O(1)`. ## Level Order Traversal The key point of level order traversal is how do we know whether the traversal of each level is done. The answer is that we use a variable as a flag representing the end of the traversal of the current level. ![binary-tree-traversal-bfs](https://tva1.sinaimg.cn/large/007S8ZIlly1ghlui1tpoug30dw0dw3yl.gif) Algorithm Level-order 1. Visit the root node, put it in a FIFO queue, put in the queue a special flag (we are using `null` here). 2. Dequeue a node. 3. If the node equals `null`, it means that all nodes of the current level have been visited. If the queue is empty, we do nothing. Or else we put in another `null`. 4. If the node is not `null`, meaning the traversal of current level has not finished yet, we enqueue its left subtree and right subtree respectively. ## Bi-color marking We know that there is a tri-color marking in garbage collection algorithm, which works as described below. - The white color represents "not visited". - The gray color represents "not all child nodes visited". - The black color represents "all child nodes visited". Enlightened by tri-color marking, a bi-color marking method can be invented to solve all three traversal problems with one solution. The core idea is as follow. - Use a color to mark whether a node has been visited or not. Nodes yet to be visited are marked as white and visited nodes are marked as gray. - If we are visiting a white node, turn it into gray, and push its right child node, itself, and it's left child node into the stack respectively. - If we are visiting a gray node, print it. Implementation of pre-order and post-order traversal algorithms can be easily done by changing the order of pushing the child nodes into the stack. Reference: [LeetCode](https://github.com/azl397985856/leetcode/blob/master/thinkings/binary-tree-traversal.en.md)
# Binary Tree Traversal ## Overview The combination of binary trees being data structures and traversal being an algorithm relates to classic problems, either directly or indirectly. > If you can grasp the traversal of binary trees, the traversal of other complicated trees will be easy for you. The following are some common ways to traverse trees. - Depth First Traversals (DFS): In-order, Pre-order, Post-order - Level Order Traversal or Breadth First or Traversal (BFS) There are applications for both DFS and BFS. Stack can be used to simplify the process of DFS traversal. Besides, since tree is a recursive data structure, recursion and stack are two key points for DFS. Graph for DFS: ![binary-tree-traversal-dfs](https://tva1.sinaimg.cn/large/007S8ZIlly1ghluhzhynsg30dw0dw3yl.gif) The key point of BFS is how to determine whether the traversal of each level has been completed. The answer is to use a variable as a flag to represent the end of the traversal of current level. ## Pre-order Traversal The traversal order of pre-order traversal is `root-left-right`. Algorithm Pre-order 1. Visit the root node and push it into a stack. 2. Pop a node from the stack, and push its right and left child node into the stack respectively. 3. Repeat step 2. Conclusion: This problem involves the classic recursive data structure (i.e. a binary tree), and the algorithm above demonstrates how a simplified solution can be reached by using a stack. If you look at the bigger picture, you'll find that the process of traversal is as followed. `Visit the left subtrees respectively from top to bottom, and visit the right subtrees respectively from bottom to top`. If we are to implement it from this perspective, things will be somewhat different. For the `top to bottom` part we can simply use recursion, and for the `bottom to top` part we can turn to stack. ## In-order Traversal The traversal order of in-order traversal is `left-root-right`. So the root node is not printed first. Things are getting a bit complicated here. Algorithm In-order 1. Visit the root and push it into a stack. 2. If there is a left child node, push it into the stack. Repeat this process until a leaf node reached. > At this point the root node and all the left nodes are in the stack. 3. Start popping nodes from the stack. If a node has a right child node, push the child node into the stack. Repeat step 2. It's worth pointing out that the in-order traversal of a binary search tree (BST) is a sorted array, which is helpful for coming up simplified solutions for some problems. ## Post-order Traversal The traversal order of post-order traversal is `left-right-root`. This one is a bit of a challenge. It deserves the `hard` tag of LeetCode. In this case, the root node is printed not as the first but the last one. A cunning way to do it is to: Record whether the current node has been visited. If 1) it's a leaf node or 2) both its left and right subtrees have been traversed, then it can be popped from the stack. As for `1) it's a leaf node`, you can easily tell whether a node is a leaf if both its left and right are `null`. As for `2) both its left and right subtrees have been traversed`, we only need a variable to record whether a node has been visited or not. In the worst case, we need to record the status for every single node and the space complexity is `O(n)`. But if you come to think about it, as we are using a stack and start printing the result from the leaf nodes, it makes sense that we only record the status for the current node popping from the stack, reducing the space complexity to `O(1)`. ## Level Order Traversal The key point of level order traversal is how do we know whether the traversal of each level is done. The answer is that we use a variable as a flag representing the end of the traversal of the current level. ![binary-tree-traversal-bfs](https://tva1.sinaimg.cn/large/007S8ZIlly1ghlui1tpoug30dw0dw3yl.gif) Algorithm Level-order 1. Visit the root node, put it in a FIFO queue, put in the queue a special flag (we are using `null` here). 2. Dequeue a node. 3. If the node equals `null`, it means that all nodes of the current level have been visited. If the queue is empty, we do nothing. Or else we put in another `null`. 4. If the node is not `null`, meaning the traversal of current level has not finished yet, we enqueue its left subtree and right subtree respectively. ## Bi-color marking We know that there is a tri-color marking in garbage collection algorithm, which works as described below. - The white color represents "not visited". - The gray color represents "not all child nodes visited". - The black color represents "all child nodes visited". Enlightened by tri-color marking, a bi-color marking method can be invented to solve all three traversal problems with one solution. The core idea is as follow. - Use a color to mark whether a node has been visited or not. Nodes yet to be visited are marked as white and visited nodes are marked as gray. - If we are visiting a white node, turn it into gray, and push its right child node, itself, and it's left child node into the stack respectively. - If we are visiting a gray node, print it. Implementation of pre-order and post-order traversal algorithms can be easily done by changing the order of pushing the child nodes into the stack. Reference: [LeetCode](https://github.com/azl397985856/leetcode/blob/master/thinkings/binary-tree-traversal.en.md)
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Conversion Conversion programs convert a type of data, a number from a numerical base or unit into one of another type, base or unit, e.g. binary to decimal, integer to string or foot to meters. * <https://en.wikipedia.org/wiki/Data_conversion> * <https://en.wikipedia.org/wiki/Transcoding>
# Conversion Conversion programs convert a type of data, a number from a numerical base or unit into one of another type, base or unit, e.g. binary to decimal, integer to string or foot to meters. * <https://en.wikipedia.org/wiki/Data_conversion> * <https://en.wikipedia.org/wiki/Transcoding>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Project Euler Problems are taken from https://projecteuler.net/, the Project Euler. [Problems are licensed under CC BY-NC-SA 4.0](https://projecteuler.net/copyright). Project Euler is a series of challenging mathematical/computer programming problems that require more than just mathematical insights to solve. Project Euler is ideal for mathematicians who are learning to code. The solutions will be checked by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) with the help of [this script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). The efficiency of your code is also checked. You can view the top 10 slowest solutions on GitHub Actions logs (under `slowest 10 durations`) and open a pull request to improve those solutions. ## Solution Guidelines Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before reading the solution guidelines, make sure you read the whole [Contributing Guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) as it won't be repeated in here. If you have any doubt on the guidelines, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). You can use the [template](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#solution-template) we have provided below as your starting point but be sure to read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) part first. ### Coding Style * Please maintain consistency in project directory and solution file names. Keep the following points in mind: * Create a new directory only for the problems which do not exist yet. * If you create a new directory, please create an empty `__init__.py` file inside it as well. * Please name the project **directory** as `problem_<problem_number>` where `problem_number` should be filled with 0s so as to occupy 3 digits. Example: `problem_001`, `problem_002`, `problem_067`, `problem_145`, and so on. * Please provide a link to the problem and other references, if used, in the **module-level docstring**. * All imports should come ***after*** the module-level docstring. * You can have as many helper functions as you want but there should be one main function called `solution` which should satisfy the conditions as stated below: * It should contain positional argument(s) whose default value is the question input. Example: Please take a look at [Problem 1](https://projecteuler.net/problem=1) where the question is to *Find the sum of all the multiples of 3 or 5 below 1000.* In this case the main solution function will be `solution(limit: int = 1000)`. * When the `solution` function is called without any arguments like so: `solution()`, it should return the answer to the problem. * Every function, which includes all the helper functions, if any, and the main solution function, should have `doctest` in the function docstring along with a brief statement mentioning what the function is about. * There should not be a `doctest` for testing the answer as that is done by our GitHub Actions build using this [script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). Keeping in mind the above example of [Problem 1](https://projecteuler.net/problem=1): ```python def solution(limit: int = 1000): """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution method in the module-level docstring. >>> solution(1) ... >>> solution(16) ... >>> solution(100) ... """ ``` ### Solution Template You can use the below template as your starting point but please read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) first to understand how the template works. Please change the name of the helper functions accordingly, change the parameter names with a descriptive one, replace the content within `[square brackets]` (including the brackets) with the appropriate content. ```python """ Project Euler Problem [problem number]: [link to the original problem] ... [Entire problem statement] ... ... [Solution explanation - Optional] ... References [Optional]: - [Wikipedia link to the topic] - [Stackoverflow link] ... """ import module1 import module2 ... def helper1(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement explaining what the function is about. ... A more elaborate description ... [Optional] ... [Doctest] ... """ ... # calculations ... return # You can have multiple helper functions but the solution function should be # after all the helper functions ... def solution(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution in the module-level docstring. ... [Doctest as mentioned above] ... """ ... # calculations ... return answer if __name__ == "__main__": print(f"{solution() = }") ```
# Project Euler Problems are taken from https://projecteuler.net/, the Project Euler. [Problems are licensed under CC BY-NC-SA 4.0](https://projecteuler.net/copyright). Project Euler is a series of challenging mathematical/computer programming problems that require more than just mathematical insights to solve. Project Euler is ideal for mathematicians who are learning to code. The solutions will be checked by our [automated testing on GitHub Actions](https://github.com/TheAlgorithms/Python/actions) with the help of [this script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). The efficiency of your code is also checked. You can view the top 10 slowest solutions on GitHub Actions logs (under `slowest 10 durations`) and open a pull request to improve those solutions. ## Solution Guidelines Welcome to [TheAlgorithms/Python](https://github.com/TheAlgorithms/Python)! Before reading the solution guidelines, make sure you read the whole [Contributing Guidelines](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md) as it won't be repeated in here. If you have any doubt on the guidelines, please feel free to [state it clearly in an issue](https://github.com/TheAlgorithms/Python/issues/new) or ask the community in [Gitter](https://gitter.im/TheAlgorithms). You can use the [template](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#solution-template) we have provided below as your starting point but be sure to read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) part first. ### Coding Style * Please maintain consistency in project directory and solution file names. Keep the following points in mind: * Create a new directory only for the problems which do not exist yet. * If you create a new directory, please create an empty `__init__.py` file inside it as well. * Please name the project **directory** as `problem_<problem_number>` where `problem_number` should be filled with 0s so as to occupy 3 digits. Example: `problem_001`, `problem_002`, `problem_067`, `problem_145`, and so on. * Please provide a link to the problem and other references, if used, in the **module-level docstring**. * All imports should come ***after*** the module-level docstring. * You can have as many helper functions as you want but there should be one main function called `solution` which should satisfy the conditions as stated below: * It should contain positional argument(s) whose default value is the question input. Example: Please take a look at [Problem 1](https://projecteuler.net/problem=1) where the question is to *Find the sum of all the multiples of 3 or 5 below 1000.* In this case the main solution function will be `solution(limit: int = 1000)`. * When the `solution` function is called without any arguments like so: `solution()`, it should return the answer to the problem. * Every function, which includes all the helper functions, if any, and the main solution function, should have `doctest` in the function docstring along with a brief statement mentioning what the function is about. * There should not be a `doctest` for testing the answer as that is done by our GitHub Actions build using this [script](https://github.com/TheAlgorithms/Python/blob/master/scripts/validate_solutions.py). Keeping in mind the above example of [Problem 1](https://projecteuler.net/problem=1): ```python def solution(limit: int = 1000): """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution method in the module-level docstring. >>> solution(1) ... >>> solution(16) ... >>> solution(100) ... """ ``` ### Solution Template You can use the below template as your starting point but please read the [Coding Style](https://github.com/TheAlgorithms/Python/blob/master/project_euler/README.md#coding-style) first to understand how the template works. Please change the name of the helper functions accordingly, change the parameter names with a descriptive one, replace the content within `[square brackets]` (including the brackets) with the appropriate content. ```python """ Project Euler Problem [problem number]: [link to the original problem] ... [Entire problem statement] ... ... [Solution explanation - Optional] ... References [Optional]: - [Wikipedia link to the topic] - [Stackoverflow link] ... """ import module1 import module2 ... def helper1(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement explaining what the function is about. ... A more elaborate description ... [Optional] ... [Doctest] ... """ ... # calculations ... return # You can have multiple helper functions but the solution function should be # after all the helper functions ... def solution(arg1: [type hint], arg2: [type hint], ...) -> [Return type hint]: """ A brief statement mentioning what the function is about. You can have a detailed explanation about the solution in the module-level docstring. ... [Doctest as mentioned above] ... """ ... # calculations ... return answer if __name__ == "__main__": print(f"{solution() = }") ```
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Arithmetic analysis Arithmetic analysis is a branch of mathematics that deals with solving linear equations. * <https://en.wikipedia.org/wiki/System_of_linear_equations> * <https://en.wikipedia.org/wiki/Gaussian_elimination> * <https://en.wikipedia.org/wiki/Root-finding_algorithms>
# Arithmetic analysis Arithmetic analysis is a branch of mathematics that deals with solving linear equations. * <https://en.wikipedia.org/wiki/System_of_linear_equations> * <https://en.wikipedia.org/wiki/Gaussian_elimination> * <https://en.wikipedia.org/wiki/Root-finding_algorithms>
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Shortest job remaining first Please note arrival time and burst Please use spaces to separate times entered. """ from __future__ import annotations import pandas as pd def calculate_waitingtime( arrival_time: list[int], burst_time: list[int], no_of_processes: int ) -> list[int]: """ Calculate the waiting time of each processes Return: List of waiting times. >>> calculate_waitingtime([1,2,3,4],[3,3,5,1],4) [0, 3, 5, 0] >>> calculate_waitingtime([1,2,3],[2,5,1],3) [0, 2, 0] >>> calculate_waitingtime([2,3],[5,1],2) [1, 0] """ remaining_time = [0] * no_of_processes waiting_time = [0] * no_of_processes # Copy the burst time into remaining_time[] for i in range(no_of_processes): remaining_time[i] = burst_time[i] complete = 0 increment_time = 0 minm = 999999999 short = 0 check = False # Process until all processes are completed while complete != no_of_processes: for j in range(no_of_processes): if arrival_time[j] <= increment_time: if remaining_time[j] > 0: if remaining_time[j] < minm: minm = remaining_time[j] short = j check = True if not check: increment_time += 1 continue remaining_time[short] -= 1 minm = remaining_time[short] if minm == 0: minm = 999999999 if remaining_time[short] == 0: complete += 1 check = False # Find finish time of current process finish_time = increment_time + 1 # Calculate waiting time finar = finish_time - arrival_time[short] waiting_time[short] = finar - burst_time[short] if waiting_time[short] < 0: waiting_time[short] = 0 # Increment time increment_time += 1 return waiting_time def calculate_turnaroundtime( burst_time: list[int], no_of_processes: int, waiting_time: list[int] ) -> list[int]: """ Calculate the turn around time of each Processes Return: list of turn around times. >>> calculate_turnaroundtime([3,3,5,1], 4, [0,3,5,0]) [3, 6, 10, 1] >>> calculate_turnaroundtime([3,3], 2, [0,3]) [3, 6] >>> calculate_turnaroundtime([8,10,1], 3, [1,0,3]) [9, 10, 4] """ turn_around_time = [0] * no_of_processes for i in range(no_of_processes): turn_around_time[i] = burst_time[i] + waiting_time[i] return turn_around_time def calculate_average_times( waiting_time: list[int], turn_around_time: list[int], no_of_processes: int ) -> None: """ This function calculates the average of the waiting & turnaround times Prints: Average Waiting time & Average Turn Around Time >>> calculate_average_times([0,3,5,0],[3,6,10,1],4) Average waiting time = 2.00000 Average turn around time = 5.0 >>> calculate_average_times([2,3],[3,6],2) Average waiting time = 2.50000 Average turn around time = 4.5 >>> calculate_average_times([10,4,3],[2,7,6],3) Average waiting time = 5.66667 Average turn around time = 5.0 """ total_waiting_time = 0 total_turn_around_time = 0 for i in range(no_of_processes): total_waiting_time = total_waiting_time + waiting_time[i] total_turn_around_time = total_turn_around_time + turn_around_time[i] print(f"Average waiting time = {total_waiting_time / no_of_processes:.5f}") print("Average turn around time =", total_turn_around_time / no_of_processes) if __name__ == "__main__": print("Enter how many process you want to analyze") no_of_processes = int(input()) burst_time = [0] * no_of_processes arrival_time = [0] * no_of_processes processes = list(range(1, no_of_processes + 1)) for i in range(no_of_processes): print("Enter the arrival time and burst time for process:--" + str(i + 1)) arrival_time[i], burst_time[i] = map(int, input().split()) waiting_time = calculate_waitingtime(arrival_time, burst_time, no_of_processes) bt = burst_time n = no_of_processes wt = waiting_time turn_around_time = calculate_turnaroundtime(bt, n, wt) calculate_average_times(waiting_time, turn_around_time, no_of_processes) fcfs = pd.DataFrame( list(zip(processes, burst_time, arrival_time, waiting_time, turn_around_time)), columns=[ "Process", "BurstTime", "ArrivalTime", "WaitingTime", "TurnAroundTime", ], ) # Printing the dataFrame pd.set_option("display.max_rows", fcfs.shape[0] + 1) print(fcfs)
""" Shortest job remaining first Please note arrival time and burst Please use spaces to separate times entered. """ from __future__ import annotations import pandas as pd def calculate_waitingtime( arrival_time: list[int], burst_time: list[int], no_of_processes: int ) -> list[int]: """ Calculate the waiting time of each processes Return: List of waiting times. >>> calculate_waitingtime([1,2,3,4],[3,3,5,1],4) [0, 3, 5, 0] >>> calculate_waitingtime([1,2,3],[2,5,1],3) [0, 2, 0] >>> calculate_waitingtime([2,3],[5,1],2) [1, 0] """ remaining_time = [0] * no_of_processes waiting_time = [0] * no_of_processes # Copy the burst time into remaining_time[] for i in range(no_of_processes): remaining_time[i] = burst_time[i] complete = 0 increment_time = 0 minm = 999999999 short = 0 check = False # Process until all processes are completed while complete != no_of_processes: for j in range(no_of_processes): if arrival_time[j] <= increment_time: if remaining_time[j] > 0: if remaining_time[j] < minm: minm = remaining_time[j] short = j check = True if not check: increment_time += 1 continue remaining_time[short] -= 1 minm = remaining_time[short] if minm == 0: minm = 999999999 if remaining_time[short] == 0: complete += 1 check = False # Find finish time of current process finish_time = increment_time + 1 # Calculate waiting time finar = finish_time - arrival_time[short] waiting_time[short] = finar - burst_time[short] if waiting_time[short] < 0: waiting_time[short] = 0 # Increment time increment_time += 1 return waiting_time def calculate_turnaroundtime( burst_time: list[int], no_of_processes: int, waiting_time: list[int] ) -> list[int]: """ Calculate the turn around time of each Processes Return: list of turn around times. >>> calculate_turnaroundtime([3,3,5,1], 4, [0,3,5,0]) [3, 6, 10, 1] >>> calculate_turnaroundtime([3,3], 2, [0,3]) [3, 6] >>> calculate_turnaroundtime([8,10,1], 3, [1,0,3]) [9, 10, 4] """ turn_around_time = [0] * no_of_processes for i in range(no_of_processes): turn_around_time[i] = burst_time[i] + waiting_time[i] return turn_around_time def calculate_average_times( waiting_time: list[int], turn_around_time: list[int], no_of_processes: int ) -> None: """ This function calculates the average of the waiting & turnaround times Prints: Average Waiting time & Average Turn Around Time >>> calculate_average_times([0,3,5,0],[3,6,10,1],4) Average waiting time = 2.00000 Average turn around time = 5.0 >>> calculate_average_times([2,3],[3,6],2) Average waiting time = 2.50000 Average turn around time = 4.5 >>> calculate_average_times([10,4,3],[2,7,6],3) Average waiting time = 5.66667 Average turn around time = 5.0 """ total_waiting_time = 0 total_turn_around_time = 0 for i in range(no_of_processes): total_waiting_time = total_waiting_time + waiting_time[i] total_turn_around_time = total_turn_around_time + turn_around_time[i] print(f"Average waiting time = {total_waiting_time / no_of_processes:.5f}") print("Average turn around time =", total_turn_around_time / no_of_processes) if __name__ == "__main__": print("Enter how many process you want to analyze") no_of_processes = int(input()) burst_time = [0] * no_of_processes arrival_time = [0] * no_of_processes processes = list(range(1, no_of_processes + 1)) for i in range(no_of_processes): print("Enter the arrival time and burst time for process:--" + str(i + 1)) arrival_time[i], burst_time[i] = map(int, input().split()) waiting_time = calculate_waitingtime(arrival_time, burst_time, no_of_processes) bt = burst_time n = no_of_processes wt = waiting_time turn_around_time = calculate_turnaroundtime(bt, n, wt) calculate_average_times(waiting_time, turn_around_time, no_of_processes) fcfs = pd.DataFrame( list(zip(processes, burst_time, arrival_time, waiting_time, turn_around_time)), columns=[ "Process", "BurstTime", "ArrivalTime", "WaitingTime", "TurnAroundTime", ], ) # Printing the dataFrame pd.set_option("display.max_rows", fcfs.shape[0] + 1) print(fcfs)
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 113: https://projecteuler.net/problem=113 Working from left-to-right if no digit is exceeded by the digit to its left it is called an increasing number; for example, 134468. Similarly if no digit is exceeded by the digit to its right it is called a decreasing number; for example, 66420. We shall call a positive integer that is neither increasing nor decreasing a "bouncy" number; for example, 155349. As n increases, the proportion of bouncy numbers below n increases such that there are only 12951 numbers below one-million that are not bouncy and only 277032 non-bouncy numbers below 10^10. How many numbers below a googol (10^100) are not bouncy? """ def choose(n: int, r: int) -> int: """ Calculate the binomial coefficient c(n,r) using the multiplicative formula. >>> choose(4,2) 6 >>> choose(5,3) 10 >>> choose(20,6) 38760 """ ret = 1.0 for i in range(1, r + 1): ret *= (n + 1 - i) / i return round(ret) def non_bouncy_exact(n: int) -> int: """ Calculate the number of non-bouncy numbers with at most n digits. >>> non_bouncy_exact(1) 9 >>> non_bouncy_exact(6) 7998 >>> non_bouncy_exact(10) 136126 """ return choose(8 + n, n) + choose(9 + n, n) - 10 def non_bouncy_upto(n: int) -> int: """ Calculate the number of non-bouncy numbers with at most n digits. >>> non_bouncy_upto(1) 9 >>> non_bouncy_upto(6) 12951 >>> non_bouncy_upto(10) 277032 """ return sum(non_bouncy_exact(i) for i in range(1, n + 1)) def solution(num_digits: int = 100) -> int: """ Calculate the number of non-bouncy numbers less than a googol. >>> solution(6) 12951 >>> solution(10) 277032 """ return non_bouncy_upto(num_digits) if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 113: https://projecteuler.net/problem=113 Working from left-to-right if no digit is exceeded by the digit to its left it is called an increasing number; for example, 134468. Similarly if no digit is exceeded by the digit to its right it is called a decreasing number; for example, 66420. We shall call a positive integer that is neither increasing nor decreasing a "bouncy" number; for example, 155349. As n increases, the proportion of bouncy numbers below n increases such that there are only 12951 numbers below one-million that are not bouncy and only 277032 non-bouncy numbers below 10^10. How many numbers below a googol (10^100) are not bouncy? """ def choose(n: int, r: int) -> int: """ Calculate the binomial coefficient c(n,r) using the multiplicative formula. >>> choose(4,2) 6 >>> choose(5,3) 10 >>> choose(20,6) 38760 """ ret = 1.0 for i in range(1, r + 1): ret *= (n + 1 - i) / i return round(ret) def non_bouncy_exact(n: int) -> int: """ Calculate the number of non-bouncy numbers with at most n digits. >>> non_bouncy_exact(1) 9 >>> non_bouncy_exact(6) 7998 >>> non_bouncy_exact(10) 136126 """ return choose(8 + n, n) + choose(9 + n, n) - 10 def non_bouncy_upto(n: int) -> int: """ Calculate the number of non-bouncy numbers with at most n digits. >>> non_bouncy_upto(1) 9 >>> non_bouncy_upto(6) 12951 >>> non_bouncy_upto(10) 277032 """ return sum(non_bouncy_exact(i) for i in range(1, n + 1)) def solution(num_digits: int = 100) -> int: """ Calculate the number of non-bouncy numbers less than a googol. >>> solution(6) 12951 >>> solution(10) 277032 """ return non_bouncy_upto(num_digits) if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def send_file(filename: str = "mytext.txt", testing: bool = False) -> None: import socket port = 12312 # Reserve a port for your service. sock = socket.socket() # Create a socket object host = socket.gethostname() # Get local machine name sock.bind((host, port)) # Bind to the port sock.listen(5) # Now wait for client connection. print("Server listening....") while True: conn, addr = sock.accept() # Establish connection with client. print(f"Got connection from {addr}") data = conn.recv(1024) print(f"Server received: {data = }") with open(filename, "rb") as in_file: data = in_file.read(1024) while data: conn.send(data) print(f"Sent {data!r}") data = in_file.read(1024) print("Done sending") conn.close() if testing: # Allow the test to complete break sock.shutdown(1) sock.close() if __name__ == "__main__": send_file()
def send_file(filename: str = "mytext.txt", testing: bool = False) -> None: import socket port = 12312 # Reserve a port for your service. sock = socket.socket() # Create a socket object host = socket.gethostname() # Get local machine name sock.bind((host, port)) # Bind to the port sock.listen(5) # Now wait for client connection. print("Server listening....") while True: conn, addr = sock.accept() # Establish connection with client. print(f"Got connection from {addr}") data = conn.recv(1024) print(f"Server received: {data = }") with open(filename, "rb") as in_file: data = in_file.read(1024) while data: conn.send(data) print(f"Sent {data!r}") data = in_file.read(1024) print("Done sending") conn.close() if testing: # Allow the test to complete break sock.shutdown(1) sock.close() if __name__ == "__main__": send_file()
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# https://en.wikipedia.org/wiki/Tree_traversal from __future__ import annotations from collections import deque from collections.abc import Sequence from dataclasses import dataclass from typing import Any @dataclass class Node: data: int left: Node | None = None right: Node | None = None def make_tree() -> Node | None: r""" The below tree 1 / \ 2 3 / \ 4 5 """ tree = Node(1) tree.left = Node(2) tree.right = Node(3) tree.left.left = Node(4) tree.left.right = Node(5) return tree def preorder(root: Node | None) -> list[int]: """ Pre-order traversal visits root node, left subtree, right subtree. >>> preorder(make_tree()) [1, 2, 4, 5, 3] """ return [root.data] + preorder(root.left) + preorder(root.right) if root else [] def postorder(root: Node | None) -> list[int]: """ Post-order traversal visits left subtree, right subtree, root node. >>> postorder(make_tree()) [4, 5, 2, 3, 1] """ return postorder(root.left) + postorder(root.right) + [root.data] if root else [] def inorder(root: Node | None) -> list[int]: """ In-order traversal visits left subtree, root node, right subtree. >>> inorder(make_tree()) [4, 2, 5, 1, 3] """ return inorder(root.left) + [root.data] + inorder(root.right) if root else [] def height(root: Node | None) -> int: """ Recursive function for calculating the height of the binary tree. >>> height(None) 0 >>> height(make_tree()) 3 """ return (max(height(root.left), height(root.right)) + 1) if root else 0 def level_order(root: Node | None) -> Sequence[Node | None]: """ Returns a list of nodes value from a whole binary tree in Level Order Traverse. Level Order traverse: Visit nodes of the tree level-by-level. """ output: list[Any] = [] if root is None: return output process_queue = deque([root]) while process_queue: node = process_queue.popleft() output.append(node.data) if node.left: process_queue.append(node.left) if node.right: process_queue.append(node.right) return output def get_nodes_from_left_to_right( root: Node | None, level: int ) -> Sequence[Node | None]: """ Returns a list of nodes value from a particular level: Left to right direction of the binary tree. """ output: list[Any] = [] def populate_output(root: Node | None, level: int) -> None: if not root: return if level == 1: output.append(root.data) elif level > 1: populate_output(root.left, level - 1) populate_output(root.right, level - 1) populate_output(root, level) return output def get_nodes_from_right_to_left( root: Node | None, level: int ) -> Sequence[Node | None]: """ Returns a list of nodes value from a particular level: Right to left direction of the binary tree. """ output: list[Any] = [] def populate_output(root: Node | None, level: int) -> None: if root is None: return if level == 1: output.append(root.data) elif level > 1: populate_output(root.right, level - 1) populate_output(root.left, level - 1) populate_output(root, level) return output def zigzag(root: Node | None) -> Sequence[Node | None] | list[Any]: """ ZigZag traverse: Returns a list of nodes value from left to right and right to left, alternatively. """ if root is None: return [] output: list[Sequence[Node | None]] = [] flag = 0 height_tree = height(root) for h in range(1, height_tree + 1): if not flag: output.append(get_nodes_from_left_to_right(root, h)) flag = 1 else: output.append(get_nodes_from_right_to_left(root, h)) flag = 0 return output def main() -> None: # Main function for testing. """ Create binary tree. """ root = make_tree() """ All Traversals of the binary are as follows: """ print(f"In-order Traversal: {inorder(root)}") print(f"Pre-order Traversal: {preorder(root)}") print(f"Post-order Traversal: {postorder(root)}", "\n") print(f"Height of Tree: {height(root)}", "\n") print("Complete Level Order Traversal: ") print(level_order(root), "\n") print("Level-wise order Traversal: ") for level in range(1, height(root) + 1): print(f"Level {level}:", get_nodes_from_left_to_right(root, level=level)) print("\nZigZag order Traversal: ") print(zigzag(root)) if __name__ == "__main__": import doctest doctest.testmod() main()
# https://en.wikipedia.org/wiki/Tree_traversal from __future__ import annotations from collections import deque from collections.abc import Sequence from dataclasses import dataclass from typing import Any @dataclass class Node: data: int left: Node | None = None right: Node | None = None def make_tree() -> Node | None: r""" The below tree 1 / \ 2 3 / \ 4 5 """ tree = Node(1) tree.left = Node(2) tree.right = Node(3) tree.left.left = Node(4) tree.left.right = Node(5) return tree def preorder(root: Node | None) -> list[int]: """ Pre-order traversal visits root node, left subtree, right subtree. >>> preorder(make_tree()) [1, 2, 4, 5, 3] """ return [root.data] + preorder(root.left) + preorder(root.right) if root else [] def postorder(root: Node | None) -> list[int]: """ Post-order traversal visits left subtree, right subtree, root node. >>> postorder(make_tree()) [4, 5, 2, 3, 1] """ return postorder(root.left) + postorder(root.right) + [root.data] if root else [] def inorder(root: Node | None) -> list[int]: """ In-order traversal visits left subtree, root node, right subtree. >>> inorder(make_tree()) [4, 2, 5, 1, 3] """ return inorder(root.left) + [root.data] + inorder(root.right) if root else [] def height(root: Node | None) -> int: """ Recursive function for calculating the height of the binary tree. >>> height(None) 0 >>> height(make_tree()) 3 """ return (max(height(root.left), height(root.right)) + 1) if root else 0 def level_order(root: Node | None) -> Sequence[Node | None]: """ Returns a list of nodes value from a whole binary tree in Level Order Traverse. Level Order traverse: Visit nodes of the tree level-by-level. """ output: list[Any] = [] if root is None: return output process_queue = deque([root]) while process_queue: node = process_queue.popleft() output.append(node.data) if node.left: process_queue.append(node.left) if node.right: process_queue.append(node.right) return output def get_nodes_from_left_to_right( root: Node | None, level: int ) -> Sequence[Node | None]: """ Returns a list of nodes value from a particular level: Left to right direction of the binary tree. """ output: list[Any] = [] def populate_output(root: Node | None, level: int) -> None: if not root: return if level == 1: output.append(root.data) elif level > 1: populate_output(root.left, level - 1) populate_output(root.right, level - 1) populate_output(root, level) return output def get_nodes_from_right_to_left( root: Node | None, level: int ) -> Sequence[Node | None]: """ Returns a list of nodes value from a particular level: Right to left direction of the binary tree. """ output: list[Any] = [] def populate_output(root: Node | None, level: int) -> None: if root is None: return if level == 1: output.append(root.data) elif level > 1: populate_output(root.right, level - 1) populate_output(root.left, level - 1) populate_output(root, level) return output def zigzag(root: Node | None) -> Sequence[Node | None] | list[Any]: """ ZigZag traverse: Returns a list of nodes value from left to right and right to left, alternatively. """ if root is None: return [] output: list[Sequence[Node | None]] = [] flag = 0 height_tree = height(root) for h in range(1, height_tree + 1): if not flag: output.append(get_nodes_from_left_to_right(root, h)) flag = 1 else: output.append(get_nodes_from_right_to_left(root, h)) flag = 0 return output def main() -> None: # Main function for testing. """ Create binary tree. """ root = make_tree() """ All Traversals of the binary are as follows: """ print(f"In-order Traversal: {inorder(root)}") print(f"Pre-order Traversal: {preorder(root)}") print(f"Post-order Traversal: {postorder(root)}", "\n") print(f"Height of Tree: {height(root)}", "\n") print("Complete Level Order Traversal: ") print(level_order(root), "\n") print("Level-wise order Traversal: ") for level in range(1, height(root) + 1): print(f"Level {level}:", get_nodes_from_left_to_right(root, level=level)) print("\nZigZag order Traversal: ") print(zigzag(root)) if __name__ == "__main__": import doctest doctest.testmod() main()
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# An island in matrix is a group of linked areas, all having the same value. # This code counts number of islands in a given matrix, with including diagonal # connections. class Matrix: # Public class to implement a graph def __init__(self, row: int, col: int, graph: list[list[bool]]) -> None: self.ROW = row self.COL = col self.graph = graph def is_safe(self, i: int, j: int, visited: list[list[bool]]) -> bool: return ( 0 <= i < self.ROW and 0 <= j < self.COL and not visited[i][j] and self.graph[i][j] ) def diffs(self, i: int, j: int, visited: list[list[bool]]) -> None: # Checking all 8 elements surrounding nth element row_nbr = [-1, -1, -1, 0, 0, 1, 1, 1] # Coordinate order col_nbr = [-1, 0, 1, -1, 1, -1, 0, 1] visited[i][j] = True # Make those cells visited for k in range(8): if self.is_safe(i + row_nbr[k], j + col_nbr[k], visited): self.diffs(i + row_nbr[k], j + col_nbr[k], visited) def count_islands(self) -> int: # And finally, count all islands. visited = [[False for j in range(self.COL)] for i in range(self.ROW)] count = 0 for i in range(self.ROW): for j in range(self.COL): if visited[i][j] is False and self.graph[i][j] == 1: self.diffs(i, j, visited) count += 1 return count
# An island in matrix is a group of linked areas, all having the same value. # This code counts number of islands in a given matrix, with including diagonal # connections. class Matrix: # Public class to implement a graph def __init__(self, row: int, col: int, graph: list[list[bool]]) -> None: self.ROW = row self.COL = col self.graph = graph def is_safe(self, i: int, j: int, visited: list[list[bool]]) -> bool: return ( 0 <= i < self.ROW and 0 <= j < self.COL and not visited[i][j] and self.graph[i][j] ) def diffs(self, i: int, j: int, visited: list[list[bool]]) -> None: # Checking all 8 elements surrounding nth element row_nbr = [-1, -1, -1, 0, 0, 1, 1, 1] # Coordinate order col_nbr = [-1, 0, 1, -1, 1, -1, 0, 1] visited[i][j] = True # Make those cells visited for k in range(8): if self.is_safe(i + row_nbr[k], j + col_nbr[k], visited): self.diffs(i + row_nbr[k], j + col_nbr[k], visited) def count_islands(self) -> int: # And finally, count all islands. visited = [[False for j in range(self.COL)] for i in range(self.ROW)] count = 0 for i in range(self.ROW): for j in range(self.COL): if visited[i][j] is False and self.graph[i][j] == 1: self.diffs(i, j, visited) count += 1 return count
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Find the area of various geometric shapes Wikipedia reference: https://en.wikipedia.org/wiki/Area """ from math import pi, sqrt, tan def surface_area_cube(side_length: float) -> float: """ Calculate the Surface Area of a Cube. >>> surface_area_cube(1) 6 >>> surface_area_cube(1.6) 15.360000000000003 >>> surface_area_cube(0) 0 >>> surface_area_cube(3) 54 >>> surface_area_cube(-1) Traceback (most recent call last): ... ValueError: surface_area_cube() only accepts non-negative values """ if side_length < 0: raise ValueError("surface_area_cube() only accepts non-negative values") return 6 * side_length**2 def surface_area_cuboid(length: float, breadth: float, height: float) -> float: """ Calculate the Surface Area of a Cuboid. >>> surface_area_cuboid(1, 2, 3) 22 >>> surface_area_cuboid(0, 0, 0) 0 >>> surface_area_cuboid(1.6, 2.6, 3.6) 38.56 >>> surface_area_cuboid(-1, 2, 3) Traceback (most recent call last): ... ValueError: surface_area_cuboid() only accepts non-negative values >>> surface_area_cuboid(1, -2, 3) Traceback (most recent call last): ... ValueError: surface_area_cuboid() only accepts non-negative values >>> surface_area_cuboid(1, 2, -3) Traceback (most recent call last): ... ValueError: surface_area_cuboid() only accepts non-negative values """ if length < 0 or breadth < 0 or height < 0: raise ValueError("surface_area_cuboid() only accepts non-negative values") return 2 * ((length * breadth) + (breadth * height) + (length * height)) def surface_area_sphere(radius: float) -> float: """ Calculate the Surface Area of a Sphere. Wikipedia reference: https://en.wikipedia.org/wiki/Sphere Formula: 4 * pi * r^2 >>> surface_area_sphere(5) 314.1592653589793 >>> surface_area_sphere(1) 12.566370614359172 >>> surface_area_sphere(1.6) 32.169908772759484 >>> surface_area_sphere(0) 0.0 >>> surface_area_sphere(-1) Traceback (most recent call last): ... ValueError: surface_area_sphere() only accepts non-negative values """ if radius < 0: raise ValueError("surface_area_sphere() only accepts non-negative values") return 4 * pi * radius**2 def surface_area_hemisphere(radius: float) -> float: """ Calculate the Surface Area of a Hemisphere. Formula: 3 * pi * r^2 >>> surface_area_hemisphere(5) 235.61944901923448 >>> surface_area_hemisphere(1) 9.42477796076938 >>> surface_area_hemisphere(0) 0.0 >>> surface_area_hemisphere(1.1) 11.40398133253095 >>> surface_area_hemisphere(-1) Traceback (most recent call last): ... ValueError: surface_area_hemisphere() only accepts non-negative values """ if radius < 0: raise ValueError("surface_area_hemisphere() only accepts non-negative values") return 3 * pi * radius**2 def surface_area_cone(radius: float, height: float) -> float: """ Calculate the Surface Area of a Cone. Wikipedia reference: https://en.wikipedia.org/wiki/Cone Formula: pi * r * (r + (h ** 2 + r ** 2) ** 0.5) >>> surface_area_cone(10, 24) 1130.9733552923256 >>> surface_area_cone(6, 8) 301.59289474462014 >>> surface_area_cone(1.6, 2.6) 23.387862992395807 >>> surface_area_cone(0, 0) 0.0 >>> surface_area_cone(-1, -2) Traceback (most recent call last): ... ValueError: surface_area_cone() only accepts non-negative values >>> surface_area_cone(1, -2) Traceback (most recent call last): ... ValueError: surface_area_cone() only accepts non-negative values >>> surface_area_cone(-1, 2) Traceback (most recent call last): ... ValueError: surface_area_cone() only accepts non-negative values """ if radius < 0 or height < 0: raise ValueError("surface_area_cone() only accepts non-negative values") return pi * radius * (radius + (height**2 + radius**2) ** 0.5) def surface_area_conical_frustum( radius_1: float, radius_2: float, height: float ) -> float: """ Calculate the Surface Area of a Conical Frustum. >>> surface_area_conical_frustum(1, 2, 3) 45.511728065337266 >>> surface_area_conical_frustum(4, 5, 6) 300.7913575056268 >>> surface_area_conical_frustum(0, 0, 0) 0.0 >>> surface_area_conical_frustum(1.6, 2.6, 3.6) 78.57907060751548 >>> surface_area_conical_frustum(-1, 2, 3) Traceback (most recent call last): ... ValueError: surface_area_conical_frustum() only accepts non-negative values >>> surface_area_conical_frustum(1, -2, 3) Traceback (most recent call last): ... ValueError: surface_area_conical_frustum() only accepts non-negative values >>> surface_area_conical_frustum(1, 2, -3) Traceback (most recent call last): ... ValueError: surface_area_conical_frustum() only accepts non-negative values """ if radius_1 < 0 or radius_2 < 0 or height < 0: raise ValueError( "surface_area_conical_frustum() only accepts non-negative values" ) slant_height = (height**2 + (radius_1 - radius_2) ** 2) ** 0.5 return pi * ((slant_height * (radius_1 + radius_2)) + radius_1**2 + radius_2**2) def surface_area_cylinder(radius: float, height: float) -> float: """ Calculate the Surface Area of a Cylinder. Wikipedia reference: https://en.wikipedia.org/wiki/Cylinder Formula: 2 * pi * r * (h + r) >>> surface_area_cylinder(7, 10) 747.6990515543707 >>> surface_area_cylinder(1.6, 2.6) 42.22300526424682 >>> surface_area_cylinder(0, 0) 0.0 >>> surface_area_cylinder(6, 8) 527.7875658030853 >>> surface_area_cylinder(-1, -2) Traceback (most recent call last): ... ValueError: surface_area_cylinder() only accepts non-negative values >>> surface_area_cylinder(1, -2) Traceback (most recent call last): ... ValueError: surface_area_cylinder() only accepts non-negative values >>> surface_area_cylinder(-1, 2) Traceback (most recent call last): ... ValueError: surface_area_cylinder() only accepts non-negative values """ if radius < 0 or height < 0: raise ValueError("surface_area_cylinder() only accepts non-negative values") return 2 * pi * radius * (height + radius) def area_rectangle(length: float, width: float) -> float: """ Calculate the area of a rectangle. >>> area_rectangle(10, 20) 200 >>> area_rectangle(1.6, 2.6) 4.16 >>> area_rectangle(0, 0) 0 >>> area_rectangle(-1, -2) Traceback (most recent call last): ... ValueError: area_rectangle() only accepts non-negative values >>> area_rectangle(1, -2) Traceback (most recent call last): ... ValueError: area_rectangle() only accepts non-negative values >>> area_rectangle(-1, 2) Traceback (most recent call last): ... ValueError: area_rectangle() only accepts non-negative values """ if length < 0 or width < 0: raise ValueError("area_rectangle() only accepts non-negative values") return length * width def area_square(side_length: float) -> float: """ Calculate the area of a square. >>> area_square(10) 100 >>> area_square(0) 0 >>> area_square(1.6) 2.5600000000000005 >>> area_square(-1) Traceback (most recent call last): ... ValueError: area_square() only accepts non-negative values """ if side_length < 0: raise ValueError("area_square() only accepts non-negative values") return side_length**2 def area_triangle(base: float, height: float) -> float: """ Calculate the area of a triangle given the base and height. >>> area_triangle(10, 10) 50.0 >>> area_triangle(1.6, 2.6) 2.08 >>> area_triangle(0, 0) 0.0 >>> area_triangle(-1, -2) Traceback (most recent call last): ... ValueError: area_triangle() only accepts non-negative values >>> area_triangle(1, -2) Traceback (most recent call last): ... ValueError: area_triangle() only accepts non-negative values >>> area_triangle(-1, 2) Traceback (most recent call last): ... ValueError: area_triangle() only accepts non-negative values """ if base < 0 or height < 0: raise ValueError("area_triangle() only accepts non-negative values") return (base * height) / 2 def area_triangle_three_sides(side1: float, side2: float, side3: float) -> float: """ Calculate area of triangle when the length of 3 sides are known. This function uses Heron's formula: https://en.wikipedia.org/wiki/Heron%27s_formula >>> area_triangle_three_sides(5, 12, 13) 30.0 >>> area_triangle_three_sides(10, 11, 12) 51.521233486786784 >>> area_triangle_three_sides(0, 0, 0) 0.0 >>> area_triangle_three_sides(1.6, 2.6, 3.6) 1.8703742940919619 >>> area_triangle_three_sides(-1, -2, -1) Traceback (most recent call last): ... ValueError: area_triangle_three_sides() only accepts non-negative values >>> area_triangle_three_sides(1, -2, 1) Traceback (most recent call last): ... ValueError: area_triangle_three_sides() only accepts non-negative values >>> area_triangle_three_sides(2, 4, 7) Traceback (most recent call last): ... ValueError: Given three sides do not form a triangle >>> area_triangle_three_sides(2, 7, 4) Traceback (most recent call last): ... ValueError: Given three sides do not form a triangle >>> area_triangle_three_sides(7, 2, 4) Traceback (most recent call last): ... ValueError: Given three sides do not form a triangle """ if side1 < 0 or side2 < 0 or side3 < 0: raise ValueError("area_triangle_three_sides() only accepts non-negative values") elif side1 + side2 < side3 or side1 + side3 < side2 or side2 + side3 < side1: raise ValueError("Given three sides do not form a triangle") semi_perimeter = (side1 + side2 + side3) / 2 area = sqrt( semi_perimeter * (semi_perimeter - side1) * (semi_perimeter - side2) * (semi_perimeter - side3) ) return area def area_parallelogram(base: float, height: float) -> float: """ Calculate the area of a parallelogram. >>> area_parallelogram(10, 20) 200 >>> area_parallelogram(1.6, 2.6) 4.16 >>> area_parallelogram(0, 0) 0 >>> area_parallelogram(-1, -2) Traceback (most recent call last): ... ValueError: area_parallelogram() only accepts non-negative values >>> area_parallelogram(1, -2) Traceback (most recent call last): ... ValueError: area_parallelogram() only accepts non-negative values >>> area_parallelogram(-1, 2) Traceback (most recent call last): ... ValueError: area_parallelogram() only accepts non-negative values """ if base < 0 or height < 0: raise ValueError("area_parallelogram() only accepts non-negative values") return base * height def area_trapezium(base1: float, base2: float, height: float) -> float: """ Calculate the area of a trapezium. >>> area_trapezium(10, 20, 30) 450.0 >>> area_trapezium(1.6, 2.6, 3.6) 7.5600000000000005 >>> area_trapezium(0, 0, 0) 0.0 >>> area_trapezium(-1, -2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(-1, 2, 3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(1, -2, 3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(1, 2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(-1, -2, 3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(1, -2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(-1, 2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values """ if base1 < 0 or base2 < 0 or height < 0: raise ValueError("area_trapezium() only accepts non-negative values") return 1 / 2 * (base1 + base2) * height def area_circle(radius: float) -> float: """ Calculate the area of a circle. >>> area_circle(20) 1256.6370614359173 >>> area_circle(1.6) 8.042477193189871 >>> area_circle(0) 0.0 >>> area_circle(-1) Traceback (most recent call last): ... ValueError: area_circle() only accepts non-negative values """ if radius < 0: raise ValueError("area_circle() only accepts non-negative values") return pi * radius**2 def area_ellipse(radius_x: float, radius_y: float) -> float: """ Calculate the area of a ellipse. >>> area_ellipse(10, 10) 314.1592653589793 >>> area_ellipse(10, 20) 628.3185307179587 >>> area_ellipse(0, 0) 0.0 >>> area_ellipse(1.6, 2.6) 13.06902543893354 >>> area_ellipse(-10, 20) Traceback (most recent call last): ... ValueError: area_ellipse() only accepts non-negative values >>> area_ellipse(10, -20) Traceback (most recent call last): ... ValueError: area_ellipse() only accepts non-negative values >>> area_ellipse(-10, -20) Traceback (most recent call last): ... ValueError: area_ellipse() only accepts non-negative values """ if radius_x < 0 or radius_y < 0: raise ValueError("area_ellipse() only accepts non-negative values") return pi * radius_x * radius_y def area_rhombus(diagonal_1: float, diagonal_2: float) -> float: """ Calculate the area of a rhombus. >>> area_rhombus(10, 20) 100.0 >>> area_rhombus(1.6, 2.6) 2.08 >>> area_rhombus(0, 0) 0.0 >>> area_rhombus(-1, -2) Traceback (most recent call last): ... ValueError: area_rhombus() only accepts non-negative values >>> area_rhombus(1, -2) Traceback (most recent call last): ... ValueError: area_rhombus() only accepts non-negative values >>> area_rhombus(-1, 2) Traceback (most recent call last): ... ValueError: area_rhombus() only accepts non-negative values """ if diagonal_1 < 0 or diagonal_2 < 0: raise ValueError("area_rhombus() only accepts non-negative values") return 1 / 2 * diagonal_1 * diagonal_2 def area_reg_polygon(sides: int, length: float) -> float: """ Calculate the area of a regular polygon. Wikipedia reference: https://en.wikipedia.org/wiki/Polygon#Regular_polygons Formula: (n*s^2*cot(pi/n))/4 >>> area_reg_polygon(3, 10) 43.301270189221945 >>> area_reg_polygon(4, 10) 100.00000000000001 >>> area_reg_polygon(0, 0) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts integers greater than or equal to \ three as number of sides >>> area_reg_polygon(-1, -2) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts integers greater than or equal to \ three as number of sides >>> area_reg_polygon(5, -2) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts non-negative values as \ length of a side >>> area_reg_polygon(-1, 2) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts integers greater than or equal to \ three as number of sides """ if not isinstance(sides, int) or sides < 3: raise ValueError( "area_reg_polygon() only accepts integers greater than or \ equal to three as number of sides" ) elif length < 0: raise ValueError( "area_reg_polygon() only accepts non-negative values as \ length of a side" ) return (sides * length**2) / (4 * tan(pi / sides)) return (sides * length**2) / (4 * tan(pi / sides)) if __name__ == "__main__": import doctest doctest.testmod(verbose=True) # verbose so we can see methods missing tests print("[DEMO] Areas of various geometric shapes: \n") print(f"Rectangle: {area_rectangle(10, 20) = }") print(f"Square: {area_square(10) = }") print(f"Triangle: {area_triangle(10, 10) = }") print(f"Triangle: {area_triangle_three_sides(5, 12, 13) = }") print(f"Parallelogram: {area_parallelogram(10, 20) = }") print(f"Rhombus: {area_rhombus(10, 20) = }") print(f"Trapezium: {area_trapezium(10, 20, 30) = }") print(f"Circle: {area_circle(20) = }") print(f"Ellipse: {area_ellipse(10, 20) = }") print("\nSurface Areas of various geometric shapes: \n") print(f"Cube: {surface_area_cube(20) = }") print(f"Cuboid: {surface_area_cuboid(10, 20, 30) = }") print(f"Sphere: {surface_area_sphere(20) = }") print(f"Hemisphere: {surface_area_hemisphere(20) = }") print(f"Cone: {surface_area_cone(10, 20) = }") print(f"Conical Frustum: {surface_area_conical_frustum(10, 20, 30) = }") print(f"Cylinder: {surface_area_cylinder(10, 20) = }") print(f"Equilateral Triangle: {area_reg_polygon(3, 10) = }") print(f"Square: {area_reg_polygon(4, 10) = }") print(f"Reqular Pentagon: {area_reg_polygon(5, 10) = }")
""" Find the area of various geometric shapes Wikipedia reference: https://en.wikipedia.org/wiki/Area """ from math import pi, sqrt, tan def surface_area_cube(side_length: float) -> float: """ Calculate the Surface Area of a Cube. >>> surface_area_cube(1) 6 >>> surface_area_cube(1.6) 15.360000000000003 >>> surface_area_cube(0) 0 >>> surface_area_cube(3) 54 >>> surface_area_cube(-1) Traceback (most recent call last): ... ValueError: surface_area_cube() only accepts non-negative values """ if side_length < 0: raise ValueError("surface_area_cube() only accepts non-negative values") return 6 * side_length**2 def surface_area_cuboid(length: float, breadth: float, height: float) -> float: """ Calculate the Surface Area of a Cuboid. >>> surface_area_cuboid(1, 2, 3) 22 >>> surface_area_cuboid(0, 0, 0) 0 >>> surface_area_cuboid(1.6, 2.6, 3.6) 38.56 >>> surface_area_cuboid(-1, 2, 3) Traceback (most recent call last): ... ValueError: surface_area_cuboid() only accepts non-negative values >>> surface_area_cuboid(1, -2, 3) Traceback (most recent call last): ... ValueError: surface_area_cuboid() only accepts non-negative values >>> surface_area_cuboid(1, 2, -3) Traceback (most recent call last): ... ValueError: surface_area_cuboid() only accepts non-negative values """ if length < 0 or breadth < 0 or height < 0: raise ValueError("surface_area_cuboid() only accepts non-negative values") return 2 * ((length * breadth) + (breadth * height) + (length * height)) def surface_area_sphere(radius: float) -> float: """ Calculate the Surface Area of a Sphere. Wikipedia reference: https://en.wikipedia.org/wiki/Sphere Formula: 4 * pi * r^2 >>> surface_area_sphere(5) 314.1592653589793 >>> surface_area_sphere(1) 12.566370614359172 >>> surface_area_sphere(1.6) 32.169908772759484 >>> surface_area_sphere(0) 0.0 >>> surface_area_sphere(-1) Traceback (most recent call last): ... ValueError: surface_area_sphere() only accepts non-negative values """ if radius < 0: raise ValueError("surface_area_sphere() only accepts non-negative values") return 4 * pi * radius**2 def surface_area_hemisphere(radius: float) -> float: """ Calculate the Surface Area of a Hemisphere. Formula: 3 * pi * r^2 >>> surface_area_hemisphere(5) 235.61944901923448 >>> surface_area_hemisphere(1) 9.42477796076938 >>> surface_area_hemisphere(0) 0.0 >>> surface_area_hemisphere(1.1) 11.40398133253095 >>> surface_area_hemisphere(-1) Traceback (most recent call last): ... ValueError: surface_area_hemisphere() only accepts non-negative values """ if radius < 0: raise ValueError("surface_area_hemisphere() only accepts non-negative values") return 3 * pi * radius**2 def surface_area_cone(radius: float, height: float) -> float: """ Calculate the Surface Area of a Cone. Wikipedia reference: https://en.wikipedia.org/wiki/Cone Formula: pi * r * (r + (h ** 2 + r ** 2) ** 0.5) >>> surface_area_cone(10, 24) 1130.9733552923256 >>> surface_area_cone(6, 8) 301.59289474462014 >>> surface_area_cone(1.6, 2.6) 23.387862992395807 >>> surface_area_cone(0, 0) 0.0 >>> surface_area_cone(-1, -2) Traceback (most recent call last): ... ValueError: surface_area_cone() only accepts non-negative values >>> surface_area_cone(1, -2) Traceback (most recent call last): ... ValueError: surface_area_cone() only accepts non-negative values >>> surface_area_cone(-1, 2) Traceback (most recent call last): ... ValueError: surface_area_cone() only accepts non-negative values """ if radius < 0 or height < 0: raise ValueError("surface_area_cone() only accepts non-negative values") return pi * radius * (radius + (height**2 + radius**2) ** 0.5) def surface_area_conical_frustum( radius_1: float, radius_2: float, height: float ) -> float: """ Calculate the Surface Area of a Conical Frustum. >>> surface_area_conical_frustum(1, 2, 3) 45.511728065337266 >>> surface_area_conical_frustum(4, 5, 6) 300.7913575056268 >>> surface_area_conical_frustum(0, 0, 0) 0.0 >>> surface_area_conical_frustum(1.6, 2.6, 3.6) 78.57907060751548 >>> surface_area_conical_frustum(-1, 2, 3) Traceback (most recent call last): ... ValueError: surface_area_conical_frustum() only accepts non-negative values >>> surface_area_conical_frustum(1, -2, 3) Traceback (most recent call last): ... ValueError: surface_area_conical_frustum() only accepts non-negative values >>> surface_area_conical_frustum(1, 2, -3) Traceback (most recent call last): ... ValueError: surface_area_conical_frustum() only accepts non-negative values """ if radius_1 < 0 or radius_2 < 0 or height < 0: raise ValueError( "surface_area_conical_frustum() only accepts non-negative values" ) slant_height = (height**2 + (radius_1 - radius_2) ** 2) ** 0.5 return pi * ((slant_height * (radius_1 + radius_2)) + radius_1**2 + radius_2**2) def surface_area_cylinder(radius: float, height: float) -> float: """ Calculate the Surface Area of a Cylinder. Wikipedia reference: https://en.wikipedia.org/wiki/Cylinder Formula: 2 * pi * r * (h + r) >>> surface_area_cylinder(7, 10) 747.6990515543707 >>> surface_area_cylinder(1.6, 2.6) 42.22300526424682 >>> surface_area_cylinder(0, 0) 0.0 >>> surface_area_cylinder(6, 8) 527.7875658030853 >>> surface_area_cylinder(-1, -2) Traceback (most recent call last): ... ValueError: surface_area_cylinder() only accepts non-negative values >>> surface_area_cylinder(1, -2) Traceback (most recent call last): ... ValueError: surface_area_cylinder() only accepts non-negative values >>> surface_area_cylinder(-1, 2) Traceback (most recent call last): ... ValueError: surface_area_cylinder() only accepts non-negative values """ if radius < 0 or height < 0: raise ValueError("surface_area_cylinder() only accepts non-negative values") return 2 * pi * radius * (height + radius) def area_rectangle(length: float, width: float) -> float: """ Calculate the area of a rectangle. >>> area_rectangle(10, 20) 200 >>> area_rectangle(1.6, 2.6) 4.16 >>> area_rectangle(0, 0) 0 >>> area_rectangle(-1, -2) Traceback (most recent call last): ... ValueError: area_rectangle() only accepts non-negative values >>> area_rectangle(1, -2) Traceback (most recent call last): ... ValueError: area_rectangle() only accepts non-negative values >>> area_rectangle(-1, 2) Traceback (most recent call last): ... ValueError: area_rectangle() only accepts non-negative values """ if length < 0 or width < 0: raise ValueError("area_rectangle() only accepts non-negative values") return length * width def area_square(side_length: float) -> float: """ Calculate the area of a square. >>> area_square(10) 100 >>> area_square(0) 0 >>> area_square(1.6) 2.5600000000000005 >>> area_square(-1) Traceback (most recent call last): ... ValueError: area_square() only accepts non-negative values """ if side_length < 0: raise ValueError("area_square() only accepts non-negative values") return side_length**2 def area_triangle(base: float, height: float) -> float: """ Calculate the area of a triangle given the base and height. >>> area_triangle(10, 10) 50.0 >>> area_triangle(1.6, 2.6) 2.08 >>> area_triangle(0, 0) 0.0 >>> area_triangle(-1, -2) Traceback (most recent call last): ... ValueError: area_triangle() only accepts non-negative values >>> area_triangle(1, -2) Traceback (most recent call last): ... ValueError: area_triangle() only accepts non-negative values >>> area_triangle(-1, 2) Traceback (most recent call last): ... ValueError: area_triangle() only accepts non-negative values """ if base < 0 or height < 0: raise ValueError("area_triangle() only accepts non-negative values") return (base * height) / 2 def area_triangle_three_sides(side1: float, side2: float, side3: float) -> float: """ Calculate area of triangle when the length of 3 sides are known. This function uses Heron's formula: https://en.wikipedia.org/wiki/Heron%27s_formula >>> area_triangle_three_sides(5, 12, 13) 30.0 >>> area_triangle_three_sides(10, 11, 12) 51.521233486786784 >>> area_triangle_three_sides(0, 0, 0) 0.0 >>> area_triangle_three_sides(1.6, 2.6, 3.6) 1.8703742940919619 >>> area_triangle_three_sides(-1, -2, -1) Traceback (most recent call last): ... ValueError: area_triangle_three_sides() only accepts non-negative values >>> area_triangle_three_sides(1, -2, 1) Traceback (most recent call last): ... ValueError: area_triangle_three_sides() only accepts non-negative values >>> area_triangle_three_sides(2, 4, 7) Traceback (most recent call last): ... ValueError: Given three sides do not form a triangle >>> area_triangle_three_sides(2, 7, 4) Traceback (most recent call last): ... ValueError: Given three sides do not form a triangle >>> area_triangle_three_sides(7, 2, 4) Traceback (most recent call last): ... ValueError: Given three sides do not form a triangle """ if side1 < 0 or side2 < 0 or side3 < 0: raise ValueError("area_triangle_three_sides() only accepts non-negative values") elif side1 + side2 < side3 or side1 + side3 < side2 or side2 + side3 < side1: raise ValueError("Given three sides do not form a triangle") semi_perimeter = (side1 + side2 + side3) / 2 area = sqrt( semi_perimeter * (semi_perimeter - side1) * (semi_perimeter - side2) * (semi_perimeter - side3) ) return area def area_parallelogram(base: float, height: float) -> float: """ Calculate the area of a parallelogram. >>> area_parallelogram(10, 20) 200 >>> area_parallelogram(1.6, 2.6) 4.16 >>> area_parallelogram(0, 0) 0 >>> area_parallelogram(-1, -2) Traceback (most recent call last): ... ValueError: area_parallelogram() only accepts non-negative values >>> area_parallelogram(1, -2) Traceback (most recent call last): ... ValueError: area_parallelogram() only accepts non-negative values >>> area_parallelogram(-1, 2) Traceback (most recent call last): ... ValueError: area_parallelogram() only accepts non-negative values """ if base < 0 or height < 0: raise ValueError("area_parallelogram() only accepts non-negative values") return base * height def area_trapezium(base1: float, base2: float, height: float) -> float: """ Calculate the area of a trapezium. >>> area_trapezium(10, 20, 30) 450.0 >>> area_trapezium(1.6, 2.6, 3.6) 7.5600000000000005 >>> area_trapezium(0, 0, 0) 0.0 >>> area_trapezium(-1, -2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(-1, 2, 3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(1, -2, 3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(1, 2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(-1, -2, 3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(1, -2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values >>> area_trapezium(-1, 2, -3) Traceback (most recent call last): ... ValueError: area_trapezium() only accepts non-negative values """ if base1 < 0 or base2 < 0 or height < 0: raise ValueError("area_trapezium() only accepts non-negative values") return 1 / 2 * (base1 + base2) * height def area_circle(radius: float) -> float: """ Calculate the area of a circle. >>> area_circle(20) 1256.6370614359173 >>> area_circle(1.6) 8.042477193189871 >>> area_circle(0) 0.0 >>> area_circle(-1) Traceback (most recent call last): ... ValueError: area_circle() only accepts non-negative values """ if radius < 0: raise ValueError("area_circle() only accepts non-negative values") return pi * radius**2 def area_ellipse(radius_x: float, radius_y: float) -> float: """ Calculate the area of a ellipse. >>> area_ellipse(10, 10) 314.1592653589793 >>> area_ellipse(10, 20) 628.3185307179587 >>> area_ellipse(0, 0) 0.0 >>> area_ellipse(1.6, 2.6) 13.06902543893354 >>> area_ellipse(-10, 20) Traceback (most recent call last): ... ValueError: area_ellipse() only accepts non-negative values >>> area_ellipse(10, -20) Traceback (most recent call last): ... ValueError: area_ellipse() only accepts non-negative values >>> area_ellipse(-10, -20) Traceback (most recent call last): ... ValueError: area_ellipse() only accepts non-negative values """ if radius_x < 0 or radius_y < 0: raise ValueError("area_ellipse() only accepts non-negative values") return pi * radius_x * radius_y def area_rhombus(diagonal_1: float, diagonal_2: float) -> float: """ Calculate the area of a rhombus. >>> area_rhombus(10, 20) 100.0 >>> area_rhombus(1.6, 2.6) 2.08 >>> area_rhombus(0, 0) 0.0 >>> area_rhombus(-1, -2) Traceback (most recent call last): ... ValueError: area_rhombus() only accepts non-negative values >>> area_rhombus(1, -2) Traceback (most recent call last): ... ValueError: area_rhombus() only accepts non-negative values >>> area_rhombus(-1, 2) Traceback (most recent call last): ... ValueError: area_rhombus() only accepts non-negative values """ if diagonal_1 < 0 or diagonal_2 < 0: raise ValueError("area_rhombus() only accepts non-negative values") return 1 / 2 * diagonal_1 * diagonal_2 def area_reg_polygon(sides: int, length: float) -> float: """ Calculate the area of a regular polygon. Wikipedia reference: https://en.wikipedia.org/wiki/Polygon#Regular_polygons Formula: (n*s^2*cot(pi/n))/4 >>> area_reg_polygon(3, 10) 43.301270189221945 >>> area_reg_polygon(4, 10) 100.00000000000001 >>> area_reg_polygon(0, 0) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts integers greater than or equal to \ three as number of sides >>> area_reg_polygon(-1, -2) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts integers greater than or equal to \ three as number of sides >>> area_reg_polygon(5, -2) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts non-negative values as \ length of a side >>> area_reg_polygon(-1, 2) Traceback (most recent call last): ... ValueError: area_reg_polygon() only accepts integers greater than or equal to \ three as number of sides """ if not isinstance(sides, int) or sides < 3: raise ValueError( "area_reg_polygon() only accepts integers greater than or \ equal to three as number of sides" ) elif length < 0: raise ValueError( "area_reg_polygon() only accepts non-negative values as \ length of a side" ) return (sides * length**2) / (4 * tan(pi / sides)) return (sides * length**2) / (4 * tan(pi / sides)) if __name__ == "__main__": import doctest doctest.testmod(verbose=True) # verbose so we can see methods missing tests print("[DEMO] Areas of various geometric shapes: \n") print(f"Rectangle: {area_rectangle(10, 20) = }") print(f"Square: {area_square(10) = }") print(f"Triangle: {area_triangle(10, 10) = }") print(f"Triangle: {area_triangle_three_sides(5, 12, 13) = }") print(f"Parallelogram: {area_parallelogram(10, 20) = }") print(f"Rhombus: {area_rhombus(10, 20) = }") print(f"Trapezium: {area_trapezium(10, 20, 30) = }") print(f"Circle: {area_circle(20) = }") print(f"Ellipse: {area_ellipse(10, 20) = }") print("\nSurface Areas of various geometric shapes: \n") print(f"Cube: {surface_area_cube(20) = }") print(f"Cuboid: {surface_area_cuboid(10, 20, 30) = }") print(f"Sphere: {surface_area_sphere(20) = }") print(f"Hemisphere: {surface_area_hemisphere(20) = }") print(f"Cone: {surface_area_cone(10, 20) = }") print(f"Conical Frustum: {surface_area_conical_frustum(10, 20, 30) = }") print(f"Cylinder: {surface_area_cylinder(10, 20) = }") print(f"Equilateral Triangle: {area_reg_polygon(3, 10) = }") print(f"Square: {area_reg_polygon(4, 10) = }") print(f"Reqular Pentagon: {area_reg_polygon(5, 10) = }")
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/env python3 """ The Bifid Cipher uses a Polybius Square to encipher a message in a way that makes it fairly difficult to decipher without knowing the secret. https://www.braingle.com/brainteasers/codes/bifid.php """ import numpy as np SQUARE = [ ["a", "b", "c", "d", "e"], ["f", "g", "h", "i", "k"], ["l", "m", "n", "o", "p"], ["q", "r", "s", "t", "u"], ["v", "w", "x", "y", "z"], ] class BifidCipher: def __init__(self) -> None: self.SQUARE = np.array(SQUARE) def letter_to_numbers(self, letter: str) -> np.ndarray: """ Return the pair of numbers that represents the given letter in the polybius square >>> np.array_equal(BifidCipher().letter_to_numbers('a'), [1,1]) True >>> np.array_equal(BifidCipher().letter_to_numbers('u'), [4,5]) True """ index1, index2 = np.where(self.SQUARE == letter) indexes = np.concatenate([index1 + 1, index2 + 1]) return indexes def numbers_to_letter(self, index1: int, index2: int) -> str: """ Return the letter corresponding to the position [index1, index2] in the polybius square >>> BifidCipher().numbers_to_letter(4, 5) == "u" True >>> BifidCipher().numbers_to_letter(1, 1) == "a" True """ letter = self.SQUARE[index1 - 1, index2 - 1] return letter def encode(self, message: str) -> str: """ Return the encoded version of message according to the polybius cipher >>> BifidCipher().encode('testmessage') == 'qtltbdxrxlk' True >>> BifidCipher().encode('Test Message') == 'qtltbdxrxlk' True >>> BifidCipher().encode('test j') == BifidCipher().encode('test i') True """ message = message.lower() message = message.replace(" ", "") message = message.replace("j", "i") first_step = np.empty((2, len(message))) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[0, letter_index] = numbers[0] first_step[1, letter_index] = numbers[1] second_step = first_step.reshape(2 * len(message)) encoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[numbers_index * 2]) index2 = int(second_step[(numbers_index * 2) + 1]) letter = self.numbers_to_letter(index1, index2) encoded_message = encoded_message + letter return encoded_message def decode(self, message: str) -> str: """ Return the decoded version of message according to the polybius cipher >>> BifidCipher().decode('qtltbdxrxlk') == 'testmessage' True """ message = message.lower() message.replace(" ", "") first_step = np.empty(2 * len(message)) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[letter_index * 2] = numbers[0] first_step[letter_index * 2 + 1] = numbers[1] second_step = first_step.reshape((2, len(message))) decoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[0, numbers_index]) index2 = int(second_step[1, numbers_index]) letter = self.numbers_to_letter(index1, index2) decoded_message = decoded_message + letter return decoded_message
#!/usr/bin/env python3 """ The Bifid Cipher uses a Polybius Square to encipher a message in a way that makes it fairly difficult to decipher without knowing the secret. https://www.braingle.com/brainteasers/codes/bifid.php """ import numpy as np SQUARE = [ ["a", "b", "c", "d", "e"], ["f", "g", "h", "i", "k"], ["l", "m", "n", "o", "p"], ["q", "r", "s", "t", "u"], ["v", "w", "x", "y", "z"], ] class BifidCipher: def __init__(self) -> None: self.SQUARE = np.array(SQUARE) def letter_to_numbers(self, letter: str) -> np.ndarray: """ Return the pair of numbers that represents the given letter in the polybius square >>> np.array_equal(BifidCipher().letter_to_numbers('a'), [1,1]) True >>> np.array_equal(BifidCipher().letter_to_numbers('u'), [4,5]) True """ index1, index2 = np.where(self.SQUARE == letter) indexes = np.concatenate([index1 + 1, index2 + 1]) return indexes def numbers_to_letter(self, index1: int, index2: int) -> str: """ Return the letter corresponding to the position [index1, index2] in the polybius square >>> BifidCipher().numbers_to_letter(4, 5) == "u" True >>> BifidCipher().numbers_to_letter(1, 1) == "a" True """ letter = self.SQUARE[index1 - 1, index2 - 1] return letter def encode(self, message: str) -> str: """ Return the encoded version of message according to the polybius cipher >>> BifidCipher().encode('testmessage') == 'qtltbdxrxlk' True >>> BifidCipher().encode('Test Message') == 'qtltbdxrxlk' True >>> BifidCipher().encode('test j') == BifidCipher().encode('test i') True """ message = message.lower() message = message.replace(" ", "") message = message.replace("j", "i") first_step = np.empty((2, len(message))) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[0, letter_index] = numbers[0] first_step[1, letter_index] = numbers[1] second_step = first_step.reshape(2 * len(message)) encoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[numbers_index * 2]) index2 = int(second_step[(numbers_index * 2) + 1]) letter = self.numbers_to_letter(index1, index2) encoded_message = encoded_message + letter return encoded_message def decode(self, message: str) -> str: """ Return the decoded version of message according to the polybius cipher >>> BifidCipher().decode('qtltbdxrxlk') == 'testmessage' True """ message = message.lower() message.replace(" ", "") first_step = np.empty(2 * len(message)) for letter_index in range(len(message)): numbers = self.letter_to_numbers(message[letter_index]) first_step[letter_index * 2] = numbers[0] first_step[letter_index * 2 + 1] = numbers[1] second_step = first_step.reshape((2, len(message))) decoded_message = "" for numbers_index in range(len(message)): index1 = int(second_step[0, numbers_index]) index2 = int(second_step[1, numbers_index]) letter = self.numbers_to_letter(index1, index2) decoded_message = decoded_message + letter return decoded_message
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# https://en.wikipedia.org/wiki/Fizz_buzz#Programming def fizz_buzz(number: int, iterations: int) -> str: """ Plays FizzBuzz. Prints Fizz if number is a multiple of 3. Prints Buzz if its a multiple of 5. Prints FizzBuzz if its a multiple of both 3 and 5 or 15. Else Prints The Number Itself. >>> fizz_buzz(1,7) '1 2 Fizz 4 Buzz Fizz 7 ' >>> fizz_buzz(1,0) Traceback (most recent call last): ... ValueError: Iterations must be done more than 0 times to play FizzBuzz >>> fizz_buzz(-5,5) Traceback (most recent call last): ... ValueError: starting number must be and integer and be more than 0 >>> fizz_buzz(10,-5) Traceback (most recent call last): ... ValueError: Iterations must be done more than 0 times to play FizzBuzz >>> fizz_buzz(1.5,5) Traceback (most recent call last): ... ValueError: starting number must be and integer and be more than 0 >>> fizz_buzz(1,5.5) Traceback (most recent call last): ... ValueError: iterations must be defined as integers """ if not type(iterations) == int: raise ValueError("iterations must be defined as integers") if not type(number) == int or not number >= 1: raise ValueError( """starting number must be and integer and be more than 0""" ) if not iterations >= 1: raise ValueError("Iterations must be done more than 0 times to play FizzBuzz") out = "" while number <= iterations: if number % 3 == 0: out += "Fizz" if number % 5 == 0: out += "Buzz" if not number % 3 == 0 and not number % 5 == 0: out += str(number) # print(out) number += 1 out += " " return out if __name__ == "__main__": import doctest doctest.testmod()
# https://en.wikipedia.org/wiki/Fizz_buzz#Programming def fizz_buzz(number: int, iterations: int) -> str: """ Plays FizzBuzz. Prints Fizz if number is a multiple of 3. Prints Buzz if its a multiple of 5. Prints FizzBuzz if its a multiple of both 3 and 5 or 15. Else Prints The Number Itself. >>> fizz_buzz(1,7) '1 2 Fizz 4 Buzz Fizz 7 ' >>> fizz_buzz(1,0) Traceback (most recent call last): ... ValueError: Iterations must be done more than 0 times to play FizzBuzz >>> fizz_buzz(-5,5) Traceback (most recent call last): ... ValueError: starting number must be and integer and be more than 0 >>> fizz_buzz(10,-5) Traceback (most recent call last): ... ValueError: Iterations must be done more than 0 times to play FizzBuzz >>> fizz_buzz(1.5,5) Traceback (most recent call last): ... ValueError: starting number must be and integer and be more than 0 >>> fizz_buzz(1,5.5) Traceback (most recent call last): ... ValueError: iterations must be defined as integers """ if not type(iterations) == int: raise ValueError("iterations must be defined as integers") if not type(number) == int or not number >= 1: raise ValueError( """starting number must be and integer and be more than 0""" ) if not iterations >= 1: raise ValueError("Iterations must be done more than 0 times to play FizzBuzz") out = "" while number <= iterations: if number % 3 == 0: out += "Fizz" if number % 5 == 0: out += "Buzz" if not number % 3 == 0 and not number % 5 == 0: out += str(number) # print(out) number += 1 out += " " return out if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Ugly numbers are numbers whose only prime factors are 2, 3 or 5. The sequence 1, 2, 3, 4, 5, 6, 8, 9, 10, 12, 15, … shows the first 11 ugly numbers. By convention, 1 is included. Given an integer n, we have to find the nth ugly number. For more details, refer this article https://www.geeksforgeeks.org/ugly-numbers/ """ def ugly_numbers(n: int) -> int: """ Returns the nth ugly number. >>> ugly_numbers(100) 1536 >>> ugly_numbers(0) 1 >>> ugly_numbers(20) 36 >>> ugly_numbers(-5) 1 >>> ugly_numbers(-5.5) Traceback (most recent call last): ... TypeError: 'float' object cannot be interpreted as an integer """ ugly_nums = [1] i2, i3, i5 = 0, 0, 0 next_2 = ugly_nums[i2] * 2 next_3 = ugly_nums[i3] * 3 next_5 = ugly_nums[i5] * 5 for _ in range(1, n): next_num = min(next_2, next_3, next_5) ugly_nums.append(next_num) if next_num == next_2: i2 += 1 next_2 = ugly_nums[i2] * 2 if next_num == next_3: i3 += 1 next_3 = ugly_nums[i3] * 3 if next_num == next_5: i5 += 1 next_5 = ugly_nums[i5] * 5 return ugly_nums[-1] if __name__ == "__main__": from doctest import testmod testmod(verbose=True) print(f"{ugly_numbers(200) = }")
""" Ugly numbers are numbers whose only prime factors are 2, 3 or 5. The sequence 1, 2, 3, 4, 5, 6, 8, 9, 10, 12, 15, … shows the first 11 ugly numbers. By convention, 1 is included. Given an integer n, we have to find the nth ugly number. For more details, refer this article https://www.geeksforgeeks.org/ugly-numbers/ """ def ugly_numbers(n: int) -> int: """ Returns the nth ugly number. >>> ugly_numbers(100) 1536 >>> ugly_numbers(0) 1 >>> ugly_numbers(20) 36 >>> ugly_numbers(-5) 1 >>> ugly_numbers(-5.5) Traceback (most recent call last): ... TypeError: 'float' object cannot be interpreted as an integer """ ugly_nums = [1] i2, i3, i5 = 0, 0, 0 next_2 = ugly_nums[i2] * 2 next_3 = ugly_nums[i3] * 3 next_5 = ugly_nums[i5] * 5 for _ in range(1, n): next_num = min(next_2, next_3, next_5) ugly_nums.append(next_num) if next_num == next_2: i2 += 1 next_2 = ugly_nums[i2] * 2 if next_num == next_3: i3 += 1 next_3 = ugly_nums[i3] * 3 if next_num == next_5: i5 += 1 next_5 = ugly_nums[i5] * 5 return ugly_nums[-1] if __name__ == "__main__": from doctest import testmod testmod(verbose=True) print(f"{ugly_numbers(200) = }")
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,867
Update docs
### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-30T08:30:31Z"
"2022-10-30T09:01:59Z"
17d93cab783095dd1def3c382866cd94296db455
57ccabbaeb0f32165271e3a218bc9c6dcfc21823
Update docs. ### Describe your change: - Update docs to Python 3.10+ to match Py upgrade settings in pre-commit (`--py310-plus`) - Remove Exclusion from pre-commit-setting, this file have been fixed in #7844 * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Created by susmith98 from collections import Counter from timeit import timeit # Problem Description: # Check if characters of the given string can be rearranged to form a palindrome. # Counter is faster for long strings and non-Counter is faster for short strings. def can_string_be_rearranged_as_palindrome_counter( input_str: str = "", ) -> bool: """ A Palindrome is a String that reads the same forward as it does backwards. Examples of Palindromes mom, dad, malayalam >>> can_string_be_rearranged_as_palindrome_counter("Momo") True >>> can_string_be_rearranged_as_palindrome_counter("Mother") False >>> can_string_be_rearranged_as_palindrome_counter("Father") False >>> can_string_be_rearranged_as_palindrome_counter("A man a plan a canal Panama") True """ return sum(c % 2 for c in Counter(input_str.replace(" ", "").lower()).values()) < 2 def can_string_be_rearranged_as_palindrome(input_str: str = "") -> bool: """ A Palindrome is a String that reads the same forward as it does backwards. Examples of Palindromes mom, dad, malayalam >>> can_string_be_rearranged_as_palindrome("Momo") True >>> can_string_be_rearranged_as_palindrome("Mother") False >>> can_string_be_rearranged_as_palindrome("Father") False >>> can_string_be_rearranged_as_palindrome_counter("A man a plan a canal Panama") True """ if len(input_str) == 0: return True lower_case_input_str = input_str.replace(" ", "").lower() # character_freq_dict: Stores the frequency of every character in the input string character_freq_dict: dict[str, int] = {} for character in lower_case_input_str: character_freq_dict[character] = character_freq_dict.get(character, 0) + 1 """ Above line of code is equivalent to: 1) Getting the frequency of current character till previous index >>> character_freq = character_freq_dict.get(character, 0) 2) Incrementing the frequency of current character by 1 >>> character_freq = character_freq + 1 3) Updating the frequency of current character >>> character_freq_dict[character] = character_freq """ """ OBSERVATIONS: Even length palindrome -> Every character appears even no.of times. Odd length palindrome -> Every character appears even no.of times except for one character. LOGIC: Step 1: We'll count number of characters that appear odd number of times i.e oddChar Step 2:If we find more than 1 character that appears odd number of times, It is not possible to rearrange as a palindrome """ odd_char = 0 for character_count in character_freq_dict.values(): if character_count % 2: odd_char += 1 if odd_char > 1: return False return True def benchmark(input_str: str = "") -> None: """ Benchmark code for comparing above 2 functions """ print("\nFor string = ", input_str, ":") print( "> can_string_be_rearranged_as_palindrome_counter()", "\tans =", can_string_be_rearranged_as_palindrome_counter(input_str), "\ttime =", timeit( "z.can_string_be_rearranged_as_palindrome_counter(z.check_str)", setup="import __main__ as z", ), "seconds", ) print( "> can_string_be_rearranged_as_palindrome()", "\tans =", can_string_be_rearranged_as_palindrome(input_str), "\ttime =", timeit( "z.can_string_be_rearranged_as_palindrome(z.check_str)", setup="import __main__ as z", ), "seconds", ) if __name__ == "__main__": check_str = input( "Enter string to determine if it can be rearranged as a palindrome or not: " ).strip() benchmark(check_str) status = can_string_be_rearranged_as_palindrome_counter(check_str) print(f"{check_str} can {'' if status else 'not '}be rearranged as a palindrome")
# Created by susmith98 from collections import Counter from timeit import timeit # Problem Description: # Check if characters of the given string can be rearranged to form a palindrome. # Counter is faster for long strings and non-Counter is faster for short strings. def can_string_be_rearranged_as_palindrome_counter( input_str: str = "", ) -> bool: """ A Palindrome is a String that reads the same forward as it does backwards. Examples of Palindromes mom, dad, malayalam >>> can_string_be_rearranged_as_palindrome_counter("Momo") True >>> can_string_be_rearranged_as_palindrome_counter("Mother") False >>> can_string_be_rearranged_as_palindrome_counter("Father") False >>> can_string_be_rearranged_as_palindrome_counter("A man a plan a canal Panama") True """ return sum(c % 2 for c in Counter(input_str.replace(" ", "").lower()).values()) < 2 def can_string_be_rearranged_as_palindrome(input_str: str = "") -> bool: """ A Palindrome is a String that reads the same forward as it does backwards. Examples of Palindromes mom, dad, malayalam >>> can_string_be_rearranged_as_palindrome("Momo") True >>> can_string_be_rearranged_as_palindrome("Mother") False >>> can_string_be_rearranged_as_palindrome("Father") False >>> can_string_be_rearranged_as_palindrome_counter("A man a plan a canal Panama") True """ if len(input_str) == 0: return True lower_case_input_str = input_str.replace(" ", "").lower() # character_freq_dict: Stores the frequency of every character in the input string character_freq_dict: dict[str, int] = {} for character in lower_case_input_str: character_freq_dict[character] = character_freq_dict.get(character, 0) + 1 """ Above line of code is equivalent to: 1) Getting the frequency of current character till previous index >>> character_freq = character_freq_dict.get(character, 0) 2) Incrementing the frequency of current character by 1 >>> character_freq = character_freq + 1 3) Updating the frequency of current character >>> character_freq_dict[character] = character_freq """ """ OBSERVATIONS: Even length palindrome -> Every character appears even no.of times. Odd length palindrome -> Every character appears even no.of times except for one character. LOGIC: Step 1: We'll count number of characters that appear odd number of times i.e oddChar Step 2:If we find more than 1 character that appears odd number of times, It is not possible to rearrange as a palindrome """ odd_char = 0 for character_count in character_freq_dict.values(): if character_count % 2: odd_char += 1 if odd_char > 1: return False return True def benchmark(input_str: str = "") -> None: """ Benchmark code for comparing above 2 functions """ print("\nFor string = ", input_str, ":") print( "> can_string_be_rearranged_as_palindrome_counter()", "\tans =", can_string_be_rearranged_as_palindrome_counter(input_str), "\ttime =", timeit( "z.can_string_be_rearranged_as_palindrome_counter(z.check_str)", setup="import __main__ as z", ), "seconds", ) print( "> can_string_be_rearranged_as_palindrome()", "\tans =", can_string_be_rearranged_as_palindrome(input_str), "\ttime =", timeit( "z.can_string_be_rearranged_as_palindrome(z.check_str)", setup="import __main__ as z", ), "seconds", ) if __name__ == "__main__": check_str = input( "Enter string to determine if it can be rearranged as a palindrome or not: " ).strip() benchmark(check_str) status = can_string_be_rearranged_as_palindrome_counter(check_str) print(f"{check_str} can {'' if status else 'not '}be rearranged as a palindrome")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Equal Loudness Filter](audio_filters/equal_loudness_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) ## Electronics * [Carrier Concentration](electronics/carrier_concentration.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Power](electronics/electric_power.py) * [Ohms Law](electronics/ohms_law.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Breadth First Search Shortest Path 2](graphs/breadth_first_search_shortest_path_2.py) * [Breadth First Search Zero One Shortest Path](graphs/breadth_first_search_zero_one_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Random Forest Classifier](machine_learning/random_forest_classifier.py) * [Random Forest Regressor](machine_learning/random_forest_regressor.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Abs Max](maths/abs_max.py) * [Abs Min](maths/abs_min.py) * [Add](maths/add.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial Iterative](maths/factorial_iterative.py) * [Factorial Recursive](maths/factorial_recursive.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Perceptron](neural_network/perceptron.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Check Strong Password](other/check_strong_password.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password Generator](other/password_generator.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Casimir Effect](physics/casimir_effect.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) * [Potential Energy](physics/potential_energy.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) ## Quantum * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Billioners](web_programming/get_top_billioners.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Equal Loudness Filter](audio_filters/equal_loudness_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) ## Electronics * [Carrier Concentration](electronics/carrier_concentration.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Power](electronics/electric_power.py) * [Ohms Law](electronics/ohms_law.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Breadth First Search Shortest Path 2](graphs/breadth_first_search_shortest_path_2.py) * [Breadth First Search Zero One Shortest Path](graphs/breadth_first_search_zero_one_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Random Forest Classifier](machine_learning/random_forest_classifier.py) * [Random Forest Regressor](machine_learning/random_forest_regressor.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Abs Max](maths/abs_max.py) * [Abs Min](maths/abs_min.py) * [Add](maths/add.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial Iterative](maths/factorial_iterative.py) * [Factorial Recursive](maths/factorial_recursive.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Perceptron](neural_network/perceptron.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Check Strong Password](other/check_strong_password.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password Generator](other/password_generator.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Casimir Effect](physics/casimir_effect.py) * [Centripetal Force](physics/centripetal_force.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) * [Potential Energy](physics/potential_energy.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) ## Quantum * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Is Srilankan Phone Number](strings/is_srilankan_phone_number.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Billioners](web_programming/get_top_billioners.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# flake8: noqa """ Binomial Heap Reference: Advanced Data Structures, Peter Brass """ class Node: """ Node in a doubly-linked binomial tree, containing: - value - size of left subtree - link to left, right and parent nodes """ def __init__(self, val): self.val = val # Number of nodes in left subtree self.left_tree_size = 0 self.left = None self.right = None self.parent = None def mergeTrees(self, other): """ In-place merge of two binomial trees of equal size. Returns the root of the resulting tree """ assert self.left_tree_size == other.left_tree_size, "Unequal Sizes of Blocks" if self.val < other.val: other.left = self.right other.parent = None if self.right: self.right.parent = other self.right = other self.left_tree_size = self.left_tree_size * 2 + 1 return self else: self.left = other.right self.parent = None if other.right: other.right.parent = self other.right = self other.left_tree_size = other.left_tree_size * 2 + 1 return other class BinomialHeap: r""" Min-oriented priority queue implemented with the Binomial Heap data structure implemented with the BinomialHeap class. It supports: - Insert element in a heap with n elements: Guaranteed logn, amoratized 1 - Merge (meld) heaps of size m and n: O(logn + logm) - Delete Min: O(logn) - Peek (return min without deleting it): O(1) Example: Create a random permutation of 30 integers to be inserted and 19 of them deleted >>> import numpy as np >>> permutation = np.random.permutation(list(range(30))) Create a Heap and insert the 30 integers __init__() test >>> first_heap = BinomialHeap() 30 inserts - insert() test >>> for number in permutation: ... first_heap.insert(number) Size test >>> first_heap.size 30 Deleting - delete() test >>> for i in range(25): ... print(first_heap.deleteMin(), end=" ") 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Create a new Heap >>> second_heap = BinomialHeap() >>> vals = [17, 20, 31, 34] >>> for value in vals: ... second_heap.insert(value) The heap should have the following structure: 17 / \ # 31 / \ 20 34 / \ / \ # # # # preOrder() test >>> second_heap.preOrder() [(17, 0), ('#', 1), (31, 1), (20, 2), ('#', 3), ('#', 3), (34, 2), ('#', 3), ('#', 3)] printing Heap - __str__() test >>> print(second_heap) 17 -# -31 --20 ---# ---# --34 ---# ---# mergeHeaps() test >>> merged = second_heap.mergeHeaps(first_heap) >>> merged.peek() 17 values in merged heap; (merge is inplace) >>> while not first_heap.isEmpty(): ... print(first_heap.deleteMin(), end=" ") 17 20 25 26 27 28 29 31 34 """ def __init__(self, bottom_root=None, min_node=None, heap_size=0): self.size = heap_size self.bottom_root = bottom_root self.min_node = min_node def mergeHeaps(self, other): """ In-place merge of two binomial heaps. Both of them become the resulting merged heap """ # Empty heaps corner cases if other.size == 0: return if self.size == 0: self.size = other.size self.bottom_root = other.bottom_root self.min_node = other.min_node return # Update size self.size = self.size + other.size # Update min.node if self.min_node.val > other.min_node.val: self.min_node = other.min_node # Merge # Order roots by left_subtree_size combined_roots_list = [] i, j = self.bottom_root, other.bottom_root while i or j: if i and ((not j) or i.left_tree_size < j.left_tree_size): combined_roots_list.append((i, True)) i = i.parent else: combined_roots_list.append((j, False)) j = j.parent # Insert links between them for i in range(len(combined_roots_list) - 1): if combined_roots_list[i][1] != combined_roots_list[i + 1][1]: combined_roots_list[i][0].parent = combined_roots_list[i + 1][0] combined_roots_list[i + 1][0].left = combined_roots_list[i][0] # Consecutively merge roots with same left_tree_size i = combined_roots_list[0][0] while i.parent: if ( (i.left_tree_size == i.parent.left_tree_size) and (not i.parent.parent) ) or ( i.left_tree_size == i.parent.left_tree_size and i.left_tree_size != i.parent.parent.left_tree_size ): # Neighbouring Nodes previous_node = i.left next_node = i.parent.parent # Merging trees i = i.mergeTrees(i.parent) # Updating links i.left = previous_node i.parent = next_node if previous_node: previous_node.parent = i if next_node: next_node.left = i else: i = i.parent # Updating self.bottom_root while i.left: i = i.left self.bottom_root = i # Update other other.size = self.size other.bottom_root = self.bottom_root other.min_node = self.min_node # Return the merged heap return self def insert(self, val): """ insert a value in the heap """ if self.size == 0: self.bottom_root = Node(val) self.size = 1 self.min_node = self.bottom_root else: # Create new node new_node = Node(val) # Update size self.size += 1 # update min_node if val < self.min_node.val: self.min_node = new_node # Put new_node as a bottom_root in heap self.bottom_root.left = new_node new_node.parent = self.bottom_root self.bottom_root = new_node # Consecutively merge roots with same left_tree_size while ( self.bottom_root.parent and self.bottom_root.left_tree_size == self.bottom_root.parent.left_tree_size ): # Next node next_node = self.bottom_root.parent.parent # Merge self.bottom_root = self.bottom_root.mergeTrees(self.bottom_root.parent) # Update Links self.bottom_root.parent = next_node self.bottom_root.left = None if next_node: next_node.left = self.bottom_root def peek(self): """ return min element without deleting it """ return self.min_node.val def isEmpty(self): return self.size == 0 def deleteMin(self): """ delete min element and return it """ # assert not self.isEmpty(), "Empty Heap" # Save minimal value min_value = self.min_node.val # Last element in heap corner case if self.size == 1: # Update size self.size = 0 # Update bottom root self.bottom_root = None # Update min_node self.min_node = None return min_value # No right subtree corner case # The structure of the tree implies that this should be the bottom root # and there is at least one other root if self.min_node.right is None: # Update size self.size -= 1 # Update bottom root self.bottom_root = self.bottom_root.parent self.bottom_root.left = None # Update min_node self.min_node = self.bottom_root i = self.bottom_root.parent while i: if i.val < self.min_node.val: self.min_node = i i = i.parent return min_value # General case # Find the BinomialHeap of the right subtree of min_node bottom_of_new = self.min_node.right bottom_of_new.parent = None min_of_new = bottom_of_new size_of_new = 1 # Size, min_node and bottom_root while bottom_of_new.left: size_of_new = size_of_new * 2 + 1 bottom_of_new = bottom_of_new.left if bottom_of_new.val < min_of_new.val: min_of_new = bottom_of_new # Corner case of single root on top left path if (not self.min_node.left) and (not self.min_node.parent): self.size = size_of_new self.bottom_root = bottom_of_new self.min_node = min_of_new # print("Single root, multiple nodes case") return min_value # Remaining cases # Construct heap of right subtree newHeap = BinomialHeap( bottom_root=bottom_of_new, min_node=min_of_new, heap_size=size_of_new ) # Update size self.size = self.size - 1 - size_of_new # Neighbour nodes previous_node = self.min_node.left next_node = self.min_node.parent # Initialize new bottom_root and min_node self.min_node = previous_node or next_node self.bottom_root = next_node # Update links of previous_node and search below for new min_node and # bottom_root if previous_node: previous_node.parent = next_node # Update bottom_root and search for min_node below self.bottom_root = previous_node self.min_node = previous_node while self.bottom_root.left: self.bottom_root = self.bottom_root.left if self.bottom_root.val < self.min_node.val: self.min_node = self.bottom_root if next_node: next_node.left = previous_node # Search for new min_node above min_node i = next_node while i: if i.val < self.min_node.val: self.min_node = i i = i.parent # Merge heaps self.mergeHeaps(newHeap) return min_value def preOrder(self): """ Returns the Pre-order representation of the heap including values of nodes plus their level distance from the root; Empty nodes appear as # """ # Find top root top_root = self.bottom_root while top_root.parent: top_root = top_root.parent # preorder heap_preOrder = [] self.__traversal(top_root, heap_preOrder) return heap_preOrder def __traversal(self, curr_node, preorder, level=0): """ Pre-order traversal of nodes """ if curr_node: preorder.append((curr_node.val, level)) self.__traversal(curr_node.left, preorder, level + 1) self.__traversal(curr_node.right, preorder, level + 1) else: preorder.append(("#", level)) def __str__(self): """ Overwriting str for a pre-order print of nodes in heap; Performance is poor, so use only for small examples """ if self.isEmpty(): return "" preorder_heap = self.preOrder() return "\n".join(("-" * level + str(value)) for value, level in preorder_heap) # Unit Tests if __name__ == "__main__": import doctest doctest.testmod()
""" Binomial Heap Reference: Advanced Data Structures, Peter Brass """ class Node: """ Node in a doubly-linked binomial tree, containing: - value - size of left subtree - link to left, right and parent nodes """ def __init__(self, val): self.val = val # Number of nodes in left subtree self.left_tree_size = 0 self.left = None self.right = None self.parent = None def merge_trees(self, other): """ In-place merge of two binomial trees of equal size. Returns the root of the resulting tree """ assert self.left_tree_size == other.left_tree_size, "Unequal Sizes of Blocks" if self.val < other.val: other.left = self.right other.parent = None if self.right: self.right.parent = other self.right = other self.left_tree_size = self.left_tree_size * 2 + 1 return self else: self.left = other.right self.parent = None if other.right: other.right.parent = self other.right = self other.left_tree_size = other.left_tree_size * 2 + 1 return other class BinomialHeap: r""" Min-oriented priority queue implemented with the Binomial Heap data structure implemented with the BinomialHeap class. It supports: - Insert element in a heap with n elements: Guaranteed logn, amoratized 1 - Merge (meld) heaps of size m and n: O(logn + logm) - Delete Min: O(logn) - Peek (return min without deleting it): O(1) Example: Create a random permutation of 30 integers to be inserted and 19 of them deleted >>> import numpy as np >>> permutation = np.random.permutation(list(range(30))) Create a Heap and insert the 30 integers __init__() test >>> first_heap = BinomialHeap() 30 inserts - insert() test >>> for number in permutation: ... first_heap.insert(number) Size test >>> first_heap.size 30 Deleting - delete() test >>> [first_heap.delete_min() for _ in range(20)] [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19] Create a new Heap >>> second_heap = BinomialHeap() >>> vals = [17, 20, 31, 34] >>> for value in vals: ... second_heap.insert(value) The heap should have the following structure: 17 / \ # 31 / \ 20 34 / \ / \ # # # # preOrder() test >>> " ".join(str(x) for x in second_heap.pre_order()) "(17, 0) ('#', 1) (31, 1) (20, 2) ('#', 3) ('#', 3) (34, 2) ('#', 3) ('#', 3)" printing Heap - __str__() test >>> print(second_heap) 17 -# -31 --20 ---# ---# --34 ---# ---# mergeHeaps() test >>> >>> merged = second_heap.merge_heaps(first_heap) >>> merged.peek() 17 values in merged heap; (merge is inplace) >>> results = [] >>> while not first_heap.is_empty(): ... results.append(first_heap.delete_min()) >>> results [17, 20, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 31, 34] """ def __init__(self, bottom_root=None, min_node=None, heap_size=0): self.size = heap_size self.bottom_root = bottom_root self.min_node = min_node def merge_heaps(self, other): """ In-place merge of two binomial heaps. Both of them become the resulting merged heap """ # Empty heaps corner cases if other.size == 0: return if self.size == 0: self.size = other.size self.bottom_root = other.bottom_root self.min_node = other.min_node return # Update size self.size = self.size + other.size # Update min.node if self.min_node.val > other.min_node.val: self.min_node = other.min_node # Merge # Order roots by left_subtree_size combined_roots_list = [] i, j = self.bottom_root, other.bottom_root while i or j: if i and ((not j) or i.left_tree_size < j.left_tree_size): combined_roots_list.append((i, True)) i = i.parent else: combined_roots_list.append((j, False)) j = j.parent # Insert links between them for i in range(len(combined_roots_list) - 1): if combined_roots_list[i][1] != combined_roots_list[i + 1][1]: combined_roots_list[i][0].parent = combined_roots_list[i + 1][0] combined_roots_list[i + 1][0].left = combined_roots_list[i][0] # Consecutively merge roots with same left_tree_size i = combined_roots_list[0][0] while i.parent: if ( (i.left_tree_size == i.parent.left_tree_size) and (not i.parent.parent) ) or ( i.left_tree_size == i.parent.left_tree_size and i.left_tree_size != i.parent.parent.left_tree_size ): # Neighbouring Nodes previous_node = i.left next_node = i.parent.parent # Merging trees i = i.merge_trees(i.parent) # Updating links i.left = previous_node i.parent = next_node if previous_node: previous_node.parent = i if next_node: next_node.left = i else: i = i.parent # Updating self.bottom_root while i.left: i = i.left self.bottom_root = i # Update other other.size = self.size other.bottom_root = self.bottom_root other.min_node = self.min_node # Return the merged heap return self def insert(self, val): """ insert a value in the heap """ if self.size == 0: self.bottom_root = Node(val) self.size = 1 self.min_node = self.bottom_root else: # Create new node new_node = Node(val) # Update size self.size += 1 # update min_node if val < self.min_node.val: self.min_node = new_node # Put new_node as a bottom_root in heap self.bottom_root.left = new_node new_node.parent = self.bottom_root self.bottom_root = new_node # Consecutively merge roots with same left_tree_size while ( self.bottom_root.parent and self.bottom_root.left_tree_size == self.bottom_root.parent.left_tree_size ): # Next node next_node = self.bottom_root.parent.parent # Merge self.bottom_root = self.bottom_root.merge_trees(self.bottom_root.parent) # Update Links self.bottom_root.parent = next_node self.bottom_root.left = None if next_node: next_node.left = self.bottom_root def peek(self): """ return min element without deleting it """ return self.min_node.val def is_empty(self): return self.size == 0 def delete_min(self): """ delete min element and return it """ # assert not self.isEmpty(), "Empty Heap" # Save minimal value min_value = self.min_node.val # Last element in heap corner case if self.size == 1: # Update size self.size = 0 # Update bottom root self.bottom_root = None # Update min_node self.min_node = None return min_value # No right subtree corner case # The structure of the tree implies that this should be the bottom root # and there is at least one other root if self.min_node.right is None: # Update size self.size -= 1 # Update bottom root self.bottom_root = self.bottom_root.parent self.bottom_root.left = None # Update min_node self.min_node = self.bottom_root i = self.bottom_root.parent while i: if i.val < self.min_node.val: self.min_node = i i = i.parent return min_value # General case # Find the BinomialHeap of the right subtree of min_node bottom_of_new = self.min_node.right bottom_of_new.parent = None min_of_new = bottom_of_new size_of_new = 1 # Size, min_node and bottom_root while bottom_of_new.left: size_of_new = size_of_new * 2 + 1 bottom_of_new = bottom_of_new.left if bottom_of_new.val < min_of_new.val: min_of_new = bottom_of_new # Corner case of single root on top left path if (not self.min_node.left) and (not self.min_node.parent): self.size = size_of_new self.bottom_root = bottom_of_new self.min_node = min_of_new # print("Single root, multiple nodes case") return min_value # Remaining cases # Construct heap of right subtree new_heap = BinomialHeap( bottom_root=bottom_of_new, min_node=min_of_new, heap_size=size_of_new ) # Update size self.size = self.size - 1 - size_of_new # Neighbour nodes previous_node = self.min_node.left next_node = self.min_node.parent # Initialize new bottom_root and min_node self.min_node = previous_node or next_node self.bottom_root = next_node # Update links of previous_node and search below for new min_node and # bottom_root if previous_node: previous_node.parent = next_node # Update bottom_root and search for min_node below self.bottom_root = previous_node self.min_node = previous_node while self.bottom_root.left: self.bottom_root = self.bottom_root.left if self.bottom_root.val < self.min_node.val: self.min_node = self.bottom_root if next_node: next_node.left = previous_node # Search for new min_node above min_node i = next_node while i: if i.val < self.min_node.val: self.min_node = i i = i.parent # Merge heaps self.merge_heaps(new_heap) return min_value def pre_order(self): """ Returns the Pre-order representation of the heap including values of nodes plus their level distance from the root; Empty nodes appear as # """ # Find top root top_root = self.bottom_root while top_root.parent: top_root = top_root.parent # preorder heap_pre_order = [] self.__traversal(top_root, heap_pre_order) return heap_pre_order def __traversal(self, curr_node, preorder, level=0): """ Pre-order traversal of nodes """ if curr_node: preorder.append((curr_node.val, level)) self.__traversal(curr_node.left, preorder, level + 1) self.__traversal(curr_node.right, preorder, level + 1) else: preorder.append(("#", level)) def __str__(self): """ Overwriting str for a pre-order print of nodes in heap; Performance is poor, so use only for small examples """ if self.is_empty(): return "" preorder_heap = self.pre_order() return "\n".join(("-" * level + str(value)) for value, level in preorder_heap) # Unit Tests if __name__ == "__main__": import doctest doctest.testmod()
1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# flake8: noqa """The following implementation assumes that the activities are already sorted according to their finish time""" """Prints a maximum set of activities that can be done by a single person, one at a time""" # n --> Total number of activities # start[]--> An array that contains start time of all activities # finish[] --> An array that contains finish time of all activities def printMaxActivities(start: list[int], finish: list[int]) -> None: """ >>> start = [1, 3, 0, 5, 8, 5] >>> finish = [2, 4, 6, 7, 9, 9] >>> printMaxActivities(start, finish) The following activities are selected: 0,1,3,4, """ n = len(finish) print("The following activities are selected:") # The first activity is always selected i = 0 print(i, end=",") # Consider rest of the activities for j in range(n): # If this activity has start time greater than # or equal to the finish time of previously # selected activity, then select it if start[j] >= finish[i]: print(j, end=",") i = j if __name__ == "__main__": import doctest doctest.testmod() start = [1, 3, 0, 5, 8, 5] finish = [2, 4, 6, 7, 9, 9] printMaxActivities(start, finish)
"""The following implementation assumes that the activities are already sorted according to their finish time""" """Prints a maximum set of activities that can be done by a single person, one at a time""" # n --> Total number of activities # start[]--> An array that contains start time of all activities # finish[] --> An array that contains finish time of all activities def print_max_activities(start: list[int], finish: list[int]) -> None: """ >>> start = [1, 3, 0, 5, 8, 5] >>> finish = [2, 4, 6, 7, 9, 9] >>> print_max_activities(start, finish) The following activities are selected: 0,1,3,4, """ n = len(finish) print("The following activities are selected:") # The first activity is always selected i = 0 print(i, end=",") # Consider rest of the activities for j in range(n): # If this activity has start time greater than # or equal to the finish time of previously # selected activity, then select it if start[j] >= finish[i]: print(j, end=",") i = j if __name__ == "__main__": import doctest doctest.testmod() start = [1, 3, 0, 5, 8, 5] finish = [2, 4, 6, 7, 9, 9] print_max_activities(start, finish)
1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# flake8: noqa """ This is pure Python implementation of tree traversal algorithms """ from __future__ import annotations import queue class TreeNode: def __init__(self, data): self.data = data self.right = None self.left = None def build_tree(): print("\n********Press N to stop entering at any point of time********\n") check = input("Enter the value of the root node: ").strip().lower() or "n" if check == "n": return None q: queue.Queue = queue.Queue() tree_node = TreeNode(int(check)) q.put(tree_node) while not q.empty(): node_found = q.get() msg = f"Enter the left node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node left_node = TreeNode(int(check)) node_found.left = left_node q.put(left_node) msg = f"Enter the right node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node right_node = TreeNode(int(check)) node_found.right = right_node q.put(right_node) def pre_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return print(node.data, end=",") pre_order(node.left) pre_order(node.right) def in_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return in_order(node.left) print(node.data, end=",") in_order(node.right) def post_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return post_order(node.left) post_order(node.right) print(node.data, end=",") def level_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order(root) 1,2,3,4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: q.put(node_dequeued.left) if node_dequeued.right: q.put(node_dequeued.right) def level_order_actual(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order_actual(root) 1, 2,3, 4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): list = [] while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: list.append(node_dequeued.left) if node_dequeued.right: list.append(node_dequeued.right) print() for node in list: q.put(node) # iteration version def pre_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order_iter(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: # start from root node, find its left child print(n.data, end=",") stack.append(n) n = n.left # end of while means current node doesn't have left child n = stack.pop() # start to traverse its right child n = n.right def in_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order_iter(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: stack.append(n) n = n.left n = stack.pop() print(n.data, end=",") n = n.right def post_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order_iter(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return stack1, stack2 = [], [] n = node stack1.append(n) while stack1: # to find the reversed order of post order, store it in stack2 n = stack1.pop() if n.left: stack1.append(n.left) if n.right: stack1.append(n.right) stack2.append(n) while stack2: # pop up from stack2 will be the post order print(stack2.pop().data, end=",") def prompt(s: str = "", width=50, char="*") -> str: if not s: return "\n" + width * char left, extra = divmod(width - len(s) - 2, 2) return f"{left * char} {s} {(left + extra) * char}" if __name__ == "__main__": import doctest doctest.testmod() print(prompt("Binary Tree Traversals")) node = build_tree() print(prompt("Pre Order Traversal")) pre_order(node) print(prompt() + "\n") print(prompt("In Order Traversal")) in_order(node) print(prompt() + "\n") print(prompt("Post Order Traversal")) post_order(node) print(prompt() + "\n") print(prompt("Level Order Traversal")) level_order(node) print(prompt() + "\n") print(prompt("Actual Level Order Traversal")) level_order_actual(node) print("*" * 50 + "\n") print(prompt("Pre Order Traversal - Iteration Version")) pre_order_iter(node) print(prompt() + "\n") print(prompt("In Order Traversal - Iteration Version")) in_order_iter(node) print(prompt() + "\n") print(prompt("Post Order Traversal - Iteration Version")) post_order_iter(node) print(prompt())
""" This is pure Python implementation of tree traversal algorithms """ from __future__ import annotations import queue class TreeNode: def __init__(self, data): self.data = data self.right = None self.left = None def build_tree(): print("\n********Press N to stop entering at any point of time********\n") check = input("Enter the value of the root node: ").strip().lower() or "n" if check == "n": return None q: queue.Queue = queue.Queue() tree_node = TreeNode(int(check)) q.put(tree_node) while not q.empty(): node_found = q.get() msg = f"Enter the left node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node left_node = TreeNode(int(check)) node_found.left = left_node q.put(left_node) msg = f"Enter the right node of {node_found.data}: " check = input(msg).strip().lower() or "n" if check == "n": return tree_node right_node = TreeNode(int(check)) node_found.right = right_node q.put(right_node) def pre_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return print(node.data, end=",") pre_order(node.left) pre_order(node.right) def in_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return in_order(node.left) print(node.data, end=",") in_order(node.right) def post_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return post_order(node.left) post_order(node.right) print(node.data, end=",") def level_order(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order(root) 1,2,3,4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: q.put(node_dequeued.left) if node_dequeued.right: q.put(node_dequeued.right) def level_order_actual(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> level_order_actual(root) 1, 2,3, 4,5,6,7, """ if not isinstance(node, TreeNode) or not node: return q: queue.Queue = queue.Queue() q.put(node) while not q.empty(): list_ = [] while not q.empty(): node_dequeued = q.get() print(node_dequeued.data, end=",") if node_dequeued.left: list_.append(node_dequeued.left) if node_dequeued.right: list_.append(node_dequeued.right) print() for node in list_: q.put(node) # iteration version def pre_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> pre_order_iter(root) 1,2,4,5,3,6,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: # start from root node, find its left child print(n.data, end=",") stack.append(n) n = n.left # end of while means current node doesn't have left child n = stack.pop() # start to traverse its right child n = n.right def in_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> in_order_iter(root) 4,2,5,1,6,3,7, """ if not isinstance(node, TreeNode) or not node: return stack: list[TreeNode] = [] n = node while n or stack: while n: stack.append(n) n = n.left n = stack.pop() print(n.data, end=",") n = n.right def post_order_iter(node: TreeNode) -> None: """ >>> root = TreeNode(1) >>> tree_node2 = TreeNode(2) >>> tree_node3 = TreeNode(3) >>> tree_node4 = TreeNode(4) >>> tree_node5 = TreeNode(5) >>> tree_node6 = TreeNode(6) >>> tree_node7 = TreeNode(7) >>> root.left, root.right = tree_node2, tree_node3 >>> tree_node2.left, tree_node2.right = tree_node4 , tree_node5 >>> tree_node3.left, tree_node3.right = tree_node6 , tree_node7 >>> post_order_iter(root) 4,5,2,6,7,3,1, """ if not isinstance(node, TreeNode) or not node: return stack1, stack2 = [], [] n = node stack1.append(n) while stack1: # to find the reversed order of post order, store it in stack2 n = stack1.pop() if n.left: stack1.append(n.left) if n.right: stack1.append(n.right) stack2.append(n) while stack2: # pop up from stack2 will be the post order print(stack2.pop().data, end=",") def prompt(s: str = "", width=50, char="*") -> str: if not s: return "\n" + width * char left, extra = divmod(width - len(s) - 2, 2) return f"{left * char} {s} {(left + extra) * char}" if __name__ == "__main__": import doctest doctest.testmod() print(prompt("Binary Tree Traversals")) node = build_tree() print(prompt("Pre Order Traversal")) pre_order(node) print(prompt() + "\n") print(prompt("In Order Traversal")) in_order(node) print(prompt() + "\n") print(prompt("Post Order Traversal")) post_order(node) print(prompt() + "\n") print(prompt("Level Order Traversal")) level_order(node) print(prompt() + "\n") print(prompt("Actual Level Order Traversal")) level_order_actual(node) print("*" * 50 + "\n") print(prompt("Pre Order Traversal - Iteration Version")) pre_order_iter(node) print(prompt() + "\n") print(prompt("In Order Traversal - Iteration Version")) in_order_iter(node) print(prompt() + "\n") print(prompt("Post Order Traversal - Iteration Version")) post_order_iter(node) print(prompt())
1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Program to encode and decode Baconian or Bacon's Cipher Wikipedia reference : https://en.wikipedia.org/wiki/Bacon%27s_cipher """ encode_dict = { "a": "AAAAA", "b": "AAAAB", "c": "AAABA", "d": "AAABB", "e": "AABAA", "f": "AABAB", "g": "AABBA", "h": "AABBB", "i": "ABAAA", "j": "BBBAA", "k": "ABAAB", "l": "ABABA", "m": "ABABB", "n": "ABBAA", "o": "ABBAB", "p": "ABBBA", "q": "ABBBB", "r": "BAAAA", "s": "BAAAB", "t": "BAABA", "u": "BAABB", "v": "BBBAB", "w": "BABAA", "x": "BABAB", "y": "BABBA", "z": "BABBB", " ": " ", } decode_dict = {value: key for key, value in encode_dict.items()} def encode(word: str) -> str: """ Encodes to Baconian cipher >>> encode("hello") 'AABBBAABAAABABAABABAABBAB' >>> encode("hello world") 'AABBBAABAAABABAABABAABBAB BABAAABBABBAAAAABABAAAABB' >>> encode("hello world!") Traceback (most recent call last): ... Exception: encode() accepts only letters of the alphabet and spaces """ encoded = "" for letter in word.lower(): if letter.isalpha() or letter == " ": encoded += encode_dict[letter] else: raise Exception("encode() accepts only letters of the alphabet and spaces") return encoded def decode(coded: str) -> str: """ Decodes from Baconian cipher >>> decode("AABBBAABAAABABAABABAABBAB BABAAABBABBAAAAABABAAAABB") 'hello world' >>> decode("AABBBAABAAABABAABABAABBAB") 'hello' >>> decode("AABBBAABAAABABAABABAABBAB BABAAABBABBAAAAABABAAAABB!") Traceback (most recent call last): ... Exception: decode() accepts only 'A', 'B' and spaces """ if set(coded) - {"A", "B", " "} != set(): raise Exception("decode() accepts only 'A', 'B' and spaces") decoded = "" for word in coded.split(): while len(word) != 0: decoded += decode_dict[word[:5]] word = word[5:] decoded += " " return decoded.strip() if __name__ == "__main__": from doctest import testmod testmod()
""" Program to encode and decode Baconian or Bacon's Cipher Wikipedia reference : https://en.wikipedia.org/wiki/Bacon%27s_cipher """ encode_dict = { "a": "AAAAA", "b": "AAAAB", "c": "AAABA", "d": "AAABB", "e": "AABAA", "f": "AABAB", "g": "AABBA", "h": "AABBB", "i": "ABAAA", "j": "BBBAA", "k": "ABAAB", "l": "ABABA", "m": "ABABB", "n": "ABBAA", "o": "ABBAB", "p": "ABBBA", "q": "ABBBB", "r": "BAAAA", "s": "BAAAB", "t": "BAABA", "u": "BAABB", "v": "BBBAB", "w": "BABAA", "x": "BABAB", "y": "BABBA", "z": "BABBB", " ": " ", } decode_dict = {value: key for key, value in encode_dict.items()} def encode(word: str) -> str: """ Encodes to Baconian cipher >>> encode("hello") 'AABBBAABAAABABAABABAABBAB' >>> encode("hello world") 'AABBBAABAAABABAABABAABBAB BABAAABBABBAAAAABABAAAABB' >>> encode("hello world!") Traceback (most recent call last): ... Exception: encode() accepts only letters of the alphabet and spaces """ encoded = "" for letter in word.lower(): if letter.isalpha() or letter == " ": encoded += encode_dict[letter] else: raise Exception("encode() accepts only letters of the alphabet and spaces") return encoded def decode(coded: str) -> str: """ Decodes from Baconian cipher >>> decode("AABBBAABAAABABAABABAABBAB BABAAABBABBAAAAABABAAAABB") 'hello world' >>> decode("AABBBAABAAABABAABABAABBAB") 'hello' >>> decode("AABBBAABAAABABAABABAABBAB BABAAABBABBAAAAABABAAAABB!") Traceback (most recent call last): ... Exception: decode() accepts only 'A', 'B' and spaces """ if set(coded) - {"A", "B", " "} != set(): raise Exception("decode() accepts only 'A', 'B' and spaces") decoded = "" for word in coded.split(): while len(word) != 0: decoded += decode_dict[word[:5]] word = word[5:] decoded += " " return decoded.strip() if __name__ == "__main__": from doctest import testmod testmod()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
#!/usr/bin/python """ Author: OMKAR PATHAK """ class Graph: def __init__(self): self.vertex = {} # for printing the Graph vertices def print_graph(self) -> None: print(self.vertex) for i in self.vertex: print(i, " -> ", " -> ".join([str(j) for j in self.vertex[i]])) # for adding the edge between two vertices def add_edge(self, from_vertex: int, to_vertex: int) -> None: # check if vertex is already present, if from_vertex in self.vertex: self.vertex[from_vertex].append(to_vertex) else: # else make a new vertex self.vertex[from_vertex] = [to_vertex] def dfs(self) -> None: # visited array for storing already visited nodes visited = [False] * len(self.vertex) # call the recursive helper function for i in range(len(self.vertex)): if not visited[i]: self.dfs_recursive(i, visited) def dfs_recursive(self, start_vertex: int, visited: list) -> None: # mark start vertex as visited visited[start_vertex] = True print(start_vertex, end=" ") # Recur for all the vertices that are adjacent to this node for i in self.vertex: if not visited[i]: self.dfs_recursive(i, visited) if __name__ == "__main__": g = Graph() g.add_edge(0, 1) g.add_edge(0, 2) g.add_edge(1, 2) g.add_edge(2, 0) g.add_edge(2, 3) g.add_edge(3, 3) g.print_graph() print("DFS:") g.dfs() # OUTPUT: # 0 -> 1 -> 2 # 1 -> 2 # 2 -> 0 -> 3 # 3 -> 3 # DFS: # 0 1 2 3
#!/usr/bin/python """ Author: OMKAR PATHAK """ class Graph: def __init__(self): self.vertex = {} # for printing the Graph vertices def print_graph(self) -> None: print(self.vertex) for i in self.vertex: print(i, " -> ", " -> ".join([str(j) for j in self.vertex[i]])) # for adding the edge between two vertices def add_edge(self, from_vertex: int, to_vertex: int) -> None: # check if vertex is already present, if from_vertex in self.vertex: self.vertex[from_vertex].append(to_vertex) else: # else make a new vertex self.vertex[from_vertex] = [to_vertex] def dfs(self) -> None: # visited array for storing already visited nodes visited = [False] * len(self.vertex) # call the recursive helper function for i in range(len(self.vertex)): if not visited[i]: self.dfs_recursive(i, visited) def dfs_recursive(self, start_vertex: int, visited: list) -> None: # mark start vertex as visited visited[start_vertex] = True print(start_vertex, end=" ") # Recur for all the vertices that are adjacent to this node for i in self.vertex: if not visited[i]: self.dfs_recursive(i, visited) if __name__ == "__main__": g = Graph() g.add_edge(0, 1) g.add_edge(0, 2) g.add_edge(1, 2) g.add_edge(2, 0) g.add_edge(2, 3) g.add_edge(3, 3) g.print_graph() print("DFS:") g.dfs() # OUTPUT: # 0 -> 1 -> 2 # 1 -> 2 # 2 -> 0 -> 3 # 3 -> 3 # DFS: # 0 1 2 3
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Source: https://github.com/jason9075/opencv-mosaic-data-aug""" import glob import os import random from string import ascii_lowercase, digits import cv2 import numpy as np # Parrameters OUTPUT_SIZE = (720, 1280) # Height, Width SCALE_RANGE = (0.4, 0.6) # if height or width lower than this scale, drop it. FILTER_TINY_SCALE = 1 / 100 LABEL_DIR = "" IMG_DIR = "" OUTPUT_DIR = "" NUMBER_IMAGES = 250 def main() -> None: """ Get images list and annotations list from input dir. Update new images and annotations. Save images and annotations in output dir. """ img_paths, annos = get_dataset(LABEL_DIR, IMG_DIR) for index in range(NUMBER_IMAGES): idxs = random.sample(range(len(annos)), 4) new_image, new_annos, path = update_image_and_anno( img_paths, annos, idxs, OUTPUT_SIZE, SCALE_RANGE, filter_scale=FILTER_TINY_SCALE, ) # Get random string code: '7b7ad245cdff75241935e4dd860f3bad' letter_code = random_chars(32) file_name = path.split(os.sep)[-1].rsplit(".", 1)[0] file_root = f"{OUTPUT_DIR}/{file_name}_MOSAIC_{letter_code}" cv2.imwrite(f"{file_root}.jpg", new_image, [cv2.IMWRITE_JPEG_QUALITY, 85]) print(f"Succeeded {index+1}/{NUMBER_IMAGES} with {file_name}") annos_list = [] for anno in new_annos: width = anno[3] - anno[1] height = anno[4] - anno[2] x_center = anno[1] + width / 2 y_center = anno[2] + height / 2 obj = f"{anno[0]} {x_center} {y_center} {width} {height}" annos_list.append(obj) with open(f"{file_root}.txt", "w") as outfile: outfile.write("\n".join(line for line in annos_list)) def get_dataset(label_dir: str, img_dir: str) -> tuple[list, list]: """ - label_dir <type: str>: Path to label include annotation of images - img_dir <type: str>: Path to folder contain images Return <type: list>: List of images path and labels """ img_paths = [] labels = [] for label_file in glob.glob(os.path.join(label_dir, "*.txt")): label_name = label_file.split(os.sep)[-1].rsplit(".", 1)[0] with open(label_file) as in_file: obj_lists = in_file.readlines() img_path = os.path.join(img_dir, f"{label_name}.jpg") boxes = [] for obj_list in obj_lists: obj = obj_list.rstrip("\n").split(" ") xmin = float(obj[1]) - float(obj[3]) / 2 ymin = float(obj[2]) - float(obj[4]) / 2 xmax = float(obj[1]) + float(obj[3]) / 2 ymax = float(obj[2]) + float(obj[4]) / 2 boxes.append([int(obj[0]), xmin, ymin, xmax, ymax]) if not boxes: continue img_paths.append(img_path) labels.append(boxes) return img_paths, labels def update_image_and_anno( all_img_list: list, all_annos: list, idxs: list[int], output_size: tuple[int, int], scale_range: tuple[float, float], filter_scale: float = 0.0, ) -> tuple[list, list, str]: """ - all_img_list <type: list>: list of all images - all_annos <type: list>: list of all annotations of specific image - idxs <type: list>: index of image in list - output_size <type: tuple>: size of output image (Height, Width) - scale_range <type: tuple>: range of scale image - filter_scale <type: float>: the condition of downscale image and bounding box Return: - output_img <type: narray>: image after resize - new_anno <type: list>: list of new annotation after scale - path[0] <type: string>: get the name of image file """ output_img = np.zeros([output_size[0], output_size[1], 3], dtype=np.uint8) scale_x = scale_range[0] + random.random() * (scale_range[1] - scale_range[0]) scale_y = scale_range[0] + random.random() * (scale_range[1] - scale_range[0]) divid_point_x = int(scale_x * output_size[1]) divid_point_y = int(scale_y * output_size[0]) new_anno = [] path_list = [] for i, index in enumerate(idxs): path = all_img_list[index] path_list.append(path) img_annos = all_annos[index] img = cv2.imread(path) if i == 0: # top-left img = cv2.resize(img, (divid_point_x, divid_point_y)) output_img[:divid_point_y, :divid_point_x, :] = img for bbox in img_annos: xmin = bbox[1] * scale_x ymin = bbox[2] * scale_y xmax = bbox[3] * scale_x ymax = bbox[4] * scale_y new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) elif i == 1: # top-right img = cv2.resize(img, (output_size[1] - divid_point_x, divid_point_y)) output_img[:divid_point_y, divid_point_x : output_size[1], :] = img for bbox in img_annos: xmin = scale_x + bbox[1] * (1 - scale_x) ymin = bbox[2] * scale_y xmax = scale_x + bbox[3] * (1 - scale_x) ymax = bbox[4] * scale_y new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) elif i == 2: # bottom-left img = cv2.resize(img, (divid_point_x, output_size[0] - divid_point_y)) output_img[divid_point_y : output_size[0], :divid_point_x, :] = img for bbox in img_annos: xmin = bbox[1] * scale_x ymin = scale_y + bbox[2] * (1 - scale_y) xmax = bbox[3] * scale_x ymax = scale_y + bbox[4] * (1 - scale_y) new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) else: # bottom-right img = cv2.resize( img, (output_size[1] - divid_point_x, output_size[0] - divid_point_y) ) output_img[ divid_point_y : output_size[0], divid_point_x : output_size[1], : ] = img for bbox in img_annos: xmin = scale_x + bbox[1] * (1 - scale_x) ymin = scale_y + bbox[2] * (1 - scale_y) xmax = scale_x + bbox[3] * (1 - scale_x) ymax = scale_y + bbox[4] * (1 - scale_y) new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) # Remove bounding box small than scale of filter if 0 < filter_scale: new_anno = [ anno for anno in new_anno if filter_scale < (anno[3] - anno[1]) and filter_scale < (anno[4] - anno[2]) ] return output_img, new_anno, path_list[0] def random_chars(number_char: int) -> str: """ Automatic generate random 32 characters. Get random string code: '7b7ad245cdff75241935e4dd860f3bad' >>> len(random_chars(32)) 32 """ assert number_char > 1, "The number of character should greater than 1" letter_code = ascii_lowercase + digits return "".join(random.choice(letter_code) for _ in range(number_char)) if __name__ == "__main__": main() print("DONE ✅")
"""Source: https://github.com/jason9075/opencv-mosaic-data-aug""" import glob import os import random from string import ascii_lowercase, digits import cv2 import numpy as np # Parrameters OUTPUT_SIZE = (720, 1280) # Height, Width SCALE_RANGE = (0.4, 0.6) # if height or width lower than this scale, drop it. FILTER_TINY_SCALE = 1 / 100 LABEL_DIR = "" IMG_DIR = "" OUTPUT_DIR = "" NUMBER_IMAGES = 250 def main() -> None: """ Get images list and annotations list from input dir. Update new images and annotations. Save images and annotations in output dir. """ img_paths, annos = get_dataset(LABEL_DIR, IMG_DIR) for index in range(NUMBER_IMAGES): idxs = random.sample(range(len(annos)), 4) new_image, new_annos, path = update_image_and_anno( img_paths, annos, idxs, OUTPUT_SIZE, SCALE_RANGE, filter_scale=FILTER_TINY_SCALE, ) # Get random string code: '7b7ad245cdff75241935e4dd860f3bad' letter_code = random_chars(32) file_name = path.split(os.sep)[-1].rsplit(".", 1)[0] file_root = f"{OUTPUT_DIR}/{file_name}_MOSAIC_{letter_code}" cv2.imwrite(f"{file_root}.jpg", new_image, [cv2.IMWRITE_JPEG_QUALITY, 85]) print(f"Succeeded {index+1}/{NUMBER_IMAGES} with {file_name}") annos_list = [] for anno in new_annos: width = anno[3] - anno[1] height = anno[4] - anno[2] x_center = anno[1] + width / 2 y_center = anno[2] + height / 2 obj = f"{anno[0]} {x_center} {y_center} {width} {height}" annos_list.append(obj) with open(f"{file_root}.txt", "w") as outfile: outfile.write("\n".join(line for line in annos_list)) def get_dataset(label_dir: str, img_dir: str) -> tuple[list, list]: """ - label_dir <type: str>: Path to label include annotation of images - img_dir <type: str>: Path to folder contain images Return <type: list>: List of images path and labels """ img_paths = [] labels = [] for label_file in glob.glob(os.path.join(label_dir, "*.txt")): label_name = label_file.split(os.sep)[-1].rsplit(".", 1)[0] with open(label_file) as in_file: obj_lists = in_file.readlines() img_path = os.path.join(img_dir, f"{label_name}.jpg") boxes = [] for obj_list in obj_lists: obj = obj_list.rstrip("\n").split(" ") xmin = float(obj[1]) - float(obj[3]) / 2 ymin = float(obj[2]) - float(obj[4]) / 2 xmax = float(obj[1]) + float(obj[3]) / 2 ymax = float(obj[2]) + float(obj[4]) / 2 boxes.append([int(obj[0]), xmin, ymin, xmax, ymax]) if not boxes: continue img_paths.append(img_path) labels.append(boxes) return img_paths, labels def update_image_and_anno( all_img_list: list, all_annos: list, idxs: list[int], output_size: tuple[int, int], scale_range: tuple[float, float], filter_scale: float = 0.0, ) -> tuple[list, list, str]: """ - all_img_list <type: list>: list of all images - all_annos <type: list>: list of all annotations of specific image - idxs <type: list>: index of image in list - output_size <type: tuple>: size of output image (Height, Width) - scale_range <type: tuple>: range of scale image - filter_scale <type: float>: the condition of downscale image and bounding box Return: - output_img <type: narray>: image after resize - new_anno <type: list>: list of new annotation after scale - path[0] <type: string>: get the name of image file """ output_img = np.zeros([output_size[0], output_size[1], 3], dtype=np.uint8) scale_x = scale_range[0] + random.random() * (scale_range[1] - scale_range[0]) scale_y = scale_range[0] + random.random() * (scale_range[1] - scale_range[0]) divid_point_x = int(scale_x * output_size[1]) divid_point_y = int(scale_y * output_size[0]) new_anno = [] path_list = [] for i, index in enumerate(idxs): path = all_img_list[index] path_list.append(path) img_annos = all_annos[index] img = cv2.imread(path) if i == 0: # top-left img = cv2.resize(img, (divid_point_x, divid_point_y)) output_img[:divid_point_y, :divid_point_x, :] = img for bbox in img_annos: xmin = bbox[1] * scale_x ymin = bbox[2] * scale_y xmax = bbox[3] * scale_x ymax = bbox[4] * scale_y new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) elif i == 1: # top-right img = cv2.resize(img, (output_size[1] - divid_point_x, divid_point_y)) output_img[:divid_point_y, divid_point_x : output_size[1], :] = img for bbox in img_annos: xmin = scale_x + bbox[1] * (1 - scale_x) ymin = bbox[2] * scale_y xmax = scale_x + bbox[3] * (1 - scale_x) ymax = bbox[4] * scale_y new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) elif i == 2: # bottom-left img = cv2.resize(img, (divid_point_x, output_size[0] - divid_point_y)) output_img[divid_point_y : output_size[0], :divid_point_x, :] = img for bbox in img_annos: xmin = bbox[1] * scale_x ymin = scale_y + bbox[2] * (1 - scale_y) xmax = bbox[3] * scale_x ymax = scale_y + bbox[4] * (1 - scale_y) new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) else: # bottom-right img = cv2.resize( img, (output_size[1] - divid_point_x, output_size[0] - divid_point_y) ) output_img[ divid_point_y : output_size[0], divid_point_x : output_size[1], : ] = img for bbox in img_annos: xmin = scale_x + bbox[1] * (1 - scale_x) ymin = scale_y + bbox[2] * (1 - scale_y) xmax = scale_x + bbox[3] * (1 - scale_x) ymax = scale_y + bbox[4] * (1 - scale_y) new_anno.append([bbox[0], xmin, ymin, xmax, ymax]) # Remove bounding box small than scale of filter if 0 < filter_scale: new_anno = [ anno for anno in new_anno if filter_scale < (anno[3] - anno[1]) and filter_scale < (anno[4] - anno[2]) ] return output_img, new_anno, path_list[0] def random_chars(number_char: int) -> str: """ Automatic generate random 32 characters. Get random string code: '7b7ad245cdff75241935e4dd860f3bad' >>> len(random_chars(32)) 32 """ assert number_char > 1, "The number of character should greater than 1" letter_code = ascii_lowercase + digits return "".join(random.choice(letter_code) for _ in range(number_char)) if __name__ == "__main__": main() print("DONE ✅")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from math import pi def arc_length(angle: int, radius: int) -> float: """ >>> arc_length(45, 5) 3.9269908169872414 >>> arc_length(120, 15) 31.415926535897928 """ return 2 * pi * radius * (angle / 360) if __name__ == "__main__": print(arc_length(90, 10))
from math import pi def arc_length(angle: int, radius: int) -> float: """ >>> arc_length(45, 5) 3.9269908169872414 >>> arc_length(120, 15) 31.415926535897928 """ return 2 * pi * radius * (angle / 360) if __name__ == "__main__": print(arc_length(90, 10))
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Segmented Sieve.""" import math def sieve(n: int) -> list[int]: """Segmented Sieve.""" in_prime = [] start = 2 end = int(math.sqrt(n)) # Size of every segment temp = [True] * (end + 1) prime = [] while start <= end: if temp[start] is True: in_prime.append(start) for i in range(start * start, end + 1, start): temp[i] = False start += 1 prime += in_prime low = end + 1 high = min(2 * end, n) while low <= n: temp = [True] * (high - low + 1) for each in in_prime: t = math.floor(low / each) * each if t < low: t += each for j in range(t, high + 1, each): temp[j - low] = False for j in range(len(temp)): if temp[j] is True: prime.append(j + low) low = high + 1 high = min(high + end, n) return prime print(sieve(10**6))
"""Segmented Sieve.""" import math def sieve(n: int) -> list[int]: """Segmented Sieve.""" in_prime = [] start = 2 end = int(math.sqrt(n)) # Size of every segment temp = [True] * (end + 1) prime = [] while start <= end: if temp[start] is True: in_prime.append(start) for i in range(start * start, end + 1, start): temp[i] = False start += 1 prime += in_prime low = end + 1 high = min(2 * end, n) while low <= n: temp = [True] * (high - low + 1) for each in in_prime: t = math.floor(low / each) * each if t < low: t += each for j in range(t, high + 1, each): temp[j - low] = False for j in range(len(temp)): if temp[j] is True: prime.append(j + low) low = high + 1 high = min(high + end, n) return prime print(sieve(10**6))
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Problem 33: https://projecteuler.net/problem=33 The fraction 49/98 is a curious fraction, as an inexperienced mathematician in attempting to simplify it may incorrectly believe that 49/98 = 4/8, which is correct, is obtained by cancelling the 9s. We shall consider fractions like, 30/50 = 3/5, to be trivial examples. There are exactly four non-trivial examples of this type of fraction, less than one in value, and containing two digits in the numerator and denominator. If the product of these four fractions is given in its lowest common terms, find the value of the denominator. """ from __future__ import annotations from fractions import Fraction def is_digit_cancelling(num: int, den: int) -> bool: if num != den: if num % 10 == den // 10: if (num // 10) / (den % 10) == num / den: return True return False def fraction_list(digit_len: int) -> list[str]: """ >>> fraction_list(2) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(3) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(4) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(0) [] >>> fraction_list(5) ['16/64', '19/95', '26/65', '49/98'] """ solutions = [] den = 11 last_digit = int("1" + "0" * digit_len) for num in range(den, last_digit): while den <= 99: if (num != den) and (num % 10 == den // 10) and (den % 10 != 0): if is_digit_cancelling(num, den): solutions.append(f"{num}/{den}") den += 1 num += 1 den = 10 return solutions def solution(n: int = 2) -> int: """ Return the solution to the problem """ result = 1.0 for fraction in fraction_list(n): frac = Fraction(fraction) result *= frac.denominator / frac.numerator return int(result) if __name__ == "__main__": print(solution())
""" Problem 33: https://projecteuler.net/problem=33 The fraction 49/98 is a curious fraction, as an inexperienced mathematician in attempting to simplify it may incorrectly believe that 49/98 = 4/8, which is correct, is obtained by cancelling the 9s. We shall consider fractions like, 30/50 = 3/5, to be trivial examples. There are exactly four non-trivial examples of this type of fraction, less than one in value, and containing two digits in the numerator and denominator. If the product of these four fractions is given in its lowest common terms, find the value of the denominator. """ from __future__ import annotations from fractions import Fraction def is_digit_cancelling(num: int, den: int) -> bool: if num != den: if num % 10 == den // 10: if (num // 10) / (den % 10) == num / den: return True return False def fraction_list(digit_len: int) -> list[str]: """ >>> fraction_list(2) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(3) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(4) ['16/64', '19/95', '26/65', '49/98'] >>> fraction_list(0) [] >>> fraction_list(5) ['16/64', '19/95', '26/65', '49/98'] """ solutions = [] den = 11 last_digit = int("1" + "0" * digit_len) for num in range(den, last_digit): while den <= 99: if (num != den) and (num % 10 == den // 10) and (den % 10 != 0): if is_digit_cancelling(num, den): solutions.append(f"{num}/{den}") den += 1 num += 1 den = 10 return solutions def solution(n: int = 2) -> int: """ Return the solution to the problem """ result = 1.0 for fraction in fraction_list(n): frac = Fraction(fraction) result *= frac.denominator / frac.numerator return int(result) if __name__ == "__main__": print(solution())
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from math import cos, sin, sqrt, tau from audio_filters.iir_filter import IIRFilter """ Create 2nd-order IIR filters with Butterworth design. Code based on https://webaudio.github.io/Audio-EQ-Cookbook/audio-eq-cookbook.html Alternatively you can use scipy.signal.butter, which should yield the same results. """ def make_lowpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates a low-pass filter >>> filter = make_lowpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.004277569313094809, 0.008555138626189618, 0.004277569313094809] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = (1 - _cos) / 2 b1 = 1 - _cos a0 = 1 + alpha a1 = -2 * _cos a2 = 1 - alpha filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b0]) return filt def make_highpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates a high-pass filter >>> filter = make_highpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.9957224306869052, -1.9914448613738105, 0.9957224306869052] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = (1 + _cos) / 2 b1 = -1 - _cos a0 = 1 + alpha a1 = -2 * _cos a2 = 1 - alpha filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b0]) return filt def make_bandpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates a band-pass filter >>> filter = make_bandpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.06526309611002579, 0, -0.06526309611002579] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = _sin / 2 b1 = 0 b2 = -b0 a0 = 1 + alpha a1 = -2 * _cos a2 = 1 - alpha filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt def make_allpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates an all-pass filter >>> filter = make_allpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.9077040443587427, -1.9828897227476208, 1.0922959556412573] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = 1 - alpha b1 = -2 * _cos b2 = 1 + alpha filt = IIRFilter(2) filt.set_coefficients([b2, b1, b0], [b0, b1, b2]) return filt def make_peak( frequency: int, samplerate: int, gain_db: float, q_factor: float = 1 / sqrt(2), # noqa: B008 ) -> IIRFilter: """ Creates a peak filter >>> filter = make_peak(1000, 48000, 6) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0653405327119334, -1.9828897227476208, 0.9346594672880666, 1.1303715025601122, -1.9828897227476208, 0.8696284974398878] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) big_a = 10 ** (gain_db / 40) b0 = 1 + alpha * big_a b1 = -2 * _cos b2 = 1 - alpha * big_a a0 = 1 + alpha / big_a a1 = -2 * _cos a2 = 1 - alpha / big_a filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt def make_lowshelf( frequency: int, samplerate: int, gain_db: float, q_factor: float = 1 / sqrt(2), # noqa: B008 ) -> IIRFilter: """ Creates a low-shelf filter >>> filter = make_lowshelf(1000, 48000, 6) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [3.0409336710888786, -5.608870992220748, 2.602157875636628, 3.139954022810743, -5.591841778072785, 2.5201667380627257] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) big_a = 10 ** (gain_db / 40) pmc = (big_a + 1) - (big_a - 1) * _cos ppmc = (big_a + 1) + (big_a - 1) * _cos mpc = (big_a - 1) - (big_a + 1) * _cos pmpc = (big_a - 1) + (big_a + 1) * _cos aa2 = 2 * sqrt(big_a) * alpha b0 = big_a * (pmc + aa2) b1 = 2 * big_a * mpc b2 = big_a * (pmc - aa2) a0 = ppmc + aa2 a1 = -2 * pmpc a2 = ppmc - aa2 filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt def make_highshelf( frequency: int, samplerate: int, gain_db: float, q_factor: float = 1 / sqrt(2), # noqa: B008 ) -> IIRFilter: """ Creates a high-shelf filter >>> filter = make_highshelf(1000, 48000, 6) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [2.2229172136088806, -3.9587208137297303, 1.7841414181566304, 4.295432981120543, -7.922740859457287, 3.6756456963725253] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) big_a = 10 ** (gain_db / 40) pmc = (big_a + 1) - (big_a - 1) * _cos ppmc = (big_a + 1) + (big_a - 1) * _cos mpc = (big_a - 1) - (big_a + 1) * _cos pmpc = (big_a - 1) + (big_a + 1) * _cos aa2 = 2 * sqrt(big_a) * alpha b0 = big_a * (ppmc + aa2) b1 = -2 * big_a * pmpc b2 = big_a * (ppmc - aa2) a0 = pmc + aa2 a1 = 2 * mpc a2 = pmc - aa2 filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt
from math import cos, sin, sqrt, tau from audio_filters.iir_filter import IIRFilter """ Create 2nd-order IIR filters with Butterworth design. Code based on https://webaudio.github.io/Audio-EQ-Cookbook/audio-eq-cookbook.html Alternatively you can use scipy.signal.butter, which should yield the same results. """ def make_lowpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates a low-pass filter >>> filter = make_lowpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.004277569313094809, 0.008555138626189618, 0.004277569313094809] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = (1 - _cos) / 2 b1 = 1 - _cos a0 = 1 + alpha a1 = -2 * _cos a2 = 1 - alpha filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b0]) return filt def make_highpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates a high-pass filter >>> filter = make_highpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.9957224306869052, -1.9914448613738105, 0.9957224306869052] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = (1 + _cos) / 2 b1 = -1 - _cos a0 = 1 + alpha a1 = -2 * _cos a2 = 1 - alpha filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b0]) return filt def make_bandpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates a band-pass filter >>> filter = make_bandpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.06526309611002579, 0, -0.06526309611002579] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = _sin / 2 b1 = 0 b2 = -b0 a0 = 1 + alpha a1 = -2 * _cos a2 = 1 - alpha filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt def make_allpass( frequency: int, samplerate: int, q_factor: float = 1 / sqrt(2) # noqa: B008 ) -> IIRFilter: """ Creates an all-pass filter >>> filter = make_allpass(1000, 48000) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0922959556412573, -1.9828897227476208, 0.9077040443587427, 0.9077040443587427, -1.9828897227476208, 1.0922959556412573] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) b0 = 1 - alpha b1 = -2 * _cos b2 = 1 + alpha filt = IIRFilter(2) filt.set_coefficients([b2, b1, b0], [b0, b1, b2]) return filt def make_peak( frequency: int, samplerate: int, gain_db: float, q_factor: float = 1 / sqrt(2), # noqa: B008 ) -> IIRFilter: """ Creates a peak filter >>> filter = make_peak(1000, 48000, 6) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [1.0653405327119334, -1.9828897227476208, 0.9346594672880666, 1.1303715025601122, -1.9828897227476208, 0.8696284974398878] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) big_a = 10 ** (gain_db / 40) b0 = 1 + alpha * big_a b1 = -2 * _cos b2 = 1 - alpha * big_a a0 = 1 + alpha / big_a a1 = -2 * _cos a2 = 1 - alpha / big_a filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt def make_lowshelf( frequency: int, samplerate: int, gain_db: float, q_factor: float = 1 / sqrt(2), # noqa: B008 ) -> IIRFilter: """ Creates a low-shelf filter >>> filter = make_lowshelf(1000, 48000, 6) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [3.0409336710888786, -5.608870992220748, 2.602157875636628, 3.139954022810743, -5.591841778072785, 2.5201667380627257] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) big_a = 10 ** (gain_db / 40) pmc = (big_a + 1) - (big_a - 1) * _cos ppmc = (big_a + 1) + (big_a - 1) * _cos mpc = (big_a - 1) - (big_a + 1) * _cos pmpc = (big_a - 1) + (big_a + 1) * _cos aa2 = 2 * sqrt(big_a) * alpha b0 = big_a * (pmc + aa2) b1 = 2 * big_a * mpc b2 = big_a * (pmc - aa2) a0 = ppmc + aa2 a1 = -2 * pmpc a2 = ppmc - aa2 filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt def make_highshelf( frequency: int, samplerate: int, gain_db: float, q_factor: float = 1 / sqrt(2), # noqa: B008 ) -> IIRFilter: """ Creates a high-shelf filter >>> filter = make_highshelf(1000, 48000, 6) >>> filter.a_coeffs + filter.b_coeffs # doctest: +NORMALIZE_WHITESPACE [2.2229172136088806, -3.9587208137297303, 1.7841414181566304, 4.295432981120543, -7.922740859457287, 3.6756456963725253] """ w0 = tau * frequency / samplerate _sin = sin(w0) _cos = cos(w0) alpha = _sin / (2 * q_factor) big_a = 10 ** (gain_db / 40) pmc = (big_a + 1) - (big_a - 1) * _cos ppmc = (big_a + 1) + (big_a - 1) * _cos mpc = (big_a - 1) - (big_a + 1) * _cos pmpc = (big_a - 1) + (big_a + 1) * _cos aa2 = 2 * sqrt(big_a) * alpha b0 = big_a * (ppmc + aa2) b1 = -2 * big_a * pmpc b2 = big_a * (ppmc - aa2) a0 = pmc + aa2 a1 = 2 * mpc a2 = pmc - aa2 filt = IIRFilter(2) filt.set_coefficients([a0, a1, a2], [b0, b1, b2]) return filt
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" A Trie/Prefix Tree is a kind of search tree used to provide quick lookup of words/patterns in a set of words. A basic Trie however has O(n^2) space complexity making it impractical in practice. It however provides O(max(search_string, length of longest word)) lookup time making it an optimal approach when space is not an issue. """ class TrieNode: def __init__(self) -> None: self.nodes: dict[str, TrieNode] = {} # Mapping from char to TrieNode self.is_leaf = False def insert_many(self, words: list[str]) -> None: """ Inserts a list of words into the Trie :param words: list of string words :return: None """ for word in words: self.insert(word) def insert(self, word: str) -> None: """ Inserts a word into the Trie :param word: word to be inserted :return: None """ curr = self for char in word: if char not in curr.nodes: curr.nodes[char] = TrieNode() curr = curr.nodes[char] curr.is_leaf = True def find(self, word: str) -> bool: """ Tries to find word in a Trie :param word: word to look for :return: Returns True if word is found, False otherwise """ curr = self for char in word: if char not in curr.nodes: return False curr = curr.nodes[char] return curr.is_leaf def delete(self, word: str) -> None: """ Deletes a word in a Trie :param word: word to delete :return: None """ def _delete(curr: TrieNode, word: str, index: int) -> bool: if index == len(word): # If word does not exist if not curr.is_leaf: return False curr.is_leaf = False return len(curr.nodes) == 0 char = word[index] char_node = curr.nodes.get(char) # If char not in current trie node if not char_node: return False # Flag to check if node can be deleted delete_curr = _delete(char_node, word, index + 1) if delete_curr: del curr.nodes[char] return len(curr.nodes) == 0 return delete_curr _delete(self, word, 0) def print_words(node: TrieNode, word: str) -> None: """ Prints all the words in a Trie :param node: root node of Trie :param word: Word variable should be empty at start :return: None """ if node.is_leaf: print(word, end=" ") for key, value in node.nodes.items(): print_words(value, word + key) def test_trie() -> bool: words = "banana bananas bandana band apple all beast".split() root = TrieNode() root.insert_many(words) # print_words(root, "") assert all(root.find(word) for word in words) assert root.find("banana") assert not root.find("bandanas") assert not root.find("apps") assert root.find("apple") assert root.find("all") root.delete("all") assert not root.find("all") root.delete("banana") assert not root.find("banana") assert root.find("bananas") return True def print_results(msg: str, passes: bool) -> None: print(str(msg), "works!" if passes else "doesn't work :(") def pytests() -> None: assert test_trie() def main() -> None: """ >>> pytests() """ print_results("Testing trie functionality", test_trie()) if __name__ == "__main__": main()
""" A Trie/Prefix Tree is a kind of search tree used to provide quick lookup of words/patterns in a set of words. A basic Trie however has O(n^2) space complexity making it impractical in practice. It however provides O(max(search_string, length of longest word)) lookup time making it an optimal approach when space is not an issue. """ class TrieNode: def __init__(self) -> None: self.nodes: dict[str, TrieNode] = {} # Mapping from char to TrieNode self.is_leaf = False def insert_many(self, words: list[str]) -> None: """ Inserts a list of words into the Trie :param words: list of string words :return: None """ for word in words: self.insert(word) def insert(self, word: str) -> None: """ Inserts a word into the Trie :param word: word to be inserted :return: None """ curr = self for char in word: if char not in curr.nodes: curr.nodes[char] = TrieNode() curr = curr.nodes[char] curr.is_leaf = True def find(self, word: str) -> bool: """ Tries to find word in a Trie :param word: word to look for :return: Returns True if word is found, False otherwise """ curr = self for char in word: if char not in curr.nodes: return False curr = curr.nodes[char] return curr.is_leaf def delete(self, word: str) -> None: """ Deletes a word in a Trie :param word: word to delete :return: None """ def _delete(curr: TrieNode, word: str, index: int) -> bool: if index == len(word): # If word does not exist if not curr.is_leaf: return False curr.is_leaf = False return len(curr.nodes) == 0 char = word[index] char_node = curr.nodes.get(char) # If char not in current trie node if not char_node: return False # Flag to check if node can be deleted delete_curr = _delete(char_node, word, index + 1) if delete_curr: del curr.nodes[char] return len(curr.nodes) == 0 return delete_curr _delete(self, word, 0) def print_words(node: TrieNode, word: str) -> None: """ Prints all the words in a Trie :param node: root node of Trie :param word: Word variable should be empty at start :return: None """ if node.is_leaf: print(word, end=" ") for key, value in node.nodes.items(): print_words(value, word + key) def test_trie() -> bool: words = "banana bananas bandana band apple all beast".split() root = TrieNode() root.insert_many(words) # print_words(root, "") assert all(root.find(word) for word in words) assert root.find("banana") assert not root.find("bandanas") assert not root.find("apps") assert root.find("apple") assert root.find("all") root.delete("all") assert not root.find("all") root.delete("banana") assert not root.find("banana") assert root.find("bananas") return True def print_results(msg: str, passes: bool) -> None: print(str(msg), "works!" if passes else "doesn't work :(") def pytests() -> None: assert test_trie() def main() -> None: """ >>> pytests() """ print_results("Testing trie functionality", test_trie()) if __name__ == "__main__": main()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from math import atan, cos, radians, sin, tan from .haversine_distance import haversine_distance AXIS_A = 6378137.0 AXIS_B = 6356752.314245 EQUATORIAL_RADIUS = 6378137 def lamberts_ellipsoidal_distance( lat1: float, lon1: float, lat2: float, lon2: float ) -> float: """ Calculate the shortest distance along the surface of an ellipsoid between two points on the surface of earth given longitudes and latitudes https://en.wikipedia.org/wiki/Geographical_distance#Lambert's_formula_for_long_lines NOTE: This algorithm uses geodesy/haversine_distance.py to compute central angle, sigma Representing the earth as an ellipsoid allows us to approximate distances between points on the surface much better than a sphere. Ellipsoidal formulas treat the Earth as an oblate ellipsoid which means accounting for the flattening that happens at the North and South poles. Lambert's formulae provide accuracy on the order of 10 meteres over thousands of kilometeres. Other methods can provide millimeter-level accuracy but this is a simpler method to calculate long range distances without increasing computational intensity. Args: lat1, lon1: latitude and longitude of coordinate 1 lat2, lon2: latitude and longitude of coordinate 2 Returns: geographical distance between two points in metres >>> from collections import namedtuple >>> point_2d = namedtuple("point_2d", "lat lon") >>> SAN_FRANCISCO = point_2d(37.774856, -122.424227) >>> YOSEMITE = point_2d(37.864742, -119.537521) >>> NEW_YORK = point_2d(40.713019, -74.012647) >>> VENICE = point_2d(45.443012, 12.313071) >>> f"{lamberts_ellipsoidal_distance(*SAN_FRANCISCO, *YOSEMITE):0,.0f} meters" '254,351 meters' >>> f"{lamberts_ellipsoidal_distance(*SAN_FRANCISCO, *NEW_YORK):0,.0f} meters" '4,138,992 meters' >>> f"{lamberts_ellipsoidal_distance(*SAN_FRANCISCO, *VENICE):0,.0f} meters" '9,737,326 meters' """ # CONSTANTS per WGS84 https://en.wikipedia.org/wiki/World_Geodetic_System # Distance in metres(m) # Equation Parameters # https://en.wikipedia.org/wiki/Geographical_distance#Lambert's_formula_for_long_lines flattening = (AXIS_A - AXIS_B) / AXIS_A # Parametric latitudes # https://en.wikipedia.org/wiki/Latitude#Parametric_(or_reduced)_latitude b_lat1 = atan((1 - flattening) * tan(radians(lat1))) b_lat2 = atan((1 - flattening) * tan(radians(lat2))) # Compute central angle between two points # using haversine theta. sigma = haversine_distance / equatorial radius sigma = haversine_distance(lat1, lon1, lat2, lon2) / EQUATORIAL_RADIUS # Intermediate P and Q values p_value = (b_lat1 + b_lat2) / 2 q_value = (b_lat2 - b_lat1) / 2 # Intermediate X value # X = (sigma - sin(sigma)) * sin^2Pcos^2Q / cos^2(sigma/2) x_numerator = (sin(p_value) ** 2) * (cos(q_value) ** 2) x_demonimator = cos(sigma / 2) ** 2 x_value = (sigma - sin(sigma)) * (x_numerator / x_demonimator) # Intermediate Y value # Y = (sigma + sin(sigma)) * cos^2Psin^2Q / sin^2(sigma/2) y_numerator = (cos(p_value) ** 2) * (sin(q_value) ** 2) y_denominator = sin(sigma / 2) ** 2 y_value = (sigma + sin(sigma)) * (y_numerator / y_denominator) return EQUATORIAL_RADIUS * (sigma - ((flattening / 2) * (x_value + y_value))) if __name__ == "__main__": import doctest doctest.testmod()
from math import atan, cos, radians, sin, tan from .haversine_distance import haversine_distance AXIS_A = 6378137.0 AXIS_B = 6356752.314245 EQUATORIAL_RADIUS = 6378137 def lamberts_ellipsoidal_distance( lat1: float, lon1: float, lat2: float, lon2: float ) -> float: """ Calculate the shortest distance along the surface of an ellipsoid between two points on the surface of earth given longitudes and latitudes https://en.wikipedia.org/wiki/Geographical_distance#Lambert's_formula_for_long_lines NOTE: This algorithm uses geodesy/haversine_distance.py to compute central angle, sigma Representing the earth as an ellipsoid allows us to approximate distances between points on the surface much better than a sphere. Ellipsoidal formulas treat the Earth as an oblate ellipsoid which means accounting for the flattening that happens at the North and South poles. Lambert's formulae provide accuracy on the order of 10 meteres over thousands of kilometeres. Other methods can provide millimeter-level accuracy but this is a simpler method to calculate long range distances without increasing computational intensity. Args: lat1, lon1: latitude and longitude of coordinate 1 lat2, lon2: latitude and longitude of coordinate 2 Returns: geographical distance between two points in metres >>> from collections import namedtuple >>> point_2d = namedtuple("point_2d", "lat lon") >>> SAN_FRANCISCO = point_2d(37.774856, -122.424227) >>> YOSEMITE = point_2d(37.864742, -119.537521) >>> NEW_YORK = point_2d(40.713019, -74.012647) >>> VENICE = point_2d(45.443012, 12.313071) >>> f"{lamberts_ellipsoidal_distance(*SAN_FRANCISCO, *YOSEMITE):0,.0f} meters" '254,351 meters' >>> f"{lamberts_ellipsoidal_distance(*SAN_FRANCISCO, *NEW_YORK):0,.0f} meters" '4,138,992 meters' >>> f"{lamberts_ellipsoidal_distance(*SAN_FRANCISCO, *VENICE):0,.0f} meters" '9,737,326 meters' """ # CONSTANTS per WGS84 https://en.wikipedia.org/wiki/World_Geodetic_System # Distance in metres(m) # Equation Parameters # https://en.wikipedia.org/wiki/Geographical_distance#Lambert's_formula_for_long_lines flattening = (AXIS_A - AXIS_B) / AXIS_A # Parametric latitudes # https://en.wikipedia.org/wiki/Latitude#Parametric_(or_reduced)_latitude b_lat1 = atan((1 - flattening) * tan(radians(lat1))) b_lat2 = atan((1 - flattening) * tan(radians(lat2))) # Compute central angle between two points # using haversine theta. sigma = haversine_distance / equatorial radius sigma = haversine_distance(lat1, lon1, lat2, lon2) / EQUATORIAL_RADIUS # Intermediate P and Q values p_value = (b_lat1 + b_lat2) / 2 q_value = (b_lat2 - b_lat1) / 2 # Intermediate X value # X = (sigma - sin(sigma)) * sin^2Pcos^2Q / cos^2(sigma/2) x_numerator = (sin(p_value) ** 2) * (cos(q_value) ** 2) x_demonimator = cos(sigma / 2) ** 2 x_value = (sigma - sin(sigma)) * (x_numerator / x_demonimator) # Intermediate Y value # Y = (sigma + sin(sigma)) * cos^2Psin^2Q / sin^2(sigma/2) y_numerator = (cos(p_value) ** 2) * (sin(q_value) ** 2) y_denominator = sin(sigma / 2) ** 2 y_value = (sigma + sin(sigma)) * (y_numerator / y_denominator) return EQUATORIAL_RADIUS * (sigma - ((flattening / 2) * (x_value + y_value))) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Implementation of Basic Math in Python.""" import math def prime_factors(n: int) -> list: """Find Prime Factors. >>> prime_factors(100) [2, 2, 5, 5] >>> prime_factors(0) Traceback (most recent call last): ... ValueError: Only positive integers have prime factors >>> prime_factors(-10) Traceback (most recent call last): ... ValueError: Only positive integers have prime factors """ if n <= 0: raise ValueError("Only positive integers have prime factors") pf = [] while n % 2 == 0: pf.append(2) n = int(n / 2) for i in range(3, int(math.sqrt(n)) + 1, 2): while n % i == 0: pf.append(i) n = int(n / i) if n > 2: pf.append(n) return pf def number_of_divisors(n: int) -> int: """Calculate Number of Divisors of an Integer. >>> number_of_divisors(100) 9 >>> number_of_divisors(0) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted >>> number_of_divisors(-10) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted """ if n <= 0: raise ValueError("Only positive numbers are accepted") div = 1 temp = 1 while n % 2 == 0: temp += 1 n = int(n / 2) div *= temp for i in range(3, int(math.sqrt(n)) + 1, 2): temp = 1 while n % i == 0: temp += 1 n = int(n / i) div *= temp if n > 1: div *= 2 return div def sum_of_divisors(n: int) -> int: """Calculate Sum of Divisors. >>> sum_of_divisors(100) 217 >>> sum_of_divisors(0) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted >>> sum_of_divisors(-10) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted """ if n <= 0: raise ValueError("Only positive numbers are accepted") s = 1 temp = 1 while n % 2 == 0: temp += 1 n = int(n / 2) if temp > 1: s *= (2**temp - 1) / (2 - 1) for i in range(3, int(math.sqrt(n)) + 1, 2): temp = 1 while n % i == 0: temp += 1 n = int(n / i) if temp > 1: s *= (i**temp - 1) / (i - 1) return int(s) def euler_phi(n: int) -> int: """Calculate Euler's Phi Function. >>> euler_phi(100) 40 """ s = n for x in set(prime_factors(n)): s *= (x - 1) / x return int(s) if __name__ == "__main__": import doctest doctest.testmod()
"""Implementation of Basic Math in Python.""" import math def prime_factors(n: int) -> list: """Find Prime Factors. >>> prime_factors(100) [2, 2, 5, 5] >>> prime_factors(0) Traceback (most recent call last): ... ValueError: Only positive integers have prime factors >>> prime_factors(-10) Traceback (most recent call last): ... ValueError: Only positive integers have prime factors """ if n <= 0: raise ValueError("Only positive integers have prime factors") pf = [] while n % 2 == 0: pf.append(2) n = int(n / 2) for i in range(3, int(math.sqrt(n)) + 1, 2): while n % i == 0: pf.append(i) n = int(n / i) if n > 2: pf.append(n) return pf def number_of_divisors(n: int) -> int: """Calculate Number of Divisors of an Integer. >>> number_of_divisors(100) 9 >>> number_of_divisors(0) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted >>> number_of_divisors(-10) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted """ if n <= 0: raise ValueError("Only positive numbers are accepted") div = 1 temp = 1 while n % 2 == 0: temp += 1 n = int(n / 2) div *= temp for i in range(3, int(math.sqrt(n)) + 1, 2): temp = 1 while n % i == 0: temp += 1 n = int(n / i) div *= temp if n > 1: div *= 2 return div def sum_of_divisors(n: int) -> int: """Calculate Sum of Divisors. >>> sum_of_divisors(100) 217 >>> sum_of_divisors(0) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted >>> sum_of_divisors(-10) Traceback (most recent call last): ... ValueError: Only positive numbers are accepted """ if n <= 0: raise ValueError("Only positive numbers are accepted") s = 1 temp = 1 while n % 2 == 0: temp += 1 n = int(n / 2) if temp > 1: s *= (2**temp - 1) / (2 - 1) for i in range(3, int(math.sqrt(n)) + 1, 2): temp = 1 while n % i == 0: temp += 1 n = int(n / i) if temp > 1: s *= (i**temp - 1) / (i - 1) return int(s) def euler_phi(n: int) -> int: """Calculate Euler's Phi Function. >>> euler_phi(100) 40 """ s = n for x in set(prime_factors(n)): s *= (x - 1) / x return int(s) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Algorithms to determine if a string is palindrome test_data = { "MALAYALAM": True, "String": False, "rotor": True, "level": True, "A": True, "BB": True, "ABC": False, "amanaplanacanalpanama": True, # "a man a plan a canal panama" } # Ensure our test data is valid assert all((key == key[::-1]) is value for key, value in test_data.items()) def is_palindrome(s: str) -> bool: """ Return True if s is a palindrome otherwise return False. >>> all(is_palindrome(key) is value for key, value in test_data.items()) True """ start_i = 0 end_i = len(s) - 1 while start_i < end_i: if s[start_i] == s[end_i]: start_i += 1 end_i -= 1 else: return False return True def is_palindrome_recursive(s: str) -> bool: """ Return True if s is a palindrome otherwise return False. >>> all(is_palindrome_recursive(key) is value for key, value in test_data.items()) True """ if len(s) <= 1: return True if s[0] == s[len(s) - 1]: return is_palindrome_recursive(s[1:-1]) else: return False def is_palindrome_slice(s: str) -> bool: """ Return True if s is a palindrome otherwise return False. >>> all(is_palindrome_slice(key) is value for key, value in test_data.items()) True """ return s == s[::-1] if __name__ == "__main__": for key, value in test_data.items(): assert is_palindrome(key) is is_palindrome_recursive(key) assert is_palindrome(key) is is_palindrome_slice(key) print(f"{key:21} {value}") print("a man a plan a canal panama")
# Algorithms to determine if a string is palindrome test_data = { "MALAYALAM": True, "String": False, "rotor": True, "level": True, "A": True, "BB": True, "ABC": False, "amanaplanacanalpanama": True, # "a man a plan a canal panama" } # Ensure our test data is valid assert all((key == key[::-1]) is value for key, value in test_data.items()) def is_palindrome(s: str) -> bool: """ Return True if s is a palindrome otherwise return False. >>> all(is_palindrome(key) is value for key, value in test_data.items()) True """ start_i = 0 end_i = len(s) - 1 while start_i < end_i: if s[start_i] == s[end_i]: start_i += 1 end_i -= 1 else: return False return True def is_palindrome_recursive(s: str) -> bool: """ Return True if s is a palindrome otherwise return False. >>> all(is_palindrome_recursive(key) is value for key, value in test_data.items()) True """ if len(s) <= 1: return True if s[0] == s[len(s) - 1]: return is_palindrome_recursive(s[1:-1]) else: return False def is_palindrome_slice(s: str) -> bool: """ Return True if s is a palindrome otherwise return False. >>> all(is_palindrome_slice(key) is value for key, value in test_data.items()) True """ return s == s[::-1] if __name__ == "__main__": for key, value in test_data.items(): assert is_palindrome(key) is is_palindrome_recursive(key) assert is_palindrome(key) is is_palindrome_slice(key) print(f"{key:21} {value}") print("a man a plan a canal panama")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Implementation of gaussian filter algorithm """ from itertools import product from cv2 import COLOR_BGR2GRAY, cvtColor, imread, imshow, waitKey from numpy import dot, exp, mgrid, pi, ravel, square, uint8, zeros def gen_gaussian_kernel(k_size, sigma): center = k_size // 2 x, y = mgrid[0 - center : k_size - center, 0 - center : k_size - center] g = 1 / (2 * pi * sigma) * exp(-(square(x) + square(y)) / (2 * square(sigma))) return g def gaussian_filter(image, k_size, sigma): height, width = image.shape[0], image.shape[1] # dst image height and width dst_height = height - k_size + 1 dst_width = width - k_size + 1 # im2col, turn the k_size*k_size pixels into a row and np.vstack all rows image_array = zeros((dst_height * dst_width, k_size * k_size)) row = 0 for i, j in product(range(dst_height), range(dst_width)): window = ravel(image[i : i + k_size, j : j + k_size]) image_array[row, :] = window row += 1 # turn the kernel into shape(k*k, 1) gaussian_kernel = gen_gaussian_kernel(k_size, sigma) filter_array = ravel(gaussian_kernel) # reshape and get the dst image dst = dot(image_array, filter_array).reshape(dst_height, dst_width).astype(uint8) return dst if __name__ == "__main__": # read original image img = imread(r"../image_data/lena.jpg") # turn image in gray scale value gray = cvtColor(img, COLOR_BGR2GRAY) # get values with two different mask size gaussian3x3 = gaussian_filter(gray, 3, sigma=1) gaussian5x5 = gaussian_filter(gray, 5, sigma=0.8) # show result images imshow("gaussian filter with 3x3 mask", gaussian3x3) imshow("gaussian filter with 5x5 mask", gaussian5x5) waitKey()
""" Implementation of gaussian filter algorithm """ from itertools import product from cv2 import COLOR_BGR2GRAY, cvtColor, imread, imshow, waitKey from numpy import dot, exp, mgrid, pi, ravel, square, uint8, zeros def gen_gaussian_kernel(k_size, sigma): center = k_size // 2 x, y = mgrid[0 - center : k_size - center, 0 - center : k_size - center] g = 1 / (2 * pi * sigma) * exp(-(square(x) + square(y)) / (2 * square(sigma))) return g def gaussian_filter(image, k_size, sigma): height, width = image.shape[0], image.shape[1] # dst image height and width dst_height = height - k_size + 1 dst_width = width - k_size + 1 # im2col, turn the k_size*k_size pixels into a row and np.vstack all rows image_array = zeros((dst_height * dst_width, k_size * k_size)) row = 0 for i, j in product(range(dst_height), range(dst_width)): window = ravel(image[i : i + k_size, j : j + k_size]) image_array[row, :] = window row += 1 # turn the kernel into shape(k*k, 1) gaussian_kernel = gen_gaussian_kernel(k_size, sigma) filter_array = ravel(gaussian_kernel) # reshape and get the dst image dst = dot(image_array, filter_array).reshape(dst_height, dst_width).astype(uint8) return dst if __name__ == "__main__": # read original image img = imread(r"../image_data/lena.jpg") # turn image in gray scale value gray = cvtColor(img, COLOR_BGR2GRAY) # get values with two different mask size gaussian3x3 = gaussian_filter(gray, 3, sigma=1) gaussian5x5 = gaussian_filter(gray, 5, sigma=0.8) # show result images imshow("gaussian filter with 3x3 mask", gaussian3x3) imshow("gaussian filter with 5x5 mask", gaussian5x5) waitKey()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Algorithm for calculating the most cost-efficient sequence for converting one string into another. The only allowed operations are --- Cost to copy a character is copy_cost --- Cost to replace a character is replace_cost --- Cost to delete a character is delete_cost --- Cost to insert a character is insert_cost """ def compute_transform_tables( source_string: str, destination_string: str, copy_cost: int, replace_cost: int, delete_cost: int, insert_cost: int, ) -> tuple[list[list[int]], list[list[str]]]: source_seq = list(source_string) destination_seq = list(destination_string) len_source_seq = len(source_seq) len_destination_seq = len(destination_seq) costs = [ [0 for _ in range(len_destination_seq + 1)] for _ in range(len_source_seq + 1) ] ops = [ ["0" for _ in range(len_destination_seq + 1)] for _ in range(len_source_seq + 1) ] for i in range(1, len_source_seq + 1): costs[i][0] = i * delete_cost ops[i][0] = f"D{source_seq[i - 1]:c}" for i in range(1, len_destination_seq + 1): costs[0][i] = i * insert_cost ops[0][i] = f"I{destination_seq[i - 1]:c}" for i in range(1, len_source_seq + 1): for j in range(1, len_destination_seq + 1): if source_seq[i - 1] == destination_seq[j - 1]: costs[i][j] = costs[i - 1][j - 1] + copy_cost ops[i][j] = f"C{source_seq[i - 1]:c}" else: costs[i][j] = costs[i - 1][j - 1] + replace_cost ops[i][j] = f"R{source_seq[i - 1]:c}" + str(destination_seq[j - 1]) if costs[i - 1][j] + delete_cost < costs[i][j]: costs[i][j] = costs[i - 1][j] + delete_cost ops[i][j] = f"D{source_seq[i - 1]:c}" if costs[i][j - 1] + insert_cost < costs[i][j]: costs[i][j] = costs[i][j - 1] + insert_cost ops[i][j] = f"I{destination_seq[j - 1]:c}" return costs, ops def assemble_transformation(ops: list[list[str]], i: int, j: int) -> list[str]: if i == 0 and j == 0: return [] else: if ops[i][j][0] == "C" or ops[i][j][0] == "R": seq = assemble_transformation(ops, i - 1, j - 1) seq.append(ops[i][j]) return seq elif ops[i][j][0] == "D": seq = assemble_transformation(ops, i - 1, j) seq.append(ops[i][j]) return seq else: seq = assemble_transformation(ops, i, j - 1) seq.append(ops[i][j]) return seq if __name__ == "__main__": _, operations = compute_transform_tables("Python", "Algorithms", -1, 1, 2, 2) m = len(operations) n = len(operations[0]) sequence = assemble_transformation(operations, m - 1, n - 1) string = list("Python") i = 0 cost = 0 with open("min_cost.txt", "w") as file: for op in sequence: print("".join(string)) if op[0] == "C": file.write("%-16s" % "Copy %c" % op[1]) file.write("\t\t\t" + "".join(string)) file.write("\r\n") cost -= 1 elif op[0] == "R": string[i] = op[2] file.write("%-16s" % ("Replace %c" % op[1] + " with " + str(op[2]))) file.write("\t\t" + "".join(string)) file.write("\r\n") cost += 1 elif op[0] == "D": string.pop(i) file.write("%-16s" % "Delete %c" % op[1]) file.write("\t\t\t" + "".join(string)) file.write("\r\n") cost += 2 else: string.insert(i, op[1]) file.write("%-16s" % "Insert %c" % op[1]) file.write("\t\t\t" + "".join(string)) file.write("\r\n") cost += 2 i += 1 print("".join(string)) print("Cost: ", cost) file.write("\r\nMinimum cost: " + str(cost))
""" Algorithm for calculating the most cost-efficient sequence for converting one string into another. The only allowed operations are --- Cost to copy a character is copy_cost --- Cost to replace a character is replace_cost --- Cost to delete a character is delete_cost --- Cost to insert a character is insert_cost """ def compute_transform_tables( source_string: str, destination_string: str, copy_cost: int, replace_cost: int, delete_cost: int, insert_cost: int, ) -> tuple[list[list[int]], list[list[str]]]: source_seq = list(source_string) destination_seq = list(destination_string) len_source_seq = len(source_seq) len_destination_seq = len(destination_seq) costs = [ [0 for _ in range(len_destination_seq + 1)] for _ in range(len_source_seq + 1) ] ops = [ ["0" for _ in range(len_destination_seq + 1)] for _ in range(len_source_seq + 1) ] for i in range(1, len_source_seq + 1): costs[i][0] = i * delete_cost ops[i][0] = f"D{source_seq[i - 1]:c}" for i in range(1, len_destination_seq + 1): costs[0][i] = i * insert_cost ops[0][i] = f"I{destination_seq[i - 1]:c}" for i in range(1, len_source_seq + 1): for j in range(1, len_destination_seq + 1): if source_seq[i - 1] == destination_seq[j - 1]: costs[i][j] = costs[i - 1][j - 1] + copy_cost ops[i][j] = f"C{source_seq[i - 1]:c}" else: costs[i][j] = costs[i - 1][j - 1] + replace_cost ops[i][j] = f"R{source_seq[i - 1]:c}" + str(destination_seq[j - 1]) if costs[i - 1][j] + delete_cost < costs[i][j]: costs[i][j] = costs[i - 1][j] + delete_cost ops[i][j] = f"D{source_seq[i - 1]:c}" if costs[i][j - 1] + insert_cost < costs[i][j]: costs[i][j] = costs[i][j - 1] + insert_cost ops[i][j] = f"I{destination_seq[j - 1]:c}" return costs, ops def assemble_transformation(ops: list[list[str]], i: int, j: int) -> list[str]: if i == 0 and j == 0: return [] else: if ops[i][j][0] == "C" or ops[i][j][0] == "R": seq = assemble_transformation(ops, i - 1, j - 1) seq.append(ops[i][j]) return seq elif ops[i][j][0] == "D": seq = assemble_transformation(ops, i - 1, j) seq.append(ops[i][j]) return seq else: seq = assemble_transformation(ops, i, j - 1) seq.append(ops[i][j]) return seq if __name__ == "__main__": _, operations = compute_transform_tables("Python", "Algorithms", -1, 1, 2, 2) m = len(operations) n = len(operations[0]) sequence = assemble_transformation(operations, m - 1, n - 1) string = list("Python") i = 0 cost = 0 with open("min_cost.txt", "w") as file: for op in sequence: print("".join(string)) if op[0] == "C": file.write("%-16s" % "Copy %c" % op[1]) file.write("\t\t\t" + "".join(string)) file.write("\r\n") cost -= 1 elif op[0] == "R": string[i] = op[2] file.write("%-16s" % ("Replace %c" % op[1] + " with " + str(op[2]))) file.write("\t\t" + "".join(string)) file.write("\r\n") cost += 1 elif op[0] == "D": string.pop(i) file.write("%-16s" % "Delete %c" % op[1]) file.write("\t\t\t" + "".join(string)) file.write("\r\n") cost += 2 else: string.insert(i, op[1]) file.write("%-16s" % "Insert %c" % op[1]) file.write("\t\t\t" + "".join(string)) file.write("\r\n") cost += 2 i += 1 print("".join(string)) print("Cost: ", cost) file.write("\r\nMinimum cost: " + str(cost))
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import math from numpy import inf from scipy.integrate import quad def gamma(num: float) -> float: """ https://en.wikipedia.org/wiki/Gamma_function In mathematics, the gamma function is one commonly used extension of the factorial function to complex numbers. The gamma function is defined for all complex numbers except the non-positive integers >>> gamma(-1) Traceback (most recent call last): ... ValueError: math domain error >>> gamma(0) Traceback (most recent call last): ... ValueError: math domain error >>> gamma(9) 40320.0 >>> from math import gamma as math_gamma >>> all(.99999999 < gamma(i) / math_gamma(i) <= 1.000000001 ... for i in range(1, 50)) True >>> from math import gamma as math_gamma >>> gamma(-1)/math_gamma(-1) <= 1.000000001 Traceback (most recent call last): ... ValueError: math domain error >>> from math import gamma as math_gamma >>> gamma(3.3) - math_gamma(3.3) <= 0.00000001 True """ if num <= 0: raise ValueError("math domain error") return quad(integrand, 0, inf, args=(num))[0] def integrand(x: float, z: float) -> float: return math.pow(x, z - 1) * math.exp(-x) if __name__ == "__main__": from doctest import testmod testmod()
import math from numpy import inf from scipy.integrate import quad def gamma(num: float) -> float: """ https://en.wikipedia.org/wiki/Gamma_function In mathematics, the gamma function is one commonly used extension of the factorial function to complex numbers. The gamma function is defined for all complex numbers except the non-positive integers >>> gamma(-1) Traceback (most recent call last): ... ValueError: math domain error >>> gamma(0) Traceback (most recent call last): ... ValueError: math domain error >>> gamma(9) 40320.0 >>> from math import gamma as math_gamma >>> all(.99999999 < gamma(i) / math_gamma(i) <= 1.000000001 ... for i in range(1, 50)) True >>> from math import gamma as math_gamma >>> gamma(-1)/math_gamma(-1) <= 1.000000001 Traceback (most recent call last): ... ValueError: math domain error >>> from math import gamma as math_gamma >>> gamma(3.3) - math_gamma(3.3) <= 0.00000001 True """ if num <= 0: raise ValueError("math domain error") return quad(integrand, 0, inf, args=(num))[0] def integrand(x: float, z: float) -> float: return math.pow(x, z - 1) * math.exp(-x) if __name__ == "__main__": from doctest import testmod testmod()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Lower-Upper (LU) Decomposition. Reference: - https://en.wikipedia.org/wiki/LU_decomposition """ from __future__ import annotations import numpy as np from numpy import float64 from numpy.typing import ArrayLike def lower_upper_decomposition( table: ArrayLike[float64], ) -> tuple[ArrayLike[float64], ArrayLike[float64]]: """Lower-Upper (LU) Decomposition Example: >>> matrix = np.array([[2, -2, 1], [0, 1, 2], [5, 3, 1]]) >>> outcome = lower_upper_decomposition(matrix) >>> outcome[0] array([[1. , 0. , 0. ], [0. , 1. , 0. ], [2.5, 8. , 1. ]]) >>> outcome[1] array([[ 2. , -2. , 1. ], [ 0. , 1. , 2. ], [ 0. , 0. , -17.5]]) >>> matrix = np.array([[2, -2, 1], [0, 1, 2]]) >>> lower_upper_decomposition(matrix) Traceback (most recent call last): ... ValueError: 'table' has to be of square shaped array but got a 2x3 array: [[ 2 -2 1] [ 0 1 2]] """ # Table that contains our data # Table has to be a square array so we need to check first rows, columns = np.shape(table) if rows != columns: raise ValueError( f"'table' has to be of square shaped array but got a {rows}x{columns} " + f"array:\n{table}" ) lower = np.zeros((rows, columns)) upper = np.zeros((rows, columns)) for i in range(columns): for j in range(i): total = 0 for k in range(j): total += lower[i][k] * upper[k][j] lower[i][j] = (table[i][j] - total) / upper[j][j] lower[i][i] = 1 for j in range(i, columns): total = 0 for k in range(i): total += lower[i][k] * upper[k][j] upper[i][j] = table[i][j] - total return lower, upper if __name__ == "__main__": import doctest doctest.testmod()
"""Lower-Upper (LU) Decomposition. Reference: - https://en.wikipedia.org/wiki/LU_decomposition """ from __future__ import annotations import numpy as np from numpy import float64 from numpy.typing import ArrayLike def lower_upper_decomposition( table: ArrayLike[float64], ) -> tuple[ArrayLike[float64], ArrayLike[float64]]: """Lower-Upper (LU) Decomposition Example: >>> matrix = np.array([[2, -2, 1], [0, 1, 2], [5, 3, 1]]) >>> outcome = lower_upper_decomposition(matrix) >>> outcome[0] array([[1. , 0. , 0. ], [0. , 1. , 0. ], [2.5, 8. , 1. ]]) >>> outcome[1] array([[ 2. , -2. , 1. ], [ 0. , 1. , 2. ], [ 0. , 0. , -17.5]]) >>> matrix = np.array([[2, -2, 1], [0, 1, 2]]) >>> lower_upper_decomposition(matrix) Traceback (most recent call last): ... ValueError: 'table' has to be of square shaped array but got a 2x3 array: [[ 2 -2 1] [ 0 1 2]] """ # Table that contains our data # Table has to be a square array so we need to check first rows, columns = np.shape(table) if rows != columns: raise ValueError( f"'table' has to be of square shaped array but got a {rows}x{columns} " + f"array:\n{table}" ) lower = np.zeros((rows, columns)) upper = np.zeros((rows, columns)) for i in range(columns): for j in range(i): total = 0 for k in range(j): total += lower[i][k] * upper[k][j] lower[i][j] = (table[i][j] - total) / upper[j][j] lower[i][i] = 1 for j in range(i, columns): total = 0 for k in range(i): total += lower[i][k] * upper[k][j] upper[i][j] = table[i][j] - total return lower, upper if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" References: wikipedia:square free number python/black : True flake8 : True """ from __future__ import annotations def is_square_free(factors: list[int]) -> bool: """ # doctest: +NORMALIZE_WHITESPACE This functions takes a list of prime factors as input. returns True if the factors are square free. >>> is_square_free([1, 1, 2, 3, 4]) False These are wrong but should return some value it simply checks for repetition in the numbers. >>> is_square_free([1, 3, 4, 'sd', 0.0]) True >>> is_square_free([1, 0.5, 2, 0.0]) True >>> is_square_free([1, 2, 2, 5]) False >>> is_square_free('asd') True >>> is_square_free(24) Traceback (most recent call last): ... TypeError: 'int' object is not iterable """ return len(set(factors)) == len(factors) if __name__ == "__main__": import doctest doctest.testmod()
""" References: wikipedia:square free number python/black : True flake8 : True """ from __future__ import annotations def is_square_free(factors: list[int]) -> bool: """ # doctest: +NORMALIZE_WHITESPACE This functions takes a list of prime factors as input. returns True if the factors are square free. >>> is_square_free([1, 1, 2, 3, 4]) False These are wrong but should return some value it simply checks for repetition in the numbers. >>> is_square_free([1, 3, 4, 'sd', 0.0]) True >>> is_square_free([1, 0.5, 2, 0.0]) True >>> is_square_free([1, 2, 2, 5]) False >>> is_square_free('asd') True >>> is_square_free(24) Traceback (most recent call last): ... TypeError: 'int' object is not iterable """ return len(set(factors)) == len(factors) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from collections import deque def tarjan(g): """ Tarjan's algo for finding strongly connected components in a directed graph Uses two main attributes of each node to track reachability, the index of that node within a component(index), and the lowest index reachable from that node(lowlink). We then perform a dfs of the each component making sure to update these parameters for each node and saving the nodes we visit on the way. If ever we find that the lowest reachable node from a current node is equal to the index of the current node then it must be the root of a strongly connected component and so we save it and it's equireachable vertices as a strongly connected component. Complexity: strong_connect() is called at most once for each node and has a complexity of O(|E|) as it is DFS. Therefore this has complexity O(|V| + |E|) for a graph G = (V, E) """ n = len(g) stack = deque() on_stack = [False for _ in range(n)] index_of = [-1 for _ in range(n)] lowlink_of = index_of[:] def strong_connect(v, index, components): index_of[v] = index # the number when this node is seen lowlink_of[v] = index # lowest rank node reachable from here index += 1 stack.append(v) on_stack[v] = True for w in g[v]: if index_of[w] == -1: index = strong_connect(w, index, components) lowlink_of[v] = ( lowlink_of[w] if lowlink_of[w] < lowlink_of[v] else lowlink_of[v] ) elif on_stack[w]: lowlink_of[v] = ( lowlink_of[w] if lowlink_of[w] < lowlink_of[v] else lowlink_of[v] ) if lowlink_of[v] == index_of[v]: component = [] w = stack.pop() on_stack[w] = False component.append(w) while w != v: w = stack.pop() on_stack[w] = False component.append(w) components.append(component) return index components = [] for v in range(n): if index_of[v] == -1: strong_connect(v, 0, components) return components def create_graph(n, edges): g = [[] for _ in range(n)] for u, v in edges: g[u].append(v) return g if __name__ == "__main__": # Test n_vertices = 7 source = [0, 0, 1, 2, 3, 3, 4, 4, 6] target = [1, 3, 2, 0, 1, 4, 5, 6, 5] edges = [(u, v) for u, v in zip(source, target)] g = create_graph(n_vertices, edges) assert [[5], [6], [4], [3, 2, 1, 0]] == tarjan(g)
from collections import deque def tarjan(g): """ Tarjan's algo for finding strongly connected components in a directed graph Uses two main attributes of each node to track reachability, the index of that node within a component(index), and the lowest index reachable from that node(lowlink). We then perform a dfs of the each component making sure to update these parameters for each node and saving the nodes we visit on the way. If ever we find that the lowest reachable node from a current node is equal to the index of the current node then it must be the root of a strongly connected component and so we save it and it's equireachable vertices as a strongly connected component. Complexity: strong_connect() is called at most once for each node and has a complexity of O(|E|) as it is DFS. Therefore this has complexity O(|V| + |E|) for a graph G = (V, E) """ n = len(g) stack = deque() on_stack = [False for _ in range(n)] index_of = [-1 for _ in range(n)] lowlink_of = index_of[:] def strong_connect(v, index, components): index_of[v] = index # the number when this node is seen lowlink_of[v] = index # lowest rank node reachable from here index += 1 stack.append(v) on_stack[v] = True for w in g[v]: if index_of[w] == -1: index = strong_connect(w, index, components) lowlink_of[v] = ( lowlink_of[w] if lowlink_of[w] < lowlink_of[v] else lowlink_of[v] ) elif on_stack[w]: lowlink_of[v] = ( lowlink_of[w] if lowlink_of[w] < lowlink_of[v] else lowlink_of[v] ) if lowlink_of[v] == index_of[v]: component = [] w = stack.pop() on_stack[w] = False component.append(w) while w != v: w = stack.pop() on_stack[w] = False component.append(w) components.append(component) return index components = [] for v in range(n): if index_of[v] == -1: strong_connect(v, 0, components) return components def create_graph(n, edges): g = [[] for _ in range(n)] for u, v in edges: g[u].append(v) return g if __name__ == "__main__": # Test n_vertices = 7 source = [0, 0, 1, 2, 3, 3, 4, 4, 6] target = [1, 3, 2, 0, 1, 4, 5, 6, 5] edges = [(u, v) for u, v in zip(source, target)] g = create_graph(n_vertices, edges) assert [[5], [6], [4], [3, 2, 1, 0]] == tarjan(g)
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import sys """ Dynamic Programming Implementation of Matrix Chain Multiplication Time Complexity: O(n^3) Space Complexity: O(n^2) """ def matrix_chain_order(array): n = len(array) matrix = [[0 for x in range(n)] for x in range(n)] sol = [[0 for x in range(n)] for x in range(n)] for chain_length in range(2, n): for a in range(1, n - chain_length + 1): b = a + chain_length - 1 matrix[a][b] = sys.maxsize for c in range(a, b): cost = ( matrix[a][c] + matrix[c + 1][b] + array[a - 1] * array[c] * array[b] ) if cost < matrix[a][b]: matrix[a][b] = cost sol[a][b] = c return matrix, sol # Print order of matrix with Ai as Matrix def print_optiomal_solution(optimal_solution, i, j): if i == j: print("A" + str(i), end=" ") else: print("(", end=" ") print_optiomal_solution(optimal_solution, i, optimal_solution[i][j]) print_optiomal_solution(optimal_solution, optimal_solution[i][j] + 1, j) print(")", end=" ") def main(): array = [30, 35, 15, 5, 10, 20, 25] n = len(array) # Size of matrix created from above array will be # 30*35 35*15 15*5 5*10 10*20 20*25 matrix, optimal_solution = matrix_chain_order(array) print("No. of Operation required: " + str(matrix[1][n - 1])) print_optiomal_solution(optimal_solution, 1, n - 1) if __name__ == "__main__": main()
import sys """ Dynamic Programming Implementation of Matrix Chain Multiplication Time Complexity: O(n^3) Space Complexity: O(n^2) """ def matrix_chain_order(array): n = len(array) matrix = [[0 for x in range(n)] for x in range(n)] sol = [[0 for x in range(n)] for x in range(n)] for chain_length in range(2, n): for a in range(1, n - chain_length + 1): b = a + chain_length - 1 matrix[a][b] = sys.maxsize for c in range(a, b): cost = ( matrix[a][c] + matrix[c + 1][b] + array[a - 1] * array[c] * array[b] ) if cost < matrix[a][b]: matrix[a][b] = cost sol[a][b] = c return matrix, sol # Print order of matrix with Ai as Matrix def print_optiomal_solution(optimal_solution, i, j): if i == j: print("A" + str(i), end=" ") else: print("(", end=" ") print_optiomal_solution(optimal_solution, i, optimal_solution[i][j]) print_optiomal_solution(optimal_solution, optimal_solution[i][j] + 1, j) print(")", end=" ") def main(): array = [30, 35, 15, 5, 10, 20, 25] n = len(array) # Size of matrix created from above array will be # 30*35 35*15 15*5 5*10 10*20 20*25 matrix, optimal_solution = matrix_chain_order(array) print("No. of Operation required: " + str(matrix[1][n - 1])) print_optiomal_solution(optimal_solution, 1, n - 1) if __name__ == "__main__": main()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Required imports to run this file import matplotlib.pyplot as plt import numpy as np # weighted matrix def weighted_matrix(point: np.mat, training_data_x: np.mat, bandwidth: float) -> np.mat: """ Calculate the weight for every point in the data set. It takes training_point , query_point, and tau Here Tau is not a fixed value it can be varied depends on output. tau --> bandwidth xmat -->Training data point --> the x where we want to make predictions >>> weighted_matrix(np.array([1., 1.]),np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]), 0.6) matrix([[1.43807972e-207, 0.00000000e+000, 0.00000000e+000], [0.00000000e+000, 0.00000000e+000, 0.00000000e+000], [0.00000000e+000, 0.00000000e+000, 0.00000000e+000]]) """ # m is the number of training samples m, n = np.shape(training_data_x) # Initializing weights as identity matrix weights = np.mat(np.eye(m)) # calculating weights for all training examples [x(i)'s] for j in range(m): diff = point - training_data_x[j] weights[j, j] = np.exp(diff * diff.T / (-2.0 * bandwidth**2)) return weights def local_weight( point: np.mat, training_data_x: np.mat, training_data_y: np.mat, bandwidth: float ) -> np.mat: """ Calculate the local weights using the weight_matrix function on training data. Return the weighted matrix. >>> local_weight(np.array([1., 1.]),np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]),np.mat([[1.01, 1.66, 3.5]]), 0.6) matrix([[0.00873174], [0.08272556]]) """ weight = weighted_matrix(point, training_data_x, bandwidth) w = (training_data_x.T * (weight * training_data_x)).I * ( training_data_x.T * weight * training_data_y.T ) return w def local_weight_regression( training_data_x: np.mat, training_data_y: np.mat, bandwidth: float ) -> np.mat: """ Calculate predictions for each data point on axis. >>> local_weight_regression(np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]),np.mat([[1.01, 1.66, 3.5]]), 0.6) array([1.07173261, 1.65970737, 3.50160179]) """ m, n = np.shape(training_data_x) ypred = np.zeros(m) for i, item in enumerate(training_data_x): ypred[i] = item * local_weight( item, training_data_x, training_data_y, bandwidth ) return ypred def load_data(dataset_name: str, cola_name: str, colb_name: str) -> np.mat: """ Function used for loading data from the seaborn splitting into x and y points """ import seaborn as sns data = sns.load_dataset(dataset_name) col_a = np.array(data[cola_name]) # total_bill col_b = np.array(data[colb_name]) # tip mcol_a = np.mat(col_a) mcol_b = np.mat(col_b) m = np.shape(mcol_b)[1] one = np.ones((1, m), dtype=int) # horizontal stacking training_data_x = np.hstack((one.T, mcol_a.T)) return training_data_x, mcol_b, col_a, col_b def get_preds(training_data_x: np.mat, mcol_b: np.mat, tau: float) -> np.ndarray: """ Get predictions with minimum error for each training data >>> get_preds(np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]),np.mat([[1.01, 1.66, 3.5]]), 0.6) array([1.07173261, 1.65970737, 3.50160179]) """ ypred = local_weight_regression(training_data_x, mcol_b, tau) return ypred def plot_preds( training_data_x: np.mat, predictions: np.ndarray, col_x: np.ndarray, col_y: np.ndarray, cola_name: str, colb_name: str, ) -> plt.plot: """ This function used to plot predictions and display the graph """ xsort = training_data_x.copy() xsort.sort(axis=0) plt.scatter(col_x, col_y, color="blue") plt.plot( xsort[:, 1], predictions[training_data_x[:, 1].argsort(0)], color="yellow", linewidth=5, ) plt.title("Local Weighted Regression") plt.xlabel(cola_name) plt.ylabel(colb_name) plt.show() if __name__ == "__main__": training_data_x, mcol_b, col_a, col_b = load_data("tips", "total_bill", "tip") predictions = get_preds(training_data_x, mcol_b, 0.5) plot_preds(training_data_x, predictions, col_a, col_b, "total_bill", "tip")
# Required imports to run this file import matplotlib.pyplot as plt import numpy as np # weighted matrix def weighted_matrix(point: np.mat, training_data_x: np.mat, bandwidth: float) -> np.mat: """ Calculate the weight for every point in the data set. It takes training_point , query_point, and tau Here Tau is not a fixed value it can be varied depends on output. tau --> bandwidth xmat -->Training data point --> the x where we want to make predictions >>> weighted_matrix(np.array([1., 1.]),np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]), 0.6) matrix([[1.43807972e-207, 0.00000000e+000, 0.00000000e+000], [0.00000000e+000, 0.00000000e+000, 0.00000000e+000], [0.00000000e+000, 0.00000000e+000, 0.00000000e+000]]) """ # m is the number of training samples m, n = np.shape(training_data_x) # Initializing weights as identity matrix weights = np.mat(np.eye(m)) # calculating weights for all training examples [x(i)'s] for j in range(m): diff = point - training_data_x[j] weights[j, j] = np.exp(diff * diff.T / (-2.0 * bandwidth**2)) return weights def local_weight( point: np.mat, training_data_x: np.mat, training_data_y: np.mat, bandwidth: float ) -> np.mat: """ Calculate the local weights using the weight_matrix function on training data. Return the weighted matrix. >>> local_weight(np.array([1., 1.]),np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]),np.mat([[1.01, 1.66, 3.5]]), 0.6) matrix([[0.00873174], [0.08272556]]) """ weight = weighted_matrix(point, training_data_x, bandwidth) w = (training_data_x.T * (weight * training_data_x)).I * ( training_data_x.T * weight * training_data_y.T ) return w def local_weight_regression( training_data_x: np.mat, training_data_y: np.mat, bandwidth: float ) -> np.mat: """ Calculate predictions for each data point on axis. >>> local_weight_regression(np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]),np.mat([[1.01, 1.66, 3.5]]), 0.6) array([1.07173261, 1.65970737, 3.50160179]) """ m, n = np.shape(training_data_x) ypred = np.zeros(m) for i, item in enumerate(training_data_x): ypred[i] = item * local_weight( item, training_data_x, training_data_y, bandwidth ) return ypred def load_data(dataset_name: str, cola_name: str, colb_name: str) -> np.mat: """ Function used for loading data from the seaborn splitting into x and y points """ import seaborn as sns data = sns.load_dataset(dataset_name) col_a = np.array(data[cola_name]) # total_bill col_b = np.array(data[colb_name]) # tip mcol_a = np.mat(col_a) mcol_b = np.mat(col_b) m = np.shape(mcol_b)[1] one = np.ones((1, m), dtype=int) # horizontal stacking training_data_x = np.hstack((one.T, mcol_a.T)) return training_data_x, mcol_b, col_a, col_b def get_preds(training_data_x: np.mat, mcol_b: np.mat, tau: float) -> np.ndarray: """ Get predictions with minimum error for each training data >>> get_preds(np.mat([[16.99, 10.34], [21.01,23.68], ... [24.59,25.69]]),np.mat([[1.01, 1.66, 3.5]]), 0.6) array([1.07173261, 1.65970737, 3.50160179]) """ ypred = local_weight_regression(training_data_x, mcol_b, tau) return ypred def plot_preds( training_data_x: np.mat, predictions: np.ndarray, col_x: np.ndarray, col_y: np.ndarray, cola_name: str, colb_name: str, ) -> plt.plot: """ This function used to plot predictions and display the graph """ xsort = training_data_x.copy() xsort.sort(axis=0) plt.scatter(col_x, col_y, color="blue") plt.plot( xsort[:, 1], predictions[training_data_x[:, 1].argsort(0)], color="yellow", linewidth=5, ) plt.title("Local Weighted Regression") plt.xlabel(cola_name) plt.ylabel(colb_name) plt.show() if __name__ == "__main__": training_data_x, mcol_b, col_a, col_b = load_data("tips", "total_bill", "tip") predictions = get_preds(training_data_x, mcol_b, 0.5) plot_preds(training_data_x, predictions, col_a, col_b, "total_bill", "tip")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations from random import random class Node: """ Treap's node Treap is a binary tree by value and heap by priority """ def __init__(self, value: int | None = None): self.value = value self.prior = random() self.left: Node | None = None self.right: Node | None = None def __repr__(self) -> str: from pprint import pformat if self.left is None and self.right is None: return f"'{self.value}: {self.prior:.5}'" else: return pformat( {f"{self.value}: {self.prior:.5}": (self.left, self.right)}, indent=1 ) def __str__(self) -> str: value = str(self.value) + " " left = str(self.left or "") right = str(self.right or "") return value + left + right def split(root: Node | None, value: int) -> tuple[Node | None, Node | None]: """ We split current tree into 2 trees with value: Left tree contains all values less than split value. Right tree contains all values greater or equal, than split value """ if root is None: # None tree is split into 2 Nones return None, None elif root.value is None: return None, None else: if value < root.value: """ Right tree's root will be current node. Now we split(with the same value) current node's left son Left tree: left part of that split Right tree's left son: right part of that split """ left, root.left = split(root.left, value) return left, root else: """ Just symmetric to previous case """ root.right, right = split(root.right, value) return root, right def merge(left: Node | None, right: Node | None) -> Node | None: """ We merge 2 trees into one. Note: all left tree's values must be less than all right tree's """ if (not left) or (not right): # If one node is None, return the other return left or right elif left.prior < right.prior: """ Left will be root because it has more priority Now we need to merge left's right son and right tree """ left.right = merge(left.right, right) return left else: """ Symmetric as well """ right.left = merge(left, right.left) return right def insert(root: Node | None, value: int) -> Node | None: """ Insert element Split current tree with a value into left, right, Insert new node into the middle Merge left, node, right into root """ node = Node(value) left, right = split(root, value) return merge(merge(left, node), right) def erase(root: Node | None, value: int) -> Node | None: """ Erase element Split all nodes with values less into left, Split all nodes with values greater into right. Merge left, right """ left, right = split(root, value - 1) _, right = split(right, value) return merge(left, right) def inorder(root: Node | None) -> None: """ Just recursive print of a tree """ if not root: # None return else: inorder(root.left) print(root.value, end=",") inorder(root.right) def interact_treap(root: Node | None, args: str) -> Node | None: """ Commands: + value to add value into treap - value to erase all nodes with value >>> root = interact_treap(None, "+1") >>> inorder(root) 1, >>> root = interact_treap(root, "+3 +5 +17 +19 +2 +16 +4 +0") >>> inorder(root) 0,1,2,3,4,5,16,17,19, >>> root = interact_treap(root, "+4 +4 +4") >>> inorder(root) 0,1,2,3,4,4,4,4,5,16,17,19, >>> root = interact_treap(root, "-0") >>> inorder(root) 1,2,3,4,4,4,4,5,16,17,19, >>> root = interact_treap(root, "-4") >>> inorder(root) 1,2,3,5,16,17,19, >>> root = interact_treap(root, "=0") Unknown command """ for arg in args.split(): if arg[0] == "+": root = insert(root, int(arg[1:])) elif arg[0] == "-": root = erase(root, int(arg[1:])) else: print("Unknown command") return root def main() -> None: """After each command, program prints treap""" root = None print( "enter numbers to create a tree, + value to add value into treap, " "- value to erase all nodes with value. 'q' to quit. " ) args = input() while args != "q": root = interact_treap(root, args) print(root) args = input() print("good by!") if __name__ == "__main__": import doctest doctest.testmod() main()
from __future__ import annotations from random import random class Node: """ Treap's node Treap is a binary tree by value and heap by priority """ def __init__(self, value: int | None = None): self.value = value self.prior = random() self.left: Node | None = None self.right: Node | None = None def __repr__(self) -> str: from pprint import pformat if self.left is None and self.right is None: return f"'{self.value}: {self.prior:.5}'" else: return pformat( {f"{self.value}: {self.prior:.5}": (self.left, self.right)}, indent=1 ) def __str__(self) -> str: value = str(self.value) + " " left = str(self.left or "") right = str(self.right or "") return value + left + right def split(root: Node | None, value: int) -> tuple[Node | None, Node | None]: """ We split current tree into 2 trees with value: Left tree contains all values less than split value. Right tree contains all values greater or equal, than split value """ if root is None: # None tree is split into 2 Nones return None, None elif root.value is None: return None, None else: if value < root.value: """ Right tree's root will be current node. Now we split(with the same value) current node's left son Left tree: left part of that split Right tree's left son: right part of that split """ left, root.left = split(root.left, value) return left, root else: """ Just symmetric to previous case """ root.right, right = split(root.right, value) return root, right def merge(left: Node | None, right: Node | None) -> Node | None: """ We merge 2 trees into one. Note: all left tree's values must be less than all right tree's """ if (not left) or (not right): # If one node is None, return the other return left or right elif left.prior < right.prior: """ Left will be root because it has more priority Now we need to merge left's right son and right tree """ left.right = merge(left.right, right) return left else: """ Symmetric as well """ right.left = merge(left, right.left) return right def insert(root: Node | None, value: int) -> Node | None: """ Insert element Split current tree with a value into left, right, Insert new node into the middle Merge left, node, right into root """ node = Node(value) left, right = split(root, value) return merge(merge(left, node), right) def erase(root: Node | None, value: int) -> Node | None: """ Erase element Split all nodes with values less into left, Split all nodes with values greater into right. Merge left, right """ left, right = split(root, value - 1) _, right = split(right, value) return merge(left, right) def inorder(root: Node | None) -> None: """ Just recursive print of a tree """ if not root: # None return else: inorder(root.left) print(root.value, end=",") inorder(root.right) def interact_treap(root: Node | None, args: str) -> Node | None: """ Commands: + value to add value into treap - value to erase all nodes with value >>> root = interact_treap(None, "+1") >>> inorder(root) 1, >>> root = interact_treap(root, "+3 +5 +17 +19 +2 +16 +4 +0") >>> inorder(root) 0,1,2,3,4,5,16,17,19, >>> root = interact_treap(root, "+4 +4 +4") >>> inorder(root) 0,1,2,3,4,4,4,4,5,16,17,19, >>> root = interact_treap(root, "-0") >>> inorder(root) 1,2,3,4,4,4,4,5,16,17,19, >>> root = interact_treap(root, "-4") >>> inorder(root) 1,2,3,5,16,17,19, >>> root = interact_treap(root, "=0") Unknown command """ for arg in args.split(): if arg[0] == "+": root = insert(root, int(arg[1:])) elif arg[0] == "-": root = erase(root, int(arg[1:])) else: print("Unknown command") return root def main() -> None: """After each command, program prints treap""" root = None print( "enter numbers to create a tree, + value to add value into treap, " "- value to erase all nodes with value. 'q' to quit. " ) args = input() while args != "q": root = interact_treap(root, args) print(root) args = input() print("good by!") if __name__ == "__main__": import doctest doctest.testmod() main()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Author: João Gustavo A. Amorim # Author email: [email protected] # Coding date: jan 2019 # python/black: True # Imports import numpy as np # Class implemented to calculus the index class IndexCalculation: """ # Class Summary This algorithm consists in calculating vegetation indices, these indices can be used for precision agriculture for example (or remote sensing). There are functions to define the data and to calculate the implemented indices. # Vegetation index https://en.wikipedia.org/wiki/Vegetation_Index A Vegetation Index (VI) is a spectral transformation of two or more bands designed to enhance the contribution of vegetation properties and allow reliable spatial and temporal inter-comparisons of terrestrial photosynthetic activity and canopy structural variations # Information about channels (Wavelength range for each) * nir - near-infrared https://www.malvernpanalytical.com/br/products/technology/near-infrared-spectroscopy Wavelength Range 700 nm to 2500 nm * Red Edge https://en.wikipedia.org/wiki/Red_edge Wavelength Range 680 nm to 730 nm * red https://en.wikipedia.org/wiki/Color Wavelength Range 635 nm to 700 nm * blue https://en.wikipedia.org/wiki/Color Wavelength Range 450 nm to 490 nm * green https://en.wikipedia.org/wiki/Color Wavelength Range 520 nm to 560 nm # Implemented index list #"abbreviationOfIndexName" -- list of channels used #"ARVI2" -- red, nir #"CCCI" -- red, redEdge, nir #"CVI" -- red, green, nir #"GLI" -- red, green, blue #"NDVI" -- red, nir #"BNDVI" -- blue, nir #"redEdgeNDVI" -- red, redEdge #"GNDVI" -- green, nir #"GBNDVI" -- green, blue, nir #"GRNDVI" -- red, green, nir #"RBNDVI" -- red, blue, nir #"PNDVI" -- red, green, blue, nir #"ATSAVI" -- red, nir #"BWDRVI" -- blue, nir #"CIgreen" -- green, nir #"CIrededge" -- redEdge, nir #"CI" -- red, blue #"CTVI" -- red, nir #"GDVI" -- green, nir #"EVI" -- red, blue, nir #"GEMI" -- red, nir #"GOSAVI" -- green, nir #"GSAVI" -- green, nir #"Hue" -- red, green, blue #"IVI" -- red, nir #"IPVI" -- red, nir #"I" -- red, green, blue #"RVI" -- red, nir #"MRVI" -- red, nir #"MSAVI" -- red, nir #"NormG" -- red, green, nir #"NormNIR" -- red, green, nir #"NormR" -- red, green, nir #"NGRDI" -- red, green #"RI" -- red, green #"S" -- red, green, blue #"IF" -- red, green, blue #"DVI" -- red, nir #"TVI" -- red, nir #"NDRE" -- redEdge, nir #list of all index implemented #allIndex = ["ARVI2", "CCCI", "CVI", "GLI", "NDVI", "BNDVI", "redEdgeNDVI", "GNDVI", "GBNDVI", "GRNDVI", "RBNDVI", "PNDVI", "ATSAVI", "BWDRVI", "CIgreen", "CIrededge", "CI", "CTVI", "GDVI", "EVI", "GEMI", "GOSAVI", "GSAVI", "Hue", "IVI", "IPVI", "I", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "S", "IF", "DVI", "TVI", "NDRE"] #list of index with not blue channel #notBlueIndex = ["ARVI2", "CCCI", "CVI", "NDVI", "redEdgeNDVI", "GNDVI", "GRNDVI", "ATSAVI", "CIgreen", "CIrededge", "CTVI", "GDVI", "GEMI", "GOSAVI", "GSAVI", "IVI", "IPVI", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "DVI", "TVI", "NDRE"] #list of index just with RGB channels #RGBIndex = ["GLI", "CI", "Hue", "I", "NGRDI", "RI", "S", "IF"] """ def __init__(self, red=None, green=None, blue=None, red_edge=None, nir=None): self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) def set_matricies(self, red=None, green=None, blue=None, red_edge=None, nir=None): if red is not None: self.red = red if green is not None: self.green = green if blue is not None: self.blue = blue if red_edge is not None: self.redEdge = red_edge if nir is not None: self.nir = nir return True def calculation( self, index="", red=None, green=None, blue=None, red_edge=None, nir=None ): """ performs the calculation of the index with the values instantiated in the class :str index: abbreviation of index name to perform """ self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) funcs = { "ARVI2": self.arv12, "CCCI": self.ccci, "CVI": self.cvi, "GLI": self.gli, "NDVI": self.ndvi, "BNDVI": self.bndvi, "redEdgeNDVI": self.red_edge_ndvi, "GNDVI": self.gndvi, "GBNDVI": self.gbndvi, "GRNDVI": self.grndvi, "RBNDVI": self.rbndvi, "PNDVI": self.pndvi, "ATSAVI": self.atsavi, "BWDRVI": self.bwdrvi, "CIgreen": self.ci_green, "CIrededge": self.ci_rededge, "CI": self.ci, "CTVI": self.ctvi, "GDVI": self.gdvi, "EVI": self.evi, "GEMI": self.gemi, "GOSAVI": self.gosavi, "GSAVI": self.gsavi, "Hue": self.hue, "IVI": self.ivi, "IPVI": self.ipvi, "I": self.i, "RVI": self.rvi, "MRVI": self.mrvi, "MSAVI": self.m_savi, "NormG": self.norm_g, "NormNIR": self.norm_nir, "NormR": self.norm_r, "NGRDI": self.ngrdi, "RI": self.ri, "S": self.s, "IF": self._if, "DVI": self.dvi, "TVI": self.tvi, "NDRE": self.ndre, } try: return funcs[index]() except KeyError: print("Index not in the list!") return False def arv12(self): """ Atmospherically Resistant Vegetation Index 2 https://www.indexdatabase.de/db/i-single.php?id=396 :return: index −0.18+1.17*(self.nir−self.red)/(self.nir+self.red) """ return -0.18 + (1.17 * ((self.nir - self.red) / (self.nir + self.red))) def ccci(self): """ Canopy Chlorophyll Content Index https://www.indexdatabase.de/db/i-single.php?id=224 :return: index """ return ((self.nir - self.redEdge) / (self.nir + self.redEdge)) / ( (self.nir - self.red) / (self.nir + self.red) ) def cvi(self): """ Chlorophyll vegetation index https://www.indexdatabase.de/db/i-single.php?id=391 :return: index """ return self.nir * (self.red / (self.green**2)) def gli(self): """ self.green leaf index https://www.indexdatabase.de/db/i-single.php?id=375 :return: index """ return (2 * self.green - self.red - self.blue) / ( 2 * self.green + self.red + self.blue ) def ndvi(self): """ Normalized Difference self.nir/self.red Normalized Difference Vegetation Index, Calibrated NDVI - CDVI https://www.indexdatabase.de/db/i-single.php?id=58 :return: index """ return (self.nir - self.red) / (self.nir + self.red) def bndvi(self): """ Normalized Difference self.nir/self.blue self.blue-normalized difference vegetation index https://www.indexdatabase.de/db/i-single.php?id=135 :return: index """ return (self.nir - self.blue) / (self.nir + self.blue) def red_edge_ndvi(self): """ Normalized Difference self.rededge/self.red https://www.indexdatabase.de/db/i-single.php?id=235 :return: index """ return (self.redEdge - self.red) / (self.redEdge + self.red) def gndvi(self): """ Normalized Difference self.nir/self.green self.green NDVI https://www.indexdatabase.de/db/i-single.php?id=401 :return: index """ return (self.nir - self.green) / (self.nir + self.green) def gbndvi(self): """ self.green-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=186 :return: index """ return (self.nir - (self.green + self.blue)) / ( self.nir + (self.green + self.blue) ) def grndvi(self): """ self.green-self.red NDVI https://www.indexdatabase.de/db/i-single.php?id=185 :return: index """ return (self.nir - (self.green + self.red)) / ( self.nir + (self.green + self.red) ) def rbndvi(self): """ self.red-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=187 :return: index """ return (self.nir - (self.blue + self.red)) / (self.nir + (self.blue + self.red)) def pndvi(self): """ Pan NDVI https://www.indexdatabase.de/db/i-single.php?id=188 :return: index """ return (self.nir - (self.green + self.red + self.blue)) / ( self.nir + (self.green + self.red + self.blue) ) def atsavi(self, x=0.08, a=1.22, b=0.03): """ Adjusted transformed soil-adjusted VI https://www.indexdatabase.de/db/i-single.php?id=209 :return: index """ return a * ( (self.nir - a * self.red - b) / (a * self.nir + self.red - a * b + x * (1 + a**2)) ) def bwdrvi(self): """ self.blue-wide dynamic range vegetation index https://www.indexdatabase.de/db/i-single.php?id=136 :return: index """ return (0.1 * self.nir - self.blue) / (0.1 * self.nir + self.blue) def ci_green(self): """ Chlorophyll Index self.green https://www.indexdatabase.de/db/i-single.php?id=128 :return: index """ return (self.nir / self.green) - 1 def ci_rededge(self): """ Chlorophyll Index self.redEdge https://www.indexdatabase.de/db/i-single.php?id=131 :return: index """ return (self.nir / self.redEdge) - 1 def ci(self): """ Coloration Index https://www.indexdatabase.de/db/i-single.php?id=11 :return: index """ return (self.red - self.blue) / self.red def ctvi(self): """ Corrected Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=244 :return: index """ ndvi = self.ndvi() return ((ndvi + 0.5) / (abs(ndvi + 0.5))) * (abs(ndvi + 0.5) ** (1 / 2)) def gdvi(self): """ Difference self.nir/self.green self.green Difference Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=27 :return: index """ return self.nir - self.green def evi(self): """ Enhanced Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=16 :return: index """ return 2.5 * ( (self.nir - self.red) / (self.nir + 6 * self.red - 7.5 * self.blue + 1) ) def gemi(self): """ Global Environment Monitoring Index https://www.indexdatabase.de/db/i-single.php?id=25 :return: index """ n = (2 * (self.nir**2 - self.red**2) + 1.5 * self.nir + 0.5 * self.red) / ( self.nir + self.red + 0.5 ) return n * (1 - 0.25 * n) - (self.red - 0.125) / (1 - self.red) def gosavi(self, y=0.16): """ self.green Optimized Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=29 mit Y = 0,16 :return: index """ return (self.nir - self.green) / (self.nir + self.green + y) def gsavi(self, n=0.5): """ self.green Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=31 mit N = 0,5 :return: index """ return ((self.nir - self.green) / (self.nir + self.green + n)) * (1 + n) def hue(self): """ Hue https://www.indexdatabase.de/db/i-single.php?id=34 :return: index """ return np.arctan( ((2 * self.red - self.green - self.blue) / 30.5) * (self.green - self.blue) ) def ivi(self, a=None, b=None): """ Ideal vegetation index https://www.indexdatabase.de/db/i-single.php?id=276 b=intercept of vegetation line a=soil line slope :return: index """ return (self.nir - b) / (a * self.red) def ipvi(self): """ Infraself.red percentage vegetation index https://www.indexdatabase.de/db/i-single.php?id=35 :return: index """ return (self.nir / ((self.nir + self.red) / 2)) * (self.ndvi() + 1) def i(self): """ Intensity https://www.indexdatabase.de/db/i-single.php?id=36 :return: index """ return (self.red + self.green + self.blue) / 30.5 def rvi(self): """ Ratio-Vegetation-Index http://www.seos-project.eu/modules/remotesensing/remotesensing-c03-s01-p01.html :return: index """ return self.nir / self.red def mrvi(self): """ Modified Normalized Difference Vegetation Index RVI https://www.indexdatabase.de/db/i-single.php?id=275 :return: index """ return (self.rvi() - 1) / (self.rvi() + 1) def m_savi(self): """ Modified Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=44 :return: index """ return ( (2 * self.nir + 1) - ((2 * self.nir + 1) ** 2 - 8 * (self.nir - self.red)) ** (1 / 2) ) / 2 def norm_g(self): """ Norm G https://www.indexdatabase.de/db/i-single.php?id=50 :return: index """ return self.green / (self.nir + self.red + self.green) def norm_nir(self): """ Norm self.nir https://www.indexdatabase.de/db/i-single.php?id=51 :return: index """ return self.nir / (self.nir + self.red + self.green) def norm_r(self): """ Norm R https://www.indexdatabase.de/db/i-single.php?id=52 :return: index """ return self.red / (self.nir + self.red + self.green) def ngrdi(self): """ Normalized Difference self.green/self.red Normalized self.green self.red difference index, Visible Atmospherically Resistant Indices self.green (VIself.green) https://www.indexdatabase.de/db/i-single.php?id=390 :return: index """ return (self.green - self.red) / (self.green + self.red) def ri(self): """ Normalized Difference self.red/self.green self.redness Index https://www.indexdatabase.de/db/i-single.php?id=74 :return: index """ return (self.red - self.green) / (self.red + self.green) def s(self): """ Saturation https://www.indexdatabase.de/db/i-single.php?id=77 :return: index """ max_value = np.max([np.max(self.red), np.max(self.green), np.max(self.blue)]) min_value = np.min([np.min(self.red), np.min(self.green), np.min(self.blue)]) return (max_value - min_value) / max_value def _if(self): """ Shape Index https://www.indexdatabase.de/db/i-single.php?id=79 :return: index """ return (2 * self.red - self.green - self.blue) / (self.green - self.blue) def dvi(self): """ Simple Ratio self.nir/self.red Difference Vegetation Index, Vegetation Index Number (VIN) https://www.indexdatabase.de/db/i-single.php?id=12 :return: index """ return self.nir / self.red def tvi(self): """ Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=98 :return: index """ return (self.ndvi() + 0.5) ** (1 / 2) def ndre(self): return (self.nir - self.redEdge) / (self.nir + self.redEdge) """ # genering a random matrices to test this class red = np.ones((1000,1000, 1),dtype="float64") * 46787 green = np.ones((1000,1000, 1),dtype="float64") * 23487 blue = np.ones((1000,1000, 1),dtype="float64") * 14578 redEdge = np.ones((1000,1000, 1),dtype="float64") * 51045 nir = np.ones((1000,1000, 1),dtype="float64") * 52200 # Examples of how to use the class # instantiating the class cl = IndexCalculation() # instantiating the class with the values #cl = indexCalculation(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # how set the values after instantiate the class cl, (for update the data or when don't # instantiating the class with the values) cl.setMatrices(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # calculating the indices for the instantiated values in the class # Note: the CCCI index can be changed to any index implemented in the class. indexValue_form1 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) indexValue_form2 = cl.CCCI() # calculating the index with the values directly -- you can set just the values # preferred note: the *calculation* function performs the function *setMatrices* indexValue_form3 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) print("Form 1: "+np.array2string(indexValue_form1, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 2: "+np.array2string(indexValue_form2, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 3: "+np.array2string(indexValue_form3, precision=20, separator=', ', floatmode='maxprec_equal')) # A list of examples results for different type of data at NDVI # float16 -> 0.31567383 #NDVI (red = 50, nir = 100) # float32 -> 0.31578946 #NDVI (red = 50, nir = 100) # float64 -> 0.3157894736842105 #NDVI (red = 50, nir = 100) # longdouble -> 0.3157894736842105 #NDVI (red = 50, nir = 100) """
# Author: João Gustavo A. Amorim # Author email: [email protected] # Coding date: jan 2019 # python/black: True # Imports import numpy as np # Class implemented to calculus the index class IndexCalculation: """ # Class Summary This algorithm consists in calculating vegetation indices, these indices can be used for precision agriculture for example (or remote sensing). There are functions to define the data and to calculate the implemented indices. # Vegetation index https://en.wikipedia.org/wiki/Vegetation_Index A Vegetation Index (VI) is a spectral transformation of two or more bands designed to enhance the contribution of vegetation properties and allow reliable spatial and temporal inter-comparisons of terrestrial photosynthetic activity and canopy structural variations # Information about channels (Wavelength range for each) * nir - near-infrared https://www.malvernpanalytical.com/br/products/technology/near-infrared-spectroscopy Wavelength Range 700 nm to 2500 nm * Red Edge https://en.wikipedia.org/wiki/Red_edge Wavelength Range 680 nm to 730 nm * red https://en.wikipedia.org/wiki/Color Wavelength Range 635 nm to 700 nm * blue https://en.wikipedia.org/wiki/Color Wavelength Range 450 nm to 490 nm * green https://en.wikipedia.org/wiki/Color Wavelength Range 520 nm to 560 nm # Implemented index list #"abbreviationOfIndexName" -- list of channels used #"ARVI2" -- red, nir #"CCCI" -- red, redEdge, nir #"CVI" -- red, green, nir #"GLI" -- red, green, blue #"NDVI" -- red, nir #"BNDVI" -- blue, nir #"redEdgeNDVI" -- red, redEdge #"GNDVI" -- green, nir #"GBNDVI" -- green, blue, nir #"GRNDVI" -- red, green, nir #"RBNDVI" -- red, blue, nir #"PNDVI" -- red, green, blue, nir #"ATSAVI" -- red, nir #"BWDRVI" -- blue, nir #"CIgreen" -- green, nir #"CIrededge" -- redEdge, nir #"CI" -- red, blue #"CTVI" -- red, nir #"GDVI" -- green, nir #"EVI" -- red, blue, nir #"GEMI" -- red, nir #"GOSAVI" -- green, nir #"GSAVI" -- green, nir #"Hue" -- red, green, blue #"IVI" -- red, nir #"IPVI" -- red, nir #"I" -- red, green, blue #"RVI" -- red, nir #"MRVI" -- red, nir #"MSAVI" -- red, nir #"NormG" -- red, green, nir #"NormNIR" -- red, green, nir #"NormR" -- red, green, nir #"NGRDI" -- red, green #"RI" -- red, green #"S" -- red, green, blue #"IF" -- red, green, blue #"DVI" -- red, nir #"TVI" -- red, nir #"NDRE" -- redEdge, nir #list of all index implemented #allIndex = ["ARVI2", "CCCI", "CVI", "GLI", "NDVI", "BNDVI", "redEdgeNDVI", "GNDVI", "GBNDVI", "GRNDVI", "RBNDVI", "PNDVI", "ATSAVI", "BWDRVI", "CIgreen", "CIrededge", "CI", "CTVI", "GDVI", "EVI", "GEMI", "GOSAVI", "GSAVI", "Hue", "IVI", "IPVI", "I", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "S", "IF", "DVI", "TVI", "NDRE"] #list of index with not blue channel #notBlueIndex = ["ARVI2", "CCCI", "CVI", "NDVI", "redEdgeNDVI", "GNDVI", "GRNDVI", "ATSAVI", "CIgreen", "CIrededge", "CTVI", "GDVI", "GEMI", "GOSAVI", "GSAVI", "IVI", "IPVI", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "DVI", "TVI", "NDRE"] #list of index just with RGB channels #RGBIndex = ["GLI", "CI", "Hue", "I", "NGRDI", "RI", "S", "IF"] """ def __init__(self, red=None, green=None, blue=None, red_edge=None, nir=None): self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) def set_matricies(self, red=None, green=None, blue=None, red_edge=None, nir=None): if red is not None: self.red = red if green is not None: self.green = green if blue is not None: self.blue = blue if red_edge is not None: self.redEdge = red_edge if nir is not None: self.nir = nir return True def calculation( self, index="", red=None, green=None, blue=None, red_edge=None, nir=None ): """ performs the calculation of the index with the values instantiated in the class :str index: abbreviation of index name to perform """ self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) funcs = { "ARVI2": self.arv12, "CCCI": self.ccci, "CVI": self.cvi, "GLI": self.gli, "NDVI": self.ndvi, "BNDVI": self.bndvi, "redEdgeNDVI": self.red_edge_ndvi, "GNDVI": self.gndvi, "GBNDVI": self.gbndvi, "GRNDVI": self.grndvi, "RBNDVI": self.rbndvi, "PNDVI": self.pndvi, "ATSAVI": self.atsavi, "BWDRVI": self.bwdrvi, "CIgreen": self.ci_green, "CIrededge": self.ci_rededge, "CI": self.ci, "CTVI": self.ctvi, "GDVI": self.gdvi, "EVI": self.evi, "GEMI": self.gemi, "GOSAVI": self.gosavi, "GSAVI": self.gsavi, "Hue": self.hue, "IVI": self.ivi, "IPVI": self.ipvi, "I": self.i, "RVI": self.rvi, "MRVI": self.mrvi, "MSAVI": self.m_savi, "NormG": self.norm_g, "NormNIR": self.norm_nir, "NormR": self.norm_r, "NGRDI": self.ngrdi, "RI": self.ri, "S": self.s, "IF": self._if, "DVI": self.dvi, "TVI": self.tvi, "NDRE": self.ndre, } try: return funcs[index]() except KeyError: print("Index not in the list!") return False def arv12(self): """ Atmospherically Resistant Vegetation Index 2 https://www.indexdatabase.de/db/i-single.php?id=396 :return: index −0.18+1.17*(self.nir−self.red)/(self.nir+self.red) """ return -0.18 + (1.17 * ((self.nir - self.red) / (self.nir + self.red))) def ccci(self): """ Canopy Chlorophyll Content Index https://www.indexdatabase.de/db/i-single.php?id=224 :return: index """ return ((self.nir - self.redEdge) / (self.nir + self.redEdge)) / ( (self.nir - self.red) / (self.nir + self.red) ) def cvi(self): """ Chlorophyll vegetation index https://www.indexdatabase.de/db/i-single.php?id=391 :return: index """ return self.nir * (self.red / (self.green**2)) def gli(self): """ self.green leaf index https://www.indexdatabase.de/db/i-single.php?id=375 :return: index """ return (2 * self.green - self.red - self.blue) / ( 2 * self.green + self.red + self.blue ) def ndvi(self): """ Normalized Difference self.nir/self.red Normalized Difference Vegetation Index, Calibrated NDVI - CDVI https://www.indexdatabase.de/db/i-single.php?id=58 :return: index """ return (self.nir - self.red) / (self.nir + self.red) def bndvi(self): """ Normalized Difference self.nir/self.blue self.blue-normalized difference vegetation index https://www.indexdatabase.de/db/i-single.php?id=135 :return: index """ return (self.nir - self.blue) / (self.nir + self.blue) def red_edge_ndvi(self): """ Normalized Difference self.rededge/self.red https://www.indexdatabase.de/db/i-single.php?id=235 :return: index """ return (self.redEdge - self.red) / (self.redEdge + self.red) def gndvi(self): """ Normalized Difference self.nir/self.green self.green NDVI https://www.indexdatabase.de/db/i-single.php?id=401 :return: index """ return (self.nir - self.green) / (self.nir + self.green) def gbndvi(self): """ self.green-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=186 :return: index """ return (self.nir - (self.green + self.blue)) / ( self.nir + (self.green + self.blue) ) def grndvi(self): """ self.green-self.red NDVI https://www.indexdatabase.de/db/i-single.php?id=185 :return: index """ return (self.nir - (self.green + self.red)) / ( self.nir + (self.green + self.red) ) def rbndvi(self): """ self.red-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=187 :return: index """ return (self.nir - (self.blue + self.red)) / (self.nir + (self.blue + self.red)) def pndvi(self): """ Pan NDVI https://www.indexdatabase.de/db/i-single.php?id=188 :return: index """ return (self.nir - (self.green + self.red + self.blue)) / ( self.nir + (self.green + self.red + self.blue) ) def atsavi(self, x=0.08, a=1.22, b=0.03): """ Adjusted transformed soil-adjusted VI https://www.indexdatabase.de/db/i-single.php?id=209 :return: index """ return a * ( (self.nir - a * self.red - b) / (a * self.nir + self.red - a * b + x * (1 + a**2)) ) def bwdrvi(self): """ self.blue-wide dynamic range vegetation index https://www.indexdatabase.de/db/i-single.php?id=136 :return: index """ return (0.1 * self.nir - self.blue) / (0.1 * self.nir + self.blue) def ci_green(self): """ Chlorophyll Index self.green https://www.indexdatabase.de/db/i-single.php?id=128 :return: index """ return (self.nir / self.green) - 1 def ci_rededge(self): """ Chlorophyll Index self.redEdge https://www.indexdatabase.de/db/i-single.php?id=131 :return: index """ return (self.nir / self.redEdge) - 1 def ci(self): """ Coloration Index https://www.indexdatabase.de/db/i-single.php?id=11 :return: index """ return (self.red - self.blue) / self.red def ctvi(self): """ Corrected Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=244 :return: index """ ndvi = self.ndvi() return ((ndvi + 0.5) / (abs(ndvi + 0.5))) * (abs(ndvi + 0.5) ** (1 / 2)) def gdvi(self): """ Difference self.nir/self.green self.green Difference Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=27 :return: index """ return self.nir - self.green def evi(self): """ Enhanced Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=16 :return: index """ return 2.5 * ( (self.nir - self.red) / (self.nir + 6 * self.red - 7.5 * self.blue + 1) ) def gemi(self): """ Global Environment Monitoring Index https://www.indexdatabase.de/db/i-single.php?id=25 :return: index """ n = (2 * (self.nir**2 - self.red**2) + 1.5 * self.nir + 0.5 * self.red) / ( self.nir + self.red + 0.5 ) return n * (1 - 0.25 * n) - (self.red - 0.125) / (1 - self.red) def gosavi(self, y=0.16): """ self.green Optimized Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=29 mit Y = 0,16 :return: index """ return (self.nir - self.green) / (self.nir + self.green + y) def gsavi(self, n=0.5): """ self.green Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=31 mit N = 0,5 :return: index """ return ((self.nir - self.green) / (self.nir + self.green + n)) * (1 + n) def hue(self): """ Hue https://www.indexdatabase.de/db/i-single.php?id=34 :return: index """ return np.arctan( ((2 * self.red - self.green - self.blue) / 30.5) * (self.green - self.blue) ) def ivi(self, a=None, b=None): """ Ideal vegetation index https://www.indexdatabase.de/db/i-single.php?id=276 b=intercept of vegetation line a=soil line slope :return: index """ return (self.nir - b) / (a * self.red) def ipvi(self): """ Infraself.red percentage vegetation index https://www.indexdatabase.de/db/i-single.php?id=35 :return: index """ return (self.nir / ((self.nir + self.red) / 2)) * (self.ndvi() + 1) def i(self): """ Intensity https://www.indexdatabase.de/db/i-single.php?id=36 :return: index """ return (self.red + self.green + self.blue) / 30.5 def rvi(self): """ Ratio-Vegetation-Index http://www.seos-project.eu/modules/remotesensing/remotesensing-c03-s01-p01.html :return: index """ return self.nir / self.red def mrvi(self): """ Modified Normalized Difference Vegetation Index RVI https://www.indexdatabase.de/db/i-single.php?id=275 :return: index """ return (self.rvi() - 1) / (self.rvi() + 1) def m_savi(self): """ Modified Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=44 :return: index """ return ( (2 * self.nir + 1) - ((2 * self.nir + 1) ** 2 - 8 * (self.nir - self.red)) ** (1 / 2) ) / 2 def norm_g(self): """ Norm G https://www.indexdatabase.de/db/i-single.php?id=50 :return: index """ return self.green / (self.nir + self.red + self.green) def norm_nir(self): """ Norm self.nir https://www.indexdatabase.de/db/i-single.php?id=51 :return: index """ return self.nir / (self.nir + self.red + self.green) def norm_r(self): """ Norm R https://www.indexdatabase.de/db/i-single.php?id=52 :return: index """ return self.red / (self.nir + self.red + self.green) def ngrdi(self): """ Normalized Difference self.green/self.red Normalized self.green self.red difference index, Visible Atmospherically Resistant Indices self.green (VIself.green) https://www.indexdatabase.de/db/i-single.php?id=390 :return: index """ return (self.green - self.red) / (self.green + self.red) def ri(self): """ Normalized Difference self.red/self.green self.redness Index https://www.indexdatabase.de/db/i-single.php?id=74 :return: index """ return (self.red - self.green) / (self.red + self.green) def s(self): """ Saturation https://www.indexdatabase.de/db/i-single.php?id=77 :return: index """ max_value = np.max([np.max(self.red), np.max(self.green), np.max(self.blue)]) min_value = np.min([np.min(self.red), np.min(self.green), np.min(self.blue)]) return (max_value - min_value) / max_value def _if(self): """ Shape Index https://www.indexdatabase.de/db/i-single.php?id=79 :return: index """ return (2 * self.red - self.green - self.blue) / (self.green - self.blue) def dvi(self): """ Simple Ratio self.nir/self.red Difference Vegetation Index, Vegetation Index Number (VIN) https://www.indexdatabase.de/db/i-single.php?id=12 :return: index """ return self.nir / self.red def tvi(self): """ Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=98 :return: index """ return (self.ndvi() + 0.5) ** (1 / 2) def ndre(self): return (self.nir - self.redEdge) / (self.nir + self.redEdge) """ # genering a random matrices to test this class red = np.ones((1000,1000, 1),dtype="float64") * 46787 green = np.ones((1000,1000, 1),dtype="float64") * 23487 blue = np.ones((1000,1000, 1),dtype="float64") * 14578 redEdge = np.ones((1000,1000, 1),dtype="float64") * 51045 nir = np.ones((1000,1000, 1),dtype="float64") * 52200 # Examples of how to use the class # instantiating the class cl = IndexCalculation() # instantiating the class with the values #cl = indexCalculation(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # how set the values after instantiate the class cl, (for update the data or when don't # instantiating the class with the values) cl.setMatrices(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # calculating the indices for the instantiated values in the class # Note: the CCCI index can be changed to any index implemented in the class. indexValue_form1 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) indexValue_form2 = cl.CCCI() # calculating the index with the values directly -- you can set just the values # preferred note: the *calculation* function performs the function *setMatrices* indexValue_form3 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) print("Form 1: "+np.array2string(indexValue_form1, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 2: "+np.array2string(indexValue_form2, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 3: "+np.array2string(indexValue_form3, precision=20, separator=', ', floatmode='maxprec_equal')) # A list of examples results for different type of data at NDVI # float16 -> 0.31567383 #NDVI (red = 50, nir = 100) # float32 -> 0.31578946 #NDVI (red = 50, nir = 100) # float64 -> 0.3157894736842105 #NDVI (red = 50, nir = 100) # longdouble -> 0.3157894736842105 #NDVI (red = 50, nir = 100) """
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
def alternative_list_arrange(first_input_list: list, second_input_list: list) -> list: """ The method arranges two lists as one list in alternative forms of the list elements. :param first_input_list: :param second_input_list: :return: List >>> alternative_list_arrange([1, 2, 3, 4, 5], ["A", "B", "C"]) [1, 'A', 2, 'B', 3, 'C', 4, 5] >>> alternative_list_arrange(["A", "B", "C"], [1, 2, 3, 4, 5]) ['A', 1, 'B', 2, 'C', 3, 4, 5] >>> alternative_list_arrange(["X", "Y", "Z"], [9, 8, 7, 6]) ['X', 9, 'Y', 8, 'Z', 7, 6] >>> alternative_list_arrange([1, 2, 3, 4, 5], []) [1, 2, 3, 4, 5] """ first_input_list_length: int = len(first_input_list) second_input_list_length: int = len(second_input_list) abs_length: int = ( first_input_list_length if first_input_list_length > second_input_list_length else second_input_list_length ) output_result_list: list = [] for char_count in range(abs_length): if char_count < first_input_list_length: output_result_list.append(first_input_list[char_count]) if char_count < second_input_list_length: output_result_list.append(second_input_list[char_count]) return output_result_list if __name__ == "__main__": print(alternative_list_arrange(["A", "B", "C"], [1, 2, 3, 4, 5]), end=" ")
def alternative_list_arrange(first_input_list: list, second_input_list: list) -> list: """ The method arranges two lists as one list in alternative forms of the list elements. :param first_input_list: :param second_input_list: :return: List >>> alternative_list_arrange([1, 2, 3, 4, 5], ["A", "B", "C"]) [1, 'A', 2, 'B', 3, 'C', 4, 5] >>> alternative_list_arrange(["A", "B", "C"], [1, 2, 3, 4, 5]) ['A', 1, 'B', 2, 'C', 3, 4, 5] >>> alternative_list_arrange(["X", "Y", "Z"], [9, 8, 7, 6]) ['X', 9, 'Y', 8, 'Z', 7, 6] >>> alternative_list_arrange([1, 2, 3, 4, 5], []) [1, 2, 3, 4, 5] """ first_input_list_length: int = len(first_input_list) second_input_list_length: int = len(second_input_list) abs_length: int = ( first_input_list_length if first_input_list_length > second_input_list_length else second_input_list_length ) output_result_list: list = [] for char_count in range(abs_length): if char_count < first_input_list_length: output_result_list.append(first_input_list[char_count]) if char_count < second_input_list_length: output_result_list.append(second_input_list[char_count]) return output_result_list if __name__ == "__main__": print(alternative_list_arrange(["A", "B", "C"], [1, 2, 3, 4, 5]), end=" ")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
import os from itertools import chain from random import randrange, shuffle import pytest from .sol1 import PokerHand SORTED_HANDS = ( "4S 3H 2C 7S 5H", "9D 8H 2C 6S 7H", "2D 6D 9D TH 7D", "TC 8C 2S JH 6C", "JH 8S TH AH QH", "TS KS 5S 9S AC", "KD 6S 9D TH AD", "KS 8D 4D 9S 4S", # pair "8C 4S KH JS 4D", # pair "QH 8H KD JH 8S", # pair "KC 4H KS 2H 8D", # pair "KD 4S KC 3H 8S", # pair "AH 8S AS KC JH", # pair "3H 4C 4H 3S 2H", # 2 pairs "5S 5D 2C KH KH", # 2 pairs "3C KH 5D 5S KH", # 2 pairs "AS 3C KH AD KH", # 2 pairs "7C 7S 3S 7H 5S", # 3 of a kind "7C 7S KH 2H 7H", # 3 of a kind "AC KH QH AH AS", # 3 of a kind "2H 4D 3C AS 5S", # straight (low ace) "3C 5C 4C 2C 6H", # straight "6S 8S 7S 5H 9H", # straight "JS QS 9H TS KH", # straight "QC KH TS JS AH", # straight (high ace) "8C 9C 5C 3C TC", # flush "3S 8S 9S 5S KS", # flush "4C 5C 9C 8C KC", # flush "JH 8H AH KH QH", # flush "3D 2H 3H 2C 2D", # full house "2H 2C 3S 3H 3D", # full house "KH KC 3S 3H 3D", # full house "JC 6H JS JD JH", # 4 of a kind "JC 7H JS JD JH", # 4 of a kind "JC KH JS JD JH", # 4 of a kind "2S AS 4S 5S 3S", # straight flush (low ace) "2D 6D 3D 4D 5D", # straight flush "5C 6C 3C 7C 4C", # straight flush "JH 9H TH KH QH", # straight flush "JH AH TH KH QH", # royal flush (high ace straight flush) ) TEST_COMPARE = ( ("2H 3H 4H 5H 6H", "KS AS TS QS JS", "Loss"), ("2H 3H 4H 5H 6H", "AS AD AC AH JD", "Win"), ("AS AH 2H AD AC", "JS JD JC JH 3D", "Win"), ("2S AH 2H AS AC", "JS JD JC JH AD", "Loss"), ("2S AH 2H AS AC", "2H 3H 5H 6H 7H", "Win"), ("AS 3S 4S 8S 2S", "2H 3H 5H 6H 7H", "Win"), ("2H 3H 5H 6H 7H", "2S 3H 4H 5S 6C", "Win"), ("2S 3H 4H 5S 6C", "3D 4C 5H 6H 2S", "Tie"), ("2S 3H 4H 5S 6C", "AH AC 5H 6H AS", "Win"), ("2S 2H 4H 5S 4C", "AH AC 5H 6H AS", "Loss"), ("2S 2H 4H 5S 4C", "AH AC 5H 6H 7S", "Win"), ("6S AD 7H 4S AS", "AH AC 5H 6H 7S", "Loss"), ("2S AH 4H 5S KC", "AH AC 5H 6H 7S", "Loss"), ("2S 3H 6H 7S 9C", "7H 3C TH 6H 9S", "Loss"), ("4S 5H 6H TS AC", "3S 5H 6H TS AC", "Win"), ("2S AH 4H 5S 6C", "AD 4C 5H 6H 2C", "Tie"), ("AS AH 3H AD AC", "AS AH 2H AD AC", "Win"), ("AH AC 5H 5C QS", "AH AC 5H 5C KS", "Loss"), ("AH AC 5H 5C QS", "KH KC 5H 5C QS", "Win"), ("7C 7S KH 2H 7H", "3C 3S AH 2H 3H", "Win"), ("3C 3S AH 2H 3H", "7C 7S KH 2H 7H", "Loss"), ("6H 5H 4H 3H 2H", "5H 4H 3H 2H AH", "Win"), ("5H 4H 3H 2H AH", "5H 4H 3H 2H AH", "Tie"), ("5H 4H 3H 2H AH", "6H 5H 4H 3H 2H", "Loss"), ("AH AD KS KC AC", "AH KD KH AC KC", "Win"), ("2H 4D 3C AS 5S", "2H 4D 3C 6S 5S", "Loss"), ("2H 3S 3C 3H 2S", "3S 3C 2S 2H 2D", "Win"), ("4D 6D 5D 2D JH", "3S 8S 3H TC KH", "Loss"), ("4S 6C 8S 3S 7S", "AD KS 2D 7D 7C", "Loss"), ("6S 4C 7H 8C 3H", "5H JC AH 9D 9C", "Loss"), ("9D 9H JH TC QH", "3C 2S JS 5C 7H", "Win"), ("2H TC 8S AD 9S", "4H TS 7H 2C 5C", "Win"), ("9D 3S 2C 7S 7C", "JC TD 3C TC 9H", "Loss"), ) TEST_FLUSH = ( ("2H 3H 4H 5H 6H", True), ("AS AH 2H AD AC", False), ("2H 3H 5H 6H 7H", True), ("KS AS TS QS JS", True), ("8H 9H QS JS TH", False), ("AS 3S 4S 8S 2S", True), ) TEST_STRAIGHT = ( ("2H 3H 4H 5H 6H", True), ("AS AH 2H AD AC", False), ("2H 3H 5H 6H 7H", False), ("KS AS TS QS JS", True), ("8H 9H QS JS TH", True), ) TEST_FIVE_HIGH_STRAIGHT = ( ("2H 4D 3C AS 5S", True, [5, 4, 3, 2, 14]), ("2H 5D 3C AS 5S", False, [14, 5, 5, 3, 2]), ("JH QD KC AS TS", False, [14, 13, 12, 11, 10]), ("9D 3S 2C 7S 7C", False, [9, 7, 7, 3, 2]), ) TEST_KIND = ( ("JH AH TH KH QH", 0), ("JH 9H TH KH QH", 0), ("JC KH JS JD JH", 7), ("KH KC 3S 3H 3D", 6), ("8C 9C 5C 3C TC", 0), ("JS QS 9H TS KH", 0), ("7C 7S KH 2H 7H", 3), ("3C KH 5D 5S KH", 2), ("QH 8H KD JH 8S", 1), ("2D 6D 9D TH 7D", 0), ) TEST_TYPES = ( ("JH AH TH KH QH", 23), ("JH 9H TH KH QH", 22), ("JC KH JS JD JH", 21), ("KH KC 3S 3H 3D", 20), ("8C 9C 5C 3C TC", 19), ("JS QS 9H TS KH", 18), ("7C 7S KH 2H 7H", 17), ("3C KH 5D 5S KH", 16), ("QH 8H KD JH 8S", 15), ("2D 6D 9D TH 7D", 14), ) def generate_random_hand(): play, oppo = randrange(len(SORTED_HANDS)), randrange(len(SORTED_HANDS)) expected = ["Loss", "Tie", "Win"][(play >= oppo) + (play > oppo)] hand, other = SORTED_HANDS[play], SORTED_HANDS[oppo] return hand, other, expected def generate_random_hands(number_of_hands: int = 100): return (generate_random_hand() for _ in range(number_of_hands)) @pytest.mark.parametrize("hand, expected", TEST_FLUSH) def test_hand_is_flush(hand, expected): assert PokerHand(hand)._is_flush() == expected @pytest.mark.parametrize("hand, expected", TEST_STRAIGHT) def test_hand_is_straight(hand, expected): assert PokerHand(hand)._is_straight() == expected @pytest.mark.parametrize("hand, expected, card_values", TEST_FIVE_HIGH_STRAIGHT) def test_hand_is_five_high_straight(hand, expected, card_values): player = PokerHand(hand) assert player._is_five_high_straight() == expected assert player._card_values == card_values @pytest.mark.parametrize("hand, expected", TEST_KIND) def test_hand_is_same_kind(hand, expected): assert PokerHand(hand)._is_same_kind() == expected @pytest.mark.parametrize("hand, expected", TEST_TYPES) def test_hand_values(hand, expected): assert PokerHand(hand)._hand_type == expected @pytest.mark.parametrize("hand, other, expected", TEST_COMPARE) def test_compare_simple(hand, other, expected): assert PokerHand(hand).compare_with(PokerHand(other)) == expected @pytest.mark.parametrize("hand, other, expected", generate_random_hands()) def test_compare_random(hand, other, expected): assert PokerHand(hand).compare_with(PokerHand(other)) == expected def test_hand_sorted(): poker_hands = [PokerHand(hand) for hand in SORTED_HANDS] list_copy = poker_hands.copy() shuffle(list_copy) user_sorted = chain(sorted(list_copy)) for index, hand in enumerate(user_sorted): assert hand == poker_hands[index] def test_custom_sort_five_high_straight(): # Test that five high straights are compared correctly. pokerhands = [PokerHand("2D AC 3H 4H 5S"), PokerHand("2S 3H 4H 5S 6C")] pokerhands.sort(reverse=True) assert pokerhands[0].__str__() == "2S 3H 4H 5S 6C" def test_multiple_calls_five_high_straight(): # Multiple calls to five_high_straight function should still return True # and shouldn't mutate the list in every call other than the first. pokerhand = PokerHand("2C 4S AS 3D 5C") expected = True expected_card_values = [5, 4, 3, 2, 14] for _ in range(10): assert pokerhand._is_five_high_straight() == expected assert pokerhand._card_values == expected_card_values def test_euler_project(): # Problem number 54 from Project Euler # Testing from poker_hands.txt file answer = 0 script_dir = os.path.abspath(os.path.dirname(__file__)) poker_hands = os.path.join(script_dir, "poker_hands.txt") with open(poker_hands) as file_hand: for line in file_hand: player_hand = line[:14].strip() opponent_hand = line[15:].strip() player, opponent = PokerHand(player_hand), PokerHand(opponent_hand) output = player.compare_with(opponent) if output == "Win": answer += 1 assert answer == 376
import os from itertools import chain from random import randrange, shuffle import pytest from .sol1 import PokerHand SORTED_HANDS = ( "4S 3H 2C 7S 5H", "9D 8H 2C 6S 7H", "2D 6D 9D TH 7D", "TC 8C 2S JH 6C", "JH 8S TH AH QH", "TS KS 5S 9S AC", "KD 6S 9D TH AD", "KS 8D 4D 9S 4S", # pair "8C 4S KH JS 4D", # pair "QH 8H KD JH 8S", # pair "KC 4H KS 2H 8D", # pair "KD 4S KC 3H 8S", # pair "AH 8S AS KC JH", # pair "3H 4C 4H 3S 2H", # 2 pairs "5S 5D 2C KH KH", # 2 pairs "3C KH 5D 5S KH", # 2 pairs "AS 3C KH AD KH", # 2 pairs "7C 7S 3S 7H 5S", # 3 of a kind "7C 7S KH 2H 7H", # 3 of a kind "AC KH QH AH AS", # 3 of a kind "2H 4D 3C AS 5S", # straight (low ace) "3C 5C 4C 2C 6H", # straight "6S 8S 7S 5H 9H", # straight "JS QS 9H TS KH", # straight "QC KH TS JS AH", # straight (high ace) "8C 9C 5C 3C TC", # flush "3S 8S 9S 5S KS", # flush "4C 5C 9C 8C KC", # flush "JH 8H AH KH QH", # flush "3D 2H 3H 2C 2D", # full house "2H 2C 3S 3H 3D", # full house "KH KC 3S 3H 3D", # full house "JC 6H JS JD JH", # 4 of a kind "JC 7H JS JD JH", # 4 of a kind "JC KH JS JD JH", # 4 of a kind "2S AS 4S 5S 3S", # straight flush (low ace) "2D 6D 3D 4D 5D", # straight flush "5C 6C 3C 7C 4C", # straight flush "JH 9H TH KH QH", # straight flush "JH AH TH KH QH", # royal flush (high ace straight flush) ) TEST_COMPARE = ( ("2H 3H 4H 5H 6H", "KS AS TS QS JS", "Loss"), ("2H 3H 4H 5H 6H", "AS AD AC AH JD", "Win"), ("AS AH 2H AD AC", "JS JD JC JH 3D", "Win"), ("2S AH 2H AS AC", "JS JD JC JH AD", "Loss"), ("2S AH 2H AS AC", "2H 3H 5H 6H 7H", "Win"), ("AS 3S 4S 8S 2S", "2H 3H 5H 6H 7H", "Win"), ("2H 3H 5H 6H 7H", "2S 3H 4H 5S 6C", "Win"), ("2S 3H 4H 5S 6C", "3D 4C 5H 6H 2S", "Tie"), ("2S 3H 4H 5S 6C", "AH AC 5H 6H AS", "Win"), ("2S 2H 4H 5S 4C", "AH AC 5H 6H AS", "Loss"), ("2S 2H 4H 5S 4C", "AH AC 5H 6H 7S", "Win"), ("6S AD 7H 4S AS", "AH AC 5H 6H 7S", "Loss"), ("2S AH 4H 5S KC", "AH AC 5H 6H 7S", "Loss"), ("2S 3H 6H 7S 9C", "7H 3C TH 6H 9S", "Loss"), ("4S 5H 6H TS AC", "3S 5H 6H TS AC", "Win"), ("2S AH 4H 5S 6C", "AD 4C 5H 6H 2C", "Tie"), ("AS AH 3H AD AC", "AS AH 2H AD AC", "Win"), ("AH AC 5H 5C QS", "AH AC 5H 5C KS", "Loss"), ("AH AC 5H 5C QS", "KH KC 5H 5C QS", "Win"), ("7C 7S KH 2H 7H", "3C 3S AH 2H 3H", "Win"), ("3C 3S AH 2H 3H", "7C 7S KH 2H 7H", "Loss"), ("6H 5H 4H 3H 2H", "5H 4H 3H 2H AH", "Win"), ("5H 4H 3H 2H AH", "5H 4H 3H 2H AH", "Tie"), ("5H 4H 3H 2H AH", "6H 5H 4H 3H 2H", "Loss"), ("AH AD KS KC AC", "AH KD KH AC KC", "Win"), ("2H 4D 3C AS 5S", "2H 4D 3C 6S 5S", "Loss"), ("2H 3S 3C 3H 2S", "3S 3C 2S 2H 2D", "Win"), ("4D 6D 5D 2D JH", "3S 8S 3H TC KH", "Loss"), ("4S 6C 8S 3S 7S", "AD KS 2D 7D 7C", "Loss"), ("6S 4C 7H 8C 3H", "5H JC AH 9D 9C", "Loss"), ("9D 9H JH TC QH", "3C 2S JS 5C 7H", "Win"), ("2H TC 8S AD 9S", "4H TS 7H 2C 5C", "Win"), ("9D 3S 2C 7S 7C", "JC TD 3C TC 9H", "Loss"), ) TEST_FLUSH = ( ("2H 3H 4H 5H 6H", True), ("AS AH 2H AD AC", False), ("2H 3H 5H 6H 7H", True), ("KS AS TS QS JS", True), ("8H 9H QS JS TH", False), ("AS 3S 4S 8S 2S", True), ) TEST_STRAIGHT = ( ("2H 3H 4H 5H 6H", True), ("AS AH 2H AD AC", False), ("2H 3H 5H 6H 7H", False), ("KS AS TS QS JS", True), ("8H 9H QS JS TH", True), ) TEST_FIVE_HIGH_STRAIGHT = ( ("2H 4D 3C AS 5S", True, [5, 4, 3, 2, 14]), ("2H 5D 3C AS 5S", False, [14, 5, 5, 3, 2]), ("JH QD KC AS TS", False, [14, 13, 12, 11, 10]), ("9D 3S 2C 7S 7C", False, [9, 7, 7, 3, 2]), ) TEST_KIND = ( ("JH AH TH KH QH", 0), ("JH 9H TH KH QH", 0), ("JC KH JS JD JH", 7), ("KH KC 3S 3H 3D", 6), ("8C 9C 5C 3C TC", 0), ("JS QS 9H TS KH", 0), ("7C 7S KH 2H 7H", 3), ("3C KH 5D 5S KH", 2), ("QH 8H KD JH 8S", 1), ("2D 6D 9D TH 7D", 0), ) TEST_TYPES = ( ("JH AH TH KH QH", 23), ("JH 9H TH KH QH", 22), ("JC KH JS JD JH", 21), ("KH KC 3S 3H 3D", 20), ("8C 9C 5C 3C TC", 19), ("JS QS 9H TS KH", 18), ("7C 7S KH 2H 7H", 17), ("3C KH 5D 5S KH", 16), ("QH 8H KD JH 8S", 15), ("2D 6D 9D TH 7D", 14), ) def generate_random_hand(): play, oppo = randrange(len(SORTED_HANDS)), randrange(len(SORTED_HANDS)) expected = ["Loss", "Tie", "Win"][(play >= oppo) + (play > oppo)] hand, other = SORTED_HANDS[play], SORTED_HANDS[oppo] return hand, other, expected def generate_random_hands(number_of_hands: int = 100): return (generate_random_hand() for _ in range(number_of_hands)) @pytest.mark.parametrize("hand, expected", TEST_FLUSH) def test_hand_is_flush(hand, expected): assert PokerHand(hand)._is_flush() == expected @pytest.mark.parametrize("hand, expected", TEST_STRAIGHT) def test_hand_is_straight(hand, expected): assert PokerHand(hand)._is_straight() == expected @pytest.mark.parametrize("hand, expected, card_values", TEST_FIVE_HIGH_STRAIGHT) def test_hand_is_five_high_straight(hand, expected, card_values): player = PokerHand(hand) assert player._is_five_high_straight() == expected assert player._card_values == card_values @pytest.mark.parametrize("hand, expected", TEST_KIND) def test_hand_is_same_kind(hand, expected): assert PokerHand(hand)._is_same_kind() == expected @pytest.mark.parametrize("hand, expected", TEST_TYPES) def test_hand_values(hand, expected): assert PokerHand(hand)._hand_type == expected @pytest.mark.parametrize("hand, other, expected", TEST_COMPARE) def test_compare_simple(hand, other, expected): assert PokerHand(hand).compare_with(PokerHand(other)) == expected @pytest.mark.parametrize("hand, other, expected", generate_random_hands()) def test_compare_random(hand, other, expected): assert PokerHand(hand).compare_with(PokerHand(other)) == expected def test_hand_sorted(): poker_hands = [PokerHand(hand) for hand in SORTED_HANDS] list_copy = poker_hands.copy() shuffle(list_copy) user_sorted = chain(sorted(list_copy)) for index, hand in enumerate(user_sorted): assert hand == poker_hands[index] def test_custom_sort_five_high_straight(): # Test that five high straights are compared correctly. pokerhands = [PokerHand("2D AC 3H 4H 5S"), PokerHand("2S 3H 4H 5S 6C")] pokerhands.sort(reverse=True) assert pokerhands[0].__str__() == "2S 3H 4H 5S 6C" def test_multiple_calls_five_high_straight(): # Multiple calls to five_high_straight function should still return True # and shouldn't mutate the list in every call other than the first. pokerhand = PokerHand("2C 4S AS 3D 5C") expected = True expected_card_values = [5, 4, 3, 2, 14] for _ in range(10): assert pokerhand._is_five_high_straight() == expected assert pokerhand._card_values == expected_card_values def test_euler_project(): # Problem number 54 from Project Euler # Testing from poker_hands.txt file answer = 0 script_dir = os.path.abspath(os.path.dirname(__file__)) poker_hands = os.path.join(script_dir, "poker_hands.txt") with open(poker_hands) as file_hand: for line in file_hand: player_hand = line[:14].strip() opponent_hand = line[15:].strip() player, opponent = PokerHand(player_hand), PokerHand(opponent_hand) output = player.compare_with(opponent) if output == "Win": answer += 1 assert answer == 376
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Created on Thu Oct 5 16:44:23 2017 @author: Christian Bender This Python library contains some useful functions to deal with prime numbers and whole numbers. Overview: is_prime(number) sieve_er(N) get_prime_numbers(N) prime_factorization(number) greatest_prime_factor(number) smallest_prime_factor(number) get_prime(n) get_primes_between(pNumber1, pNumber2) ---- is_even(number) is_odd(number) gcd(number1, number2) // greatest common divisor kg_v(number1, number2) // least common multiple get_divisors(number) // all divisors of 'number' inclusive 1, number is_perfect_number(number) NEW-FUNCTIONS simplify_fraction(numerator, denominator) factorial (n) // n! fib (n) // calculate the n-th fibonacci term. ----- goldbach(number) // Goldbach's assumption """ from math import sqrt def is_prime(number: int) -> bool: """ input: positive integer 'number' returns true if 'number' is prime otherwise false. """ # precondition assert isinstance(number, int) and ( number >= 0 ), "'number' must been an int and positive" status = True # 0 and 1 are none primes. if number <= 1: status = False for divisor in range(2, int(round(sqrt(number))) + 1): # if 'number' divisible by 'divisor' then sets 'status' # of false and break up the loop. if number % divisor == 0: status = False break # precondition assert isinstance(status, bool), "'status' must been from type bool" return status # ------------------------------------------ def sieve_er(n): """ input: positive integer 'N' > 2 returns a list of prime numbers from 2 up to N. This function implements the algorithm called sieve of erathostenes. """ # precondition assert isinstance(n, int) and (n > 2), "'N' must been an int and > 2" # beginList: contains all natural numbers from 2 up to N begin_list = list(range(2, n + 1)) ans = [] # this list will be returns. # actual sieve of erathostenes for i in range(len(begin_list)): for j in range(i + 1, len(begin_list)): if (begin_list[i] != 0) and (begin_list[j] % begin_list[i] == 0): begin_list[j] = 0 # filters actual prime numbers. ans = [x for x in begin_list if x != 0] # precondition assert isinstance(ans, list), "'ans' must been from type list" return ans # -------------------------------- def get_prime_numbers(n): """ input: positive integer 'N' > 2 returns a list of prime numbers from 2 up to N (inclusive) This function is more efficient as function 'sieveEr(...)' """ # precondition assert isinstance(n, int) and (n > 2), "'N' must been an int and > 2" ans = [] # iterates over all numbers between 2 up to N+1 # if a number is prime then appends to list 'ans' for number in range(2, n + 1): if is_prime(number): ans.append(number) # precondition assert isinstance(ans, list), "'ans' must been from type list" return ans # ----------------------------------------- def prime_factorization(number): """ input: positive integer 'number' returns a list of the prime number factors of 'number' """ # precondition assert isinstance(number, int) and number >= 0, "'number' must been an int and >= 0" ans = [] # this list will be returns of the function. # potential prime number factors. factor = 2 quotient = number if number == 0 or number == 1: ans.append(number) # if 'number' not prime then builds the prime factorization of 'number' elif not is_prime(number): while quotient != 1: if is_prime(factor) and (quotient % factor == 0): ans.append(factor) quotient /= factor else: factor += 1 else: ans.append(number) # precondition assert isinstance(ans, list), "'ans' must been from type list" return ans # ----------------------------------------- def greatest_prime_factor(number): """ input: positive integer 'number' >= 0 returns the greatest prime number factor of 'number' """ # precondition assert isinstance(number, int) and ( number >= 0 ), "'number' bust been an int and >= 0" ans = 0 # prime factorization of 'number' prime_factors = prime_factorization(number) ans = max(prime_factors) # precondition assert isinstance(ans, int), "'ans' must been from type int" return ans # ---------------------------------------------- def smallest_prime_factor(number): """ input: integer 'number' >= 0 returns the smallest prime number factor of 'number' """ # precondition assert isinstance(number, int) and ( number >= 0 ), "'number' bust been an int and >= 0" ans = 0 # prime factorization of 'number' prime_factors = prime_factorization(number) ans = min(prime_factors) # precondition assert isinstance(ans, int), "'ans' must been from type int" return ans # ---------------------- def is_even(number): """ input: integer 'number' returns true if 'number' is even, otherwise false. """ # precondition assert isinstance(number, int), "'number' must been an int" assert isinstance(number % 2 == 0, bool), "compare bust been from type bool" return number % 2 == 0 # ------------------------ def is_odd(number): """ input: integer 'number' returns true if 'number' is odd, otherwise false. """ # precondition assert isinstance(number, int), "'number' must been an int" assert isinstance(number % 2 != 0, bool), "compare bust been from type bool" return number % 2 != 0 # ------------------------ def goldbach(number): """ Goldbach's assumption input: a even positive integer 'number' > 2 returns a list of two prime numbers whose sum is equal to 'number' """ # precondition assert ( isinstance(number, int) and (number > 2) and is_even(number) ), "'number' must been an int, even and > 2" ans = [] # this list will returned # creates a list of prime numbers between 2 up to 'number' prime_numbers = get_prime_numbers(number) len_pn = len(prime_numbers) # run variable for while-loops. i = 0 j = None # exit variable. for break up the loops loop = True while i < len_pn and loop: j = i + 1 while j < len_pn and loop: if prime_numbers[i] + prime_numbers[j] == number: loop = False ans.append(prime_numbers[i]) ans.append(prime_numbers[j]) j += 1 i += 1 # precondition assert ( isinstance(ans, list) and (len(ans) == 2) and (ans[0] + ans[1] == number) and is_prime(ans[0]) and is_prime(ans[1]) ), "'ans' must contains two primes. And sum of elements must been eq 'number'" return ans # ---------------------------------------------- def gcd(number1, number2): """ Greatest common divisor input: two positive integer 'number1' and 'number2' returns the greatest common divisor of 'number1' and 'number2' """ # precondition assert ( isinstance(number1, int) and isinstance(number2, int) and (number1 >= 0) and (number2 >= 0) ), "'number1' and 'number2' must been positive integer." rest = 0 while number2 != 0: rest = number1 % number2 number1 = number2 number2 = rest # precondition assert isinstance(number1, int) and ( number1 >= 0 ), "'number' must been from type int and positive" return number1 # ---------------------------------------------------- def kg_v(number1, number2): """ Least common multiple input: two positive integer 'number1' and 'number2' returns the least common multiple of 'number1' and 'number2' """ # precondition assert ( isinstance(number1, int) and isinstance(number2, int) and (number1 >= 1) and (number2 >= 1) ), "'number1' and 'number2' must been positive integer." ans = 1 # actual answer that will be return. # for kgV (x,1) if number1 > 1 and number2 > 1: # builds the prime factorization of 'number1' and 'number2' prime_fac_1 = prime_factorization(number1) prime_fac_2 = prime_factorization(number2) elif number1 == 1 or number2 == 1: prime_fac_1 = [] prime_fac_2 = [] ans = max(number1, number2) count1 = 0 count2 = 0 done = [] # captured numbers int both 'primeFac1' and 'primeFac2' # iterates through primeFac1 for n in prime_fac_1: if n not in done: if n in prime_fac_2: count1 = prime_fac_1.count(n) count2 = prime_fac_2.count(n) for _ in range(max(count1, count2)): ans *= n else: count1 = prime_fac_1.count(n) for _ in range(count1): ans *= n done.append(n) # iterates through primeFac2 for n in prime_fac_2: if n not in done: count2 = prime_fac_2.count(n) for _ in range(count2): ans *= n done.append(n) # precondition assert isinstance(ans, int) and ( ans >= 0 ), "'ans' must been from type int and positive" return ans # ---------------------------------- def get_prime(n): """ Gets the n-th prime number. input: positive integer 'n' >= 0 returns the n-th prime number, beginning at index 0 """ # precondition assert isinstance(n, int) and (n >= 0), "'number' must been a positive int" index = 0 ans = 2 # this variable holds the answer while index < n: index += 1 ans += 1 # counts to the next number # if ans not prime then # runs to the next prime number. while not is_prime(ans): ans += 1 # precondition assert isinstance(ans, int) and is_prime( ans ), "'ans' must been a prime number and from type int" return ans # --------------------------------------------------- def get_primes_between(p_number_1, p_number_2): """ input: prime numbers 'pNumber1' and 'pNumber2' pNumber1 < pNumber2 returns a list of all prime numbers between 'pNumber1' (exclusive) and 'pNumber2' (exclusive) """ # precondition assert ( is_prime(p_number_1) and is_prime(p_number_2) and (p_number_1 < p_number_2) ), "The arguments must been prime numbers and 'pNumber1' < 'pNumber2'" number = p_number_1 + 1 # jump to the next number ans = [] # this list will be returns. # if number is not prime then # fetch the next prime number. while not is_prime(number): number += 1 while number < p_number_2: ans.append(number) number += 1 # fetch the next prime number. while not is_prime(number): number += 1 # precondition assert ( isinstance(ans, list) and ans[0] != p_number_1 and ans[len(ans) - 1] != p_number_2 ), "'ans' must been a list without the arguments" # 'ans' contains not 'pNumber1' and 'pNumber2' ! return ans # ---------------------------------------------------- def get_divisors(n): """ input: positive integer 'n' >= 1 returns all divisors of n (inclusive 1 and 'n') """ # precondition assert isinstance(n, int) and (n >= 1), "'n' must been int and >= 1" ans = [] # will be returned. for divisor in range(1, n + 1): if n % divisor == 0: ans.append(divisor) # precondition assert ans[0] == 1 and ans[len(ans) - 1] == n, "Error in function getDivisiors(...)" return ans # ---------------------------------------------------- def is_perfect_number(number): """ input: positive integer 'number' > 1 returns true if 'number' is a perfect number otherwise false. """ # precondition assert isinstance(number, int) and ( number > 1 ), "'number' must been an int and >= 1" divisors = get_divisors(number) # precondition assert ( isinstance(divisors, list) and (divisors[0] == 1) and (divisors[len(divisors) - 1] == number) ), "Error in help-function getDivisiors(...)" # summed all divisors up to 'number' (exclusive), hence [:-1] return sum(divisors[:-1]) == number # ------------------------------------------------------------ def simplify_fraction(numerator, denominator): """ input: two integer 'numerator' and 'denominator' assumes: 'denominator' != 0 returns: a tuple with simplify numerator and denominator. """ # precondition assert ( isinstance(numerator, int) and isinstance(denominator, int) and (denominator != 0) ), "The arguments must been from type int and 'denominator' != 0" # build the greatest common divisor of numerator and denominator. gcd_of_fraction = gcd(abs(numerator), abs(denominator)) # precondition assert ( isinstance(gcd_of_fraction, int) and (numerator % gcd_of_fraction == 0) and (denominator % gcd_of_fraction == 0) ), "Error in function gcd(...,...)" return (numerator // gcd_of_fraction, denominator // gcd_of_fraction) # ----------------------------------------------------------------- def factorial(n): """ input: positive integer 'n' returns the factorial of 'n' (n!) """ # precondition assert isinstance(n, int) and (n >= 0), "'n' must been a int and >= 0" ans = 1 # this will be return. for factor in range(1, n + 1): ans *= factor return ans # ------------------------------------------------------------------- def fib(n): """ input: positive integer 'n' returns the n-th fibonacci term , indexing by 0 """ # precondition assert isinstance(n, int) and (n >= 0), "'n' must been an int and >= 0" tmp = 0 fib1 = 1 ans = 1 # this will be return for _ in range(n - 1): tmp = ans ans += fib1 fib1 = tmp return ans
""" Created on Thu Oct 5 16:44:23 2017 @author: Christian Bender This Python library contains some useful functions to deal with prime numbers and whole numbers. Overview: is_prime(number) sieve_er(N) get_prime_numbers(N) prime_factorization(number) greatest_prime_factor(number) smallest_prime_factor(number) get_prime(n) get_primes_between(pNumber1, pNumber2) ---- is_even(number) is_odd(number) gcd(number1, number2) // greatest common divisor kg_v(number1, number2) // least common multiple get_divisors(number) // all divisors of 'number' inclusive 1, number is_perfect_number(number) NEW-FUNCTIONS simplify_fraction(numerator, denominator) factorial (n) // n! fib (n) // calculate the n-th fibonacci term. ----- goldbach(number) // Goldbach's assumption """ from math import sqrt def is_prime(number: int) -> bool: """ input: positive integer 'number' returns true if 'number' is prime otherwise false. """ # precondition assert isinstance(number, int) and ( number >= 0 ), "'number' must been an int and positive" status = True # 0 and 1 are none primes. if number <= 1: status = False for divisor in range(2, int(round(sqrt(number))) + 1): # if 'number' divisible by 'divisor' then sets 'status' # of false and break up the loop. if number % divisor == 0: status = False break # precondition assert isinstance(status, bool), "'status' must been from type bool" return status # ------------------------------------------ def sieve_er(n): """ input: positive integer 'N' > 2 returns a list of prime numbers from 2 up to N. This function implements the algorithm called sieve of erathostenes. """ # precondition assert isinstance(n, int) and (n > 2), "'N' must been an int and > 2" # beginList: contains all natural numbers from 2 up to N begin_list = list(range(2, n + 1)) ans = [] # this list will be returns. # actual sieve of erathostenes for i in range(len(begin_list)): for j in range(i + 1, len(begin_list)): if (begin_list[i] != 0) and (begin_list[j] % begin_list[i] == 0): begin_list[j] = 0 # filters actual prime numbers. ans = [x for x in begin_list if x != 0] # precondition assert isinstance(ans, list), "'ans' must been from type list" return ans # -------------------------------- def get_prime_numbers(n): """ input: positive integer 'N' > 2 returns a list of prime numbers from 2 up to N (inclusive) This function is more efficient as function 'sieveEr(...)' """ # precondition assert isinstance(n, int) and (n > 2), "'N' must been an int and > 2" ans = [] # iterates over all numbers between 2 up to N+1 # if a number is prime then appends to list 'ans' for number in range(2, n + 1): if is_prime(number): ans.append(number) # precondition assert isinstance(ans, list), "'ans' must been from type list" return ans # ----------------------------------------- def prime_factorization(number): """ input: positive integer 'number' returns a list of the prime number factors of 'number' """ # precondition assert isinstance(number, int) and number >= 0, "'number' must been an int and >= 0" ans = [] # this list will be returns of the function. # potential prime number factors. factor = 2 quotient = number if number == 0 or number == 1: ans.append(number) # if 'number' not prime then builds the prime factorization of 'number' elif not is_prime(number): while quotient != 1: if is_prime(factor) and (quotient % factor == 0): ans.append(factor) quotient /= factor else: factor += 1 else: ans.append(number) # precondition assert isinstance(ans, list), "'ans' must been from type list" return ans # ----------------------------------------- def greatest_prime_factor(number): """ input: positive integer 'number' >= 0 returns the greatest prime number factor of 'number' """ # precondition assert isinstance(number, int) and ( number >= 0 ), "'number' bust been an int and >= 0" ans = 0 # prime factorization of 'number' prime_factors = prime_factorization(number) ans = max(prime_factors) # precondition assert isinstance(ans, int), "'ans' must been from type int" return ans # ---------------------------------------------- def smallest_prime_factor(number): """ input: integer 'number' >= 0 returns the smallest prime number factor of 'number' """ # precondition assert isinstance(number, int) and ( number >= 0 ), "'number' bust been an int and >= 0" ans = 0 # prime factorization of 'number' prime_factors = prime_factorization(number) ans = min(prime_factors) # precondition assert isinstance(ans, int), "'ans' must been from type int" return ans # ---------------------- def is_even(number): """ input: integer 'number' returns true if 'number' is even, otherwise false. """ # precondition assert isinstance(number, int), "'number' must been an int" assert isinstance(number % 2 == 0, bool), "compare bust been from type bool" return number % 2 == 0 # ------------------------ def is_odd(number): """ input: integer 'number' returns true if 'number' is odd, otherwise false. """ # precondition assert isinstance(number, int), "'number' must been an int" assert isinstance(number % 2 != 0, bool), "compare bust been from type bool" return number % 2 != 0 # ------------------------ def goldbach(number): """ Goldbach's assumption input: a even positive integer 'number' > 2 returns a list of two prime numbers whose sum is equal to 'number' """ # precondition assert ( isinstance(number, int) and (number > 2) and is_even(number) ), "'number' must been an int, even and > 2" ans = [] # this list will returned # creates a list of prime numbers between 2 up to 'number' prime_numbers = get_prime_numbers(number) len_pn = len(prime_numbers) # run variable for while-loops. i = 0 j = None # exit variable. for break up the loops loop = True while i < len_pn and loop: j = i + 1 while j < len_pn and loop: if prime_numbers[i] + prime_numbers[j] == number: loop = False ans.append(prime_numbers[i]) ans.append(prime_numbers[j]) j += 1 i += 1 # precondition assert ( isinstance(ans, list) and (len(ans) == 2) and (ans[0] + ans[1] == number) and is_prime(ans[0]) and is_prime(ans[1]) ), "'ans' must contains two primes. And sum of elements must been eq 'number'" return ans # ---------------------------------------------- def gcd(number1, number2): """ Greatest common divisor input: two positive integer 'number1' and 'number2' returns the greatest common divisor of 'number1' and 'number2' """ # precondition assert ( isinstance(number1, int) and isinstance(number2, int) and (number1 >= 0) and (number2 >= 0) ), "'number1' and 'number2' must been positive integer." rest = 0 while number2 != 0: rest = number1 % number2 number1 = number2 number2 = rest # precondition assert isinstance(number1, int) and ( number1 >= 0 ), "'number' must been from type int and positive" return number1 # ---------------------------------------------------- def kg_v(number1, number2): """ Least common multiple input: two positive integer 'number1' and 'number2' returns the least common multiple of 'number1' and 'number2' """ # precondition assert ( isinstance(number1, int) and isinstance(number2, int) and (number1 >= 1) and (number2 >= 1) ), "'number1' and 'number2' must been positive integer." ans = 1 # actual answer that will be return. # for kgV (x,1) if number1 > 1 and number2 > 1: # builds the prime factorization of 'number1' and 'number2' prime_fac_1 = prime_factorization(number1) prime_fac_2 = prime_factorization(number2) elif number1 == 1 or number2 == 1: prime_fac_1 = [] prime_fac_2 = [] ans = max(number1, number2) count1 = 0 count2 = 0 done = [] # captured numbers int both 'primeFac1' and 'primeFac2' # iterates through primeFac1 for n in prime_fac_1: if n not in done: if n in prime_fac_2: count1 = prime_fac_1.count(n) count2 = prime_fac_2.count(n) for _ in range(max(count1, count2)): ans *= n else: count1 = prime_fac_1.count(n) for _ in range(count1): ans *= n done.append(n) # iterates through primeFac2 for n in prime_fac_2: if n not in done: count2 = prime_fac_2.count(n) for _ in range(count2): ans *= n done.append(n) # precondition assert isinstance(ans, int) and ( ans >= 0 ), "'ans' must been from type int and positive" return ans # ---------------------------------- def get_prime(n): """ Gets the n-th prime number. input: positive integer 'n' >= 0 returns the n-th prime number, beginning at index 0 """ # precondition assert isinstance(n, int) and (n >= 0), "'number' must been a positive int" index = 0 ans = 2 # this variable holds the answer while index < n: index += 1 ans += 1 # counts to the next number # if ans not prime then # runs to the next prime number. while not is_prime(ans): ans += 1 # precondition assert isinstance(ans, int) and is_prime( ans ), "'ans' must been a prime number and from type int" return ans # --------------------------------------------------- def get_primes_between(p_number_1, p_number_2): """ input: prime numbers 'pNumber1' and 'pNumber2' pNumber1 < pNumber2 returns a list of all prime numbers between 'pNumber1' (exclusive) and 'pNumber2' (exclusive) """ # precondition assert ( is_prime(p_number_1) and is_prime(p_number_2) and (p_number_1 < p_number_2) ), "The arguments must been prime numbers and 'pNumber1' < 'pNumber2'" number = p_number_1 + 1 # jump to the next number ans = [] # this list will be returns. # if number is not prime then # fetch the next prime number. while not is_prime(number): number += 1 while number < p_number_2: ans.append(number) number += 1 # fetch the next prime number. while not is_prime(number): number += 1 # precondition assert ( isinstance(ans, list) and ans[0] != p_number_1 and ans[len(ans) - 1] != p_number_2 ), "'ans' must been a list without the arguments" # 'ans' contains not 'pNumber1' and 'pNumber2' ! return ans # ---------------------------------------------------- def get_divisors(n): """ input: positive integer 'n' >= 1 returns all divisors of n (inclusive 1 and 'n') """ # precondition assert isinstance(n, int) and (n >= 1), "'n' must been int and >= 1" ans = [] # will be returned. for divisor in range(1, n + 1): if n % divisor == 0: ans.append(divisor) # precondition assert ans[0] == 1 and ans[len(ans) - 1] == n, "Error in function getDivisiors(...)" return ans # ---------------------------------------------------- def is_perfect_number(number): """ input: positive integer 'number' > 1 returns true if 'number' is a perfect number otherwise false. """ # precondition assert isinstance(number, int) and ( number > 1 ), "'number' must been an int and >= 1" divisors = get_divisors(number) # precondition assert ( isinstance(divisors, list) and (divisors[0] == 1) and (divisors[len(divisors) - 1] == number) ), "Error in help-function getDivisiors(...)" # summed all divisors up to 'number' (exclusive), hence [:-1] return sum(divisors[:-1]) == number # ------------------------------------------------------------ def simplify_fraction(numerator, denominator): """ input: two integer 'numerator' and 'denominator' assumes: 'denominator' != 0 returns: a tuple with simplify numerator and denominator. """ # precondition assert ( isinstance(numerator, int) and isinstance(denominator, int) and (denominator != 0) ), "The arguments must been from type int and 'denominator' != 0" # build the greatest common divisor of numerator and denominator. gcd_of_fraction = gcd(abs(numerator), abs(denominator)) # precondition assert ( isinstance(gcd_of_fraction, int) and (numerator % gcd_of_fraction == 0) and (denominator % gcd_of_fraction == 0) ), "Error in function gcd(...,...)" return (numerator // gcd_of_fraction, denominator // gcd_of_fraction) # ----------------------------------------------------------------- def factorial(n): """ input: positive integer 'n' returns the factorial of 'n' (n!) """ # precondition assert isinstance(n, int) and (n >= 0), "'n' must been a int and >= 0" ans = 1 # this will be return. for factor in range(1, n + 1): ans *= factor return ans # ------------------------------------------------------------------- def fib(n): """ input: positive integer 'n' returns the n-th fibonacci term , indexing by 0 """ # precondition assert isinstance(n, int) and (n >= 0), "'n' must been an int and >= 0" tmp = 0 fib1 = 1 ans = 1 # this will be return for _ in range(n - 1): tmp = ans ans += fib1 fib1 = tmp return ans
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Scrape the price and pharmacy name for a prescription drug from rx site after providing the drug name and zipcode. """ from urllib.error import HTTPError from bs4 import BeautifulSoup from requests import exceptions, get BASE_URL = "https://www.wellrx.com/prescriptions/{0}/{1}/?freshSearch=true" def fetch_pharmacy_and_price_list(drug_name: str, zip_code: str) -> list | None: """[summary] This function will take input of drug name and zipcode, then request to the BASE_URL site. Get the page data and scrape it to the generate the list of lowest prices for the prescription drug. Args: drug_name (str): [Drug name] zip_code(str): [Zip code] Returns: list: [List of pharmacy name and price] >>> fetch_pharmacy_and_price_list(None, None) >>> fetch_pharmacy_and_price_list(None, 30303) >>> fetch_pharmacy_and_price_list("eliquis", None) """ try: # Has user provided both inputs? if not drug_name or not zip_code: return None request_url = BASE_URL.format(drug_name, zip_code) response = get(request_url) # Is the response ok? response.raise_for_status() # Scrape the data using bs4 soup = BeautifulSoup(response.text, "html.parser") # This list will store the name and price. pharmacy_price_list = [] # Fetch all the grids that contains the items. grid_list = soup.find_all("div", {"class": "grid-x pharmCard"}) if grid_list and len(grid_list) > 0: for grid in grid_list: # Get the pharmacy price. pharmacy_name = grid.find("p", {"class": "list-title"}).text # Get price of the drug. price = grid.find("span", {"p", "price price-large"}).text pharmacy_price_list.append( { "pharmacy_name": pharmacy_name, "price": price, } ) return pharmacy_price_list except (HTTPError, exceptions.RequestException, ValueError): return None if __name__ == "__main__": # Enter a drug name and a zip code drug_name = input("Enter drug name: ").strip() zip_code = input("Enter zip code: ").strip() pharmacy_price_list: list | None = fetch_pharmacy_and_price_list( drug_name, zip_code ) if pharmacy_price_list: print(f"\nSearch results for {drug_name} at location {zip_code}:") for pharmacy_price in pharmacy_price_list: name = pharmacy_price["pharmacy_name"] price = pharmacy_price["price"] print(f"Pharmacy: {name} Price: {price}") else: print(f"No results found for {drug_name}")
""" Scrape the price and pharmacy name for a prescription drug from rx site after providing the drug name and zipcode. """ from urllib.error import HTTPError from bs4 import BeautifulSoup from requests import exceptions, get BASE_URL = "https://www.wellrx.com/prescriptions/{0}/{1}/?freshSearch=true" def fetch_pharmacy_and_price_list(drug_name: str, zip_code: str) -> list | None: """[summary] This function will take input of drug name and zipcode, then request to the BASE_URL site. Get the page data and scrape it to the generate the list of lowest prices for the prescription drug. Args: drug_name (str): [Drug name] zip_code(str): [Zip code] Returns: list: [List of pharmacy name and price] >>> fetch_pharmacy_and_price_list(None, None) >>> fetch_pharmacy_and_price_list(None, 30303) >>> fetch_pharmacy_and_price_list("eliquis", None) """ try: # Has user provided both inputs? if not drug_name or not zip_code: return None request_url = BASE_URL.format(drug_name, zip_code) response = get(request_url) # Is the response ok? response.raise_for_status() # Scrape the data using bs4 soup = BeautifulSoup(response.text, "html.parser") # This list will store the name and price. pharmacy_price_list = [] # Fetch all the grids that contains the items. grid_list = soup.find_all("div", {"class": "grid-x pharmCard"}) if grid_list and len(grid_list) > 0: for grid in grid_list: # Get the pharmacy price. pharmacy_name = grid.find("p", {"class": "list-title"}).text # Get price of the drug. price = grid.find("span", {"p", "price price-large"}).text pharmacy_price_list.append( { "pharmacy_name": pharmacy_name, "price": price, } ) return pharmacy_price_list except (HTTPError, exceptions.RequestException, ValueError): return None if __name__ == "__main__": # Enter a drug name and a zip code drug_name = input("Enter drug name: ").strip() zip_code = input("Enter zip code: ").strip() pharmacy_price_list: list | None = fetch_pharmacy_and_price_list( drug_name, zip_code ) if pharmacy_price_list: print(f"\nSearch results for {drug_name} at location {zip_code}:") for pharmacy_price in pharmacy_price_list: name = pharmacy_price["pharmacy_name"] price = pharmacy_price["price"] print(f"Pharmacy: {name} Price: {price}") else: print(f"No results found for {drug_name}")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Convert between different units of temperature """ def celsius_to_fahrenheit(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> celsius_to_fahrenheit(273.354, 3) 524.037 >>> celsius_to_fahrenheit(273.354, 0) 524.0 >>> celsius_to_fahrenheit(-40.0) -40.0 >>> celsius_to_fahrenheit(-20.0) -4.0 >>> celsius_to_fahrenheit(0) 32.0 >>> celsius_to_fahrenheit(20) 68.0 >>> celsius_to_fahrenheit("40") 104.0 >>> celsius_to_fahrenheit("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 32, ndigits) def celsius_to_kelvin(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> celsius_to_kelvin(273.354, 3) 546.504 >>> celsius_to_kelvin(273.354, 0) 547.0 >>> celsius_to_kelvin(0) 273.15 >>> celsius_to_kelvin(20.0) 293.15 >>> celsius_to_kelvin("40") 313.15 >>> celsius_to_kelvin("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round(float(celsius) + 273.15, ndigits) def celsius_to_rankine(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> celsius_to_rankine(273.354, 3) 983.707 >>> celsius_to_rankine(273.354, 0) 984.0 >>> celsius_to_rankine(0) 491.67 >>> celsius_to_rankine(20.0) 527.67 >>> celsius_to_rankine("40") 563.67 >>> celsius_to_rankine("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 491.67, ndigits) def fahrenheit_to_celsius(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> fahrenheit_to_celsius(273.354, 3) 134.086 >>> fahrenheit_to_celsius(273.354, 0) 134.0 >>> fahrenheit_to_celsius(0) -17.78 >>> fahrenheit_to_celsius(20.0) -6.67 >>> fahrenheit_to_celsius(40.0) 4.44 >>> fahrenheit_to_celsius(60) 15.56 >>> fahrenheit_to_celsius(80) 26.67 >>> fahrenheit_to_celsius("100") 37.78 >>> fahrenheit_to_celsius("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round((float(fahrenheit) - 32) * 5 / 9, ndigits) def fahrenheit_to_kelvin(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> fahrenheit_to_kelvin(273.354, 3) 407.236 >>> fahrenheit_to_kelvin(273.354, 0) 407.0 >>> fahrenheit_to_kelvin(0) 255.37 >>> fahrenheit_to_kelvin(20.0) 266.48 >>> fahrenheit_to_kelvin(40.0) 277.59 >>> fahrenheit_to_kelvin(60) 288.71 >>> fahrenheit_to_kelvin(80) 299.82 >>> fahrenheit_to_kelvin("100") 310.93 >>> fahrenheit_to_kelvin("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(((float(fahrenheit) - 32) * 5 / 9) + 273.15, ndigits) def fahrenheit_to_rankine(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> fahrenheit_to_rankine(273.354, 3) 733.024 >>> fahrenheit_to_rankine(273.354, 0) 733.0 >>> fahrenheit_to_rankine(0) 459.67 >>> fahrenheit_to_rankine(20.0) 479.67 >>> fahrenheit_to_rankine(40.0) 499.67 >>> fahrenheit_to_rankine(60) 519.67 >>> fahrenheit_to_rankine(80) 539.67 >>> fahrenheit_to_rankine("100") 559.67 >>> fahrenheit_to_rankine("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(float(fahrenheit) + 459.67, ndigits) def kelvin_to_celsius(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> kelvin_to_celsius(273.354, 3) 0.204 >>> kelvin_to_celsius(273.354, 0) 0.0 >>> kelvin_to_celsius(273.15) 0.0 >>> kelvin_to_celsius(300) 26.85 >>> kelvin_to_celsius("315.5") 42.35 >>> kelvin_to_celsius("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(float(kelvin) - 273.15, ndigits) def kelvin_to_fahrenheit(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> kelvin_to_fahrenheit(273.354, 3) 32.367 >>> kelvin_to_fahrenheit(273.354, 0) 32.0 >>> kelvin_to_fahrenheit(273.15) 32.0 >>> kelvin_to_fahrenheit(300) 80.33 >>> kelvin_to_fahrenheit("315.5") 108.23 >>> kelvin_to_fahrenheit("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(((float(kelvin) - 273.15) * 9 / 5) + 32, ndigits) def kelvin_to_rankine(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> kelvin_to_rankine(273.354, 3) 492.037 >>> kelvin_to_rankine(273.354, 0) 492.0 >>> kelvin_to_rankine(0) 0.0 >>> kelvin_to_rankine(20.0) 36.0 >>> kelvin_to_rankine("40") 72.0 >>> kelvin_to_rankine("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round((float(kelvin) * 9 / 5), ndigits) def rankine_to_celsius(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> rankine_to_celsius(273.354, 3) -121.287 >>> rankine_to_celsius(273.354, 0) -121.0 >>> rankine_to_celsius(273.15) -121.4 >>> rankine_to_celsius(300) -106.48 >>> rankine_to_celsius("315.5") -97.87 >>> rankine_to_celsius("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) - 491.67) * 5 / 9, ndigits) def rankine_to_fahrenheit(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> rankine_to_fahrenheit(273.15) -186.52 >>> rankine_to_fahrenheit(300) -159.67 >>> rankine_to_fahrenheit("315.5") -144.17 >>> rankine_to_fahrenheit("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round(float(rankine) - 459.67, ndigits) def rankine_to_kelvin(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> rankine_to_kelvin(0) 0.0 >>> rankine_to_kelvin(20.0) 11.11 >>> rankine_to_kelvin("40") 22.22 >>> rankine_to_kelvin("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) * 5 / 9), ndigits) def reaumur_to_kelvin(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to Kelvin and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_kelvin(0) 273.15 >>> reaumur_to_kelvin(20.0) 298.15 >>> reaumur_to_kelvin(40) 323.15 >>> reaumur_to_kelvin("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25 + 273.15), ndigits) def reaumur_to_fahrenheit(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to fahrenheit and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_fahrenheit(0) 32.0 >>> reaumur_to_fahrenheit(20.0) 77.0 >>> reaumur_to_fahrenheit(40) 122.0 >>> reaumur_to_fahrenheit("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32), ndigits) def reaumur_to_celsius(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to celsius and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_celsius(0) 0.0 >>> reaumur_to_celsius(20.0) 25.0 >>> reaumur_to_celsius(40) 50.0 >>> reaumur_to_celsius("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25), ndigits) def reaumur_to_rankine(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to rankine and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_rankine(0) 491.67 >>> reaumur_to_rankine(20.0) 536.67 >>> reaumur_to_rankine(40) 581.67 >>> reaumur_to_rankine("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32 + 459.67), ndigits) if __name__ == "__main__": import doctest doctest.testmod()
""" Convert between different units of temperature """ def celsius_to_fahrenheit(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> celsius_to_fahrenheit(273.354, 3) 524.037 >>> celsius_to_fahrenheit(273.354, 0) 524.0 >>> celsius_to_fahrenheit(-40.0) -40.0 >>> celsius_to_fahrenheit(-20.0) -4.0 >>> celsius_to_fahrenheit(0) 32.0 >>> celsius_to_fahrenheit(20) 68.0 >>> celsius_to_fahrenheit("40") 104.0 >>> celsius_to_fahrenheit("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 32, ndigits) def celsius_to_kelvin(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> celsius_to_kelvin(273.354, 3) 546.504 >>> celsius_to_kelvin(273.354, 0) 547.0 >>> celsius_to_kelvin(0) 273.15 >>> celsius_to_kelvin(20.0) 293.15 >>> celsius_to_kelvin("40") 313.15 >>> celsius_to_kelvin("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round(float(celsius) + 273.15, ndigits) def celsius_to_rankine(celsius: float, ndigits: int = 2) -> float: """ Convert a given value from Celsius to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Celsius Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> celsius_to_rankine(273.354, 3) 983.707 >>> celsius_to_rankine(273.354, 0) 984.0 >>> celsius_to_rankine(0) 491.67 >>> celsius_to_rankine(20.0) 527.67 >>> celsius_to_rankine("40") 563.67 >>> celsius_to_rankine("celsius") Traceback (most recent call last): ... ValueError: could not convert string to float: 'celsius' """ return round((float(celsius) * 9 / 5) + 491.67, ndigits) def fahrenheit_to_celsius(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> fahrenheit_to_celsius(273.354, 3) 134.086 >>> fahrenheit_to_celsius(273.354, 0) 134.0 >>> fahrenheit_to_celsius(0) -17.78 >>> fahrenheit_to_celsius(20.0) -6.67 >>> fahrenheit_to_celsius(40.0) 4.44 >>> fahrenheit_to_celsius(60) 15.56 >>> fahrenheit_to_celsius(80) 26.67 >>> fahrenheit_to_celsius("100") 37.78 >>> fahrenheit_to_celsius("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round((float(fahrenheit) - 32) * 5 / 9, ndigits) def fahrenheit_to_kelvin(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> fahrenheit_to_kelvin(273.354, 3) 407.236 >>> fahrenheit_to_kelvin(273.354, 0) 407.0 >>> fahrenheit_to_kelvin(0) 255.37 >>> fahrenheit_to_kelvin(20.0) 266.48 >>> fahrenheit_to_kelvin(40.0) 277.59 >>> fahrenheit_to_kelvin(60) 288.71 >>> fahrenheit_to_kelvin(80) 299.82 >>> fahrenheit_to_kelvin("100") 310.93 >>> fahrenheit_to_kelvin("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(((float(fahrenheit) - 32) * 5 / 9) + 273.15, ndigits) def fahrenheit_to_rankine(fahrenheit: float, ndigits: int = 2) -> float: """ Convert a given value from Fahrenheit to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> fahrenheit_to_rankine(273.354, 3) 733.024 >>> fahrenheit_to_rankine(273.354, 0) 733.0 >>> fahrenheit_to_rankine(0) 459.67 >>> fahrenheit_to_rankine(20.0) 479.67 >>> fahrenheit_to_rankine(40.0) 499.67 >>> fahrenheit_to_rankine(60) 519.67 >>> fahrenheit_to_rankine(80) 539.67 >>> fahrenheit_to_rankine("100") 559.67 >>> fahrenheit_to_rankine("fahrenheit") Traceback (most recent call last): ... ValueError: could not convert string to float: 'fahrenheit' """ return round(float(fahrenheit) + 459.67, ndigits) def kelvin_to_celsius(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> kelvin_to_celsius(273.354, 3) 0.204 >>> kelvin_to_celsius(273.354, 0) 0.0 >>> kelvin_to_celsius(273.15) 0.0 >>> kelvin_to_celsius(300) 26.85 >>> kelvin_to_celsius("315.5") 42.35 >>> kelvin_to_celsius("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(float(kelvin) - 273.15, ndigits) def kelvin_to_fahrenheit(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> kelvin_to_fahrenheit(273.354, 3) 32.367 >>> kelvin_to_fahrenheit(273.354, 0) 32.0 >>> kelvin_to_fahrenheit(273.15) 32.0 >>> kelvin_to_fahrenheit(300) 80.33 >>> kelvin_to_fahrenheit("315.5") 108.23 >>> kelvin_to_fahrenheit("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round(((float(kelvin) - 273.15) * 9 / 5) + 32, ndigits) def kelvin_to_rankine(kelvin: float, ndigits: int = 2) -> float: """ Convert a given value from Kelvin to Rankine and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale >>> kelvin_to_rankine(273.354, 3) 492.037 >>> kelvin_to_rankine(273.354, 0) 492.0 >>> kelvin_to_rankine(0) 0.0 >>> kelvin_to_rankine(20.0) 36.0 >>> kelvin_to_rankine("40") 72.0 >>> kelvin_to_rankine("kelvin") Traceback (most recent call last): ... ValueError: could not convert string to float: 'kelvin' """ return round((float(kelvin) * 9 / 5), ndigits) def rankine_to_celsius(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Celsius and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Celsius >>> rankine_to_celsius(273.354, 3) -121.287 >>> rankine_to_celsius(273.354, 0) -121.0 >>> rankine_to_celsius(273.15) -121.4 >>> rankine_to_celsius(300) -106.48 >>> rankine_to_celsius("315.5") -97.87 >>> rankine_to_celsius("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) - 491.67) * 5 / 9, ndigits) def rankine_to_fahrenheit(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Fahrenheit and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Fahrenheit >>> rankine_to_fahrenheit(273.15) -186.52 >>> rankine_to_fahrenheit(300) -159.67 >>> rankine_to_fahrenheit("315.5") -144.17 >>> rankine_to_fahrenheit("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round(float(rankine) - 459.67, ndigits) def rankine_to_kelvin(rankine: float, ndigits: int = 2) -> float: """ Convert a given value from Rankine to Kelvin and round it to 2 decimal places. Wikipedia reference: https://en.wikipedia.org/wiki/Rankine_scale Wikipedia reference: https://en.wikipedia.org/wiki/Kelvin >>> rankine_to_kelvin(0) 0.0 >>> rankine_to_kelvin(20.0) 11.11 >>> rankine_to_kelvin("40") 22.22 >>> rankine_to_kelvin("rankine") Traceback (most recent call last): ... ValueError: could not convert string to float: 'rankine' """ return round((float(rankine) * 5 / 9), ndigits) def reaumur_to_kelvin(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to Kelvin and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_kelvin(0) 273.15 >>> reaumur_to_kelvin(20.0) 298.15 >>> reaumur_to_kelvin(40) 323.15 >>> reaumur_to_kelvin("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25 + 273.15), ndigits) def reaumur_to_fahrenheit(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to fahrenheit and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_fahrenheit(0) 32.0 >>> reaumur_to_fahrenheit(20.0) 77.0 >>> reaumur_to_fahrenheit(40) 122.0 >>> reaumur_to_fahrenheit("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32), ndigits) def reaumur_to_celsius(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to celsius and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_celsius(0) 0.0 >>> reaumur_to_celsius(20.0) 25.0 >>> reaumur_to_celsius(40) 50.0 >>> reaumur_to_celsius("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 1.25), ndigits) def reaumur_to_rankine(reaumur: float, ndigits: int = 2) -> float: """ Convert a given value from reaumur to rankine and round it to 2 decimal places. Reference:- http://www.csgnetwork.com/temp2conv.html >>> reaumur_to_rankine(0) 491.67 >>> reaumur_to_rankine(20.0) 536.67 >>> reaumur_to_rankine(40) 581.67 >>> reaumur_to_rankine("reaumur") Traceback (most recent call last): ... ValueError: could not convert string to float: 'reaumur' """ return round((float(reaumur) * 2.25 + 32 + 459.67), ndigits) if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
from __future__ import annotations import requests def get_hackernews_story(story_id: str) -> dict: url = f"https://hacker-news.firebaseio.com/v0/item/{story_id}.json?print=pretty" return requests.get(url).json() def hackernews_top_stories(max_stories: int = 10) -> list[dict]: """ Get the top max_stories posts from HackerNews - https://news.ycombinator.com/ """ url = "https://hacker-news.firebaseio.com/v0/topstories.json?print=pretty" story_ids = requests.get(url).json()[:max_stories] return [get_hackernews_story(story_id) for story_id in story_ids] def hackernews_top_stories_as_markdown(max_stories: int = 10) -> str: stories = hackernews_top_stories(max_stories) return "\n".join("* [{title}]({url})".format(**story) for story in stories) if __name__ == "__main__": print(hackernews_top_stories_as_markdown())
from __future__ import annotations import requests def get_hackernews_story(story_id: str) -> dict: url = f"https://hacker-news.firebaseio.com/v0/item/{story_id}.json?print=pretty" return requests.get(url).json() def hackernews_top_stories(max_stories: int = 10) -> list[dict]: """ Get the top max_stories posts from HackerNews - https://news.ycombinator.com/ """ url = "https://hacker-news.firebaseio.com/v0/topstories.json?print=pretty" story_ids = requests.get(url).json()[:max_stories] return [get_hackernews_story(story_id) for story_id in story_ids] def hackernews_top_stories_as_markdown(max_stories: int = 10) -> str: stories = hackernews_top_stories(max_stories) return "\n".join("* [{title}]({url})".format(**story) for story in stories) if __name__ == "__main__": print(hackernews_top_stories_as_markdown())
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Author : Yvonne This is a pure Python implementation of Dynamic Programming solution to the longest_sub_array problem. The problem is : Given an array, to find the longest and continuous sub array and get the max sum of the sub array in the given array. """ class SubArray: def __init__(self, arr): # we need a list not a string, so do something to change the type self.array = arr.split(",") def solve_sub_array(self): rear = [int(self.array[0])] * len(self.array) sum_value = [int(self.array[0])] * len(self.array) for i in range(1, len(self.array)): sum_value[i] = max( int(self.array[i]) + sum_value[i - 1], int(self.array[i]) ) rear[i] = max(sum_value[i], rear[i - 1]) return rear[len(self.array) - 1] if __name__ == "__main__": whole_array = input("please input some numbers:") array = SubArray(whole_array) re = array.solve_sub_array() print(("the results is:", re))
""" Author : Yvonne This is a pure Python implementation of Dynamic Programming solution to the longest_sub_array problem. The problem is : Given an array, to find the longest and continuous sub array and get the max sum of the sub array in the given array. """ class SubArray: def __init__(self, arr): # we need a list not a string, so do something to change the type self.array = arr.split(",") def solve_sub_array(self): rear = [int(self.array[0])] * len(self.array) sum_value = [int(self.array[0])] * len(self.array) for i in range(1, len(self.array)): sum_value[i] = max( int(self.array[i]) + sum_value[i - 1], int(self.array[i]) ) rear[i] = max(sum_value[i], rear[i - 1]) return rear[len(self.array) - 1] if __name__ == "__main__": whole_array = input("please input some numbers:") array = SubArray(whole_array) re = array.solve_sub_array() print(("the results is:", re))
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
"""Matrix Exponentiation""" import timeit """ Matrix Exponentiation is a technique to solve linear recurrences in logarithmic time. You read more about it here: https://zobayer.blogspot.com/2010/11/matrix-exponentiation.html https://www.hackerearth.com/practice/notes/matrix-exponentiation-1/ """ class Matrix: def __init__(self, arg): if isinstance(arg, list): # Initializes a matrix identical to the one provided. self.t = arg self.n = len(arg) else: # Initializes a square matrix of the given size and set values to zero. self.n = arg self.t = [[0 for _ in range(self.n)] for _ in range(self.n)] def __mul__(self, b): matrix = Matrix(self.n) for i in range(self.n): for j in range(self.n): for k in range(self.n): matrix.t[i][j] += self.t[i][k] * b.t[k][j] return matrix def modular_exponentiation(a, b): matrix = Matrix([[1, 0], [0, 1]]) while b > 0: if b & 1: matrix *= a a *= a b >>= 1 return matrix def fibonacci_with_matrix_exponentiation(n, f1, f2): # Trivial Cases if n == 1: return f1 elif n == 2: return f2 matrix = Matrix([[1, 1], [1, 0]]) matrix = modular_exponentiation(matrix, n - 2) return f2 * matrix.t[0][0] + f1 * matrix.t[0][1] def simple_fibonacci(n, f1, f2): # Trivial Cases if n == 1: return f1 elif n == 2: return f2 fn_1 = f1 fn_2 = f2 n -= 2 while n > 0: fn_1, fn_2 = fn_1 + fn_2, fn_1 n -= 1 return fn_1 def matrix_exponentiation_time(): setup = """ from random import randint from __main__ import fibonacci_with_matrix_exponentiation """ code = "fibonacci_with_matrix_exponentiation(randint(1,70000), 1, 1)" exec_time = timeit.timeit(setup=setup, stmt=code, number=100) print("With matrix exponentiation the average execution time is ", exec_time / 100) return exec_time def simple_fibonacci_time(): setup = """ from random import randint from __main__ import simple_fibonacci """ code = "simple_fibonacci(randint(1,70000), 1, 1)" exec_time = timeit.timeit(setup=setup, stmt=code, number=100) print( "Without matrix exponentiation the average execution time is ", exec_time / 100 ) return exec_time def main(): matrix_exponentiation_time() simple_fibonacci_time() if __name__ == "__main__": main()
"""Matrix Exponentiation""" import timeit """ Matrix Exponentiation is a technique to solve linear recurrences in logarithmic time. You read more about it here: https://zobayer.blogspot.com/2010/11/matrix-exponentiation.html https://www.hackerearth.com/practice/notes/matrix-exponentiation-1/ """ class Matrix: def __init__(self, arg): if isinstance(arg, list): # Initializes a matrix identical to the one provided. self.t = arg self.n = len(arg) else: # Initializes a square matrix of the given size and set values to zero. self.n = arg self.t = [[0 for _ in range(self.n)] for _ in range(self.n)] def __mul__(self, b): matrix = Matrix(self.n) for i in range(self.n): for j in range(self.n): for k in range(self.n): matrix.t[i][j] += self.t[i][k] * b.t[k][j] return matrix def modular_exponentiation(a, b): matrix = Matrix([[1, 0], [0, 1]]) while b > 0: if b & 1: matrix *= a a *= a b >>= 1 return matrix def fibonacci_with_matrix_exponentiation(n, f1, f2): # Trivial Cases if n == 1: return f1 elif n == 2: return f2 matrix = Matrix([[1, 1], [1, 0]]) matrix = modular_exponentiation(matrix, n - 2) return f2 * matrix.t[0][0] + f1 * matrix.t[0][1] def simple_fibonacci(n, f1, f2): # Trivial Cases if n == 1: return f1 elif n == 2: return f2 fn_1 = f1 fn_2 = f2 n -= 2 while n > 0: fn_1, fn_2 = fn_1 + fn_2, fn_1 n -= 1 return fn_1 def matrix_exponentiation_time(): setup = """ from random import randint from __main__ import fibonacci_with_matrix_exponentiation """ code = "fibonacci_with_matrix_exponentiation(randint(1,70000), 1, 1)" exec_time = timeit.timeit(setup=setup, stmt=code, number=100) print("With matrix exponentiation the average execution time is ", exec_time / 100) return exec_time def simple_fibonacci_time(): setup = """ from random import randint from __main__ import simple_fibonacci """ code = "simple_fibonacci(randint(1,70000), 1, 1)" exec_time = timeit.timeit(setup=setup, stmt=code, number=100) print( "Without matrix exponentiation the average execution time is ", exec_time / 100 ) return exec_time def main(): matrix_exponentiation_time() simple_fibonacci_time() if __name__ == "__main__": main()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" The convex hull problem is problem of finding all the vertices of convex polygon, P of a set of points in a plane such that all the points are either on the vertices of P or inside P. TH convex hull problem has several applications in geometrical problems, computer graphics and game development. Two algorithms have been implemented for the convex hull problem here. 1. A brute-force algorithm which runs in O(n^3) 2. A divide-and-conquer algorithm which runs in O(n log(n)) There are other several other algorithms for the convex hull problem which have not been implemented here, yet. """ from __future__ import annotations from collections.abc import Iterable class Point: """ Defines a 2-d point for use by all convex-hull algorithms. Parameters ---------- x: an int or a float, the x-coordinate of the 2-d point y: an int or a float, the y-coordinate of the 2-d point Examples -------- >>> Point(1, 2) (1.0, 2.0) >>> Point("1", "2") (1.0, 2.0) >>> Point(1, 2) > Point(0, 1) True >>> Point(1, 1) == Point(1, 1) True >>> Point(-0.5, 1) == Point(0.5, 1) False >>> Point("pi", "e") Traceback (most recent call last): ... ValueError: could not convert string to float: 'pi' """ def __init__(self, x, y): self.x, self.y = float(x), float(y) def __eq__(self, other): return self.x == other.x and self.y == other.y def __ne__(self, other): return not self == other def __gt__(self, other): if self.x > other.x: return True elif self.x == other.x: return self.y > other.y return False def __lt__(self, other): return not self > other def __ge__(self, other): if self.x > other.x: return True elif self.x == other.x: return self.y >= other.y return False def __le__(self, other): if self.x < other.x: return True elif self.x == other.x: return self.y <= other.y return False def __repr__(self): return f"({self.x}, {self.y})" def __hash__(self): return hash(self.x) def _construct_points( list_of_tuples: list[Point] | list[list[float]] | Iterable[list[float]], ) -> list[Point]: """ constructs a list of points from an array-like object of numbers Arguments --------- list_of_tuples: array-like object of type numbers. Acceptable types so far are lists, tuples and sets. Returns -------- points: a list where each item is of type Point. This contains only objects which can be converted into a Point. Examples ------- >>> _construct_points([[1, 1], [2, -1], [0.3, 4]]) [(1.0, 1.0), (2.0, -1.0), (0.3, 4.0)] >>> _construct_points([1, 2]) Ignoring deformed point 1. All points must have at least 2 coordinates. Ignoring deformed point 2. All points must have at least 2 coordinates. [] >>> _construct_points([]) [] >>> _construct_points(None) [] """ points: list[Point] = [] if list_of_tuples: for p in list_of_tuples: if isinstance(p, Point): points.append(p) else: try: points.append(Point(p[0], p[1])) except (IndexError, TypeError): print( f"Ignoring deformed point {p}. All points" " must have at least 2 coordinates." ) return points def _validate_input(points: list[Point] | list[list[float]]) -> list[Point]: """ validates an input instance before a convex-hull algorithms uses it Parameters --------- points: array-like, the 2d points to validate before using with a convex-hull algorithm. The elements of points must be either lists, tuples or Points. Returns ------- points: array_like, an iterable of all well-defined Points constructed passed in. Exception --------- ValueError: if points is empty or None, or if a wrong data structure like a scalar is passed TypeError: if an iterable but non-indexable object (eg. dictionary) is passed. The exception to this a set which we'll convert to a list before using Examples ------- >>> _validate_input([[1, 2]]) [(1.0, 2.0)] >>> _validate_input([(1, 2)]) [(1.0, 2.0)] >>> _validate_input([Point(2, 1), Point(-1, 2)]) [(2.0, 1.0), (-1.0, 2.0)] >>> _validate_input([]) Traceback (most recent call last): ... ValueError: Expecting a list of points but got [] >>> _validate_input(1) Traceback (most recent call last): ... ValueError: Expecting an iterable object but got an non-iterable type 1 """ if not hasattr(points, "__iter__"): raise ValueError( f"Expecting an iterable object but got an non-iterable type {points}" ) if not points: raise ValueError(f"Expecting a list of points but got {points}") return _construct_points(points) def _det(a: Point, b: Point, c: Point) -> float: """ Computes the sign perpendicular distance of a 2d point c from a line segment ab. The sign indicates the direction of c relative to ab. A Positive value means c is above ab (to the left), while a negative value means c is below ab (to the right). 0 means all three points are on a straight line. As a side note, 0.5 * abs|det| is the area of triangle abc Parameters ---------- a: point, the point on the left end of line segment ab b: point, the point on the right end of line segment ab c: point, the point for which the direction and location is desired. Returns -------- det: float, abs(det) is the distance of c from ab. The sign indicates which side of line segment ab c is. det is computed as (a_xb_y + c_xa_y + b_xc_y) - (a_yb_x + c_ya_x + b_yc_x) Examples ---------- >>> _det(Point(1, 1), Point(1, 2), Point(1, 5)) 0.0 >>> _det(Point(0, 0), Point(10, 0), Point(0, 10)) 100.0 >>> _det(Point(0, 0), Point(10, 0), Point(0, -10)) -100.0 """ det = (a.x * b.y + b.x * c.y + c.x * a.y) - (a.y * b.x + b.y * c.x + c.y * a.x) return det def convex_hull_bf(points: list[Point]) -> list[Point]: """ Constructs the convex hull of a set of 2D points using a brute force algorithm. The algorithm basically considers all combinations of points (i, j) and uses the definition of convexity to determine whether (i, j) is part of the convex hull or not. (i, j) is part of the convex hull if and only iff there are no points on both sides of the line segment connecting the ij, and there is no point k such that k is on either end of the ij. Runtime: O(n^3) - definitely horrible Parameters --------- points: array-like of object of Points, lists or tuples. The set of 2d points for which the convex-hull is needed Returns ------ convex_set: list, the convex-hull of points sorted in non-decreasing order. See Also -------- convex_hull_recursive, Examples --------- >>> convex_hull_bf([[0, 0], [1, 0], [10, 1]]) [(0.0, 0.0), (1.0, 0.0), (10.0, 1.0)] >>> convex_hull_bf([[0, 0], [1, 0], [10, 0]]) [(0.0, 0.0), (10.0, 0.0)] >>> convex_hull_bf([[-1, 1],[-1, -1], [0, 0], [0.5, 0.5], [1, -1], [1, 1], ... [-0.75, 1]]) [(-1.0, -1.0), (-1.0, 1.0), (1.0, -1.0), (1.0, 1.0)] >>> convex_hull_bf([(0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), ... (2, -1), (2, -4), (1, -3)]) [(0.0, 0.0), (0.0, 3.0), (1.0, -3.0), (2.0, -4.0), (3.0, 0.0), (3.0, 3.0)] """ points = sorted(_validate_input(points)) n = len(points) convex_set = set() for i in range(n - 1): for j in range(i + 1, n): points_left_of_ij = points_right_of_ij = False ij_part_of_convex_hull = True for k in range(n): if k != i and k != j: det_k = _det(points[i], points[j], points[k]) if det_k > 0: points_left_of_ij = True elif det_k < 0: points_right_of_ij = True else: # point[i], point[j], point[k] all lie on a straight line # if point[k] is to the left of point[i] or it's to the # right of point[j], then point[i], point[j] cannot be # part of the convex hull of A if points[k] < points[i] or points[k] > points[j]: ij_part_of_convex_hull = False break if points_left_of_ij and points_right_of_ij: ij_part_of_convex_hull = False break if ij_part_of_convex_hull: convex_set.update([points[i], points[j]]) return sorted(convex_set) def convex_hull_recursive(points: list[Point]) -> list[Point]: """ Constructs the convex hull of a set of 2D points using a divide-and-conquer strategy The algorithm exploits the geometric properties of the problem by repeatedly partitioning the set of points into smaller hulls, and finding the convex hull of these smaller hulls. The union of the convex hull from smaller hulls is the solution to the convex hull of the larger problem. Parameter --------- points: array-like of object of Points, lists or tuples. The set of 2d points for which the convex-hull is needed Runtime: O(n log n) Returns ------- convex_set: list, the convex-hull of points sorted in non-decreasing order. Examples --------- >>> convex_hull_recursive([[0, 0], [1, 0], [10, 1]]) [(0.0, 0.0), (1.0, 0.0), (10.0, 1.0)] >>> convex_hull_recursive([[0, 0], [1, 0], [10, 0]]) [(0.0, 0.0), (10.0, 0.0)] >>> convex_hull_recursive([[-1, 1],[-1, -1], [0, 0], [0.5, 0.5], [1, -1], [1, 1], ... [-0.75, 1]]) [(-1.0, -1.0), (-1.0, 1.0), (1.0, -1.0), (1.0, 1.0)] >>> convex_hull_recursive([(0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), ... (2, -1), (2, -4), (1, -3)]) [(0.0, 0.0), (0.0, 3.0), (1.0, -3.0), (2.0, -4.0), (3.0, 0.0), (3.0, 3.0)] """ points = sorted(_validate_input(points)) n = len(points) # divide all the points into an upper hull and a lower hull # the left most point and the right most point are definitely # members of the convex hull by definition. # use these two anchors to divide all the points into two hulls, # an upper hull and a lower hull. # all points to the left (above) the line joining the extreme points belong to the # upper hull # all points to the right (below) the line joining the extreme points below to the # lower hull # ignore all points on the line joining the extreme points since they cannot be # part of the convex hull left_most_point = points[0] right_most_point = points[n - 1] convex_set = {left_most_point, right_most_point} upper_hull = [] lower_hull = [] for i in range(1, n - 1): det = _det(left_most_point, right_most_point, points[i]) if det > 0: upper_hull.append(points[i]) elif det < 0: lower_hull.append(points[i]) _construct_hull(upper_hull, left_most_point, right_most_point, convex_set) _construct_hull(lower_hull, right_most_point, left_most_point, convex_set) return sorted(convex_set) def _construct_hull( points: list[Point], left: Point, right: Point, convex_set: set[Point] ) -> None: """ Parameters --------- points: list or None, the hull of points from which to choose the next convex-hull point left: Point, the point to the left of line segment joining left and right right: The point to the right of the line segment joining left and right convex_set: set, the current convex-hull. The state of convex-set gets updated by this function Note ---- For the line segment 'ab', 'a' is on the left and 'b' on the right. but the reverse is true for the line segment 'ba'. Returns ------- Nothing, only updates the state of convex-set """ if points: extreme_point = None extreme_point_distance = float("-inf") candidate_points = [] for p in points: det = _det(left, right, p) if det > 0: candidate_points.append(p) if det > extreme_point_distance: extreme_point_distance = det extreme_point = p if extreme_point: _construct_hull(candidate_points, left, extreme_point, convex_set) convex_set.add(extreme_point) _construct_hull(candidate_points, extreme_point, right, convex_set) def convex_hull_melkman(points: list[Point]) -> list[Point]: """ Constructs the convex hull of a set of 2D points using the melkman algorithm. The algorithm works by iteratively inserting points of a simple polygonal chain (meaning that no line segments between two consecutive points cross each other). Sorting the points yields such a polygonal chain. For a detailed description, see http://cgm.cs.mcgill.ca/~athens/cs601/Melkman.html Runtime: O(n log n) - O(n) if points are already sorted in the input Parameters --------- points: array-like of object of Points, lists or tuples. The set of 2d points for which the convex-hull is needed Returns ------ convex_set: list, the convex-hull of points sorted in non-decreasing order. See Also -------- Examples --------- >>> convex_hull_melkman([[0, 0], [1, 0], [10, 1]]) [(0.0, 0.0), (1.0, 0.0), (10.0, 1.0)] >>> convex_hull_melkman([[0, 0], [1, 0], [10, 0]]) [(0.0, 0.0), (10.0, 0.0)] >>> convex_hull_melkman([[-1, 1],[-1, -1], [0, 0], [0.5, 0.5], [1, -1], [1, 1], ... [-0.75, 1]]) [(-1.0, -1.0), (-1.0, 1.0), (1.0, -1.0), (1.0, 1.0)] >>> convex_hull_melkman([(0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), ... (2, -1), (2, -4), (1, -3)]) [(0.0, 0.0), (0.0, 3.0), (1.0, -3.0), (2.0, -4.0), (3.0, 0.0), (3.0, 3.0)] """ points = sorted(_validate_input(points)) n = len(points) convex_hull = points[:2] for i in range(2, n): det = _det(convex_hull[1], convex_hull[0], points[i]) if det > 0: convex_hull.insert(0, points[i]) break elif det < 0: convex_hull.append(points[i]) break else: convex_hull[1] = points[i] i += 1 for j in range(i, n): if ( _det(convex_hull[0], convex_hull[-1], points[j]) > 0 and _det(convex_hull[-1], convex_hull[0], points[1]) < 0 ): # The point lies within the convex hull continue convex_hull.insert(0, points[j]) convex_hull.append(points[j]) while _det(convex_hull[0], convex_hull[1], convex_hull[2]) >= 0: del convex_hull[1] while _det(convex_hull[-1], convex_hull[-2], convex_hull[-3]) <= 0: del convex_hull[-2] # `convex_hull` is contains the convex hull in circular order return sorted(convex_hull[1:] if len(convex_hull) > 3 else convex_hull) def main(): points = [ (0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), (2, -1), (2, -4), (1, -3), ] # the convex set of points is # [(0, 0), (0, 3), (1, -3), (2, -4), (3, 0), (3, 3)] results_bf = convex_hull_bf(points) results_recursive = convex_hull_recursive(points) assert results_bf == results_recursive results_melkman = convex_hull_melkman(points) assert results_bf == results_melkman print(results_bf) if __name__ == "__main__": main()
""" The convex hull problem is problem of finding all the vertices of convex polygon, P of a set of points in a plane such that all the points are either on the vertices of P or inside P. TH convex hull problem has several applications in geometrical problems, computer graphics and game development. Two algorithms have been implemented for the convex hull problem here. 1. A brute-force algorithm which runs in O(n^3) 2. A divide-and-conquer algorithm which runs in O(n log(n)) There are other several other algorithms for the convex hull problem which have not been implemented here, yet. """ from __future__ import annotations from collections.abc import Iterable class Point: """ Defines a 2-d point for use by all convex-hull algorithms. Parameters ---------- x: an int or a float, the x-coordinate of the 2-d point y: an int or a float, the y-coordinate of the 2-d point Examples -------- >>> Point(1, 2) (1.0, 2.0) >>> Point("1", "2") (1.0, 2.0) >>> Point(1, 2) > Point(0, 1) True >>> Point(1, 1) == Point(1, 1) True >>> Point(-0.5, 1) == Point(0.5, 1) False >>> Point("pi", "e") Traceback (most recent call last): ... ValueError: could not convert string to float: 'pi' """ def __init__(self, x, y): self.x, self.y = float(x), float(y) def __eq__(self, other): return self.x == other.x and self.y == other.y def __ne__(self, other): return not self == other def __gt__(self, other): if self.x > other.x: return True elif self.x == other.x: return self.y > other.y return False def __lt__(self, other): return not self > other def __ge__(self, other): if self.x > other.x: return True elif self.x == other.x: return self.y >= other.y return False def __le__(self, other): if self.x < other.x: return True elif self.x == other.x: return self.y <= other.y return False def __repr__(self): return f"({self.x}, {self.y})" def __hash__(self): return hash(self.x) def _construct_points( list_of_tuples: list[Point] | list[list[float]] | Iterable[list[float]], ) -> list[Point]: """ constructs a list of points from an array-like object of numbers Arguments --------- list_of_tuples: array-like object of type numbers. Acceptable types so far are lists, tuples and sets. Returns -------- points: a list where each item is of type Point. This contains only objects which can be converted into a Point. Examples ------- >>> _construct_points([[1, 1], [2, -1], [0.3, 4]]) [(1.0, 1.0), (2.0, -1.0), (0.3, 4.0)] >>> _construct_points([1, 2]) Ignoring deformed point 1. All points must have at least 2 coordinates. Ignoring deformed point 2. All points must have at least 2 coordinates. [] >>> _construct_points([]) [] >>> _construct_points(None) [] """ points: list[Point] = [] if list_of_tuples: for p in list_of_tuples: if isinstance(p, Point): points.append(p) else: try: points.append(Point(p[0], p[1])) except (IndexError, TypeError): print( f"Ignoring deformed point {p}. All points" " must have at least 2 coordinates." ) return points def _validate_input(points: list[Point] | list[list[float]]) -> list[Point]: """ validates an input instance before a convex-hull algorithms uses it Parameters --------- points: array-like, the 2d points to validate before using with a convex-hull algorithm. The elements of points must be either lists, tuples or Points. Returns ------- points: array_like, an iterable of all well-defined Points constructed passed in. Exception --------- ValueError: if points is empty or None, or if a wrong data structure like a scalar is passed TypeError: if an iterable but non-indexable object (eg. dictionary) is passed. The exception to this a set which we'll convert to a list before using Examples ------- >>> _validate_input([[1, 2]]) [(1.0, 2.0)] >>> _validate_input([(1, 2)]) [(1.0, 2.0)] >>> _validate_input([Point(2, 1), Point(-1, 2)]) [(2.0, 1.0), (-1.0, 2.0)] >>> _validate_input([]) Traceback (most recent call last): ... ValueError: Expecting a list of points but got [] >>> _validate_input(1) Traceback (most recent call last): ... ValueError: Expecting an iterable object but got an non-iterable type 1 """ if not hasattr(points, "__iter__"): raise ValueError( f"Expecting an iterable object but got an non-iterable type {points}" ) if not points: raise ValueError(f"Expecting a list of points but got {points}") return _construct_points(points) def _det(a: Point, b: Point, c: Point) -> float: """ Computes the sign perpendicular distance of a 2d point c from a line segment ab. The sign indicates the direction of c relative to ab. A Positive value means c is above ab (to the left), while a negative value means c is below ab (to the right). 0 means all three points are on a straight line. As a side note, 0.5 * abs|det| is the area of triangle abc Parameters ---------- a: point, the point on the left end of line segment ab b: point, the point on the right end of line segment ab c: point, the point for which the direction and location is desired. Returns -------- det: float, abs(det) is the distance of c from ab. The sign indicates which side of line segment ab c is. det is computed as (a_xb_y + c_xa_y + b_xc_y) - (a_yb_x + c_ya_x + b_yc_x) Examples ---------- >>> _det(Point(1, 1), Point(1, 2), Point(1, 5)) 0.0 >>> _det(Point(0, 0), Point(10, 0), Point(0, 10)) 100.0 >>> _det(Point(0, 0), Point(10, 0), Point(0, -10)) -100.0 """ det = (a.x * b.y + b.x * c.y + c.x * a.y) - (a.y * b.x + b.y * c.x + c.y * a.x) return det def convex_hull_bf(points: list[Point]) -> list[Point]: """ Constructs the convex hull of a set of 2D points using a brute force algorithm. The algorithm basically considers all combinations of points (i, j) and uses the definition of convexity to determine whether (i, j) is part of the convex hull or not. (i, j) is part of the convex hull if and only iff there are no points on both sides of the line segment connecting the ij, and there is no point k such that k is on either end of the ij. Runtime: O(n^3) - definitely horrible Parameters --------- points: array-like of object of Points, lists or tuples. The set of 2d points for which the convex-hull is needed Returns ------ convex_set: list, the convex-hull of points sorted in non-decreasing order. See Also -------- convex_hull_recursive, Examples --------- >>> convex_hull_bf([[0, 0], [1, 0], [10, 1]]) [(0.0, 0.0), (1.0, 0.0), (10.0, 1.0)] >>> convex_hull_bf([[0, 0], [1, 0], [10, 0]]) [(0.0, 0.0), (10.0, 0.0)] >>> convex_hull_bf([[-1, 1],[-1, -1], [0, 0], [0.5, 0.5], [1, -1], [1, 1], ... [-0.75, 1]]) [(-1.0, -1.0), (-1.0, 1.0), (1.0, -1.0), (1.0, 1.0)] >>> convex_hull_bf([(0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), ... (2, -1), (2, -4), (1, -3)]) [(0.0, 0.0), (0.0, 3.0), (1.0, -3.0), (2.0, -4.0), (3.0, 0.0), (3.0, 3.0)] """ points = sorted(_validate_input(points)) n = len(points) convex_set = set() for i in range(n - 1): for j in range(i + 1, n): points_left_of_ij = points_right_of_ij = False ij_part_of_convex_hull = True for k in range(n): if k != i and k != j: det_k = _det(points[i], points[j], points[k]) if det_k > 0: points_left_of_ij = True elif det_k < 0: points_right_of_ij = True else: # point[i], point[j], point[k] all lie on a straight line # if point[k] is to the left of point[i] or it's to the # right of point[j], then point[i], point[j] cannot be # part of the convex hull of A if points[k] < points[i] or points[k] > points[j]: ij_part_of_convex_hull = False break if points_left_of_ij and points_right_of_ij: ij_part_of_convex_hull = False break if ij_part_of_convex_hull: convex_set.update([points[i], points[j]]) return sorted(convex_set) def convex_hull_recursive(points: list[Point]) -> list[Point]: """ Constructs the convex hull of a set of 2D points using a divide-and-conquer strategy The algorithm exploits the geometric properties of the problem by repeatedly partitioning the set of points into smaller hulls, and finding the convex hull of these smaller hulls. The union of the convex hull from smaller hulls is the solution to the convex hull of the larger problem. Parameter --------- points: array-like of object of Points, lists or tuples. The set of 2d points for which the convex-hull is needed Runtime: O(n log n) Returns ------- convex_set: list, the convex-hull of points sorted in non-decreasing order. Examples --------- >>> convex_hull_recursive([[0, 0], [1, 0], [10, 1]]) [(0.0, 0.0), (1.0, 0.0), (10.0, 1.0)] >>> convex_hull_recursive([[0, 0], [1, 0], [10, 0]]) [(0.0, 0.0), (10.0, 0.0)] >>> convex_hull_recursive([[-1, 1],[-1, -1], [0, 0], [0.5, 0.5], [1, -1], [1, 1], ... [-0.75, 1]]) [(-1.0, -1.0), (-1.0, 1.0), (1.0, -1.0), (1.0, 1.0)] >>> convex_hull_recursive([(0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), ... (2, -1), (2, -4), (1, -3)]) [(0.0, 0.0), (0.0, 3.0), (1.0, -3.0), (2.0, -4.0), (3.0, 0.0), (3.0, 3.0)] """ points = sorted(_validate_input(points)) n = len(points) # divide all the points into an upper hull and a lower hull # the left most point and the right most point are definitely # members of the convex hull by definition. # use these two anchors to divide all the points into two hulls, # an upper hull and a lower hull. # all points to the left (above) the line joining the extreme points belong to the # upper hull # all points to the right (below) the line joining the extreme points below to the # lower hull # ignore all points on the line joining the extreme points since they cannot be # part of the convex hull left_most_point = points[0] right_most_point = points[n - 1] convex_set = {left_most_point, right_most_point} upper_hull = [] lower_hull = [] for i in range(1, n - 1): det = _det(left_most_point, right_most_point, points[i]) if det > 0: upper_hull.append(points[i]) elif det < 0: lower_hull.append(points[i]) _construct_hull(upper_hull, left_most_point, right_most_point, convex_set) _construct_hull(lower_hull, right_most_point, left_most_point, convex_set) return sorted(convex_set) def _construct_hull( points: list[Point], left: Point, right: Point, convex_set: set[Point] ) -> None: """ Parameters --------- points: list or None, the hull of points from which to choose the next convex-hull point left: Point, the point to the left of line segment joining left and right right: The point to the right of the line segment joining left and right convex_set: set, the current convex-hull. The state of convex-set gets updated by this function Note ---- For the line segment 'ab', 'a' is on the left and 'b' on the right. but the reverse is true for the line segment 'ba'. Returns ------- Nothing, only updates the state of convex-set """ if points: extreme_point = None extreme_point_distance = float("-inf") candidate_points = [] for p in points: det = _det(left, right, p) if det > 0: candidate_points.append(p) if det > extreme_point_distance: extreme_point_distance = det extreme_point = p if extreme_point: _construct_hull(candidate_points, left, extreme_point, convex_set) convex_set.add(extreme_point) _construct_hull(candidate_points, extreme_point, right, convex_set) def convex_hull_melkman(points: list[Point]) -> list[Point]: """ Constructs the convex hull of a set of 2D points using the melkman algorithm. The algorithm works by iteratively inserting points of a simple polygonal chain (meaning that no line segments between two consecutive points cross each other). Sorting the points yields such a polygonal chain. For a detailed description, see http://cgm.cs.mcgill.ca/~athens/cs601/Melkman.html Runtime: O(n log n) - O(n) if points are already sorted in the input Parameters --------- points: array-like of object of Points, lists or tuples. The set of 2d points for which the convex-hull is needed Returns ------ convex_set: list, the convex-hull of points sorted in non-decreasing order. See Also -------- Examples --------- >>> convex_hull_melkman([[0, 0], [1, 0], [10, 1]]) [(0.0, 0.0), (1.0, 0.0), (10.0, 1.0)] >>> convex_hull_melkman([[0, 0], [1, 0], [10, 0]]) [(0.0, 0.0), (10.0, 0.0)] >>> convex_hull_melkman([[-1, 1],[-1, -1], [0, 0], [0.5, 0.5], [1, -1], [1, 1], ... [-0.75, 1]]) [(-1.0, -1.0), (-1.0, 1.0), (1.0, -1.0), (1.0, 1.0)] >>> convex_hull_melkman([(0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), ... (2, -1), (2, -4), (1, -3)]) [(0.0, 0.0), (0.0, 3.0), (1.0, -3.0), (2.0, -4.0), (3.0, 0.0), (3.0, 3.0)] """ points = sorted(_validate_input(points)) n = len(points) convex_hull = points[:2] for i in range(2, n): det = _det(convex_hull[1], convex_hull[0], points[i]) if det > 0: convex_hull.insert(0, points[i]) break elif det < 0: convex_hull.append(points[i]) break else: convex_hull[1] = points[i] i += 1 for j in range(i, n): if ( _det(convex_hull[0], convex_hull[-1], points[j]) > 0 and _det(convex_hull[-1], convex_hull[0], points[1]) < 0 ): # The point lies within the convex hull continue convex_hull.insert(0, points[j]) convex_hull.append(points[j]) while _det(convex_hull[0], convex_hull[1], convex_hull[2]) >= 0: del convex_hull[1] while _det(convex_hull[-1], convex_hull[-2], convex_hull[-3]) <= 0: del convex_hull[-2] # `convex_hull` is contains the convex hull in circular order return sorted(convex_hull[1:] if len(convex_hull) > 3 else convex_hull) def main(): points = [ (0, 3), (2, 2), (1, 1), (2, 1), (3, 0), (0, 0), (3, 3), (2, -1), (2, -4), (1, -3), ] # the convex set of points is # [(0, 0), (0, 3), (1, -3), (2, -4), (3, 0), (3, 3)] results_bf = convex_hull_bf(points) results_recursive = convex_hull_recursive(points) assert results_bf == results_recursive results_melkman = convex_hull_melkman(points) assert results_bf == results_melkman print(results_bf) if __name__ == "__main__": main()
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Project Euler Problem 207: https://projecteuler.net/problem=207 Problem Statement: For some positive integers k, there exists an integer partition of the form 4**t = 2**t + k, where 4**t, 2**t, and k are all positive integers and t is a real number. The first two such partitions are 4**1 = 2**1 + 2 and 4**1.5849625... = 2**1.5849625... + 6. Partitions where t is also an integer are called perfect. For any m ≥ 1 let P(m) be the proportion of such partitions that are perfect with k ≤ m. Thus P(6) = 1/2. In the following table are listed some values of P(m) P(5) = 1/1 P(10) = 1/2 P(15) = 2/3 P(20) = 1/2 P(25) = 1/2 P(30) = 2/5 ... P(180) = 1/4 P(185) = 3/13 Find the smallest m for which P(m) < 1/12345 Solution: Equation 4**t = 2**t + k solved for t gives: t = log2(sqrt(4*k+1)/2 + 1/2) For t to be real valued, sqrt(4*k+1) must be an integer which is implemented in function check_t_real(k). For a perfect partition t must be an integer. To speed up significantly the search for partitions, instead of incrementing k by one per iteration, the next valid k is found by k = (i**2 - 1) / 4 with an integer i and k has to be a positive integer. If this is the case a partition is found. The partition is perfect if t os an integer. The integer i is increased with increment 1 until the proportion perfect partitions / total partitions drops under the given value. """ import math def check_partition_perfect(positive_integer: int) -> bool: """ Check if t = f(positive_integer) = log2(sqrt(4*positive_integer+1)/2 + 1/2) is a real number. >>> check_partition_perfect(2) True >>> check_partition_perfect(6) False """ exponent = math.log2(math.sqrt(4 * positive_integer + 1) / 2 + 1 / 2) return exponent == int(exponent) def solution(max_proportion: float = 1 / 12345) -> int: """ Find m for which the proportion of perfect partitions to total partitions is lower than max_proportion >>> solution(1) > 5 True >>> solution(1/2) > 10 True >>> solution(3 / 13) > 185 True """ total_partitions = 0 perfect_partitions = 0 integer = 3 while True: partition_candidate = (integer**2 - 1) / 4 # if candidate is an integer, then there is a partition for k if partition_candidate == int(partition_candidate): partition_candidate = int(partition_candidate) total_partitions += 1 if check_partition_perfect(partition_candidate): perfect_partitions += 1 if perfect_partitions > 0: if perfect_partitions / total_partitions < max_proportion: return int(partition_candidate) integer += 1 if __name__ == "__main__": print(f"{solution() = }")
""" Project Euler Problem 207: https://projecteuler.net/problem=207 Problem Statement: For some positive integers k, there exists an integer partition of the form 4**t = 2**t + k, where 4**t, 2**t, and k are all positive integers and t is a real number. The first two such partitions are 4**1 = 2**1 + 2 and 4**1.5849625... = 2**1.5849625... + 6. Partitions where t is also an integer are called perfect. For any m ≥ 1 let P(m) be the proportion of such partitions that are perfect with k ≤ m. Thus P(6) = 1/2. In the following table are listed some values of P(m) P(5) = 1/1 P(10) = 1/2 P(15) = 2/3 P(20) = 1/2 P(25) = 1/2 P(30) = 2/5 ... P(180) = 1/4 P(185) = 3/13 Find the smallest m for which P(m) < 1/12345 Solution: Equation 4**t = 2**t + k solved for t gives: t = log2(sqrt(4*k+1)/2 + 1/2) For t to be real valued, sqrt(4*k+1) must be an integer which is implemented in function check_t_real(k). For a perfect partition t must be an integer. To speed up significantly the search for partitions, instead of incrementing k by one per iteration, the next valid k is found by k = (i**2 - 1) / 4 with an integer i and k has to be a positive integer. If this is the case a partition is found. The partition is perfect if t os an integer. The integer i is increased with increment 1 until the proportion perfect partitions / total partitions drops under the given value. """ import math def check_partition_perfect(positive_integer: int) -> bool: """ Check if t = f(positive_integer) = log2(sqrt(4*positive_integer+1)/2 + 1/2) is a real number. >>> check_partition_perfect(2) True >>> check_partition_perfect(6) False """ exponent = math.log2(math.sqrt(4 * positive_integer + 1) / 2 + 1 / 2) return exponent == int(exponent) def solution(max_proportion: float = 1 / 12345) -> int: """ Find m for which the proportion of perfect partitions to total partitions is lower than max_proportion >>> solution(1) > 5 True >>> solution(1/2) > 10 True >>> solution(3 / 13) > 185 True """ total_partitions = 0 perfect_partitions = 0 integer = 3 while True: partition_candidate = (integer**2 - 1) / 4 # if candidate is an integer, then there is a partition for k if partition_candidate == int(partition_candidate): partition_candidate = int(partition_candidate) total_partitions += 1 if check_partition_perfect(partition_candidate): perfect_partitions += 1 if perfect_partitions > 0: if perfect_partitions / total_partitions < max_proportion: return int(partition_candidate) integer += 1 if __name__ == "__main__": print(f"{solution() = }")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Python program for Bitonic Sort. Note that this program works only when size of input is a power of 2. """ from __future__ import annotations def comp_and_swap(array: list[int], index1: int, index2: int, direction: int) -> None: """Compare the value at given index1 and index2 of the array and swap them as per the given direction. The parameter direction indicates the sorting direction, ASCENDING(1) or DESCENDING(0); if (a[i] > a[j]) agrees with the direction, then a[i] and a[j] are interchanged. >>> arr = [12, 42, -21, 1] >>> comp_and_swap(arr, 1, 2, 1) >>> arr [12, -21, 42, 1] >>> comp_and_swap(arr, 1, 2, 0) >>> arr [12, 42, -21, 1] >>> comp_and_swap(arr, 0, 3, 1) >>> arr [1, 42, -21, 12] >>> comp_and_swap(arr, 0, 3, 0) >>> arr [12, 42, -21, 1] """ if (direction == 1 and array[index1] > array[index2]) or ( direction == 0 and array[index1] < array[index2] ): array[index1], array[index2] = array[index2], array[index1] def bitonic_merge(array: list[int], low: int, length: int, direction: int) -> None: """ It recursively sorts a bitonic sequence in ascending order, if direction = 1, and in descending if direction = 0. The sequence to be sorted starts at index position low, the parameter length is the number of elements to be sorted. >>> arr = [12, 42, -21, 1] >>> bitonic_merge(arr, 0, 4, 1) >>> arr [-21, 1, 12, 42] >>> bitonic_merge(arr, 0, 4, 0) >>> arr [42, 12, 1, -21] """ if length > 1: middle = int(length / 2) for i in range(low, low + middle): comp_and_swap(array, i, i + middle, direction) bitonic_merge(array, low, middle, direction) bitonic_merge(array, low + middle, middle, direction) def bitonic_sort(array: list[int], low: int, length: int, direction: int) -> None: """ This function first produces a bitonic sequence by recursively sorting its two halves in opposite sorting orders, and then calls bitonic_merge to make them in the same order. >>> arr = [12, 34, 92, -23, 0, -121, -167, 145] >>> bitonic_sort(arr, 0, 8, 1) >>> arr [-167, -121, -23, 0, 12, 34, 92, 145] >>> bitonic_sort(arr, 0, 8, 0) >>> arr [145, 92, 34, 12, 0, -23, -121, -167] """ if length > 1: middle = int(length / 2) bitonic_sort(array, low, middle, 1) bitonic_sort(array, low + middle, middle, 0) bitonic_merge(array, low, length, direction) if __name__ == "__main__": user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item.strip()) for item in user_input.split(",")] bitonic_sort(unsorted, 0, len(unsorted), 1) print("\nSorted array in ascending order is: ", end="") print(*unsorted, sep=", ") bitonic_merge(unsorted, 0, len(unsorted), 0) print("Sorted array in descending order is: ", end="") print(*unsorted, sep=", ")
""" Python program for Bitonic Sort. Note that this program works only when size of input is a power of 2. """ from __future__ import annotations def comp_and_swap(array: list[int], index1: int, index2: int, direction: int) -> None: """Compare the value at given index1 and index2 of the array and swap them as per the given direction. The parameter direction indicates the sorting direction, ASCENDING(1) or DESCENDING(0); if (a[i] > a[j]) agrees with the direction, then a[i] and a[j] are interchanged. >>> arr = [12, 42, -21, 1] >>> comp_and_swap(arr, 1, 2, 1) >>> arr [12, -21, 42, 1] >>> comp_and_swap(arr, 1, 2, 0) >>> arr [12, 42, -21, 1] >>> comp_and_swap(arr, 0, 3, 1) >>> arr [1, 42, -21, 12] >>> comp_and_swap(arr, 0, 3, 0) >>> arr [12, 42, -21, 1] """ if (direction == 1 and array[index1] > array[index2]) or ( direction == 0 and array[index1] < array[index2] ): array[index1], array[index2] = array[index2], array[index1] def bitonic_merge(array: list[int], low: int, length: int, direction: int) -> None: """ It recursively sorts a bitonic sequence in ascending order, if direction = 1, and in descending if direction = 0. The sequence to be sorted starts at index position low, the parameter length is the number of elements to be sorted. >>> arr = [12, 42, -21, 1] >>> bitonic_merge(arr, 0, 4, 1) >>> arr [-21, 1, 12, 42] >>> bitonic_merge(arr, 0, 4, 0) >>> arr [42, 12, 1, -21] """ if length > 1: middle = int(length / 2) for i in range(low, low + middle): comp_and_swap(array, i, i + middle, direction) bitonic_merge(array, low, middle, direction) bitonic_merge(array, low + middle, middle, direction) def bitonic_sort(array: list[int], low: int, length: int, direction: int) -> None: """ This function first produces a bitonic sequence by recursively sorting its two halves in opposite sorting orders, and then calls bitonic_merge to make them in the same order. >>> arr = [12, 34, 92, -23, 0, -121, -167, 145] >>> bitonic_sort(arr, 0, 8, 1) >>> arr [-167, -121, -23, 0, 12, 34, 92, 145] >>> bitonic_sort(arr, 0, 8, 0) >>> arr [145, 92, 34, 12, 0, -23, -121, -167] """ if length > 1: middle = int(length / 2) bitonic_sort(array, low, middle, 1) bitonic_sort(array, low + middle, middle, 0) bitonic_merge(array, low, length, direction) if __name__ == "__main__": user_input = input("Enter numbers separated by a comma:\n").strip() unsorted = [int(item.strip()) for item in user_input.split(",")] bitonic_sort(unsorted, 0, len(unsorted), 1) print("\nSorted array in ascending order is: ", end="") print(*unsorted, sep=", ") bitonic_merge(unsorted, 0, len(unsorted), 0) print("Sorted array in descending order is: ", end="") print(*unsorted, sep=", ")
-1
TheAlgorithms/Python
7,844
Remove file-level flake8 suppression
### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T13:08:44Z"
"2022-10-29T20:45:21Z"
47ddba1d914bf5955a244056e794e718dee9ead1
1550731cb7457ddae216da2ffe0bc1587f5234f3
Remove file-level flake8 suppression. ### Describe your change: Some files were completely excluded from flake8 checks. - Remove file-level suppressions - Fix simple warnings - Suppress long line indocstring * [ ] Add an algorithm? * [x] Fix a bug or typo in an existing algorithm? * [ ] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Implementation of double ended queue. """ from __future__ import annotations from collections.abc import Iterable from dataclasses import dataclass from typing import Any class Deque: """ Deque data structure. Operations ---------- append(val: Any) -> None appendleft(val: Any) -> None extend(iterable: Iterable) -> None extendleft(iterable: Iterable) -> None pop() -> Any popleft() -> Any Observers --------- is_empty() -> bool Attributes ---------- _front: _Node front of the deque a.k.a. the first element _back: _Node back of the element a.k.a. the last element _len: int the number of nodes """ __slots__ = ["_front", "_back", "_len"] @dataclass class _Node: """ Representation of a node. Contains a value and a pointer to the next node as well as to the previous one. """ val: Any = None next: Deque._Node | None = None prev: Deque._Node | None = None class _Iterator: """ Helper class for iteration. Will be used to implement iteration. Attributes ---------- _cur: _Node the current node of the iteration. """ __slots__ = ["_cur"] def __init__(self, cur: Deque._Node | None) -> None: self._cur = cur def __iter__(self) -> Deque._Iterator: """ >>> our_deque = Deque([1, 2, 3]) >>> iterator = iter(our_deque) """ return self def __next__(self) -> Any: """ >>> our_deque = Deque([1, 2, 3]) >>> iterator = iter(our_deque) >>> next(iterator) 1 >>> next(iterator) 2 >>> next(iterator) 3 """ if self._cur is None: # finished iterating raise StopIteration val = self._cur.val self._cur = self._cur.next return val def __init__(self, iterable: Iterable[Any] | None = None) -> None: self._front: Any = None self._back: Any = None self._len: int = 0 if iterable is not None: # append every value to the deque for val in iterable: self.append(val) def append(self, val: Any) -> None: """ Adds val to the end of the deque. Time complexity: O(1) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_1.append(4) >>> our_deque_1 [1, 2, 3, 4] >>> our_deque_2 = Deque('ab') >>> our_deque_2.append('c') >>> our_deque_2 ['a', 'b', 'c'] >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_1.append(4) >>> deque_collections_1 deque([1, 2, 3, 4]) >>> deque_collections_2 = deque('ab') >>> deque_collections_2.append('c') >>> deque_collections_2 deque(['a', 'b', 'c']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ node = self._Node(val, None, None) if self.is_empty(): # front = back self._front = self._back = node self._len = 1 else: # connect nodes self._back.next = node node.prev = self._back self._back = node # assign new back to the new node self._len += 1 # make sure there were no errors assert not self.is_empty(), "Error on appending value." def appendleft(self, val: Any) -> None: """ Adds val to the beginning of the deque. Time complexity: O(1) >>> our_deque_1 = Deque([2, 3]) >>> our_deque_1.appendleft(1) >>> our_deque_1 [1, 2, 3] >>> our_deque_2 = Deque('bc') >>> our_deque_2.appendleft('a') >>> our_deque_2 ['a', 'b', 'c'] >>> from collections import deque >>> deque_collections_1 = deque([2, 3]) >>> deque_collections_1.appendleft(1) >>> deque_collections_1 deque([1, 2, 3]) >>> deque_collections_2 = deque('bc') >>> deque_collections_2.appendleft('a') >>> deque_collections_2 deque(['a', 'b', 'c']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ node = self._Node(val, None, None) if self.is_empty(): # front = back self._front = self._back = node self._len = 1 else: # connect nodes node.next = self._front self._front.prev = node self._front = node # assign new front to the new node self._len += 1 # make sure there were no errors assert not self.is_empty(), "Error on appending value." def extend(self, iterable: Iterable[Any]) -> None: """ Appends every value of iterable to the end of the deque. Time complexity: O(n) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_1.extend([4, 5]) >>> our_deque_1 [1, 2, 3, 4, 5] >>> our_deque_2 = Deque('ab') >>> our_deque_2.extend('cd') >>> our_deque_2 ['a', 'b', 'c', 'd'] >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_1.extend([4, 5]) >>> deque_collections_1 deque([1, 2, 3, 4, 5]) >>> deque_collections_2 = deque('ab') >>> deque_collections_2.extend('cd') >>> deque_collections_2 deque(['a', 'b', 'c', 'd']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ for val in iterable: self.append(val) def extendleft(self, iterable: Iterable[Any]) -> None: """ Appends every value of iterable to the beginning of the deque. Time complexity: O(n) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_1.extendleft([0, -1]) >>> our_deque_1 [-1, 0, 1, 2, 3] >>> our_deque_2 = Deque('cd') >>> our_deque_2.extendleft('ba') >>> our_deque_2 ['a', 'b', 'c', 'd'] >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_1.extendleft([0, -1]) >>> deque_collections_1 deque([-1, 0, 1, 2, 3]) >>> deque_collections_2 = deque('cd') >>> deque_collections_2.extendleft('ba') >>> deque_collections_2 deque(['a', 'b', 'c', 'd']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ for val in iterable: self.appendleft(val) def pop(self) -> Any: """ Removes the last element of the deque and returns it. Time complexity: O(1) @returns topop.val: the value of the node to pop. >>> our_deque = Deque([1, 2, 3, 15182]) >>> our_popped = our_deque.pop() >>> our_popped 15182 >>> our_deque [1, 2, 3] >>> from collections import deque >>> deque_collections = deque([1, 2, 3, 15182]) >>> collections_popped = deque_collections.pop() >>> collections_popped 15182 >>> deque_collections deque([1, 2, 3]) >>> list(our_deque) == list(deque_collections) True >>> our_popped == collections_popped True """ # make sure the deque has elements to pop assert not self.is_empty(), "Deque is empty." topop = self._back self._back = self._back.prev # set new back self._back.next = ( None # drop the last node - python will deallocate memory automatically ) self._len -= 1 return topop.val def popleft(self) -> Any: """ Removes the first element of the deque and returns it. Time complexity: O(1) @returns topop.val: the value of the node to pop. >>> our_deque = Deque([15182, 1, 2, 3]) >>> our_popped = our_deque.popleft() >>> our_popped 15182 >>> our_deque [1, 2, 3] >>> from collections import deque >>> deque_collections = deque([15182, 1, 2, 3]) >>> collections_popped = deque_collections.popleft() >>> collections_popped 15182 >>> deque_collections deque([1, 2, 3]) >>> list(our_deque) == list(deque_collections) True >>> our_popped == collections_popped True """ # make sure the deque has elements to pop assert not self.is_empty(), "Deque is empty." topop = self._front self._front = self._front.next # set new front and drop the first node self._front.prev = None self._len -= 1 return topop.val def is_empty(self) -> bool: """ Checks if the deque is empty. Time complexity: O(1) >>> our_deque = Deque([1, 2, 3]) >>> our_deque.is_empty() False >>> our_empty_deque = Deque() >>> our_empty_deque.is_empty() True >>> from collections import deque >>> empty_deque_collections = deque() >>> list(our_empty_deque) == list(empty_deque_collections) True """ return self._front is None def __len__(self) -> int: """ Implements len() function. Returns the length of the deque. Time complexity: O(1) >>> our_deque = Deque([1, 2, 3]) >>> len(our_deque) 3 >>> our_empty_deque = Deque() >>> len(our_empty_deque) 0 >>> from collections import deque >>> deque_collections = deque([1, 2, 3]) >>> len(deque_collections) 3 >>> empty_deque_collections = deque() >>> len(empty_deque_collections) 0 >>> len(our_empty_deque) == len(empty_deque_collections) True """ return self._len def __eq__(self, other: object) -> bool: """ Implements "==" operator. Returns if *self* is equal to *other*. Time complexity: O(n) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_2 = Deque([1, 2, 3]) >>> our_deque_1 == our_deque_2 True >>> our_deque_3 = Deque([1, 2]) >>> our_deque_1 == our_deque_3 False >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_2 = deque([1, 2, 3]) >>> deque_collections_1 == deque_collections_2 True >>> deque_collections_3 = deque([1, 2]) >>> deque_collections_1 == deque_collections_3 False >>> (our_deque_1 == our_deque_2) == (deque_collections_1 == deque_collections_2) True >>> (our_deque_1 == our_deque_3) == (deque_collections_1 == deque_collections_3) True """ if not isinstance(other, Deque): return NotImplemented me = self._front oth = other._front # if the length of the dequeues are not the same, they are not equal if len(self) != len(other): return False while me is not None and oth is not None: # compare every value if me.val != oth.val: return False me = me.next oth = oth.next return True def __iter__(self) -> Deque._Iterator: """ Implements iteration. Time complexity: O(1) >>> our_deque = Deque([1, 2, 3]) >>> for v in our_deque: ... print(v) 1 2 3 >>> from collections import deque >>> deque_collections = deque([1, 2, 3]) >>> for v in deque_collections: ... print(v) 1 2 3 """ return Deque._Iterator(self._front) def __repr__(self) -> str: """ Implements representation of the deque. Represents it as a list, with its values between '[' and ']'. Time complexity: O(n) >>> our_deque = Deque([1, 2, 3]) >>> our_deque [1, 2, 3] """ values_list = [] aux = self._front while aux is not None: # append the values in a list to display values_list.append(aux.val) aux = aux.next return "[" + ", ".join(repr(val) for val in values_list) + "]" if __name__ == "__main__": import doctest doctest.testmod()
""" Implementation of double ended queue. """ from __future__ import annotations from collections.abc import Iterable from dataclasses import dataclass from typing import Any class Deque: """ Deque data structure. Operations ---------- append(val: Any) -> None appendleft(val: Any) -> None extend(iterable: Iterable) -> None extendleft(iterable: Iterable) -> None pop() -> Any popleft() -> Any Observers --------- is_empty() -> bool Attributes ---------- _front: _Node front of the deque a.k.a. the first element _back: _Node back of the element a.k.a. the last element _len: int the number of nodes """ __slots__ = ["_front", "_back", "_len"] @dataclass class _Node: """ Representation of a node. Contains a value and a pointer to the next node as well as to the previous one. """ val: Any = None next: Deque._Node | None = None prev: Deque._Node | None = None class _Iterator: """ Helper class for iteration. Will be used to implement iteration. Attributes ---------- _cur: _Node the current node of the iteration. """ __slots__ = ["_cur"] def __init__(self, cur: Deque._Node | None) -> None: self._cur = cur def __iter__(self) -> Deque._Iterator: """ >>> our_deque = Deque([1, 2, 3]) >>> iterator = iter(our_deque) """ return self def __next__(self) -> Any: """ >>> our_deque = Deque([1, 2, 3]) >>> iterator = iter(our_deque) >>> next(iterator) 1 >>> next(iterator) 2 >>> next(iterator) 3 """ if self._cur is None: # finished iterating raise StopIteration val = self._cur.val self._cur = self._cur.next return val def __init__(self, iterable: Iterable[Any] | None = None) -> None: self._front: Any = None self._back: Any = None self._len: int = 0 if iterable is not None: # append every value to the deque for val in iterable: self.append(val) def append(self, val: Any) -> None: """ Adds val to the end of the deque. Time complexity: O(1) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_1.append(4) >>> our_deque_1 [1, 2, 3, 4] >>> our_deque_2 = Deque('ab') >>> our_deque_2.append('c') >>> our_deque_2 ['a', 'b', 'c'] >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_1.append(4) >>> deque_collections_1 deque([1, 2, 3, 4]) >>> deque_collections_2 = deque('ab') >>> deque_collections_2.append('c') >>> deque_collections_2 deque(['a', 'b', 'c']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ node = self._Node(val, None, None) if self.is_empty(): # front = back self._front = self._back = node self._len = 1 else: # connect nodes self._back.next = node node.prev = self._back self._back = node # assign new back to the new node self._len += 1 # make sure there were no errors assert not self.is_empty(), "Error on appending value." def appendleft(self, val: Any) -> None: """ Adds val to the beginning of the deque. Time complexity: O(1) >>> our_deque_1 = Deque([2, 3]) >>> our_deque_1.appendleft(1) >>> our_deque_1 [1, 2, 3] >>> our_deque_2 = Deque('bc') >>> our_deque_2.appendleft('a') >>> our_deque_2 ['a', 'b', 'c'] >>> from collections import deque >>> deque_collections_1 = deque([2, 3]) >>> deque_collections_1.appendleft(1) >>> deque_collections_1 deque([1, 2, 3]) >>> deque_collections_2 = deque('bc') >>> deque_collections_2.appendleft('a') >>> deque_collections_2 deque(['a', 'b', 'c']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ node = self._Node(val, None, None) if self.is_empty(): # front = back self._front = self._back = node self._len = 1 else: # connect nodes node.next = self._front self._front.prev = node self._front = node # assign new front to the new node self._len += 1 # make sure there were no errors assert not self.is_empty(), "Error on appending value." def extend(self, iterable: Iterable[Any]) -> None: """ Appends every value of iterable to the end of the deque. Time complexity: O(n) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_1.extend([4, 5]) >>> our_deque_1 [1, 2, 3, 4, 5] >>> our_deque_2 = Deque('ab') >>> our_deque_2.extend('cd') >>> our_deque_2 ['a', 'b', 'c', 'd'] >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_1.extend([4, 5]) >>> deque_collections_1 deque([1, 2, 3, 4, 5]) >>> deque_collections_2 = deque('ab') >>> deque_collections_2.extend('cd') >>> deque_collections_2 deque(['a', 'b', 'c', 'd']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ for val in iterable: self.append(val) def extendleft(self, iterable: Iterable[Any]) -> None: """ Appends every value of iterable to the beginning of the deque. Time complexity: O(n) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_1.extendleft([0, -1]) >>> our_deque_1 [-1, 0, 1, 2, 3] >>> our_deque_2 = Deque('cd') >>> our_deque_2.extendleft('ba') >>> our_deque_2 ['a', 'b', 'c', 'd'] >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_1.extendleft([0, -1]) >>> deque_collections_1 deque([-1, 0, 1, 2, 3]) >>> deque_collections_2 = deque('cd') >>> deque_collections_2.extendleft('ba') >>> deque_collections_2 deque(['a', 'b', 'c', 'd']) >>> list(our_deque_1) == list(deque_collections_1) True >>> list(our_deque_2) == list(deque_collections_2) True """ for val in iterable: self.appendleft(val) def pop(self) -> Any: """ Removes the last element of the deque and returns it. Time complexity: O(1) @returns topop.val: the value of the node to pop. >>> our_deque = Deque([1, 2, 3, 15182]) >>> our_popped = our_deque.pop() >>> our_popped 15182 >>> our_deque [1, 2, 3] >>> from collections import deque >>> deque_collections = deque([1, 2, 3, 15182]) >>> collections_popped = deque_collections.pop() >>> collections_popped 15182 >>> deque_collections deque([1, 2, 3]) >>> list(our_deque) == list(deque_collections) True >>> our_popped == collections_popped True """ # make sure the deque has elements to pop assert not self.is_empty(), "Deque is empty." topop = self._back self._back = self._back.prev # set new back self._back.next = ( None # drop the last node - python will deallocate memory automatically ) self._len -= 1 return topop.val def popleft(self) -> Any: """ Removes the first element of the deque and returns it. Time complexity: O(1) @returns topop.val: the value of the node to pop. >>> our_deque = Deque([15182, 1, 2, 3]) >>> our_popped = our_deque.popleft() >>> our_popped 15182 >>> our_deque [1, 2, 3] >>> from collections import deque >>> deque_collections = deque([15182, 1, 2, 3]) >>> collections_popped = deque_collections.popleft() >>> collections_popped 15182 >>> deque_collections deque([1, 2, 3]) >>> list(our_deque) == list(deque_collections) True >>> our_popped == collections_popped True """ # make sure the deque has elements to pop assert not self.is_empty(), "Deque is empty." topop = self._front self._front = self._front.next # set new front and drop the first node self._front.prev = None self._len -= 1 return topop.val def is_empty(self) -> bool: """ Checks if the deque is empty. Time complexity: O(1) >>> our_deque = Deque([1, 2, 3]) >>> our_deque.is_empty() False >>> our_empty_deque = Deque() >>> our_empty_deque.is_empty() True >>> from collections import deque >>> empty_deque_collections = deque() >>> list(our_empty_deque) == list(empty_deque_collections) True """ return self._front is None def __len__(self) -> int: """ Implements len() function. Returns the length of the deque. Time complexity: O(1) >>> our_deque = Deque([1, 2, 3]) >>> len(our_deque) 3 >>> our_empty_deque = Deque() >>> len(our_empty_deque) 0 >>> from collections import deque >>> deque_collections = deque([1, 2, 3]) >>> len(deque_collections) 3 >>> empty_deque_collections = deque() >>> len(empty_deque_collections) 0 >>> len(our_empty_deque) == len(empty_deque_collections) True """ return self._len def __eq__(self, other: object) -> bool: """ Implements "==" operator. Returns if *self* is equal to *other*. Time complexity: O(n) >>> our_deque_1 = Deque([1, 2, 3]) >>> our_deque_2 = Deque([1, 2, 3]) >>> our_deque_1 == our_deque_2 True >>> our_deque_3 = Deque([1, 2]) >>> our_deque_1 == our_deque_3 False >>> from collections import deque >>> deque_collections_1 = deque([1, 2, 3]) >>> deque_collections_2 = deque([1, 2, 3]) >>> deque_collections_1 == deque_collections_2 True >>> deque_collections_3 = deque([1, 2]) >>> deque_collections_1 == deque_collections_3 False >>> (our_deque_1 == our_deque_2) == (deque_collections_1 == deque_collections_2) True >>> (our_deque_1 == our_deque_3) == (deque_collections_1 == deque_collections_3) True """ if not isinstance(other, Deque): return NotImplemented me = self._front oth = other._front # if the length of the dequeues are not the same, they are not equal if len(self) != len(other): return False while me is not None and oth is not None: # compare every value if me.val != oth.val: return False me = me.next oth = oth.next return True def __iter__(self) -> Deque._Iterator: """ Implements iteration. Time complexity: O(1) >>> our_deque = Deque([1, 2, 3]) >>> for v in our_deque: ... print(v) 1 2 3 >>> from collections import deque >>> deque_collections = deque([1, 2, 3]) >>> for v in deque_collections: ... print(v) 1 2 3 """ return Deque._Iterator(self._front) def __repr__(self) -> str: """ Implements representation of the deque. Represents it as a list, with its values between '[' and ']'. Time complexity: O(n) >>> our_deque = Deque([1, 2, 3]) >>> our_deque [1, 2, 3] """ values_list = [] aux = self._front while aux is not None: # append the values in a list to display values_list.append(aux.val) aux = aux.next return "[" + ", ".join(repr(val) for val in values_list) + "]" if __name__ == "__main__": import doctest doctest.testmod()
-1
TheAlgorithms/Python
7,843
Fix yesqa hook
### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T12:58:59Z"
"2022-10-29T13:07:02Z"
18ffc4dec85a85837f71cd6c9b1e630b9d185001
584e743422565decd35b1b6f94cef3ced840698b
Fix yesqa hook. ### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.3.0 hooks: - id: check-executables-have-shebangs - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace exclude: | (?x)^( data_structures/heap/binomial_heap.py )$ - id: requirements-txt-fixer - repo: https://github.com/MarcoGorelli/auto-walrus rev: v0.2.1 hooks: - id: auto-walrus - repo: https://github.com/psf/black rev: 22.10.0 hooks: - id: black - repo: https://github.com/PyCQA/isort rev: 5.10.1 hooks: - id: isort args: - --profile=black - repo: https://github.com/asottile/pyupgrade rev: v3.1.0 hooks: - id: pyupgrade args: - --py310-plus - repo: https://github.com/PyCQA/flake8 rev: 5.0.4 hooks: - id: flake8 # See .flake8 for args additional_dependencies: - flake8-bugbear - flake8-builtins - flake8-broken-line - flake8-comprehensions - pep8-naming - yesqa - repo: https://github.com/pre-commit/mirrors-mypy rev: v0.982 hooks: - id: mypy args: - --ignore-missing-imports - --install-types # See mirrors-mypy README.md - --non-interactive additional_dependencies: [types-requests] - repo: https://github.com/codespell-project/codespell rev: v2.2.2 hooks: - id: codespell args: - --ignore-words-list=ans,crate,damon,fo,followings,hist,iff,mater,secant,som,sur,tim,zar exclude: | (?x)^( ciphers/prehistoric_men.txt | strings/dictionary.txt | strings/words.txt | project_euler/problem_022/p022_names.txt )$ - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false
repos: - repo: https://github.com/pre-commit/pre-commit-hooks rev: v4.3.0 hooks: - id: check-executables-have-shebangs - id: check-yaml - id: end-of-file-fixer types: [python] - id: trailing-whitespace exclude: | (?x)^( data_structures/heap/binomial_heap.py )$ - id: requirements-txt-fixer - repo: https://github.com/MarcoGorelli/auto-walrus rev: v0.2.1 hooks: - id: auto-walrus - repo: https://github.com/psf/black rev: 22.10.0 hooks: - id: black - repo: https://github.com/PyCQA/isort rev: 5.10.1 hooks: - id: isort args: - --profile=black - repo: https://github.com/asottile/pyupgrade rev: v3.1.0 hooks: - id: pyupgrade args: - --py310-plus - repo: https://github.com/PyCQA/flake8 rev: 5.0.4 hooks: - id: flake8 # See .flake8 for args additional_dependencies: &flake8-plugins - flake8-bugbear - flake8-builtins - flake8-broken-line - flake8-comprehensions - pep8-naming - repo: https://github.com/asottile/yesqa rev: v1.4.0 hooks: - id: yesqa additional_dependencies: *flake8-plugins - repo: https://github.com/pre-commit/mirrors-mypy rev: v0.982 hooks: - id: mypy args: - --ignore-missing-imports - --install-types # See mirrors-mypy README.md - --non-interactive additional_dependencies: [types-requests] - repo: https://github.com/codespell-project/codespell rev: v2.2.2 hooks: - id: codespell args: - --ignore-words-list=ans,crate,damon,fo,followings,hist,iff,mater,secant,som,sur,tim,zar exclude: | (?x)^( ciphers/prehistoric_men.txt | strings/dictionary.txt | strings/words.txt | project_euler/problem_022/p022_names.txt )$ - repo: local hooks: - id: validate-filenames name: Validate filenames entry: ./scripts/validate_filenames.py language: script pass_filenames: false
1
TheAlgorithms/Python
7,843
Fix yesqa hook
### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T12:58:59Z"
"2022-10-29T13:07:02Z"
18ffc4dec85a85837f71cd6c9b1e630b9d185001
584e743422565decd35b1b6f94cef3ced840698b
Fix yesqa hook. ### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Equal Loudness Filter](audio_filters/equal_loudness_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) ## Electronics * [Carrier Concentration](electronics/carrier_concentration.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Power](electronics/electric_power.py) * [Ohms Law](electronics/ohms_law.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bfs Shortest Path](graphs/bfs_shortest_path.py) * [Bfs Zero One Shortest Path](graphs/bfs_zero_one_shortest_path.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Random Forest Classifier](machine_learning/random_forest_classifier.py) * [Random Forest Regressor](machine_learning/random_forest_regressor.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Abs Max](maths/abs_max.py) * [Abs Min](maths/abs_min.py) * [Add](maths/add.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial Iterative](maths/factorial_iterative.py) * [Factorial Recursive](maths/factorial_recursive.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Perceptron](neural_network/perceptron.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Check Strong Password](other/check_strong_password.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password Generator](other/password_generator.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Casimir Effect](physics/casimir_effect.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) ## Quantum * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Billioners](web_programming/get_top_billioners.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
## Arithmetic Analysis * [Bisection](arithmetic_analysis/bisection.py) * [Gaussian Elimination](arithmetic_analysis/gaussian_elimination.py) * [In Static Equilibrium](arithmetic_analysis/in_static_equilibrium.py) * [Intersection](arithmetic_analysis/intersection.py) * [Jacobi Iteration Method](arithmetic_analysis/jacobi_iteration_method.py) * [Lu Decomposition](arithmetic_analysis/lu_decomposition.py) * [Newton Forward Interpolation](arithmetic_analysis/newton_forward_interpolation.py) * [Newton Method](arithmetic_analysis/newton_method.py) * [Newton Raphson](arithmetic_analysis/newton_raphson.py) * [Newton Raphson New](arithmetic_analysis/newton_raphson_new.py) * [Secant Method](arithmetic_analysis/secant_method.py) ## Audio Filters * [Butterworth Filter](audio_filters/butterworth_filter.py) * [Equal Loudness Filter](audio_filters/equal_loudness_filter.py) * [Iir Filter](audio_filters/iir_filter.py) * [Show Response](audio_filters/show_response.py) ## Backtracking * [All Combinations](backtracking/all_combinations.py) * [All Permutations](backtracking/all_permutations.py) * [All Subsequences](backtracking/all_subsequences.py) * [Coloring](backtracking/coloring.py) * [Combination Sum](backtracking/combination_sum.py) * [Hamiltonian Cycle](backtracking/hamiltonian_cycle.py) * [Knight Tour](backtracking/knight_tour.py) * [Minimax](backtracking/minimax.py) * [Minmax](backtracking/minmax.py) * [N Queens](backtracking/n_queens.py) * [N Queens Math](backtracking/n_queens_math.py) * [Rat In Maze](backtracking/rat_in_maze.py) * [Sudoku](backtracking/sudoku.py) * [Sum Of Subsets](backtracking/sum_of_subsets.py) ## Bit Manipulation * [Binary And Operator](bit_manipulation/binary_and_operator.py) * [Binary Count Setbits](bit_manipulation/binary_count_setbits.py) * [Binary Count Trailing Zeros](bit_manipulation/binary_count_trailing_zeros.py) * [Binary Or Operator](bit_manipulation/binary_or_operator.py) * [Binary Shifts](bit_manipulation/binary_shifts.py) * [Binary Twos Complement](bit_manipulation/binary_twos_complement.py) * [Binary Xor Operator](bit_manipulation/binary_xor_operator.py) * [Count 1S Brian Kernighan Method](bit_manipulation/count_1s_brian_kernighan_method.py) * [Count Number Of One Bits](bit_manipulation/count_number_of_one_bits.py) * [Gray Code Sequence](bit_manipulation/gray_code_sequence.py) * [Highest Set Bit](bit_manipulation/highest_set_bit.py) * [Is Even](bit_manipulation/is_even.py) * [Reverse Bits](bit_manipulation/reverse_bits.py) * [Single Bit Manipulation Operations](bit_manipulation/single_bit_manipulation_operations.py) ## Blockchain * [Chinese Remainder Theorem](blockchain/chinese_remainder_theorem.py) * [Diophantine Equation](blockchain/diophantine_equation.py) * [Modular Division](blockchain/modular_division.py) ## Boolean Algebra * [And Gate](boolean_algebra/and_gate.py) * [Nand Gate](boolean_algebra/nand_gate.py) * [Norgate](boolean_algebra/norgate.py) * [Not Gate](boolean_algebra/not_gate.py) * [Or Gate](boolean_algebra/or_gate.py) * [Quine Mc Cluskey](boolean_algebra/quine_mc_cluskey.py) * [Xnor Gate](boolean_algebra/xnor_gate.py) * [Xor Gate](boolean_algebra/xor_gate.py) ## Cellular Automata * [Conways Game Of Life](cellular_automata/conways_game_of_life.py) * [Game Of Life](cellular_automata/game_of_life.py) * [Nagel Schrekenberg](cellular_automata/nagel_schrekenberg.py) * [One Dimensional](cellular_automata/one_dimensional.py) ## Ciphers * [A1Z26](ciphers/a1z26.py) * [Affine Cipher](ciphers/affine_cipher.py) * [Atbash](ciphers/atbash.py) * [Baconian Cipher](ciphers/baconian_cipher.py) * [Base16](ciphers/base16.py) * [Base32](ciphers/base32.py) * [Base64](ciphers/base64.py) * [Base85](ciphers/base85.py) * [Beaufort Cipher](ciphers/beaufort_cipher.py) * [Bifid](ciphers/bifid.py) * [Brute Force Caesar Cipher](ciphers/brute_force_caesar_cipher.py) * [Caesar Cipher](ciphers/caesar_cipher.py) * [Cryptomath Module](ciphers/cryptomath_module.py) * [Decrypt Caesar With Chi Squared](ciphers/decrypt_caesar_with_chi_squared.py) * [Deterministic Miller Rabin](ciphers/deterministic_miller_rabin.py) * [Diffie](ciphers/diffie.py) * [Diffie Hellman](ciphers/diffie_hellman.py) * [Elgamal Key Generator](ciphers/elgamal_key_generator.py) * [Enigma Machine2](ciphers/enigma_machine2.py) * [Hill Cipher](ciphers/hill_cipher.py) * [Mixed Keyword Cypher](ciphers/mixed_keyword_cypher.py) * [Mono Alphabetic Ciphers](ciphers/mono_alphabetic_ciphers.py) * [Morse Code](ciphers/morse_code.py) * [Onepad Cipher](ciphers/onepad_cipher.py) * [Playfair Cipher](ciphers/playfair_cipher.py) * [Polybius](ciphers/polybius.py) * [Porta Cipher](ciphers/porta_cipher.py) * [Rabin Miller](ciphers/rabin_miller.py) * [Rail Fence Cipher](ciphers/rail_fence_cipher.py) * [Rot13](ciphers/rot13.py) * [Rsa Cipher](ciphers/rsa_cipher.py) * [Rsa Factorization](ciphers/rsa_factorization.py) * [Rsa Key Generator](ciphers/rsa_key_generator.py) * [Shuffled Shift Cipher](ciphers/shuffled_shift_cipher.py) * [Simple Keyword Cypher](ciphers/simple_keyword_cypher.py) * [Simple Substitution Cipher](ciphers/simple_substitution_cipher.py) * [Trafid Cipher](ciphers/trafid_cipher.py) * [Transposition Cipher](ciphers/transposition_cipher.py) * [Transposition Cipher Encrypt Decrypt File](ciphers/transposition_cipher_encrypt_decrypt_file.py) * [Vigenere Cipher](ciphers/vigenere_cipher.py) * [Xor Cipher](ciphers/xor_cipher.py) ## Compression * [Burrows Wheeler](compression/burrows_wheeler.py) * [Huffman](compression/huffman.py) * [Lempel Ziv](compression/lempel_ziv.py) * [Lempel Ziv Decompress](compression/lempel_ziv_decompress.py) * [Peak Signal To Noise Ratio](compression/peak_signal_to_noise_ratio.py) * [Run Length Encoding](compression/run_length_encoding.py) ## Computer Vision * [Cnn Classification](computer_vision/cnn_classification.py) * [Flip Augmentation](computer_vision/flip_augmentation.py) * [Harris Corner](computer_vision/harris_corner.py) * [Horn Schunck](computer_vision/horn_schunck.py) * [Mean Threshold](computer_vision/mean_threshold.py) * [Mosaic Augmentation](computer_vision/mosaic_augmentation.py) * [Pooling Functions](computer_vision/pooling_functions.py) ## Conversions * [Astronomical Length Scale Conversion](conversions/astronomical_length_scale_conversion.py) * [Binary To Decimal](conversions/binary_to_decimal.py) * [Binary To Hexadecimal](conversions/binary_to_hexadecimal.py) * [Binary To Octal](conversions/binary_to_octal.py) * [Decimal To Any](conversions/decimal_to_any.py) * [Decimal To Binary](conversions/decimal_to_binary.py) * [Decimal To Binary Recursion](conversions/decimal_to_binary_recursion.py) * [Decimal To Hexadecimal](conversions/decimal_to_hexadecimal.py) * [Decimal To Octal](conversions/decimal_to_octal.py) * [Excel Title To Column](conversions/excel_title_to_column.py) * [Hex To Bin](conversions/hex_to_bin.py) * [Hexadecimal To Decimal](conversions/hexadecimal_to_decimal.py) * [Length Conversion](conversions/length_conversion.py) * [Molecular Chemistry](conversions/molecular_chemistry.py) * [Octal To Decimal](conversions/octal_to_decimal.py) * [Prefix Conversions](conversions/prefix_conversions.py) * [Prefix Conversions String](conversions/prefix_conversions_string.py) * [Pressure Conversions](conversions/pressure_conversions.py) * [Rgb Hsv Conversion](conversions/rgb_hsv_conversion.py) * [Roman Numerals](conversions/roman_numerals.py) * [Speed Conversions](conversions/speed_conversions.py) * [Temperature Conversions](conversions/temperature_conversions.py) * [Volume Conversions](conversions/volume_conversions.py) * [Weight Conversion](conversions/weight_conversion.py) ## Data Structures * Binary Tree * [Avl Tree](data_structures/binary_tree/avl_tree.py) * [Basic Binary Tree](data_structures/binary_tree/basic_binary_tree.py) * [Binary Search Tree](data_structures/binary_tree/binary_search_tree.py) * [Binary Search Tree Recursive](data_structures/binary_tree/binary_search_tree_recursive.py) * [Binary Tree Mirror](data_structures/binary_tree/binary_tree_mirror.py) * [Binary Tree Node Sum](data_structures/binary_tree/binary_tree_node_sum.py) * [Binary Tree Path Sum](data_structures/binary_tree/binary_tree_path_sum.py) * [Binary Tree Traversals](data_structures/binary_tree/binary_tree_traversals.py) * [Diff Views Of Binary Tree](data_structures/binary_tree/diff_views_of_binary_tree.py) * [Fenwick Tree](data_structures/binary_tree/fenwick_tree.py) * [Inorder Tree Traversal 2022](data_structures/binary_tree/inorder_tree_traversal_2022.py) * [Lazy Segment Tree](data_structures/binary_tree/lazy_segment_tree.py) * [Lowest Common Ancestor](data_structures/binary_tree/lowest_common_ancestor.py) * [Maximum Fenwick Tree](data_structures/binary_tree/maximum_fenwick_tree.py) * [Merge Two Binary Trees](data_structures/binary_tree/merge_two_binary_trees.py) * [Non Recursive Segment Tree](data_structures/binary_tree/non_recursive_segment_tree.py) * [Number Of Possible Binary Trees](data_structures/binary_tree/number_of_possible_binary_trees.py) * [Red Black Tree](data_structures/binary_tree/red_black_tree.py) * [Segment Tree](data_structures/binary_tree/segment_tree.py) * [Segment Tree Other](data_structures/binary_tree/segment_tree_other.py) * [Treap](data_structures/binary_tree/treap.py) * [Wavelet Tree](data_structures/binary_tree/wavelet_tree.py) * Disjoint Set * [Alternate Disjoint Set](data_structures/disjoint_set/alternate_disjoint_set.py) * [Disjoint Set](data_structures/disjoint_set/disjoint_set.py) * Hashing * [Double Hash](data_structures/hashing/double_hash.py) * [Hash Table](data_structures/hashing/hash_table.py) * [Hash Table With Linked List](data_structures/hashing/hash_table_with_linked_list.py) * Number Theory * [Prime Numbers](data_structures/hashing/number_theory/prime_numbers.py) * [Quadratic Probing](data_structures/hashing/quadratic_probing.py) * Heap * [Binomial Heap](data_structures/heap/binomial_heap.py) * [Heap](data_structures/heap/heap.py) * [Heap Generic](data_structures/heap/heap_generic.py) * [Max Heap](data_structures/heap/max_heap.py) * [Min Heap](data_structures/heap/min_heap.py) * [Randomized Heap](data_structures/heap/randomized_heap.py) * [Skew Heap](data_structures/heap/skew_heap.py) * Linked List * [Circular Linked List](data_structures/linked_list/circular_linked_list.py) * [Deque Doubly](data_structures/linked_list/deque_doubly.py) * [Doubly Linked List](data_structures/linked_list/doubly_linked_list.py) * [Doubly Linked List Two](data_structures/linked_list/doubly_linked_list_two.py) * [From Sequence](data_structures/linked_list/from_sequence.py) * [Has Loop](data_structures/linked_list/has_loop.py) * [Is Palindrome](data_structures/linked_list/is_palindrome.py) * [Merge Two Lists](data_structures/linked_list/merge_two_lists.py) * [Middle Element Of Linked List](data_structures/linked_list/middle_element_of_linked_list.py) * [Print Reverse](data_structures/linked_list/print_reverse.py) * [Singly Linked List](data_structures/linked_list/singly_linked_list.py) * [Skip List](data_structures/linked_list/skip_list.py) * [Swap Nodes](data_structures/linked_list/swap_nodes.py) * Queue * [Circular Queue](data_structures/queue/circular_queue.py) * [Circular Queue Linked List](data_structures/queue/circular_queue_linked_list.py) * [Double Ended Queue](data_structures/queue/double_ended_queue.py) * [Linked Queue](data_structures/queue/linked_queue.py) * [Priority Queue Using List](data_structures/queue/priority_queue_using_list.py) * [Queue On List](data_structures/queue/queue_on_list.py) * [Queue On Pseudo Stack](data_structures/queue/queue_on_pseudo_stack.py) * Stacks * [Balanced Parentheses](data_structures/stacks/balanced_parentheses.py) * [Dijkstras Two Stack Algorithm](data_structures/stacks/dijkstras_two_stack_algorithm.py) * [Evaluate Postfix Notations](data_structures/stacks/evaluate_postfix_notations.py) * [Infix To Postfix Conversion](data_structures/stacks/infix_to_postfix_conversion.py) * [Infix To Prefix Conversion](data_structures/stacks/infix_to_prefix_conversion.py) * [Next Greater Element](data_structures/stacks/next_greater_element.py) * [Postfix Evaluation](data_structures/stacks/postfix_evaluation.py) * [Prefix Evaluation](data_structures/stacks/prefix_evaluation.py) * [Stack](data_structures/stacks/stack.py) * [Stack With Doubly Linked List](data_structures/stacks/stack_with_doubly_linked_list.py) * [Stack With Singly Linked List](data_structures/stacks/stack_with_singly_linked_list.py) * [Stock Span Problem](data_structures/stacks/stock_span_problem.py) * Trie * [Trie](data_structures/trie/trie.py) ## Digital Image Processing * [Change Brightness](digital_image_processing/change_brightness.py) * [Change Contrast](digital_image_processing/change_contrast.py) * [Convert To Negative](digital_image_processing/convert_to_negative.py) * Dithering * [Burkes](digital_image_processing/dithering/burkes.py) * Edge Detection * [Canny](digital_image_processing/edge_detection/canny.py) * Filters * [Bilateral Filter](digital_image_processing/filters/bilateral_filter.py) * [Convolve](digital_image_processing/filters/convolve.py) * [Gabor Filter](digital_image_processing/filters/gabor_filter.py) * [Gaussian Filter](digital_image_processing/filters/gaussian_filter.py) * [Local Binary Pattern](digital_image_processing/filters/local_binary_pattern.py) * [Median Filter](digital_image_processing/filters/median_filter.py) * [Sobel Filter](digital_image_processing/filters/sobel_filter.py) * Histogram Equalization * [Histogram Stretch](digital_image_processing/histogram_equalization/histogram_stretch.py) * [Index Calculation](digital_image_processing/index_calculation.py) * Morphological Operations * [Dilation Operation](digital_image_processing/morphological_operations/dilation_operation.py) * [Erosion Operation](digital_image_processing/morphological_operations/erosion_operation.py) * Resize * [Resize](digital_image_processing/resize/resize.py) * Rotation * [Rotation](digital_image_processing/rotation/rotation.py) * [Sepia](digital_image_processing/sepia.py) * [Test Digital Image Processing](digital_image_processing/test_digital_image_processing.py) ## Divide And Conquer * [Closest Pair Of Points](divide_and_conquer/closest_pair_of_points.py) * [Convex Hull](divide_and_conquer/convex_hull.py) * [Heaps Algorithm](divide_and_conquer/heaps_algorithm.py) * [Heaps Algorithm Iterative](divide_and_conquer/heaps_algorithm_iterative.py) * [Inversions](divide_and_conquer/inversions.py) * [Kth Order Statistic](divide_and_conquer/kth_order_statistic.py) * [Max Difference Pair](divide_and_conquer/max_difference_pair.py) * [Max Subarray Sum](divide_and_conquer/max_subarray_sum.py) * [Mergesort](divide_and_conquer/mergesort.py) * [Peak](divide_and_conquer/peak.py) * [Power](divide_and_conquer/power.py) * [Strassen Matrix Multiplication](divide_and_conquer/strassen_matrix_multiplication.py) ## Dynamic Programming * [Abbreviation](dynamic_programming/abbreviation.py) * [All Construct](dynamic_programming/all_construct.py) * [Bitmask](dynamic_programming/bitmask.py) * [Catalan Numbers](dynamic_programming/catalan_numbers.py) * [Climbing Stairs](dynamic_programming/climbing_stairs.py) * [Combination Sum Iv](dynamic_programming/combination_sum_iv.py) * [Edit Distance](dynamic_programming/edit_distance.py) * [Factorial](dynamic_programming/factorial.py) * [Fast Fibonacci](dynamic_programming/fast_fibonacci.py) * [Fibonacci](dynamic_programming/fibonacci.py) * [Floyd Warshall](dynamic_programming/floyd_warshall.py) * [Integer Partition](dynamic_programming/integer_partition.py) * [Iterating Through Submasks](dynamic_programming/iterating_through_submasks.py) * [Knapsack](dynamic_programming/knapsack.py) * [Longest Common Subsequence](dynamic_programming/longest_common_subsequence.py) * [Longest Common Substring](dynamic_programming/longest_common_substring.py) * [Longest Increasing Subsequence](dynamic_programming/longest_increasing_subsequence.py) * [Longest Increasing Subsequence O(Nlogn)](dynamic_programming/longest_increasing_subsequence_o(nlogn).py) * [Longest Sub Array](dynamic_programming/longest_sub_array.py) * [Matrix Chain Order](dynamic_programming/matrix_chain_order.py) * [Max Non Adjacent Sum](dynamic_programming/max_non_adjacent_sum.py) * [Max Sub Array](dynamic_programming/max_sub_array.py) * [Max Sum Contiguous Subsequence](dynamic_programming/max_sum_contiguous_subsequence.py) * [Minimum Coin Change](dynamic_programming/minimum_coin_change.py) * [Minimum Cost Path](dynamic_programming/minimum_cost_path.py) * [Minimum Partition](dynamic_programming/minimum_partition.py) * [Minimum Squares To Represent A Number](dynamic_programming/minimum_squares_to_represent_a_number.py) * [Minimum Steps To One](dynamic_programming/minimum_steps_to_one.py) * [Optimal Binary Search Tree](dynamic_programming/optimal_binary_search_tree.py) * [Rod Cutting](dynamic_programming/rod_cutting.py) * [Subset Generation](dynamic_programming/subset_generation.py) * [Sum Of Subset](dynamic_programming/sum_of_subset.py) ## Electronics * [Carrier Concentration](electronics/carrier_concentration.py) * [Coulombs Law](electronics/coulombs_law.py) * [Electric Power](electronics/electric_power.py) * [Ohms Law](electronics/ohms_law.py) ## File Transfer * [Receive File](file_transfer/receive_file.py) * [Send File](file_transfer/send_file.py) * Tests * [Test Send File](file_transfer/tests/test_send_file.py) ## Financial * [Equated Monthly Installments](financial/equated_monthly_installments.py) * [Interest](financial/interest.py) * [Price Plus Tax](financial/price_plus_tax.py) ## Fractals * [Julia Sets](fractals/julia_sets.py) * [Koch Snowflake](fractals/koch_snowflake.py) * [Mandelbrot](fractals/mandelbrot.py) * [Sierpinski Triangle](fractals/sierpinski_triangle.py) ## Fuzzy Logic * [Fuzzy Operations](fuzzy_logic/fuzzy_operations.py) ## Genetic Algorithm * [Basic String](genetic_algorithm/basic_string.py) ## Geodesy * [Haversine Distance](geodesy/haversine_distance.py) * [Lamberts Ellipsoidal Distance](geodesy/lamberts_ellipsoidal_distance.py) ## Graphics * [Bezier Curve](graphics/bezier_curve.py) * [Vector3 For 2D Rendering](graphics/vector3_for_2d_rendering.py) ## Graphs * [A Star](graphs/a_star.py) * [Articulation Points](graphs/articulation_points.py) * [Basic Graphs](graphs/basic_graphs.py) * [Bellman Ford](graphs/bellman_ford.py) * [Bidirectional A Star](graphs/bidirectional_a_star.py) * [Bidirectional Breadth First Search](graphs/bidirectional_breadth_first_search.py) * [Boruvka](graphs/boruvka.py) * [Breadth First Search](graphs/breadth_first_search.py) * [Breadth First Search 2](graphs/breadth_first_search_2.py) * [Breadth First Search Shortest Path](graphs/breadth_first_search_shortest_path.py) * [Breadth First Search Shortest Path 2](graphs/breadth_first_search_shortest_path_2.py) * [Breadth First Search Zero One Shortest Path](graphs/breadth_first_search_zero_one_shortest_path.py) * [Check Bipartite Graph Bfs](graphs/check_bipartite_graph_bfs.py) * [Check Bipartite Graph Dfs](graphs/check_bipartite_graph_dfs.py) * [Check Cycle](graphs/check_cycle.py) * [Connected Components](graphs/connected_components.py) * [Depth First Search](graphs/depth_first_search.py) * [Depth First Search 2](graphs/depth_first_search_2.py) * [Dijkstra](graphs/dijkstra.py) * [Dijkstra 2](graphs/dijkstra_2.py) * [Dijkstra Algorithm](graphs/dijkstra_algorithm.py) * [Dijkstra Alternate](graphs/dijkstra_alternate.py) * [Dinic](graphs/dinic.py) * [Directed And Undirected (Weighted) Graph](graphs/directed_and_undirected_(weighted)_graph.py) * [Edmonds Karp Multiple Source And Sink](graphs/edmonds_karp_multiple_source_and_sink.py) * [Eulerian Path And Circuit For Undirected Graph](graphs/eulerian_path_and_circuit_for_undirected_graph.py) * [Even Tree](graphs/even_tree.py) * [Finding Bridges](graphs/finding_bridges.py) * [Frequent Pattern Graph Miner](graphs/frequent_pattern_graph_miner.py) * [G Topological Sort](graphs/g_topological_sort.py) * [Gale Shapley Bigraph](graphs/gale_shapley_bigraph.py) * [Graph List](graphs/graph_list.py) * [Graph Matrix](graphs/graph_matrix.py) * [Graphs Floyd Warshall](graphs/graphs_floyd_warshall.py) * [Greedy Best First](graphs/greedy_best_first.py) * [Greedy Min Vertex Cover](graphs/greedy_min_vertex_cover.py) * [Kahns Algorithm Long](graphs/kahns_algorithm_long.py) * [Kahns Algorithm Topo](graphs/kahns_algorithm_topo.py) * [Karger](graphs/karger.py) * [Markov Chain](graphs/markov_chain.py) * [Matching Min Vertex Cover](graphs/matching_min_vertex_cover.py) * [Minimum Path Sum](graphs/minimum_path_sum.py) * [Minimum Spanning Tree Boruvka](graphs/minimum_spanning_tree_boruvka.py) * [Minimum Spanning Tree Kruskal](graphs/minimum_spanning_tree_kruskal.py) * [Minimum Spanning Tree Kruskal2](graphs/minimum_spanning_tree_kruskal2.py) * [Minimum Spanning Tree Prims](graphs/minimum_spanning_tree_prims.py) * [Minimum Spanning Tree Prims2](graphs/minimum_spanning_tree_prims2.py) * [Multi Heuristic Astar](graphs/multi_heuristic_astar.py) * [Page Rank](graphs/page_rank.py) * [Prim](graphs/prim.py) * [Random Graph Generator](graphs/random_graph_generator.py) * [Scc Kosaraju](graphs/scc_kosaraju.py) * [Strongly Connected Components](graphs/strongly_connected_components.py) * [Tarjans Scc](graphs/tarjans_scc.py) * Tests * [Test Min Spanning Tree Kruskal](graphs/tests/test_min_spanning_tree_kruskal.py) * [Test Min Spanning Tree Prim](graphs/tests/test_min_spanning_tree_prim.py) ## Greedy Methods * [Fractional Knapsack](greedy_methods/fractional_knapsack.py) * [Fractional Knapsack 2](greedy_methods/fractional_knapsack_2.py) * [Optimal Merge Pattern](greedy_methods/optimal_merge_pattern.py) ## Hashes * [Adler32](hashes/adler32.py) * [Chaos Machine](hashes/chaos_machine.py) * [Djb2](hashes/djb2.py) * [Enigma Machine](hashes/enigma_machine.py) * [Hamming Code](hashes/hamming_code.py) * [Luhn](hashes/luhn.py) * [Md5](hashes/md5.py) * [Sdbm](hashes/sdbm.py) * [Sha1](hashes/sha1.py) * [Sha256](hashes/sha256.py) ## Knapsack * [Greedy Knapsack](knapsack/greedy_knapsack.py) * [Knapsack](knapsack/knapsack.py) * Tests * [Test Greedy Knapsack](knapsack/tests/test_greedy_knapsack.py) * [Test Knapsack](knapsack/tests/test_knapsack.py) ## Linear Algebra * Src * [Conjugate Gradient](linear_algebra/src/conjugate_gradient.py) * [Lib](linear_algebra/src/lib.py) * [Polynom For Points](linear_algebra/src/polynom_for_points.py) * [Power Iteration](linear_algebra/src/power_iteration.py) * [Rayleigh Quotient](linear_algebra/src/rayleigh_quotient.py) * [Schur Complement](linear_algebra/src/schur_complement.py) * [Test Linear Algebra](linear_algebra/src/test_linear_algebra.py) * [Transformations 2D](linear_algebra/src/transformations_2d.py) ## Machine Learning * [Astar](machine_learning/astar.py) * [Data Transformations](machine_learning/data_transformations.py) * [Decision Tree](machine_learning/decision_tree.py) * Forecasting * [Run](machine_learning/forecasting/run.py) * [Gaussian Naive Bayes](machine_learning/gaussian_naive_bayes.py) * [Gradient Boosting Regressor](machine_learning/gradient_boosting_regressor.py) * [Gradient Descent](machine_learning/gradient_descent.py) * [K Means Clust](machine_learning/k_means_clust.py) * [K Nearest Neighbours](machine_learning/k_nearest_neighbours.py) * [Knn Sklearn](machine_learning/knn_sklearn.py) * [Linear Discriminant Analysis](machine_learning/linear_discriminant_analysis.py) * [Linear Regression](machine_learning/linear_regression.py) * Local Weighted Learning * [Local Weighted Learning](machine_learning/local_weighted_learning/local_weighted_learning.py) * [Logistic Regression](machine_learning/logistic_regression.py) * Lstm * [Lstm Prediction](machine_learning/lstm/lstm_prediction.py) * [Multilayer Perceptron Classifier](machine_learning/multilayer_perceptron_classifier.py) * [Polymonial Regression](machine_learning/polymonial_regression.py) * [Random Forest Classifier](machine_learning/random_forest_classifier.py) * [Random Forest Regressor](machine_learning/random_forest_regressor.py) * [Scoring Functions](machine_learning/scoring_functions.py) * [Self Organizing Map](machine_learning/self_organizing_map.py) * [Sequential Minimum Optimization](machine_learning/sequential_minimum_optimization.py) * [Similarity Search](machine_learning/similarity_search.py) * [Support Vector Machines](machine_learning/support_vector_machines.py) * [Word Frequency Functions](machine_learning/word_frequency_functions.py) * [Xgboost Classifier](machine_learning/xgboost_classifier.py) * [Xgboost Regressor](machine_learning/xgboost_regressor.py) ## Maths * [3N Plus 1](maths/3n_plus_1.py) * [Abs](maths/abs.py) * [Abs Max](maths/abs_max.py) * [Abs Min](maths/abs_min.py) * [Add](maths/add.py) * [Aliquot Sum](maths/aliquot_sum.py) * [Allocation Number](maths/allocation_number.py) * [Arc Length](maths/arc_length.py) * [Area](maths/area.py) * [Area Under Curve](maths/area_under_curve.py) * [Armstrong Numbers](maths/armstrong_numbers.py) * [Average Absolute Deviation](maths/average_absolute_deviation.py) * [Average Mean](maths/average_mean.py) * [Average Median](maths/average_median.py) * [Average Mode](maths/average_mode.py) * [Bailey Borwein Plouffe](maths/bailey_borwein_plouffe.py) * [Basic Maths](maths/basic_maths.py) * [Binary Exp Mod](maths/binary_exp_mod.py) * [Binary Exponentiation](maths/binary_exponentiation.py) * [Binary Exponentiation 2](maths/binary_exponentiation_2.py) * [Binary Exponentiation 3](maths/binary_exponentiation_3.py) * [Binomial Coefficient](maths/binomial_coefficient.py) * [Binomial Distribution](maths/binomial_distribution.py) * [Bisection](maths/bisection.py) * [Carmichael Number](maths/carmichael_number.py) * [Catalan Number](maths/catalan_number.py) * [Ceil](maths/ceil.py) * [Check Polygon](maths/check_polygon.py) * [Chudnovsky Algorithm](maths/chudnovsky_algorithm.py) * [Collatz Sequence](maths/collatz_sequence.py) * [Combinations](maths/combinations.py) * [Decimal Isolate](maths/decimal_isolate.py) * [Double Factorial Iterative](maths/double_factorial_iterative.py) * [Double Factorial Recursive](maths/double_factorial_recursive.py) * [Entropy](maths/entropy.py) * [Euclidean Distance](maths/euclidean_distance.py) * [Euclidean Gcd](maths/euclidean_gcd.py) * [Euler Method](maths/euler_method.py) * [Euler Modified](maths/euler_modified.py) * [Eulers Totient](maths/eulers_totient.py) * [Extended Euclidean Algorithm](maths/extended_euclidean_algorithm.py) * [Factorial Iterative](maths/factorial_iterative.py) * [Factorial Recursive](maths/factorial_recursive.py) * [Factors](maths/factors.py) * [Fermat Little Theorem](maths/fermat_little_theorem.py) * [Fibonacci](maths/fibonacci.py) * [Find Max](maths/find_max.py) * [Find Max Recursion](maths/find_max_recursion.py) * [Find Min](maths/find_min.py) * [Find Min Recursion](maths/find_min_recursion.py) * [Floor](maths/floor.py) * [Gamma](maths/gamma.py) * [Gamma Recursive](maths/gamma_recursive.py) * [Gaussian](maths/gaussian.py) * [Gaussian Error Linear Unit](maths/gaussian_error_linear_unit.py) * [Greatest Common Divisor](maths/greatest_common_divisor.py) * [Greedy Coin Change](maths/greedy_coin_change.py) * [Hamming Numbers](maths/hamming_numbers.py) * [Hardy Ramanujanalgo](maths/hardy_ramanujanalgo.py) * [Integration By Simpson Approx](maths/integration_by_simpson_approx.py) * [Is Ip V4 Address Valid](maths/is_ip_v4_address_valid.py) * [Is Square Free](maths/is_square_free.py) * [Jaccard Similarity](maths/jaccard_similarity.py) * [Kadanes](maths/kadanes.py) * [Karatsuba](maths/karatsuba.py) * [Krishnamurthy Number](maths/krishnamurthy_number.py) * [Kth Lexicographic Permutation](maths/kth_lexicographic_permutation.py) * [Largest Of Very Large Numbers](maths/largest_of_very_large_numbers.py) * [Largest Subarray Sum](maths/largest_subarray_sum.py) * [Least Common Multiple](maths/least_common_multiple.py) * [Line Length](maths/line_length.py) * [Lucas Lehmer Primality Test](maths/lucas_lehmer_primality_test.py) * [Lucas Series](maths/lucas_series.py) * [Maclaurin Series](maths/maclaurin_series.py) * [Matrix Exponentiation](maths/matrix_exponentiation.py) * [Max Sum Sliding Window](maths/max_sum_sliding_window.py) * [Median Of Two Arrays](maths/median_of_two_arrays.py) * [Miller Rabin](maths/miller_rabin.py) * [Mobius Function](maths/mobius_function.py) * [Modular Exponential](maths/modular_exponential.py) * [Monte Carlo](maths/monte_carlo.py) * [Monte Carlo Dice](maths/monte_carlo_dice.py) * [Nevilles Method](maths/nevilles_method.py) * [Newton Raphson](maths/newton_raphson.py) * [Number Of Digits](maths/number_of_digits.py) * [Numerical Integration](maths/numerical_integration.py) * [Perfect Cube](maths/perfect_cube.py) * [Perfect Number](maths/perfect_number.py) * [Perfect Square](maths/perfect_square.py) * [Persistence](maths/persistence.py) * [Pi Monte Carlo Estimation](maths/pi_monte_carlo_estimation.py) * [Points Are Collinear 3D](maths/points_are_collinear_3d.py) * [Pollard Rho](maths/pollard_rho.py) * [Polynomial Evaluation](maths/polynomial_evaluation.py) * [Power Using Recursion](maths/power_using_recursion.py) * [Prime Check](maths/prime_check.py) * [Prime Factors](maths/prime_factors.py) * [Prime Numbers](maths/prime_numbers.py) * [Prime Sieve Eratosthenes](maths/prime_sieve_eratosthenes.py) * [Primelib](maths/primelib.py) * [Proth Number](maths/proth_number.py) * [Pythagoras](maths/pythagoras.py) * [Qr Decomposition](maths/qr_decomposition.py) * [Quadratic Equations Complex Numbers](maths/quadratic_equations_complex_numbers.py) * [Radians](maths/radians.py) * [Radix2 Fft](maths/radix2_fft.py) * [Relu](maths/relu.py) * [Runge Kutta](maths/runge_kutta.py) * [Segmented Sieve](maths/segmented_sieve.py) * Series * [Arithmetic](maths/series/arithmetic.py) * [Geometric](maths/series/geometric.py) * [Geometric Series](maths/series/geometric_series.py) * [Harmonic](maths/series/harmonic.py) * [Harmonic Series](maths/series/harmonic_series.py) * [Hexagonal Numbers](maths/series/hexagonal_numbers.py) * [P Series](maths/series/p_series.py) * [Sieve Of Eratosthenes](maths/sieve_of_eratosthenes.py) * [Sigmoid](maths/sigmoid.py) * [Sigmoid Linear Unit](maths/sigmoid_linear_unit.py) * [Signum](maths/signum.py) * [Simpson Rule](maths/simpson_rule.py) * [Sin](maths/sin.py) * [Sock Merchant](maths/sock_merchant.py) * [Softmax](maths/softmax.py) * [Square Root](maths/square_root.py) * [Sum Of Arithmetic Series](maths/sum_of_arithmetic_series.py) * [Sum Of Digits](maths/sum_of_digits.py) * [Sum Of Geometric Progression](maths/sum_of_geometric_progression.py) * [Sum Of Harmonic Series](maths/sum_of_harmonic_series.py) * [Sylvester Sequence](maths/sylvester_sequence.py) * [Test Prime Check](maths/test_prime_check.py) * [Trapezoidal Rule](maths/trapezoidal_rule.py) * [Triplet Sum](maths/triplet_sum.py) * [Two Pointer](maths/two_pointer.py) * [Two Sum](maths/two_sum.py) * [Ugly Numbers](maths/ugly_numbers.py) * [Volume](maths/volume.py) * [Weird Number](maths/weird_number.py) * [Zellers Congruence](maths/zellers_congruence.py) ## Matrix * [Binary Search Matrix](matrix/binary_search_matrix.py) * [Count Islands In Matrix](matrix/count_islands_in_matrix.py) * [Cramers Rule 2X2](matrix/cramers_rule_2x2.py) * [Inverse Of Matrix](matrix/inverse_of_matrix.py) * [Largest Square Area In Matrix](matrix/largest_square_area_in_matrix.py) * [Matrix Class](matrix/matrix_class.py) * [Matrix Operation](matrix/matrix_operation.py) * [Max Area Of Island](matrix/max_area_of_island.py) * [Nth Fibonacci Using Matrix Exponentiation](matrix/nth_fibonacci_using_matrix_exponentiation.py) * [Rotate Matrix](matrix/rotate_matrix.py) * [Searching In Sorted Matrix](matrix/searching_in_sorted_matrix.py) * [Sherman Morrison](matrix/sherman_morrison.py) * [Spiral Print](matrix/spiral_print.py) * Tests * [Test Matrix Operation](matrix/tests/test_matrix_operation.py) ## Networking Flow * [Ford Fulkerson](networking_flow/ford_fulkerson.py) * [Minimum Cut](networking_flow/minimum_cut.py) ## Neural Network * [2 Hidden Layers Neural Network](neural_network/2_hidden_layers_neural_network.py) * [Back Propagation Neural Network](neural_network/back_propagation_neural_network.py) * [Convolution Neural Network](neural_network/convolution_neural_network.py) * [Perceptron](neural_network/perceptron.py) ## Other * [Activity Selection](other/activity_selection.py) * [Alternative List Arrange](other/alternative_list_arrange.py) * [Check Strong Password](other/check_strong_password.py) * [Davisb Putnamb Logemannb Loveland](other/davisb_putnamb_logemannb_loveland.py) * [Dijkstra Bankers Algorithm](other/dijkstra_bankers_algorithm.py) * [Doomsday](other/doomsday.py) * [Fischer Yates Shuffle](other/fischer_yates_shuffle.py) * [Gauss Easter](other/gauss_easter.py) * [Graham Scan](other/graham_scan.py) * [Greedy](other/greedy.py) * [Least Recently Used](other/least_recently_used.py) * [Lfu Cache](other/lfu_cache.py) * [Linear Congruential Generator](other/linear_congruential_generator.py) * [Lru Cache](other/lru_cache.py) * [Magicdiamondpattern](other/magicdiamondpattern.py) * [Maximum Subarray](other/maximum_subarray.py) * [Nested Brackets](other/nested_brackets.py) * [Password Generator](other/password_generator.py) * [Scoring Algorithm](other/scoring_algorithm.py) * [Sdes](other/sdes.py) * [Tower Of Hanoi](other/tower_of_hanoi.py) ## Physics * [Casimir Effect](physics/casimir_effect.py) * [Horizontal Projectile Motion](physics/horizontal_projectile_motion.py) * [Kinetic Energy](physics/kinetic_energy.py) * [Lorentz Transformation Four Vector](physics/lorentz_transformation_four_vector.py) * [Malus Law](physics/malus_law.py) * [N Body Simulation](physics/n_body_simulation.py) * [Newtons Law Of Gravitation](physics/newtons_law_of_gravitation.py) * [Newtons Second Law Of Motion](physics/newtons_second_law_of_motion.py) * [Potential Energy](physics/potential_energy.py) ## Project Euler * Problem 001 * [Sol1](project_euler/problem_001/sol1.py) * [Sol2](project_euler/problem_001/sol2.py) * [Sol3](project_euler/problem_001/sol3.py) * [Sol4](project_euler/problem_001/sol4.py) * [Sol5](project_euler/problem_001/sol5.py) * [Sol6](project_euler/problem_001/sol6.py) * [Sol7](project_euler/problem_001/sol7.py) * Problem 002 * [Sol1](project_euler/problem_002/sol1.py) * [Sol2](project_euler/problem_002/sol2.py) * [Sol3](project_euler/problem_002/sol3.py) * [Sol4](project_euler/problem_002/sol4.py) * [Sol5](project_euler/problem_002/sol5.py) * Problem 003 * [Sol1](project_euler/problem_003/sol1.py) * [Sol2](project_euler/problem_003/sol2.py) * [Sol3](project_euler/problem_003/sol3.py) * Problem 004 * [Sol1](project_euler/problem_004/sol1.py) * [Sol2](project_euler/problem_004/sol2.py) * Problem 005 * [Sol1](project_euler/problem_005/sol1.py) * [Sol2](project_euler/problem_005/sol2.py) * Problem 006 * [Sol1](project_euler/problem_006/sol1.py) * [Sol2](project_euler/problem_006/sol2.py) * [Sol3](project_euler/problem_006/sol3.py) * [Sol4](project_euler/problem_006/sol4.py) * Problem 007 * [Sol1](project_euler/problem_007/sol1.py) * [Sol2](project_euler/problem_007/sol2.py) * [Sol3](project_euler/problem_007/sol3.py) * Problem 008 * [Sol1](project_euler/problem_008/sol1.py) * [Sol2](project_euler/problem_008/sol2.py) * [Sol3](project_euler/problem_008/sol3.py) * Problem 009 * [Sol1](project_euler/problem_009/sol1.py) * [Sol2](project_euler/problem_009/sol2.py) * [Sol3](project_euler/problem_009/sol3.py) * Problem 010 * [Sol1](project_euler/problem_010/sol1.py) * [Sol2](project_euler/problem_010/sol2.py) * [Sol3](project_euler/problem_010/sol3.py) * Problem 011 * [Sol1](project_euler/problem_011/sol1.py) * [Sol2](project_euler/problem_011/sol2.py) * Problem 012 * [Sol1](project_euler/problem_012/sol1.py) * [Sol2](project_euler/problem_012/sol2.py) * Problem 013 * [Sol1](project_euler/problem_013/sol1.py) * Problem 014 * [Sol1](project_euler/problem_014/sol1.py) * [Sol2](project_euler/problem_014/sol2.py) * Problem 015 * [Sol1](project_euler/problem_015/sol1.py) * Problem 016 * [Sol1](project_euler/problem_016/sol1.py) * [Sol2](project_euler/problem_016/sol2.py) * Problem 017 * [Sol1](project_euler/problem_017/sol1.py) * Problem 018 * [Solution](project_euler/problem_018/solution.py) * Problem 019 * [Sol1](project_euler/problem_019/sol1.py) * Problem 020 * [Sol1](project_euler/problem_020/sol1.py) * [Sol2](project_euler/problem_020/sol2.py) * [Sol3](project_euler/problem_020/sol3.py) * [Sol4](project_euler/problem_020/sol4.py) * Problem 021 * [Sol1](project_euler/problem_021/sol1.py) * Problem 022 * [Sol1](project_euler/problem_022/sol1.py) * [Sol2](project_euler/problem_022/sol2.py) * Problem 023 * [Sol1](project_euler/problem_023/sol1.py) * Problem 024 * [Sol1](project_euler/problem_024/sol1.py) * Problem 025 * [Sol1](project_euler/problem_025/sol1.py) * [Sol2](project_euler/problem_025/sol2.py) * [Sol3](project_euler/problem_025/sol3.py) * Problem 026 * [Sol1](project_euler/problem_026/sol1.py) * Problem 027 * [Sol1](project_euler/problem_027/sol1.py) * Problem 028 * [Sol1](project_euler/problem_028/sol1.py) * Problem 029 * [Sol1](project_euler/problem_029/sol1.py) * Problem 030 * [Sol1](project_euler/problem_030/sol1.py) * Problem 031 * [Sol1](project_euler/problem_031/sol1.py) * [Sol2](project_euler/problem_031/sol2.py) * Problem 032 * [Sol32](project_euler/problem_032/sol32.py) * Problem 033 * [Sol1](project_euler/problem_033/sol1.py) * Problem 034 * [Sol1](project_euler/problem_034/sol1.py) * Problem 035 * [Sol1](project_euler/problem_035/sol1.py) * Problem 036 * [Sol1](project_euler/problem_036/sol1.py) * Problem 037 * [Sol1](project_euler/problem_037/sol1.py) * Problem 038 * [Sol1](project_euler/problem_038/sol1.py) * Problem 039 * [Sol1](project_euler/problem_039/sol1.py) * Problem 040 * [Sol1](project_euler/problem_040/sol1.py) * Problem 041 * [Sol1](project_euler/problem_041/sol1.py) * Problem 042 * [Solution42](project_euler/problem_042/solution42.py) * Problem 043 * [Sol1](project_euler/problem_043/sol1.py) * Problem 044 * [Sol1](project_euler/problem_044/sol1.py) * Problem 045 * [Sol1](project_euler/problem_045/sol1.py) * Problem 046 * [Sol1](project_euler/problem_046/sol1.py) * Problem 047 * [Sol1](project_euler/problem_047/sol1.py) * Problem 048 * [Sol1](project_euler/problem_048/sol1.py) * Problem 049 * [Sol1](project_euler/problem_049/sol1.py) * Problem 050 * [Sol1](project_euler/problem_050/sol1.py) * Problem 051 * [Sol1](project_euler/problem_051/sol1.py) * Problem 052 * [Sol1](project_euler/problem_052/sol1.py) * Problem 053 * [Sol1](project_euler/problem_053/sol1.py) * Problem 054 * [Sol1](project_euler/problem_054/sol1.py) * [Test Poker Hand](project_euler/problem_054/test_poker_hand.py) * Problem 055 * [Sol1](project_euler/problem_055/sol1.py) * Problem 056 * [Sol1](project_euler/problem_056/sol1.py) * Problem 057 * [Sol1](project_euler/problem_057/sol1.py) * Problem 058 * [Sol1](project_euler/problem_058/sol1.py) * Problem 059 * [Sol1](project_euler/problem_059/sol1.py) * Problem 062 * [Sol1](project_euler/problem_062/sol1.py) * Problem 063 * [Sol1](project_euler/problem_063/sol1.py) * Problem 064 * [Sol1](project_euler/problem_064/sol1.py) * Problem 065 * [Sol1](project_euler/problem_065/sol1.py) * Problem 067 * [Sol1](project_euler/problem_067/sol1.py) * [Sol2](project_euler/problem_067/sol2.py) * Problem 068 * [Sol1](project_euler/problem_068/sol1.py) * Problem 069 * [Sol1](project_euler/problem_069/sol1.py) * Problem 070 * [Sol1](project_euler/problem_070/sol1.py) * Problem 071 * [Sol1](project_euler/problem_071/sol1.py) * Problem 072 * [Sol1](project_euler/problem_072/sol1.py) * [Sol2](project_euler/problem_072/sol2.py) * Problem 073 * [Sol1](project_euler/problem_073/sol1.py) * Problem 074 * [Sol1](project_euler/problem_074/sol1.py) * [Sol2](project_euler/problem_074/sol2.py) * Problem 075 * [Sol1](project_euler/problem_075/sol1.py) * Problem 076 * [Sol1](project_euler/problem_076/sol1.py) * Problem 077 * [Sol1](project_euler/problem_077/sol1.py) * Problem 078 * [Sol1](project_euler/problem_078/sol1.py) * Problem 080 * [Sol1](project_euler/problem_080/sol1.py) * Problem 081 * [Sol1](project_euler/problem_081/sol1.py) * Problem 085 * [Sol1](project_euler/problem_085/sol1.py) * Problem 086 * [Sol1](project_euler/problem_086/sol1.py) * Problem 087 * [Sol1](project_euler/problem_087/sol1.py) * Problem 089 * [Sol1](project_euler/problem_089/sol1.py) * Problem 091 * [Sol1](project_euler/problem_091/sol1.py) * Problem 092 * [Sol1](project_euler/problem_092/sol1.py) * Problem 097 * [Sol1](project_euler/problem_097/sol1.py) * Problem 099 * [Sol1](project_euler/problem_099/sol1.py) * Problem 101 * [Sol1](project_euler/problem_101/sol1.py) * Problem 102 * [Sol1](project_euler/problem_102/sol1.py) * Problem 104 * [Sol1](project_euler/problem_104/sol1.py) * Problem 107 * [Sol1](project_euler/problem_107/sol1.py) * Problem 109 * [Sol1](project_euler/problem_109/sol1.py) * Problem 112 * [Sol1](project_euler/problem_112/sol1.py) * Problem 113 * [Sol1](project_euler/problem_113/sol1.py) * Problem 114 * [Sol1](project_euler/problem_114/sol1.py) * Problem 115 * [Sol1](project_euler/problem_115/sol1.py) * Problem 116 * [Sol1](project_euler/problem_116/sol1.py) * Problem 119 * [Sol1](project_euler/problem_119/sol1.py) * Problem 120 * [Sol1](project_euler/problem_120/sol1.py) * Problem 121 * [Sol1](project_euler/problem_121/sol1.py) * Problem 123 * [Sol1](project_euler/problem_123/sol1.py) * Problem 125 * [Sol1](project_euler/problem_125/sol1.py) * Problem 129 * [Sol1](project_euler/problem_129/sol1.py) * Problem 135 * [Sol1](project_euler/problem_135/sol1.py) * Problem 144 * [Sol1](project_euler/problem_144/sol1.py) * Problem 145 * [Sol1](project_euler/problem_145/sol1.py) * Problem 173 * [Sol1](project_euler/problem_173/sol1.py) * Problem 174 * [Sol1](project_euler/problem_174/sol1.py) * Problem 180 * [Sol1](project_euler/problem_180/sol1.py) * Problem 188 * [Sol1](project_euler/problem_188/sol1.py) * Problem 191 * [Sol1](project_euler/problem_191/sol1.py) * Problem 203 * [Sol1](project_euler/problem_203/sol1.py) * Problem 205 * [Sol1](project_euler/problem_205/sol1.py) * Problem 206 * [Sol1](project_euler/problem_206/sol1.py) * Problem 207 * [Sol1](project_euler/problem_207/sol1.py) * Problem 234 * [Sol1](project_euler/problem_234/sol1.py) * Problem 301 * [Sol1](project_euler/problem_301/sol1.py) * Problem 493 * [Sol1](project_euler/problem_493/sol1.py) * Problem 551 * [Sol1](project_euler/problem_551/sol1.py) * Problem 587 * [Sol1](project_euler/problem_587/sol1.py) * Problem 686 * [Sol1](project_euler/problem_686/sol1.py) ## Quantum * [Deutsch Jozsa](quantum/deutsch_jozsa.py) * [Half Adder](quantum/half_adder.py) * [Not Gate](quantum/not_gate.py) * [Q Full Adder](quantum/q_full_adder.py) * [Quantum Entanglement](quantum/quantum_entanglement.py) * [Ripple Adder Classic](quantum/ripple_adder_classic.py) * [Single Qubit Measure](quantum/single_qubit_measure.py) * [Superdense Coding](quantum/superdense_coding.py) ## Scheduling * [First Come First Served](scheduling/first_come_first_served.py) * [Highest Response Ratio Next](scheduling/highest_response_ratio_next.py) * [Job Sequencing With Deadline](scheduling/job_sequencing_with_deadline.py) * [Multi Level Feedback Queue](scheduling/multi_level_feedback_queue.py) * [Non Preemptive Shortest Job First](scheduling/non_preemptive_shortest_job_first.py) * [Round Robin](scheduling/round_robin.py) * [Shortest Job First](scheduling/shortest_job_first.py) ## Searches * [Binary Search](searches/binary_search.py) * [Binary Tree Traversal](searches/binary_tree_traversal.py) * [Double Linear Search](searches/double_linear_search.py) * [Double Linear Search Recursion](searches/double_linear_search_recursion.py) * [Fibonacci Search](searches/fibonacci_search.py) * [Hill Climbing](searches/hill_climbing.py) * [Interpolation Search](searches/interpolation_search.py) * [Jump Search](searches/jump_search.py) * [Linear Search](searches/linear_search.py) * [Quick Select](searches/quick_select.py) * [Sentinel Linear Search](searches/sentinel_linear_search.py) * [Simple Binary Search](searches/simple_binary_search.py) * [Simulated Annealing](searches/simulated_annealing.py) * [Tabu Search](searches/tabu_search.py) * [Ternary Search](searches/ternary_search.py) ## Sorts * [Bead Sort](sorts/bead_sort.py) * [Bitonic Sort](sorts/bitonic_sort.py) * [Bogo Sort](sorts/bogo_sort.py) * [Bubble Sort](sorts/bubble_sort.py) * [Bucket Sort](sorts/bucket_sort.py) * [Circle Sort](sorts/circle_sort.py) * [Cocktail Shaker Sort](sorts/cocktail_shaker_sort.py) * [Comb Sort](sorts/comb_sort.py) * [Counting Sort](sorts/counting_sort.py) * [Cycle Sort](sorts/cycle_sort.py) * [Double Sort](sorts/double_sort.py) * [Dutch National Flag Sort](sorts/dutch_national_flag_sort.py) * [Exchange Sort](sorts/exchange_sort.py) * [External Sort](sorts/external_sort.py) * [Gnome Sort](sorts/gnome_sort.py) * [Heap Sort](sorts/heap_sort.py) * [Insertion Sort](sorts/insertion_sort.py) * [Intro Sort](sorts/intro_sort.py) * [Iterative Merge Sort](sorts/iterative_merge_sort.py) * [Merge Insertion Sort](sorts/merge_insertion_sort.py) * [Merge Sort](sorts/merge_sort.py) * [Msd Radix Sort](sorts/msd_radix_sort.py) * [Natural Sort](sorts/natural_sort.py) * [Odd Even Sort](sorts/odd_even_sort.py) * [Odd Even Transposition Parallel](sorts/odd_even_transposition_parallel.py) * [Odd Even Transposition Single Threaded](sorts/odd_even_transposition_single_threaded.py) * [Pancake Sort](sorts/pancake_sort.py) * [Patience Sort](sorts/patience_sort.py) * [Pigeon Sort](sorts/pigeon_sort.py) * [Pigeonhole Sort](sorts/pigeonhole_sort.py) * [Quick Sort](sorts/quick_sort.py) * [Quick Sort 3 Partition](sorts/quick_sort_3_partition.py) * [Radix Sort](sorts/radix_sort.py) * [Random Normal Distribution Quicksort](sorts/random_normal_distribution_quicksort.py) * [Random Pivot Quick Sort](sorts/random_pivot_quick_sort.py) * [Recursive Bubble Sort](sorts/recursive_bubble_sort.py) * [Recursive Insertion Sort](sorts/recursive_insertion_sort.py) * [Recursive Mergesort Array](sorts/recursive_mergesort_array.py) * [Recursive Quick Sort](sorts/recursive_quick_sort.py) * [Selection Sort](sorts/selection_sort.py) * [Shell Sort](sorts/shell_sort.py) * [Shrink Shell Sort](sorts/shrink_shell_sort.py) * [Slowsort](sorts/slowsort.py) * [Stooge Sort](sorts/stooge_sort.py) * [Strand Sort](sorts/strand_sort.py) * [Tim Sort](sorts/tim_sort.py) * [Topological Sort](sorts/topological_sort.py) * [Tree Sort](sorts/tree_sort.py) * [Unknown Sort](sorts/unknown_sort.py) * [Wiggle Sort](sorts/wiggle_sort.py) ## Strings * [Aho Corasick](strings/aho_corasick.py) * [Alternative String Arrange](strings/alternative_string_arrange.py) * [Anagrams](strings/anagrams.py) * [Autocomplete Using Trie](strings/autocomplete_using_trie.py) * [Barcode Validator](strings/barcode_validator.py) * [Boyer Moore Search](strings/boyer_moore_search.py) * [Can String Be Rearranged As Palindrome](strings/can_string_be_rearranged_as_palindrome.py) * [Capitalize](strings/capitalize.py) * [Check Anagrams](strings/check_anagrams.py) * [Credit Card Validator](strings/credit_card_validator.py) * [Detecting English Programmatically](strings/detecting_english_programmatically.py) * [Dna](strings/dna.py) * [Frequency Finder](strings/frequency_finder.py) * [Hamming Distance](strings/hamming_distance.py) * [Indian Phone Validator](strings/indian_phone_validator.py) * [Is Contains Unique Chars](strings/is_contains_unique_chars.py) * [Is Isogram](strings/is_isogram.py) * [Is Palindrome](strings/is_palindrome.py) * [Is Pangram](strings/is_pangram.py) * [Is Spain National Id](strings/is_spain_national_id.py) * [Jaro Winkler](strings/jaro_winkler.py) * [Join](strings/join.py) * [Knuth Morris Pratt](strings/knuth_morris_pratt.py) * [Levenshtein Distance](strings/levenshtein_distance.py) * [Lower](strings/lower.py) * [Manacher](strings/manacher.py) * [Min Cost String Conversion](strings/min_cost_string_conversion.py) * [Naive String Search](strings/naive_string_search.py) * [Ngram](strings/ngram.py) * [Palindrome](strings/palindrome.py) * [Prefix Function](strings/prefix_function.py) * [Rabin Karp](strings/rabin_karp.py) * [Remove Duplicate](strings/remove_duplicate.py) * [Reverse Letters](strings/reverse_letters.py) * [Reverse Long Words](strings/reverse_long_words.py) * [Reverse Words](strings/reverse_words.py) * [Snake Case To Camel Pascal Case](strings/snake_case_to_camel_pascal_case.py) * [Split](strings/split.py) * [Upper](strings/upper.py) * [Wave](strings/wave.py) * [Wildcard Pattern Matching](strings/wildcard_pattern_matching.py) * [Word Occurrence](strings/word_occurrence.py) * [Word Patterns](strings/word_patterns.py) * [Z Function](strings/z_function.py) ## Web Programming * [Co2 Emission](web_programming/co2_emission.py) * [Covid Stats Via Xpath](web_programming/covid_stats_via_xpath.py) * [Crawl Google Results](web_programming/crawl_google_results.py) * [Crawl Google Scholar Citation](web_programming/crawl_google_scholar_citation.py) * [Currency Converter](web_programming/currency_converter.py) * [Current Stock Price](web_programming/current_stock_price.py) * [Current Weather](web_programming/current_weather.py) * [Daily Horoscope](web_programming/daily_horoscope.py) * [Download Images From Google Query](web_programming/download_images_from_google_query.py) * [Emails From Url](web_programming/emails_from_url.py) * [Fetch Anime And Play](web_programming/fetch_anime_and_play.py) * [Fetch Bbc News](web_programming/fetch_bbc_news.py) * [Fetch Github Info](web_programming/fetch_github_info.py) * [Fetch Jobs](web_programming/fetch_jobs.py) * [Fetch Quotes](web_programming/fetch_quotes.py) * [Fetch Well Rx Price](web_programming/fetch_well_rx_price.py) * [Get Amazon Product Data](web_programming/get_amazon_product_data.py) * [Get Imdb Top 250 Movies Csv](web_programming/get_imdb_top_250_movies_csv.py) * [Get Imdbtop](web_programming/get_imdbtop.py) * [Get Top Billioners](web_programming/get_top_billioners.py) * [Get Top Hn Posts](web_programming/get_top_hn_posts.py) * [Get User Tweets](web_programming/get_user_tweets.py) * [Giphy](web_programming/giphy.py) * [Instagram Crawler](web_programming/instagram_crawler.py) * [Instagram Pic](web_programming/instagram_pic.py) * [Instagram Video](web_programming/instagram_video.py) * [Nasa Data](web_programming/nasa_data.py) * [Open Google Results](web_programming/open_google_results.py) * [Random Anime Character](web_programming/random_anime_character.py) * [Recaptcha Verification](web_programming/recaptcha_verification.py) * [Reddit](web_programming/reddit.py) * [Search Books By Isbn](web_programming/search_books_by_isbn.py) * [Slack Message](web_programming/slack_message.py) * [Test Fetch Github Info](web_programming/test_fetch_github_info.py) * [World Covid19 Stats](web_programming/world_covid19_stats.py)
1
TheAlgorithms/Python
7,843
Fix yesqa hook
### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T12:58:59Z"
"2022-10-29T13:07:02Z"
18ffc4dec85a85837f71cd6c9b1e630b9d185001
584e743422565decd35b1b6f94cef3ced840698b
Fix yesqa hook. ### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" A non-recursive Segment Tree implementation with range query and single element update, works virtually with any list of the same type of elements with a "commutative" combiner. Explanation: https://www.geeksforgeeks.org/iterative-segment-tree-range-minimum-query/ https://www.geeksforgeeks.org/segment-tree-efficient-implementation/ >>> SegmentTree([1, 2, 3], lambda a, b: a + b).query(0, 2) 6 >>> SegmentTree([3, 1, 2], min).query(0, 2) 1 >>> SegmentTree([2, 3, 1], max).query(0, 2) 3 >>> st = SegmentTree([1, 5, 7, -1, 6], lambda a, b: a + b) >>> st.update(1, -1) >>> st.update(2, 3) >>> st.query(1, 2) 2 >>> st.query(1, 1) -1 >>> st.update(4, 1) >>> st.query(3, 4) 0 >>> st = SegmentTree([[1, 2, 3], [3, 2, 1], [1, 1, 1]], lambda a, b: [a[i] + b[i] for i ... in range(len(a))]) >>> st.query(0, 1) [4, 4, 4] >>> st.query(1, 2) [4, 3, 2] >>> st.update(1, [-1, -1, -1]) >>> st.query(1, 2) [0, 0, 0] >>> st.query(0, 2) [1, 2, 3] """ from __future__ import annotations from collections.abc import Callable from typing import Any, Generic, TypeVar T = TypeVar("T") class SegmentTree(Generic[T]): def __init__(self, arr: list[T], fnc: Callable[[T, T], T]) -> None: """ Segment Tree constructor, it works just with commutative combiner. :param arr: list of elements for the segment tree :param fnc: commutative function for combine two elements >>> SegmentTree(['a', 'b', 'c'], lambda a, b: f'{a}{b}').query(0, 2) 'abc' >>> SegmentTree([(1, 2), (2, 3), (3, 4)], ... lambda a, b: (a[0] + b[0], a[1] + b[1])).query(0, 2) (6, 9) """ any_type: Any | T = None self.N: int = len(arr) self.st: list[T] = [any_type for _ in range(self.N)] + arr self.fn = fnc self.build() def build(self) -> None: for p in range(self.N - 1, 0, -1): self.st[p] = self.fn(self.st[p * 2], self.st[p * 2 + 1]) def update(self, p: int, v: T) -> None: """ Update an element in log(N) time :param p: position to be update :param v: new value >>> st = SegmentTree([3, 1, 2, 4], min) >>> st.query(0, 3) 1 >>> st.update(2, -1) >>> st.query(0, 3) -1 """ p += self.N self.st[p] = v while p > 1: p = p // 2 self.st[p] = self.fn(self.st[p * 2], self.st[p * 2 + 1]) def query(self, l: int, r: int) -> T | None: # noqa: E741 """ Get range query value in log(N) time :param l: left element index :param r: right element index :return: element combined in the range [l, r] >>> st = SegmentTree([1, 2, 3, 4], lambda a, b: a + b) >>> st.query(0, 2) 6 >>> st.query(1, 2) 5 >>> st.query(0, 3) 10 >>> st.query(2, 3) 7 """ l, r = l + self.N, r + self.N # noqa: E741 res: T | None = None while l <= r: # noqa: E741 if l % 2 == 1: res = self.st[l] if res is None else self.fn(res, self.st[l]) if r % 2 == 0: res = self.st[r] if res is None else self.fn(res, self.st[r]) l, r = (l + 1) // 2, (r - 1) // 2 return res if __name__ == "__main__": from functools import reduce test_array = [1, 10, -2, 9, -3, 8, 4, -7, 5, 6, 11, -12] test_updates = { 0: 7, 1: 2, 2: 6, 3: -14, 4: 5, 5: 4, 6: 7, 7: -10, 8: 9, 9: 10, 10: 12, 11: 1, } min_segment_tree = SegmentTree(test_array, min) max_segment_tree = SegmentTree(test_array, max) sum_segment_tree = SegmentTree(test_array, lambda a, b: a + b) def test_all_segments() -> None: """ Test all possible segments """ for i in range(len(test_array)): for j in range(i, len(test_array)): min_range = reduce(min, test_array[i : j + 1]) max_range = reduce(max, test_array[i : j + 1]) sum_range = reduce(lambda a, b: a + b, test_array[i : j + 1]) assert min_range == min_segment_tree.query(i, j) assert max_range == max_segment_tree.query(i, j) assert sum_range == sum_segment_tree.query(i, j) test_all_segments() for index, value in test_updates.items(): test_array[index] = value min_segment_tree.update(index, value) max_segment_tree.update(index, value) sum_segment_tree.update(index, value) test_all_segments()
""" A non-recursive Segment Tree implementation with range query and single element update, works virtually with any list of the same type of elements with a "commutative" combiner. Explanation: https://www.geeksforgeeks.org/iterative-segment-tree-range-minimum-query/ https://www.geeksforgeeks.org/segment-tree-efficient-implementation/ >>> SegmentTree([1, 2, 3], lambda a, b: a + b).query(0, 2) 6 >>> SegmentTree([3, 1, 2], min).query(0, 2) 1 >>> SegmentTree([2, 3, 1], max).query(0, 2) 3 >>> st = SegmentTree([1, 5, 7, -1, 6], lambda a, b: a + b) >>> st.update(1, -1) >>> st.update(2, 3) >>> st.query(1, 2) 2 >>> st.query(1, 1) -1 >>> st.update(4, 1) >>> st.query(3, 4) 0 >>> st = SegmentTree([[1, 2, 3], [3, 2, 1], [1, 1, 1]], lambda a, b: [a[i] + b[i] for i ... in range(len(a))]) >>> st.query(0, 1) [4, 4, 4] >>> st.query(1, 2) [4, 3, 2] >>> st.update(1, [-1, -1, -1]) >>> st.query(1, 2) [0, 0, 0] >>> st.query(0, 2) [1, 2, 3] """ from __future__ import annotations from collections.abc import Callable from typing import Any, Generic, TypeVar T = TypeVar("T") class SegmentTree(Generic[T]): def __init__(self, arr: list[T], fnc: Callable[[T, T], T]) -> None: """ Segment Tree constructor, it works just with commutative combiner. :param arr: list of elements for the segment tree :param fnc: commutative function for combine two elements >>> SegmentTree(['a', 'b', 'c'], lambda a, b: f'{a}{b}').query(0, 2) 'abc' >>> SegmentTree([(1, 2), (2, 3), (3, 4)], ... lambda a, b: (a[0] + b[0], a[1] + b[1])).query(0, 2) (6, 9) """ any_type: Any | T = None self.N: int = len(arr) self.st: list[T] = [any_type for _ in range(self.N)] + arr self.fn = fnc self.build() def build(self) -> None: for p in range(self.N - 1, 0, -1): self.st[p] = self.fn(self.st[p * 2], self.st[p * 2 + 1]) def update(self, p: int, v: T) -> None: """ Update an element in log(N) time :param p: position to be update :param v: new value >>> st = SegmentTree([3, 1, 2, 4], min) >>> st.query(0, 3) 1 >>> st.update(2, -1) >>> st.query(0, 3) -1 """ p += self.N self.st[p] = v while p > 1: p = p // 2 self.st[p] = self.fn(self.st[p * 2], self.st[p * 2 + 1]) def query(self, l: int, r: int) -> T | None: # noqa: E741 """ Get range query value in log(N) time :param l: left element index :param r: right element index :return: element combined in the range [l, r] >>> st = SegmentTree([1, 2, 3, 4], lambda a, b: a + b) >>> st.query(0, 2) 6 >>> st.query(1, 2) 5 >>> st.query(0, 3) 10 >>> st.query(2, 3) 7 """ l, r = l + self.N, r + self.N res: T | None = None while l <= r: # noqa: E741 if l % 2 == 1: res = self.st[l] if res is None else self.fn(res, self.st[l]) if r % 2 == 0: res = self.st[r] if res is None else self.fn(res, self.st[r]) l, r = (l + 1) // 2, (r - 1) // 2 return res if __name__ == "__main__": from functools import reduce test_array = [1, 10, -2, 9, -3, 8, 4, -7, 5, 6, 11, -12] test_updates = { 0: 7, 1: 2, 2: 6, 3: -14, 4: 5, 5: 4, 6: 7, 7: -10, 8: 9, 9: 10, 10: 12, 11: 1, } min_segment_tree = SegmentTree(test_array, min) max_segment_tree = SegmentTree(test_array, max) sum_segment_tree = SegmentTree(test_array, lambda a, b: a + b) def test_all_segments() -> None: """ Test all possible segments """ for i in range(len(test_array)): for j in range(i, len(test_array)): min_range = reduce(min, test_array[i : j + 1]) max_range = reduce(max, test_array[i : j + 1]) sum_range = reduce(lambda a, b: a + b, test_array[i : j + 1]) assert min_range == min_segment_tree.query(i, j) assert max_range == max_segment_tree.query(i, j) assert sum_range == sum_segment_tree.query(i, j) test_all_segments() for index, value in test_updates.items(): test_array[index] = value min_segment_tree.update(index, value) max_segment_tree.update(index, value) sum_segment_tree.update(index, value) test_all_segments()
1
TheAlgorithms/Python
7,843
Fix yesqa hook
### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T12:58:59Z"
"2022-10-29T13:07:02Z"
18ffc4dec85a85837f71cd6c9b1e630b9d185001
584e743422565decd35b1b6f94cef3ced840698b
Fix yesqa hook. ### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
# Author: João Gustavo A. Amorim # Author email: [email protected] # Coding date: jan 2019 # python/black: True # Imports import numpy as np # Class implemented to calculus the index class IndexCalculation: """ # Class Summary This algorithm consists in calculating vegetation indices, these indices can be used for precision agriculture for example (or remote sensing). There are functions to define the data and to calculate the implemented indices. # Vegetation index https://en.wikipedia.org/wiki/Vegetation_Index A Vegetation Index (VI) is a spectral transformation of two or more bands designed to enhance the contribution of vegetation properties and allow reliable spatial and temporal inter-comparisons of terrestrial photosynthetic activity and canopy structural variations # Information about channels (Wavelength range for each) * nir - near-infrared https://www.malvernpanalytical.com/br/products/technology/near-infrared-spectroscopy Wavelength Range 700 nm to 2500 nm * Red Edge https://en.wikipedia.org/wiki/Red_edge Wavelength Range 680 nm to 730 nm * red https://en.wikipedia.org/wiki/Color Wavelength Range 635 nm to 700 nm * blue https://en.wikipedia.org/wiki/Color Wavelength Range 450 nm to 490 nm * green https://en.wikipedia.org/wiki/Color Wavelength Range 520 nm to 560 nm # Implemented index list #"abbreviationOfIndexName" -- list of channels used #"ARVI2" -- red, nir #"CCCI" -- red, redEdge, nir #"CVI" -- red, green, nir #"GLI" -- red, green, blue #"NDVI" -- red, nir #"BNDVI" -- blue, nir #"redEdgeNDVI" -- red, redEdge #"GNDVI" -- green, nir #"GBNDVI" -- green, blue, nir #"GRNDVI" -- red, green, nir #"RBNDVI" -- red, blue, nir #"PNDVI" -- red, green, blue, nir #"ATSAVI" -- red, nir #"BWDRVI" -- blue, nir #"CIgreen" -- green, nir #"CIrededge" -- redEdge, nir #"CI" -- red, blue #"CTVI" -- red, nir #"GDVI" -- green, nir #"EVI" -- red, blue, nir #"GEMI" -- red, nir #"GOSAVI" -- green, nir #"GSAVI" -- green, nir #"Hue" -- red, green, blue #"IVI" -- red, nir #"IPVI" -- red, nir #"I" -- red, green, blue #"RVI" -- red, nir #"MRVI" -- red, nir #"MSAVI" -- red, nir #"NormG" -- red, green, nir #"NormNIR" -- red, green, nir #"NormR" -- red, green, nir #"NGRDI" -- red, green #"RI" -- red, green #"S" -- red, green, blue #"IF" -- red, green, blue #"DVI" -- red, nir #"TVI" -- red, nir #"NDRE" -- redEdge, nir #list of all index implemented #allIndex = ["ARVI2", "CCCI", "CVI", "GLI", "NDVI", "BNDVI", "redEdgeNDVI", "GNDVI", "GBNDVI", "GRNDVI", "RBNDVI", "PNDVI", "ATSAVI", "BWDRVI", "CIgreen", "CIrededge", "CI", "CTVI", "GDVI", "EVI", "GEMI", "GOSAVI", "GSAVI", "Hue", "IVI", "IPVI", "I", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "S", "IF", "DVI", "TVI", "NDRE"] #list of index with not blue channel #notBlueIndex = ["ARVI2", "CCCI", "CVI", "NDVI", "redEdgeNDVI", "GNDVI", "GRNDVI", "ATSAVI", "CIgreen", "CIrededge", "CTVI", "GDVI", "GEMI", "GOSAVI", "GSAVI", "IVI", "IPVI", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "DVI", "TVI", "NDRE"] #list of index just with RGB channels #RGBIndex = ["GLI", "CI", "Hue", "I", "NGRDI", "RI", "S", "IF"] """ def __init__(self, red=None, green=None, blue=None, red_edge=None, nir=None): self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) def set_matricies(self, red=None, green=None, blue=None, red_edge=None, nir=None): if red is not None: self.red = red if green is not None: self.green = green if blue is not None: self.blue = blue if red_edge is not None: self.redEdge = red_edge if nir is not None: self.nir = nir return True def calculation( self, index="", red=None, green=None, blue=None, red_edge=None, nir=None ): """ performs the calculation of the index with the values instantiated in the class :str index: abbreviation of index name to perform """ self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) funcs = { "ARVI2": self.arv12, "CCCI": self.ccci, "CVI": self.cvi, "GLI": self.gli, "NDVI": self.ndvi, "BNDVI": self.bndvi, "redEdgeNDVI": self.red_edge_ndvi, "GNDVI": self.gndvi, "GBNDVI": self.gbndvi, "GRNDVI": self.grndvi, "RBNDVI": self.rbndvi, "PNDVI": self.pndvi, "ATSAVI": self.atsavi, "BWDRVI": self.bwdrvi, "CIgreen": self.ci_green, "CIrededge": self.ci_rededge, "CI": self.ci, "CTVI": self.ctvi, "GDVI": self.gdvi, "EVI": self.evi, "GEMI": self.gemi, "GOSAVI": self.gosavi, "GSAVI": self.gsavi, "Hue": self.hue, "IVI": self.ivi, "IPVI": self.ipvi, "I": self.i, "RVI": self.rvi, "MRVI": self.mrvi, "MSAVI": self.m_savi, "NormG": self.norm_g, "NormNIR": self.norm_nir, "NormR": self.norm_r, "NGRDI": self.ngrdi, "RI": self.ri, "S": self.s, "IF": self._if, "DVI": self.dvi, "TVI": self.tvi, "NDRE": self.ndre, } try: return funcs[index]() except KeyError: print("Index not in the list!") return False def arv12(self): """ Atmospherically Resistant Vegetation Index 2 https://www.indexdatabase.de/db/i-single.php?id=396 :return: index −0.18+1.17*(self.nir−self.red)/(self.nir+self.red) """ return -0.18 + (1.17 * ((self.nir - self.red) / (self.nir + self.red))) def ccci(self): """ Canopy Chlorophyll Content Index https://www.indexdatabase.de/db/i-single.php?id=224 :return: index """ return ((self.nir - self.redEdge) / (self.nir + self.redEdge)) / ( (self.nir - self.red) / (self.nir + self.red) ) def cvi(self): """ Chlorophyll vegetation index https://www.indexdatabase.de/db/i-single.php?id=391 :return: index """ return self.nir * (self.red / (self.green**2)) def gli(self): """ self.green leaf index https://www.indexdatabase.de/db/i-single.php?id=375 :return: index """ return (2 * self.green - self.red - self.blue) / ( 2 * self.green + self.red + self.blue ) def ndvi(self): """ Normalized Difference self.nir/self.red Normalized Difference Vegetation Index, Calibrated NDVI - CDVI https://www.indexdatabase.de/db/i-single.php?id=58 :return: index """ return (self.nir - self.red) / (self.nir + self.red) def bndvi(self): """ Normalized Difference self.nir/self.blue self.blue-normalized difference vegetation index https://www.indexdatabase.de/db/i-single.php?id=135 :return: index """ return (self.nir - self.blue) / (self.nir + self.blue) def red_edge_ndvi(self): """ Normalized Difference self.rededge/self.red https://www.indexdatabase.de/db/i-single.php?id=235 :return: index """ return (self.redEdge - self.red) / (self.redEdge + self.red) def gndvi(self): """ Normalized Difference self.nir/self.green self.green NDVI https://www.indexdatabase.de/db/i-single.php?id=401 :return: index """ return (self.nir - self.green) / (self.nir + self.green) def gbndvi(self): """ self.green-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=186 :return: index """ return (self.nir - (self.green + self.blue)) / ( self.nir + (self.green + self.blue) ) def grndvi(self): """ self.green-self.red NDVI https://www.indexdatabase.de/db/i-single.php?id=185 :return: index """ return (self.nir - (self.green + self.red)) / ( self.nir + (self.green + self.red) ) def rbndvi(self): """ self.red-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=187 :return: index """ return (self.nir - (self.blue + self.red)) / (self.nir + (self.blue + self.red)) def pndvi(self): """ Pan NDVI https://www.indexdatabase.de/db/i-single.php?id=188 :return: index """ return (self.nir - (self.green + self.red + self.blue)) / ( self.nir + (self.green + self.red + self.blue) ) def atsavi(self, x=0.08, a=1.22, b=0.03): """ Adjusted transformed soil-adjusted VI https://www.indexdatabase.de/db/i-single.php?id=209 :return: index """ return a * ( (self.nir - a * self.red - b) / (a * self.nir + self.red - a * b + x * (1 + a**2)) ) def bwdrvi(self): """ self.blue-wide dynamic range vegetation index https://www.indexdatabase.de/db/i-single.php?id=136 :return: index """ return (0.1 * self.nir - self.blue) / (0.1 * self.nir + self.blue) def ci_green(self): """ Chlorophyll Index self.green https://www.indexdatabase.de/db/i-single.php?id=128 :return: index """ return (self.nir / self.green) - 1 def ci_rededge(self): """ Chlorophyll Index self.redEdge https://www.indexdatabase.de/db/i-single.php?id=131 :return: index """ return (self.nir / self.redEdge) - 1 def ci(self): """ Coloration Index https://www.indexdatabase.de/db/i-single.php?id=11 :return: index """ return (self.red - self.blue) / self.red def ctvi(self): """ Corrected Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=244 :return: index """ ndvi = self.ndvi() return ((ndvi + 0.5) / (abs(ndvi + 0.5))) * (abs(ndvi + 0.5) ** (1 / 2)) def gdvi(self): """ Difference self.nir/self.green self.green Difference Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=27 :return: index """ return self.nir - self.green def evi(self): """ Enhanced Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=16 :return: index """ return 2.5 * ( (self.nir - self.red) / (self.nir + 6 * self.red - 7.5 * self.blue + 1) ) def gemi(self): """ Global Environment Monitoring Index https://www.indexdatabase.de/db/i-single.php?id=25 :return: index """ n = (2 * (self.nir**2 - self.red**2) + 1.5 * self.nir + 0.5 * self.red) / ( self.nir + self.red + 0.5 ) return n * (1 - 0.25 * n) - (self.red - 0.125) / (1 - self.red) def gosavi(self, y=0.16): """ self.green Optimized Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=29 mit Y = 0,16 :return: index """ return (self.nir - self.green) / (self.nir + self.green + y) def gsavi(self, n=0.5): """ self.green Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=31 mit N = 0,5 :return: index """ return ((self.nir - self.green) / (self.nir + self.green + n)) * (1 + n) def hue(self): """ Hue https://www.indexdatabase.de/db/i-single.php?id=34 :return: index """ return np.arctan( ((2 * self.red - self.green - self.blue) / 30.5) * (self.green - self.blue) ) def ivi(self, a=None, b=None): """ Ideal vegetation index https://www.indexdatabase.de/db/i-single.php?id=276 b=intercept of vegetation line a=soil line slope :return: index """ return (self.nir - b) / (a * self.red) def ipvi(self): """ Infraself.red percentage vegetation index https://www.indexdatabase.de/db/i-single.php?id=35 :return: index """ return (self.nir / ((self.nir + self.red) / 2)) * (self.ndvi() + 1) def i(self): # noqa: E741,E743 """ Intensity https://www.indexdatabase.de/db/i-single.php?id=36 :return: index """ return (self.red + self.green + self.blue) / 30.5 def rvi(self): """ Ratio-Vegetation-Index http://www.seos-project.eu/modules/remotesensing/remotesensing-c03-s01-p01.html :return: index """ return self.nir / self.red def mrvi(self): """ Modified Normalized Difference Vegetation Index RVI https://www.indexdatabase.de/db/i-single.php?id=275 :return: index """ return (self.rvi() - 1) / (self.rvi() + 1) def m_savi(self): """ Modified Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=44 :return: index """ return ( (2 * self.nir + 1) - ((2 * self.nir + 1) ** 2 - 8 * (self.nir - self.red)) ** (1 / 2) ) / 2 def norm_g(self): """ Norm G https://www.indexdatabase.de/db/i-single.php?id=50 :return: index """ return self.green / (self.nir + self.red + self.green) def norm_nir(self): """ Norm self.nir https://www.indexdatabase.de/db/i-single.php?id=51 :return: index """ return self.nir / (self.nir + self.red + self.green) def norm_r(self): """ Norm R https://www.indexdatabase.de/db/i-single.php?id=52 :return: index """ return self.red / (self.nir + self.red + self.green) def ngrdi(self): """ Normalized Difference self.green/self.red Normalized self.green self.red difference index, Visible Atmospherically Resistant Indices self.green (VIself.green) https://www.indexdatabase.de/db/i-single.php?id=390 :return: index """ return (self.green - self.red) / (self.green + self.red) def ri(self): """ Normalized Difference self.red/self.green self.redness Index https://www.indexdatabase.de/db/i-single.php?id=74 :return: index """ return (self.red - self.green) / (self.red + self.green) def s(self): """ Saturation https://www.indexdatabase.de/db/i-single.php?id=77 :return: index """ max_value = np.max([np.max(self.red), np.max(self.green), np.max(self.blue)]) min_value = np.min([np.min(self.red), np.min(self.green), np.min(self.blue)]) return (max_value - min_value) / max_value def _if(self): """ Shape Index https://www.indexdatabase.de/db/i-single.php?id=79 :return: index """ return (2 * self.red - self.green - self.blue) / (self.green - self.blue) def dvi(self): """ Simple Ratio self.nir/self.red Difference Vegetation Index, Vegetation Index Number (VIN) https://www.indexdatabase.de/db/i-single.php?id=12 :return: index """ return self.nir / self.red def tvi(self): """ Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=98 :return: index """ return (self.ndvi() + 0.5) ** (1 / 2) def ndre(self): return (self.nir - self.redEdge) / (self.nir + self.redEdge) """ # genering a random matrices to test this class red = np.ones((1000,1000, 1),dtype="float64") * 46787 green = np.ones((1000,1000, 1),dtype="float64") * 23487 blue = np.ones((1000,1000, 1),dtype="float64") * 14578 redEdge = np.ones((1000,1000, 1),dtype="float64") * 51045 nir = np.ones((1000,1000, 1),dtype="float64") * 52200 # Examples of how to use the class # instantiating the class cl = IndexCalculation() # instantiating the class with the values #cl = indexCalculation(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # how set the values after instantiate the class cl, (for update the data or when don't # instantiating the class with the values) cl.setMatrices(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # calculating the indices for the instantiated values in the class # Note: the CCCI index can be changed to any index implemented in the class. indexValue_form1 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) indexValue_form2 = cl.CCCI() # calculating the index with the values directly -- you can set just the values # preferred note: the *calculation* function performs the function *setMatrices* indexValue_form3 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) print("Form 1: "+np.array2string(indexValue_form1, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 2: "+np.array2string(indexValue_form2, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 3: "+np.array2string(indexValue_form3, precision=20, separator=', ', floatmode='maxprec_equal')) # A list of examples results for different type of data at NDVI # float16 -> 0.31567383 #NDVI (red = 50, nir = 100) # float32 -> 0.31578946 #NDVI (red = 50, nir = 100) # float64 -> 0.3157894736842105 #NDVI (red = 50, nir = 100) # longdouble -> 0.3157894736842105 #NDVI (red = 50, nir = 100) """
# Author: João Gustavo A. Amorim # Author email: [email protected] # Coding date: jan 2019 # python/black: True # Imports import numpy as np # Class implemented to calculus the index class IndexCalculation: """ # Class Summary This algorithm consists in calculating vegetation indices, these indices can be used for precision agriculture for example (or remote sensing). There are functions to define the data and to calculate the implemented indices. # Vegetation index https://en.wikipedia.org/wiki/Vegetation_Index A Vegetation Index (VI) is a spectral transformation of two or more bands designed to enhance the contribution of vegetation properties and allow reliable spatial and temporal inter-comparisons of terrestrial photosynthetic activity and canopy structural variations # Information about channels (Wavelength range for each) * nir - near-infrared https://www.malvernpanalytical.com/br/products/technology/near-infrared-spectroscopy Wavelength Range 700 nm to 2500 nm * Red Edge https://en.wikipedia.org/wiki/Red_edge Wavelength Range 680 nm to 730 nm * red https://en.wikipedia.org/wiki/Color Wavelength Range 635 nm to 700 nm * blue https://en.wikipedia.org/wiki/Color Wavelength Range 450 nm to 490 nm * green https://en.wikipedia.org/wiki/Color Wavelength Range 520 nm to 560 nm # Implemented index list #"abbreviationOfIndexName" -- list of channels used #"ARVI2" -- red, nir #"CCCI" -- red, redEdge, nir #"CVI" -- red, green, nir #"GLI" -- red, green, blue #"NDVI" -- red, nir #"BNDVI" -- blue, nir #"redEdgeNDVI" -- red, redEdge #"GNDVI" -- green, nir #"GBNDVI" -- green, blue, nir #"GRNDVI" -- red, green, nir #"RBNDVI" -- red, blue, nir #"PNDVI" -- red, green, blue, nir #"ATSAVI" -- red, nir #"BWDRVI" -- blue, nir #"CIgreen" -- green, nir #"CIrededge" -- redEdge, nir #"CI" -- red, blue #"CTVI" -- red, nir #"GDVI" -- green, nir #"EVI" -- red, blue, nir #"GEMI" -- red, nir #"GOSAVI" -- green, nir #"GSAVI" -- green, nir #"Hue" -- red, green, blue #"IVI" -- red, nir #"IPVI" -- red, nir #"I" -- red, green, blue #"RVI" -- red, nir #"MRVI" -- red, nir #"MSAVI" -- red, nir #"NormG" -- red, green, nir #"NormNIR" -- red, green, nir #"NormR" -- red, green, nir #"NGRDI" -- red, green #"RI" -- red, green #"S" -- red, green, blue #"IF" -- red, green, blue #"DVI" -- red, nir #"TVI" -- red, nir #"NDRE" -- redEdge, nir #list of all index implemented #allIndex = ["ARVI2", "CCCI", "CVI", "GLI", "NDVI", "BNDVI", "redEdgeNDVI", "GNDVI", "GBNDVI", "GRNDVI", "RBNDVI", "PNDVI", "ATSAVI", "BWDRVI", "CIgreen", "CIrededge", "CI", "CTVI", "GDVI", "EVI", "GEMI", "GOSAVI", "GSAVI", "Hue", "IVI", "IPVI", "I", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "S", "IF", "DVI", "TVI", "NDRE"] #list of index with not blue channel #notBlueIndex = ["ARVI2", "CCCI", "CVI", "NDVI", "redEdgeNDVI", "GNDVI", "GRNDVI", "ATSAVI", "CIgreen", "CIrededge", "CTVI", "GDVI", "GEMI", "GOSAVI", "GSAVI", "IVI", "IPVI", "RVI", "MRVI", "MSAVI", "NormG", "NormNIR", "NormR", "NGRDI", "RI", "DVI", "TVI", "NDRE"] #list of index just with RGB channels #RGBIndex = ["GLI", "CI", "Hue", "I", "NGRDI", "RI", "S", "IF"] """ def __init__(self, red=None, green=None, blue=None, red_edge=None, nir=None): self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) def set_matricies(self, red=None, green=None, blue=None, red_edge=None, nir=None): if red is not None: self.red = red if green is not None: self.green = green if blue is not None: self.blue = blue if red_edge is not None: self.redEdge = red_edge if nir is not None: self.nir = nir return True def calculation( self, index="", red=None, green=None, blue=None, red_edge=None, nir=None ): """ performs the calculation of the index with the values instantiated in the class :str index: abbreviation of index name to perform """ self.set_matricies(red=red, green=green, blue=blue, red_edge=red_edge, nir=nir) funcs = { "ARVI2": self.arv12, "CCCI": self.ccci, "CVI": self.cvi, "GLI": self.gli, "NDVI": self.ndvi, "BNDVI": self.bndvi, "redEdgeNDVI": self.red_edge_ndvi, "GNDVI": self.gndvi, "GBNDVI": self.gbndvi, "GRNDVI": self.grndvi, "RBNDVI": self.rbndvi, "PNDVI": self.pndvi, "ATSAVI": self.atsavi, "BWDRVI": self.bwdrvi, "CIgreen": self.ci_green, "CIrededge": self.ci_rededge, "CI": self.ci, "CTVI": self.ctvi, "GDVI": self.gdvi, "EVI": self.evi, "GEMI": self.gemi, "GOSAVI": self.gosavi, "GSAVI": self.gsavi, "Hue": self.hue, "IVI": self.ivi, "IPVI": self.ipvi, "I": self.i, "RVI": self.rvi, "MRVI": self.mrvi, "MSAVI": self.m_savi, "NormG": self.norm_g, "NormNIR": self.norm_nir, "NormR": self.norm_r, "NGRDI": self.ngrdi, "RI": self.ri, "S": self.s, "IF": self._if, "DVI": self.dvi, "TVI": self.tvi, "NDRE": self.ndre, } try: return funcs[index]() except KeyError: print("Index not in the list!") return False def arv12(self): """ Atmospherically Resistant Vegetation Index 2 https://www.indexdatabase.de/db/i-single.php?id=396 :return: index −0.18+1.17*(self.nir−self.red)/(self.nir+self.red) """ return -0.18 + (1.17 * ((self.nir - self.red) / (self.nir + self.red))) def ccci(self): """ Canopy Chlorophyll Content Index https://www.indexdatabase.de/db/i-single.php?id=224 :return: index """ return ((self.nir - self.redEdge) / (self.nir + self.redEdge)) / ( (self.nir - self.red) / (self.nir + self.red) ) def cvi(self): """ Chlorophyll vegetation index https://www.indexdatabase.de/db/i-single.php?id=391 :return: index """ return self.nir * (self.red / (self.green**2)) def gli(self): """ self.green leaf index https://www.indexdatabase.de/db/i-single.php?id=375 :return: index """ return (2 * self.green - self.red - self.blue) / ( 2 * self.green + self.red + self.blue ) def ndvi(self): """ Normalized Difference self.nir/self.red Normalized Difference Vegetation Index, Calibrated NDVI - CDVI https://www.indexdatabase.de/db/i-single.php?id=58 :return: index """ return (self.nir - self.red) / (self.nir + self.red) def bndvi(self): """ Normalized Difference self.nir/self.blue self.blue-normalized difference vegetation index https://www.indexdatabase.de/db/i-single.php?id=135 :return: index """ return (self.nir - self.blue) / (self.nir + self.blue) def red_edge_ndvi(self): """ Normalized Difference self.rededge/self.red https://www.indexdatabase.de/db/i-single.php?id=235 :return: index """ return (self.redEdge - self.red) / (self.redEdge + self.red) def gndvi(self): """ Normalized Difference self.nir/self.green self.green NDVI https://www.indexdatabase.de/db/i-single.php?id=401 :return: index """ return (self.nir - self.green) / (self.nir + self.green) def gbndvi(self): """ self.green-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=186 :return: index """ return (self.nir - (self.green + self.blue)) / ( self.nir + (self.green + self.blue) ) def grndvi(self): """ self.green-self.red NDVI https://www.indexdatabase.de/db/i-single.php?id=185 :return: index """ return (self.nir - (self.green + self.red)) / ( self.nir + (self.green + self.red) ) def rbndvi(self): """ self.red-self.blue NDVI https://www.indexdatabase.de/db/i-single.php?id=187 :return: index """ return (self.nir - (self.blue + self.red)) / (self.nir + (self.blue + self.red)) def pndvi(self): """ Pan NDVI https://www.indexdatabase.de/db/i-single.php?id=188 :return: index """ return (self.nir - (self.green + self.red + self.blue)) / ( self.nir + (self.green + self.red + self.blue) ) def atsavi(self, x=0.08, a=1.22, b=0.03): """ Adjusted transformed soil-adjusted VI https://www.indexdatabase.de/db/i-single.php?id=209 :return: index """ return a * ( (self.nir - a * self.red - b) / (a * self.nir + self.red - a * b + x * (1 + a**2)) ) def bwdrvi(self): """ self.blue-wide dynamic range vegetation index https://www.indexdatabase.de/db/i-single.php?id=136 :return: index """ return (0.1 * self.nir - self.blue) / (0.1 * self.nir + self.blue) def ci_green(self): """ Chlorophyll Index self.green https://www.indexdatabase.de/db/i-single.php?id=128 :return: index """ return (self.nir / self.green) - 1 def ci_rededge(self): """ Chlorophyll Index self.redEdge https://www.indexdatabase.de/db/i-single.php?id=131 :return: index """ return (self.nir / self.redEdge) - 1 def ci(self): """ Coloration Index https://www.indexdatabase.de/db/i-single.php?id=11 :return: index """ return (self.red - self.blue) / self.red def ctvi(self): """ Corrected Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=244 :return: index """ ndvi = self.ndvi() return ((ndvi + 0.5) / (abs(ndvi + 0.5))) * (abs(ndvi + 0.5) ** (1 / 2)) def gdvi(self): """ Difference self.nir/self.green self.green Difference Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=27 :return: index """ return self.nir - self.green def evi(self): """ Enhanced Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=16 :return: index """ return 2.5 * ( (self.nir - self.red) / (self.nir + 6 * self.red - 7.5 * self.blue + 1) ) def gemi(self): """ Global Environment Monitoring Index https://www.indexdatabase.de/db/i-single.php?id=25 :return: index """ n = (2 * (self.nir**2 - self.red**2) + 1.5 * self.nir + 0.5 * self.red) / ( self.nir + self.red + 0.5 ) return n * (1 - 0.25 * n) - (self.red - 0.125) / (1 - self.red) def gosavi(self, y=0.16): """ self.green Optimized Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=29 mit Y = 0,16 :return: index """ return (self.nir - self.green) / (self.nir + self.green + y) def gsavi(self, n=0.5): """ self.green Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=31 mit N = 0,5 :return: index """ return ((self.nir - self.green) / (self.nir + self.green + n)) * (1 + n) def hue(self): """ Hue https://www.indexdatabase.de/db/i-single.php?id=34 :return: index """ return np.arctan( ((2 * self.red - self.green - self.blue) / 30.5) * (self.green - self.blue) ) def ivi(self, a=None, b=None): """ Ideal vegetation index https://www.indexdatabase.de/db/i-single.php?id=276 b=intercept of vegetation line a=soil line slope :return: index """ return (self.nir - b) / (a * self.red) def ipvi(self): """ Infraself.red percentage vegetation index https://www.indexdatabase.de/db/i-single.php?id=35 :return: index """ return (self.nir / ((self.nir + self.red) / 2)) * (self.ndvi() + 1) def i(self): """ Intensity https://www.indexdatabase.de/db/i-single.php?id=36 :return: index """ return (self.red + self.green + self.blue) / 30.5 def rvi(self): """ Ratio-Vegetation-Index http://www.seos-project.eu/modules/remotesensing/remotesensing-c03-s01-p01.html :return: index """ return self.nir / self.red def mrvi(self): """ Modified Normalized Difference Vegetation Index RVI https://www.indexdatabase.de/db/i-single.php?id=275 :return: index """ return (self.rvi() - 1) / (self.rvi() + 1) def m_savi(self): """ Modified Soil Adjusted Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=44 :return: index """ return ( (2 * self.nir + 1) - ((2 * self.nir + 1) ** 2 - 8 * (self.nir - self.red)) ** (1 / 2) ) / 2 def norm_g(self): """ Norm G https://www.indexdatabase.de/db/i-single.php?id=50 :return: index """ return self.green / (self.nir + self.red + self.green) def norm_nir(self): """ Norm self.nir https://www.indexdatabase.de/db/i-single.php?id=51 :return: index """ return self.nir / (self.nir + self.red + self.green) def norm_r(self): """ Norm R https://www.indexdatabase.de/db/i-single.php?id=52 :return: index """ return self.red / (self.nir + self.red + self.green) def ngrdi(self): """ Normalized Difference self.green/self.red Normalized self.green self.red difference index, Visible Atmospherically Resistant Indices self.green (VIself.green) https://www.indexdatabase.de/db/i-single.php?id=390 :return: index """ return (self.green - self.red) / (self.green + self.red) def ri(self): """ Normalized Difference self.red/self.green self.redness Index https://www.indexdatabase.de/db/i-single.php?id=74 :return: index """ return (self.red - self.green) / (self.red + self.green) def s(self): """ Saturation https://www.indexdatabase.de/db/i-single.php?id=77 :return: index """ max_value = np.max([np.max(self.red), np.max(self.green), np.max(self.blue)]) min_value = np.min([np.min(self.red), np.min(self.green), np.min(self.blue)]) return (max_value - min_value) / max_value def _if(self): """ Shape Index https://www.indexdatabase.de/db/i-single.php?id=79 :return: index """ return (2 * self.red - self.green - self.blue) / (self.green - self.blue) def dvi(self): """ Simple Ratio self.nir/self.red Difference Vegetation Index, Vegetation Index Number (VIN) https://www.indexdatabase.de/db/i-single.php?id=12 :return: index """ return self.nir / self.red def tvi(self): """ Transformed Vegetation Index https://www.indexdatabase.de/db/i-single.php?id=98 :return: index """ return (self.ndvi() + 0.5) ** (1 / 2) def ndre(self): return (self.nir - self.redEdge) / (self.nir + self.redEdge) """ # genering a random matrices to test this class red = np.ones((1000,1000, 1),dtype="float64") * 46787 green = np.ones((1000,1000, 1),dtype="float64") * 23487 blue = np.ones((1000,1000, 1),dtype="float64") * 14578 redEdge = np.ones((1000,1000, 1),dtype="float64") * 51045 nir = np.ones((1000,1000, 1),dtype="float64") * 52200 # Examples of how to use the class # instantiating the class cl = IndexCalculation() # instantiating the class with the values #cl = indexCalculation(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # how set the values after instantiate the class cl, (for update the data or when don't # instantiating the class with the values) cl.setMatrices(red=red, green=green, blue=blue, redEdge=redEdge, nir=nir) # calculating the indices for the instantiated values in the class # Note: the CCCI index can be changed to any index implemented in the class. indexValue_form1 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) indexValue_form2 = cl.CCCI() # calculating the index with the values directly -- you can set just the values # preferred note: the *calculation* function performs the function *setMatrices* indexValue_form3 = cl.calculation("CCCI", red=red, green=green, blue=blue, redEdge=redEdge, nir=nir).astype(np.float64) print("Form 1: "+np.array2string(indexValue_form1, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 2: "+np.array2string(indexValue_form2, precision=20, separator=', ', floatmode='maxprec_equal')) print("Form 3: "+np.array2string(indexValue_form3, precision=20, separator=', ', floatmode='maxprec_equal')) # A list of examples results for different type of data at NDVI # float16 -> 0.31567383 #NDVI (red = 50, nir = 100) # float32 -> 0.31578946 #NDVI (red = 50, nir = 100) # float64 -> 0.3157894736842105 #NDVI (red = 50, nir = 100) # longdouble -> 0.3157894736842105 #NDVI (red = 50, nir = 100) """
1
TheAlgorithms/Python
7,843
Fix yesqa hook
### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
Cjkjvfnby
"2022-10-29T12:58:59Z"
"2022-10-29T13:07:02Z"
18ffc4dec85a85837f71cd6c9b1e630b9d185001
584e743422565decd35b1b6f94cef3ced840698b
Fix yesqa hook. ### Describe your change: I have noticed that yesqa is not working as the flake8 plugin. It should be installed as a separate hook. To have exactly the same flake8 setting plugins should be installed on both hooks. I added a YAML anchor to keep plugins declaration in a single place. Fix warnings that were raised by this hook across the repo. * [ ] Add an algorithm? * [ ] Fix a bug or typo in an existing algorithm? * [x] Documentation change? ### Checklist: * [x] I have read [CONTRIBUTING.md](https://github.com/TheAlgorithms/Python/blob/master/CONTRIBUTING.md). * [ ] This pull request is all my own work -- I have not plagiarized. * [ ] I know that pull requests will not be merged if they fail the automated tests. * [ ] This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms. * [ ] All new Python files are placed inside an existing directory. * [ ] All filenames are in all lowercase characters with no spaces or dashes. * [ ] All functions and variable names follow Python naming conventions. * [ ] All function parameters and return values are annotated with Python [type hints](https://docs.python.org/3/library/typing.html). * [ ] All functions have [doctests](https://docs.python.org/3/library/doctest.html) that pass the automated testing. * [ ] All new algorithms include at least one URL that points to Wikipedia or another similar explanation. * [ ] If this pull request resolves one or more open issues then the commit message contains `Fixes: #{$ISSUE_NO}`.
""" Simple multithreaded algorithm to show how the 4 phases of a genetic algorithm works (Evaluation, Selection, Crossover and Mutation) https://en.wikipedia.org/wiki/Genetic_algorithm Author: D4rkia """ from __future__ import annotations import random # Maximum size of the population. Bigger could be faster but is more memory expensive. N_POPULATION = 200 # Number of elements selected in every generation of evolution. The selection takes # place from best to worst of that generation and must be smaller than N_POPULATION. N_SELECTED = 50 # Probability that an element of a generation can mutate, changing one of its genes. # This will guarantee that all genes will be used during evolution. MUTATION_PROBABILITY = 0.4 # Just a seed to improve randomness required by the algorithm. random.seed(random.randint(0, 1000)) def basic(target: str, genes: list[str], debug: bool = True) -> tuple[int, int, str]: """ Verify that the target contains no genes besides the ones inside genes variable. >>> from string import ascii_lowercase >>> basic("doctest", ascii_lowercase, debug=False)[2] 'doctest' >>> genes = list(ascii_lowercase) >>> genes.remove("e") >>> basic("test", genes) Traceback (most recent call last): ... ValueError: ['e'] is not in genes list, evolution cannot converge >>> genes.remove("s") >>> basic("test", genes) Traceback (most recent call last): ... ValueError: ['e', 's'] is not in genes list, evolution cannot converge >>> genes.remove("t") >>> basic("test", genes) Traceback (most recent call last): ... ValueError: ['e', 's', 't'] is not in genes list, evolution cannot converge """ # Verify if N_POPULATION is bigger than N_SELECTED if N_POPULATION < N_SELECTED: raise ValueError(f"{N_POPULATION} must be bigger than {N_SELECTED}") # Verify that the target contains no genes besides the ones inside genes variable. not_in_genes_list = sorted({c for c in target if c not in genes}) if not_in_genes_list: raise ValueError( f"{not_in_genes_list} is not in genes list, evolution cannot converge" ) # Generate random starting population. population = [] for _ in range(N_POPULATION): population.append("".join([random.choice(genes) for i in range(len(target))])) # Just some logs to know what the algorithms is doing. generation, total_population = 0, 0 # This loop will end when we find a perfect match for our target. while True: generation += 1 total_population += len(population) # Random population created. Now it's time to evaluate. def evaluate(item: str, main_target: str = target) -> tuple[str, float]: """ Evaluate how similar the item is with the target by just counting each char in the right position >>> evaluate("Helxo Worlx", Hello World) ["Helxo Worlx", 9] """ score = len( [g for position, g in enumerate(item) if g == main_target[position]] ) return (item, float(score)) # noqa: B023 # Adding a bit of concurrency can make everything faster, # # import concurrent.futures # population_score: list[tuple[str, float]] = [] # with concurrent.futures.ThreadPoolExecutor( # max_workers=NUM_WORKERS) as executor: # futures = {executor.submit(evaluate, item) for item in population} # concurrent.futures.wait(futures) # population_score = [item.result() for item in futures] # # but with a simple algorithm like this, it will probably be slower. # We just need to call evaluate for every item inside the population. population_score = [evaluate(item) for item in population] # Check if there is a matching evolution. population_score = sorted(population_score, key=lambda x: x[1], reverse=True) if population_score[0][0] == target: return (generation, total_population, population_score[0][0]) # Print the best result every 10 generation. # Just to know that the algorithm is working. if debug and generation % 10 == 0: print( f"\nGeneration: {generation}" f"\nTotal Population:{total_population}" f"\nBest score: {population_score[0][1]}" f"\nBest string: {population_score[0][0]}" ) # Flush the old population, keeping some of the best evolutions. # Keeping this avoid regression of evolution. population_best = population[: int(N_POPULATION / 3)] population.clear() population.extend(population_best) # Normalize population score to be between 0 and 1. population_score = [ (item, score / len(target)) for item, score in population_score ] # Select, crossover and mutate a new population. def select(parent_1: tuple[str, float]) -> list[str]: """Select the second parent and generate new population""" pop = [] # Generate more children proportionally to the fitness score. child_n = int(parent_1[1] * 100) + 1 child_n = 10 if child_n >= 10 else child_n for _ in range(child_n): parent_2 = population_score[ # noqa: B023 random.randint(0, N_SELECTED) ][0] child_1, child_2 = crossover(parent_1[0], parent_2) # Append new string to the population list. pop.append(mutate(child_1)) pop.append(mutate(child_2)) return pop def crossover(parent_1: str, parent_2: str) -> tuple[str, str]: """Slice and combine two string at a random point.""" random_slice = random.randint(0, len(parent_1) - 1) child_1 = parent_1[:random_slice] + parent_2[random_slice:] child_2 = parent_2[:random_slice] + parent_1[random_slice:] return (child_1, child_2) def mutate(child: str) -> str: """Mutate a random gene of a child with another one from the list.""" child_list = list(child) if random.uniform(0, 1) < MUTATION_PROBABILITY: child_list[random.randint(0, len(child)) - 1] = random.choice(genes) return "".join(child_list) # This is selection for i in range(N_SELECTED): population.extend(select(population_score[int(i)])) # Check if the population has already reached the maximum value and if so, # break the cycle. If this check is disabled, the algorithm will take # forever to compute large strings, but will also calculate small strings in # a far fewer generations. if len(population) > N_POPULATION: break if __name__ == "__main__": target_str = ( "This is a genetic algorithm to evaluate, combine, evolve, and mutate a string!" ) genes_list = list( " ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklm" "nopqrstuvwxyz.,;!?+-*#@^'èéòà€ù=)(&%$£/\\" ) generation, population, target = basic(target_str, genes_list) print( f"\nGeneration: {generation}\nTotal Population: {population}\nTarget: {target}" )
""" Simple multithreaded algorithm to show how the 4 phases of a genetic algorithm works (Evaluation, Selection, Crossover and Mutation) https://en.wikipedia.org/wiki/Genetic_algorithm Author: D4rkia """ from __future__ import annotations import random # Maximum size of the population. Bigger could be faster but is more memory expensive. N_POPULATION = 200 # Number of elements selected in every generation of evolution. The selection takes # place from best to worst of that generation and must be smaller than N_POPULATION. N_SELECTED = 50 # Probability that an element of a generation can mutate, changing one of its genes. # This will guarantee that all genes will be used during evolution. MUTATION_PROBABILITY = 0.4 # Just a seed to improve randomness required by the algorithm. random.seed(random.randint(0, 1000)) def basic(target: str, genes: list[str], debug: bool = True) -> tuple[int, int, str]: """ Verify that the target contains no genes besides the ones inside genes variable. >>> from string import ascii_lowercase >>> basic("doctest", ascii_lowercase, debug=False)[2] 'doctest' >>> genes = list(ascii_lowercase) >>> genes.remove("e") >>> basic("test", genes) Traceback (most recent call last): ... ValueError: ['e'] is not in genes list, evolution cannot converge >>> genes.remove("s") >>> basic("test", genes) Traceback (most recent call last): ... ValueError: ['e', 's'] is not in genes list, evolution cannot converge >>> genes.remove("t") >>> basic("test", genes) Traceback (most recent call last): ... ValueError: ['e', 's', 't'] is not in genes list, evolution cannot converge """ # Verify if N_POPULATION is bigger than N_SELECTED if N_POPULATION < N_SELECTED: raise ValueError(f"{N_POPULATION} must be bigger than {N_SELECTED}") # Verify that the target contains no genes besides the ones inside genes variable. not_in_genes_list = sorted({c for c in target if c not in genes}) if not_in_genes_list: raise ValueError( f"{not_in_genes_list} is not in genes list, evolution cannot converge" ) # Generate random starting population. population = [] for _ in range(N_POPULATION): population.append("".join([random.choice(genes) for i in range(len(target))])) # Just some logs to know what the algorithms is doing. generation, total_population = 0, 0 # This loop will end when we find a perfect match for our target. while True: generation += 1 total_population += len(population) # Random population created. Now it's time to evaluate. def evaluate(item: str, main_target: str = target) -> tuple[str, float]: """ Evaluate how similar the item is with the target by just counting each char in the right position >>> evaluate("Helxo Worlx", Hello World) ["Helxo Worlx", 9] """ score = len( [g for position, g in enumerate(item) if g == main_target[position]] ) return (item, float(score)) # Adding a bit of concurrency can make everything faster, # # import concurrent.futures # population_score: list[tuple[str, float]] = [] # with concurrent.futures.ThreadPoolExecutor( # max_workers=NUM_WORKERS) as executor: # futures = {executor.submit(evaluate, item) for item in population} # concurrent.futures.wait(futures) # population_score = [item.result() for item in futures] # # but with a simple algorithm like this, it will probably be slower. # We just need to call evaluate for every item inside the population. population_score = [evaluate(item) for item in population] # Check if there is a matching evolution. population_score = sorted(population_score, key=lambda x: x[1], reverse=True) if population_score[0][0] == target: return (generation, total_population, population_score[0][0]) # Print the best result every 10 generation. # Just to know that the algorithm is working. if debug and generation % 10 == 0: print( f"\nGeneration: {generation}" f"\nTotal Population:{total_population}" f"\nBest score: {population_score[0][1]}" f"\nBest string: {population_score[0][0]}" ) # Flush the old population, keeping some of the best evolutions. # Keeping this avoid regression of evolution. population_best = population[: int(N_POPULATION / 3)] population.clear() population.extend(population_best) # Normalize population score to be between 0 and 1. population_score = [ (item, score / len(target)) for item, score in population_score ] # Select, crossover and mutate a new population. def select(parent_1: tuple[str, float]) -> list[str]: """Select the second parent and generate new population""" pop = [] # Generate more children proportionally to the fitness score. child_n = int(parent_1[1] * 100) + 1 child_n = 10 if child_n >= 10 else child_n for _ in range(child_n): parent_2 = population_score[ # noqa: B023 random.randint(0, N_SELECTED) ][0] child_1, child_2 = crossover(parent_1[0], parent_2) # Append new string to the population list. pop.append(mutate(child_1)) pop.append(mutate(child_2)) return pop def crossover(parent_1: str, parent_2: str) -> tuple[str, str]: """Slice and combine two string at a random point.""" random_slice = random.randint(0, len(parent_1) - 1) child_1 = parent_1[:random_slice] + parent_2[random_slice:] child_2 = parent_2[:random_slice] + parent_1[random_slice:] return (child_1, child_2) def mutate(child: str) -> str: """Mutate a random gene of a child with another one from the list.""" child_list = list(child) if random.uniform(0, 1) < MUTATION_PROBABILITY: child_list[random.randint(0, len(child)) - 1] = random.choice(genes) return "".join(child_list) # This is selection for i in range(N_SELECTED): population.extend(select(population_score[int(i)])) # Check if the population has already reached the maximum value and if so, # break the cycle. If this check is disabled, the algorithm will take # forever to compute large strings, but will also calculate small strings in # a far fewer generations. if len(population) > N_POPULATION: break if __name__ == "__main__": target_str = ( "This is a genetic algorithm to evaluate, combine, evolve, and mutate a string!" ) genes_list = list( " ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklm" "nopqrstuvwxyz.,;!?+-*#@^'èéòà€ù=)(&%$£/\\" ) generation, population, target = basic(target_str, genes_list) print( f"\nGeneration: {generation}\nTotal Population: {population}\nTarget: {target}" )
1