text_prompt
stringlengths 157
13.1k
| code_prompt
stringlengths 7
19.8k
⌀ |
---|---|
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def filename_unsafe(self):
"""The filename from the Content-Disposition header. If a location was passed at instanciation, the basename from that may be used as a fallback. Otherwise, this may be the None value. On safety: This property records the intent of the sender. You shouldn't use this sender-controlled value as a filesystem path, it can be insecure. Serving files with this filename can be dangerous as well, due to a certain browser using the part after the dot for mime-sniffing. Saving it to a database is fine by itself though. """ |
if 'filename*' in self.assocs:
return self.assocs['filename*'].string
elif 'filename' in self.assocs:
# XXX Reject non-ascii (parsed via qdtext) here?
return self.assocs['filename']
elif self.location is not None:
return posixpath.basename(self.location_path.rstrip('/')) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def filename_sanitized(self, extension, default_filename='file'):
"""Returns a filename that is safer to use on the filesystem. The filename will not contain a slash (nor the path separator for the current platform, if different), it will not start with a dot, and it will have the expected extension. No guarantees that makes it "safe enough". No effort is made to remove special characters; using this value blindly might overwrite existing files, etc. """ |
assert extension
assert extension[0] != '.'
assert default_filename
assert '.' not in default_filename
extension = '.' + extension
fname = self.filename_unsafe
if fname is None:
fname = default_filename
fname = posixpath.basename(fname)
fname = os.path.basename(fname)
fname = fname.lstrip('.')
if not fname:
fname = default_filename
if not fname.endswith(extension):
fname = fname + extension
return fname |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def str_uptime(self):
"""uptime in human readable format.""" |
mins, secs = divmod(self.uptime, 60)
hours, mins = divmod(mins, 60)
return '%02d:%02d:%02d' % (hours, mins, secs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def transmission_rate(self):
""" Returns the upstream, downstream values as a tuple in bytes per second. Use this for periodical calling. """ |
sent = self.bytes_sent
received = self.bytes_received
traffic_call = time.time()
time_delta = traffic_call - self.last_traffic_call
upstream = int(1.0 * (sent - self.last_bytes_sent)/time_delta)
downstream = int(1.0 * (received - self.last_bytes_received)/time_delta)
self.last_bytes_sent = sent
self.last_bytes_received = received
self.last_traffic_call = traffic_call
return upstream, downstream |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def str_transmission_rate(self):
"""Returns a tuple of human readable transmission rates in bytes.""" |
upstream, downstream = self.transmission_rate
return (
fritztools.format_num(upstream),
fritztools.format_num(downstream)
) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _body_builder(self, kwargs):
""" Helper method to construct the appropriate SOAP-body to call a FritzBox-Service. """ |
p = {
'action_name': self.name,
'service_type': self.service_type,
'arguments': '',
}
if kwargs:
arguments = [
self.argument_template % {'name': k, 'value': v}
for k, v in kwargs.items()
]
p['arguments'] = ''.join(arguments)
body = self.body_template.strip() % p
return body |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def execute(self, **kwargs):
""" Calls the FritzBox action and returns a dictionary with the arguments. """ |
headers = self.header.copy()
headers['soapaction'] = '%s#%s' % (self.service_type, self.name)
data = self.envelope.strip() % self._body_builder(kwargs)
url = 'http://%s:%s%s' % (self.address, self.port, self.control_url)
auth = None
if self.password:
auth=HTTPDigestAuth(self.user, self.password)
response = requests.post(url, data=data, headers=headers, auth=auth)
# lxml needs bytes, therefore response.content (not response.text)
result = self.parse_response(response.content)
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_modelname(self):
"""Returns the FritzBox model name.""" |
xpath = '%s/%s' % (self.nodename('device'), self.nodename('modelName'))
return self.root.find(xpath).text |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_services(self):
"""Returns a list of FritzService-objects.""" |
result = []
nodes = self.root.iterfind(
'.//ns:service', namespaces={'ns': self.namespace})
for node in nodes:
result.append(FritzService(
node.find(self.nodename('serviceType')).text,
node.find(self.nodename('controlURL')).text,
node.find(self.nodename('SCPDURL')).text))
return result |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_actions(self):
"""Returns a list of FritzAction instances.""" |
self._read_state_variables()
actions = []
nodes = self.root.iterfind(
'.//ns:action', namespaces={'ns': self.namespace})
for node in nodes:
action = FritzAction(self.service.service_type,
self.service.control_url)
action.name = node.find(self.nodename('name')).text
action.arguments = self._get_arguments(node)
actions.append(action)
return actions |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _get_arguments(self, action_node):
""" Returns a dictionary of arguments for the given action_node. """ |
arguments = {}
argument_nodes = action_node.iterfind(
r'./ns:argumentList/ns:argument', namespaces={'ns': self.namespace})
for argument_node in argument_nodes:
argument = self._get_argument(argument_node)
arguments[argument.name] = argument
return arguments |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _get_argument(self, argument_node):
""" Returns a FritzActionArgument instance for the given argument_node. """ |
argument = FritzActionArgument()
argument.name = argument_node.find(self.nodename('name')).text
argument.direction = argument_node.find(self.nodename('direction')).text
rsv = argument_node.find(self.nodename('relatedStateVariable')).text
# TODO: track malformed xml-nodes (i.e. misspelled)
argument.data_type = self.state_variables.get(rsv, None)
return argument |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _read_descriptions(self, password):
""" Read and evaluate the igddesc.xml file and the tr64desc.xml file if a password is given. """ |
descfiles = [FRITZ_IGD_DESC_FILE]
if password:
descfiles.append(FRITZ_TR64_DESC_FILE)
for descfile in descfiles:
parser = FritzDescParser(self.address, self.port, descfile)
if not self.modelname:
self.modelname = parser.get_modelname()
services = parser.get_services()
self._read_services(services) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _read_services(self, services):
"""Get actions from services.""" |
for service in services:
parser = FritzSCDPParser(self.address, self.port, service)
actions = parser.get_actions()
service.actions = {action.name: action for action in actions}
self.services[service.name] = service |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def actionnames(self):
""" Returns a alphabetical sorted list of tuples with all known service- and action-names. """ |
actions = []
for service_name in sorted(self.services.keys()):
action_names = self.services[service_name].actions.keys()
for action_name in sorted(action_names):
actions.append((service_name, action_name))
return actions |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_action_arguments(self, service_name, action_name):
""" Returns a list of tuples with all known arguments for the given service- and action-name combination. The tuples contain the argument-name, direction and data_type. """ |
return self.services[service_name].actions[action_name].info |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def call_action(self, service_name, action_name, **kwargs):
"""Executes the given action. Raise a KeyError on unkown actions.""" |
action = self.services[service_name].actions[action_name]
return action.execute(**kwargs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def find_executable(executable):
'''
Finds executable in PATH
Returns:
string or None
'''
logger = logging.getLogger(__name__)
logger.debug("Checking executable '%s'...", executable)
executable_path = _find_executable(executable)
found = executable_path is not None
if found:
logger.debug("Executable '%s' found: '%s'", executable, executable_path)
else:
logger.debug("Executable '%s' not found", executable)
return executable_path |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def inject(self):
""" Recursively inject aXe into all iframes and the top level document. :param script_url: location of the axe-core script. :type script_url: string """ |
with open(self.script_url, "r", encoding="utf8") as f:
self.selenium.execute_script(f.read()) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def run(self, context=None, options=None):
""" Run axe against the current page. :param context: which page part(s) to analyze and/or what to exclude. :param options: dictionary of aXe options. """ |
template = (
"var callback = arguments[arguments.length - 1];"
+ "axe.run(%s).then(results => callback(results))"
)
args = ""
# If context parameter is passed, add to args
if context is not None:
args += "%r" % context
# Add comma delimiter only if both parameters are passed
if context is not None and options is not None:
args += ","
# If options parameter is passed, add to args
if options is not None:
args += "%s" % options
command = template % args
response = self.selenium.execute_async_script(command)
return response |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def report(self, violations):
""" Return readable report of accessibility violations found. :param violations: Dictionary of violations. :type violations: dict :return report: Readable report of violations. :rtype: string """ |
string = ""
string += "Found " + str(len(violations)) + " accessibility violations:"
for violation in violations:
string += (
"\n\n\nRule Violated:\n"
+ violation["id"]
+ " - "
+ violation["description"]
+ "\n\tURL: "
+ violation["helpUrl"]
+ "\n\tImpact Level: "
+ violation["impact"]
+ "\n\tTags:"
)
for tag in violation["tags"]:
string += " " + tag
string += "\n\tElements Affected:"
i = 1
for node in violation["nodes"]:
for target in node["target"]:
string += "\n\t" + str(i) + ") Target: " + target
i += 1
for item in node["all"]:
string += "\n\t\t" + item["message"]
for item in node["any"]:
string += "\n\t\t" + item["message"]
for item in node["none"]:
string += "\n\t\t" + item["message"]
string += "\n\n\n"
return string |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def write_results(self, data, name=None):
""" Write JSON to file with the specified name. :param name: Path to the file to be written to. If no path is passed a new JSON file "results.json" will be created in the current working directory. :param output: JSON object. """ |
if name:
filepath = os.path.abspath(name)
else:
filepath = os.path.join(os.path.getcwd(), "results.json")
with open(filepath, "w", encoding="utf8") as f:
try:
f.write(unicode(json.dumps(data, indent=4)))
except NameError:
f.write(json.dumps(data, indent=4)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def create_engine(engine, options=None, defaults=None):
'''
Creates an instance of an engine.
There is a two-stage instantiation process with engines.
1. ``options``:
The keyword options to instantiate the engine class
2. ``defaults``:
The default configuration for the engine (options often depends on instantiated TTS engine)
'''
if engine not in _ENGINE_MAP.keys():
raise TTSError('Unknown engine %s' % engine)
options = options or {}
defaults = defaults or {}
einst = _ENGINE_MAP[engine](**options)
einst.configure_default(**defaults)
return einst |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def classify(self, txt):
'''
Classifies text by language. Uses preferred_languages weighting.
'''
ranks = []
for lang, score in langid.rank(txt):
if lang in self.preferred_languages:
score += self.preferred_factor
ranks.append((lang, score))
ranks.sort(key=lambda x: x[1], reverse=True)
return ranks[0][0] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def say(self, txt, lang=None):
'''
Says the text.
if ``lang`` is ``None``, then uses ``classify()`` to detect language.
'''
lang = lang or self.classify(txt)
self.get_engine_for_lang(lang).say(txt, language=lang) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def configure_default(self, **_options):
'''
Sets default configuration.
Raises TTSError on error.
'''
language, voice, voiceinfo, options = self._configure(**_options)
self.languages_options[language] = (voice, options)
self.default_language = language
self.default_options = options |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def configure(self, **_options):
'''
Sets language-specific configuration.
Raises TTSError on error.
'''
language, voice, voiceinfo, options = self._configure(**_options)
self.languages_options[language] = (voice, options) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def play(self, filename, translate=False): # pragma: no cover
'''
Plays the sounds.
:filename: The input file name
:translate: If True, it runs it through audioread which will translate from common compression formats to raw WAV.
'''
# FIXME: Use platform-independent and async audio-output here
# PyAudio looks most promising, too bad about:
# --allow-external PyAudio --allow-unverified PyAudio
if translate:
with tempfile.NamedTemporaryFile(suffix='.wav', delete=False) as f:
fname = f.name
with audioread.audio_open(filename) as f:
with contextlib.closing(wave.open(fname, 'w')) as of:
of.setnchannels(f.channels)
of.setframerate(f.samplerate)
of.setsampwidth(2)
for buf in f:
of.writeframes(buf)
filename = fname
if winsound:
winsound.PlaySound(str(filename), winsound.SND_FILENAME)
else:
cmd = ['aplay', str(filename)]
self._logger.debug('Executing %s', ' '.join([pipes.quote(arg) for arg in cmd]))
subprocess.call(cmd)
if translate:
os.remove(fname) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _hashes_match(self, a, b):
"""Constant time comparison of bytes for py3, strings for py2""" |
if len(a) != len(b):
return False
diff = 0
if six.PY2:
a = bytearray(a)
b = bytearray(b)
for x, y in zip(a, b):
diff |= x ^ y
return not diff |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def parse(self):
""" Convert ACIS 'll' value into separate latitude and longitude. """ |
super(AcisIO, self).parse()
# This is more of a "mapping" step than a "parsing" step, but mappers
# only allow one-to-one mapping from input fields to output fields.
for row in self.data:
if 'meta' in row:
row = row['meta']
if 'll' in row:
row['longitude'], row['latitude'] = row['ll']
del row['ll'] |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_field_names(self):
""" ACIS web service returns "meta" and "data" for each station; Use meta attributes as field names """ |
field_names = super(StationDataIO, self).get_field_names()
if set(field_names) == set(['meta', 'data']):
meta_fields = list(self.data[0]['meta'].keys())
if set(meta_fields) < set(self.getvalue('meta')):
meta_fields = self.getvalue('meta')
field_names = list(meta_fields) + ['data']
return field_names |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def usable_item(self, data):
""" ACIS web service returns "meta" and "data" for each station; use meta attributes as item values, and add an IO for iterating over "data" """ |
# Use metadata as item
item = data['meta']
# Add nested IO for data
elems, elems_is_complex = self.getlist('parameter')
if elems_is_complex:
elems = [elem['name'] for elem in elems]
add, add_is_complex = self.getlist('add')
item['data'] = DataIO(
data=data['data'],
parameter=elems,
add=add,
start_date=self.getvalue('start_date'),
end_date=self.getvalue('end_date'),
)
# TupleMapper will convert item to namedtuple
return super(StationDataIO, self).usable_item(item) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def load_data(self, data):
""" MultiStnData data results are arrays without explicit dates; Infer time series based on start date. """ |
dates = fill_date_range(self.start_date, self.end_date)
for row, date in zip(data, dates):
data = {'date': date}
if self.add:
# If self.add is set, results will contain additional
# attributes (e.g. flags). In that case, create one row per
# result, with attributes "date", "elem", "value", and one for
# each item in self.add.
for elem, vals in zip(self.parameter, row):
data['elem'] = elem
for add, val in zip(['value'] + self.add, vals):
data[add] = val
yield data
else:
# Otherwise, return one row per date, with "date" and each
# element's value as attributes.
for elem, val in zip(self.parameter, row):
# namedtuple doesn't like numeric field names
if elem.isdigit():
elem = "e%s" % elem
data[elem] = val
yield data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def parse(self, value):
""" Enforce rules and return parsed value """ |
if self.required and value is None:
raise ValueError("%s is required!" % self.name)
elif self.ignored and value is not None:
warn("%s is ignored for this class!" % self.name)
elif not self.multi and isinstance(value, (list, tuple)):
if len(value) > 1:
raise ValueError(
"%s does not accept multiple values!" % self.name
)
return value[0]
elif self.multi and value is not None:
if not isinstance(value, (list, tuple)):
return [value]
return value |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set_param(self, into, name):
""" Set parameter key, noting whether list value is "complex" """ |
value, complex = self.getlist(name)
if value is not None:
into[name] = value
return complex |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_params(self):
""" Get parameters for web service, noting whether any are "complex" """ |
params = {}
complex = False
for name, opt in self.filter_options.items():
if opt.ignored:
continue
if self.set_param(params, name):
complex = True
return params, complex |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def params(self):
""" URL parameters for wq.io.loaders.NetLoader """ |
params, complex = self.get_params()
url_params = self.default_params.copy()
url_params.update(self.serialize_params(params, complex))
return url_params |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _get_remarks_component(self, string, initial_pos):
''' Parse the remarks into the _remarks dict '''
remarks_code = string[initial_pos:initial_pos + self.ADDR_CODE_LENGTH]
if remarks_code != 'REM':
raise ish_reportException("Parsing remarks. Expected REM but got %s." % (remarks_code,))
expected_length = int(string[0:4]) + self.PREAMBLE_LENGTH
position = initial_pos + self.ADDR_CODE_LENGTH
while position < expected_length:
key = string[position:position + self.ADDR_CODE_LENGTH]
if key == 'EQD':
break
chars_to_read = string[position + self.ADDR_CODE_LENGTH:position + \
(self.ADDR_CODE_LENGTH * 2)]
chars_to_read = int(chars_to_read)
position += (self.ADDR_CODE_LENGTH * 2)
string_value = string[position:position + chars_to_read]
self._remarks[key] = string_value
position += chars_to_read |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def _get_component(self, string, initial_pos):
''' given a string and a position, return both an updated position and
either a Component Object or a String back to the caller '''
add_code = string[initial_pos:initial_pos + self.ADDR_CODE_LENGTH]
if add_code == 'REM':
raise ish_reportException("This is a remarks record")
if add_code == 'EQD':
raise ish_reportException("This is EQD record")
initial_pos += self.ADDR_CODE_LENGTH
try:
useable_map = self.MAP[add_code]
except:
raise BaseException("Cannot find code %s in string %s (%d)." % (add_code, string, initial_pos))
# if there is no defined length, then read next three chars to get it
# this only applies to REM types, which have 3 chars for the type, then variable
if useable_map[1] is False:
chars_to_read = string[initial_pos + self.ADDR_CODE_LENGTH:initial_pos + \
(self.ADDR_CODE_LENGTH * 2)]
chars_to_read = int(chars_to_read)
initial_pos += (self.ADDR_CODE_LENGTH * 2)
else:
chars_to_read = useable_map[1]
new_position = initial_pos + chars_to_read
string_value = string[initial_pos:new_position]
try:
object_value = useable_map[2]()
object_value.loads(string_value)
except IndexError as err:
object_value = string_value
return (new_position, [add_code, object_value]) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def loads(self, string):
''' load from a string '''
for line in string.split("\n"):
if len(line) < 10:
continue
try:
report = ish_report()
report.loads(line)
self._reports.append(report)
except BaseException as exp:
''' don't complain TOO much '''
logging.warning('unable to load report, error: %s' % exp) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def do_pot(self):
""" Sync the template with the python code. """ |
files_to_translate = []
log.debug("Collecting python sources for pot ...")
for source_path in self._source_paths:
for source_path in self._iter_suffix(path=source_path, suffix=".py"):
log.debug("... add to pot: {source}".format(source=str(source_path)))
files_to_translate.append(str(source_path))
for system_file in self.SYSTEM_SOURCE_FILES:
files_to_translate.append(str(self._system_path / system_file))
# FIXME: use separate domain for system source translations? Nerge them when generating mo's?
log.debug("Finished collection sources.")
pot_path = (self._po_path / self._basename).with_suffix(".pot")
command = ["xgettext", "--keyword=_", "--keyword=_translate",
"--output={output}".format(output=str(pot_path))]
command.extend(files_to_translate)
check_call(command)
log.debug("pot file \"{pot}\" created!".format(pot=str(pot_path)))
pot_copy_path = self._mo_path / pot_path.name
log.debug("Copying pot file to mo path: {pot_copy_path}".format(pot_copy_path=str(pot_copy_path)))
shutil.copy(str(pot_path), str(pot_copy_path)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def do_po(self):
""" Update all po files with the data in the pot reference file. """ |
log.debug("Start updating po files ...")
pot_path = (self._po_path / self._basename).with_suffix(".pot")
for po_dir_path in self._iter_po_dir():
po_path = (po_dir_path / self._basename).with_suffix(".po")
if po_path.exists():
log.debug("update {po}:".format(po=str(po_path)))
check_call(["msgmerge", "-U", str(po_path), str(pot_path)])
else:
log.debug("create {po}:".format(po=str(po_path)))
check_call(["msginit", "-i", str(pot_path), "-o", str(po_path), "--no-translator"])
po_copy_path = self._mo_path / po_path.parent.name / po_path.name
po_copy_path.parent.mkdir(exist_ok=True)
log.debug("Copying po file to mo path: {po_copy_path}".format(po_copy_path=str(po_copy_path)))
shutil.copy(str(po_path), str(po_copy_path))
log.debug("All po files updated") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def do_mo(self):
""" Generate mo files for all po files. """ |
log.debug("Start updating mo files ...")
for po_dir_path in self._iter_po_dir():
po_path = (po_dir_path / self._basename).with_suffix(".po")
lc_path = self._mo_path / po_dir_path.name / "LC_MESSAGES"
lc_path.mkdir(parents=True, exist_ok=True)
mo_path = (lc_path / self._basename).with_suffix(".mo")
log.debug("Creating from {po}: {mo}".format(po=str(po_path), mo=str(mo_path)))
check_call(["msgfmt", str(po_path), "-o", str(mo_path)])
log.debug("All mo files updated") |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _process_message(self, message: amqp.Message) -> None: """Processes the message received from the queue.""" |
if self.shutdown_pending.is_set():
return
try:
if isinstance(message.body, bytes):
message.body = message.body.decode()
description = json.loads(message.body)
except Exception:
logger.error("Cannot decode message. Dropping. Message: %r", message.body)
traceback.print_exc()
message.channel.basic_reject(message.delivery_tag, requeue=False)
else:
logger.info("Processing task: %r", description)
self._process_description(message, description) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _apply_task(task: Task, args: Tuple, kwargs: Dict[str, Any]) -> Any: """Logs the time spent while running the task.""" |
if args is None:
args = ()
if kwargs is None:
kwargs = {}
start = monotonic()
try:
return task.apply(*args, **kwargs)
finally:
delta = monotonic() - start
logger.info("%s finished in %i seconds." % (task.name, delta)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _shutdown_timer(self) -> None: """Counts down from MAX_WORKER_RUN_TIME. When it reaches zero sutdown gracefully. """ |
remaining = self._max_run_time - self.uptime
if not self.shutdown_pending.wait(remaining):
logger.warning('Run time reached zero')
self.shutdown() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _handle_sigint(self, signum: int, frame: Any) -> None: """Shutdown after processing current task.""" |
logger.warning("Catched SIGINT")
self.shutdown() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _handle_sighup(self, signum: int, frame: Any) -> None: """Used internally to fail the task when connection to RabbitMQ is lost during the execution of the task. """ |
logger.warning("Catched SIGHUP")
exc_info = self._heartbeat_exc_info
self._heartbeat_exc_info = None
# Format exception info to see in tools like Sentry.
formatted_exception = ''.join(traceback.format_exception(*exc_info)) # noqa
raise HeartbeatError(exc_info) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _handle_sigusr1(signum: int, frame: Any) -> None: """Print stacktrace.""" |
print('=' * 70)
print(''.join(traceback.format_stack()))
print('-' * 70) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _handle_sigusr2(self, signum: int, frame: Any) -> None: """Drop current task.""" |
logger.warning("Catched SIGUSR2")
if self.current_task:
logger.warning("Dropping current task...")
raise Discard |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def onFolderTreeClicked(self, proxyIndex):
"""What to do when a Folder in the tree is clicked""" |
if not proxyIndex.isValid():
return
index = self.proxyFileModel.mapToSource(proxyIndex)
settings = QSettings()
folder_path = self.fileModel.filePath(index)
settings.setValue('mainwindow/workingDirectory', folder_path) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def send_to_queue( self, args: Tuple=(), kwargs: Dict[str, Any]={}, host: str=None, wait_result: Union[int, float]=None, message_ttl: Union[int, float]=None, ) -> Any: """ Sends a message to the queue. A worker will run the task's function when it receives the message. :param args: Arguments that will be passed to task on execution. :param kwargs: Keyword arguments that will be passed to task on execution. :param host: Send this task to specific host. ``host`` will be appended to the queue name. If ``host`` is "localhost", hostname of the server will be appended to the queue name. :param wait_result: Wait for result from worker for ``wait_result`` seconds. If timeout occurs, :class:`~kuyruk.exceptions.ResultTimeout` is raised. If excecption occurs in worker, :class:`~kuyruk.exceptions.RemoteException` is raised. :param message_ttl: If set, message will be destroyed in queue after ``message_ttl`` seconds. :return: Result from worker if ``wait_result`` is set, else :const:`None`. """ |
if self.kuyruk.config.EAGER:
# Run the task in current process
result = self.apply(*args, **kwargs)
return result if wait_result else None
logger.debug("Task.send_to_queue args=%r, kwargs=%r", args, kwargs)
queue = self._queue_for_host(host)
description = self._get_description(args, kwargs)
self._send_signal(signals.task_presend, args=args, kwargs=kwargs, description=description)
body = json.dumps(description)
msg = amqp.Message(body=body)
if wait_result:
# Use direct reply-to feature from RabbitMQ:
# https://www.rabbitmq.com/direct-reply-to.html
msg.properties['reply_to'] = 'amq.rabbitmq.reply-to'
if message_ttl:
msg.properties['expiration'] = str(int(message_ttl * 1000))
with self.kuyruk.channel() as ch:
if wait_result:
result = Result(ch.connection)
ch.basic_consume(queue='amq.rabbitmq.reply-to', no_ack=True, callback=result.process_message)
ch.queue_declare(queue=queue, durable=True, auto_delete=False)
ch.basic_publish(msg, exchange="", routing_key=queue)
self._send_signal(signals.task_postsend, args=args, kwargs=kwargs, description=description)
if wait_result:
return result.wait(wait_result) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _get_description(self, args: Tuple, kwargs: Dict[str, Any]) -> Dict[str, Any]: """Return the dictionary to be sent to the queue.""" |
return {
'id': uuid1().hex,
'args': args,
'kwargs': kwargs,
'module': self._module_name,
'function': self.f.__name__,
'sender_hostname': socket.gethostname(),
'sender_pid': os.getpid(),
'sender_cmd': ' '.join(sys.argv),
'sender_timestamp': datetime.utcnow().isoformat()[:19],
} |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def apply(self, *args: Any, **kwargs: Any) -> Any: """Called by workers to run the wrapped function. You may call it yourself if you want to run the task in current process without sending to the queue. If task has a `retry` property it will be retried on failure. If task has a `max_run_time` property the task will not be allowed to run more than that. """ |
def send_signal(sig: Signal, **extra: Any) -> None:
self._send_signal(sig, args=args, kwargs=kwargs, **extra)
logger.debug("Applying %r, args=%r, kwargs=%r", self, args, kwargs)
send_signal(signals.task_preapply)
try:
tries = 1 + self.retry
while 1:
tries -= 1
send_signal(signals.task_prerun)
try:
with time_limit(self.max_run_time or 0):
return self.f(*args, **kwargs)
except Exception:
send_signal(signals.task_error, exc_info=sys.exc_info())
if tries <= 0:
raise
else:
break
finally:
send_signal(signals.task_postrun)
except Exception:
send_signal(signals.task_failure, exc_info=sys.exc_info())
raise
else:
send_signal(signals.task_success)
finally:
send_signal(signals.task_postapply) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _module_name(self) -> str: """Module name of the wrapped function.""" |
name = self.f.__module__
if name == '__main__':
return importer.main_module_name()
return name |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_object(self, obj: Union[str, Any]) -> None: """Load values from an object.""" |
if isinstance(obj, str):
obj = importer.import_object_str(obj)
for key in dir(obj):
if key.isupper():
value = getattr(obj, key)
self._setattr(key, value)
logger.info("Config is loaded from object: %r", obj) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_dict(self, d: Dict[str, Any]) -> None: """Load values from a dict.""" |
for key, value in d.items():
if key.isupper():
self._setattr(key, value)
logger.info("Config is loaded from dict: %r", d) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_pyfile(self, filename: str) -> None: """Load values from a Python file.""" |
globals_ = {} # type: Dict[str, Any]
locals_ = {} # type: Dict[str, Any]
with open(filename, "rb") as f:
exec(compile(f.read(), filename, 'exec'), globals_, locals_)
for key, value in locals_.items():
if (key.isupper() and not isinstance(value, types.ModuleType)):
self._setattr(key, value)
logger.info("Config is loaded from file: %s", filename) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def from_env_vars(self) -> None: """Load values from environment variables. Keys must start with `KUYRUK_`.""" |
for key, value in os.environ.items():
if key.startswith('KUYRUK_'):
key = key[7:]
if hasattr(Config, key):
try:
value = ast.literal_eval(value)
except (ValueError, SyntaxError):
pass
self._setattr(key, value) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def option(current_kwargs, **kwargs):
""" Context manager for temporarily setting a keyword argument and then restoring it to whatever it was before. """ |
tmp_kwargs = dict((key, current_kwargs.get(key)) for key, value in kwargs.items())
current_kwargs.update(kwargs)
yield
current_kwargs.update(tmp_kwargs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def is_method_call(node, method_name):
""" Returns True if `node` is a method call for `method_name`. `method_name` can be either a string or an iterable of strings. """ |
if not isinstance(node, nodes.Call):
return False
if isinstance(node.node, nodes.Getattr):
# e.g. foo.bar()
method = node.node.attr
elif isinstance(node.node, nodes.Name):
# e.g. bar()
method = node.node.name
elif isinstance(node.node, nodes.Getitem):
# e.g. foo["bar"]()
method = node.node.arg.value
else:
return False
if isinstance(method_name, (list, tuple)):
return method in method_name
return method == method_name |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_output(self):
""" Returns the generated JavaScript code. Returns: str """ |
# generate the JS function string
template_function = TEMPLATE_WRAPPER.format(
function_name=self.js_function_name,
template_code=self.output.getvalue()
).strip()
# get the correct module format template
module_format = JS_MODULE_FORMATS[self.js_module_format]
# generate the module code
return module_format(self.dependencies, template_function) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _get_depencency_var_name(self, dependency):
""" Returns the variable name assigned to the given dependency or None if the dependency has not yet been registered. Args: dependency (str):
Thet dependency that needs to be imported. Returns: str or None """ |
for dep_path, var_name in self.dependencies:
if dep_path == dependency:
return var_name |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _add_dependency(self, dependency, var_name=None):
""" Adds the given dependency and returns the variable name to use to access it. If `var_name` is not given then a random one will be created. Args: dependency (str):
var_name (str, optional):
Returns: str """ |
if var_name is None:
var_name = next(self.temp_var_names)
# Don't add duplicate dependencies
if (dependency, var_name) not in self.dependencies:
self.dependencies.append((dependency, var_name))
return var_name |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _process_output(self, node, **kwargs):
""" Processes an output node, which will contain things like `Name` and `TemplateData` nodes. """ |
for n in node.nodes:
self._process_node(n, **kwargs) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _process_templatedata(self, node, **_):
""" Processes a `TemplateData` node, this is just a bit of as-is text to be written to the output. """ |
# escape double quotes
value = re.sub('"', r'\\"', node.data)
# escape new lines
value = re.sub('\n', r'\\n', value)
# append value to the result
self.output.write('__result += "' + value + '";') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _execution(self):
""" Context manager for executing some JavaScript inside a template. """ |
did_start_executing = False
if self.state == STATE_DEFAULT:
did_start_executing = True
self.state = STATE_EXECUTING
def close():
if did_start_executing and self.state == STATE_EXECUTING:
self.state = STATE_DEFAULT
yield close
close() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _scoped_variables(self, nodes_list, **kwargs):
""" Context manager for creating scoped variables defined by the nodes in `nodes_list`. These variables will be added to the context, and when the context manager exits the context object will be restored to it's previous state. """ |
tmp_vars = []
for node in nodes_list:
is_assign_node = isinstance(node, nodes.Assign)
name = node.target.name if is_assign_node else node.name
# create a temp variable name
tmp_var = next(self.temp_var_names)
# save previous context value
with self._execution():
# save the current value of this name
self.output.write('var %s = %s.%s;' % (tmp_var, self.context_name, name))
# add new value to context
self.output.write('%s.%s = ' % (self.context_name, name))
if is_assign_node:
self._process_node(node.node, **kwargs)
else:
self.output.write(node.name)
self.output.write(';')
tmp_vars.append((tmp_var, name))
yield
# restore context
for tmp_var, name in tmp_vars:
with self._execution():
self.output.write('%s.%s = %s;' % (self.context_name, name, tmp_var)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def channel(self) -> Iterator[amqp.Channel]: """Returns a new channel from a new connection as a context manager.""" |
with self.connection() as conn:
ch = conn.channel()
logger.info('Opened new channel')
with _safe_close(ch):
yield ch |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def connection(self) -> Iterator[amqp.Connection]: """Returns a new connection as a context manager.""" |
TCP_USER_TIMEOUT = 18 # constant is available on Python 3.6+.
socket_settings = {TCP_USER_TIMEOUT: self.config.TCP_USER_TIMEOUT}
if sys.platform.startswith('darwin'):
del socket_settings[TCP_USER_TIMEOUT]
conn = amqp.Connection(
host="%s:%s" % (self.config.RABBIT_HOST, self.config.RABBIT_PORT),
userid=self.config.RABBIT_USER,
password=self.config.RABBIT_PASSWORD,
virtual_host=self.config.RABBIT_VIRTUAL_HOST,
connect_timeout=self.config.RABBIT_CONNECT_TIMEOUT,
read_timeout=self.config.RABBIT_READ_TIMEOUT,
write_timeout=self.config.RABBIT_WRITE_TIMEOUT,
socket_settings=socket_settings,
heartbeat=self.config.RABBIT_HEARTBEAT,
)
conn.connect()
logger.info('Connected to RabbitMQ')
with _safe_close(conn):
yield conn |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def to_text(sentence):
""" Helper routine that converts a Sentence protobuf to a string from its tokens. """ |
text = ""
for i, tok in enumerate(sentence.token):
if i != 0:
text += tok.before
text += tok.word
return text |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def showMessage(self, message, *args):
""" Public method to show a message in the bottom part of the splashscreen. @param message message to be shown (string or QString) """ |
QSplashScreen.showMessage(
self, message, Qt.AlignBottom | Qt.AlignRight | Qt.AlignAbsolute, QColor(Qt.white)) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def main_module_name() -> str: """Returns main module and module name pair.""" |
if not hasattr(main_module, '__file__'):
# running from interactive shell
return None
main_filename = os.path.basename(main_module.__file__)
module_name, ext = os.path.splitext(main_filename)
return module_name |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def build_dirs(files):
'''
Build necessary directories based on a list of file paths
'''
for i in files:
if type(i) is list:
build_dirs(i)
continue
else:
if len(i['path']) > 1:
addpath = os.path.join(os.getcwd(), *i['path'][:-1])
subdirs = all_subdirs(os.getcwd())
if addpath and addpath not in subdirs:
os.makedirs(addpath)
print 'just made path', addpath |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def get_want_file_pos(file_list):
'''
Ask the user which files in file_list he or she is interested in.
Return indices for the files inside file_list
'''
want_file_pos = []
print '\nFiles contained:\n'
for i in file_list:
print(os.path.join(*i['path']))
while 1:
all_answer = raw_input('\nDo you want all these files? (y/n): ')
if all_answer in ('y', 'n'):
break
if all_answer == 'y':
want_file_pos = range(len(file_list))
return want_file_pos
if all_answer == 'n':
for j, tfile in enumerate(file_list):
while 1:
file_answer = raw_input('Do you want {}? '
'(y/n): '.format(os.path.join
(*tfile['path'])))
if file_answer in ('y', 'n'):
break
if file_answer == 'y':
want_file_pos.append(j)
print "Here are all the files you want:"
for k in want_file_pos:
print os.path.join(*file_list[k]['path'])
return want_file_pos |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def get_rightmost_index(byte_index=0, file_starts=[0]):
'''
Retrieve the highest-indexed file that starts at or before byte_index.
'''
i = 1
while i <= len(file_starts):
start = file_starts[-i]
if start <= byte_index:
return len(file_starts) - i
else:
i += 1
else:
raise Exception('byte_index lower than all file_starts') |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def vis_init(self):
'''
Sends the state of the BTC at the time the visualizer connects,
initializing it.
'''
init_dict = {}
init_dict['kind'] = 'init'
assert len(self.want_file_pos) == len(self.heads_and_tails)
init_dict['want_file_pos'] = self.want_file_pos
init_dict['files'] = self.file_list
init_dict['heads_and_tails'] = self.heads_and_tails
init_dict['num_pieces'] = self.num_pieces
self.broadcast(init_dict) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def build_payload(self):
'''
Builds the payload that will be sent in tracker_request
'''
payload = {}
hashed_info = hashlib.sha1(tparser.bencode(self.torrent_dict['info']))
self.hash_string = hashed_info.digest()
self.peer_id = ('-DR' + VERSION +
''.join(random.sample(ALPHANUM, 13)))
assert len(self.peer_id) == 20
payload['info_hash'] = self.hash_string
payload['peer_id'] = self.peer_id
payload['port'] = self.port
payload['uploaded'] = 0
payload['downloaded'] = 0
payload['left'] = self.length
payload['compact'] = 1
payload['supportcrypto'] = 1
payload['event'] = 'started'
return payload |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def tracker_request(self):
'''
Sends the initial request to the tracker, compiling list of all peers
announcing to the tracker
'''
assert self.torrent_dict['info']
payload = self.build_payload()
if self.torrent_dict['announce'].startswith('udp'):
raise Exception('need to deal with UDP')
else:
self.r = requests.get(self.torrent_dict['announce'],
params=payload)
# Decoding response from tracker
self.tracker_response = tparser.bdecode(self.r.content)
self.get_peer_ips() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def initpeer(self, sock):
'''
Creates a new peer object for a nvalid socket and adds it to reactor's
listen list
'''
location_json = requests.request("GET", "http://freegeoip.net/json/"
+ sock.getpeername()[0]).content
location = json.loads(location_json)
tpeer = peer.Peer(sock, self.reactor, self, location)
self.peer_dict[sock] = tpeer
self.reactor.select_list.append(tpeer) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
| def ppiece(self, content):
'''
Process a piece that we've received from a peer, writing it out to
one or more files
'''
piece_index, byte_begin = struct.unpack('!ii', content[0:8])
# TODO -- figure out a better way to catch this error.
# How is piece_index getting swapped out from under me?
if piece_index != self.piece.index:
return
assert byte_begin % REQUEST_SIZE == 0
block_begin = byte_begin / REQUEST_SIZE
block = content[8:]
self.piece.save(index=block_begin, bytes=block)
if self.piece.complete:
piece_bytes = self.piece.get_bytes()
if self.piece.index == self.torrent.last_piece:
piece_bytes = piece_bytes[:self.torrent.last_piece_length]
if hashlib.sha1(piece_bytes).digest() == (self.torrent.torrent_dict
['info']['pieces']
[20 * piece_index:20 *
piece_index + 20]):
print 'hash matches'
# Take care of visualizer stuff
piece_dict = {'kind': 'piece', 'peer': self.sock.getpeername(),
'piece_index': piece_index}
self.torrent.switchboard.broadcast(piece_dict)
print ('writing piece {}. Length is '
'{}').format(repr(piece_bytes)[:10] + '...',
len(piece_bytes))
# Write out
byte_index = piece_index * self.torrent.piece_length
self.piece = self.init_piece()
self.request_all()
self.torrent.switchboard.write(byte_index, piece_bytes)
self.torrent.switchboard.mark_off(piece_index)
print self.torrent.switchboard.bitfield
if self.torrent.switchboard.complete:
print '\nDownload complete\n'
self.reactor.is_running = False
else:
print "Bad data -- hash doesn't match. Discarding piece."
self.piece = self.init_piece()
self.request_all() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def is_connected(self):
""" Returns the connection status of the data store. Returns: bool: ``True`` if the data store is connected to the MongoDB server. """ |
if self._client is not None:
try:
self._client.server_info()
except ConnectionFailure:
return False
return True
else:
return False |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def connect(self):
""" Establishes a connection to the MongoDB server. Use the MongoProxy library in order to automatically handle AutoReconnect exceptions in a graceful and reliable way. """ |
mongodb_args = {
'host': self.host,
'port': self.port,
'username': self._username,
'password': self._password,
'authSource': self._auth_source,
'serverSelectionTimeoutMS': self._connect_timeout
}
if self._auth_mechanism is not None:
mongodb_args['authMechanism'] = self._auth_mechanism
self._client = MongoClient(**mongodb_args)
if self._handle_reconnect:
self._client = MongoClientProxy(self._client) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def exists(self, workflow_id):
""" Checks whether a document with the specified workflow id already exists. Args: workflow_id (str):
The workflow id that should be checked. Raises: DataStoreNotConnected: If the data store is not connected to the server. Returns: bool: ``True`` if a document with the specified workflow id exists. """ |
try:
db = self._client[self.database]
col = db[WORKFLOW_DATA_COLLECTION_NAME]
return col.find_one({"_id": ObjectId(workflow_id)}) is not None
except ConnectionFailure:
raise DataStoreNotConnected() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def add(self, payload=None):
""" Adds a new document to the data store and returns its id. Args: payload (dict):
Dictionary of initial data that should be stored in the new document in the meta section. Raises: DataStoreNotConnected: If the data store is not connected to the server. Returns: str: The id of the newly created document. """ |
try:
db = self._client[self.database]
col = db[WORKFLOW_DATA_COLLECTION_NAME]
return str(col.insert_one({
DataStoreDocumentSection.Meta:
payload if isinstance(payload, dict) else {},
DataStoreDocumentSection.Data: {}
}).inserted_id)
except ConnectionFailure:
raise DataStoreNotConnected() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def remove(self, workflow_id):
""" Removes a document specified by its id from the data store. All associated GridFs documents are deleted as well. Args: workflow_id (str):
The id of the document that represents a workflow run. Raises: DataStoreNotConnected: If the data store is not connected to the server. """ |
try:
db = self._client[self.database]
fs = GridFSProxy(GridFS(db.unproxied_object))
for grid_doc in fs.find({"workflow_id": workflow_id},
no_cursor_timeout=True):
fs.delete(grid_doc._id)
col = db[WORKFLOW_DATA_COLLECTION_NAME]
return col.delete_one({"_id": ObjectId(workflow_id)})
except ConnectionFailure:
raise DataStoreNotConnected() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get(self, workflow_id):
""" Returns the document for the given workflow id. Args: workflow_id (str):
The id of the document that represents a workflow run. Raises: DataStoreNotConnected: If the data store is not connected to the server. Returns: DataStoreDocument: The document for the given workflow id. """ |
try:
db = self._client[self.database]
fs = GridFSProxy(GridFS(db.unproxied_object))
return DataStoreDocument(db[WORKFLOW_DATA_COLLECTION_NAME], fs, workflow_id)
except ConnectionFailure:
raise DataStoreNotConnected() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get(self, key, default=None, *, section=DataStoreDocumentSection.Data):
""" Return the field specified by its key from the specified section. This method access the specified section of the workflow document and returns the value for the given key. Args: key (str):
The key pointing to the value that should be retrieved. It supports MongoDB's dot notation for nested fields. default: The default value that is returned if the key does not exist. section (DataStoreDocumentSection):
The section from which the data should be retrieved. Returns: object: The value from the field that the specified key is pointing to. If the key does not exist, the default value is returned. If no default value is provided and the key does not exist ``None`` is returned. """ |
key_notation = '.'.join([section, key])
try:
return self._decode_value(self._data_from_dotnotation(key_notation, default))
except KeyError:
return None |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def set(self, key, value, *, section=DataStoreDocumentSection.Data):
""" Store a value under the specified key in the given section of the document. This method stores a value into the specified section of the workflow data store document. Any existing value is overridden. Before storing a value, any linked GridFS document under the specified key is deleted. Args: key (str):
The key pointing to the value that should be stored/updated. It supports MongoDB's dot notation for nested fields. value: The value that should be stored/updated. section (DataStoreDocumentSection):
The section from which the data should be retrieved. Returns: bool: ``True`` if the value could be set/updated, otherwise ``False``. """ |
key_notation = '.'.join([section, key])
try:
self._delete_gridfs_data(self._data_from_dotnotation(key_notation,
default=None))
except KeyError:
logger.info('Adding new field {} to the data store'.format(key_notation))
result = self._collection.update_one(
{"_id": ObjectId(self._workflow_id)},
{
"$set": {
key_notation: self._encode_value(value)
},
"$currentDate": {"lastModified": True}
}
)
return result.modified_count == 1 |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def push(self, key, value, *, section=DataStoreDocumentSection.Data):
""" Appends a value to a list in the specified section of the document. Args: key (str):
The key pointing to the value that should be stored/updated. It supports MongoDB's dot notation for nested fields. value: The value that should be appended to a list in the data store. section (DataStoreDocumentSection):
The section from which the data should be retrieved. Returns: bool: ``True`` if the value could be appended, otherwise ``False``. """ |
key_notation = '.'.join([section, key])
result = self._collection.update_one(
{"_id": ObjectId(self._workflow_id)},
{
"$push": {
key_notation: self._encode_value(value)
},
"$currentDate": {"lastModified": True}
}
)
return result.modified_count == 1 |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def extend(self, key, values, *, section=DataStoreDocumentSection.Data):
""" Extends a list in the data store with the elements of values. Args: key (str):
The key pointing to the value that should be stored/updated. It supports MongoDB's dot notation for nested fields. values (list):
A list of the values that should be used to extend the list in the document. section (DataStoreDocumentSection):
The section from which the data should be retrieved. Returns: bool: ``True`` if the list in the database could be extended, otherwise ``False``. """ |
key_notation = '.'.join([section, key])
if not isinstance(values, list):
return False
result = self._collection.update_one(
{"_id": ObjectId(self._workflow_id)},
{
"$push": {
key_notation: {"$each": self._encode_value(values)}
},
"$currentDate": {"lastModified": True}
}
)
return result.modified_count == 1 |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _data_from_dotnotation(self, key, default=None):
""" Returns the MongoDB data from a key using dot notation. Args: key (str):
The key to the field in the workflow document. Supports MongoDB's dot notation for embedded fields. default (object):
The default value that is returned if the key does not exist. Returns: object: The data for the specified key or the default value. """ |
if key is None:
raise KeyError('NoneType is not a valid key!')
doc = self._collection.find_one({"_id": ObjectId(self._workflow_id)})
if doc is None:
return default
for k in key.split('.'):
doc = doc[k]
return doc |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _encode_value(self, value):
""" Encodes the value such that it can be stored into MongoDB. Any primitive types are stored directly into MongoDB, while non-primitive types are pickled and stored as GridFS objects. The id pointing to a GridFS object replaces the original value. Args: value (object):
The object that should be encoded for storing in MongoDB. Returns: object: The encoded value ready to be stored in MongoDB. """ |
if isinstance(value, (int, float, str, bool, datetime)):
return value
elif isinstance(value, list):
return [self._encode_value(item) for item in value]
elif isinstance(value, dict):
result = {}
for key, item in value.items():
result[key] = self._encode_value(item)
return result
else:
return self._gridfs.put(Binary(pickle.dumps(value)),
workflow_id=self._workflow_id) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _decode_value(self, value):
""" Decodes the value by turning any binary data back into Python objects. The method searches for ObjectId values, loads the associated binary data from GridFS and returns the decoded Python object. Args: value (object):
The value that should be decoded. Raises: DataStoreDecodingError: An ObjectId was found but the id is not a valid GridFS id. DataStoreDecodeUnknownType: The type of the specified value is unknown. Returns: object: The decoded value as a valid Python object. """ |
if isinstance(value, (int, float, str, bool, datetime)):
return value
elif isinstance(value, list):
return [self._decode_value(item) for item in value]
elif isinstance(value, dict):
result = {}
for key, item in value.items():
result[key] = self._decode_value(item)
return result
elif isinstance(value, ObjectId):
if self._gridfs.exists({"_id": value}):
return pickle.loads(self._gridfs.get(value).read())
else:
raise DataStoreGridfsIdInvalid()
else:
raise DataStoreDecodeUnknownType() |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _delete_gridfs_data(self, data):
""" Delete all GridFS data that is linked by fields in the specified data. Args: data: The data that is parsed for MongoDB ObjectIDs. The linked GridFs object for any ObjectID is deleted. """ |
if isinstance(data, ObjectId):
if self._gridfs.exists({"_id": data}):
self._gridfs.delete(data)
else:
raise DataStoreGridfsIdInvalid()
elif isinstance(data, list):
for item in data:
self._delete_gridfs_data(item)
elif isinstance(data, dict):
for key, item in data.items():
self._delete_gridfs_data(item) |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def _iterate_prefix(self, callsign, timestamp=timestamp_now):
"""truncate call until it corresponds to a Prefix in the database""" |
prefix = callsign
if re.search('(VK|AX|VI)9[A-Z]{3}', callsign): #special rule for VK9 calls
if timestamp > datetime(2006,1,1, tzinfo=UTC):
prefix = callsign[0:3]+callsign[4:5]
while len(prefix) > 0:
try:
return self._lookuplib.lookup_prefix(prefix, timestamp)
except KeyError:
prefix = prefix.replace(' ', '')[:-1]
continue
raise KeyError |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_all(self, callsign, timestamp=timestamp_now):
""" Lookup a callsign and return all data available from the underlying database Args: callsign (str):
Amateur Radio callsign timestamp (datetime, optional):
datetime in UTC (tzinfo=pytz.UTC) Returns: dict: Dictionary containing the callsign specific data Raises: KeyError: Callsign could not be identified Example: The following code returns all available information from the country-files.com database for the callsign "DH1TW" { 'country': 'Fed. Rep. of Germany', 'adif': 230, 'continent': 'EU', 'latitude': 51.0, 'longitude': -10.0, 'cqz': 14, 'ituz': 28 } Note: The content of the returned data depends entirely on the injected :py:class:`LookupLib` (and the used database). While the country-files.com provides for example the ITU Zone, Clublog doesn't. Consequently, the item "ituz" would be missing with Clublog (API or XML) :py:class:`LookupLib`. """ |
callsign_data = self._lookup_callsign(callsign, timestamp)
try:
cqz = self._lookuplib.lookup_zone_exception(callsign, timestamp)
callsign_data[const.CQZ] = cqz
except KeyError:
pass
return callsign_data |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def is_valid_callsign(self, callsign, timestamp=timestamp_now):
""" Checks if a callsign is valid Args: callsign (str):
Amateur Radio callsign timestamp (datetime, optional):
datetime in UTC (tzinfo=pytz.UTC) Returns: bool: True / False Example: The following checks if "DH1TW" is a valid callsign True """ |
try:
if self.get_all(callsign, timestamp):
return True
except KeyError:
return False |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_lat_long(self, callsign, timestamp=timestamp_now):
""" Returns Latitude and Longitude for a callsign Args: callsign (str):
Amateur Radio callsign timestamp (datetime, optional):
datetime in UTC (tzinfo=pytz.UTC) Returns: dict: Containing Latitude and Longitude Raises: KeyError: No data found for callsign Example: The following code returns Latitude & Longitude for "DH1TW" { 'latitude': 51.0, 'longitude': -10.0 } Note: Unfortunately, in most cases the returned Latitude and Longitude are not very precise. Clublog and Country-files.com use the country's capital coordinates in most cases, if no dedicated entry in the database exists. Best results will be retrieved with QRZ.com Lookup. """ |
callsign_data = self.get_all(callsign, timestamp=timestamp)
return {
const.LATITUDE: callsign_data[const.LATITUDE],
const.LONGITUDE: callsign_data[const.LONGITUDE]
} |
<SYSTEM_TASK:>
Solve the following problem using Python, implementing the functions described below, one line at a time
<END_TASK>
<USER_TASK:>
Description:
def get_cqz(self, callsign, timestamp=timestamp_now):
""" Returns CQ Zone of a callsign Args: callsign (str):
Amateur Radio callsign timestamp (datetime, optional):
datetime in UTC (tzinfo=pytz.UTC) Returns: int: containing the callsign's CQ Zone Raises: KeyError: no CQ Zone found for callsign """ |
return self.get_all(callsign, timestamp)[const.CQZ] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.