status
stringclasses 1
value | repo_name
stringlengths 9
24
| repo_url
stringlengths 28
43
| issue_id
int64 1
104k
| updated_files
stringlengths 8
1.76k
| title
stringlengths 4
369
| body
stringlengths 0
254k
β | issue_url
stringlengths 37
56
| pull_url
stringlengths 37
54
| before_fix_sha
stringlengths 40
40
| after_fix_sha
stringlengths 40
40
| report_datetime
timestamp[ns, tz=UTC] | language
stringclasses 5
values | commit_datetime
timestamp[us, tz=UTC] |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
closed | apache/airflow | https://github.com/apache/airflow | 13,069 | ["dev/provider_packages/prepare_provider_packages.py", "scripts/in_container/run_prepare_provider_packages.sh"] | Rewrite handwritten argument parser in prepare_provider_packages.py | Hello @potiuk
I believe you wrote this script - [`prepare_provider_packages.py`](https://github.com/apache/airflow/blob/master/dev/provider_packages/prepare_provider_packages.py).
I am wondering if there is any reason why you did not use the standard library - [`argument.ArgumentParser`](https://docs.python.org/3/library/argparse.html) in this script?
https://docs.python.org/3/library/argparse.html
https://github.com/apache/airflow/blob/550395192929f86376151b4385cea6b8dfb76e8e/dev/provider_packages/prepare_provider_packages.py#L1518-L1552
This makes the code overly complex and, by the way, more prone to errors e.g. ``usage()`` method describes the `--version-suffix-for-pypi` parameter, but we changed its name to `--version-suffix`.
What do you think to rewrite this script to use the standard library? Do we have a requirement that prevents this? I think of something similar to [my gist](https://gist.github.com/mik-laj/ff008718fc6cec9fe929731b8c62d6f8).
| https://github.com/apache/airflow/issues/13069 | https://github.com/apache/airflow/pull/13234 | 43b2d3392224d8e0d6fb8ce8cdc6b0f0b0cc727b | 1500083ca1b374d8320328929e07f84055d07c4e | 2020-12-14T18:39:34Z | python | 2021-01-04T20:24:08Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,053 | ["airflow/utils/dot_renderer.py", "tests/utils/test_dot_renderer.py"] | CLI does not display TaskGroups | Hello,
Airflow ability to [display DAG in CLI](http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/usage-cli.html#display-dags-structure) with command `airflow dags show`, but unfortunately this command does not display Task Groups. It would be great if the Task Groups were correctly marked in the diagrams.
<img width="1268" alt="Screenshot 2020-12-14 at 02 28 58" src="https://user-images.githubusercontent.com/12058428/102030893-9f3e4d00-3db4-11eb-8c2d-f33e38d01997.png">
<img width="681" alt="Screenshot 2020-12-14 at 02 29 16" src="https://user-images.githubusercontent.com/12058428/102030898-a2d1d400-3db4-11eb-9b31-0cde70fea675.png">
Best regards,
Kamil BreguΕa | https://github.com/apache/airflow/issues/13053 | https://github.com/apache/airflow/pull/14269 | 21f297425ae85ce89e21477d55b51d5560f47bf8 | c71f707d24a9196d33b91a7a2a9e3384698e5193 | 2020-12-14T01:34:50Z | python | 2021-02-25T15:23:15Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,047 | ["airflow/utils/dag_processing.py", "tests/utils/test_dag_processing.py"] | Occasional "KeyError" in dag_processing | **Apache Airflow version**: 2.0.0rc2
**Environment**: Breeze with example dags, Python 3.8 postgres.
- **OS** (e.g. from /etc/os-release): Linux
- **Kernel** (e.g. `uname -a`): Breeze CI image
- **Install tools**: Breeze:
- **Executor**: LocalExecutor
```
./breeze start-airflow --backend postgres --load-example-dags --load-default-connections --install-airflow-version 2.0.0rc2 --skip-mounting-local-sources --python 3.8
```
**What happened**:
When testing airflow logging I occasionally stumble upon "KeyError' from `dag_procesing.py`. I am not sure exactly when it happens. It's not always reproducible but it looks like it is when I restart scheduler and trigger 'example_bash_operator.py" it happens rather randomly (1/10 times more or less). It does not happen always when I triggere task manually. DAG gets correctly executed after triggering, but the log is there and warniing printed in the logs right after the DAG finishes execution.
The error I see in scheduler's logs:
```
[2020-12-13 19:35:33,752] {dagbag.py:440} INFO - Filling up the DagBag from /usr/local/lib/python3.8/site-packages/airflow/example_dags/example_bash_operator.py
Running <TaskInstance: example_bash_operator.run_after_loop 2020-12-13T19:35:30.648020+00:00 [queued]> on host 6611da4b1a27
[2020-12-13 19:35:34,517] {dagrun.py:444} INFO - Marking run <DagRun example_bash_operator @ 2020-12-13 19:35:30.648020+00:00: manual__2020-12-13T19:35:30.648020+00:00, externally triggered: True> successful
[2020-12-13 19:35:34,523] {scheduler_job.py:1193} INFO - Executor reports execution of example_bash_operator.run_after_loop execution_date=2020-12-13 19:35:30.648020+00:00 exited with status success for try_number 1
Process ForkProcess-34:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/usr/local/lib/python3.8/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/lib/python3.8/site-packages/airflow/utils/dag_processing.py", line 365, in _run_processor_manager
processor_manager.start()
File "/usr/local/lib/python3.8/site-packages/airflow/utils/dag_processing.py", line 596, in start
return self._run_parsing_loop()
File "/usr/local/lib/python3.8/site-packages/airflow/utils/dag_processing.py", line 659, in _run_parsing_loop
self._processors.pop(processor.file_path)
KeyError: '/usr/local/lib/python3.8/site-packages/airflow/example_dags/example_bash_operator.py'
[2020-12-13 19:35:35,589] {dag_processing.py:396} WARNING - DagFileProcessorManager (PID=1029759) exited with exit code 1 - re-launching
```
**What you expected to happen**:
No error in logs.
**How to reproduce it**:
```
./breeze start-airflow --backend postgres --load-example-dags --load-default-connections --install-airflow-version 2.0.0rc2 --skip-mounting-local-sources --python 3.8
```
Login to the webserver, enable 'example_bash_operator", wait for it to execute. Trigger the example DAG several times (always wait for the end of execution. It happens randomly (for me around 1/10 tasks)
**Anything else we need to know**:
| https://github.com/apache/airflow/issues/13047 | https://github.com/apache/airflow/pull/13662 | 614b70805ade1946bb543b6815e304af1342ae06 | 32f59534cbdb8188e4c8f49d7dfbb4b915eaeb4d | 2020-12-13T19:53:22Z | python | 2021-01-15T16:40:20Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,046 | ["airflow/utils/json.py"] | installation of simplejson breaks airlfow webserver 2.0.0rc2 | Version 2.0.0rc2
To reproduce:
1. `pip install apache-airflow==2.0.0rc2`
2. `pip install simplejson`
3. run webserver
4. open in browser and observe following error
Error:
```
[2020-12-13 11:37:28 -0800] [85061] [INFO] Starting gunicorn 19.10.0
[2020-12-13 11:37:28 -0800] [85061] [INFO] Listening at: http://0.0.0.0:8080 (85061)
[2020-12-13 11:37:28 -0800] [85061] [INFO] Using worker: sync
[2020-12-13 11:37:28 -0800] [85064] [INFO] Booting worker with pid: 85064
[2020-12-13 11:37:36,444] {app.py:1892} ERROR - Exception on /home [GET]
Traceback (most recent call last):
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app
ctx.push()
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/ctx.py", line 390, in push
self.session = session_interface.open_session(self.app, self.request)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 340, in open_session
s = self.get_signing_serializer(app)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 336, in get_signing_serializer
signer_kwargs=signer_kwargs,
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 95, in __init__
self.is_text_serializer = is_text_serializer(serializer)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 13, in is_text_serializer
return isinstance(serializer.dumps({}), text_type)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/tag.py", line 305, in dumps
return dumps(self.tag(value), separators=(",", ":"))
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/__init__.py", line 211, in dumps
rv = _json.dumps(obj, **kwargs)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/simplejson/__init__.py", line 412, in dumps
**kw).encode(obj)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/airflow/utils/json.py", line 36, in __init__
super().__init__(*args, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'encoding'
[2020-12-13 11:37:36 -0800] [85064] [ERROR] Error handling request /home
Traceback (most recent call last):
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2446, in wsgi_app
ctx.push()
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/ctx.py", line 390, in push
self.session = session_interface.open_session(self.app, self.request)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 340, in open_session
s = self.get_signing_serializer(app)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/sessions.py", line 336, in get_signing_serializer
signer_kwargs=signer_kwargs,
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 95, in __init__
self.is_text_serializer = is_text_serializer(serializer)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/itsdangerous/serializer.py", line 13, in is_text_serializer
return isinstance(serializer.dumps({}), text_type)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/tag.py", line 305, in dumps
return dumps(self.tag(value), separators=(",", ":"))
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/json/__init__.py", line 211, in dumps
rv = _json.dumps(obj, **kwargs)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/simplejson/__init__.py", line 412, in dumps
**kw).encode(obj)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/airflow/utils/json.py", line 36, in __init__
super().__init__(*args, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'encoding'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/gunicorn/workers/sync.py", line 135, in handle
self.handle_request(listener, req, client, addr)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/gunicorn/workers/sync.py", line 176, in handle_request
respiter = self.wsgi(environ, resp.start_response)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2464, in __call__
return self.wsgi_app(environ, start_response)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 2450, in wsgi_app
response = self.handle_exception(e)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 1879, in handle_exception
server_error = handler(server_error)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/airflow/www/views.py", line 372, in show_traceback
else 'Error! Please contact server admin.',
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/templating.py", line 136, in render_template
ctx.app.update_template_context(context)
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask/app.py", line 838, in update_template_context
context.update(func())
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask_login/utils.py", line 368, in _user_context_processor
return dict(current_user=_get_user())
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask_login/utils.py", line 335, in _get_user
current_app.login_manager._load_user()
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/flask_login/login_manager.py", line 346, in _load_user
is_missing_user_id = 'user_id' not in session
File "/Users/dstandish/.pyenv/versions/3.7.5/lib/python3.7/site-packages/werkzeug/local.py", line 379, in <lambda>
__contains__ = lambda x, i: i in x._get_current_object()
TypeError: argument of type 'NoneType' is not iterable
127.0.0.1 - - [13/Dec/2020:11:37:36 -0800] "GET /home HTTP/1.1" 500 0 "-" "-"
^C[2020-12-13 11:37:38,288] {webserver_command.py:430} INFO - Received signal: 2. Closing gunicorn.
[2020-12-13 11:37:38 -0800] [85061] [INFO] Handling signal: int
[2020-12-13 11:37:38 -0800] [85064] [INFO] Worker exiting (pid: 85064)
[2020-12-13 11:37:38 -0800] [85061] [INFO] Shutting down: Master
``` | https://github.com/apache/airflow/issues/13046 | https://github.com/apache/airflow/pull/13050 | 1c1ef7ee693fead93e269dfd9774a72b6eed2e85 | ea3d42a3b68f926ff5022e2786bd6c57e3308cd2 | 2020-12-13T19:41:10Z | python | 2020-12-14T12:41:12Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,027 | ["MANIFEST.in"] | No such file or directory: '/usr/local/lib/python3.9/site-packages/airflow/customized_form_field_behaviours.schema.json' | v2.0.0rc1
```
airflow db init
DB: sqlite:////Users/red/airflow/airflow.db
[2020-12-12 00:33:02,036] {db.py:678} INFO - Creating tables
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.9/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/commands/db_command.py", line 31, in initdb
db.initdb()
File "/usr/local/lib/python3.9/site-packages/airflow/utils/db.py", line 549, in initdb
upgradedb()
File "/usr/local/lib/python3.9/site-packages/airflow/utils/db.py", line 688, in upgradedb
command.upgrade(config, 'heads')
File "/usr/local/lib/python3.9/site-packages/alembic/command.py", line 298, in upgrade
script.run_env()
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 489, in run_env
util.load_python_file(self.dir, "env.py")
File "/usr/local/lib/python3.9/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file
module = load_module_py(module_id, path)
File "/usr/local/lib/python3.9/site-packages/alembic/util/compat.py", line 184, in load_module_py
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 790, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/usr/local/lib/python3.9/site-packages/airflow/migrations/env.py", line 108, in <module>
run_migrations_online()
File "/usr/local/lib/python3.9/site-packages/airflow/migrations/env.py", line 102, in run_migrations_online
context.run_migrations()
File "<string>", line 8, in run_migrations
File "/usr/local/lib/python3.9/site-packages/alembic/runtime/environment.py", line 846, in run_migrations
self.get_context().run_migrations(**kw)
File "/usr/local/lib/python3.9/site-packages/alembic/runtime/migration.py", line 511, in run_migrations
for step in self._migrations_fn(heads, self):
File "/usr/local/lib/python3.9/site-packages/alembic/command.py", line 287, in upgrade
return script._upgrade_revs(revision, rev)
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 364, in _upgrade_revs
revs = list(revs)
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 777, in _iterate_revisions
uppers = util.dedupe_tuple(self.get_revisions(upper))
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 321, in get_revisions
resolved_id, branch_label = self._resolve_revision_number(id_)
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 501, in _resolve_revision_number
self._revision_map
File "/usr/local/lib/python3.9/site-packages/alembic/util/langhelpers.py", line 230, in __get__
obj.__dict__[self.__name__] = result = self.fget(obj)
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 123, in _revision_map
for revision in self._generator():
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 112, in _load_revisions
script = Script._from_filename(self, vers, file_)
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 906, in _from_filename
module = util.load_python_file(dir_, filename)
File "/usr/local/lib/python3.9/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file
module = load_module_py(module_id, path)
File "/usr/local/lib/python3.9/site-packages/alembic/util/compat.py", line 184, in load_module_py
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 790, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/usr/local/lib/python3.9/site-packages/airflow/migrations/versions/2c6edca13270_resource_based_permissions.py", line 27, in <module>
from airflow.www.app import create_app
File "/usr/local/lib/python3.9/site-packages/airflow/www/app.py", line 38, in <module>
from airflow.www.extensions.init_views import (
File "/usr/local/lib/python3.9/site-packages/airflow/www/extensions/init_views.py", line 29, in <module>
from airflow.www.views import lazy_add_provider_discovered_options_to_connection_form
File "/usr/local/lib/python3.9/site-packages/airflow/www/views.py", line 2836, in <module>
class ConnectionFormWidget(FormWidget):
File "/usr/local/lib/python3.9/site-packages/airflow/www/views.py", line 2839, in ConnectionFormWidget
field_behaviours = json.dumps(ProvidersManager().field_behaviours)
File "/usr/local/lib/python3.9/site-packages/airflow/providers_manager.py", line 111, in __init__
_create_customized_form_field_behaviours_schema_validator()
File "/usr/local/lib/python3.9/site-packages/airflow/providers_manager.py", line 53, in _create_customized_form_field_behaviours_schema_validator
importlib_resources.read_text('airflow', 'customized_form_field_behaviours.schema.json')
File "/usr/local/Cellar/[email protected]/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 139, in read_text
with open_text(package, resource, encoding, errors) as fp:
File "/usr/local/Cellar/[email protected]/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 121, in open_text
open_binary(package, resource), encoding=encoding, errors=errors)
File "/usr/local/Cellar/[email protected]/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 91, in open_binary
return reader.open_resource(resource)
File "<frozen importlib._bootstrap_external>", line 995, in open_resource
FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.9/site-packages/airflow/customized_form_field_behaviours.schema.json'
```
| https://github.com/apache/airflow/issues/13027 | https://github.com/apache/airflow/pull/13031 | 15fd1bc890aa1630ef16e7981408f8f994d30d97 | baa68ca51f93b3cea18efc24a7540a0ddf89c03d | 2020-12-12T00:42:57Z | python | 2020-12-12T09:21:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,969 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/task_command.py", "airflow/executors/celery_executor.py", "airflow/executors/local_executor.py", "airflow/task/task_runner/standard_task_runner.py"] | S3 Remote Logging not working | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: v2.0.0b3
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.15
**Environment**:
- **Cloud provider or hardware configuration**: AWS
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**: Custom Helm Chart
- **Others**:
**What happened**:
S3 Remote Logging not working. Below is the stacktrace:
```
Running <TaskInstance: canary_dag.print_date 2020-12-09T19:46:17.200838+00:00 [queued]> on host canarydagprintdate-9fafada4409d4eafb5e6e9c7187810ae β
β [2020-12-09 19:54:09,825] {s3_task_handler.py:183} ERROR - Could not verify previous log to append: 'NoneType' object is not callable β
β Traceback (most recent call last): β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 179, in s3_write β
β if append and self.s3_log_exists(remote_log_location): β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 141, in s3_log_exists β
β return self.hook.check_for_key(remote_log_location) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper β
β connection = self.get_connection(self.aws_conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection β
β conn = Connection.get_connection_from_secrets(conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets β
β conn = secrets_backend.get_connection(conn_id=conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper β
β with create_session() as session: β
β File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ β
β return next(self.gen) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session β
β session = settings.Session() β
β TypeError: 'NoneType' object is not callable β
β [2020-12-09 19:54:09,826] {s3_task_handler.py:193} ERROR - Could not write logs to s3://my-favorite-airflow-logs/canary_dag/print_date/2020-12-09T19:46:17.200838+00:00/2.log β
β Traceback (most recent call last): β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 190, in s3_write β
β encrypt=conf.getboolean('logging', 'ENCRYPT_S3_LOGS'), β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper β
β connection = self.get_connection(self.aws_conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection β
β conn = Connection.get_connection_from_secrets(conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets β
β conn = secrets_backend.get_connection(conn_id=conn_id) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper β
β with create_session() as session: β
β File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ β
β return next(self.gen) β
β File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session β
β session = settings.Session() β
β TypeError: 'NoneType' object is not callable
stream closed
```
**What you expected to happen**
Able to see the task instance logs in the airflow UI being read from S3 remote location.
**How to reproduce it**:
Pulled the latest master and created an airflow image from the dockerfile mentioned in the repo.
| https://github.com/apache/airflow/issues/12969 | https://github.com/apache/airflow/pull/13057 | 6bf9acb90fcb510223cadc1f41431ea5f57f0ca1 | ab5f770bfcd8c690cbe4d0825896325aca0beeca | 2020-12-09T20:21:42Z | python | 2020-12-14T16:28:01Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,912 | ["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"] | dagrun_timeout doesn't kill task instances on timeout | **Apache Airflow version**:
1.10.12
**What happened**:
I created dag with dagrun_timeout=2 minutes.
After 2 minutes dagrun is marked as failed and the next one is started, but task keeps going.
**What you expected to happen**:
Task is killed with dag run as it is done when you mark dagrun failed manually.
**How to reproduce it**:
```
dag = DAG(dag_id='platform.airflow-test',
description='',
schedule_interval="0 0 * * *",
start_date=datetime(2020, 7, 1),
max_active_runs=1,
catchup=True,
dagrun_timeout=timedelta(minutes=2))
run_this = BashOperator(
task_id='run_after_loop',
bash_command=' for((i=1;i<=600;i+=1)); do echo "Welcome $i times"; sleep 1; done',
dag=dag,
)
``` | https://github.com/apache/airflow/issues/12912 | https://github.com/apache/airflow/pull/14321 | 9f37af25ae7eb85fa8dbb70b7dbb23bbd5505323 | 09327ba6b371aa68cf681747c73a7a0f4968c173 | 2020-12-08T09:36:09Z | python | 2021-03-05T00:45:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,909 | [".github/workflows/scheduled_quarantined.yml"] | Quarantined Build is broken | Seems like the script `./scripts/ci/tools/ci_check_if_tests_should_be_run.sh` has been removed from code between release 1.10.12 & 1.10.13, and since then the Quarantined Build is broken https://github.com/apache/airflow/actions/runs/405827008
cc - @potiuk
| https://github.com/apache/airflow/issues/12909 | https://github.com/apache/airflow/pull/13288 | c2bedd580c3dd0e971ac394be25e331ba9c1c932 | c4809885ecd7ec1a92a1d8d0264234d86479bf24 | 2020-12-08T05:29:46Z | python | 2020-12-23T17:52:30Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,881 | ["Dockerfile", "Dockerfile.ci", "IMAGES.rst", "scripts/in_container/_in_container_utils.sh", "scripts/in_container/run_ci_tests.sh", "scripts/in_container/run_install_and_test_provider_packages.sh", "scripts/in_container/run_prepare_provider_readme.sh", "setup.py", "tests/providers/presto/hooks/test_presto.py"] | Snowflake python connector monkeypatches urllib and makes many services unusable. | Curreently wnen you run snowflke provider, it monkeypatches urlllb in a way that is not compatible with other libraries (for example presto SSL with kerberos, google, amazon, qubole and many others).
This is not critical (as in 2.0 we have provider separation and snowflake code will not even be there until you choose [snowflake] extra or install provider manually,
For now we decided to release but immediately yank the snowflake provider!
Additional links:
* Issue: https://github.com/snowflakedb/snowflake-connector-python/issues/324
Offending code:
* https://github.com/snowflakedb/snowflake-connector-python/blob/133d6215f7920d304c5f2d466bae38127c1b836d/src/snowflake/connector/network.py#L89-L92
| https://github.com/apache/airflow/issues/12881 | https://github.com/apache/airflow/pull/13654 | 821194beead51868ce360dfc096dbab91760cc37 | 6e90dfc38b1bf222f47acc2beb1a6c7ceccdc8dc | 2020-12-07T12:47:04Z | python | 2021-01-16T11:52:56Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,877 | ["setup.cfg"] | ImportError: cannot import name '_Union' from 'typing' (/usr/lib/python3.9/typing.py) | **Apache Airflow version**: 1.10.3
**Environment**:
- **OS** (e.g. from /etc/os-release): Arch Linux
- **Kernel** (e.g. `uname -a`): Linux 5.9.11-arch2-1 #1 SMP PREEMPT Sat, 28 Nov 2020 02:07:22 +0000 x86_64 GNU/Linux
- **Install tools**: pip 2.3.1 (with _--use-deprecated legacy-resolver_)
- **Others**: python 3.9
**What happened**:
```
(env) β project-airflow git:(feature-implementation) β ./env/bin/airflow webserver
Traceback (most recent call last):
File "/home/user/dev/project-airflow/./env/bin/airflow", line 26, in <module>
from airflow.bin.cli import CLIFactory
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/bin/cli.py", line 95, in <module>
api_module = import_module(conf.get('cli', 'api_client')) # type: Any
File "/usr/lib/python3.9/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/api/client/local_client.py", line 24, in <module>
from airflow.api.common.experimental import delete_dag
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/api/common/experimental/delete_dag.py", line 26, in <module>
from airflow.models.serialized_dag import SerializedDagModel
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/models/serialized_dag.py", line 35, in <module>
from airflow.serialization.serialized_objects import SerializedDAG
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 28, in <module>
import cattr
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/__init__.py", line 2, in <module>
from .converters import Converter, UnstructureStrategy
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/converters.py", line 15, in <module>
from ._compat import (
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/_compat.py", line 87, in <module>
from typing import _Union
ImportError: cannot import name '_Union' from 'typing' (/usr/lib/python3.9/typing.py)
```
**How to reproduce it**:
Try launch airflow webserver with python **3.9**
**Anything else we need to know**:
-- | https://github.com/apache/airflow/issues/12877 | https://github.com/apache/airflow/pull/13223 | f95b1c9c95c059e85ad5676daaa191929785fee2 | 9c0a5df22230105eb3a571c040daaba3f9cadf37 | 2020-12-07T10:19:45Z | python | 2020-12-21T20:36:54Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,876 | ["airflow/models/baseoperator.py", "tests/core/test_core.py", "tests/models/test_baseoperator.py"] | Improve error message when template fields contain invalid entries | ** Description **
When a new operator is defined there are some issues with `template_fields` that produces somewhat misleading error messages
* `template_fields = ('myfield')`, which of course is incorrect because that is not a tuple, you need to use `('myfield',)` or `['myfield']`
* `template_field = ['field_with_a_typo']`
As a I said both gives errors (at different stages) but the errors are a bit cryptic
For example:
```
[2020-12-07 09:50:45,088] {taskinstance.py:1150} ERROR - 'SFTPToS3Operator2' object has no attribute 's3_key'
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 965, in _run_raw_task
self.render_templates(context=context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1424, in render_templates
self.task.render_template_fields(context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 719, in render_template_fields
self._do_render_template_fields(self, self.template_fields, context, jinja_env, set())
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 724, in _do_render_template_fields
content = getattr(parent, attr_name)
AttributeError: 'SFTPToS3Operator2' object has no attribute 's3_key'
```
where there is no mention that this is related to `templated_fields`
**Use case / motivation**
In order to have a better experience developing plugins I would like
* A warning / error if a str is used for `template_fields = 'myfield'`. It's very unlikely that anyone want to use `myfield` as sequence `['m','y', 'f','i','e','l','d']`.
* A more specific error message in `_run_raw_task` if `template_fields` contains attributes not present.
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
<!-- A short description of your feature -->
**Use case / motivation**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/12876 | https://github.com/apache/airflow/pull/21054 | e97b72994f18e40e302ba8a14dbe73d34846a557 | beb2a2081a800650bc9a5e602b7216166582f67f | 2020-12-07T10:06:14Z | python | 2022-01-26T16:44:57Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,861 | [".github/workflows/build-images-workflow-run.yml", ".github/workflows/ci.yml", "scripts/ci/tools/ci_free_space_on_ci.sh", "tests/cli/commands/test_jobs_command.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_taskinstance.py", "tests/test_utils/asserts.py"] | [QUARANTINE] TestSchedulerJob.test_scheduler_task_start_date | The test fails quite regularly - usually in one of many jobs:
Example
https://github.com/apache/airflow/pull/12850/checks?check_run_id=1506517544
```
______________ TestSchedulerJob.test_scheduler_task_start_date ________________
self = <tests.jobs.test_scheduler_job.TestSchedulerJob testMethod=test_scheduler_task_start_date>
def test_scheduler_task_start_date(self):
"""
Test that the scheduler respects task start dates that are different from DAG start dates
"""
dagbag = DagBag(dag_folder=os.path.join(settings.DAGS_FOLDER, "no_dags.py"), include_examples=False)
dag_id = 'test_task_start_date_scheduling'
dag = self.dagbag.get_dag(dag_id)
dag.is_paused_upon_creation = False
dagbag.bag_dag(dag=dag, root_dag=dag)
# Deactivate other dags in this file so the scheduler doesn't waste time processing them
other_dag = self.dagbag.get_dag('test_start_date_scheduling')
other_dag.is_paused_upon_creation = True
dagbag.bag_dag(dag=other_dag, root_dag=other_dag)
dagbag.sync_to_db()
scheduler = SchedulerJob(executor=self.null_exec, subdir=dag.fileloc, num_runs=2)
scheduler.run()
session = settings.Session()
tiq = session.query(TaskInstance).filter(TaskInstance.dag_id == dag_id)
ti1s = tiq.filter(TaskInstance.task_id == 'dummy1').all()
ti2s = tiq.filter(TaskInstance.task_id == 'dummy2').all()
self.assertEqual(len(ti1s), 0)
> self.assertEqual(len(ti2s), 2)
E AssertionError: 1 != 2
tests/jobs/test_scheduler_job.py:2415: AssertionError
```
| https://github.com/apache/airflow/issues/12861 | https://github.com/apache/airflow/pull/14792 | 3f61df11e7e81abc0ac4495325ccb55cc1c88af4 | 45cf89ce51b203bdf4a2545c67449b67ac5e94f1 | 2020-12-06T19:43:03Z | python | 2021-03-18T13:01:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,852 | ["IMAGES.rst", "README.md"] | The README file in this repo has a bad link - [404:NotFound] "production-deployment.html" | **Apache Airflow version**:
N/A
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
N/A
**Environment**:
N/A
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): N/A
- **Kernel** (e.g. `uname -a`): N/A
- **Install tools**: N/A
- **Others**: N/A
**What happened**:
The link under βLatest docsβ gives
Status code [404:NotFound] - Link: https://github.com/apache/airflow/blob/master/docs/production-deployment.html
**What you expected to happen**:
The link should point to an actual file.
The closest name file I could find is βhttps://github.com/apache/airflow/blob/master/docs/apache-airflow/production-deployment.rstβ
But I was not sure if this is what the link should be pointing to or not(??)
**How to reproduce it**:
Click the link in the main page for this repo
## Install minikube/kind
N/A
**Anything else we need to know**:
This bad link was found by a tool I recently created as part of an new experimental hobby project: https://github.com/MrCull/GitHub-Repo-ReadMe-Dead-Link-Finder
Re-check this Repo via: http://githubreadmechecker.com/Home/Search?SingleRepoUri=https%3a%2f%2fgithub.com%2fapache%2fairflow
Check all Repos for this GitHub account: http://githubreadmechecker.com/Home/Search?User=apache
--
I (a human) verified that this link is broken and have manually logged this Issue (i.e. this Issue has not been created by a bot).
If this has been in any way helpful then please consider giving the above Repo a Star.
If you have any feedback on the information provided here, or on the tool itself, then please feel free to share your thoughts and pass on the feedback, or long an βIssueβ.
| https://github.com/apache/airflow/issues/12852 | https://github.com/apache/airflow/pull/12854 | a00f25011fc6c859b27b6c78b9201880cf6323ce | 3663d1519eb867b6bb152b27b93033666993511a | 2020-12-06T13:53:21Z | python | 2020-12-07T00:05:21Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,832 | ["dev/README_RELEASE_AIRFLOW.md", "dev/sign.sh"] | Source hash apache-airflow-1.10.13-bin.tar.gz.sha512 format is invalid | **Description**
The sha256sum file for apache-airflow releases is in an unexpected format for python-based checksum modules.
**Current file format:**
apache-airflow-1.10.13rc1-bin.tar.gz: 36D641C0 F2AAEC4E BCE91BD2 66CE2BC6
AA2D995C 08C9B62A 0EA1CBEC 027E657B
8AF4B54E 6C3AD117 9634198D F6EA53F8
163711BA 95586B5B 7BCF7F4B 098A19E2
**Wanted formats**
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx amd64\apache-airflow-1.10.13-bin.tar.gz
**Or**
`xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx`
**Use case / motivation**
Ansible and salt python libraries to consume checksums do not understand the format...
```
ID: airflow-archive-install
Function: archive.extracted
Name: /opt/apache-airflow-1.10.13/bin/
Result: False
Comment: Attempt 1: Returned a result of "False", with the following comment: "Source hash https://github.com/apache/airflow/releases/download/1.10.13/ap
ache-airflow-1.10.13-bin.tar.gz.sha512 format is invalid. The supported formats are: 1) a hash, 2) an expression in the format <hash_type>=<hash>, or 3) eithe
r a path to a local file containing hashes, or a URI of a remote hash file. Supported protocols for remote hash files are: salt, file, http, https, ftp, swift
, s3. The hash may also not be of a valid length, the following are supported hash types and lengths: md5 (32), sha1 (40), sha224 (56), sha256 (64), sha384 (9
6), sha512 (128)."
......etc
Started: 11:39:44.082079
Duration: 123506.098 ms
```
**Related Issues**
No
| https://github.com/apache/airflow/issues/12832 | https://github.com/apache/airflow/pull/12867 | 298c88a434325dd6df8f374057709022e0b0811f | a00f25011fc6c859b27b6c78b9201880cf6323ce | 2020-12-05T12:01:35Z | python | 2020-12-06T23:46:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,827 | ["airflow/config_templates/default_webserver_config.py", "docs/apache-airflow/security/webserver.rst"] | Missing docs about webserver_config.py | Hello,
We are missing documentation on the `webserver_config.py` file. I think it is worth answering the following questions in this guide:
* What is this file?
* What is this file for?
* When and how should you edit this file?
Best regards,
Kamil BreguΕa | https://github.com/apache/airflow/issues/12827 | https://github.com/apache/airflow/pull/13155 | 23a47879ababe76f6cf9034a2bae055b2a91bf1f | 81fed8072d1462ab43818bb7757ade4b67982976 | 2020-12-05T05:45:30Z | python | 2020-12-20T01:21:49Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,807 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/configuration.py", "airflow/models/baseoperator.py", "tests/core/test_configuration.py", "tests/models/test_baseoperator.py"] | add default weight_rule to airflow.cfg | **Description**
It would be nice if the weight_rule default value could be managed by a global config
suggested config:
```
# Weighting method used for the effective total priority weight of the task.
# Options are: { downstream | upstream | absolute } default is
default_weight_rule = downstream
```
**Use case / motivation**
In some pipeline, you really need to have absolute weight, and then you have to add a line in each task definition which is annoying
| https://github.com/apache/airflow/issues/12807 | https://github.com/apache/airflow/pull/18627 | d0ffd31ba3a4e8cd27fb7305cc19c33cf637509f | d79f506213297dc0dc034d6df3226361b6f95d7a | 2020-12-04T10:02:56Z | python | 2021-09-30T14:53:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,806 | ["airflow/cli/commands/db_command.py"] | 'NoneType' object has no attribute 'wait' with airflow db shell on SQLite | **Apache Airflow version**: 2.0.0b3
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 20.04.1 LTS
- **Kernel** (e.g. `uname -a`): Linux airflowvm 5.4.0-56-generic #62-Ubuntu SMP Mon Nov 23 19:20:19 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**:
- **Others**:
**What happened**:
After connecting to SQLite with `airflow db shell`
And exiting the shell with`.quit`
I got the following error:
```
[2020-12-04 07:31:28,506] {process_utils.py:149} INFO - Executing cmd: sqlite3 /home/airflow/airflow/airflow.db
SQLite version 3.31.1 2020-01-27 19:55:54
Enter ".help" for usage hints.
sqlite> ;
sqlite> .quit
Traceback (most recent call last):
File "/home/airflow/sandbox/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 50, in command
return func(*args, **kwargs)
File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/utils/cli.py", line 86, in wrapper
return f(*args, **kwargs)
File "/home/airflow/sandbox/lib/python3.8/site-packages/airflow/cli/commands/db_command.py", line 78, in shell
execute_interactive(["sqlite3", url.database]).wait()
AttributeError: 'NoneType' object has no attribute 'wait'
```
**What you expected to happen**:
No error when exiting the session.
Looks like this `execute_interactive(["sqlite3", url.database])` returns `None`
**How to reproduce it**:
```
airflow db shell
sqlite> .quit
```
**Anything else we need to know**:
I love this new command :) | https://github.com/apache/airflow/issues/12806 | https://github.com/apache/airflow/pull/13907 | c2266aac489b126638b3403b7a1ff0d2a9368056 | 0d1c39ad2d1e8344af413041b3bb6834d1b56778 | 2020-12-04T07:40:22Z | python | 2021-01-26T12:23:38Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,796 | ["airflow/providers/http/sensors/http.py"] | Make headers templated in HttpSensor | **Description**
Make HttpSensor `headers` parameter templated.
**Use case / motivation**
This would allow for passing data from other tasks, such as an API token, in the headers.
**Related Issues**
N/A | https://github.com/apache/airflow/issues/12796 | https://github.com/apache/airflow/pull/12809 | 37afe55775676e2cb4cf6ed0cfc6c892855d6805 | c1cd50465c5473bc817fded5eeb4c425a0529ae5 | 2020-12-03T20:57:43Z | python | 2020-12-05T00:59:52Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,785 | ["airflow/operators/python.py", "airflow/plugins_manager.py", "airflow/utils/python_virtualenv_script.jinja2", "tests/plugins/test_plugins_manager.py"] | Macros added through plugins can not be used within Jinja templates in Airflow 2.0 | **Apache Airflow version**: 2.0.0b3
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**:
- **OS** (e.g. from /etc/os-release): Debian GNU/Linux 10 (buster)
- **Kernel** (e.g. `uname -a`): Linux 6ae65b86e112 5.4.0-52-generic #57-Ubuntu SMP Thu Oct 15 10:57:00 UTC 2020 x86_64 GNU/Linux
- **Others**: Python 3.8
**What happened**:
At JW Player we add additional macros to Airflow through a plugin. The definition of this plugin looks like the following (simplified):
```
from airflow.plugins_manager import AirflowPlugin
from utils_plugin.macros.convert_image_tag import convert_image_tag
class JwUtilsPlugin(AirflowPlugin):
name = 'jw_utils'
macros = [convert_image_tag]
```
`convert_image_tag` is a function that takes a string (a docker tag) as argument and resolves it to a SHA-256 hash that uniquely identifies an image by querying the docker registry. I.e. it is a function that takes a string as argument and returns a string.
In Airflow 1.10.x we can successfully use this macro in our DAGs to resolve image tags to SHA-256 hashes, e.g. the following DAG will run an Alpine Image using a DockerOperator:
```python
from datetime import datetime, timedelta
from airflow import DAG
try:
from airflow.providers.docker.operators.docker import DockerOperator
except ModuleNotFoundError:
from airflow.operators.docker_operator import DockerOperator
now = datetime.now()
with DAG('test_dag',
schedule_interval='*/15 * * * *',
default_args={
'owner': 'airflow',
'start_date': datetime.utcnow() - timedelta(hours=1),
'task_concurrency': 1,
'execution_timeout': timedelta(minutes=5)
},
max_active_runs=1) as dag:
task_sleep = DockerOperator(
task_id='task_sleep',
image=f"{{ macros.jw_utils.convert_image_tag('alpine') }}",
command=['sleep', '10']
)
```
This is in contrast to Airflow 2.0, if we attempt to use our custom macro here, then when Airflow attempts to render the task template it will error out with the following error:
```
[2020-12-03 12:54:43,666] {{taskinstance.py:1402}} ERROR - 'module object' has no attribute 'jw_utils'
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1087, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1224, in _prepare_and_execute_task_with_callbacks
self.render_templates(context=context)
File "/usr/local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1690, in render_templates
self.task.render_template_fields(context)
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 857, in render_template_fields
self._do_render_template_fields(self, self.template_fields, context, jinja_env, set())
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 870, in _do_render_template_fields
rendered_content = self.render_template(content, context, jinja_env, seen_oids)
File "/usr/local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 907, in render_template
return jinja_env.from_string(content).render(**context)
File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/usr/local/lib/python3.8/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "<template>", line 1, in top-level template code
File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 471, in getattr
return getattr(obj, attribute)
jinja2.exceptions.UndefinedError: 'module object' has no attribute 'jw_utils'
```
**What you expected to happen**:
I would have expected that the DAG definition from above would have worked in Airflow 2.0, like it would have functioned in Airflow 1.10.x.
**How to reproduce it**:
This bug can be reproduced by creating a plugin that adds a macro, and then attempting to use that macro in a DAG.
**Anything else we need to know**:
In order to better understand the issue, I did a bit of digging. The plugin that we extend Airflow's functionality with has its own suite of pytest testcases. Since we are in the process of preparing for a transition to Airflow 2.0 we are now running the unit tests for this plugin against both Airflow 1.10.x and Airflow 2.0.0b3.
After reviewing how plugins were being loaded in Airflow, I've added the following testcase to mimic how plugins were being loaded and how [`get_template_context()`](https://github.com/apache/airflow/blob/2.0.0b3/airflow/models/taskinstance.py#L1481) in Airflow 2.0 ensures that plugins have been imported:
```python
def test_macro_namespacing(is_airflow_1):
"""
Tests whether macros can be loaded from Airflow's namespace after loading plugins.
"""
from airflow import macros
if not is_airflow_1:
# In Airflow 2.x, we need to make sure we invoke integrate_macros_plugins(), otherwise
# the namespace will not be created properly.
from airflow.plugins_manager import integrate_macros_plugins
integrate_macros_plugins()
from utils_plugin.plugin import JwUtilsPlugin
# After Airflow has loaded the plugins, the macros should be available as airflow.macros.jw_utils.
macros_module = import_module(f"airflow.macros.{JwUtilsPlugin.name}")
for macro in JwUtilsPlugin.macros:
# Verify that macros have been registered correctly.
assert hasattr(macros_module, macro.__name__)
# However, in order for the module to actually be allowed to be used in templates, it must also exist on
# airflow.macros.
assert hasattr(macros, 'jw_utils')
```
This test case passes when being ran on Airflow 1.10, but surprisngly enough it fails on Airflow 2.x. Specifically it fails on the `assert hasattr(macros, 'jw_utils')` statement in Airflow 2.0. This statement tests whether the macros that we create through the `JwUtilsPlugin` have been properly added to `airflow.macros`.
I thought it was strange for the test-case to fail on this module, given that the `import_module()` statement succeeded in Airflow 2.0. After this observation I started comparing the logic for registering macros in Airflow 1.10.x to the Airflow 2.0.0 implementation.
While doing this I observed that the plugin loading mechanism in Airflow 1.10.x works because Airflow [automatically discovers](https://github.com/apache/airflow/blob/1.10.13/airflow/__init__.py#L104) all plugins through the `plugins_manager` module. When this happens it automatically [initializes plugin-macro modules](https://github.com/apache/airflow/blob/1.10.13/airflow/plugins_manager.py#L306) in the `airflow.macros` namespace. Notably, after the plugin's module has been initialized it will also automatically be registered on the `airflow.macros` module [by updating the dictionary](https://github.com/apache/airflow/blob/1.10.13/airflow/macros/__init__.py#L93) returned by `globals()`.
This is in contrast to Airflow 2.0, where plugins are no longer loaded automatically. Instead they are being loaded lazily, i.e. they will be loaded on-demand whenever a function needs them. In order to load macros (or ensure that macros have been loaded), modules need to import the [`integrate_macros_plugins`](https://github.com/apache/airflow/blob/2.0.0b3/airflow/plugins_manager.py#L395) function from `airflow.plugins_manager`.
When Airflow attempts to prepare a template context, prior to running a task, it properly imports this function and invokes it in [taskinstance.py](https://github.com/apache/airflow/blob/2.0.0b3/airflow/models/taskinstance.py#L1483). However, in contrast to the old 1.10.x implementation, this function does not update the symbol table of `airflow.macros`. The result of this is that the macros from the plugin _will in fact_ be imported, but because `airflow.macros` symbol table itself is not being updated, the macros that are being added by the plugins can not be used in the template rendering context.
I believe this issue could be solved by ensuring that `integrate_macros_plugins` sets a reference to the `airflow.macros.jw_utils` as `jw_utils` on the `airflow.macros` module. Once that has been done I believe macros provided through plugins are functional again.
| https://github.com/apache/airflow/issues/12785 | https://github.com/apache/airflow/pull/12788 | f66a46db88da86b4a11c5ee142c09a5001c32c41 | 29d78489e76c292c2ca74cab02141c2bcff2aabc | 2020-12-03T14:52:01Z | python | 2020-12-07T22:34:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,783 | ["airflow/models/baseoperator.py", "airflow/sensors/base_sensor_operator.py", "airflow/serialization/schema.json", "airflow/serialization/serialized_objects.py", "tests/serialization/test_dag_serialization.py"] | Sensors in reschedule mode are not rescheduled | **Apache Airflow version**:
2.0.0dev
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
```
./breeze --python=3.8 --backend=postgres --db-reset restart
```
**What happened**:
Sensors in reschedule mode are not rescheduled by scheduler.
**What you expected to happen**:
Sensors in both poke and reschedule mode should work.
**How to reproduce it**:
```
from airflow import DAG
from airflow.sensors.base_sensor_operator import BaseSensorOperator
from airflow.utils.dates import days_ago
class DummySensor(BaseSensorOperator):
def poke(self, context):
return False
with DAG(
"other_dag",
start_date=days_ago(1),
schedule_interval="*/5 * * * *",
catchup=False
) as dag3:
DummySensor(
task_id='wait-task',
poke_interval=60 * 5,
mode='reschedule'
)
```
Then:
```
root@053f6ca34e24: /opt/airflow# airflow dags unpause other_dag
Dag: other_dag, paused: False
root@053f6ca34e24: /opt/airflow# airflow scheduler
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2020-12-03 14:18:58,404] {scheduler_job.py:1247} INFO - Starting the scheduler
[2020-12-03 14:18:58,404] {scheduler_job.py:1252} INFO - Processing each file at most -1 times
[2020-12-03 14:18:58,571] {dag_processing.py:250} INFO - Launched DagFileProcessorManager with pid: 63835
[2020-12-03 14:18:58,576] {scheduler_job.py:1757} INFO - Resetting orphaned tasks for active dag runs
[2020-12-03 14:18:58,660] {settings.py:52} INFO - Configured default timezone Timezone('UTC')
[2020-12-03 14:18:58,916] {scheduler_job.py:944} INFO - 1 tasks up for execution:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:18:58,920] {scheduler_job.py:973} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2020-12-03 14:18:58,921] {scheduler_job.py:1001} INFO - DAG other_dag has 0/16 running and queued tasks
[2020-12-03 14:18:58,921] {scheduler_job.py:1066} INFO - Setting the following tasks to queued state:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:18:58,925] {scheduler_job.py:1108} INFO - Sending TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 1 and queue default
[2020-12-03 14:18:58,926] {base_executor.py:79} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'other_dag', 'wait-task', '2020-12-03T14:10:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/files/dags/the_old_issue.py']
[2020-12-03 14:18:58,935] {local_executor.py:80} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', 'other_dag', 'wait-task', '2020-12-03T14:10:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/files/dags/the_old_issue.py']
[2020-12-03 14:18:59,063] {dagbag.py:440} INFO - Filling up the DagBag from /files/dags/the_old_issue.py
Running <TaskInstance: other_dag.wait-task 2020-12-03T14:10:00+00:00 [queued]> on host 053f6ca34e24
[2020-12-03 14:19:00,022] {scheduler_job.py:944} INFO - 1 tasks up for execution:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:19:00,029] {scheduler_job.py:973} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2020-12-03 14:19:00,029] {scheduler_job.py:1001} INFO - DAG other_dag has 0/16 running and queued tasks
[2020-12-03 14:19:00,029] {scheduler_job.py:1066} INFO - Setting the following tasks to queued state:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:19:00,033] {scheduler_job.py:1108} INFO - Sending TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 1 and queue default
[2020-12-03 14:19:00,033] {base_executor.py:82} ERROR - could not queue task TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1)
[2020-12-03 14:19:00,038] {scheduler_job.py:1199} INFO - Executor reports execution of other_dag.wait-task execution_date=2020-12-03 14:10:00+00:00 exited with status success for try_number 1
[2020-12-03 14:19:00,045] {scheduler_job.py:1235} ERROR - Executor reports task instance <TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [queued]> finished (success) although the task says its queued. (Info: None) Was the task killed externally?
[2020-12-03 14:19:01,173] {dagrun.py:429} ERROR - Marking run <DagRun other_dag @ 2020-12-03 14:10:00+00:00: scheduled__2020-12-03T14:10:00+00:00, externally triggered: False> failed
```
**Anything else we need to know**:
Discovered when working on #10790
Thank @nathadfield for helping discover this issue!
| https://github.com/apache/airflow/issues/12783 | https://github.com/apache/airflow/pull/12858 | 75d8ff96b4e7736b177c3bb8e949653d6a501736 | c045ff335eecb5c72aeab9e7f01973c18f678ff7 | 2020-12-03T13:52:28Z | python | 2020-12-06T21:55:53Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,780 | ["PULL_REQUEST_WORKFLOW.rst", "scripts/ci/selective_ci_checks.sh"] | K8S were not run on cli change | In https://github.com/apache/airflow/pull/12725 selective checks did not run K8S tests. | https://github.com/apache/airflow/issues/12780 | https://github.com/apache/airflow/pull/13305 | e9d65bd4582b083914f2fc1213bea44cf41d1a08 | e2bfac9fc874a6dd1eb52a067313f43ec94307e3 | 2020-12-03T09:36:39Z | python | 2020-12-24T14:48:57Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,776 | ["airflow/migrations/versions/4addfa1236f1_add_fractional_seconds_to_mysql_tables.py", "airflow/migrations/versions/d2ae31099d61_increase_text_size_for_mysql.py", "airflow/migrations/versions/e959f08ac86c_change_field_in_dagcode_to_mediumtext_.py", "airflow/models/dagcode.py"] | Update source_code field of dag_code table to MEDIUMTEXT | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
Update source_code field of dag_code table to MEDIUMTEXT
<!-- A short description of your feature -->
**Use case / motivation**
Lot of dags exceed the limit of 65K characters limit giving error `"Data too long for column 'source_code' at row 1"` when enabling webserver to fetch dag_code from db.
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/12776 | https://github.com/apache/airflow/pull/12890 | b11551278a703e2e742969ac554908f16f235809 | f66a46db88da86b4a11c5ee142c09a5001c32c41 | 2020-12-03T08:42:01Z | python | 2020-12-07T22:17:22Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,774 | ["CONTRIBUTING.rst", "docs/apache-airflow/cli-and-env-variables-ref.rst"] | Missing AIRFLOW__{SECTION}__{OPTION}__SECRET in environment variable reference | Hello,
One env variiable - `AIRFLOW__{SECTION}__{OPTION}__SECRET` has not been added to our [environment variables reference](https://github.com/apache/airflow/blob/master/docs/apache-airflow/cli-and-env-variables-ref.rst). For moe info, see: See: http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/howto/set-config.html
Related:
https://github.com/apache/airflow/issues/12773
https://github.com/apache/airflow/issues/12772
| https://github.com/apache/airflow/issues/12774 | https://github.com/apache/airflow/pull/12797 | 4da94b5a19eb547e86cebf074078ba6f03a51db1 | 292118e33971dfd68cb32a404a85c0d46d225b40 | 2020-12-03T08:21:50Z | python | 2020-12-04T00:58:17Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,773 | ["airflow/config_templates/config.yml", "docs/apache-airflow/configurations-ref.rst"] | Incomplete list of environment variables that override configuration | Hello,
In our configuration reference docs, we provide information about the environment variables that affect the options.
<img width="430" alt="Screenshot 2020-12-03 at 09 14 33" src="https://user-images.githubusercontent.com/12058428/100982181-07389c00-3548-11eb-9089-fe00c4b9367f.png">
Unfortunately, this list is not complete. Some configuration options can also be set using `AIRFLOW__{SECTION}__{OPTION}__SECRET` or `AIRFLOW__{SECTION}__{OPTION}__CMD` env variable. See: http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/howto/set-config.html
| https://github.com/apache/airflow/issues/12773 | https://github.com/apache/airflow/pull/12820 | e82cf0d01d6c1e1ec65d8e1b70d65158947fccd2 | c85f49454de63f5857bf477a240229a71f0e78ff | 2020-12-03T08:17:58Z | python | 2020-12-05T06:00:18Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,772 | ["airflow/configuration.py", "docs/apache-airflow/configurations-ref.rst", "docs/conf.py", "tests/core/test_configuration.py"] | Missing docs for deprecated configuration options | Hello,
In Airflow 2, we've moved some configuration options to the new section. We also changed the names of some of the configuration options. This is confusing for users who are familiar with the old option and section names. It would be great if we could add information to the documentation that points to the new name of the options.
https://github.com/apache/airflow/blob/8f48f12128e0d985c6de2603902524859fecbca8/airflow/configuration.py#L139-L169
> 'The {old} option in [{section}] has been renamed to {new}
> 'The {old_key} option in [{old_section}] has been moved to the {new_key} option in '
> '[{new_section}]
Best regards,
Kamil BreguΕa
| https://github.com/apache/airflow/issues/12772 | https://github.com/apache/airflow/pull/13883 | 810c15ed85d7bcde8d5b8bc44e1cbd4859e29d2e | 65e49fc56f32b3e815fdf4a17be6b4e1c1e43c11 | 2020-12-03T08:11:14Z | python | 2021-01-27T12:06:35Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,769 | ["docs/apache-airflow/upgrading-to-2.rst"] | Documentation needed for DB upgrade as part of 2.0 | Following up on the dev call on 30th of November, there was a clear desire expressed for documentation around the database upgrade process from Airflow 1.10.14 (or equivalent) to Airflow 2.0. Though the upgrade process is conceptually no different from a regular 1.10.x to a 1.10.x+1 release, the fact that there are significant known database changes may raise concerns in the minds of Airflow users as part of the upgrade.
To ease their concerns, the following questions should ideally be answered as part of the documentation specifically either as part of the "Upgrading to 2.0 document" or linked from there.
Q 1. Is there anything "special" which I need to be done to upgrade from 1.10.x to 2.0 with respect to the database?
Ans. I don't believe so, other than the normal upgrade checks.
Q 2. How long should I expect this database upgrade expected to take?
Ans. I am not quite sure how to answer this since it depends on the data. We can possibly share sample times based on tested data sets.
Q 3. Can I do something to reduce the database upgrade time?
Ans. A couple of options here. One possibility is to recommend the maintenance DAGs to be run to archive / delete older task history, xcom data, and equivalent. Another possibility is to provide a script for them to run as part of the Airflow project distribution, possibly part of upgrade check scripts.
| https://github.com/apache/airflow/issues/12769 | https://github.com/apache/airflow/pull/13005 | 3fbc8e650dcd398bc2844b7b3d92748423c7611a | 0ffd5fa3d87e78126807e6cdb4b1b29154047153 | 2020-12-03T01:57:37Z | python | 2020-12-11T16:20:35Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,757 | ["airflow/models/baseoperator.py", "tests/utils/test_task_group.py"] | Graph View is empty when Operator has multiline string in args (v2.0) | Airflow v2.0b3
Kubernetes v1.19.3
Discovered issue while testing KubernetesPodOperator (haven't tested with other operator).
If I create a multiline string using """ """", add some variables inside (Jinja templating), then use this string as an argument to KubernetesPodOperator:
- In Graph View DAG is not visible (just gray area where it should be a digraph);
- in browser's web console i see the following error:
`Uncaught TypeError: node is undefined
preProcessGraph http://localhost:8080/static/dist/dagre-d3.min.js:103
preProcessGraph http://localhost:8080/static/dist/dagre-d3.min.js:103
fn http://localhost:8080/static/dist/dagre-d3.min.js:103
call http://localhost:8080/static/dist/d3.min.js:3
draw http://localhost:8080/graph?dag_id=mydag&execution_date=mydate
expand_group http://localhost:8080/graph?dag_id=mydag&execution_date=mydate
<anonymous> http://localhost:8080/graph?dag_id=mydag&execution_date=mydate`
Tree view works without issues in this case. The DAG succeeds. | https://github.com/apache/airflow/issues/12757 | https://github.com/apache/airflow/pull/12829 | cd66450b4ee2a219ddc847970255e420ed679700 | 12ce5be77f64c335dce12c3586d2dc7b63491d34 | 2020-12-02T14:59:06Z | python | 2020-12-05T11:52:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,751 | ["chart/templates/flower/flower-service.yaml", "chart/templates/webserver/webserver-service.yaml", "chart/tests/test_flower.py", "chart/tests/test_webserver.py", "chart/values.schema.json", "chart/values.yaml"] | Helm Chart: Provide option to specify loadBalancerIP in webserver service | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
The current service type for `webserver` is defaulted at `ClusterIP`.
I am able to change it to `LoadBalancer` type, but the I was not able to specify the static IP.
So every time we reinstall the chart, it will change the assigned IP of the loadbalancer being provisioned to us.
<!-- A short description of your feature -->
**Use case / motivation**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/12751 | https://github.com/apache/airflow/pull/15972 | bb43e06c75dd6cafc094813347f7a7b13cb9374e | 9875f640ca19dabd846c17f4278ccc90e189ae8d | 2020-12-02T04:19:48Z | python | 2021-05-21T23:06:09Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,748 | ["codecov.yml"] | Code Coverage is Broken | https://codecov.io/github/apache/airflow?branch=master CodeCov code-coverage is broken on Master. It wasn't great but still useful to check which sections needed lacks tests.
cc @potiuk | https://github.com/apache/airflow/issues/12748 | https://github.com/apache/airflow/pull/13092 | 0eb210df3e10b478a567291355bc269150c93ae5 | ae98c074032861b07d6945a8f6f493b319dcc374 | 2020-12-01T23:46:36Z | python | 2020-12-15T21:33:39Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,744 | ["setup.cfg", "setup.py"] | Difference of extras Airflow 2.0 vs. Airflow 1.10 | **Description**
When airflow 2.0 is installed from PyPI, providers are not installed by default. In order to install them, you should add an appropriate extra. While this behavior is identical in Airflow 1.10 for those "providers" that required additional packages, there were a few "providers" that did not require any extras to function (example http, ftp) - we have "http", "ftp" extras for them now, but maybe some of those are popular enough to be included by default?.
We have to make a decision now:
- [x] should all of them (or some of them) be included by default when you install Airflow?
- [x] if we decide to exclude only some (or none), we should add them in UPGRADING_to_2_0 and in UPDATING documentation.
**Use case / motivation**
We want people to get a familiar experience when installing airflow. Why we provide familiar mechanism (with extras) and people will expect a slightly different configurations, installation and we can describe the differences, maybe some of those providers are so popular that we should include them by default?
**Related Issues**
#12685 - where we discuss which of the extras should be included in the Production Image of 2.0.
**Additional info**
Here is the list of all "providers" that were present in 1.10 and had no additional dependencies - so basically they woudl work out-fhe-box in 1.10, but they need appropriate "extra" in 2.0.
* "apache.pig": [],
* "apache.sqoop": [],
* "dingding": [],
* "discord": [],
* "ftp": [],
* "http": [],
* "imap": [],
* "openfaas": [],
* "opsgenie": [],
* "sqlite": [],
Also here I appeal to the wisdom of crowd: @ashb, @dimberman @kaxil, @turbaszek, @mik-laj. @XD-DENG, @feluelle, @eladkal, @ryw, @vikramkoka, @KevinYang21 - let me know WDYT before I bring it to devlist? | https://github.com/apache/airflow/issues/12744 | https://github.com/apache/airflow/pull/12916 | 9b39f24780e85f859236672e9060b2fbeee81b36 | e7c1771cba16e3f554d6de5f77c97e49b16f7bed | 2020-12-01T18:44:37Z | python | 2020-12-08T15:22:47Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,736 | ["airflow/operators/generic_transfer.py", "tests/operators/test_generic_transfer.py"] | property "replace" in GenericTransfer.insert_rows is not accessible | I want to set replace=True during insert, but existing param "replace" cannot be set.
Suggest something like:
generic_transfer.py:
`destination_hook.insert_rows(table=self.destination_table,` rows=results, replace=self.replace)
**Apache Airflow version**: 1.10.13
| https://github.com/apache/airflow/issues/12736 | https://github.com/apache/airflow/pull/15825 | fea29112bef4ad8787ae1482d829046bbba39f7e | 636625fdb99e6b7beb1375c5df52b06c09e6bafb | 2020-12-01T14:06:41Z | python | 2021-07-19T19:04:32Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,726 | ["docs/apache-airflow/tutorial_taskflow_api.rst"] | Add classic operator in TaskFlow API tutorial |
**Description**
TaskFlow API tutorial should add an example to use a classic operator (example: EmailOperator) so that users know that it can be leveraged.
Alternatively, it should add references to how to add dependencies (implicit or explicit) to classic operators.
**Use case / motivation**
It's not super clear how can TaskFlow API be used with existing operators (aka PostgresOperator, EmailOperator...). Adding an example, will facilitate users to get a picture of what can be done with this.
| https://github.com/apache/airflow/issues/12726 | https://github.com/apache/airflow/pull/19214 | 2fdcb8a89cd1aaf1a90657385a257e58926c21a9 | 2dfe85dcb4923f1c4cce8b1570561f11cf07c186 | 2020-12-01T00:27:48Z | python | 2021-10-29T16:44:50Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,722 | ["airflow/hooks/dbapi_hook.py", "tests/hooks/test_dbapi_hook.py"] | No Support for special Characters in Passwords for get_uri() method in DbApiHook | Hello,
I have recently noticed that a lot of connections with certain special characters don't work with SqlAlchemy as it requires the passwords to be [urlencoded when they contain special characters](https://stackoverflow.com/questions/1423804/writing-a-connection-string-when-password-contains-special-characters).
Would there be any impact to changing the code at line 81 to urlencode the password like `urllib.parse.quote_plus(conn.password)` to prevent login failures for special characters?
https://github.com/apache/airflow/blob/dee304b222d355b03794aa063f39e3ee13997730/airflow/hooks/dbapi_hook.py#L72-L88
I initially caught this issue while using the OdbcHook method found here. https://github.com/apache/airflow/blob/dee304b222d355b03794aa063f39e3ee13997730/airflow/providers/odbc/hooks/odbc.py#L198-L204
Happy to create a feature request, I just want to confirm that this issue makes sense.
| https://github.com/apache/airflow/issues/12722 | https://github.com/apache/airflow/pull/12775 | 4fb312140fc15b46fa96e98ec0e3939d81109eb6 | 01707d71d9d184d4c5b9602c93c2e46c9010d711 | 2020-11-30T20:23:33Z | python | 2020-12-07T16:51:09Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,716 | ["chart/templates/dags-persistent-volume-claim.yaml", "chart/templates/scheduler/scheduler-deployment.yaml", "chart/values.yaml"] | Scheduler in helm chart does not work with persistent + gitsync | **Apache Airflow version**:
2.0.0dev
Due to this:
https://github.com/apache/airflow/blob/5e13c372860a28256bf6e572bf7349f3dd6b8b0c/chart/templates/scheduler/scheduler-deployment.yaml#L156-L164
Doing this:
https://github.com/apache/airflow/tree/master/chart#mounting-dags-using-git-sync-side-car-with-persistence-enabled
```
helm upgrade airflow . \
--set dags.persistence.enabled=false \
--set dags.gitSync.enabled=true
```
will fail with:
```
Error: Deployment.apps "airflow4-scheduler" is invalid: spec.template.spec.containers[0].volumeMounts[4].mountPath: Invalid value: "/opt/airflow/dags": must be unique
```
The reason is that if both
```
--set dags.persistence.enabled=false
--set dags.gitSync.enabled=true
```
are specified then the volume mount is specified two times in scheduler definition. | https://github.com/apache/airflow/issues/12716 | https://github.com/apache/airflow/pull/12717 | 8c5594b02ffbfc631ebc2366dbde6d8c4e56d550 | e7a2e3544f216c6fba8ea4b344ecb6c89158c032 | 2020-11-30T11:21:00Z | python | 2021-02-08T16:39:56Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,692 | ["airflow/plugins_manager.py", "airflow/providers_manager.py", "setup.cfg", "tests/plugins/test_plugins_manager.py", "tests/www/test_views.py"] | Provider discovery based on entry_points is rather brittle | The tests we run in CI had shown that provider discovery based on entry_points is rather brittle.
Example here:
https://github.com/apache/airflow/pull/12466/checks?check_run_id=1467792592#step:9:4452
This is not a problem with Airflow, but wth PIP which might silently upgrade some packages and cause "version conflict" totally independently from Airflow configuration and totally out-of-our-control.
Simple installing a whl package on top of the existing airflow installation (as it happened in the case above) might cause inconsistent requirements (in the case above installing .whl packages with all providers on top of existing Airflow installation caused the requests package to be upgraded to 2.25.0, even if airflow has the right requirements set. In this case it was (correct and it is from the "install_requires" section of airflow's setup.cfg):
```
Requirement.parse('requests<2.24.0,>=2.20.0'), {'apache-airflow'}
```
In case you have a version conflict in your env, running entry_point.load() from a package that has this version conflicts results with `pkg_resources.VersionConflict` error or `pkg_resources.ContextualVersionConflict) rather than returning the entry_point. Or at least that's what I observed so far. It's rather easy to reproduce. Simply install requests > 2.24.0 in the current airflow and see what happens.
So far I could not find a way to mitigate this problem, but @ashb - since you have more experience with it, maybe you can find a workaround for this?
I think we have a few options:
1) We fail 'airflow' hard if there is any Version Conflict. We have a way now after I've implemented ##10854 (and after @ephraimbuddy finishes the #12188 ) - we have a good, maintainable list of non-conflicting dependencies for Airflow and it's providers and we can keep that in the future thanks to pip-check. But I am afraid that will give a hard time to people who would like to install airflow with some custom dependencies (Tensorflow for example, depending on versions is notoriously difficult to sync with Airflow when it comes to dependencies). However, this is the most "Proper" (TM) solution.
2) We find a workaround for the entry_point.load() VersionConflict exception. However, I think that might not be possible or easy looking for example at this SO thread: https://stackoverflow.com/questions/52982603/python-entry-point-fails-for-dependency-conflict . The most upvoted (=1) answer there starts with "Welcome to the world of dependencies hell! I know no clean way to solve this" - which is not very encouraging. I tried also to find it out from docs and code of the entry_point.load() but to no avail. @ashb - maybe you can help here.
3) We go back to the original implementation of mine where I read provider info from provider.yaml embedded into the package. This has disadvantage of being non-standard, but it works independently of version conflicts.
WDYT?
| https://github.com/apache/airflow/issues/12692 | https://github.com/apache/airflow/pull/12694 | 850b74befe5e1827c84d02dd2c7c5e6aded3f841 | 7ef9aa7d545f11442b6ebb86590cd8ce5f98430b | 2020-11-28T18:11:18Z | python | 2020-11-29T06:19:47Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,691 | ["airflow/www/templates/airflow/dag_details.html"] | add dagrun_timeout to the DAG Details screen in the UI | In the Details page the is no indication of the DAG `dagrun_timeout` | https://github.com/apache/airflow/issues/12691 | https://github.com/apache/airflow/pull/14165 | 92f81da91cc337e18e5aa77d445d0a8ab7d32600 | 61b613359e2394869070b3ad94f64dfda3efac74 | 2020-11-28T18:02:57Z | python | 2021-02-10T20:25:04Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,666 | ["airflow/migrations/versions/33ae817a1ff4_add_kubernetes_resource_checkpointing.py", "airflow/migrations/versions/bef4f3d11e8b_drop_kuberesourceversion_and_.py"] | Error: table kube_resource_version already exists when calling reset_db() with SQLite backend | Issue encountered during alembic migration when running Airflow 2.0.0b3 locally using a sqlite backend and calling airflow.utils.db.resetdb():
**sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) table kube_resource_version already exists
[SQL:
CREATE TABLE kube_resource_version (
one_row_id BOOLEAN DEFAULT (1) NOT NULL,
resource_version VARCHAR(255),
PRIMARY KEY (one_row_id),
CONSTRAINT kube_resource_version_one_row_id CHECK (one_row_id),
CHECK (one_row_id IN (0, 1))
)
]**
| https://github.com/apache/airflow/issues/12666 | https://github.com/apache/airflow/pull/12670 | fa8af2d16551e287673d94a40cfb41e49d685412 | 704e724cc127c9ca6c9f0f51997c9d057b697aec | 2020-11-27T19:22:55Z | python | 2020-11-27T22:49:31Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,665 | ["scripts/in_container/run_generate_constraints.sh"] | Constraints behaviour changes in new PIP | We have this warning when running the latest PIP, so we have to take a close look what it means to us:
```
pip install 'https://github.com/apache/airflow/archive/master.tar.gz#egg=apache-airflow[devel_ci]'
--constraint https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt
DEPRECATION: Constraints are only allowed to take the form of a package
name and a version specifier. Other forms were originally permitted
as an accident of the implementation, but were undocumented. The new implementation
of the resolver no longer supports these forms. A possible replacement is replacing
the constraint with a requirement.. You can find discussion regarding
this at https://github.com/pypa/pip/issues/8210.
```
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
<!-- A short description of your feature -->
**Use case / motivation**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/12665 | https://github.com/apache/airflow/pull/12671 | 944bd4c658e9793c43c068e5359f816ded4f0b40 | 3b138d2d60d86ca0a80e9c27afd3421f45df178e | 2020-11-27T19:18:14Z | python | 2020-11-28T05:04:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,642 | ["airflow/settings.py"] | typo: force_logout_after should be force_log_out_after |
**Apache Airflow version**:
1.10.13
**What happened**:
`force_logout_after` should be `force_log_out_after` in the code section https://github.com/apache/airflow/blob/master/airflow/settings.py#L372-L381. As `force_log_out_after` is actually used and written in https://github.com/apache/airflow/blob/c5700a56bb3b9a5b872bda0fe0d3de82b0128bdf/UPDATING.md#unify-user-session-lifetime-configuration.
**What you expected to happen**:
`force_logout_after` is renamed to `force_log_out_after`. | https://github.com/apache/airflow/issues/12642 | https://github.com/apache/airflow/pull/12661 | 456a1c5dc920a2369e726d2b89819bd5a3ce613e | 531e00660af0cc7729792ef08559edd84c6c46ab | 2020-11-26T14:29:56Z | python | 2020-11-27T17:36:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,632 | ["chart/templates/webserver/webserver-deployment.yaml", "chart/tests/test_webserver_deployment.py"] | helm chart: webserver replies with 404 "Apache Airflow is not at this location" when using config.webserver.base_url | I tried to deploy with helm values:
ingress.web.path: /airflow/*
config.webserver.base_url: https://myhostname/airflow
It doesn't work because then
* the livenessProbe uses `/health` instead of `/airflow/health`
* I don't think the livenessProbe sends the appropriate `Host` header, so even if it requested `/airflow/health` it will return 404 because airflow webserver thinks the requested url is `http://localhost:8080/airflow/health` instead of `http://myhostname/airflow/health`
If I open a shell to the running pod for the `webserver` container with
kubectl -n airflow-test exec -it airflow-test-webserver-569f8bb5f7-gw9rj -c webserver -- /bin/bash
and perform the query with
curl -v --header "Host: myhostname" --header "X-Forwarded-Host: myhostnamme" http://localhost:8080/airflow/login/ # this works
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): EKS 1.18
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12632 | https://github.com/apache/airflow/pull/12634 | 8ecdef3e50d3b83901d70a13794ae6afabc4964e | 75fd5f8254a6ecf616475a485f6da76240a34776 | 2020-11-25T22:08:17Z | python | 2021-01-12T12:03:39Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,616 | ["airflow/api_connexion/schemas/dag_run_schema.py", "tests/api_connexion/schemas/test_dag_run_schema.py"] | Triggering a DAGRUN with invalid execution_date causes a ParserError in the REST API | Triggering a dagrun with an invalid execution date causes a ParserError in the REST API
**Apache Airflow version**: 2.0
**What happened**:
When you trigger a dagrun with an invalid execution date, it returns an HTML response showing the error
**What you expected to happen**:
I expected a JSON response showing that the execution date is invalid
**How to reproduce it**:
1. Start airflow webserver and scheduler
2. Make a post request to this endpoint `http://localhost:28080/api/v1/dags/example_bash_operator/dagRuns` with this request body `{"execution_date": "mydate"}`
3. This will return an HTML page instead of JSON
| https://github.com/apache/airflow/issues/12616 | https://github.com/apache/airflow/pull/12618 | 56f82ba22519b0cf2cb0a1f7c4d083db7f2e3358 | b62abfbfae5ae84f62522ce5db5852598caf9eb8 | 2020-11-25T12:59:16Z | python | 2020-12-03T16:04:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,588 | ["airflow/migrations/versions/61ec73d9401f_add_description_field_to_connection.py", "airflow/migrations/versions/64a7d6477aae_fix_description_field_in_connection_to_.py", "airflow/models/connection.py"] | Connection description migration breaks on MySQL 8 | The migration added in #10873 doesn't work on Mysql8 -- it is too long for a text column with utf8 collation.
```
Running upgrade 2c6edca13270 -> 61ec73d9401f, Add description field to connection
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
cursor.execute(statement, parameters)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 255, in execute
self.errorhandler(self, exc, value)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
raise errorvalue
File "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 252, in execute
res = self._query(query)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 378, in _query
db.query(q)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/connections.py", line 280, in query
_mysql.connection.query(self, query)
_mysql_exceptions.OperationalError: (1118, 'Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs')
```
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12588 | https://github.com/apache/airflow/pull/12596 | 950d80bd98aef63905db9b01c7b8658d06c6f858 | cdaaff12c7c80311eba22dcb856fe9c24d7f49aa | 2020-11-24T14:30:03Z | python | 2020-11-25T13:30:31Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,587 | ["airflow/operators/dagrun_operator.py", "tests/operators/test_dagrun_operator.py"] | dagrun object doesn't exist in the TriggerDagRunOperator |
**Apache Airflow version**: 2.0.0b3
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**: MacOS with Docker
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
```
Traceback (most recent call last):
File "/usr/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/local/airflow/plugins/custom_trigger_operator.py", line 138, in execute
dag_run.execution_date,
UnboundLocalError: local variable 'dag_run' referenced before assignment
```
**What you expected to happen**:
The dag_run object doesn't exist in https://github.com/apache/airflow/blob/94ba200d42403d1067a63e2e6b158d893c2e8b5a/airflow/operators/dagrun_operator.py#L161
as the DagRunAlreadyExists exception is raised
**How to reproduce it**:
```
first_task = DummyOperator(...)
trigger_task = TriggerDagRunOperator(...
reset_dag_run=True,
wait_for_completion=True,
trigger_dag_id='my_dag_to_trigger'
...)
```
Run the DAG above once. It success. Clear the task first_task to run a second time the DAG and so trigger again my_dag_to_trigger, you will get the error.
**Anything else we need to know**:
Nope
| https://github.com/apache/airflow/issues/12587 | https://github.com/apache/airflow/pull/12819 | c1cd50465c5473bc817fded5eeb4c425a0529ae5 | e82cf0d01d6c1e1ec65d8e1b70d65158947fccd2 | 2020-11-24T13:54:22Z | python | 2020-12-05T04:10:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,585 | ["airflow/models/taskinstance.py", "tests/cli/commands/test_task_command.py"] | airflow task test failing due to mini scheduler implementation not respecting test mode | **Apache Airflow version**: 2.0.0b3
**Environment**: Python3.7-slim running on docker
**What happened**:
Error when running `airflow tasks test <dag> <task> <date>` there is an error with the task rescheduler, which should not happen as we are testing the tasks and not running via scheduler.
**What you expected to happen**:
No error message should be displayed as the task is success and it is running on test mode
**How to reproduce it**:
just run the `airflow test <dag> <task> <date>` after a vanilla airflow installation using pip install.
**Anything else we need to know**:
this is the log
```
root@add8b3f038cf:/# airflow tasks test docker_test d 2020-11-24
[2020-11-24 08:13:00,796] {dagbag.py:440} INFO - Filling up the DagBag from /opt/airflow/dags/test
[2020-11-24 08:13:01,072] {taskinstance.py:827} INFO - Dependencies all met for <TaskInstance: docker_test.d 2020-11-24T00:00:00+00:00 [None]>
[2020-11-24 08:13:01,077] {taskinstance.py:827} INFO - Dependencies all met for <TaskInstance: docker_test.d 2020-11-24T00:00:00+00:00 [None]>
[2020-11-24 08:13:01,077] {taskinstance.py:1018} INFO -
--------------------------------------------------------------------------------
[2020-11-24 08:13:01,077] {taskinstance.py:1019} INFO - Starting attempt 1 of 4
[2020-11-24 08:13:01,077] {taskinstance.py:1020} INFO -
--------------------------------------------------------------------------------
[2020-11-24 08:13:01,078] {taskinstance.py:1039} INFO - Executing <Task(PythonOperator): d> on 2020-11-24T00:00:00+00:00
[2020-11-24 08:13:01,109] {taskinstance.py:1232} INFO - Exporting the following env vars:
[email protected]
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=docker_test
AIRFLOW_CTX_TASK_ID=d
AIRFLOW_CTX_EXECUTION_DATE=2020-11-24T00:00:00+00:00
d
[2020-11-24 08:13:01,110] {python.py:118} INFO - Done. Returned value was: None
[2020-11-24 08:13:01,115] {taskinstance.py:1143} INFO - Marking task as SUCCESS. dag_id=docker_test, task_id=d, execution_date=20201124T000000, start_date=20201124T081301, end_date=20201124T081301
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.7/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/usr/local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 50, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/cli.py", line 86, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/cli/commands/task_command.py", line 379, in task_test
ti.run(ignore_task_deps=True, ignore_ti_state=True, test_mode=True)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/session.py", line 63, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1350, in run
mark_success=mark_success, test_mode=test_mode, job_id=job_id, pool=pool, session=session
File "/usr/local/lib/python3.7/site-packages/airflow/utils/session.py", line 59, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1152, in _run_raw_task
self._run_mini_scheduler_on_child_tasks(session)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/session.py", line 59, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1165, in _run_mini_scheduler_on_child_tasks
execution_date=self.execution_date,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3473, in one
raise orm_exc.NoResultFound("No row was found for one()")
sqlalchemy.orm.exc.NoResultFound: No row was found for one()
```
| https://github.com/apache/airflow/issues/12585 | https://github.com/apache/airflow/pull/12595 | 91af0ddf76855737a3c32456ce1cfeddde82b44e | 6caf2607e04f581abdcb38fdbc426e03d5307429 | 2020-11-24T09:33:29Z | python | 2020-11-24T21:43:22Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,576 | ["airflow/providers/docker/operators/docker.py"] | DockerOperator causes log spam when pulling image | **Apache Airflow version**: 1.10.4
**Environment**:
- **Cloud provider or hardware configuration**: AWS EC2
- **OS** (e.g. from /etc/os-release): ubuntu
**What happened**:
I am using `DockerOperator` in Airflow set to pull the image before executing. The pull step generates a lot of log spam:

For example the above log has 1104 lines, of which 1045 are Docker pull output.
**What you expected to happen**:
When nothing goes wrong, either of these three options:
* No output from Docker pull
* A single line saying the pull succeeded
* Several lines, one for each parent image, saying that the pull succeeded
When the pull fails, I expect a single line explaining that the pull failed and how/why.
The Docker pull output is clearly meant for an interactive shell as it is trying to display a live progress report. I see that [in my version](https://github.com/apache/airflow/blob/1.10.4/airflow/operators/docker_operator.py#L210-L212) the output is simply copied to the log. The [more recent version](https://github.com/apache/airflow/blob/8c42cf1b00c90f0d7f11b8a3a455381de8e003c5/airflow/providers/docker/operators/docker.py#L287-L291) also has similar code so I'm assuming the latest Airflow has the same problem.
**How to reproduce it**:
I think simply creating a DAG with a `DockerOperator` that has `force_pull=True` should reproduce the issue.
| https://github.com/apache/airflow/issues/12576 | https://github.com/apache/airflow/pull/12763 | 28e83c30eb6ccd8662cecaaf26590cc0063af65a | 6b339c70c45a2bad0e1e2c3f6638f4c59475569e | 2020-11-23T20:24:41Z | python | 2020-12-03T21:36:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,554 | ["docs/index.rst"] | Remove or limit table of content at the main Airflow doc page | Table of content here : http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/index.html at the main page of "apache-airflow" documentation is huge and useless (especially that we have it in directory on the left).
We should remove it or limit heavily (to 1st level only)

| https://github.com/apache/airflow/issues/12554 | https://github.com/apache/airflow/pull/12561 | b57b9321133a28126e17d17885c80dc04a2e121e | 936566c586e6cbb155ffa541e89a31f7239f51bb | 2020-11-22T19:34:52Z | python | 2020-11-24T12:01:03Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,550 | ["airflow/www/templates/airflow/trigger.html", "airflow/www/views.py", "tests/www/test_views.py"] | Ability to provide sample conf JSON for a dag in trigger page | **Description**
In the trigger page, there is a text area to enter (optional) conf json. It would be great if a sample JSON can be provided programatically while defining DAG
**Use case / motivation**
This will improve usability of the UI that triggers a DAG
| https://github.com/apache/airflow/issues/12550 | https://github.com/apache/airflow/pull/13365 | b2cb6ee5ba895983e4e9d9327ff62a9262b765a2 | 0e510b2f2bb03ed9344df664b123920e70382fd1 | 2020-11-22T16:40:47Z | python | 2021-01-07T02:33:23Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,537 | ["airflow/providers/docker/CHANGELOG.rst", "airflow/providers/docker/example_dags/example_docker_copy_data.py", "airflow/providers/docker/operators/docker.py", "airflow/providers/docker/operators/docker_swarm.py", "airflow/providers/docker/provider.yaml", "docs/conf.py", "docs/exts/docs_build/third_party_inventories.py", "tests/providers/docker/operators/test_docker.py", "tests/providers/docker/operators/test_docker_swarm.py"] | Mounting directories using docker operator on airflow is not working | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: apache-airflow==1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Does not apply
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 18.04.5 LTS bionic
- **Kernel** (e.g. `uname -a`): Linux letyndr-letyndr 4.15.0-123-generic #126-Ubuntu SMP Wed Oct 21 09:40:11 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**:
- **Others**:
**What happened**:
I'm trying to use the docker operator to automate the execution of some scripts using airflow.
What I want to do is to "copy" all my project's files (with folders and files) to the container using this code.
The following file ml-intermediate.py is in this directory ~/airflow/dags/ml-intermediate.py:
```
"""
Template to convert a Ploomber DAG to Airflow
"""
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
from ploomber.spec import DAGSpec
from soopervisor.script.ScriptConfig import ScriptConfig
script_cfg = ScriptConfig.from_path('/home/letyndr/airflow/dags/ml-intermediate')
# Replace the project root to reflect the new location - or maybe just
# write a soopervisor.yaml, then we can we rid of this line
script_cfg.paths.project = '/home/letyndr/airflow/dags/ml-intermediate'
# TODO: use lazy_import from script_cfg
dag_ploomber = DAGSpec('/home/letyndr/airflow/dags/ml-intermediate/pipeline.yaml',
lazy_import=True).to_dag()
dag_ploomber.name = "ML Intermediate"
default_args = {
'start_date': days_ago(0),
}
dag_airflow = DAG(
dag_ploomber.name.replace(' ', '-'),
default_args=default_args,
description='Ploomber dag',
schedule_interval=None,
)
script_cfg.save_script()
from airflow.operators.docker_operator import DockerOperator
for task_name in dag_ploomber:
DockerOperator(task_id=task_name,
image="continuumio/miniconda3",
api_version="auto",
auto_remove=True,
# command="sh /home/letyndr/airflow/dags/ml-intermediate/script.sh",
command="sleep 600",
docker_url="unix://var/run/docker.sock",
volumes=[
"/home/letyndr/airflow/dags/ml-intermediate:/home/letyndr/airflow/dags/ml-intermediate:rw",
"/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"
],
working_dir=script_cfg.paths.project,
dag=dag_airflow,
container_name=task_name,
)
for task_name in dag_ploomber:
task_ploomber = dag_ploomber[task_name]
task_airflow = dag_airflow.get_task(task_name)
for upstream in task_ploomber.upstream:
task_airflow.set_upstream(dag_airflow.get_task(upstream))
dag = dag_airflow
```
When I execute this DAG using Airflow, I get the error that the docker does not find the `/home/letyndr/airflow/dags/ml-intermediate/script.sh` script. I changed the execution command of the docker operator `sleep 600` to enter to the container and check the files in the container with the corrects paths.
**What you expected to happen**: Basically to share the files of the host with the docker container to execute a shell script within the container.
When I'm in the container I can go to this path /home/letyndr/airflow/dags/ml-intermediate/ for example, but I don't see the files that are supposed to be there.
**What do you think went wrong?**
I tried to replicate how Airflow implements Docker SDK for Python
This is my one replication of the docker implementation:
```
import docker
client = docker.APIClient()
# binds = {
# "/home/letyndr/airflow/dags": {
# "bind": "/home/letyndr/airflow/dags",
# "mode": "rw"
# },
# "/home/letyndr/airflow-data/ml-intermediate": {
# "bind": "/home/letyndr/airflow-data/ml-intermediate",
# "mode": "rw"
# }
# }
binds = ["/home/letyndr/airflow/dags:/home/letyndr/airflow/dags:rw",
"/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"]
container = client.create_container(
image="continuumio/miniconda3",
command="sleep 600",
volumes=["/home/letyndr/airflow/dags", "/home/letyndr/airflow-data/ml-intermediate"],
host_config=client.create_host_config(binds=binds),
working_dir="/home/letyndr/airflow/dags",
name="simple_example",
)
client.start(container=container.get("Id"))
```
What I found was that mounting volumes only works if it's set `host_config` and `volumes`, the problem is that the implementation on Airflow just set `host_config` but not `volumes`. I added the parameter on the method `create_container`, it worked.
**How to reproduce it**:
Mount a volume from a host and use the files inside the directory in the docker container.
| https://github.com/apache/airflow/issues/12537 | https://github.com/apache/airflow/pull/15843 | ac3454e4f169cdb0e756667575153aca8c1b6981 | 12995cfb9a90d1f93511a4a4ab692323e62cc318 | 2020-11-21T21:13:03Z | python | 2021-05-17T15:03:18Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,509 | ["Dockerfile", "Dockerfile.ci", "IMAGES.rst", "scripts/ci/libraries/_build_images.sh"] | Add support for using Cloud Build in breeze build-image | **Description**
I would like to build airflow images using external services like Google Cloud Build. In this way I don't have to run the build on local machine and have to design custom CI pipelines to build it (especially for dev purposes).
**Use case / motivation**
Building production image can take time and running this on notebooks sounds like not an optimal way of doing it. We can take advantage of system dedicated to do this task like Google Cloud Build.
**Related Issues**
N/A
| https://github.com/apache/airflow/issues/12509 | https://github.com/apache/airflow/pull/12534 | 370e7d07d1ed1a53b73fe878425fdcd4c71a7ed1 | 37548f09acb91edd041565f52051f58610402cb3 | 2020-11-20T15:18:02Z | python | 2020-11-21T18:21:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,489 | ["airflow/www/forms.py", "airflow/www/views.py", "airflow/www/widgets.py"] | Clicking Edit on TaskInstance Causes Crash | **Apache Airflow version**: 2.0.0b2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**:
- **Cloud provider or hardware configuration**: Ubuntu 20.04 Guest/Windows Server 2012 R2 Host
- **OS** (e.g. from /etc/os-release): Ubuntu 20.04.1 LTS
- **Kernel** (e.g. `uname -a`): Linux 5.4.0-1031-azure #32-Ubuntu SMPx86_64 GNU/Linux
- **Install tools**: Pip 20.3b1, Ansible Template - https://github.com/jrd281/ansible-role-airflow/tree/airflowdbg-1
- **Others**: Python 3.8
**What happened**: I was looking at the taskinstance list and then I clicked the edit icon. After clicking, an error page was displayed.
<!-- (please include exact error messages if you can) -->
**What you expected to happen**: I expected to be whisked to an editing UI
<!-- What do you think went wrong? -->
**How to reproduce it**:
1. Visit your installation's home page
2. Click the success bucket under the `Recent Tasks` heading
<img width="261" alt="AirflowDebug#1" src="https://user-images.githubusercontent.com/848305/99696511-c5462b00-2a5c-11eb-92f1-088347b657a2.png">
3. Click the `Edit` button for one of your tasks
<img width="285" alt="AirflowDebug#2" src="https://user-images.githubusercontent.com/848305/99696788-1524f200-2a5d-11eb-9127-5a308a8ba14c.png">
4. Watch the system throw an error.
<img width="355" alt="AirflowDebug#3" src="https://user-images.githubusercontent.com/848305/99697050-57e6ca00-2a5d-11eb-93f1-1e88aaf59c63.png">
**Anything else we need to know**:
<details><summary>UI Error Page</summary>
Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.
Python version: 3.8.5
Airflow version: 2.0.0b2
Node: REDACTED
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/security/decorators.py", line 109, in wraps
return f(self, *args, **kwargs)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/views.py", line 606, in edit
return self.render_template(
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/baseviews.py", line 280, in render_template
return render_template(
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/templating.py", line 137, in render_template
return _render(
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask/templating.py", line 120, in _render
rv = template.render(context)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/var/lib/airflow/venv/lib/python3.8/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/var/lib/airflow/venv/lib/python3.8/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/model/edit.html", line 2, in top-level template code
{% import 'appbuilder/general/lib.html' as lib %}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/base.html", line 1, in top-level template code
{% extends base_template %}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/airflow/www/templates/airflow/master.html", line 20, in top-level template code
{% extends 'appbuilder/baselayout.html' %}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/baselayout.html", line 2, in top-level template code
{% import 'appbuilder/baselib.html' as baselib %}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/init.html", line 46, in top-level template code
{% block body %}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/baselayout.html", line 19, in block "body"
{% block content %}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/model/edit.html", line 23, in block "content"
{% block edit_form %}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/model/edit.html", line 25, in block "edit_form"
{{ widgets.get('edit')(form_action=form_action)|safe }}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/widgets.py", line 37, in __call__
return template.render(args)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/var/lib/airflow/venv/lib/python3.8/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/var/lib/airflow/venv/lib/python3.8/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/widgets/form.html", line 45, in top-level template code
{{ lib.render_field(field, begin_sep_label, end_sep_label, begin_sep_field, end_sep_field) }}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/jinja2/runtime.py", line 679, in _invoke
rv = self._func(*arguments)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/lib.html", line 237, in template
{{ field(**kwargs)|safe }}
File "/var/lib/airflow/venv/lib/python3.8/site-packages/wtforms/fields/core.py", line 160, in __call__
return self.meta.render_field(self, kwargs)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/wtforms/meta.py", line 56, in render_field
return field.widget(field, **render_kw)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/fieldwidgets.py", line 176, in __call__
return super(Select2ManyWidget, self).__call__(field, **kwargs)
File "/var/lib/airflow/venv/lib/python3.8/site-packages/wtforms/widgets/core.py", line 300, in __call__
for val, label, selected in field.iter_choices():
File "/var/lib/airflow/venv/lib/python3.8/site-packages/flask_appbuilder/fields.py", line 209, in iter_choices
yield (pk, self.get_label(obj), obj in self.data)
TypeError: argument of type 'DagRun' is not iterable
</details>
**How often does this problem occur?:** Every time I click the `Edit` icon
| https://github.com/apache/airflow/issues/12489 | https://github.com/apache/airflow/pull/12770 | b62abfbfae5ae84f62522ce5db5852598caf9eb8 | be7d867459dcc4d26bd3cae55df6ff118c2be16a | 2020-11-19T16:59:15Z | python | 2020-12-03T17:53:59Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,485 | ["UPDATING.md", "airflow/models/dag.py", "airflow/www/views.py", "tests/www/views/test_views.py"] | Optionally clear downstream failed tasks when marking success | **Description**
For a DAG that looks like this:
```
A >> B
```
If A fails, B goes into "upstream_failed" state. If a user then marks A "success", B will remain in "upstream_failed" state. It will not automatically start running. This scenario often happens if the failure of A is dealt with outside of Airflow and the user does not want Airflow to run A again. But he usually expect Airflow to run B after A is marked "success".
**Use case / motivation**
After A is marked "success", its downstream tasks B should be cleared automatically and get ready to be scheduled. To avoid changing this behaviour completely, there can be a toggle button next to "Success" that lets the user decide if he wants Airflow to automatically clear downstream tasks that are in "upstream_failed" state. | https://github.com/apache/airflow/issues/12485 | https://github.com/apache/airflow/pull/13037 | 2bca8a5425c234b04fdf32d6c50ae3a91cd08262 | 6b2524fe733c42fe586405c84a496ac4aaf8fe49 | 2020-11-19T13:46:48Z | python | 2021-05-29T17:11:19Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,468 | ["airflow/api_connexion/endpoints/provider_endpoint.py", "airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/provider_schema.py", "airflow/security/permissions.py", "airflow/www/security.py", "docs/apache-airflow/security/access-control.rst", "tests/api_connexion/endpoints/test_provider_endpoint.py"] | Add API to query for providers | https://github.com/apache/airflow/issues/12468 | https://github.com/apache/airflow/pull/13394 | b8c0fde38a7df9d00185bf53e9f303b98fd064dc | 9dad095f735cd6a73bcbf57324d7ed79f622858c | 2020-11-19T01:19:32Z | python | 2021-05-07T13:47:09Z |
|
closed | apache/airflow | https://github.com/apache/airflow | 12,448 | ["airflow/models/serialized_dag.py", "tests/models/test_dagrun.py"] | General Error Deleting DAG run | Using a 2.0.0 beta 2 build...

Log from webserver:
```
[2020-11-18 14:58:52,712] {interface.py:713} ERROR - Delete record error: Dependency rule tried to blank-out primary key column 'job.id' on instance '<SchedulerJob at 0x7fc68ba7f990>'
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask_appbuilder/models/sqla/interface.py", line 698, in delete
self.session.commit()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py", line 163, in do
return getattr(self.registry(), name)(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
self.transaction.commit()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 504, in commit
self._prepare_impl()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
self.session.flush()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2536, in flush
self._flush(objects)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2678, in _flush
transaction.rollback(_capture_exception=True)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
with_traceback=exc_tb,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2638, in _flush
flush_context.execute()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
rec.execute(self)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 538, in execute
self.dependency_processor.process_deletes(uow, states)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/dependency.py", line 547, in process_deletes
state, child, None, True, uowcommit, False
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/dependency.py", line 604, in _synchronize
sync.clear(dest, self.mapper, self.prop.synchronize_pairs)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/sync.py", line 88, in clear
"column '%s' on instance '%s'" % (r, orm_util.state_str(dest))
AssertionError: Dependency rule tried to blank-out primary key column 'job.id' on instance '<SchedulerJob at 0x7fc68ba7f990>'
172.19.0.1 - - [18/Nov/2020:14:58:52 +0000] "POST /dagrun/delete/147 HTTP/1.1" 302 415 "http://localhost:8080/dagrun/list/?_flt_3_dag_id=airflow_community_data_refresh&_flt_3_state=success" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.193 Safari/537.36"
``` | https://github.com/apache/airflow/issues/12448 | https://github.com/apache/airflow/pull/12586 | c6467ba12d4a94027137e3173097d73be56c5d12 | 08251c145d9ace8fe2f1e1309833eb4d4ad54eca | 2020-11-18T14:56:14Z | python | 2020-11-25T02:07:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,439 | ["airflow/providers/google/cloud/transfers/gcs_to_bigquery.py"] | The operator GCS to BigQuery fails to download the schema because download is called with the wrong order of parameters. | # Version: Airflow: apache-airflow-backport-providers-google==2020.11.13
## Problem
The operator gcs to bigquery fails because it calls the method download with positional parameters in the wrong order.
It calls in this way `gcs_hook.download(self.bucket, self.schema_object)`. See more in the [line](https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py#L267)
The signature for the download is `gcs_hook.download(self, object_name, bucket_name)`. See more in the [line ](https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/hooks/gcs.py#L259-L261)
## Solution
1. Change the [gcs_to_bigquery](https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py#L267) operator so it calls the method download with the keyword parameters rather than positional.
2. Change the [gcs_to_bigquery](https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/transfers/gcs_to_bigquery.py#L267) operator so it calls the method download with the right order of the parameters.
3. Change the signature of download method in the [gcs_hook](https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/hooks/gcs.py#L353-L361) to be `def download(self, bucket_name, object_name, filename)` rather than `def dowload(self, object_name, bucket_name, filename)`
I would prefer the option 3 because the method upload start with bucket_name see [line]( https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/hooks/gcs.py#L353-L361) and it makes sense to follow the same convention. Also most of the method in the gcs hook start with bucket name.
Also in Airflow 1.10.12 the method download starts with bucket_name and it would make sense not to change the order of the parameters (even if it change the name of the parameters). See [here](https://airflow.apache.org/docs/stable/_api/airflow/contrib/hooks/gcs_hook/index.html#airflow.contrib.hooks.gcs_hook.GoogleCloudStorageHook.download)
## Issues Related
This issue is related with https://github.com/apache/airflow/issues/12335. It was closed but it was not fixed. I left a comments but there was not reply. | https://github.com/apache/airflow/issues/12439 | https://github.com/apache/airflow/pull/12442 | bf6da166a974331e8cfe416c29cde0206c86d251 | 8d09506464c8480fa42e8bfe6a36c6f631cd23f6 | 2020-11-18T09:27:12Z | python | 2020-11-18T11:25:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,413 | ["airflow/migrations/versions/e165e7455d70_add_description_field_to_variable.py", "airflow/models/variable.py", "airflow/www/views.py"] | Description Field for Variables | add text column to explain what the variable is used for
same as https://github.com/apache/airflow/issues/10840 just for Variables.
| https://github.com/apache/airflow/issues/12413 | https://github.com/apache/airflow/pull/15194 | d944f5a59daf0c4512f87369c6eabb27666376bf | 925ef281856630f5231baf42a30a5eb18f0b7ca0 | 2020-11-17T18:32:16Z | python | 2021-04-12T15:33:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,401 | ["airflow/www/views.py", "tests/www/views/test_views_connection.py"] | Duplicate connections UI | In the UI it would be nice to duplicate the selected connection, in the menu "With Selected".
The copy would be set with the same name plus some suffix (_copy, _1, whatever) and it is useful for when you have a connection with all equal fields except something in a particular one. | https://github.com/apache/airflow/issues/12401 | https://github.com/apache/airflow/pull/15574 | 621ef766ffc77c7bd51c81fe802fa019a44094ea | 2011da25a50edfcdf7657ec172f57ae6e43ca216 | 2020-11-17T13:39:57Z | python | 2021-06-17T15:22:44Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,385 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/configuration.py", "docs/howto/set-config.rst", "docs/spelling_wordlist.txt", "tests/core/test_config_templates.py"] | [ldap] section in configuration is not applicable anymore in 2.0 | **Apache Airflow version**: 2.0.0b* / master
**What happened**:
`[ldap]` section in `airflow.cfg` is not applicable anymore in 2.0 and `master`, because the LDAP authentication (for webserver and API) is handled by FAB, and the configuration for this is handled by `webserver_config.py` file.

**What you expected to happen**:
The `[ldap]` section should be removed from `airflow/config_templates/default_airflow.cfg` and `airflow/config_templates/config.yml` (and some other applicable files).
Otherwise leaving this section there will be a big confusion for users. | https://github.com/apache/airflow/issues/12385 | https://github.com/apache/airflow/pull/12386 | d4e1ff290f59f365163b0d969840f72746364e8e | 35b56148175fca18d719d9b6847019d623880f66 | 2020-11-16T17:53:27Z | python | 2020-11-16T20:34:20Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,369 | ["airflow/cli/commands/connection_command.py", "tests/cli/commands/test_connection_command.py"] | In 2.0.0b2/master, CLI "airflow connections add" is not handling invalid URIs properly |
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0b2
**Environment**: Docker
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
In current 2.0.0b2, `airflow connections add`Β is not handling invalid URIs properly.
For example, running CLI `airflow connections add --conn-uriΒ xyzΒ conn_dummy` will
- create the connection βconn_dummyβ successfully (which should not be, IMO)
- in the connection created, it only has one attribute filled, which is βschemaβ, but the value added is βyzβ (the value we provide is "xyz", i.e., the 1st element is removed. because of https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L213)
<img width="552" alt="airflow_0284e070228e___" src="https://user-images.githubusercontent.com/11539188/99155673-519dca00-26ba-11eb-9517-31c1a238476f.png">
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
In my opinion, a validation should be done first to ensure the `conn-uri` provided is valid (at least have βschemeβ and βnetlocβ available in the `urlparse` result), and reject if it's an invalid URI.
<!-- What do you think went wrong? -->
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12369 | https://github.com/apache/airflow/pull/12370 | 9ba8b31ed5001ea6522657b86ce3dfd2a75d594c | 823b3aace298ab13d2e19b8f0bf1c426ff20407c | 2020-11-14T19:47:00Z | python | 2020-11-15T10:47:57Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,357 | [".pre-commit-config.yaml", "airflow/config_templates/config.yml", "airflow/config_templates/config.yml.schema.json"] | JSON schema validation for airflow/config_templates/config.yml | Hello my friends,
I hope you are all right and you are all healthy. Today I come with another small improvement proposal for our project.
I think it is worth adding JSON schema validations for `[airflow/config_templates/config.yml`](https://github.com/apache/airflow/blob/master/airflow/config_templates/config.yml) file. This will allow us to detect the wrong structure of this file very quickly and also provide simple documentation for this file. This schema should be verified every time a file is loaded.
In order to generate a schema, you can use generators:
https://onlineyamltools.com/convert-yaml-to-json
https://www.liquid-technologies.com/online-json-to-schema-converter (Please note that you should set array rules to "List validation". In the result, it is also worth updating the JSON Schema specification versions to latest current spec)
In the generated specification, it is worth adding descriptions for each field:
https://json-schema.org/understanding-json-schema/reference/generic.html#annotations
Best regards,
Kamil BreguΕa
| https://github.com/apache/airflow/issues/12357 | https://github.com/apache/airflow/pull/13025 | f6448b4e482fd96339ae65c26d08e6a2bdb51aaf | 1dc36d870418824ecc2221b468a576eecd9eca24 | 2020-11-13T23:25:39Z | python | 2020-12-16T01:49:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,344 | ["airflow/www/static/js/base.js"] | 2.0.0b2: time-label is wrongly formatted | When hovering over the timestamp.
<img width="224" alt="image" src="https://user-images.githubusercontent.com/8430564/99089658-e1197f00-25cd-11eb-86b5-6633a410ce3b.png">
| https://github.com/apache/airflow/issues/12344 | https://github.com/apache/airflow/pull/12447 | 7ca0b6f121c9cec6e25de130f86a56d7c7fbe38c | b584adbe1120d5e2b8a9fae3356a97f13ed70cd3 | 2020-11-13T15:33:30Z | python | 2020-11-18T14:58:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,329 | ["airflow/providers/google/cloud/transfers/gcs_to_bigquery.py", "tests/providers/google/cloud/transfers/test_gcs_to_bigquery.py"] | GCSToBigQueryOperator - allow upload to existing table without specifying schema_fields/schema_object | **Description**
We would like to be able to load data to **existing** BigQuery tables without having to specify schema_fields/schema_object in `GCSToBigQueryOperator` since table already exists.
**Use case / motivation**
<details>
<summary>BigQuery load job usage details and problem explanation</summary>
We create BigQuery tables/datasets through CI process (terraform managed), with the help of Airflow we updating those tables with data.
To update tables with data we can use:
Airflow 2.0 operator: GCSToBigQueryOperator
Airflow 1.* operator (deprecated) GoogleCloudStorageToBigQueryOperator
However those operator require to specify one of 3 things:
1. schema_fields - fields that define table schema
2. schema_object - a GCS object path pointing to a .json file that contains the schema for the table
3. or autodetect=True
In other cases it will:
```
raise ValueError('At least one of `schema_fields`, `schema_object`, '
'or `autodetect**=True**` must be passed.')
```
_Note: it does not actually says that `autodetect` must be `True` in exception - but according to code it must be specified as True, or schema should be used otherwise._
But we already have created table, and we can update it using
`bq load` command. (which Airflow operators mentioned above are using internally)
When using `bq load` - you also have an option to specify **schema**. The schema can be a local JSON file, or it can be typed inline as part of the command. You can also use the `--autodetect` flag instead of supplying a schema definition.
https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-csv#bq
When you specify `--autodetect` as True - BigQuery will try to give random names to your columns, e.g.: 'string_field_0', 'int_field_1' - and if you are trying to load into **existing** table - `bq load` will fail with error:
'Cannot add fields (field: string_field_0)'}.'
Same way Airflow operators like 'GCSToBigQueryOperator' will fail.
However **there is also an option NOT to specify** `--autodetect` or specify `--autodetect=False` and in this case `bq load` will load from CloudStorage to **existing** BQ table without problems.
</details>
Proposal/TL;DR:
Add an option **not** to specify `--autodetect` or specify `--autodetect=False` when write_disposition='WRITE_APPEND' is used in GCSToBigQueryOperator. This will allow an operator to update **existing** BigQuery table without having to specify schema within the operator itself (it will just be updating **existing** table with data).
| https://github.com/apache/airflow/issues/12329 | https://github.com/apache/airflow/pull/28564 | 3df204b53247f51d94135698defdbae0d359921f | d7f5f6d737cf06cc8e216f523534aeaf48065793 | 2020-11-12T22:38:28Z | python | 2022-12-24T14:26:59Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,318 | ["airflow/providers/ssh/hooks/ssh.py", "tests/providers/ssh/hooks/test_ssh.py"] | SSHOperator does support only RSA keys and should support other keys compatible with Paramiko | See the hook: `airflow/providers/ssh/hooks/ssh.py`
The Paramiko lib supports ECDSA keys. But the hook forces the use of RSA keys.
This should be viewed as an issue as the release note of OpenSSH 8.4 states that `ssh-rsa` public key signatures are soon becoming obsolete.
https://www.openssh.com/txt/release-8.4
```
Future deprecation notice
=========================
It is now possible[1] to perform chosen-prefix attacks against the
SHA-1 algorithm for less than USD$50K. For this reason, we will be
disabling the "ssh-rsa" public key signature algorithm by default in a
near-future release.
This algorithm is unfortunately still used widely despite the
existence of better alternatives, being the only remaining public key
signature algorithm specified by the original SSH RFCs.
The better alternatives include:
* The RFC8332 RSA SHA-2 signature algorithms rsa-sha2-256/512. These
algorithms have the advantage of using the same key type as
"ssh-rsa" but use the safe SHA-2 hash algorithms. These have been
supported since OpenSSH 7.2 and are already used by default if the
client and server support them.
* The ssh-ed25519 signature algorithm. It has been supported in
OpenSSH since release 6.5.
* The RFC5656 ECDSA algorithms: ecdsa-sha2-nistp256/384/521. These
have been supported by OpenSSH since release 5.7.
To check whether a server is using the weak ssh-rsa public key
algorithm, for host authentication, try to connect to it after
removing the ssh-rsa algorithm from ssh(1)'s allowed list:
ssh -oHostKeyAlgorithms=-ssh-rsa user@host
If the host key verification fails and no other supported host key
types are available, the server software on that host should be
upgraded.
We intend to enable UpdateHostKeys by default in the next OpenSSH
release. This will assist the client by automatically migrating to
better algorithms. Users may consider enabling this option manually.
[1] "SHA-1 is a Shambles: First Chosen-Prefix Collision on SHA-1 and
Application to the PGP Web of Trust" Leurent, G and Peyrin, T
(2020) https://eprint.iacr.org/2020/014.pdf
```
Thus, support of other public keys like EC keys should be added in order to improve the security and interoperability of Airflow, and so that users can start using more trustworthy algorithms. Especially since Paramiko already supports them. | https://github.com/apache/airflow/issues/12318 | https://github.com/apache/airflow/pull/12467 | 3d3a219ca9fe52086a0f6f637aa09c6a8ef28631 | f180fa13bf2a0ffa31b30bb21468510fe8a20131 | 2020-11-12T18:35:14Z | python | 2021-02-08T20:21:07Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,315 | ["airflow/www/templates/airflow/model_list.html"] | WebUI broken by custom XComBackend | **Apache Airflow version**:
2.0 / master
**Environment**:
breeze
**What happened**:
I want to use custom XCom backend that returns in `deserialize_value` a pandas DataFrame object. This should work as expected because the type annotation suggests it:
```
def deserialize_value(result) -> Any:
```
However, not every object implements `__nonzero__` method that returns boolean value. For example:
```
File "/usr/local/lib/python3.8/site-packages/pandas/core/generic.py", line 1326, in __nonzero__
raise ValueError(
ValueError: The truth value of a DataFrame is ambiguous. Use a.empty, a.bool(), a.item(), a.any() or a.all().
```
This is problematic because if users click in Airflow webui Admin > XComs then they will the following:
```
Python version: 3.8.6
Airflow version: 2.0.0b2
Node: 950a17127708
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/security/decorators.py", line 109, in wraps
return f(self, *args, **kwargs)
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/views.py", line 552, in list
return self.render_template(
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/baseviews.py", line 280, in render_template
return render_template(
File "/usr/local/lib/python3.8/site-packages/flask/templating.py", line 137, in render_template
return _render(
File "/usr/local/lib/python3.8/site-packages/flask/templating.py", line 120, in _render
rv = template.render(context)
File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/usr/local/lib/python3.8/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/model/list.html", line 2, in top-level template code
{% import 'appbuilder/general/lib.html' as lib %}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/base.html", line 1, in top-level template code
{% extends base_template %}
File "/opt/airflow/airflow/www/templates/airflow/master.html", line 20, in top-level template code
{% extends 'appbuilder/baselayout.html' %}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/baselayout.html", line 2, in top-level template code
{% import 'appbuilder/baselib.html' as baselib %}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/init.html", line 46, in top-level template code
{% block body %}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/baselayout.html", line 19, in block "body"
{% block content %}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/model/list.html", line 13, in block "content"
{% block list_list scoped %}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/model/list.html", line 15, in block "list_list"
{{ widgets.get('list')()|safe }}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/widgets.py", line 37, in __call__
return template.render(args)
File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/usr/local/lib/python3.8/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/usr/local/lib/python3.8/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "/opt/airflow/airflow/www/templates/airflow/model_list.html", line 21, in top-level template code
{% extends 'appbuilder/general/widgets/base_list.html' %}
File "/usr/local/lib/python3.8/site-packages/flask_appbuilder/templates/appbuilder/general/widgets/base_list.html", line 23, in top-level template code
{% block begin_loop_values %}
File "/opt/airflow/airflow/www/templates/airflow/model_list.html", line 80, in block "begin_loop_values"
{% elif item[value] != None %}
File "/usr/local/lib/python3.8/site-packages/pandas/core/generic.py", line 1326, in __nonzero__
raise ValueError(
ValueError: The truth value of a DataFrame is ambiguous. Use a.empty, a.bool(), a.item(), a.any() or a.all().
```
**What you expected to happen**:
It would be nice to allow returning **any** object in deserialise method
**How to reproduce it**:
Create `xcom_backend.py` in `files` directory (assuming breeze) with the following content:
```py
import pandas as pd
from airflow.models.xcom import BaseXCom
class CustomXComBackend(BaseXCom):
@staticmethod
def serialize_value(value: Any):
return BaseXCom.serialize_value(value)
@staticmethod
def deserialize_value(result) -> Any:
return pd.DataFrame({"a": [1, 2]})
```
then set
```
export PYTHONPATH=$PYTHONPATH:/files
export AIRFLOW__CORE__XCOM_BACKEND=xcom_backend. CustomXComBackend
```
and start webserver
```
airflow webserver -w 1 -D
```
You may need to first run a DAG to populate the table.
**Anything else we need to know**:
N/A
| https://github.com/apache/airflow/issues/12315 | https://github.com/apache/airflow/pull/16893 | 2fea4cdceaa12b3ac13f24eeb383af624aacb2e7 | dcc7fb56708773e929f742c7c8463fb8e91e7340 | 2020-11-12T17:09:36Z | python | 2021-07-12T14:23:07Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,313 | ["airflow/www/templates/airflow/dags.html"] | The "Filter tags" multi-select container can't hold the selected tags while switching between views | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.12 & 2.0.0b2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): Linux mint 20
- **Kernel** (e.g. `uname -a`): 5.4.0-52-generic
- **Install tools**:
- **Others**:
**What happened**:

<!-- (please include exact error messages if you can) -->
**What you expected to happen**: The selected tags should stay the same while switching between `All`, `Active`, and `Paused`
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12313 | https://github.com/apache/airflow/pull/12324 | af19b126e94876c371553f6a7cfae6b1102f79fd | 7f828b03ccef848c740f8013c56a856708ed505c | 2020-11-12T16:56:23Z | python | 2020-11-12T21:45:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,309 | ["airflow/operators/python.py", "docs/concepts.rst", "tests/operators/test_python.py"] | TaskGroup does not support dynamically generated tasks | **Apache Airflow version**:
2.0 / master
**Environment**:
breeze
**What happened**:
Using this DAG:
```py
from airflow.operators.bash import BashOperator
from airflow.operators.python import task
from airflow.models import DAG
from airflow.utils.task_group import TaskGroup
@task
def show():
print("Cats are awesome!")
with DAG(
"using_task_group",
default_args={'owner': 'airflow'},
start_date=days_ago(2),
schedule_interval=None,
) as dag3:
start_task = BashOperator(
task_id="start_task",
bash_command="echo start",
)
end_task = BashOperator(
task_id="end_task",
bash_command="echo end",
)
with TaskGroup(group_id="show_tasks") as tg1:
previous_show = show()
for _ in range(100):
next_show = show()
previous_show >> next_show
previous_show = next_show
```
I get:
```
Broken DAG: [/files/dags/test.py] Traceback (most recent call last):
File "/opt/airflow/airflow/models/baseoperator.py", line 410, in __init__
task_group.add(self)
File "/opt/airflow/airflow/utils/task_group.py", line 140, in add
raise DuplicateTaskIdFound(f"Task id '{key}' has already been added to the DAG")
airflow.exceptions.DuplicateTaskIdFound: Task id 'show_tasks.show' has already been added to the DAG
```
If I remove the task group the task are generated as expected.
**What you expected to happen**:
I expect to be able to generate tasks dynamically using TaskGroup and task decoratos.
**How to reproduce it**:
Use the DAG from above.
**Anything else we need to know**:
N/A
| https://github.com/apache/airflow/issues/12309 | https://github.com/apache/airflow/pull/12312 | 823b3aace298ab13d2e19b8f0bf1c426ff20407c | 39ea8722c04fb1c0b286b4248a52e8d974a47b30 | 2020-11-12T13:43:16Z | python | 2020-11-15T11:28:04Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,306 | ["airflow/api_connexion/openapi/v1.yaml", "tests/api_connexion/endpoints/test_task_instance_endpoint.py"] | Getting list of taskInstances without start_date, end_date and state fails in REST API | Getting a list of taskinstances where some taskinstances does not have start date, end date and state fails
**Apache Airflow version**: 2.0
**What happened**:
Calling the endpoint `http://localhost:28080/api/v1//dags/~/dagRuns/~/taskInstances/list` with dags whose tasks instances have not started fails.
```
{
"detail": "None is not of type 'string'\n\nFailed validating 'type' in schema['allOf'][0]['properties']['task_instances']['items']['properties']['start_date']:\n {'format': 'datetime', 'type': 'string'}\n\nOn instance['task_instances'][9349]['start_date']:\n None",
"status": 500,
"title": "Response body does not conform to specification",
"type": "https://airflow.readthedocs.io/en/latest/stable-rest-api-ref.html#section/Errors/Unknown"
}
```
**What you expected to happen**:
I expected to see a list of task instances like this:
```
{
"task_instances": [
{
"dag_id": "latest_only",
"duration": 0.481884,
"end_date": "2020-11-11T22:03:03.822310+00:00",
"execution_date": "2020-11-10T12:00:00+00:00",
"executor_config": "{}",
"hostname": "7b6c973dde4b",
"max_tries": 0,
"operator": "LatestOnlyOperator",
"pid": 1943,
"pool": "default_pool",
"pool_slots": 1,
"priority_weight": 2,
"queue": "default",
"queued_when": "2020-11-11T22:03:02.648502+00:00",
"sla_miss": null,
"start_date": "2020-11-11T22:03:03.340426+00:00",
"state": "success",
"task_id": "latest_only",
"try_number": 1,
"unixname": "root"
},
{
"dag_id": "example_branch_dop_operator_v3",
"duration": null,
"end_date": null,
"execution_date": "2020-11-11T02:18:00+00:00",
"executor_config": "{}",
"hostname": "",
"max_tries": 0,
"operator": "BranchPythonOperator",
"pid": null,
"pool": "default_pool",
"pool_slots": 1,
"priority_weight": 3,
"queue": "default",
"queued_when": null,
"sla_miss": null,
"start_date": null,
"state": null,
"task_id": "condition",
"try_number": 0,
"unixname": "root"
}
],
"total_entries": 2
}
```
**How to reproduce it**:
Call the endpoint `http://localhost:28080/api/v1//dags/~/dagRuns/~/taskInstances/list` with dags whose tasks instances have not started. | https://github.com/apache/airflow/issues/12306 | https://github.com/apache/airflow/pull/12453 | e9cfa393ab05e9d9546e5c203d4b39af5586031d | 44282350716e322a20e9da069f63d4f2fa6fbc42 | 2020-11-12T08:08:38Z | python | 2020-11-20T11:36:46Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,298 | ["docs/apache-airflow/concepts/dags.rst"] | Add docs around when to use TaskGroup vs SubDag and potentially listing PROs and CONS. | It would be great for users to know when they should use TaskGroup vs SubDag. A section somewhere in docs would be great or even better a Matrix / table to explain difference would be aweomse.
What are the PROs and CONs of each
**TaskGroup**: https://airflow.readthedocs.io/en/latest/concepts.html#taskgroup
**SubDags**: https://airflow.readthedocs.io/en/latest/concepts.html#subdags | https://github.com/apache/airflow/issues/12298 | https://github.com/apache/airflow/pull/20700 | 129b4d2ac2ce09d42fb487f8a9aaac7eb7901a05 | 6b0c52898555641059e149c5ff0d9b46b2d45379 | 2020-11-11T23:56:49Z | python | 2022-01-09T21:58:26Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,292 | ["airflow/operators/subdag.py", "tests/operators/test_subdag_operator.py"] | Deprecate SubDags in Favor of TaskGroups | Once TaskGroups (https://airflow.readthedocs.io/en/latest/concepts.html#taskgroup) that would be released in Airflow 2.0 reach feature parity with SubDags and we have wide adoption and feedback from users about Taskgroups we should deprecate Subdags and remove them eventually in Airflow 3.0.
Discussion Thread: https://lists.apache.org/thread.html/ra52746f9c8274469d343b5f0251199de776e75ab75ded6830886fb6a%40%3Cdev.airflow.apache.org%3E | https://github.com/apache/airflow/issues/12292 | https://github.com/apache/airflow/pull/17488 | 69d2ed65cb7c9384d309ae5e499d5798c2c3ac96 | b311bc0237b28c6d23f54137ed46f46e7fa5893f | 2020-11-11T23:15:54Z | python | 2021-08-08T12:05:31Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,284 | ["airflow/providers/google/cloud/hooks/bigquery.py", "tests/providers/google/cloud/hooks/test_bigquery.py"] | BigQueryGetDataOperator doesn't return data correctly | Apache Airflow version: 1.10.*
BackPort Packages version: 2020.11.13rc1
What happened:
Although running the `BigQueryGetDataOperator` without specifying `selected_fields` appears to be successful, if you inspect `table_data` returned from `hook.list_rows` then it is just a list of empty sets.
```
[2020-11-11 15:10:30,665] {bigquery.py:483} INFO - Total extracted rows: 253
[(), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), (), ()]
[2020-11-11 15:10:30,679] {taskinstance.py:1143} INFO - Marking task as SUCCESS. dag_id=my-test-dag-v1.0, task_id=get-table, execution_date=20201101T000000, start_date=20201111T151030, end_date=20201111T151030
```
Adding some fields to the `selected_fields` argument, does correctly return data however.
```
[2020-11-11 15:13:37,608] {bigquery.py:483} INFO - Total extracted rows: 253
[(None, 'Anonymous Proxy'), (None, 'Satellite Provider'), (None, 'Asia/Pacific Region'), (None, 'European Union'), ('Pacific', 'American Samoa'), ('Pacific', 'Australia'), ('Pacific', 'Cook Islands'), ('Pacific', 'Fiji'), ('Pacific', 'Micronesia, Federated States of'), ('Pacific', 'Guam'), ('Pacific', 'Kiribati'), ('Pacific', 'Marshall Islands'), ('Pacific', 'Northern Mariana Islands'), ('Pacific', 'New Caledonia'), ('Pacific', 'Norfolk Island'), ('Pacific', 'Nauru'),
```
I think the offending code is between this two lines:
https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/hooks/bigquery.py#L1299
https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/hooks/bigquery.py#L1312
When no fields are submitted, it passes `[]` to the `list_rows` method in the BigQuery client which returns:
```
[Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), Row((), {}), R
```
Changing `hooks.bigquery.py` so that only `None` or a non-empty list is passed will fix the issue. | https://github.com/apache/airflow/issues/12284 | https://github.com/apache/airflow/pull/12307 | 571f831cfe4879fbe56e19ea9b1b93bb7c830d5e | 32b59f8350f55793df6838a32de662a80483ecda | 2020-11-11T15:32:29Z | python | 2020-11-12T22:47:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,276 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/models/log.py", "tests/api_connexion/endpoints/test_event_log_endpoint.py"] | (REST API) List event logs endpoint is not working |
The list event log endpoint is not working
**Apache Airflow version**: 2.0
**Environment**: Breeze
**What happened**:
Calling the list event logs endpoint `http://localhost:28080/api/v1/eventLogs` returns 500 status code with the below message:
```
{
"detail": "None is not of type 'string'\n\nFailed validating 'type' in schema['allOf'][0]['properties']['event_logs']['items']['properties']['dag_id']:\n {'description': 'The DAG ID', 'readOnly': True, 'type': 'string'}\n\nOn instance['event_logs'][0]['dag_id']:\n None",
"status": 500,
"title": "Response body does not conform to specification",
"type": "https://airflow.readthedocs.io/en/latest/stable-rest-api-ref.html#section/Errors/Unknown"
}
```
**What you expected to happen**:
I expected it to produce a list of event logs:
```
{
"event_logs": [
{
"dag_id": null,
"event": "cli_webserver",
"event_log_id": 482,
"execution_date": null,
"extra": "{\"host_name\": \"e24b454f002a\", \"full_command\": \"['/usr/local/bin/airflow', 'webserver']\"}",
"owner": "root",
"task_id": null,
"when": "2020-11-11T03:28:48.722814+00:00"
},
{
"dag_id": null,
"event": "cli_scheduler",
"event_log_id": 483,
"execution_date": null,
"extra": "{\"host_name\": \"e24b454f002a\", \"full_command\": \"['/usr/local/bin/airflow', 'scheduler']\"}",
"owner": "root",
"task_id": null,
"when": "2020-11-11T03:32:18.684231+00:00"
},
],
"total_entries": 2
}
```
**How to reproduce it**:
1. Start airflow webserver and scheduler in breeze
2. Call the endpoint `http://localhost:28080/api/v1/eventLogs`
3. Check the response
| https://github.com/apache/airflow/issues/12276 | https://github.com/apache/airflow/pull/12287 | 388736bf97a4313f81aadbeecbb99e5fcb145c31 | 0d37c59669afebe774355a310a889e3cfa378862 | 2020-11-11T04:21:14Z | python | 2020-11-11T19:10:13Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,269 | ["airflow/example_dags/tutorial_taskflow_api_etl.py", "docs/apache-airflow/tutorial_taskflow_api.rst"] | Update TaskFlow API example to use `@dag` decorator |
**Description**
Update https://github.com/apache/airflow/blob/master/airflow/example_dags/tutorial_taskflow_api_etl.py tp use `@dag` decorator instead of leveraging context manager to make it more TaskFlow-y.
| https://github.com/apache/airflow/issues/12269 | https://github.com/apache/airflow/pull/12937 | 5c74c3a5c1bc6424a068f1dd21a2d999b92cd8c5 | 18a7a35eabc8aef5a9ba2f2cf0f9b64d5f5ebd58 | 2020-11-10T21:14:18Z | python | 2020-12-09T09:40:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,263 | ["scripts/ci/images/ci_build_dockerhub.sh"] | Tagged Production image build fails in DockerHub | Beta1 and Beta 2 were built manually as the scripts for building images in DockerHub ALMOST worked :). they failed on pulling tagged -build images.
```
Cloning into '.'...
Warning: Permanently added the RSA host key for IP address '140.82.114.4' to the list of known hosts.
Switched to a new branch '2.0.0b2'
Executing build hook...
DOCKER_REPO=index.docker.io/apache/airflow
DOCKERHUB_USER=apache
DOCKERHUB_REPO=airflow
DOCKER_TAG=2.0.0b2-python3.7
Detected PYTHON_MAJOR_MINOR_VERSION=3.7
+++ date +%s
++ START_SCRIPT_TIME=1605013728
++ build_images::determine_docker_cache_strategy
++ [[ -z '' ]]
++ [[ false == \t\r\u\e ]]
++ export DOCKER_CACHE=pulled
++ DOCKER_CACHE=pulled
++ readonly DOCKER_CACHE
++ verbosity::print_info
++ [[ false == \t\r\u\e ]]
++ verbosity::print_info 'Using pulled cache strategy for the build.'
++ [[ false == \t\r\u\e ]]
++ verbosity::print_info
++ [[ false == \t\r\u\e ]]
++ initialization::get_environment_for_builds_on_ci
++ [[ false == \t\r\u\e ]]
++ export CI_TARGET_REPO=apache/airflow
++ CI_TARGET_REPO=apache/airflow
++ export CI_TARGET_BRANCH=master
++ CI_TARGET_BRANCH=master
++ export CI_BUILD_ID=0
++ CI_BUILD_ID=0
++ export CI_JOB_ID=0
++ CI_JOB_ID=0
++ export CI_EVENT_TYPE=pull_request
++ CI_EVENT_TYPE=pull_request
++ export CI_REF=refs/head/master
++ CI_REF=refs/head/master
++ [[ false == \t\r\u\e ]]
++ build_images::get_docker_image_names
++ export PYTHON_BASE_IMAGE_VERSION=3.7
++ PYTHON_BASE_IMAGE_VERSION=3.7
++ export PYTHON_BASE_IMAGE=python:3.7-slim-buster
++ PYTHON_BASE_IMAGE=python:3.7-slim-buster
++ export AIRFLOW_CI_BASE_TAG=master-python3.7-ci
++ AIRFLOW_CI_BASE_TAG=master-python3.7-ci
++ export AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ export AIRFLOW_CI_PYTHON_IMAGE=apache/airflow:python3.7-master
++ AIRFLOW_CI_PYTHON_IMAGE=apache/airflow:python3.7-master
++ export AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ [[ 2.0.0b2-python3.7 == '' ]]
++ export AIRFLOW_PROD_BASE_TAG=2.0.0b2-python3.7
++ AIRFLOW_PROD_BASE_TAG=2.0.0b2-python3.7
++ export AIRFLOW_PROD_IMAGE=apache/airflow:2.0.0b2-python3.7
++ AIRFLOW_PROD_IMAGE=apache/airflow:2.0.0b2-python3.7
++ export AIRFLOW_PROD_BUILD_IMAGE=apache/airflow:2.0.0b2-python3.7-build
++ AIRFLOW_PROD_BUILD_IMAGE=apache/airflow:2.0.0b2-python3.7-build
++ export AIRFLOW_PROD_IMAGE_KUBERNETES=apache/airflow:2.0.0b2-python3.7-kubernetes
++ AIRFLOW_PROD_IMAGE_KUBERNETES=apache/airflow:2.0.0b2-python3.7-kubernetes
++ export AIRFLOW_PROD_IMAGE_DEFAULT=apache/airflow:master
++ AIRFLOW_PROD_IMAGE_DEFAULT=apache/airflow:master
++ export BUILT_CI_IMAGE_FLAG_FILE=/src/b3fpsnmwartmqn9f6rbzfxh/.build/master/.built_3.7
++ BUILT_CI_IMAGE_FLAG_FILE=/src/b3fpsnmwartmqn9f6rbzfxh/.build/master/.built_3.7
++ initialization::make_constants_read_only
++ readonly PYTHON_MAJOR_MINOR_VERSION
++ readonly WEBSERVER_HOST_PORT
++ readonly POSTGRES_HOST_PORT
++ readonly MYSQL_HOST_PORT
++ readonly HOST_USER_ID
++ readonly HOST_GROUP_ID
++ readonly HOST_AIRFLOW_SOURCES
++ readonly HOST_HOME
++ readonly HOST_OS
++ readonly ENABLE_KIND_CLUSTER
++ readonly KUBERNETES_MODE
++ readonly KUBERNETES_VERSION
++ readonly KIND_VERSION
++ readonly HELM_VERSION
++ readonly KUBECTL_VERSION
++ readonly BACKEND
++ readonly POSTGRES_VERSION
++ readonly MYSQL_VERSION
++ readonly MOUNT_LOCAL_SOURCES
++ readonly INSTALL_AIRFLOW_VERSION
++ readonly INSTALL_AIRFLOW_REFERENCE
++ readonly DB_RESET
++ readonly VERBOSE
++ readonly START_AIRFLOW
++ readonly PRODUCTION_IMAGE
++ readonly SKIP_BUILDING_PROD_IMAGE
++ readonly CI_BUILD_ID
++ readonly CI_JOB_ID
++ readonly IMAGE_TAG
++ readonly AIRFLOW_PRE_CACHED_PIP_PACKAGES
++ readonly INSTALL_AIRFLOW_VIA_PIP
++ readonly AIRFLOW_LOCAL_PIP_WHEELS
++ readonly AIRFLOW_CONSTRAINTS_REFERENCE
++ readonly AIRFLOW_CONSTRAINTS_LOCATION
++ readonly ADDITIONAL_AIRFLOW_EXTRAS
++ readonly ADDITIONAL_PYTHON_DEPS
++ readonly AIRFLOW_PRE_CACHED_PIP_PACKAGES
++ readonly DEV_APT_COMMAND
++ readonly DEV_APT_DEPS
++ readonly ADDITIONAL_DEV_APT_COMMAND
++ readonly ADDITIONAL_DEV_APT_DEPS
++ readonly ADDITIONAL_DEV_APT_ENV
++ readonly RUNTIME_APT_COMMAND
++ readonly RUNTIME_APT_DEPS
++ readonly ADDITIONAL_RUNTIME_APT_COMMAND
++ readonly ADDITIONAL_RUNTIME_APT_DEPS
++ readonly ADDITIONAL_RUNTIME_APT_ENV
++ readonly DOCKERHUB_USER
++ readonly DOCKERHUB_REPO
++ readonly DOCKER_CACHE
++ readonly USE_GITHUB_REGISTRY
++ readonly GITHUB_REGISTRY
++ readonly GITHUB_REGISTRY_WAIT_FOR_IMAGE
++ readonly GITHUB_REGISTRY_PULL_IMAGE_TAG
++ readonly GITHUB_REGISTRY_PUSH_IMAGE_TAG
++ readonly GITHUB_REPOSITORY
++ readonly GITHUB_TOKEN
++ readonly GITHUB_USERNAME
++ readonly FORWARD_CREDENTIALS
++ readonly USE_GITHUB_REGISTRY
++ readonly EXTRA_STATIC_CHECK_OPTIONS
++ readonly VERSION_SUFFIX_FOR_PYPI
++ readonly VERSION_SUFFIX_FOR_SVN
++ readonly PYTHON_BASE_IMAGE_VERSION
++ readonly PYTHON_BASE_IMAGE
++ readonly AIRFLOW_CI_BASE_TAG
++ readonly AIRFLOW_CI_IMAGE
++ readonly AIRFLOW_CI_IMAGE_DEFAULT
++ readonly AIRFLOW_PROD_BASE_TAG
++ readonly AIRFLOW_PROD_IMAGE
++ readonly AIRFLOW_PROD_BUILD_IMAGE
++ readonly AIRFLOW_PROD_IMAGE_KUBERNETES
++ readonly AIRFLOW_PROD_IMAGE_DEFAULT
++ readonly BUILT_CI_IMAGE_FLAG_FILE
++ readonly INIT_SCRIPT_FILE
++ traps::add_trap start_end::script_end EXIT HUP INT TERM
++ trap=start_end::script_end
++ shift
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p EXIT
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' EXIT
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p HUP
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' HUP
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p INT
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' INT
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p TERM
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' TERM
+ [[ 2.0.0b2-python3.7 == *python*-ci ]]
+ [[ 2.0.0b2-python3.7 == *python* ]]
+ echo
+ echo 'Building prod image'
Building prod image
+ echo
+ rm -rf /src/b3fpsnmwartmqn9f6rbzfxh/.build
+ build_images::prepare_prod_build
+ [[ -n '' ]]
+ [[ -n '' ]]
+ EXTRA_DOCKER_PROD_BUILD_FLAGS=("--build-arg" "AIRFLOW_CONSTRAINTS_REFERENCE=${DEFAULT_CONSTRAINTS_BRANCH}")
+ [[ 3.6 == \3\.\7 ]]
+ export DEFAULT_CI_IMAGE=
+ DEFAULT_CI_IMAGE=
+ export THE_IMAGE_TYPE=PROD
+ THE_IMAGE_TYPE=PROD
+ export 'IMAGE_DESCRIPTION=Airflow production'
+ IMAGE_DESCRIPTION='Airflow production'
+ export AIRFLOW_EXTRAS=async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv
+ AIRFLOW_EXTRAS=async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv
+ readonly AIRFLOW_EXTRAS
+ export AIRFLOW_IMAGE=apache/airflow:2.0.0b2-python3.7
+ AIRFLOW_IMAGE=apache/airflow:2.0.0b2-python3.7
+ readonly AIRFLOW_IMAGE
+ [[ false == \t\r\u\e ]]
+ AIRFLOW_BRANCH_FOR_PYPI_PRELOADING=master
+ sanity_checks::go_to_airflow_sources
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ pushd /src/b3fpsnmwartmqn9f6rbzfxh
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
++ pwd
+ verbosity::print_info 'Running in host in /src/b3fpsnmwartmqn9f6rbzfxh'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ build_images::build_prod_images
+ build_images::print_build_info
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info 'Airflow 2.0.0b2 Python: 3.7. Image description: Airflow production'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ [[ false == \t\r\u\e ]]
+ push_pull_remove_images::pull_prod_images_if_needed
+ [[ pulled == \p\u\l\l\e\d ]]
+ [[ false == \t\r\u\e ]]
+ [[ false == \t\r\u\e ]]
+ push_pull_remove_images::pull_image_if_not_present_or_forced apache/airflow:2.0.0b2-python3.7-build
+ local IMAGE_TO_PULL=apache/airflow:2.0.0b2-python3.7-build
+ local IMAGE_HASH
++ docker images -q apache/airflow:2.0.0b2-python3.7-build
+ IMAGE_HASH=
+ local PULL_IMAGE=false
+ [[ -z '' ]]
+ PULL_IMAGE=true
+ [[ true == \t\r\u\e ]]
+ echo
+ echo 'Pulling the image apache/airflow:2.0.0b2-python3.7-build'
Pulling the image apache/airflow:2.0.0b2-python3.7-build
+ echo
+ docker pull apache/airflow:2.0.0b2-python3.7-build
+ verbosity::store_exit_on_error_status
+ exit_on_error=false
+ [[ ehuxB == *e* ]]
+ exit_on_error=true
+ set +e
+ [[ false == \t\r\u\e ]]
+ [[ true == \f\a\l\s\e ]]
+ /usr/bin/docker pull apache/airflow:2.0.0b2-python3.7-build
++ tee -a /tmp/tmp.zuzptQhEPi/out.log
++ tee -a /tmp/tmp.zuzptQhEPi/out.log
Error response from daemon: manifest for apache/airflow:2.0.0b2-python3.7-build not found: manifest unknown: manifest unknown
+ res=1
+ [[ 1 == \0 ]]
+ [[ true == \f\a\l\s\e ]]
+ verbosity::restore_exit_on_error_status
+ [[ true == \t\r\u\e ]]
+ set -e
+ unset exit_on_error
+ return 1
+ start_end::script_end
+ local exit_code=1
+ [[ 1 != 0 ]]
+ [[ -f /tmp/tmp.zuzptQhEPi/out.log ]]
+ [[ true == \f\a\l\s\e ]]
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info '###########################################################################################'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info ' EXITING WITH STATUS CODE 1'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info '###########################################################################################'
+ [[ false == \t\r\u\e ]]
+ [[ true == \t\r\u\e ]]
+ set +x
build hook failed! (1)
``` | https://github.com/apache/airflow/issues/12263 | https://github.com/apache/airflow/pull/12378 | 561e4594913395c52a331e44ec2f638b55fa513e | 0038660fddc99f454a8ecf4de53be9848f7ddc5d | 2020-11-10T17:55:26Z | python | 2020-11-16T01:26:36Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,262 | ["airflow/www/compile_assets_if_needed.sh", "scripts/in_container/entrypoint_ci.sh"] | Automate asset compiling when entering breeze | Currenlty when any of the assets change, you need to remember about recompiling assets when entering breeze. This can be easily automated so that the rebuild will happen every time assets change.
Linked issue #12258 | https://github.com/apache/airflow/issues/12262 | https://github.com/apache/airflow/pull/13292 | af611e76ed18b51b32bc72dfe4d97af6b21e7d5f | a1e06ac7a65dddfee26e39b4191766d9c840c1fe | 2020-11-10T17:43:31Z | python | 2020-12-23T20:08:47Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,261 | ["scripts/ci/images/ci_build_dockerhub.sh", "scripts/ci/images/ci_prepare_prod_image_on_ci.sh", "scripts/ci/libraries/_build_images.sh", "scripts/ci/libraries/_initialization.sh"] | Make production image use provider packages to build for production. | Currently, the production image we use is build directly from sources. This is great for development, but id does not really test if airflow will work if installed from packages. We should be able to build the packages locally and build the image using whl packages as sources of pip packages.
This will be as close to someone installing airflow from those packages manually
Once we use it for testing, we should also consider to build the image published in DockerHub to be built from those packages but it adds some complications in building scripts. This is possible but we have to test it first.
That needs two parts:
- [x] changing images in CI to be built from packages
- [x] changing images in DockerHub to be built from packages
| https://github.com/apache/airflow/issues/12261 | https://github.com/apache/airflow/pull/12908 | ef523b4c2bdb10a10ad042d36a57157cb5d85723 | f9e9ad2b096ff9d8ee78224333f799ca3968b6bd | 2020-11-10T17:32:52Z | python | 2020-12-08T11:45:03Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,255 | [".pre-commit-config.yaml", "BREEZE.rst", "CONTRIBUTING.rst", "INSTALL", "STATIC_CODE_CHECKS.rst", "breeze-complete", "docs/installation.rst", "scripts/ci/pre_commit/pre_commit_check_extras_have_providers.py", "setup.py"] | Statsd (tbc. if more extras) tries to install provider package where it is missing | **Apache Airflow version**: 2.0.0b2
**What happened**:
Starting in 2.0.0b2, extras that aren't also providers like `statsd` prevents Airflow from being installed.
**How to reproduce it**:
```
$ docker run --rm -ti python:3.6 bash
# pip install apache-airflow[statsd]==2.0.0b2
Collecting apache-airflow[statsd]==2.0.0b2
Downloading apache_airflow-2.0.0b2-py3-none-any.whl (4.5 MB)
...
Collecting statsd<4.0,>=3.3.0; extra == "statsd"
Downloading statsd-3.3.0-py2.py3-none-any.whl (11 kB)
ERROR: Could not find a version that satisfies the requirement apache-airflow-providers-statsd; extra == "statsd" (from apache-airflow[statsd]==2.0.0b2) (from versions: none)
ERROR: No matching distribution found for apache-airflow-providers-statsd; extra == "statsd" (from apache-airflow[statsd]==2.0.0b2)
```
I believe this is from https://github.com/apache/airflow/pull/12233 | https://github.com/apache/airflow/issues/12255 | https://github.com/apache/airflow/pull/12265 | cbf49848afa43f693d890ac5cce8000aa723d2bf | 348510f86b8ee6b7d89c1355258e61095a6a29e9 | 2020-11-10T15:50:43Z | python | 2020-11-11T16:13:57Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,254 | ["airflow/www/templates/airflow/dag.html"] | "Log" button on graph view popup doesn't open the logs view | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:2.0.0b2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): macOS
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
Clicking on the "Log" button in the popup for task instance in graph view doesn't link to the logs view.
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
The log view should show up.
<!-- What do you think went wrong? -->
**How to reproduce it**:
1. Run any dag.
2. Open Graph VIew.
3. Click on any task.
4. Click on the "Log" button.
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
Check this in action [here](https://youtu.be/fXEQ-yOwMrM).
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12254 | https://github.com/apache/airflow/pull/12268 | 0cd1c846b2fb4d830b87e11b884094ee4765ab22 | 938c512c6d9e05865cb6c8e0098ba6dba5ef55b6 | 2020-11-10T15:38:38Z | python | 2020-11-10T22:18:15Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,247 | [".pre-commit-config.yaml", "BREEZE.rst", "STATIC_CODE_CHECKS.rst", "airflow/example_dags/example_dag_decorator.py", "airflow/operators/email.py", "airflow/operators/python.py", "airflow/www/utils.py", "breeze-complete"] | Example DAGs import error | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0b2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):macOS
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
Running `airflow db init` followed by `airflow dag list` give the following error:
```
[2020-11-10 19:04:24,553] {dagbag.py:297} ERROR - Failed to import: /Users/abagri/Workspace/airflow2.0/venv/lib/python3.8/site-packages/airflow/example_dags/example_kubernetes_executor_config.py
Traceback (most recent call last):
File "/Users/abagri/Workspace/airflow2.0/venv/lib/python3.8/site-packages/airflow/models/dagbag.py", line 294, in _load_modules_from_file
loader.exec_module(new_module)
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/Users/abagri/Workspace/airflow2.0/venv/lib/python3.8/site-packages/airflow/example_dags/example_kubernetes_executor_config.py", line 23, in <module>
from kubernetes.client import models as k8s
ModuleNotFoundError: No module named 'kubernetes'
[2020-11-10 19:04:24,558] {dagbag.py:297} ERROR - Failed to import: /Users/abagri/Workspace/airflow2.0/venv/lib/python3.8/site-packages/airflow/example_dags/example_dag_decorator.py
Traceback (most recent call last):
File "/Users/abagri/Workspace/airflow2.0/venv/lib/python3.8/site-packages/airflow/models/dagbag.py", line 294, in _load_modules_from_file
loader.exec_module(new_module)
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/Users/abagri/Workspace/airflow2.0/venv/lib/python3.8/site-packages/airflow/example_dags/example_dag_decorator.py", line 25, in <module>
from airflow.providers.http.operators.http import SimpleHttpOperator
ModuleNotFoundError: No module named 'airflow.providers'
*****// Followed by name of other DAG, omiting here *******
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
Expect the example dags to import files
<!-- What do you think went wrong? -->
**How to reproduce it**:
1. Run `airflow db init`
2. Run `airflow dags list`
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12247 | https://github.com/apache/airflow/pull/12252 | 4f9439dec13d3118d5423bac246064dea7a95002 | 0cd1c846b2fb4d830b87e11b884094ee4765ab22 | 2020-11-10T13:38:19Z | python | 2020-11-10T21:49:08Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,239 | ["airflow/api_connexion/__init__.py", "docs/conf.py"] | Airflow v2.0.0b1 package doesnt include "api_connexion/exceptions" | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0b1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): macOS
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
Installed apache-airlfow==2.0.0b1 using pip.
Running `airflow webserver` gave the following error:
```
Traceback (most recent call last):
File "/Users/abagri/Workspace/service-workflows/venv/bin/airflow", line 8, in <module>
sys.exit(main())
File "/Users/abagri/Workspace/service-workflows/venv/lib/python3.8/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/Users/abagri/Workspace/service-workflows/venv/lib/python3.8/site-packages/airflow/cli/cli_parser.py", line 49, in command
func = import_string(import_path)
File "/Users/abagri/Workspace/service-workflows/venv/lib/python3.8/site-packages/airflow/utils/module_loading.py", line 32, in import_string
module = import_module(module_path)
File "/usr/local/Cellar/[email protected]/3.8.5/Frameworks/Python.framework/Versions/3.8/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/Users/abagri/Workspace/service-workflows/venv/lib/python3.8/site-packages/airflow/cli/commands/webserver_command.py", line 43, in <module>
from airflow.www.app import cached_app, create_app
File "/Users/abagri/Workspace/service-workflows/venv/lib/python3.8/site-packages/airflow/www/app.py", line 39, in <module>
from airflow.www.extensions.init_views import (
File "/Users/abagri/Workspace/service-workflows/venv/lib/python3.8/site-packages/airflow/www/extensions/init_views.py", line 25, in <module>
from airflow.api_connexion.exceptions import common_error_handler
ModuleNotFoundError: No module named 'airflow.api_connexion.exceptions'
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
Expect this command to start the webserver
<!-- What do you think went wrong? -->
**How to reproduce it**:
Install a fresh version of airflow, run `airflow db init` followed by `airflow webserver`
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12239 | https://github.com/apache/airflow/pull/12240 | 5912d0cae7033b3e2549280677dd60faa53be5e7 | 249d1741e448d2938fd7f507c62961ae748db1ad | 2020-11-10T10:28:09Z | python | 2020-11-10T11:26:34Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,231 | ["setup.py"] | Extras installation was lost in the last rebase | The #11526 was badly rebased just before beta1 relase and few
lines installing the providers were lost.
| https://github.com/apache/airflow/issues/12231 | https://github.com/apache/airflow/pull/12233 | 45587a664433991b01a24bf0210116c3b562adc7 | 5912d0cae7033b3e2549280677dd60faa53be5e7 | 2020-11-10T07:52:14Z | python | 2020-11-10T10:52:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,217 | ["airflow/providers/sendgrid/PROVIDER_CHANGES_1.0.0b1.md", "airflow/providers/sendgrid/README.md"] | Sendgrid provider is missing README.md file | The SendGrid provider is missing a README.md file -- this means we haven't yet published a release for this. | https://github.com/apache/airflow/issues/12217 | https://github.com/apache/airflow/pull/12245 | 5ac1738d52840ac59f75bb93627d45ce22029409 | c5806efb54ad06049e13a5fc7df2f03846fe566e | 2020-11-09T22:34:37Z | python | 2020-11-10T12:39:38Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,214 | ["airflow/providers/slack/hooks/slack_webhook.py", "tests/providers/slack/hooks/test_slack_webhook.py"] | SlackWebhookHook should use password instead of extra to store the token | Currently `SlackWebhookHook` gets the token from `extra` but it would be more secure and intuitive to have it in password.
https://github.com/apache/airflow/blob/4fb5c017fe5ca41ed95547a857c9c39efc4f1476/airflow/providers/slack/hooks/slack_webhook.py | https://github.com/apache/airflow/issues/12214 | https://github.com/apache/airflow/pull/12674 | 03fa6edc7a3469e7cc660a97a2efe4b9e7ac7a19 | 2947e0999979fad1f2c98aeb4f1e46297e4c9864 | 2020-11-09T21:43:26Z | python | 2020-12-02T14:09:28Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,201 | ["airflow/api_connexion/endpoints/dag_run_endpoint.py", "airflow/api_connexion/parameters.py", "airflow/api_connexion/schemas/dag_run_schema.py", "airflow/api_connexion/schemas/task_instance_schema.py", "tests/api_connexion/endpoints/test_dag_run_endpoint.py", "tests/api_connexion/endpoints/test_task_instance_endpoint.py", "tests/api_connexion/test_parameters.py"] | (REST API)Triggering a dagrun with naive datetime raises an html page instead of json | Triggering a dagrun with naive datetime raises an HTML page instead of JSON
**Apache Airflow version**:2.0
**What happened**:
An HTML page was returned instead of json when the REST API was triggered with a naive datetime
**What you expected to happen**:
I expected it to raise error status 400 indicating that it's a bad request
**How to reproduce it**:
1. Start the web server and scheduler inside breeze.
2. Make a post request to this endpoint: http://localhost:28080/api/v1/dags/example_bash_operator/dagRuns
to trigger dagruns using this request body: {"execution_date": "2020-11-09T16:25:56.939143"}. or any naive datetime
3. See that an HTML is raised instead of Json
| https://github.com/apache/airflow/issues/12201 | https://github.com/apache/airflow/pull/12248 | 0d37c59669afebe774355a310a889e3cfa378862 | 7478e18ee55eed80a2b8a8f7599b95d0955986c0 | 2020-11-09T15:38:13Z | python | 2020-11-11T19:10:44Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,197 | [".gitignore", "dev/provider_packages/MANIFEST_TEMPLATE.in.jinja2", "dev/provider_packages/SETUP_TEMPLATE.py.jinja2", "dev/provider_packages/prepare_provider_packages.py"] | Provider packages don't include datafiles | The amazon and google providers don't include necessary datafiles in them.
They were previously included in the sdist via MANIFEST.in (see https://github.com/apache/airflow/pull/12196) and in the bdist via include_package_data from the top level setup.py
Both of these are currently missing.
I've put this against 2.0.0-beta1, it _could_ be changed separately as providers are separate releases. | https://github.com/apache/airflow/issues/12197 | https://github.com/apache/airflow/pull/12200 | 55c401dbf9f7bf8730158f52a5ccc4aa7ab06381 | 765cbbcd76900fd0777730aaba637058b089ac95 | 2020-11-09T13:35:08Z | python | 2020-11-09T15:57:01Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,193 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/jobs/scheduler_job.py", "docs/apache-airflow/scheduler.rst"] | Add detailed documentation for scheduler "tuning" parameters | https://github.com/apache/airflow/pull/12139 added some new params to the scheduler -- we have "cursory" docs for them in the comments in the config file, but we should have a more details docs in the scheduler.rst page. | https://github.com/apache/airflow/issues/12193 | https://github.com/apache/airflow/pull/12899 | 7d37391a2b93259281cbe99125240727319633ba | 5f65cec77e52e319358b12b0554aa8960fbff918 | 2020-11-09T12:13:46Z | python | 2020-12-08T00:18:58Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,182 | ["airflow/www/templates/airflow/dags.html"] | "blocked" UI feature is broken in 2.0.0a2 and the latest releases of 1.10.x | When a DAG's number of active DagRuns >= its number of max allowed active DagRuns, it's `Schedule` will be highlighted with red colour (and a tooltip should always be added).
However this feature is broken in 2.0.0a2 and the latest a few releases of 1.10.x series.
## Example
in this case, the `schedule`'s background colour should be changed to red, according to code https://github.com/apache/airflow/blob/63ac07d9c735b1ccc8aa5fa974a260cc944cc539/airflow/www/templates/airflow/dags.html#L357-L367
While it is not working as expected.

I tried to check the history, and seems it's caused by PR https://github.com/apache/airflow/pull/6985
| https://github.com/apache/airflow/issues/12182 | https://github.com/apache/airflow/pull/12183 | fcb6b00efef80c81272a30cfc618202a29e0c6a9 | 6ce95fb268f73298dffdc42f3cad33d922006c3a | 2020-11-08T16:09:51Z | python | 2020-11-08T19:51:40Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,179 | ["UPDATING.md", "airflow/models/dagbag.py", "airflow/settings.py", "docs/concepts.rst", "tests/cluster_policies/__init__.py", "tests/dags/test_dag_with_no_tags.py", "tests/models/test_dagbag.py"] | Add DAG level cluster policy | **Description**
Currently we have **only** task level [cluster policy](https://airflow.readthedocs.io/en/latest/concepts.html?highlight=cluster%20policy#cluster-policy):
https://github.com/apache/airflow/blob/7dfb3980cecfafdb7d6b79d17455b08971cec7d4/airflow/models/dagbag.py#L386-L387
This is an amazing tool that allows users to validate their tasks by either skipping DAG or setting default values (queues, owners, emails etc). I would like to proposes same mechanism for DAGs.
Some may argue that users already can implement it by accessing `task.dag` attribute in the policy, however this has the following drawbacks:
- it's not "explicit over implicit" see https://github.com/apache/airflow/issues/12086#issuecomment-722986766
- the "DAG policy" is executed for every task (if there's 1000 tasks in DAG, the DAG policy will be checked 1000 times)
**Use case / motivation**
Create explicit mechanism for creating "DAG level" cluster policy that will be optimised for this task (check only once per dag not for every task) and will not required users to implement this custom optimisation logic (for example by monkey patching or changes in their fork).
**Related Issues**
#10282
| https://github.com/apache/airflow/issues/12179 | https://github.com/apache/airflow/pull/12184 | 7c4fe19e41ae02a1df1c0a217501cae2e0e84819 | 1222ebd4e1eb6fdefa886759c43d4d4db691697a | 2020-11-08T11:51:33Z | python | 2020-11-13T13:32:49Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,168 | ["airflow/www/views.py", "tests/www/test_views.py"] | More meaningful XCom List View Name | In flask-appbuilder, the List View would use "List " + prettied model name as the title.
This makes sense for most of the cases, like `Variable`, `Connection`, etc.
But for `XCom`, the title is not making much sense.
(I'm testing with 2.0.0a2. But actually this issue exists in earlier 1.x versions as well).

<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12168 | https://github.com/apache/airflow/pull/12169 | bedaf5353d87604d12442ecb0f481cb4d85d9ab4 | 8d5ad6969ff68deea3aca3c98b4a982597f330a0 | 2020-11-07T19:50:14Z | python | 2020-11-07T22:04:52Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,151 | ["airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", "tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py"] | Kubernetes tests are failing in master | Kubernetes tests are failing master build:
Example here:
https://github.com/apache/airflow/actions/runs/350798991
The failing test is:
```
_______________ TestKubernetesPodOperatorSystem.test_pod_failure _______________
self = <kubernetes_tests.test_kubernetes_pod_operator.TestKubernetesPodOperatorSystem testMethod=test_pod_failure>
def test_pod_failure(self):
"""
Tests that the task fails when a pod reports a failure
"""
bad_internal_command = ["foobar 10 "]
k = KubernetesPodOperator(
namespace='default',
image="ubuntu:16.04",
cmds=["bash", "-cx"],
arguments=bad_internal_command,
labels={"foo": "bar"},
name="test-" + str(random.randint(0, 1000000)),
task_id="task" + self.get_current_task_name(),
in_cluster=False,
do_xcom_push=False,
)
with self.assertRaises(AirflowException):
context = create_context(k)
> k.execute(context)
kubernetes_tests/test_kubernetes_pod_operator.py:536:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py:313: in execute
status = self.client.read_namespaced_pod(self.name, self.namespace)
.build/.kubernetes_venv/lib/python3.6/site-packages/kubernetes/client/api/core_v1_api.py:19078: in read_namespaced_pod
(data) = self.read_namespaced_pod_with_http_info(name, namespace, **kwargs) # noqa: E501
.build/.kubernetes_venv/lib/python3.6/site-packages/kubernetes/client/api/core_v1_api.py:19169: in read_namespaced_pod_with_http_info
collection_formats=collection_formats)
.build/.kubernetes_venv/lib/python3.6/site-packages/kubernetes/client/api_client.py:345: in call_api
_preload_content, _request_timeout)
.build/.kubernetes_venv/lib/python3.6/site-packages/kubernetes/client/api_client.py:176: in __call_api
_request_timeout=_request_timeout)
.build/.kubernetes_venv/lib/python3.6/site-packages/kubernetes/client/api_client.py:366: in request
headers=headers)
.build/.kubernetes_venv/lib/python3.6/site-packages/kubernetes/client/rest.py:241: in GET
query_params=query_params)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
def request(self, method, url, query_params=None, headers=None,
body=None, post_params=None, _preload_content=True,
_request_timeout=None):
"""Perform requests.
:param method: http request method
:param url: http request url
:param query_params: query parameters in the url
:param headers: http request headers
:param body: request json body, for `application/json`
:param post_params: request post parameters,
`application/x-www-form-urlencoded`
and `multipart/form-data`
:param _preload_content: if False, the urllib3.HTTPResponse object will
be returned without reading/decoding response
data. Default is True.
:param _request_timeout: timeout setting for this request. If one
number provided, it will be total request
timeout. It can also be a pair (tuple) of
(connection, read) timeouts.
"""
method = method.upper()
assert method in ['GET', 'HEAD', 'DELETE', 'POST', 'PUT',
'PATCH', 'OPTIONS']
if post_params and body:
raise ValueError(
"body parameter cannot be used with post_params parameter."
)
post_params = post_params or {}
headers = headers or {}
timeout = None
if _request_timeout:
if isinstance(_request_timeout, (int, ) if six.PY3 else (int, long)): # noqa: E501,F821
timeout = urllib3.Timeout(total=_request_timeout)
elif (isinstance(_request_timeout, tuple) and
len(_request_timeout) == 2):
timeout = urllib3.Timeout(
connect=_request_timeout[0], read=_request_timeout[1])
if 'Content-Type' not in headers:
headers['Content-Type'] = 'application/json'
try:
# For `POST`, `PUT`, `PATCH`, `OPTIONS`, `DELETE`
if method in ['POST', 'PUT', 'PATCH', 'OPTIONS', 'DELETE']:
if query_params:
url += '?' + urlencode(query_params)
if re.search('json', headers['Content-Type'], re.IGNORECASE):
if headers['Content-Type'] == 'application/json-patch+json':
if not isinstance(body, list):
headers['Content-Type'] = \
'application/strategic-merge-patch+json'
request_body = None
if body is not None:
request_body = json.dumps(body)
r = self.pool_manager.request(
method, url,
body=request_body,
preload_content=_preload_content,
timeout=timeout,
headers=headers)
elif headers['Content-Type'] == 'application/x-www-form-urlencoded': # noqa: E501
r = self.pool_manager.request(
method, url,
fields=post_params,
encode_multipart=False,
preload_content=_preload_content,
timeout=timeout,
headers=headers)
elif headers['Content-Type'] == 'multipart/form-data':
# must del headers['Content-Type'], or the correct
# Content-Type which generated by urllib3 will be
# overwritten.
del headers['Content-Type']
r = self.pool_manager.request(
method, url,
fields=post_params,
encode_multipart=True,
preload_content=_preload_content,
timeout=timeout,
headers=headers)
# Pass a `string` parameter directly in the body to support
# other content types than Json when `body` argument is
# provided in serialized form
elif isinstance(body, str):
request_body = body
r = self.pool_manager.request(
method, url,
body=request_body,
preload_content=_preload_content,
timeout=timeout,
headers=headers)
else:
# Cannot generate the request from given parameters
msg = """Cannot prepare a request message for provided
arguments. Please check that your arguments match
declared content type."""
raise ApiException(status=0, reason=msg)
# For `GET`, `HEAD`
else:
r = self.pool_manager.request(method, url,
fields=query_params,
preload_content=_preload_content,
timeout=timeout,
headers=headers)
except urllib3.exceptions.SSLError as e:
msg = "{0}\n{1}".format(type(e).__name__, str(e))
raise ApiException(status=0, reason=msg)
if _preload_content:
r = RESTResponse(r)
# In the python 3, the response.data is bytes.
# we need to decode it to string.
if six.PY3:
r.data = r.data.decode('utf8')
# log response body
logger.debug("response body: %s", r.data)
if not 200 <= r.status <= 299:
> raise ApiException(http_resp=r)
E kubernetes.client.rest.ApiException: (404)
E Reason: Not Found
E HTTP response headers: HTTPHeaderDict({'Content-Type': 'application/json', 'Date': 'Sat, 07 Nov 2020 07:33:13 GMT', 'Content-Length': '190'})
E HTTP response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"pods \"test-815505\" not found","reason":"NotFound","details":{"name":"test-815505","kind":"pods"},"code":404}
.build/.kubernetes_venv/lib/python3.6/site-packages/kubernetes/client/rest.py:231: ApiException
``` | https://github.com/apache/airflow/issues/12151 | https://github.com/apache/airflow/pull/12171 | b2a28d1590410630d66966aa1f2b2a049a8c3b32 | 3f59e75cdf4a95829ac60b151135e03267e63a12 | 2020-11-07T09:39:52Z | python | 2020-11-09T13:13:49Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,150 | ["scripts/ci/images/ci_prepare_prod_image_on_ci.sh", "setup.py"] | Production image has only amazon and google providers installed | When "production" image is prepared for, only amazon and google providers are installed from sources.
**Apache Airflow version**:
master
**What you expected to happen**:
All providers should be installed
**How to reproduce it**:
```
./breeze --production-image --python 3.6
```
Then:
```
./breeze --production-image --python 3.6 shell bash
```
then
```
ls -la ~/.local/lib/python3.6/site-packages/airflow/providers/
amazon
google
```
UPDATE:
They are not even installed:
```
.
./amazon
./amazon/aws
./amazon/aws/hooks
./amazon/aws/hooks/batch_waiters.json
./google
./google/cloud
./google/cloud/example_dags
./google/cloud/example_dags/example_bigquery_query.sql
./google/cloud/example_dags/example_cloud_build.yaml
./google/cloud/example_dags/example_spanner.sql
```
| https://github.com/apache/airflow/issues/12150 | https://github.com/apache/airflow/pull/12154 | 92e405e72922cc569a2e41281df9d055c3a7855d | eaac361f3bb29cd3bbd459488fcf31c28ed8fb2b | 2020-11-07T09:27:40Z | python | 2020-11-09T12:26:24Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,148 | ["airflow/www/views.py", "tests/www/test_views.py"] | 2.0.0a2 webserver fails when connection "extra" field is None (NULL in DB) | This issue can be reproduced by clicking "Edit" any connection whose `extra` is `NULL` in DB, for example, "airflow_db", or "databricks_default"
I have already found the reason, will raise a PR shortly.
1. How to Reproduce

2. Error
```
Python version: 3.6.10
Airflow version: 2.0.0a2
Node: a1c5c0c0026a
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/security/decorators.py", line 109, in wraps
return f(self, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/views.py", line 602, in edit
widgets = self._edit(pk)
File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/baseviews.py", line 1228, in _edit
self.prefill_form(form, pk)
File "/usr/local/lib/python3.6/site-packages/airflow/www/views.py", line 2546, in prefill_form
extra_dictionary = json.loads(form.data.get('extra', '{}'))
File "/usr/local/lib/python3.6/json/__init__.py", line 348, in loads
'not {!r}'.format(s.__class__.__name__))
TypeError: the JSON object must be str, bytes or bytearray, not 'NoneType'
```
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12148 | https://github.com/apache/airflow/pull/12149 | fbbb1990586acd828ec752e7060fa40082f1f2a5 | bedaf5353d87604d12442ecb0f481cb4d85d9ab4 | 2020-11-07T08:40:47Z | python | 2020-11-07T21:29:20Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,138 | ["airflow/executors/celery_executor.py"] | Celery executor should support custom database backend models | Currently, the celery executor imports the `Task` model for the Celery database backend directly:
https://github.com/apache/airflow/blob/2dd4e96045d4a7f45cc8c06df3d25c4f1479392c/airflow/executors/celery_executor.py#L38
However, Celery, being highly customisable, uses `self.task_cls` in the backend implementation; this defaults to either `Task` or `TaskExtended` depending on the Celery configuration, see [`celery.backends.database:66`](https://github.com/celery/celery/blob/406f04a082949ac42ec7a4af94fed896c515aaa4/celery/backends/database/__init__.py#L66):
> ```python
> task_cls = Task
> ```
and [`celery.backends.database:76,77`](https://github.com/celery/celery/blob/406f04a082949ac42ec7a4af94fed896c515aaa4/celery/backends/database/__init__.py#L76-L77):
> ```python
> if self.extended_result:
> self.task_cls = TaskExtended
> ```
Airflow should support this flexibility, and instead of importing a fixed class just use `backend.task_cls`, so in `_get_many_from_db_backend` just use:
```python
Task = app.backend.task_cls
with session_cleanup(session):
tasks = session.query(Task).filter(Task.task_id.in_(task_ids)).all()
```
I suppose there is one downside to this: `TaskExtended` fetches a few more columns per task, but given that the result of `task.to_dict()` is passed to `backend.meta_from_decoded()`, which could also have been customised to make use of information contained in a custom model, that's neither here nor there. It would be the price anyone using the `result_extended` option would need to pay elsewhere anyway. | https://github.com/apache/airflow/issues/12138 | https://github.com/apache/airflow/pull/12336 | 458ad93bd81c9a3499364bcca4ef17be5c9c25d0 | d54f087b66d7252cbe270929cf08efc2b70edf6e | 2020-11-06T16:00:31Z | python | 2020-11-13T18:58:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,131 | ["airflow/api_connexion/exceptions.py", "airflow/api_connexion/openapi/v1.yaml", "airflow/www/extensions/init_views.py", "tests/api_connexion/test_error_handling.py"] | Return Json for all `Not Found` views in REST API | Currently if an API URL that does not exist is requested, an html page is returned instead of JSON
**Apache Airflow version**:2.0
**What happened**:
Making a request against an endpoint that does not exist returns an html page
**What you expected to happen**:
I expected it to return 404 error in JSON indicating that the URL was not found
**How to reproduce it**:
1. Start airflow website and scheduler in breeze.
2. Make a request against an endpoint that does not exist e.g http://localhost:28080/api/v1/getconnectiond
3. Notice that an html page is returned instead of a json
cc: @mik-laj
| https://github.com/apache/airflow/issues/12131 | https://github.com/apache/airflow/pull/12305 | 763b40d223e5e5512494a97f8335e16960e6adc3 | 966ee7d99466ba841e5fd7cd29f050ae59e75c85 | 2020-11-06T10:24:51Z | python | 2020-11-18T04:35:22Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,121 | ["airflow/api_connexion/schemas/dag_schema.py", "setup.cfg", "tests/api_connexion/endpoints/test_dag_endpoint.py", "tests/api_connexion/schemas/test_dag_schema.py"] | Bugs in REST API get dag details endpoint. | Here are some bugs in dag details endpoint.
1. Calling this endpoint `dags/{dag_id}/details` returns an error due to `doc_md` field that's not nullable.
2. The dag details endpoint `GET /dags/{dag_id}/details` does not return a json if dag does not exist.
3. This endpoint does not return a `file_token` which is needed to get the source code of a DAG. And also expected judging from the documentation.
**Apache Airflow version**: 2.0
**What happened**:
1. Calling this endpoint GET /dags/{dag_id}/details returns an error
2. When the above is fixed and you look for the details of a dag that does not exist, this endpoint returns a webpage instead of json.
3. Calling the endpoint with a dag ID that exists doesn't return a file_token
**What you expected to happen**:
1. I expected it to return the details of the dag
2. I expected it to return a json indicating that the dag was not found if dag doesn't exist
3. I expected it to return a file_token in the list of fields when the dag exists
**How to reproduce it**:
Run airflow in breeze.
Make a request to `/dags/{dag_id}/details`. It will return an error.
Go to openapi.yaml and set `doc_md` to be nullable.
Make another request to see that it returns a result without a file_token.
Change the dag id to a dag ID that does not exist.
Make another request and see that it returns a webpage.
| https://github.com/apache/airflow/issues/12121 | https://github.com/apache/airflow/pull/12463 | c34ef853c890e08f5468183c03dc8f3f3ce84af2 | 20843ff89ddbdac45f7ecf9913c4e38685089eb4 | 2020-11-06T00:24:52Z | python | 2020-11-20T16:28:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,098 | ["chart/templates/scheduler/scheduler-deployment.yaml"] | Liveness probe only observes one scheduler. | Hiello,
Recently, the Helm Chart has been updated to add the ability to run multiple schedulers.
https://github.com/apache/airflow/pull/11330
The liveness probe that does not support multiple schedulers has not been updated. I think we should also update the liveness probe to handle this case. Currently, it only watches the newest job, not the job assigned to the current scheduler.
https://github.com/apache/airflow/blob/91a64db505e50712cd53928b4f2b84aece3cc1c0/chart/templates/scheduler/scheduler-deployment.yaml#L115-L130
Best regards,
Kamil BreguΕa
| https://github.com/apache/airflow/issues/12098 | https://github.com/apache/airflow/pull/13705 | 808092928a66908f36aec585b881c5390d365130 | 2abfe1e1364a98e923a0967e4a989ccabf8bde54 | 2020-11-04T20:44:40Z | python | 2021-01-15T23:52:52Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,090 | ["airflow/cli/commands/task_command.py", "tests/cli/commands/test_task_command.py"] | task_command.task_run log handling is either broken or redundant | **Apache Airflow version**: 2.0.0a2
As of #9363, the `tasks run` command attempts to restore the root logger handlers to the previous state after running the task with the handlers all replaced by those from the `airflow.tasks` logger.
However, because no actual *copy* is created of the root handlers list, what you effectively end up with is either an empty list of handlers and nothing is restored. To make matters worse, the block ends with a `logging.shutdown()` call, which completely closes and releases all handlers.
So, either the code to 'restore' can just be removed, or a proper copy needs to be made and the `logging.shutdown()` call needs to be removed from the command. `logging.shutdown` is going to be called when the interpreter exits, anyway.
In detail, this is the code section:
https://github.com/apache/airflow/blob/bec9f3b29fd42ecd1beae3db75784b9a726caf15/airflow/cli/commands/task_command.py#L194-L222
The first problem here is that `root_logger_handlers = root_logger.handlers` merely creates a reference to the logger `handlers` list. The `root_logger.removeHandler(handler)` in a loop further on remove the handlers from that same list as you iterate, and this causes it to [only remove *every second handler*](https://sopython.com/canon/95/removing-items-from-a-list-while-iterating-over-the-list/). Luckily, this was never a problem because in the standard configuration there is only a single handler on the root. If there are more, there is a bigger problem further down.
Continuing on, the next loop, adding handlers from `airflow_logger_handlers`, causes those same handlers to show up in the `root_logger_handlers` reference, but those very same handlers are removed again in a second loop over the same `airflow_logger_handlers` list.
So when you hit the final `for handler in root_logger_handlers: root_logger.addHandler(handler)` loop, either `root_logger_handlers` is an empty list (no handlers added), or, if you started with more than one root handler and every second handler was left in place, you now are adding handlers to the list that are already in the list. The `Logger.addHandler()` method uses `self.handler.append()`, appending to a list you are iterating over, and so you have an infinite loop on your hands.
Either just remove all handlers from the root logger (taking care to loop over the list in reverse, e.g. `for h in reversed(root_logger.handlers): root_logger.removeHandler(h)`), or create proper copy of the list with `root_logger_handlers = root_logger.handlers[:]`. The `logging.shutdown()` call has to be removed then too.
| https://github.com/apache/airflow/issues/12090 | https://github.com/apache/airflow/pull/12342 | 6cb8e5c2ae352750d40b21721996472163ecdcea | 4c25e7636033801b7d182894fe93df6505b4079a | 2020-11-04T12:49:31Z | python | 2020-11-14T04:10:08Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,081 | ["airflow/providers/papermill/operators/papermill.py", "tests/providers/papermill/operators/test_papermill.py"] | PapermillOperator does not take user defined macros | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
1.15.11
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
- When initialize DAG, I defined `user_defined_macros` with my macros.
- If I use user defined macro on PapermillOperator, it does not find user defined macros.
<!-- (please include exact error messages if you can) -->
```
[2020-11-04 16:35:10,822] {taskinstance.py:1150} ERROR - 'seoul_date' is undefined
Traceback (most recent call last):
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/airflow/operators/papermill_operator.py", line 57, in execute
parameters=self.inlets[i].parameters,
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/airflow/lineage/datasets.py", line 69, in __getattr__
).render(**self.context)
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "<template>", line 1, in top-level template code
jinja2.exceptions.UndefinedError: 'seoul_date' is undefined
```
**What you expected to happen**:
PapermillOperator should find user defined macro.
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
PapermillOperator uses `Notebook` class which derived from `DataSet`
https://github.com/apache/airflow/blob/65df1e802190d262b5e18fa9bc2e055768b96e28/airflow/operators/papermill_operator.py#L27
And `DataSet` seems not taking user defined macro from DAG.
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12081 | https://github.com/apache/airflow/pull/18357 | 1008d8bf8acf459dbc692691a589c27fa4567123 | f382a79adabb2372a1ca5d9e43ed34afd9dec33d | 2020-11-04T08:34:48Z | python | 2021-09-20T05:06:47Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,042 | ["docs/installation.rst"] | Fix description of some extra packages | **Description**
The PR #12023 added consistency check for documentation and setup.py. However few packages need additional description:
- [x] pagerduty
- [ ] plexus
- [ ] sentry
- [x] singularity
- [ ] tableau
- [ ] virtualenv
This is a great "good-first-issue"
**Use case / motivation**
Descriptions should be complete.
| https://github.com/apache/airflow/issues/12042 | https://github.com/apache/airflow/pull/12141 | 128c9918b5f79cb46a563b77e803c29548c4319c | 4df25e94de74b6f430b1f05235715b99e56ab3db | 2020-11-02T14:22:33Z | python | 2020-11-06T19:41:32Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,035 | ["docs/apache-airflow-providers-amazon/connections/aws.rst"] | Support AssumeRoleWithWebIdentity for AWS provider | **Description**
Support of [AssumeRoleWithWebIdentity](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role_with_web_identity) for AWS provider when running Airflow workers in EKS.
**Use case / motivation**
This feature will allow us to use [IRSA](https://aws.amazon.com/blogs/opensource/introducing-fine-grained-iam-roles-service-accounts/) with Airflow Pods running on EKS and x-account _assumeRole_.
A connection type aws with empty username & password and the following extra parameters
```json
{
"role_arn": "<role_arn>",
"region_name": "<region>",
"aws_session_token": "file://$AWS_WEB_IDENTITY_TOKEN_FILE"
}
```
will retrieve temporary credentials using method `sts-assume-role-with-web-identity` (see [boto3 documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role_with_web_identity) of this method)
| https://github.com/apache/airflow/issues/12035 | https://github.com/apache/airflow/pull/17283 | 8fa4a8b587a3672156110fc4cf5c04bdf6830867 | c52e4f35170cc3cd9d597110bc24c270af553ca2 | 2020-11-02T11:03:23Z | python | 2021-08-01T22:38:32Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,030 | ["airflow/www/static/css/main.css"] | Airflow WebUI style broken | **Apache Airflow version**:
2.0.0 - master branch
**Environment**:
- **OS** (e.g. from /etc/os-release): MacOS 10.15.7
- **Browser**: Safari Version 13.1.3 (15609.4.1)
**What happened**:
The row in tis table is wider than 100% width
<img width="1440" alt="Screenshot 2020-11-02 at 11 11 39" src="https://user-images.githubusercontent.com/9528307/97856398-85f9a980-1cfc-11eb-89ed-d739ecb0f51c.png">
**What you expected to happen**:
I expect the row to not overflow.
**How to reproduce it**:
Go to http://0.0.0.0:28080/taskinstance/list
**Anything else we need to know**:
N/A
| https://github.com/apache/airflow/issues/12030 | https://github.com/apache/airflow/pull/12048 | b72bd4ae6b0e62689b126463396bf1e59f068543 | a1a1fc9f32940a8abbfc4a12d32321d75ac8268c | 2020-11-02T10:16:20Z | python | 2020-11-02T16:55:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,028 | ["airflow/operators/email.py", "tests/operators/test_email.py"] | Add `files` to templated fields of `EmailOperator` | **Description**
Files are not part of the templated fields in the `EmailOperator` https://airflow.apache.org/docs/stable/_modules/airflow/operators/email_operator.html
Whle in fact file names should also be templated .
**Use case / motivation**
We want to store files accordingly to the DAGs execution date and further send them to email recipients like so:
```
send_report_email = EmailOperator(
....
files=[
"/tmp/Report-A-{{ execution_date.strftime('%Y-%m-%d') }}.csv",
"/tmp/Reportt-B-{{ execution_date.strftime('%Y-%m-%d') }}.csv"
]
```
| https://github.com/apache/airflow/issues/12028 | https://github.com/apache/airflow/pull/12435 | 4873d9759dfdec1dd3663074f9e64ad69fa881cc | 9b9fe45f46455bdb7d3702ba4f4524574f11f75c | 2020-11-02T08:00:25Z | python | 2020-11-18T08:01:41Z |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.