package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
airflow-providers-siasg
airflow-providers-siasgProvider para Airflow que comunica comSIASGe seus sistemas derivadosInstalaçãopipinstallairflow-providers-siasgConteúdoSub-provider paraDW-SIASGNovo tipo de conexão "Conta do DW-SIASG"Transfers de relatórios do DW-SIASG para arquivo e banco MongoDBExemplos de usotask2=DWSIASGRelatorioParaMongoOperator(task_id='task2',id_conexao='teste',id_relatorio='BFD128CD11EC5B5D670B0080EF6553F5',respostas_prompts=['160030','160130'],timeout_segundos=120,id_conexao_mongo='teste_mongo',banco='teste_siasg',colecao='teste_siasg',)Para mais exemplos, consultarairflow/providers/siasg/example_dags/dw.py
airflow-providers-tesouro-gerencial
airflow-providers-tesouro-gerencialProvider do Airflow para comunicação comTesouro Gerencial.Instalaçãopipinstallairflow-providers-tesouro-gerencialConteúdoHook para conexão com Tesouro Gerencial, que contém métodos para:Entrada e saída de contexto (clásulawith), inicializando e encerrando sessão no Tesouro GerencialExecução e exportação de relatórioTransfers que carregam relatórios do Tesouro Gerencial para:Arquivo local;Banco MongoDBExemplo de UsoTransferência de relatório para arquivo local:fromdatetimeimportdatetime,timedeltafromairflow.decoratorsimportdagfromairflow.providers.tesouro_gerencial.transfers.relatorio_para_arquivoimportRelatorioParaArquivo@dag(schedule_interval=None,start_date=datetime(2021,1,1))defteste_tesouro_gerencial():teste=RelatorioParaArquivo(task_id='teste1',id_conexao_siafi='teste',id_relatorio='970A89D511EC923631090080EFC5BFD1',caminho_arquivo='/tmp/tg.xlsx',respostas_prompts_valor=['622110000','622120000'],retries=10,retry_delay=timedelta(minutes=2))minha_dag=teste_tesouro_gerencial()Transferência para banco MongoDBfromdatetimeimportdatetime,timedeltafromairflow.decoratorsimportdagfromairflow.providers.tesouro_gerencial.transfers.relatorio_para_mongoimportRelatorioParaMongo@dag(schedule_interval=None,start_date=datetime(2021,1,1))defteste_tesouro_gerencial():teste=RelatorioParaMongo(task_id='teste2',id_conexao_siafi='teste',id_relatorio='970D89D511EC423631090080EFA5BFD1',id_conexao_mongo='teste_mongo',banco='teste',colecao='teste',respostas_prompts_valor=['622110000','622120000'],truncar_colecao=True,nulos_para_zero=True,retries=10,retry_delay=timedelta(minutes=2))minha_dag=teste_tesouro_gerencial()
airflow-providers-zeppelin
This repo exists to provide [an example setup.py] file, that can be used
airflow-provider-tecton
Tecton Airflow provider for Apache AirflowThis package contains operators, a sensor and a hook that integrate Tecton into Apache Airflow.Two basic capabilities are supported:Submitting materialization jobsWaiting for Feature View/Feature Service data to materialize.TectonSensorwaits for Feature View/Feature Service data to materialize.TectonTriggerOperatorlaunches a Tecton job.TectonJobOperatorlaunches a Tecton job and waits for its completion.TectonFeatureTableTriggerOperatorlaunches a Tecton Feature Table ingestion job.TectonFeatureTableJobOperatorlaunches a Tecton Feature Table ingestion job and waits for its completion.InstallationYou can install this package viapip install airflow-provider-tecton. Note that this requiresapache-airflow>=2.0.ConfigurationThis provider uses operators that interface with Tecton's API and requires you set up Airflow Connection for Tecton. Most of the Connection config fields will be left blank. Configure the following fields:Conn Id:tecton_defaultConn Type:TectonHost:https://your-tecton-url.tecton.aipassword:your-tecton-api-keyUsageConfiguring a Feature View for manual triggeringABatchFeatureViewand aStreamFeatureViewcan be configured for manual triggering only. To do so, setbatch_trigger=BatchTriggerType.MANUAL. When set to manual, Tecton will not automatically create any batch materialization jobs for the Feature View. As of Tecton 0.6, any FeatureView can be manually triggered, but this is recommended mostly for manual usage.For aStreamFeatureView, only batch materialization job scheduling will be impacted by thebatch_triggersetting. Streaming materialization job scheduling will still be managed by Tecton.Here’s an example of aBatchFeatureViewconfigured for manual triggering.fromtectonimportbatch_feature_view,FilteredSource,Aggregation,BatchTriggerTypefromfraud.entitiesimportuserfromfraud.data_sources.transactionsimporttransactions_batchfromdatetimeimportdatetime,timedelta@batch_feature_view(sources=[FilteredSource(transactions_batch)],entities=[user],mode='spark_sql',aggregation_interval=timedelta(days=1),aggregations=[Aggregation(column='transaction',function='count',time_window=timedelta(days=1)),Aggregation(column='transaction',function='count',time_window=timedelta(days=30)),Aggregation(column='transaction',function='count',time_window=timedelta(days=90))],online=False,offline=True,feature_start_time=datetime(2022,5,1),tags={'release':'production'},owner='[email protected]',description='User transaction totals over a series of time windows, updated daily.',batch_trigger=BatchTriggerType.MANUAL# Use manual triggers)defuser_transaction_counts(transactions):returnf'''SELECTuser_id,1 as transaction,timestampFROM{transactions}'''If a Data Source input to the Feature View hasdata_delayset, then that delay will still be factored in to constructing training data sets but does not impact when the job can be triggered with the materialization API.Materialization Job SubmissionThere are two methods available to submit materialization jobs:TectonTriggerOperator: This triggers a materialization job for a Feature View. Tecton will retry any failing jobs automatically. Note that completion of this operator only means submission succeeded. To wait for completion, you must combine this withTectonSensor.TectonJobOperator: This triggers a materialization job for a Feature View with no retries. Additionally, when this operator is terminated, it will make a best effort to clean up the execution of the materialization job. Using this operator allows you to use standard Airflow keyword arguments to configure retry behavior. Additionally, this operator is synchronous, meaning that when the operator has succeeded, the underlying job has succeeded.Both of these require the following arguments:workspace - the workspace name of the Feature View you intend to materializefeature_view - the name of the Feature View you intend to materializeonline - whether the job should materialize to the online store. This requires that your FeatureView also has online materialization enabled.offline - whether the job should materialize to the offline store. This requires that your FeatureView also has offline materialization enabled.The time interval of the materialization job is configured automatically using Airflow templates. By default, it is from thedata_interval_startto thedata_interval_endof your DAG run. These can overridden if necessary.Example Usagefromtecton_providerimportTectonJobOperator,TectonTriggerOperatorTectonJobOperator(task_id="tecton_job",workspace="my_workspace",feature_view="my_fv",online=False,offline=True,retries=3,)TectonTriggerOperator(task_id="trigger_tecton",workspace="my_workspace",feature_view="my_fv",online=True,offline=True,)Waiting For MaterializationTectonSensorThis enables you to wait for Materialization to complete for both Feature Views and Feature Services. Common uses are for monitoring as well as kicking off a training job after daily materialization completes.Example Usagefromtecton_providerimportTectonSensorTectonSensor(task_id="wait_for_fs_online",workspace="my_workspace",feature_service="my_fs",online=True,offline=False,)TectonSensor(task_id="wait_for_fv",workspace="my_workspace",feature_view="my_fv",online=True,offline=True,)ExamplesSeeexample dags here.DevelopmentPre-commitThis repo uses pre-commit. Runpre-commit installin the repo root to configure pre-commit hooks. Pre-commit hooks take care of running unit tests as well as linting files.Run unit tests manuallyRunpython -m pytest tests/in the repo root.LicenseThis is licensed with the Apache 2.0 License.IssuesPlease submit issues and pull requests in our official repo:https://github.com/tecton-ai/airflow-provider-tectonTecton would be happy to hear from you. Please email any [email protected].
airflow-provider-toloka
Airflow Toloka ProviderThis library allows you to run crowdsourcingTolokaprocesses inApache Airflow- a widely used workflow management systemHere you can find a collection of ready-made Airflow tasks for the most frequently used actions inToloka-Kit.Getting started$ pip install airflow-provider-tolokaA good way to start is to follow theexamplein this repo.TolokaHookTolokaHookis used for getting toloka OAuth token and creatingTolokaClientwith it. You can getTolokaClientfromTolokaHookby callingget_conn()method.To make an appropriate Airflow Connection you need to create it in the Airflow Connections UI with following parameters:Conn ID:toloka_defaultConn Type:TolokaToken: enter your OAuth token for Toloka. You can learn more about how to get ithere.Environment: enterproductionorsandboxTasks use thetoloka_defaultconnection id by default, but if needed, you can create additional Airflow Connections and reference them as the functiontoloka_conn_idargument.Tasks and SensorsThere are several tasks and sensors that give you easy way to interact with Toloka from Airflow DAGs. Creating a project and a pool, adding tasks and getting assignments are among them. You can easily create your own task usingTolokaHookif it is beyond the scope of implemented ones. And it would be nice to have your pull request with updates.Check out ourexampleto see tasks and sensors in the battlefield.Useful LinksToloka homepage.Apache Airflow homepage.Toloka API documentation.Toloka-kit usage examples.Questions and bug reportsFor reporting bugs please use theToloka/bugreportpage.Join our English-speakingslack communityfor both tech and abstract questions.License© YANDEX LLC, 2022. Licensed under the Apache License, Version 2.0. See LICENSE file for more details.
airflow-provider-toloka-admin
This is a security placeholder package. If you want to claim this name for legitimate purposes, please contact us [email protected]@yandex-team.ru
airflow-provider-vdk
Versatile Data Kit Airflow providerA set of Airflow operators, sensors and a connection hook intended to help schedule Versatile Data Kit jobs using Apache Airflow.UsageTo install it simply run:pip install airflow-provider-vdkThen you can create a workflow of data jobs (deployed by VDK Control Service) like this:from datetime import datetime from airflow import DAG from vdk_provider.operators.vdk import VDKOperator with DAG( "airflow_example_vdk", schedule_interval=None, start_date=datetime(2022, 1, 1), catchup=False, tags=["example", "vdk"], ) as dag: trino_job1 = VDKOperator( conn_id="vdk-default", job_name="airflow-trino-job1", team_name="taurus", task_id="trino-job1", ) trino_job2 = VDKOperator( conn_id="vdk-default", job_name="airflow-trino-job2", team_name="taurus", task_id="trino-job2", ) transform_job = VDKOperator( conn_id="vdk-default", job_name="airflow-transform-job", team_name="taurus", task_id="transform-job", ) [trino_job1, trino_job2] >> transform_jobExampleSeefull example hereDemoYou can see demo during one of the community meetings here:https://www.youtube.com/watch?v=c3j1aOALjVU&t=690sArchitectureSee thevdk enhancement proposal spec here
airflow-provider-vertex-ai
vertex AI
airflow-provider-vineyard
Apache Airflow Provider for VineyardThe apache airflow provider for vineyard contains components to share intermediate data among tasks in Airflow workflows using vineyard.Vineyard works as theXCombackend for airflow workers to allow transferring large-scale data objects between tasks that cannot be fit into the Airflow's database backend without involving external storage systems like HDFS. The Vineyard XCom backend handles object migration as well when the required inputs are not located where the task is scheduled to execute.Table of ContentsRequirementsConfigurationUsageRun the testsDeploy using Docker ComposeDeploy on KubernetesRequirementsThe following packages are needed to run Airflow on Vineyard,airflow >= 2.1.0vineyard >= 0.2.12ConfigurationInstall required packages:pip3 install airflow-provider-vineyardConfigure Vineyard locallyThe vineyard server can be easier launched locally with the following command:python3 -m vineyard --socket=/tmp/vineyard.sockSee also our documentation aboutlaunching vineyard.Configure Airflow to use the vineyard XCom backend by specifying the environment variableexport AIRFLOW__CORE__XCOM_BACKEND=vineyard.contrib.airflow.xcom.VineyardXComand configure the location of the UNIX-domain IPC socket for vineyard client byexport AIRFLOW__VINEYARD__IPC_SOCKET=/tmp/vineyard.sockorexport VINEYARD_IPC_SOCKET=/tmp/vineyard.sockIf you have deployed a distributed vineyard cluster, you can also specify thepersistenvironment to enable the vineyard client to persist the data to the vineyard cluster.export AIRFLOW__VINEYARD__PERSIST=trueUsageAfter installing the dependencies and preparing the vineyard server, you can launch your airflow scheduler and workers, and run the following DAG as an example,importnumpyasnpimportpandasaspdfromairflow.decoratorsimportdag,taskfromairflow.utils.datesimportdays_agodefault_args={'owner':'airflow',}@dag(default_args=default_args,schedule_interval=None,start_date=days_ago(2),tags=['example'])deftaskflow_etl_pandas():@task()defextract():order_data_dict=pd.DataFrame({'a':np.random.rand(100000),'b':np.random.rand(100000)})returnorder_data_dict@task(multiple_outputs=True)deftransform(order_data_dict:dict):return{"total_order_value":order_data_dict["a"].sum()}@task()defload(total_order_value:float):print(f"Total order value is:{total_order_value:.2f}")order_data=extract()order_summary=transform(order_data)load(order_summary["total_order_value"])taskflow_etl_pandas_dag=taskflow_etl_pandas()In the above example, taskextractand tasktransformshares apandas.DataFrameas the intermediate data, which is impossible as it cannot be pickled and when the data is large, it cannot be fit into the table in backend databases of Airflow.The example is adapted from the documentation of Airflow, see alsoTutorial on the Taskflow API.Run the testsStart your vineyardd with the following command,python3 -m vineyardSet airflow to use the vineyard XCom backend, and run tests with pytest,export AIRFLOW__CORE__XCOM_BACKEND=vineyard.contrib.airflow.xcom.VineyardXCom pytest -s -vvv python/vineyard/contrib/airflow/tests/test_python_dag.py pytest -s -vvv python/vineyard/contrib/airflow/tests/test_pandas_dag.pyThe pandas test suite is not possible to run with the default XCom backend, vineyard enables airflow to exchangecomplexandbigdata without modify the DAG and tasks!Deploy using Docker ComposeWe provide a reference docker-compose settings (seedocker-compose.yaml) for deploying airflow with vineyard as the XCom backend on Docker Compose. For more details, please refer tothe official documentation.Before deploying the docker-compose, you need to initialize the environment as follows.$mkdir-p./dags./logs./plugins./config $echo-e"AIRFLOW_UID=$(id-u)">.envThen, copy the example dags to the./dagsdirectory.$cpexamples_dags/v6d*.py./dagsAfter that, the docker-compose containers could be deployed as$cddocker/ $dockerbuild.-fDockerfile-tvineyardcloudnative/vineyard-airflow:2.6.3 $dockercomposeupYou can see the added DAGs and run them viaweb uiorcli.We have also included a diff filedocker-compose.yaml.diffthat shows the changed pieces that can be introduced into your own docker compose deployment.Deploy on KubernetesWe provide a reference settings (seevalues.yaml) for deploying airflow with vineyard as the XCom backend on Kubernetes, based onthe official helm charts.Next, we will show how to deploy airflow with vineyard on Kubernetes using the helm chart.You are supposed to deploy the vineyard cluster on the kubernetes cluster first. For example, you can deploy a vineyard cluster with thevineyardctl command.$vineyardctldeployvineyard-deployment--create-namespaceThevalues.yamlmainly tweak the following settings:Installing vineyard dependency to the containers using pip before starting workersAdding a vineyardd container to the airflow podsMounting the vineyardd's UNIX-domain socket and shared memory to the airflow worker podsNote thatthevalues.yamlmay doesn't work in your environment, as airflow requires other settings like postgresql database, persistence volumes, etc. You can combine the referencevalues.yamlwith your own specific Airflow settings.Thevalues.yamlfor Airflow's helm chart can be used as# add airflow helm stable repo$helmrepoaddapache-airflowhttps://airflow.apache.org $helmrepoupdate# deploy airflow$helminstall-fvalues.yamlairflowapache-airflow/airflow--namespaceairflow--create-namespaceIf you want to put the vineyard example DAGs into the airflow scheduler pod and the worker pod, you can use the following command.$kubectlcp./example_dags/v6d_etl.py$(kubectlgetpod-lcomponent=scheduler-nairflow-ojsonpath='{.items[0].metadata.name}'):/opt/airflow/dags-cscheduler-nairflow $kubectlcp./example_dags/v6d_etl.py$(kubectlgetpod-lcomponent=worker-nairflow-ojsonpath='{.items[0].metadata.name}'):/opt/airflow/dags-cworker-nairflow
airflow-provider-weights-and-biases
Weights and Biases
airflow-provider-whylogs
whylogs Airflow OperatorThis is a package for thewhylogsprovider, the open source standard for data and ML logging. With whylogs, users are able to generate summaries of their datasets (called whylogs profiles) which they can use to:Track changes in their datasetCreate data constraints to know whether their data looks the way it shouldQuickly visualize key summary statistics about their datasetsThis Airflow operator focuses on simplifying whylogs' usage along with Airflow. Users are encouraged to benefit from their existing Data Profiles, which are created with whylogs and can bring a lot of value and visibility to track their data changes over time.InstallationYou can install this package on top of an existing Airflow 2.0+ installation (Requirements) by simply running:$pipinstallairflow-provider-whylogsTo install this provider from source, run these instead:[email protected]:whylabs/airflow-provider-whylogs.git $cdairflow-provider-whylogs $python3-mvenv.env&&source.env/bin/activate $pip3install-e.Usage exampleIn order to benefir from the existing operators, users will have to profile their data first, with theirprocessingenvironment of choice. To create and store a profile locally, run the following command on a pandas DataFrame:importwhylogsaswhydf=pd.read_csv("some_file.csv")results=why.log(df)results.writer("local").write()And after that, you can use our operators to either:Create a Summary Drift Report, to visually help you identify if there was drift in your datafromwhylogs_provider.operators.whylogsimportWhylogsSummaryDriftOperatorsummary_drift=WhylogsSummaryDriftOperator(task_id="drift_report",target_profile_path="data/profile.bin",reference_profile_path="data/profile.bin",reader="local",write_report_path="data/Profile.html",)Run a Constraints check, to check if your profiled data met some criteriafromwhylogs_provider.operators.whylogsimportWhylogsConstraintsOperatorfromwhylogs.core.constraints.factoriesimportgreater_than_numberconstraints=WhylogsConstraintsOperator(task_id="constraints_check",profile_path="data/profile.bin",reader="local",constraint=greater_than_number(column_name="my_column",number=0.0),)NOTE: It is important to note that even though it is possible to create a Dataset Profile with the Python Operator, Airflow tries to separate the concern of orchestration from processing, so that is one of the reasons why we didn't want to have a strong opinion on how to read data and profile it, enabling users to best adjust this step to their existing scenario.A full DAG example can be found on the whylogs_provider packagedirectory.RequirementsThe current requirements to use this Airflow Provider are described on the table below.PIP packageVersion requiredapache-airflow>=2.0whylogs[viz, s3]>=1.0.10ContributingUsers are always welcome to ask questions and contribute to this repository, by submitting issues and communicating with us through ourcommunity Slack. Feel free to reach out and makewhylogseven more awesome to use with Airflow.Happy coding! 😄
airflow-provider-xlsx
Airflow Provider XLSXApache Airflowoperators for converting XLSX files from/to Parquet, CSV and JSON.System RequirementsAirflow Versions2.0 or newerInstallation$pipinstallairflow-provider-xlsxOperatorsFromXLSXOperatorRead an XLSX or XLS file and convert it into Parquet, CSV, JSON, JSON Lines(one line per record) file.API DocumentationExampleXLSX SourceAirflow Taskfromxlsx_provider.operators.from_xlsx_operatorimportFromXLSXOperatorxlsx_to_jsonl=FromXLSXOperator(task_id='xlsx_to_jsonl',source='{{ var.value.tmp_path }}/test.xlsx',target='{{ var.value.tmp_path }}/test.jsonl',file_format='jsonl',dag=dag)JSON Lines Output{"month":"Jan","high":-12.2,"mean":-16.2,"low":-20.1,"precipitation":19}{"month":"Feb","high":-10.3,"mean":-14.7,"low":-19.1,"precipitation":14}{"month":"Mar","high":-2.6,"mean":-7.2,"low":-11.8,"precipitation":15}{"month":"Apr","high":8.1,"mean":3.2,"low":-1.7,"precipitation":24}{"month":"May","high":17.5,"mean":11.6,"low":5.6,"precipitation":36}{"month":"Jun","high":24,"mean":18.2,"low":12.3,"precipitation":58}{"month":"Jul","high":25.7,"mean":20.2,"low":14.7,"precipitation":72}{"month":"Aug","high":22.2,"mean":17,"low":11.7,"precipitation":66}{"month":"Sep","high":16.6,"mean":11.5,"low":6.4,"precipitation":44}{"month":"Oct","high":6.8,"mean":3.4,"low":0,"precipitation":38}FromXLSXQueryOperatorExecute an SQL query an XLSX/XLS file and export the result into a Parquet or CSV fileThis operators loads an XLSX or XLS file into an in-memory SQLite database, executes a query on the db and stores the result into a Parquet, CSV, JSON, JSON Lines(one line per record) file. The output columns names and types are determinated by the SQL query output.API DocumentationExampleXLSX SourceSQL Queryselectgashigh_tech_sector,haseur_bilion,iassharefromhigh_techwhere_index>1andhigh_tech_sector<>''andlower(high_tech_sector)<>'total'Airflow Taskfromxlsx_provider.operators.from_xlsx_query_operatorimportFromXLSXQueryOperatorxlsx_to_csv=FromXLSXQueryOperator(task_id='xlsx_to_csv',source='{{ var.value.tmp_path }}/high_tech.xlsx',target='{{ var.value.tmp_path }}/high_tech.parquet',file_format='csv',csv_delimiter=',',table_name='high_tech',worksheet='Figure 3',query='''selectg as high_tech_sector,h as eur_bilion,i as sharefromhigh_techwhere_index > 1and high_tech_sector <> ''and lower(high_tech_sector) <> 'total'''',dag=dag)Outputhigh_tech_sector,value,share Pharmacy,78280,0.231952169555313 Electronics-telecommunications,75243,0.222954583130376 Scientific instruments,64010,0.189670433253542 Aerospace,44472,0.131776952366115 Computers office machines,21772,0.0645136852766778 Non-electrical machinery,20813,0.0616714981835167 Chemistry,19776,0.058598734453222 Electrical machinery,9730,0.028831912195612 Armament,3384,0.0100300315856265ToXLSXOperatorRead a Parquest, CSV, JSON, JSON Lines(one line per record) file and convert it into XLSX.API DocumentationExamplefromxlsx_provider.operators.to_xlsx_operatorimportToXLSXOperatorparquet_to_xlsx=ToXLSXOperator(task_id='parquet_to_xlsx',source='{{ var.value.tmp_path }}/test.parquet',target='{{ var.value.tmp_path }}/test.xlsx',dag=dag)LinksApache Airflow -https://github.com/apache/airflowProject home page (GitHub) -https://github.com/andreax79/airflow-provider-xlsxDocumentation (Read the Docs) -https://airflow-provider-xlsx.readthedocs.io/en/latestopenpyxl, library to read/write Excel 2010 xlsx/xlsm/xltx/xltm files -https://foss.heptapod.net/openpyxl/openpyxllrd, library for reading data and formatting information from Excel files in the historical .xls format -https://github.com/python-excel/xlrdPython library for Apache Arrow -https://github.com/apache/arrow/tree/master/python
airflow-provider-zenml
zenml
airflow-pydantic-dags
Airflow-Pydantic-DAGsPydanticDAGs allows you to usePydanticmodels for task configuration in yourAirflowDAGs.Airflow + Pydantic = ❤️: Use Pydantic for Airflow Task ConfigurationRuns of Airflow DAGs can be configured using parameters, that usepropietary validationand are not de-serialized: you get a dictionary of parameters in your tasks. This leaves YOU to deal with un-typed dictionary values at the task level, write validation logic for them, all without instrospection when developing.If only there was a established library to create data models with validation?EnterPydantic: using theDAGmodel ofairflow-pydantic-dagsyou get a Pydantic model passed to your tasks, that contains your validated run configuration. Your model is also exposed in the Airflow UI, making it easy to launch Airflow DAGRuns.InstallationCurrently, this supports:airflow >= 2.6.0pydantic < 2As soon as we publish this to pip :) you canpip install airflow-pydantic-dagsUsageUse the classPydanticDAGinstead ofDAG, and pass the pydantic class you want to use to parse the params. By decorating tasks [email protected]_config()you will get aconfig_objectpassed to your task, which is an instance of the Pydantic class, initialized with your parameters.Currently, your Pydantic classneeds to provide default values for all attributes, otherwise the DAG will fail to initialize.In the Airflow UI, you will find all attributes of the Pydantic class exposed as params. Currently, only non-nested fields are exposed as single items, everything else will become a json parameter.[!NOTE] Validation of params by Pydantic when submitting through the UI and CLI iscurrentlynot done at time of DAG run creation, but instead only when the parameters are first accessed in a task. Achieving validation at creation time will be part of a future update. See just below for how to force earlier validation.To achieve systematic early validation and fail your PydanticDAGdagfor invalid parameters, do one of the following:usePydanticDAG(add_validation_task=True): this will add a task (without dependencies) to your DAG that validates the paramsusedag.get_validation_taskto get a task that validates the params: You can use this to create custom dependencies in your DAG on the validationuse the [email protected]_configon any of your own tasks: this will force validation of the paramsExampleSource is atdags/example.py:fromairflow_pydantic_dags.dagimportPydanticDAGfromairflow.decoratorsimporttaskfromdatetimeimportdatetimeclassMyRunConfig(pyd.BaseModel):string_to_print:str="overwrite this"withPydanticDAG(params_pydantic_class=MyRunConfig,dag_id="example",schedule=None,start_date=datetime(2023,8,1),params={"airflow_classic_param":1},)asdag:@task(dag=dag)@dag.parse_config()defpull_params(config_object:MyRunConfig|None=None,**kwargs):# params contains pydantic and non-pydantic parameter valuesprint("Params:")print(kwargs["params"])# using the dag.parse_config() decorator, we also get the deserialized pydantic object as 'config_object'print("Pydantic object:")print(type(config_object))print(config_object)pull_params()This generates an UI interface for your DAG, including all pydantic and non-pydantic parameters:And the task log shows{logging_mixin.py:151} INFO - Params: {logging_mixin.py:151} INFO - {'airflow_classic_param': 1, 'string_to_print': 'overwrite this'} {logging_mixin.py:151} INFO - Pydantic object: {logging_mixin.py:151} INFO - <class 'unusual_prefix_95fa66c061cb15347627f327a8a577346657e3a7_example.MyRunConfig'> {logging_mixin.py:151} INFO - string_to_print='overwrite this'MentionsOther projectspyproject.toml,setup.cfg, and.pre-commit-config.yamlwere adapted from the excellentcoookiecutter-djangoproject.
airflow-python-sdk
airflow-python-sdkOverviewTo facilitate management, Apache Airflow supports a range of REST API endpoints across its objects. This section provides an overview of the API design, methods, and supported use cases.Most of the endpoints acceptJSONas input and returnJSONresponses. This means that you must usually add the following headers to your request:Content-type: application/json Accept: application/jsonResourcesThe termresourcerefers to a single type of object in the Airflow metadata. An API is broken up by its endpoint's corresponding resource. The name of a resource is typically plural and expressed in camelCase. Example:dagRuns.Resource names are used as part of endpoint URLs, as well as in API parameters and responses.CRUD OperationsThe platform supportsCreate,Read,Update, andDelete operations on most resources. You can review the standards for these operations and their standard parameters below.Some endpoints have special behavior as exceptions.CreateTo create a resource, you typically submit an HTTPPOSTrequest with the resource's required metadata in the request body. The response returns a201 Createdresponse code upon success with the resource's metadata, including its internalid, in the response body.ReadThe HTTPGETrequest can be used to read a resource or to list a number of resources.A resource'sidcan be submitted in the request parameters to read a specific resource. The response usually returns a200 OKresponse code upon success, with the resource's metadata in the response body.If aGETrequest does not include a specific resourceid, it is treated as a list request. The response usually returns a200 OKresponse code upon success, with an object containing a list of resources' metadata in the response body.When reading resources, some common query parameters are usually available. e.g.:v1/connections?limit=25&offset=25Query ParameterTypeDescriptionlimitintegerMaximum number of objects to fetch. Usually 25 by defaultoffsetintegerOffset after which to start returning objects. For use with limit query parameter.UpdateUpdating a resource requires the resourceid, and is typically done using an HTTPPATCHrequest, with the fields to modify in the request body. The response usually returns a200 OKresponse code upon success, with information about the modified resource in the response body.DeleteDeleting a resource requires the resourceidand is typically executing via an HTTPDELETErequest. The response usually returns a204 No Contentresponse code upon success.ConventionsResource names are plural and expressed in camelCase.Names are consistent between URL parameter name and field name.Field names are in snake_case.{\"name\": \"string\",\"slots\": 0,\"occupied_slots\": 0,\"used_slots\": 0,\"queued_slots\": 0,\"open_slots\": 0}Update MaskUpdate mask is available as a query parameter in patch endpoints. It is used to notify the API which fields you want to update. Usingupdate_maskmakes it easier to update objects by helping the server know which fields to update in an object instead of updating all fields. The update request ignores any fields that aren't specified in the field mask, leaving them with their current values.Example:resource = request.get('/resource/my-id').json() resource['my_field'] = 'new-value' request.patch('/resource/my-id?update_mask=my_field', data=json.dumps(resource))Versioning and Endpoint LifecycleAPI versioning is not synchronized to specific releases of the Apache Airflow.APIs are designed to be backward compatible.Any changes to the API will first go through a deprecation phase.Summary of ChangesAirflow versionDescriptionv2.0Initial releaseTrying the APIYou can use a third party client, such ascurl,HTTPie,Postmanorthe Insomnia rest clientto test the Apache Airflow API.Note that you will need to pass credentials data.For e.g., here is how to pause a DAG withcurl, when basic authorization is used:curl-XPOST'https://example.com/api/v1/dags/{dag_id}?update_mask=is_paused'\\-H'Content-Type: application/json'\\--user\"username:password\"\\-d'{\"is_paused\": true}'Using a graphical tool such asPostmanorInsomnia, it is possible to import the API specifications directly:Download the API specification by clicking theDownloadbutton at top of this documentImport the JSON specification in the graphical tool of your choice.InPostman, you can click theimportbutton at the topWithInsomnia, you can just drag-and-drop the file on the UINote that withPostman, you can also generate code snippets by selecting a request and clicking on theCodebutton.AuthenticationTo be able to meet the requirements of many organizations, Airflow supports many authentication methods, and it is even possible to add your own method.If you want to check which auth backend is currently set, you can useairflow config get-value api auth_backendcommand as in the example below.$airflowconfigget-valueapiauth_backend airflow.api.auth.backend.basic_authThe default is to deny all requests.For details on configuring the authentication, seeAPI Authorization.ErrorsWe follow the error response format proposed inRFC 7807also known as Problem Details for HTTP APIs. As with our normal API responses, your client must be prepared to gracefully handle additional members of the response.UnauthenticatedThis indicates that the request has not been applied because it lacks valid authentication credentials for the target resource. Please check that you have valid credentials.PermissionDeniedThis response means that the server understood the request but refuses to authorize it because it lacks sufficient rights to the resource. It happens when you do not have the necessary permission to execute the action you performed. You need to get the appropriate permissions in other to resolve this error.BadRequestThis response means that the server cannot or will not process the request due to something that is perceived to be a client error (e.g., malformed request syntax, invalid request message framing, or deceptive request routing). To resolve this, please ensure that your syntax is correct.NotFoundThis client error response indicates that the server cannot find the requested resource.MethodNotAllowedIndicates that the request method is known by the server but is not supported by the target resource.NotAcceptableThe target resource does not have a current representation that would be acceptable to the user agent, according to the proactive negotiation header fields received in the request, and the server is unwilling to supply a default representation.AlreadyExistsThe request could not be completed due to a conflict with the current state of the target resource, meaning that the resource already existsUnknownThis means that the server encountered an unexpected condition that prevented it from fulfilling the request.This Python package is automatically generated by theOpenAPI Generatorproject based onthe specs:API version: 1.0.0Package version: 1.0.1Build package: org.openapitools.codegen.languages.PythonClientCodegen For more information, please visithttps://github.com/zachliuRequirements.Python >= 3.6Installation & Usagepip installThe python package is hosted on PyPI, you can install directly using:pipinstallairflow-python-sdkThen import the package:importairflow_python_sdkSetuptoolsInstall viaSetuptools.pythonsetup.pyinstall--user(orsudo python setup.py installto install the package for all users)Then import the package:importairflow_python_sdkGetting StartedPlease follow theinstallation procedureand then run the following:importtimeimportairflow_python_sdkfrompprintimportpprintfromairflow_python_sdk.apiimportconfig_apifromairflow_python_sdk.model.configimportConfigfromairflow_python_sdk.model.errorimportError# Defining the host is optional and defaults to http://localhost/api/v1# See configuration.py for a list of all supported configuration parameters.configuration=airflow_python_sdk.Configuration(host="http://localhost/api/v1")# The client must configure the authentication and authorization parameters# in accordance with the API server security policy.# Examples for each auth method are provided below, use the example that# satisfies your auth use case.# Configure HTTP basic authorization: Basicconfiguration=airflow_python_sdk.Configuration(host="https://<your-airflow-2.0.0>/api/v1",username='YOUR_USERNAME',password='YOUR_PASSWORD')# Enter a context with an instance of the API clientwithairflow_python_sdk.ApiClient(configuration)asapi_client:# Create an instance of the API classapi_instance=config_api.ConfigApi(api_client)try:# Get current configurationapi_response=api_instance.get_config()pprint(api_response)exceptairflow_python_sdk.ApiExceptionase:print("Exception when calling ConfigApi->get_config:%s\n"%e)Documentation for API EndpointsAll URIs are relative tohttp://localhost/api/v1ClassMethodHTTP requestDescriptionConfigApiget_configGET/configGet current configurationConnectionApidelete_connectionDELETE/connections/{connection_id}Delete a connectionConnectionApiget_connectionGET/connections/{connection_id}Get a connectionConnectionApiget_connectionsGET/connectionsList connectionsConnectionApipatch_connectionPATCH/connections/{connection_id}Update a connectionConnectionApipost_connectionPOST/connectionsCreate a connectionDAGApiget_dagGET/dags/{dag_id}Get basic information about a DAGDAGApiget_dag_detailsGET/dags/{dag_id}/detailsGet a simplified representation of DAGDAGApiget_dag_sourceGET/dagSources/{file_token}Get a source codeDAGApiget_dagsGET/dagsList DAGsDAGApiget_taskGET/dags/{dag_id}/tasks/{task_id}Get simplified representation of a taskDAGApiget_tasksGET/dags/{dag_id}/tasksGet tasks for DAGDAGApipatch_dagPATCH/dags/{dag_id}Update a DAGDAGApipost_clear_task_instancesPOST/dags/{dag_id}/clearTaskInstancesClear a set of task instancesDAGApipost_set_task_instances_statePOST/dags/{dag_id}/updateTaskInstancesStateSet a state of task instancesDAGRunApidelete_dag_runDELETE/dags/{dag_id}/dagRuns/{dag_run_id}Delete a DAG runDAGRunApiget_dag_runGET/dags/{dag_id}/dagRuns/{dag_run_id}Get a DAG runDAGRunApiget_dag_runsGET/dags/{dag_id}/dagRunsList DAG runsDAGRunApiget_dag_runs_batchPOST/dags/~/dagRuns/listList DAG runs (batch)DAGRunApipost_dag_runPOST/dags/{dag_id}/dagRunsTrigger a new DAG runEventLogApiget_event_logGET/eventLogs/{event_log_id}Get a log entryEventLogApiget_event_logsGET/eventLogsList log entriesImportErrorApiget_import_errorGET/importErrors/{import_error_id}Get an import errorImportErrorApiget_import_errorsGET/importErrorsList import errorsMonitoringApiget_healthGET/healthGet instance statusMonitoringApiget_versionGET/versionGet version informationPoolApidelete_poolDELETE/pools/{pool_name}Delete a poolPoolApiget_poolGET/pools/{pool_name}Get a poolPoolApiget_poolsGET/poolsList poolsPoolApipatch_poolPATCH/pools/{pool_name}Update a poolPoolApipost_poolPOST/poolsCreate a poolTaskInstanceApiget_extra_linksGET/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/linksList extra linksTaskInstanceApiget_logGET/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{task_try_number}Get logsTaskInstanceApiget_task_instanceGET/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}Get a task instanceTaskInstanceApiget_task_instancesGET/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstancesList task instancesTaskInstanceApiget_task_instances_batchPOST/dags//dagRuns//taskInstances/listList task instances (batch)VariableApidelete_variableDELETE/variables/{variable_key}Delete a variableVariableApiget_variableGET/variables/{variable_key}Get a variableVariableApiget_variablesGET/variablesList variablesVariableApipatch_variablePATCH/variables/{variable_key}Update a variableVariableApipost_variablesPOST/variablesCreate a variableXComApiget_xcom_entriesGET/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntriesList XCom entriesXComApiget_xcom_entryGET/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}Get an XCom entryDocumentation For ModelsClassReferenceClearTaskInstanceCollectionInfoColorConfigConfigOptionConfigSectionConnectionConnectionAllOfConnectionCollectionConnectionCollectionAllOfConnectionCollectionItemCronExpressionDAGDAGCollectionDAGCollectionAllOfDAGDetailDAGDetailAllOfDAGRunDAGRunCollectionDAGRunCollectionAllOfDagStateErrorEventLogEventLogCollectionEventLogCollectionAllOfExtraLinkExtraLinkCollectionHealthInfoHealthStatusImportErrorImportErrorCollectionImportErrorCollectionAllOfInlineResponse200InlineResponse2001ListDagRunsFormListTaskInstanceFormMetadatabaseStatusPoolPoolCollectionPoolCollectionAllOfRelativeDeltaSLAMissScheduleIntervalSchedulerStatusTagTaskTaskCollectionTaskCollectionAllOfTaskExtraLinksTaskInstanceTaskInstanceCollectionTaskInstanceCollectionAllOfTaskInstanceReferenceTaskInstanceReferenceCollectionTaskStateTimeDeltaTriggerRuleUpdateTaskInstancesStateVariableVariableAllOfVariableCollectionVariableCollectionAllOfVariableCollectionItemVersionInfoWeightRuleXComXComAllOfXComCollectionXComCollectionAllOfXComCollectionItemDocumentation For AuthorizationBasicType: HTTP basic [email protected] for Large OpenAPI documentsIf the OpenAPI document is large, imports in airflow_python_sdk.apis and airflow_python_sdk.models may fail with a RecursionError indicating the maximum recursion limit has been exceeded. In that case, there are a couple of solutions:Solution 1: Use specific imports for apis and models like:from airflow_python_sdk.api.default_api import DefaultApifrom airflow_python_sdk.model.pet import PetSolution 1: Before importing the package, adjust the maximum recursion limit as shown below:import sys sys.setrecursionlimit(1500) import airflow_python_sdk from airflow_python_sdk.apis import * from airflow_python_sdk.models import *
airflow-queue-stats
Welcome to Airflow Queue StatsAn airflow plugin to assist with worker and queue management in airflow
airflow-ray-executor
Airflow Ray ExecutorAirflow executor implemented usingray中文Usage$pipinstallairflow-ray-executorEdit yourairflow.cfgto set your executor to class:airflow_ray_executor.RayExecutorand add ray client address to this file, example:executor = airflow_ray_executor.RayExecutor[ray]# ray client address to connect to ray cluster# Ray Executor will start Ray on a single machine if not providedclient = ray://127.0.0.1:10001Please note: Airflow not support sqlite database when executor neither DebugExecutor nor SequentialExecutor
airflow-rbac-api-plugin
Failed to fetch description. HTTP Status Code: 404
airflow-rest-api
AIRFLOW-REST-APIAsync wrapper forAirflow REST API. See more indocumentationINSTALLpipinstallairflow-rest-apiUSAGEimportasyncioimportosimportaiohttpfromairflow_rest_api.apiimportAirflowRestApiAIRFLOW_HOST=os.environ.get("AIRFLOW_HOST","http://127.0.0.1:8080")AIRFLOW_USER=os.environ.get("AIRFLOW_USER","admin")AIRFLOW_PASSWORD=os.environ.get("AIRFLOW_PASSWORD","admin")asyncdefmain(ara):fortemplateinara.find_template("/dags"):template_id=template.idbreakauth=aiohttp.BasicAuth(AIRFLOW_USER,AIRFLOW_PASSWORD)asyncwithaiohttp.ClientSession(auth=auth)assession:airflow_tasks=awaitara.execute(session=session,template_id=template_id)ifairflow_tasks.status==200:file_token=airflow_tasks.raw["dags"][0]["file_token"]template_id=360dag_source_code=(awaitara.execute(session=session,method=ara.get_template(template_id).method,url=ara.render_url(template_id=template_id,file_token=file_token))).rawprint(dag_source_code)ara=AirflowRestApi(airflow_host=AIRFLOW_HOST)ara.show_templates()loop=asyncio.get_event_loop()loop.run_until_complete(main(ara))
airflow-restapi-sdk
AIRFLOW_RESTAPI-SDKfromairflow_restapi_sdkimportClient,Stateclient=Client('http://localhost:8080')# 触发DAGclient.dag.trigger('test_dag',conf={'k':'v'})# {# "execution_date": "2020-08-20T07:51:36+00:00",# "message": "Created <DagRun test_dag @ 2020-08-20 07:51:36+00:00: manual__2020-08-20T07:51:36+00:00, externally triggered: True>",# "run_id": "manual__2020-08-20T07:51:36+00:00"# }# 查看DAG运行状态client.dag.state('test_dag','2020-08-20T07:51:36+00:00')# {'state': 'failed'}# 触发DAG并阻塞,直到成功或失败status=client.dag.trigger_join('test_dag',conf={'k':'v'},timeout=300)print(status)# {'state': 'failed'}
airflow-run
Airflow RunPython tool for deploying Airflow Multi-Node Cluster.RequirementsPython >=3.6 (tested)GoalTo provide a quick way to setup Airflow Multi-Node Cluster (a.k.a. Celery Executor Setup).StepsGenerate config yaml file.Run commands to start webserver, scheduler, worker, (rabbitmq, postgres).Add dag files and run initdb.Generate config file:afr--generate_configRunning the tool in the same directory as config file:afr--runpostgresqlafr--runinitdbafr--runrabbitmqafr--runwebserverafr--runschedulerafr--runworker--queue{queuename}afr--runflowerOr, running the tool specifying config path:afr--runpostgresql--config/path/config.yamlOr, use this environment variable to set the config path:exportAIRFLOWRUN_CONFIG_PATH="/some_path/config.yaml"After running webserver, scheduler and worker (postgres and rabbitmq if needed local instances), Add your dag files in the dags subdirectory in the directory you defined in the config file.(* note: make sure you have the correct user permission in the dags, logs subdirectories.)That is it!!Default Config yaml file:private_registry:Falseregistry_url:registry.hub.docker.comusername:""password:""repository:pkuong/airflow-runimage:airflow-runtag:latestlocal_dir:{local directory where you want to mount /dags and /logs folder}webserver_port:8000flower_port:5555custom_mount_volumes:[]env:AIRFLOW__CORE__EXECUTOR:CeleryExecutorAIRFLOW__CORE__LOAD_EXAMPLES:"False"AIRFLOW__CORE__DAGS_FOLDER:/usr/local/airflow/airflow/dagsAIRFLOW__CORE__LOGS_FOLDER:/usr/local/airflow/airflow/logsAIRFLOW_HOME:/usr/local/airflowAIRFLOW__CORE__FERNET_KEY:""rabbitmq:name:rabbitmqusername:{username}password:{password}host:{IP}virtual_host:/image:rabbitmq:3-managementhome:/var/lib/rabbitmqui_port:15672port:5672env:RABBITMQ_DEFAULT_USER:{username}RABBITMQ_DEFAULT_PASS:{password}postgresql:name:postgresqlusername:{username}password:{password}host:{host}image:postgresdata:/var/lib/postgresql/dataport:5432env:PGDATA:/var/lib/postgresql/data/pgdataPOSTGRES_USER:{username}POSTGRES_PASSWORD:{password}Custom mount volumesYou can specify custom mount volumes in the container, for example:custom_mount_volumes:-host_path:/Users/bob/.awscontainer_path:/usr/local/airflow/.awsDocker imageThis tool is using the following public docker image by default.https://hub.docker.com/repository/docker/pkuong/airflow-runBuilding the image:If you want to build your own image, you can run the following:afd--build--config_path={absolutepathtoconfig.yaml}--dockerfile_path={absolutepathtodirectorywhichcontainsDockerfile}ContributorsPaulo Kuong (@pkuong)
airflowscan
airflowscanChecklist and tools for increasing security of Apache Airflow.DISCLAIMERThis project NOT AFFILIATED with the Apache Foundation and the Airflow project, and is not endorsed by them.ContentsThe purpose of this project is provide tools to increase security ofApache Airflow. installations. This projects provides the following tools:Configuration file with hardened settings - seehardened_airflow.cfg.Security checklist for hardening default installations - seeCHECKLIST.MD.Static analysis tool to check Airflow configuration files for insecure settings.JSON schema document used for validation by the static analysis tool - seeairflow_cfg.schemaInformation for the Static Analysis Tool (airflowscan)The static analysis tool can check an Airflow configuration file for settings related to security. The tool convers the config file to JSON, and then uses a JSON Schema to do the validation.RequirementsPython 3 is required and you can find all required modules in therequirements.txtfile. Only tested on Python 3.7 but should work on other 3.x releases. No plans to 2.x support at this time.InstallationYou can install this via PIP as follows:pip install airflowscan airflowscanTo download and run manually, do the following:git clone https://github.com/nightwatchcybersecurity/airflowscan.git cd airflowscan pip -r requirements.txt python -m airflowscan.cliHow to useTo scan a configuration file, do the following command:airflowscan scan some_airflow.cfgReporting bugs and feature requestsPlease use the GitHub issue tracker to report issues or suggest features:https://github.com/nightwatchcybersecurity/airflowscanYou can also send emai toresearch /at/ nightwatchcybersecurity [dot] com
airflow-sendgrid-provider-j2
Airflow SendGrid ProviderThis is a modified version ofhttps://github.com/apache/airflow/tree/master/airflow/providers/sendgridChangelog:added dynamic templates supportchanged errors messagesUsageIftemplate_idandtemplate_substitutionparameters is passed, thehtml_contentandsubjectparameters will be ignoredExamplesend_email( from_email='[email protected]', from_name='Hello', subject=null, to='[email protected]'', html_content=null, template_id='00000000-0000-0000-0000-000000000000', template_substitution={ ':number': '1337' } )
airflow-smartsheet-plugin
airflow-smartsheetSimple hooks and operators for transporting data from Smartsheet.Import Smartsheet into PostgreSQL or export as CSV, PDF or EXCEL file.FeaturesSmartsheetToFileOperator: exporting a Smartsheet sheet to a file/jsonSmartsheetToPostgresOperator: exporting a Smartsheet sheet to a PostgreSQL tableInstallUsing pip:pip3installairflow-smartsheet-pluginUsageCreate a variable in Airflow namedSMARTSHEET_ACCESS_TOKENto store your Smartsheet API access token.You can also pass in an override token in your DAG definition.This plugin is published as a pip package. Refer to theexample DAGfor available parameters.Refer to theenumsfor available PDF paper sizes.
airflow-snapshot
No description available on PyPI.
airflow-snapshot-test
No description available on PyPI.
airflow-socrata-plugin
Airflow-SocrataSimple hooks and operators for uploading data to Socrata.FeaturesUpsert or reupload PostgreSQL tables to SocrataInstallUsing pip:pip3installairflow-socrata-pluginUsageCreate a connection namedhttp_socrataof typehttpto store Socrata credentials.You can also pass inconn_nameparameter in DAG definition to override.Create a connection namedetl_postgresof typepostgresto store PostgreSQL credentials.You can also pass inpostgres_conn_nameparameter in DAG definition to override.By default, the plugin looks for the specified table underpublicschema. The schema can be specified withpostgres_schemaparameter.The plugin is published as a pip package. Refer to theexample DAGfor available parameters.
airflow-sops-secrets-backend
Airflow SOPS Secrets Backend for GCP KMSThis packages enables Airflow to pull connections and variables from files in GCP bucket that are encrypted bySOPSusing GCP KMS.Configure AirflowAdd following toairflow.cfg.[secrets] backend = airflow_sops.secrets_backend.GcsSopsSecretsBackend backend_kwargs = {"project_id": "your-project-id"}Available parameters to backend_kwargs:project_id: Optional. GCP project id where the GCS bucket which holds the encrypted connections/variables files reside.,bucket_name: Optional. If not submitted tries retrieving from Composer GCS_BUCKET environment variableconnections_prefix. Optional. Default is "sops/connections". The folder in GCS bucket that holds encrypted connections.variables_prefix: Optional. Default is "sops/variables". The folder in GCS bucket that holds encrypted variables.,encrypted_file_ext: Optional. Default is "enc". The file extension for encrypted sops files. The format is <connection_id or variable_key>.<encrypted_file_ext>.yamlignore_mac: Optional. Default is True. Ignores file checksum when true.GCP Configlocals{gcp_project_id="your-project-id"service_account_name="your-composer-service-account-name"}resource"google_service_account""composer"{account_id=local.service_account_namedisplay_name=local.service_account_nameproject=local.gcp_project_id}resource"google_project_iam_member""composer_worker"{project=local.gcp_project_idrole="roles/composer.worker"member="serviceAccount:${google_service_account.composer.email}"}resource"google_kms_key_ring""secrets"{name=local.gcp_project_idlocation="europe-west1"project=local.gcp_project_id}resource"google_kms_crypto_key""secrets_sops"{name="secrets_sops"key_ring=google_kms_key_ring.secrets.idrotation_period="7776000s"// 90 days}resource"google_kms_crypto_key_iam_member""composer_sops_decrypter"{crypto_key_id=google_kms_crypto_key.secrets_sops.idrole="roles/cloudkms.cryptoKeyDecrypter"member="serviceAccount:${google_service_account.composer.email}"}# some mandatory attributes omittedresource"google_composer_environment""composer"{name="your-composer-environment-name"region="europe-west1"project=local.gcp_project_idconfig{software_config{airflow_config_overrides={secrets-backend="airflow_sops.secrets_backend.GcsSopsSecretsBackend"}pypi_packages={airflow-secrets-sops="==0.0.1"}}node_config{service_account=google_service_account.composer.email}}}SOPSInstallSOPS. Encrypt files using GCP KMS and upload to GCP bucket sops/connections directoryexportKMS_PATH=$(gcloudkmskeyslist--locationeurope-west1--keyringyour-keyring--projectyour-gcp-project|awk'FNR == 2 {print $1}')sops--encrypt--encrypted-regex'^(password|extra)$'--gcp-kms$KMS_PATHsome-connection.yaml>some-connection.enc.yamlSetuppython-mvenv.venvsource.venv/bin/activate pipconfigset--siteglobal.extra-index-urlhttps://pypi.org/simple pipinstall-rrequirements.txtTestpipinstall.airflow-sops-secrets-backend[test]pytestBuildpipinstallairflow-sops-secrets-backend[dev]python-mbuild
airflow-spark-k8s
No description available on PyPI.
airflow-spell
No description available on PyPI.
airflow-sqlcmd-operator
airflow-sqlcmd-operatorCustom Airflow BashOperator for the Microsoft sqlcmd.This package utilizes the sqlcmd to run Microsoft SQLServer scripts on Linux like you would use them on SSMS for example.Thesqlcmdsupports SQLServer scripts with commands like GO, USE [db_name], etc, and multiple statements.RequirementsYou must havesqlcmdalready installed and (currently) on following location: "/opt/mssql-tools/bin/sqlcmd".Installing on Ubuntu with apt:curlhttps://packages.microsoft.com/keys/microsoft.asc|apt-keyadd- curlhttps://packages.microsoft.com/config/debian/10/prod.list>/etc/apt/sources.list.d/mssql-release.list# install required packages for pyodbcapt-getupdateACCEPT_EULA=Yapt-getinstall-ymsodbcsql17unixodbc-devmssql-tools&&apt-getcleanUsageOn a dag, you can call it like this:fromairflow_sqlcmd_operatorimportSqlcmdOperatorsqlcmd=SqlcmdOperator("MyDB","/scripts/folder/mydag","do_stuff.sql",dag=dag)
airflow-supervisor
airflow-supervisorApache Airflowutilities for running long-running or always-on jobs withsupervisord
airflow-supporter
airflow_supporterairflow_supporter provide services to supporter to manage Apache AirflowUsageAlert email on off DAG and turn on DAG automaticallyInsert following example in your DAGfromairflow_supporter.dag.check_off_dagimportcreate_check_off_dagcreate_check_off_dag()Variableenroll following Variable in your airflowcheck_off_dag_variable{ exclude_dag_list: list[str], default empty list, automatically_turn_on: bool, default true, email: Optional[str] }Restart failed DagRun@dag(dag_id="restart_failed_dagrun_dag",schedule="* * * * *",is_paused_upon_creation=False,catchup=False,start_date=datetime(year=1970,month=1,day=1),)defrestart_failed_dagrun_dag()->None:restart_failed_dagrun_op.restart_failed_dagrun_op(rv=RestartFailedDagrunVariable())restart_failed_dagrun_dag()Restart stucked Task@dag(dag_id="clear_stucked_task_dag",schedule="* * * * *",is_paused_upon_creation=False,catchup=False,start_date=datetime(year=1970,month=1,day=1),)defclear_stucked_task_dag()->None:restart_stucked_task_op.clear_stucked_task_op(rv=RestartStuckedTaskVariable())clear_stucked_task_dag()
airflow-tecton
[preview] airflow-tectonContains an Apache Airflow provider that allows you to author Tecton workflows inside Airflow.Two basic capabilities are supported:Submitting materialization jobsWaiting for Feature View/Feature Service data to materialize.Changelog0.1.0 Added 2 new operators to support triggering Feature Table ingestion jobs0.0.3 Added support forallow_overwritesetting in the operators0.0.2 Removed type annotations that caused compatibility issues with Airflow versions below 2.4.0.0.1 Initial releaseInstallation and ConfigurationInstallationYou can install this package viapip install airflow-tecton. Note that this requiresapache-airflow>=2.0.For a deployed environment, addairflow-tectonto yourrequirements.txtor wherever you configure installed packages.You can confirm a successful installation by runningairflow providers list, which should showairflow-tectonin the list.ConfigurationThis provider uses operators that interface with Tecton's API, thus it requires you set up Airflow to connect to Tecton.You can add a new connection by going toConnectionsunder theAdmintab in the Airflow UI. From there, hit the+button and selectTectonin the connection type dropdown. From there, you can configure the connection name, Tecton URL, and Tecton API key. Note that the default connection name istecton_default, so we recommend starting with this as a connection name to minimize configuration.UsageConfiguring a Feature View for manual triggeringABatchFeatureViewand aStreamFeatureViewcan be configured for manual triggering only. To do so, setbatch_trigger=BatchTriggerType.MANUAL. When set to manual, Tecton will not automatically create any batch materialization jobs for the Feature View. As of Tecton 0.6, any FeatureView can be manually triggered, but this is recommended mostly for manual usage.For aStreamFeatureView, only batch materialization job scheduling will be impacted by thebatch_triggersetting. Streaming materialization job scheduling will still be managed by Tecton.Here’s an example of aBatchFeatureViewconfigured for manual triggering.fromtectonimportbatch_feature_view,FilteredSource,Aggregation,BatchTriggerTypefromfraud.entitiesimportuserfromfraud.data_sources.transactionsimporttransactions_batchfromdatetimeimportdatetime,timedelta@batch_feature_view(sources=[FilteredSource(transactions_batch)],entities=[user],mode='spark_sql',aggregation_interval=timedelta(days=1),aggregations=[Aggregation(column='transaction',function='count',time_window=timedelta(days=1)),Aggregation(column='transaction',function='count',time_window=timedelta(days=30)),Aggregation(column='transaction',function='count',time_window=timedelta(days=90))],online=False,offline=True,feature_start_time=datetime(2022,5,1),tags={'release':'production'},owner='[email protected]',description='User transaction totals over a series of time windows, updated daily.',batch_trigger=BatchTriggerType.MANUAL# Use manual triggers)defuser_transaction_counts(transactions):returnf'''SELECTuser_id,1 as transaction,timestampFROM{transactions}'''If a Data Source input to the Feature View hasdata_delayset, then that delay will still be factored in to constructing training data sets but does not impact when the job can be triggered with the materialization API.Materialization Job SubmissionThere are two methods available to submit materialization jobs:TectonTriggerOperator: This triggers a materialization job for a Feature View. Tecton will retry any failing jobs automatically. Note that completion of this operator only means submission succeeded. To wait for completion, you must combine this withTectonSensor.TectonJobOperator: This triggers a materialization job for a Feature View with no retries. Additionally, when this operator is terminated, it will make a best effort to clean up the execution of the materialization job. Using this operator allows you to use standard Airflow keyword arguments to configure retry behavior. Additionally, this operator is synchronous, meaning that when the operator has succeeded, the underlying job has succeeded.Both of these require the following arguments:workspace - the workspace name of the Feature View you intend to materializefeature_view - the name of the Feature View you intend to materializeonline - whether the job should materialize to the online store. This requires that your FeatureView also has online materialization enabled.offline - whether the job should materialize to the offline store. This requires that your FeatureView also has offline materialization enabled.The time interval of the materialization job is configured automatically using Airflow templates. By default, it is from thedata_interval_startto thedata_interval_endof your DAG run. These can overridden if necessary.Example Usagefromairflow_tectonimportTectonJobOperator,TectonTriggerOperatorTectonJobOperator(task_id="tecton_job",workspace="my_workspace",feature_view="my_fv",online=False,offline=True,retries=3,)TectonTriggerOperator(task_id="trigger_tecton",workspace="my_workspace",feature_view="my_fv",online=True,offline=True,)Waiting For MaterializationTectonSensorThis enables you to wait for Materialization to complete for both Feature Views and Feature Services. Common uses are for monitoring as well as kicking off a training job after daily materialization completes.Example Usagefromairflow_tectonimportTectonSensorTectonSensor(task_id="wait_for_fs_online",workspace="my_workspace",feature_service="my_fs",online=True,offline=False,)TectonSensor(task_id="wait_for_fv",workspace="my_workspace",feature_view="my_fv",online=True,offline=True,)ExamplesSeeexample dags here.DevelopmentPre-commitThis repo uses pre-commit. Runpre-commit installin the repo root to configure pre-commit hooks. Pre-commit hooks take care of running unit tests as well as linting files.Run unit tests manuallyRunpython -m pytest tests/in the repo root.LicenseThis is licensed with the Apache 2.0 License.
airflow-test-decorator
No description available on PyPI.
airflow-tm1
Note: this was an old proof of concept for Airflow 1.x and probably isn't useful for most peopleI've started a airflow 2 style provider athttps://github.com/scrambldchannel/airflow-provider-tm1, help wanted ;)airflow-tm1A package that provides a hook to simplify connecting to the IBM Cognos TM1 / Planning Analytics REST API.RequirementsPython 3.7+Airflow 1.xTM1pyInstallationInstall with pippip install airflow-tm1UsageCreate a connection in Airflow with at least the following parameters set:HostLoginPortExtrassslAny other parameter accepted by the TM1py RestService constructor (eg base_url, namespace etc) can also be added as a key in the Extras field in the connection.In your DAG file:fromairflow_tm1.hooks.tm1importTM1Hooktm1_hook=TM1Hook(tm1_conn_id="tm1_default")tm1=tm1_hook.get_conn()This will attempt to connect to the TM1 server using the details provided and initialise an instance of the TM1Service class than be accessed attm1_hook.tm1SeeTM1pyfor more details.LicenseSeeLICENSE
airflow_toloka_ad_security
This is a security placeholder package. If you want to claim this name for legitimate purposes, please contact us [email protected]@yandex-team.ru
airflow-tools
Airflow ToolsCollection of Operators, Hooks and utility functions aimed at facilitating ELT pipelines.Data Lake FacadeThe Data Lake Facade serves as an abstracion over different Hooks that can be used as a backend such as:Azure Data Lake Storage (ADLS)Simple Storage Service (S3)Operators can create the correct hook at runtime by passing a connection ID with a connection type ofawsoradls. Example code:conn=BaseHook.get_connection(conn_id)hook=conn.get_hook()OperatorsHTTP to Data LakeCreates a Example usage:HttpToDataLake(task_id='test_http_to_data_lake',http_conn_id='http_test',data_lake_conn_id='data_lake_test',data_lake_path=s3_bucket+'/source1/entity1/{{ ds }}/',endpoint='/api/users',method='GET',jmespath_expression='data[:2].{id: id, email: email}',)JMESPATH expressionsAPIs often return the response we are interested in wrapped in a key. JMESPATH expressions are a query language that we can use to select the response we are interested in. You can find more information on JMESPATH expressions and test themhere.The above expression selects the first two objects inside the key data, and then only theidandemailattributes in each object. An example response can be foundhere.TestsIntegration testsTo guarantee that the library works as intended we have an integration test that attempts to install it in a fresh virtual environment, and we aim to have a test for each Operator.Running integration tests locallyThelint-and-test.ymlworkflowsets up the necessary environment variables, but if you want to run them locally you will need the following environment variables:AIRFLOW_CONN_DATA_LAKE_TEST='{"conn_type": "aws", "extra": {"endpoint_url": "http://localhost:9090"}}'AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLEAWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEYAWS_DEFAULT_REGION=us-east-1TEST_BUCKET=data_lakeS3_ENDPOINT_URL=http://localhost:9090AIRFLOW_CONN_DATA_LAKE_TEST='{"conn_type": "aws", "extra": {"endpoint_url": "http://localhost:9090"}}'AWS_ACCESS_KEY_ID=AKIAIOSFODNN7EXAMPLEAWS_SECRET_ACCESS_KEY=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEYTEST_BUCKET=data_lakeS3_ENDPOINT_URL=http://localhost:9090poetryrunpytesttests/--doctest-modules--junitxml=junit/test-results.xml--cov=com--cov-report=xml--cov-report=htmlAIRFLOW_CONN_SFTP_TEST='{"conn_type": "sftp", "host": "localhost", "port": 22, "login": "test_user", "password": "pass"}'And you also need to runAdobe's S3 mock containerlike this:dockerrun--rm-p9090:9090-einitialBuckets=data_lake-edebug=true-tadobe/s3mockand the SFTP container like this:dockerrun-p22:22-datmoz/sftptest_user:pass:::root_folderNotificationsSlack (incoming webhook)If your or your team are using slack, you can send and receive notifications about failed dags usingdag_failure_slack_notification_webhookmethod (innotifications.slack.webhook). You need to create a new Slack App and enable the "Incoming Webhooks". More info about sending messages using Slack Incoming Webhookshere.You need to create a new Airflow connection with the nameSLACK_WEBHOOK_NOTIFICATION_CONN(orAIRFLOW_CONN_SLACK_WEBHOOK_NOTIFICATION_CONNif you are using environment variables.)Default message will have the format below:But you can custom this message providing the below parameters:text (str)[optional]:the main message will appear in the notification. If you provide your slack block will be ignored.blocks (dict)[optional]:you can provide your custom slack blocks for your message.include_blocks (bool)[optional]:indicates if the default block have to be used. If you provide your own blocks will be ignored.source (typing.Literal['DAG', 'TASK'])[optional]:source of the failure (dag or task). Default:DAG.image_url: (str)[optional]image url for you notification (accessory). You can useAIRFLOW_TOOLS__SLACK_NOTIFICATION_IMG_URLinstead.Example of use in a Dagfromdatetimeimportdatetime,timedeltafromairflowimportDAGfromairflow.operators.bashimportBashOperatorfromairflow_tools.notifications.slack.webhookimport(dag_failure_slack_notification_webhook,# <--- IMPORT)withDAG("slack_notification_dkl",description="Slack notification on fail",schedule=timedelta(days=1),start_date=datetime(2021,1,1),catchup=False,tags=["example"],on_failure_callback=dag_failure_slack_notification_webhook(),# <--- HERE)asdag:t=BashOperator(task_id="failing_test",depends_on_past=False,bash_command="exit 1",retries=1,)if__name__=="__main__":dag.test()You can used only in a task providing the parametersource='TASK':t=BashOperator(task_id="failing_test",depends_on_past=False,bash_command="exit 1",retries=1,on_failure_callback=dag_failure_slack_notification_webhook(source='TASK'))You can add a custom message (ignoring the slack blocks for a formatted message):withDAG(...on_failure_callback=dag_failure_slack_notification_webhook(text='The task {{ ti.task_id }} failed',include_blocks=False),)asdag:Or you can pass your own Slack blocks:custom_slack_blocks={"type":"section","text":{"type":"mrkdwn","text":"<https://api.slack.com/reference/block-kit/block|This is an example using custom Slack blocks>"}}withDAG(...on_failure_callback=dag_failure_slack_notification_webhook(blocks=custom_slack_blocks),)asdag:
airflow-training-operators
Airflow Training operatorsSome public operators that we use for the Airflow Training material.
airflow-trigger-multiple-dag-run
airflow-trigger-multiple-dag-runUsagepipinstallairflow-trigger-multiple-dag-runDev setupinitial setuppipinstallpoetry makesetuppublish to testpypipoetryconfigrepositories.testpypihttps://test.pypi.org/legacy/ poetrypublish--build-rtestpypi-p<password>-u<username:=__token__>useful poetry commandspoetryadd<dependencies> poetryadd--dev<dependencies># dev dependenciespoetryupdate poetrylock--no-updateContributionContributions are very welcome. Tests can be run withmake test, please ensure the coverage at least stays the same before you submit a pull request.LicenseDistributed under the terms of theMITlicense, "airflow-trigger-multiple-dagrun" is free and open source softwareRepositoryRepo-url
airflow-util-dv
No description available on PyPI.
airflow_utils
No description available on PyPI.
airflow-valohai-plugin
No description available on PyPI.
airflow-waterdrop-plugin
No description available on PyPI.
airflow-windmill
WindmillDrag'n'drop web app to manage and create Airflow DAGs. DAGs are described usinga JSON "wml" file, which can be transpiled into a Python DAG file and pushed to a configured git repository.Front end is built using React on TypescriptBack end is built using Flask on Python 3.6+Getting StartedInstall withpip install airflow-windmillAirflow is expected to be installed on the system. This allows Windmill to run with arbitrary versions of AirflowOtherwise it can be packaged with windmill usingpip install airflow-windmill[airflow]. The version is defined inpyproject.tomlRunwindmill initto create a local Windmill projectcd windmill-projectRunwindmill runfrom this folder to run the app locallyMVP Required FeaturesFront-End FeaturesDynamic OperatorsMenu DropdownsLoad Operators from AppFormat operator display into classesSearch functionality for operatorsBasic operator level propertiesImplement DAG level propertiesNew DAG FunctionalityParameter TooltipsRender arbitrary viewport windows for New/Save/Load etcOverwrite/Save prompt on NewDAG renaming and save functionalityOpen dag from menuMake save/load more efficient by removing non-essential valuesSwitch nav menu to iconsAdd convert DAG callAdd hotkeys to menu functionsMake input/output nodes more clearCheck if file already exists on renamePrompt save if there are nodes on openFix loss of state on refresh bugPut File details in File BrowseMake Flask Backend URI configurableAdd a last saved time to NavBarAdd error handling to backend callsOnly save local state if validAdd testsGet task descriptions from Operator listXSS and injection vulns?Ctl+Shift+F FIXMEBack-End FeaturesGenerate Operator ListsCLI to start Web and Front EndGenerate DAG SpecCLI to create new windmill projectCLI to start windmill from a windmill projectImplement windmill-dev startSave/Load Windmill Files functionalityGet default valuesPull parameters from parent classesMove airflow dependency as extraConvert WML into Python DAGAPI Endpoint to trigger WML -> DAGMake sure that nodes are being put in right order using portsEdge cases for WML -> DAGAllow repeated/weird dag/task ids (e.g. 123)Get WML owner and last-modified details during wml listAllow custom operatorsStrategy for Python Opjects (e.g. callables) - allow either a import ref or an inline statementBackport existing Python DAGs to WMLsAllow DAG updates to propogate to WMLs (probably better to just always backport - consolidating would be a mess)Add tests for different airflow versionVersion lock travis tox and poetry version?Other featuresValidate on backend or front end or both?DocoAdd permission restrictions for valid tagsOnly include dist folder from app in poetry buildDev User GuideTo run as a dev:Clone from gitRunpoetry install -E airflowRunnpm install from windmill/http/appRunwindmill-dev start-frontendRunwindmill-dev start-backendOpen127.0.0.1:1234Run Python TestsExpects Tox and Poetry to be available on pathpyenv install 3.6.5 3.7.7 pyenv local 3.6.5 3.7.7 toxFuture Usage PatternsAuto-sync for windmill project to gitDeploymentDeployment to PyPi is managed using Travis and should be done in the following steps:Runpoetry version {patch|minor|major}Increment the version number inwindmill/__init__.pyCommit and merge code into the master branchEnsure that the travis build is greenCreate a git tag for the new buildPush the tag to origin
airflow-workflows
WorkflowsWorkflows are a cleaner way of implementing DAGs using a Django-inspired class-based syntax.Simple ExampleLet's create a single Airflow DAG, whose name is a camelcased version of the class name, and whose operator dependencies are in the order they are defined.There is an option to override the defaultdependenciesmethod implementation to customise the dependency chain for your use case.importworkflowsclassExampleWorkflow(workflows.Workflow):classMeta:schedule_interval='0 9 * * *'do_something_useful=workflows.PythonOperator(python_callable=lambda**kwargs:print('something useful'),)something_else=workflows.PythonOperator(python_callable=lambda**kwargs:print('Something not useful'),)globals()[ExampleWorkflow.DAG.dag_id]=ExampleWorkflow.DAGDynamic DAG ExampleLet's create (in this case three) DAGs, created dynamically and based on theExampleWorkflowclass as implemented above. In other words, they will share the same DAG metadata (so schedule in this case).importworkflowsworkflow_names=['Test1','Test2','Test3',]forworkflowinworkflow_names:WorkflowClass=workflows.create_workflow(workflow,base=ExampleWorkflow,)globals()[WorkflowClass.DAG.dag_id]=WorkflowClass.DAG
airflow-xcom-email
Failed to fetch description. HTTP Status Code: 404
airflow-xplenty
No description available on PyPI.
airflow-xtended-api
Airflow Xtended API - PluginApache Airflow plugin that exposes xtended secure API endpoints similar to the officialAirflow API (Stable) (1.0.0), providing richer capabilities to support more powerful DAG and job management. Apache Airflow version 2.8.0 or higher is necessary.Requirementsapache-airflowpymongoboto3requestsInstallationpython3-mpipinstallairflow-xtended-apiBuild from sourceBuild a custom version of this plugin by following the instructions in thisdocScreenshotsAuthenticationAirflow Xtended API plugin uses the same auth mechanism asAirflow API (Stable) (1.0.0). So, by default APIs exposed via this plugin respect the auth mechanism used by your Airflow webserver and also complies with the existing RBAC policies. Note that you will need to pass credentials data as part of the request. Here is a snippet from the official docs when basic authorization is used:curl-XPOST'http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/dags/{dag_id}?update_mask=is_paused'\-H'Content-Type: application/json'\--user"username:password"\-d'{"is_paused": true}'Using the Custom APIAfter installing the plugin python package and restarting your airflow webserver, You can see a link under the 'Xtended API' tab called 'Reference Docs' on the airflow webserver homepage. All the necessary documentation for the supported API endpoints resides on that page. You can also directly navigate to that page using below link.http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/xtended_api/All the supported endpoints are exposed in the below format:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/{ENDPOINT_NAME}Following are the names of endpoints which are currently supported.deploy_dagcreate_dags3_syncmongo_syncscan_dagspurge_dagsrefresh_all_dagsdelete_dagupload_filerestart_failed_taskkill_running_tasksrun_task_instanceskip_task_instancedeploy_dagDescription:Deploy a new DAG File to the DAGs directory.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/deploy_dagMethod:POSTPOST request Arguments:dag_file - file - Upload & Deploy a DAG from .py or .zip filesforce (optional) - boolean - Force uploading the file if it already existsunpause (optional) - boolean - The DAG will be unpaused on creation (Works only when uploading .py files)otf_sync (optional) - boolean - Check for newly created DAGs On The Fly!Examples:curl-XPOST-H'Content-Type: multipart/form-data'\--user"username:password"\-F'dag_file=@test_dag.py'\-F'force=y'\-F'unpause=y'\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/deploy_dagresponse:{"message":"DAG File [<module '{MODULE_NAME}' from '/{DAG_FOLDER}/exam.py'>] has been uploaded","status":"success"}Method:GETGet request Arguments:dag_file_url - file - A valid url for fetching .py, .pyc or .zip DAG filesfilename - string - A valid filename ending with .py, .pyc or .zipforce (optional) - boolean - Force uploading the file if it already exists.unpause (optional) - boolean - The DAG will be unpaused on creation (Works only when uploading .py files)otf_sync (optional) - boolean - Check for newly created DAGs On The Fly!Examples:curl-XGET--user"username:password"\'http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/deploy_dag?dag_file_url={DAG_FILE_URL}&filename=test_dag.py&force=on&unpause=on'response:{"message":"DAG File [<module '{MODULE_NAME}' from '/{DAG_FOLDER}/exam.py'>] has been uploaded","status":"success"}create_dagDescription:Create a new DAG File in the DAGs directory.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/create_dagMethod:POSTPOST request Arguments:filename - string - Name of the python DAG filedag_code - string(multiline) - Python code of the DAG fileforce (optional) - boolean - Force uploading the file if it already existsunpause (optional) - boolean - The DAG will be unpaused on creation (Works only when uploading .py files)otf_sync (optional) - boolean - Check for newly created DAGs On The Fly!s3_syncDescription:Sync DAG files from an S3 bucket.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/s3_syncMethod:POSTPOST request Arguments:s3_bucket_name - string - S3 bucket name where DAG files exists3_region - string - S3 region name where the specified bucket existss3_access_key - string - IAM access key having atleast S3 bucket read accesss3_secret_key - string - IAM secret key for the specifed access keys3_object_prefix (optional) - string - Filter results by object prefixs3_object_keys (optional) - string - Sync DAG files specifed by the object keys. Multiple object keys are seperated by comma (,)skip_purge (optional) - boolean - Skip emptying DAGs directoryotf_sync (optional) - boolean - Check for newly created DAGs On The Fly!Examples:curl-XPOST-H'Content-Type: multipart/form-data'\--user"username:password"\-F's3_bucket_name=test-bucket'\-F's3_region=us-east-1'\-F's3_access_key={IAM_ACCESS_KEY}'\-F's3_secret_key={IAM_SECRET_KEY}'\-F'skip_purge=y'\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/s3_syncresponse:{"message":"dag files synced from s3","sync_status":{"synced":["test_dag0.py","test_dag1.py","test_dag2.py"],"failed":[]},"status":"success"}mongo_syncDescription:Sync DAG files from a mongo db collectionEndpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/mongo_syncMethod:POSTPOST request Arguments:connection_string - string - Source mongo server connection stringdb_name - string - Source mongo database namecollection_name - string - Collection name where DAG data exists in the specified dbfield_filename - string - DAGs are named using value of this document field from the specified collectionfield_dag_source - string - A document field referring the Python source for the yet-to-be created DAGsquery_filter (optional) - string - JSON query string to filter required documentsskip_purge (optional) - boolean - Skip emptying DAGs directoryotf_sync (optional) - boolean - Check for newly created DAGs On The Fly!Examples:curl-XPOST-H'Content-Type: multipart/form-data'\--user"username:password"\-F'connection_string={MONGO_SERVER_CONNECTION_STRING}'\-F'db_name=test_db'\-F'collection_name=test_collection'\-F'field_dag_source=dag_source'\-F'field_filename=dag_filename'\-F'skip_purge=y'\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/mongo_syncresponse:{"message":"dag files synced from mongo","sync_status":{"synced":["test_dag0.py","test_dag1.py","test_dag2.py"],"failed":[]},"status":"success"}refresh_all_dagsDescription:Refresh all DAGs in the webserver.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/refresh_all_dagsMethod:GETGET request Arguments:NoneExamples:curl-XGET--user"username:password"\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/refresh_all_dagsresponse:{"message":"All DAGs are now up-to-date!!","status":"success"}scan_dagsDescription:Check for newly created DAGs.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/scan_dagsMethod:GETGET request Arguments:otf_sync (optional) - boolean - Check for newly created DAGs On The Fly!Examples:curl-XGET--user"username:password"\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/scan_dagsresponse:{"message":"Ondemand DAG scan complete!!","status":"success"}purge_dagsDescription:Empty DAG directory.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/purge_dagsMethod:GETGET request Arguments:NoneExamples:curl-XGET--user"username:password"\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/purge_dagsresponse:{"message":"DAG directory purged!!","status":"success"}delete_dagDescription:Delete a DAG in the web server from Airflow database and filesystem.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/delete_dagMethod:GETGET request Arguments:dag_id (optional)- string - DAG idfilename (optional) - string - Name of the DAG file that needs to be deletedNote: Atleast one of args 'dag_id' or 'filename' should be specifiedExamples:curl-XGET--user"username:password"\'http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/delete_dag?dag_id=test_dag&filename=test_dag.py'response:{"message":"DAG [dag_test] deleted","status":"success"}upload_fileDescription:Upload a new File to the specified directory.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/upload_fileMethod:POSTPOST request Arguments:file - file - File to be uploadedforce (optional) - boolean - Force uploading the file if it already existspath (optional) - string - Location where the file is to be uploaded (Default is the DAGs directory)Examples:curl-XPOST-H'Content-Type: multipart/form-data'\--user"username:password"\-F'file=@test_file.py'\-F'force=y'\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/upload_fileresponse:{"message":"File [/{DAG_FOLDER}/dag_test.txt] has been uploaded","status":"success"}restart_failed_taskDescription:Restart failed tasks with downstream.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/restart_failed_taskMethod:GETGET request Arguments:dag_id - string - DAG idrun_id - string - DagRun idExamples:curl-XGET--user"username:password"\'http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/restart_failed_task?dag_id=test_dag&run_id=test_run'response:{"message":{"failed_task_count":1,"clear_task_count":7},"status":"success"}kill_running_tasksDescription:Kill running tasks having status in ['none', 'running'].Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/kill_running_tasksMethod:GETGET request Arguments:dag_id - string - DAG idrun_id - string - DagRun idtask_id - string - If task_id is none, kill all tasks, else kill the specified task.Examples:curl-XGET--user"username:password"\'http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/kill_running_tasks?dag_id=test_dag&run_id=test_run&task_id=test_task'response:{"message":"tasks in test_run killed!!","status":"success"}run_task_instanceDescription:Create DagRun, run the specified tasks, and skip the rest.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/run_task_instanceMethod:POSTPOST request Arguments:dag_id - string - DAG idrun_id - string - DagRun idtasks - string - task id(s), Multiple tasks are separated by comma (,)conf (optional)- string - Optional configuartion for creating DagRun.Examples:curl-XPOST-H'Content-Type: multipart/form-data'\--user"username:password"\-F'dag_id=test_dag'\-F'run_id=test_run'\-F'tasks=test_task'\http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/run_task_instanceresponse:{"execution_date":"2021-06-21T05:50:19.740803+0000","status":"success"}skip_task_instanceDescription:Skip one task instance.Endpoint:http://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/skip_task_instanceMethod:GETGET request Arguments:dag_id - string - DAG idrun_id - string - DagRun idtask_id - string - task idExamples:curl-XGEThttp://{AIRFLOW_HOST}:{AIRFLOW_PORT}/api/v1/xtended/skip_task_instance?dag_id=test_dag&run_id=test_run&task_id=test_taskresponse:{"message":"<TaskInstance: test_dag.test_task 2021-06-21 19:59:34.638794+00:00 [skipped]> skipped!!","status":"success"}AcknowledgementsHuge shout out to these awesome plugins that contributed to the growth of Airflow ecosystem, which also inspired this plugin.airflow-rest-api-pluginairflow-rest-api-pluginsimple-dag-editorairflow-code-editor
airflow-yeedu-operator
Airflow Yeedu OperatorInstallationTo use the Yeedu Operator in your Airflow environment, install it using the following command:pip3installairflow-yeedu-operatorDAG: Yeedu Job ExecutionOverviewTheYeeduJobRunOperatorin this DAG facilitates the submission and monitoring of jobs using the Yeedu API in Apache Airflow. This DAG enables users to execute Yeedu jobs and handle their completion status and logs seamlessly within their Airflow environment.PrerequisitesBefore using this DAG, ensure you have:Access to the Yeedu API.Proper configuration of Airflow with required connections and variables (if applicable).UsageDAG InitializationImport the necessary modules and instantiate the DAG with required arguments and schedule interval.fromdatetimeimportdatetime,timedeltafromairflowimportDAGfromyeedu.operators.yeeduimportYeeduJobRunOperator# Define DAG argumentsdefault_args={'owner':'airflow','depends_on_past':False,'start_date':datetime(2023,1,1),'retries':1,'retry_delay':timedelta(minutes=5),}# Instantiate DAGdag=DAG('yeedu_job_execution',default_args=default_args,description='DAG to execute jobs using Yeedu API',schedule_interval='@once',catchup=False,)Task InitializationCreate tasks usingYeeduJobRunOperatorto perform various Yeedu API operations.Define YeeduJobRunOperator taskssubmit_job_task=YeeduJobRunOperator(task_id='submit_yeedu_job',job_conf_id='your_job_config_id',# Replace with your job config IDtoken='your_yeedu_api_token',# Replace with your Yeedu API tokenhostname='yeedu.example.com',# Replace with your Yeedu API hostnameworkspace_id=123,# Replace with your Yeedu workspace IDdag=dag,)ExecutionTo execute this DAG:Ensure all required configurations (job config ID, API token, hostname, workspace ID) are correctly provided in the task definitions.Place the DAG file in the appropriate Airflow DAGs folder.Trigger the DAG manually or based on the defined schedule interval.Monitor the Airflow UI for task execution and logs.
airflux
AirfluxAirflux is averysimple installer and tmux session manager for Airflow.It exists for those of us who deploy Airflow a lot; maybe you're developing an operator and need to test it by hand on Airflow 1.3, or you manage a collection of Airflow pipelines and need a quick way to start and stop local versions of them.Getting startedInstall airflux:pip install airfluxAirflux uses constraints files to install Airflow for your version of Python. This command will list the versions of Airflow with available constraints files.airflux versionsYou will probably want to make a directory for your deployment. The script doesn't do this for you.mkdir my-airflow-deployment && cd my-airflow-deploymentTo download the constraints file and create a local Python environment for Airflow, specify the version number toairflux new:airflux new 2.3.3After the installation is complete, start Airflux in a tmux session:airflux startOnce inside the tmux session, you can exit it and shut down Airflow:airflux stopTo install Airflow into your current directory
airfly
AirFly: Auto Generate Airflow'sdag.pyOn The FlyPipeline management is essential for data operation in company, many engineering teams rely on tools like Airflow to help them organize workflows, such as ETL, data analytic jobs or machine learning projects.Airflow provides rich extensibility to let developers arrange workloads into a sequence of "operators", then they declare the task dependencies within aDAGcontext while writing thedag.pyfile.As workflow grows progressively, the increasing complexity of task relations prones to messing up the dag structure, leads to decrease of code maintainability, especially in collaborative scenarios.airflytries to mitigate such pain-points and brings automation to this development life cycle, it assumes all tasks are managed in certain python module, developers specify the dependencies while defining the task objects. During deployment,airflycan resolve the dependency tree and automatically build thedag.pyfor you.InstallDownloadairflyfrom PyPI$ pip install airfly $ airfly --help Usage: airfly [OPTIONS] Options: --version Show version and exit. -n, --name TEXT Assign to DAG id. -m, --modname TEXT Name of the module to search tasks for building the task dependency tree and using it to generate the airflow DAG file. -p, --path TEXT Insert into "sys.path" to include certain modules, multi-value is allowed. -e, --exclude-pattern TEXT Exclude the tasks from the dependency tree if their __qualname__ get matched with this regex pattern. -i, --includes TEXT Paths of python files, the code within will be included in the output DAG file, multi-value is allowed. -d, --dag-params TEXT Parameters to construct DAG object, defined by a dictionary in a python file. Pass this option with <python-file>:<variable> form, the <variable> should be the dictionary which will be passed to DAG as keyword arguments. --help Show this message and exit.Usageairflyexpects the implementations are populated in a Python module(or package), the task dependencies are declared by assigningupstreamsanddownstreamsattributes to each object. The task objects are actually wrappers for Airflow operators, whenairflywalks through the entire module, all tasks are discovered and collected, the dependency tree and theDAGcontext are automatically built, with someasthelpers,airflycan wrap all these information, convert them into python code, and finally save them todag.py.Wrap Airflow operator withAirflowTaskIn order to do codegen, collect the operator's metadata into aAirflowTasksubclass as following(seedemo):# in demo.pyfromairfly.model.airflowimportAirflowTaskclassprint_date(AirflowTask):operator_class="BashOperator"params=dict(bash_command="date")operator_classspecifies the class of the Airflow operator.The class name(print_date) will be mapped totask_idto the applied operator after code generation.paramswill be passed to operator as keyword argument.Declare task dependencyUseupstreamsanddownstreamsto specify task dependencies.# in demo.pyfromtextwrapimportdedenttemplated_command=dedent("""{% for i in range(5) %}echo "{{ ds }}"echo "{{ macros.ds_add(ds, 7)}}"echo "{{ params.my_param }}"{% endfor %}""")classtemplated(AirflowTask):operator_class="BashOperator"params=dict(depends_on_past=False,bash_command=templated_command,params={"my_param":"Parameter I passed in"})classsleep(AirflowTask):operator_class="BashOperator"params=dict(depends_on_past=False,bash_command="sleep 5",retries=3)upstreams=print_datedownstreams=(templated,)Generate thedag.pyfileWith commandline interface:$ airfly --name demo_dag --modname demo > dag.pyThe outputs indag.py:# This file is auto-generated by airfly 0.6.0fromairflow.modelsimportDAGfromairflow.operators.bashimportBashOperatorwithDAG("demo_dag")asdag:demo_print_date=BashOperator(task_id="demo.print_date",bash_command="date")demo_sleep=BashOperator(task_id="demo.sleep",depends_on_past=False,bash_command="sleep 5",retries=3)demo_templated=BashOperator(task_id="demo.templated",depends_on_past=False,bash_command="""{% for i in range(5) %}echo "{{ ds }}"echo "{{ macros.ds_add(ds, 7)}}"echo "{{ params.my_param }}"{% endfor %}""",params={"my_param":"Parameter I passed in"},)demo_print_date>>demo_sleepdemo_sleep>>demo_templatedInject parameters toDAGIf any additional arguments are needed, write and manage those configurations in a python file(seedemo),airflycan pass them toDAGduring codegen.# in params.pyfromdatetimeimporttimedeltafromairflow.utils.datesimportdays_agodefault_args={"owner":"airflow","depends_on_past":False,"email":["[email protected]"],"email_on_failure":False,"email_on_retry":False,"retries":1,"retry_delay":timedelta(minutes=5),# 'queue': 'bash_queue',# 'pool': 'backfill',# 'priority_weight': 10,# 'end_date': datetime(2016, 1, 1),# 'wait_for_downstream': False,# 'dag': dag,# 'sla': timedelta(hours=2),# 'execution_timeout': timedelta(seconds=300),# 'on_failure_callback': some_function,# 'on_success_callback': some_other_function,# 'on_retry_callback': another_function,# 'sla_miss_callback': yet_another_function,# 'trigger_rule': 'all_success'}dag_kwargs=dict(default_args=default_args,description="A simple tutorial DAG",schedule_interval=timedelta(days=1),start_date=days_ago(2),tags=["example"],)Inject those arguments with--dag-paramsoption:$ airfly --name demo_dag --modname demo --dag-params params.py:dag_kwargs > dag.pyThe outputs indag.py:# This file is auto-generated by airfly 0.6.0fromdatetimeimporttimedeltafromairflow.modelsimportDAGfromairflow.operators.bashimportBashOperatorfromairflow.utils.datesimportdays_ago# >>>>>>>>>> Include from 'params.py'default_args={"owner":"airflow","depends_on_past":False,"email":["[email protected]"],"email_on_failure":False,"email_on_retry":False,"retries":1,"retry_delay":timedelta(minutes=5),}dag_kwargs=dict(default_args=default_args,description="A simple tutorial DAG",schedule_interval=timedelta(days=1),start_date=days_ago(2),tags=["example"],)# <<<<<<<<<< End of code insertionwithDAG("demo_dag",**dag_kwargs)asdag:demo_print_date=BashOperator(task_id="demo.print_date",bash_command="date")demo_sleep=BashOperator(task_id="demo.sleep",depends_on_past=False,bash_command="sleep 5",retries=3)demo_templated=BashOperator(task_id="demo.templated",depends_on_past=False,bash_command="""{% for i in range(5) %}echo "{{ ds }}"echo "{{ macros.ds_add(ds, 7)}}"echo "{{ params.my_param }}"{% endfor %}""",params={"my_param":"Parameter I passed in"},)demo_print_date>>demo_sleepdemo_sleep>>demo_templatedairflywraps required information including variables and imports into output python script, and pass the specified value toDAGobject.Exclude tasks from codegenBy passing--exclude-patternto match any unwanted objects with their__qualname__. then filter them out.$ airfly --name demo_dag --modname demo --exclude-pattern templated > dag.pyThe outputs indag.py:# This file is auto-generated by airfly 0.6.0fromairflow.modelsimportDAGfromairflow.operators.bashimportBashOperatorwithDAG("demo_dag")asdag:demo_print_date=BashOperator(task_id="demo.print_date",bash_command="date")demo_sleep=BashOperator(task_id="demo.sleep",depends_on_past=False,bash_command="sleep 5",retries=3)demo_print_date>>demo_sleepThetemplatedtask is gone.ExamplesPlease visitexamplesto explore more use cases.
airfoil-generator
No description available on PyPI.
airfoils
Airfoilsis a smallPythonlibrary for object-oriented airfoil modelling. The library provides tools to easily instantiate airfoil objects and to query geometric information. An airfoil object is defined by upper and a lower surface coordinates.Airfoil nomenclature. Image in the public domain, viaWikimedia Commons.FeaturesAirfoil generation with aNACA-4series definitionImport from filesFull support for airfoils from theUIUC Airfoil Coordinates DatabaseInterpolation or computation of airfoil geometry parametersUpper and lower surface coordinatesCamber line coordinatesChord line coordinates (TODO)Thickness distribution (TODO)Maximum thickness (TODO)Linear interpolation between two different airfoils (MorphAirfoil)Plotting of airfoilsExample>>>fromairfoilsimportAirfoil>>>foil=Airfoil.NACA4('4812')>>>foil.plot()>>>foil.y_upper(x=0.5)array(0.13085448)>>>foil.y_lower(x=[0.2,0.6,0.85])array([0.00217557,0.02562315,0.01451318])>>>foil.camber_line(x=0.5)0.07789290253609385Installationpip install airfoilsDocumentationhttps://airfoils.readthedocs.io/LicenseLicense:Apache-2.0
airfoil-utils
Generate printable files for airfoils from UIUC Airfoil database
airfold
Failed to fetch description. HTTP Status Code: 404
airfold-cli
Airfold CLIUse it to manage your projects and call Airfold from your CI/CD pipelines.Installpipinstallairfold-cliUsageaf--help
airfold-common
Airfold Common
airframe
AirFrame===============================.. image:: https://badge.fury.io/py/airframe.png:target: http://badge.fury.io/py/airframe.. image:: https://pypip.in/d/airframe/badge.png:target: https://crate.io/packages/airframe?version=latestDownload images from an authenticated Flickr account (or local filesystem) andpush them wirelessly to a Toshiba FlashAir Wifi SD card mounted in a digitalphoto frame.* Free software: ASL2 license* Documentation: http://documentup.com/virantha/airframe* Source: http://github.com/virantha/airframe* API docs: http://virantha.github.com/airframe/htmlFeatures--------* Authenticates to Flickr to get your private photos* Only downloads photos with specified tags* Alternativly, can sync files from a local directory* Caches and syncs the photos to the Wifi SD cardInstallation------------.. code-block:: bash$ pip install airframeUsage-----First, go to Flickr and get a private key at http://www.flickr.com/services/api/misc.api_keys.htmlThen, create a directory from where you will start airframe, and create a file called flickr_api.yaml:.. code-block:: yamlkey: "YOUR_API_KEY"secret: "YOUR_API_SECRET"Then, setup your FlashAir card as described in `this post's<http://virantha.com/2014/01/09/hacking-together-a-wifi-photo-frame-with-a-toshiba-flashair-sd-card-wireless-photo-uploads>`__"Enabling the FlashAir" section.Now, you're ready to sync some photos! Just run:.. code-block:: bash$ airframe -n 100 -t photoframe YOUR_AIRFRAME_IPThis will download and sync the 100 most recent photos tagged with "photoframe" to yourAirFrame... warning:: Any other image files in the FlashAir upload directory will be deleted, so make sure you backup anything you want to keep from your SD card.The image files from Flickr will be cached in a sub-directory called``.airframe`` in the location you invoked airframe from, so as long as you rerunfrom the same directory, the script will only download new files from Flickr. If you want toredownload all the files from scratch, just ``rm .airframe`` these files.The script will also only upload new images to the FlashAir card, and ignore any files that arealready present on the card. If you want to force a clean upload, do the following:.. code-block:: bash$ airframe -n 100 -t photoframe -f YOUR_AIRFRAME_IPThis will delete all images already on the card, and upload every image again.Alternatively, you can sync files directly from your local computer by pointingthe script at a directory of ``.jpg`` files:.. code-block:: bash$ airframe -l /path/to/photos YOUR_AIRFRAME_IPNote: other flags are ignored in this mode... :changelog:History-------0.1.0 (2014-01-10)++++++++++++++++++* First release on PyPI.Todo list=========- Add tests- Add docstrings- Travis integration- Use docopt instead of argparse- Better error handling
airfrans
AirfRANS: High Fidelity Computational Fluid Dynamics Dataset for Approximating Reynolds-Averaged Navier–Stokes SolutionsThe AirfRANS dataset makes available numerical resolutions of the incompressible Reynolds-Averaged Navier–Stokes (RANS) equations over the NACA 4 and 5 digits series of airfoils and in a subsonic flight regime setup. Readthedocs documentation is availablehere.FeaturesAccess to 1000 simulations.Reynolds number between 2 and 6 million.Angle of attack between -5° and 15°.Airfoil drawn in the NACA 4 and 5 digits series.Four machine learning tasks representing different challenges.InstallationInstall withpip install airfransUsageDownloading the datasetFrom python:import airfrans as af af.dataset.download(root = PATH_TO_SAVING_DIRECTORY, unzip = True)You can also directly download a ready-to-use version of the dataset in thePyTorch Geometric libraryFinally, you can directly download the dataset in the raw OpenFOAM versionhere, or in the more friendly pre-processed versionhere.Loading the datasetFrom python:import airfrans as af dataset, dataname = af.dataset.load(root = PATH_TO_DATASET, task = TASK, train = True)The tasks are the one presented in theassociated paper. You can choose between'full','scarce','reynolds' and'aoa'. The dataset loaded in this case is the same as the one you can directly access via thePyTorch Geometric library. If you want more flexibility about the sampling of each simulation for the inputs or targets, please feel free to build a custom loader with the help of the'Simulation'class presented in the following. We highly recommend to handle those data with a Gemetric Deep Learning library such asPyTorch GeometricorDeep Graph Library.Simulation classThe'Simulation'class is an object to facilitate the manipulation of AirfRANS simulations. Given the root folder of where the directories of the simulations have been saved and the name of a simulation you can easily manipulate it.import airfrans as af name = 'airFoil2D_SST_57.872_7.314_5.454_3.799_13.179' simulation = af.Simulation(root = PATH_TO_DATASET, name = name)See the documentation for more details about this object.LicenseThis project is licensed under theODbL-1.0 License.ReferenceThe original paper accepted at the 36th Conference on Neural Information Processing Systems (NeurIPS 2022) Track on Datasets and Benchmarks can be foundhereand the preprinthere.Disclaimer: An important update correcting an inconsistency in the Machine Learning experiments proposed in the main part of the NeurIPS version of the paper has been done. Please refer to theArXiv versionfor the up to date version.Please cite this paper if you use this dataset in your own work.@inproceedings{ bonnet2022airfrans, title={Airf{RANS}: High Fidelity Computational Fluid Dynamics Dataset for Approximating Reynolds-Averaged Navier{\textendash}Stokes Solutions}, author={Florent Bonnet and Jocelyn Ahmed Mazari and Paola Cinnella and Patrick Gallinari}, booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, year={2022}, url={https://arxiv.org/abs/2212.07564} }
airfs
airfs: A Python library for cloud and remote file Systemsairfs brings standard Python I/O to various storages (like cloud objects storage, remote file-systems, ...) by providing:Abstract classes of Cloud objects with the completeio.RawIOBaseandio.BufferedIOBasestandard interfaces.Features equivalent to the standard library (io,os,os.path,shutil) for seamlessly managing storage objects and local files.These functions are source agnostic and always provide the same interface for all files from storage or local file systems.Buffered storage objects also support the following features:Buffered asynchronous writing of any object size.Buffered asynchronous preloading in reading mode.Write or read lock depending on memory usage limitation.Maximization of bandwidth using parallels connections.For more information, refer to thedocumentation.Supported storageairfs is compatible with the following storage services:Alibaba Cloud OSSAWS S3 /MinIOGitHub (Read Only)Microsoft Azure Blobs StorageMicrosoft Azure Files StorageOpenStack Swift Object Storeairfs can also access any publicly accessible file via HTTP/HTTPS (Read only)."airfs" is a fork of the unmaintained "Pycosio" project by its main developer.
airgen
Python API for airgen
airgiant
This is the airgiant program.
airgoodies
AirgoodiesAirgoodies is a project that contains various APIs to interact with external services inside Apache Airflow using minimum configuration (seeairgoodies.{module}.variables.jsonfor each module).Current version matrix:Airgoodies VersionApache Airflow VersionPython VersionProject tag0.0.42.7.23.11v0.0.40.0.32.7.23.11v0.0.30.0.22.7.23.11v0.0.20.0.1-alpha2.7.23.11v0.0.1-alphaProvided goodies for version0.0.4:ModuleDescriptionDependency Versionsairgoodies.commandAPI for dynamic task configuration through YAMLpyyaml==6.0.1airgoodies.awsAPI for easy interaction with AWSpandas==2.1.1apache-airflow-providers-amazon===8.7.1airgoodies.mongoAPI for easy interaction with MongoDBpymongo==4.5.0airgoodies.xcomAPI for managing variables through XComNoneInstallationAdd the following requirement in yourrequirements.txt# requirements.txt airgoodies=0.0.4Example usageFor the official documentation, seehereFor an example of how to use this project, seehereBuilding the projectTo build the project:$python3setup.pysdistbdist_wheelLicenseThis project is available under the MIT License.AuthorStavros Grigoriou (stav121)
airgpt
airgptThis repository containsairgpt: a command line tool empowered by AI and LLM. It calls OpenAI to get the appropriate command based on user prompts, and executes the command to get the results.InstallationSign up withOpenAIto get an API key. Set up the environment vairableOPENAI_API_KEYin a terminal:exportOPENAI_API_KEY=<your_openai_api_key>Make sure you have Python 3.10+ installed. Installairgptwith the following command:pip3installairgptWhen the installation is complete, run the following command to start theairgpt:airgptDeveloping the Python source code locallyThe following steps could be used to developairgptlocally:Clone the repository from GitHub:gitclonehttps://github.com/kp-enterprise/airgpt.gitCreate a virtual environment and install the required libraries:cdairgpt python3-mvenvvenvsourcevenv/bin/activate pip3install-irequirements.txtRun the following command to startairgpt:python3airgpt.py
airgram
A python wrapper for making calls to theAirgram API, which enables you to send push notifications to your mobile devices.Since it is a very shallow wrapper, you can refer to theofficial api referencefor details on the functions.ExamplesAt the time of writing (2015-08-20) airgram is using wrong certificates (see), which are intended for herokuapp.com. Because of this cert verification needs to be turned off.Using as a guestfromairgramimportAirgramag=Airgram(verify_certs=False)# Send a message to a userag.send_as_guest("[email protected]","Test message from Airgram API","http://example.com")Using with an authenticated airgram servicefromairgramimportAirgramag=Airgram(key="MY_SERVICE_KEY",secret="MY_SERVICE_SECRET",verify_certs=False)# Subscribe an email to the serviceag.subscribe("[email protected]")# Send a message to a subscriberag.send("[email protected]","Hello, how are you?")# Send a message to ALL subscribersag.broadcast("Airgram for python is awesome",url="https://github.com/the01/python-airgram")History0.1.3 (2015-08-25)BugFixadded MANIFEST.in (fix install problem)0.1.2 (2015-08-21)BugFixCorrect wrong api url0.1.1 (2015-08-21)Add module loggerAdd class loggerFunctions throw AirgramException on failure0.1.0 (2015-07-30)First release on PyPI.
airgun
https://github.com/SatelliteQE/airgun.git
airgym
XPlane Gym EnvironmentThis project provides an OpenAI Gym environment for training reinforcement learning agents on an XPlane simulator. The environment allows agents to control an aircraft and receive rewards based on how well they perform a task, such as flying a certain trajectory or landing safely.InstallationTo install the package, run the following command:pipinstallairgymUsage/ExamplesTo use the environment in your Python code, you can import it as follows:importairgymimportgym# If XPlane is running on the same machine, you can use the default address and port.# Or, set ip address and port according to your configuration.env=gym.make('AirGym-v1')episods=0forepisodinrange(episods):obs=env.reset()done=Falsewhilenotdone:actions=env.action_space.sample()obs,reward,done,info=env.step(action)env.close()
airgym-api
Failed to fetch description. HTTP Status Code: 404
airhttprunner
HttpRunnerHttpRunneris a simple & elegant, yet powerful HTTP(S) testing framework. Enjoy! ✨ 🚀 ✨Design PhilosophyConvention over configurationROI mattersEmbrace open source, leveragerequests,pytest,pydantic,allureandlocust.Key FeaturesInherit all powerful features ofrequests, just have fun to handle HTTP(S) in human way.Define testcase in YAML or JSON format, run withpytestin concise and elegant manner.Record and generate testcases withHARsupport.Supportsvariables/extract/validate/hooksmechanisms to create extremely complex test scenarios.Withdebugtalk.pyplugin, any function can be used in any part of your testcase.Withjmespath, extract and validate json response has never been easier.Withpytest, hundreds of plugins are readily available.Withallure, test report can be pretty nice and powerful.With reuse oflocust, you can run performance test without extra work.CLI command supported, perfect combination withCI/CD.SponsorsThank you to all our sponsors! ✨🍰✨ (become a sponsor)金牌赞助商(Gold Sponsor)霍格沃兹测试学院是业界领先的测试开发技术高端教育品牌,隶属于测吧(北京)科技有限公司。学院课程均由 BAT 一线测试大咖执教,提供实战驱动的接口自动化测试、移动自动化测试、性能测试、持续集成与 DevOps 等技术培训,以及测试开发优秀人才内推服务。点击学习!霍格沃兹测试学院是 HttpRunner 的首家金牌赞助商。开源服务赞助商(Open Source Sponsor)HttpRunner is in Sentry Sponsored plan.Subscribe关注 HttpRunner 的微信公众号,第一时间获得最新资讯。
airi
AIRi is a twisted base Bluetooth service + web server that allows to configure, control and watch AIRcable’s AIRi cameras, and AIRcable’s OptiEyes cameras as well.
airiam
AirIAM is an AWS IAM to least privilege Terraform execution framework. It compiles AWS IAM usage and leverages that data to create a least-privilege IAM Terraform that replaces the exiting IAM management method.AirIAM was created to promote immutable and version-controlled IAM management to replace today's manual and error prone methods.Table of contentsTable of contentsIntroductionFeaturesCommandsUsageData FlowExamplesGetting StartedInstallationUsing PipUsing brew (MacOS Only)Recommended flowFAQAlternativesAWS IAM Cleanup ToolsAWS IAM Policy Management ToolsMigration of AWS to Terraform ToolsContributingSupportIntroductionAirIAM scans existing IAM usage patterns and provides a simple method to migrate IAM configurations into a right-sized Terraform plan. It identifies unused users, roles, groups, policies and policy attachments and replaces them with a Least Privileges Terraform code modelled to manage AWS IAM.By moving all IAM configurations into Terraform code, admins can start tracking, auditing and modifying IAM configurations as part of their standard infrastructure-as-code development provisioning processes.AirIAM is battle-tested and is recommended for use in Dev, QA and test environments that have been previously managed by humans. It is design to result in minimal impact on existing workloads.If you are interested in migrating a Prod account, contact us [email protected] some helpful tips.FeaturesDetects unused IAM resources using native AWS andAmazon Access AdvisorAPIs.Provides scripts to remove unused entities en-masse.Effortless migration of existing IAM configurations into a simple Least Privileges Terraform model.Integrates withCheckov, a static-code analysis tool for Terraform, to track unwanted configuration changes and configuration drift.Commandsfind_unused- Detects unused users, roles, groups, policies and policy attachments. It also adds links to automation scripts that could remove these entities entirely using Bridgecrew Community.Learn more about these scripts and automation.usage:airiamfind_unused[-h][-pPROFILE][-lLAST_USED_THRESHOLD][--no-cache][-o{cli}]optionalarguments:-h,--helpshowthishelpmessageandexit-pPROFILE,--profilePROFILEAWSprofiletobeused(default:None)-lLAST_USED_THRESHOLD,--last-used-thresholdLAST_USED_THRESHOLD"Last Used"threshold,indays,foranentitytobeconsideredunused(default:90)--no-cacheGenerateafreshsetofdatafromAWSIAMAPIcalls(default:False)-o{cli},--output{cli}Outputformat(default:OutputFormat.cli)recommend_groups- Identifies what permissions are in use and creates 3 generalized groups according to that usage. Supported groups:Admins - Users who have the AdministratorAccess policy attached. It will be added to the admins group which will have the managed policyarn:aws:iam::aws:policy/AdministratorAccessattached.PowerUsers - Users who have write access toanyof the services. In case of more than 10 policies being attached to that group, a number of groups will be created for PowerUsers, and the relevant users will be members of all of them.ReadOnly - Users who only have read access to the account. Will be members of the readonly group which will have the managed policyarn:aws:iam::aws:policy/ReadOnlyAccessattached.usage:airiamrecommend_groups[-h][-pPROFILE][-o{cli}][-lLAST_USED_THRESHOLD][--no-cache]optionalarguments:-h,--helpshowthishelpmessageandexit-pPROFILE,--profilePROFILEAWSprofiletobeused(default:None)-o{cli},--output{cli}Outputformat(default:OutputFormat.cli)-lLAST_USED_THRESHOLD,--last-used-thresholdLAST_USED_THRESHOLD"Last Used"threshold,indays,foranentitytobeconsideredunused(default:90)--no-cacheGenerateafreshsetofdatafromAWSIAMAPIcalls(default:False)terraform- Creates Terraform files based on the outputs and the transformations applied by the optional arguments supplied.usage:airiamterraform[-h][-pPROFILE][-dDIRECTORY][--without-unused][--without-groups][-lLAST_USED_THRESHOLD][--no-cache][--without-import]optionalarguments:-h,--helpshowthishelpmessageandexit-pPROFILE,--profilePROFILEAWSprofiletobeused(default:None)-dDIRECTORY,--directoryDIRECTORYPathwheretheoutputterraformcodeandstatewillbestored(default:results)--without-unusedCreateterraformcodewithoutunusedentities(default:False)--without-groupsCreateterraformcodewithoutrecommendationforusergroups(default:False)-lLAST_USED_THRESHOLD,--last-used-thresholdLAST_USED_THRESHOLD"Last Used"threshold,indays,foranentitytobeconsideredunused(default:90)--no-cacheGenerateafreshsetofdatafromAWSIAMAPIcalls(default:False)--without-importImporttheresultingentitiestoterraform'sstatefile.Note-thismighttakealongtime(default:False)Important notes forterraformcommand:a. AirIAM replaces all hardcoded values with the matching terraform references, which results in replacements of all group memberships and policy attachments. If this is run using a user, please make sure the user has the relevant privileges directly attached. A matching warning will be displayed if relevant.c. AirIAM tags all the resources it touched so it will be easy to identify the entities which are not managed through AirIAM. This results in terraform modifying the relevant entities by adding these tags.d. By default, AirIAM will import the currently existing IAM entities and their relationships, which might take a while depending on the number of configurations.UsageThe three commands above run sequentially, and in-sync, as seen in the diagram below.When executing, AirIAM starts by scanning a selected AWS account using the specified profile. Iffind_unusedis specified, the results are printed and the execution completes. Ifrecommend_groupsis specified, after the stage of group recommendation the results are printed and the execution completes. If theterraformcommand is specified it takes all the results and creates the Terraform code and state file required to replace the existing IAM configuration.Data FlowExamplesGetting StartedInstallationUsing Pippip3 install airiam --userUsing brew (MacOS Only)brew tap bridgecrewio/airiam https://github.com/bridgecrewio/airiam brew update brew install airiamRecommended FlowThe recommended workflow for using this tool is as follows:Run thefind_unusedcommand and delete the unused access keys + unused console logins - these cannot be migrated to terraform because they hold secrets known only to the relevant user - his password and private credentials.Run theterraformcommand without any flags, creating a terraform setup that mirrors your existing IAM setup. This will take a while as all of the entities will be imported to your state fileCommit the terraform files (without the state file) to a new repository.Run the terraform command again, this time with the flag--without-importand--without-unused. This will edit the .tf files to contain only the entities that are in use.Create a new branch and commit the new terraform files.Create a Pull Request / Merge Request from this branch to the default branch. Check out the differences and make sure all the changes are good. Consult relevant stakeholders in your organization if necessary.After approval - merge the PR and apply the changes using terraform apply. Please note this action will require Admin IAM access to the account.FAQIf you run into the following error:airiam is not recognized as an internal or external commandPlease make sure python is in yourPATHby running the following command:export PATH="/Users//Library/Python/3.7/bin:$PATH"AlternativesAWS IAM Cleanup ToolsFor AWS IAM usage scanners check outCloudTracker,Trailscraper,Aadvark&Repokid. The main difference between these tools and AirIAM is that AirIAM also moves the problem into static terraform code form, which allows an entire set of code analysis tools to manage and identify deviations and changes.AWS IAM Policy Management ToolsFor static IAM policy linting, check outParliament. Parliament is actually integrated into AirIAM, and is run on the policies it gets from your AWS account.For automatically creating IAM policies and managing them as code, check outaws-iam-generator,PolicySentry.Cloudsplainingis another tool from salesforce that analyzes existing IAM set-up and identifies risky / over privileged roles.These tools help create better policies, but do not help with existing AWS IAM set-up.Migration of AWS to Terraform ToolsFor other tools that help migrate existing AWS IAM set-up to terraform, check outterracognitaandterraforming. AirIAM is the only tool which supports migrating all relevant IAM entities to terraform v0.12.ContributingContribution is welcomed!We would love to hear about other IAM governance models for additional use cases as well as new ways to identify over-permissive IAM resources.SupportBridgecrewbuilds and maintains AirIAM to encourage the adoption of IAM-as-code and enforcement of IAM Rightsizing and Least Privileges best practices in policy-as-code.Start with ourDocumentationfor quick tutorials and examples.If you need direct support you can contact us [email protected].
airignis
AirignisThis package aims to provide a C# familiar way of handling events in addition to the possibility to schedule the execution of periodic events on different time bases (year, month, weak, day, hour, minute, second) called auto events. When using this package, please keep in mind that the timing precision of the auto event executions mainly depends on the real-time ability of your operating system. Therefore, it is not intended for applications requiring a high level of timing precision.Installation$pipinstallairignisHow to use this package?Create an Event and add subscriber's callbacksfromairignisimportEventdeffoo(data):print(f"doing something with the data:{data}")# creating an Event objectmy_event=Event()# adding the foo() function as subscriber callbackmy_event+=fooAfter importing the Event class an object can be created. Following, one or multiple callback functions can be added with the += operator.Invoke the Event and pass arguments to the callbackmy_event.invoke(data)The type and number of arguments of the subscriber callbacks must match with the data template accepted by the Event.Remove a subscriber's callback# removing the foo() function from the list of subscriber callbacksmy_event-=fooSchedule an AutoEvent and set the function to be executed at each due timeThis example shows a minimalistic usage of the AutoEvent class. Please refer to the package documentation for a detailed showcase of the functionalities.fromairignisimportAutoEvent,duetimefromdatetimeimportdatetime,timedeltadefsay_hello(name:str):print(f"Hello{name}")rate=2rate_unit='second'exec_time=20# Specify the target due timetest_due_time=duetime(rate=rate,rate_unit=rate_unit)# Defining the stop date time of the auto event. Should stop after 20 secondsstop_datetime=datetime.now(test_due_time.timezone)+timedelta(seconds=exec_time)auto_event=AutoEvent(say_hello,test_due_time,stop_datetime,'world')# starting the auto event handlerauto_event.start()# Interrupting the auto event handlerauto_event.stop()License© 2022 Subvael
airium
AiriumBidirectionalHTML-pythontranslator.Key features:simple, straight-forwardtemplate-less (just the python, you may say goodbye to all the templates)DOM structure is strictly represented by python indentation (with context-managers)gives much cleanerHTMLthan regular templatesequipped with reverse translator:HTMLto pythoncan output either pretty (default) or minifiedHTMLcodeGeneratingHTMLcode in python usingairiumBasicHTMLpage (hello world)fromairiumimportAiriuma=Airium()a('<!DOCTYPE html>')witha.html(lang="pl"):witha.head():a.meta(charset="utf-8")a.title(_t="Airium example")witha.body():witha.h3(id="id23409231",klass='main_header'):a("Hello World.")html=str(a)# casting to string extracts the value# or directly to UTF-8 encoded bytes:html_bytes=bytes(a)# casting to bytes is a shortcut to str(a).encode('utf-8')print(html)Prints such a string:<!DOCTYPE html><htmllang="pl"><head><metacharset="utf-8"/><title>Airium example</title></head><body><h3id="id23409231"class="main_header">Hello World.</h3></body></html>In order to store it as a file, just:withopen('that/file/path.html','wb')asf:f.write(bytes(html))Simple image in a divfromairiumimportAiriuma=Airium()witha.div():a.img(src='source.png',alt='alt text')a('the text')html_str=str(a)print(html_str)<div><imgsrc="source.png"alt="alt text"/>the text</div>TablefromairiumimportAiriuma=Airium()witha.table(id='table_372'):witha.tr(klass='header_row'):a.th(_t='no.')a.th(_t='Firstname')a.th(_t='Lastname')witha.tr():a.td(_t='1.')a.td(id='jbl',_t='Jill')a.td(_t='Smith')# can use _t or textwitha.tr():a.td(_t='2.')a.td(_t='Roland',id='rmd')a.td(_t='Mendel')table_str=str(a)print(table_str)# To store it to a file:withopen('/tmp/airium_www.example.com.py')asf:f.write(table_str)Nowtable_strcontains such a string:<tableid="table_372"><trclass="header_row"><th>no.</th><th>Firstname</th><th>Lastname</th></tr><tr><td>1.</td><tdid="jbl">Jill</td><td>Smith</td></tr><tr><td>2.</td><tdid="rmd">Roland</td><td>Mendel</td></tr></table>Chaining shortcut for elements with only one childNew in version 0.2.2Having a structure with large number ofwithstatements:fromairiumimportAiriuma=Airium()witha.article():witha.table():witha.thead():witha.tr():a.th(_t='Column 1')a.th(_t='Column 2')witha.tbody():witha.tr():witha.td():a.strong(_t='Value 1')a.td(_t='Value 2')table_str=str(a)print(table_str)You may use a shortcut that is equivalent to:fromairiumimportAiriuma=Airium()witha.article().table():witha.thead().tr():a.th(_t="Column 1")a.th(_t="Column 2")witha.tbody().tr():a.td().strong(_t="Value 1")a.td(_t="Value 2")table_str=str(a)print(table_str)<article><table><thead><tr><th>Column 1</th><th>Column 2</th></tr></thead><tbody><tr><td><strong>Value 1</strong></td><td>Value 2</td></tr></tbody></table></article>OptionsPretty or MinifyBy default, airium biuldsHTMLcode indented with spaces and with line breaks being line feed\ncharacters. It can be changed while creating anAiriuminstance. In general all avaliable arguments whit their default values are:a=Airium(base_indent=' ',# strcurrent_level=0,# intsource_minify=False,# boolsource_line_break_character="\n",# str)minifyThat's a mode when size of the code is minimized, i.e. contains as less whitespaces as it's possible. The option can be enabled withsource_minifyargument, i.e.:a=Airium(source_minify=True)In case if you need to explicitly add a line break in the source code (not the<br/>):a=Airium(source_minify=True)a.h1(_t="Here's your table")witha.table():witha.tr():a.break_source_line()a.th(_t="Cell 11")a.th(_t="Cell 12")witha.tr():a.break_source_line()a.th(_t="Cell 21")a.th(_t="Cell 22")a.break_source_line()a.p(_t="Another content goes here")Will result with such a code:<h1>Here's your table</h1><table><tr><th>Cell 11</th><th>Cell 12</th></tr><tr><th>Cell 21</th><th>Cell 22</th></tr></table><p>Another content goes here</p>Note that thebreak_source_linecannot be used incontext manager chains.indent styleThe default indent of the generated HTML code has two spaces per each indent level. You can change it to\tor 4 spaces by settingAiriumconstructor argument, e.g.:a=Airium(base_indent="\t")# one tab symbola=Airium(base_indent=" ")# 4 spaces per each indentation levela=Airium(base_indent=" ")# 1 space per one level# pick one of the above statements, it can be mixed with other argumentsNote that this setting is ignored whensource_minifyargument is set toTrue(see above).There is a special case when you set the base indent to empty string. It would disable indentation, but line breaks will be still added. In order to get rid of line breaks, check thesource_minifyargument.indent levelThecurrent_levelbeing an integer can be set to non-negative value, wich will causeairiumto start indentation with level offset given by the number.line break characterBy default, just a line feed (\n) is used for terminating lines of the generated code. You can change it to different style, e.g.\r\nor\rby settingsource_line_break_characterto the desired value.a=Airium(source_line_break_character="\r\n")# windows' styleNote that the setting has no effect whensource_minifyargument is set toTrue(see above).Using airium with web-frameworksAirium can be used with frameworks like Flask or Django. It can completely replace template engines, reducing code-files scater, which may bring better code organization, and some other reasons.Here is an example of using airium with django. It implements reusablebasic_bodyand a view calledindex.# file: your_app/views.pyimportcontextlibimportinspectfromairiumimportAiriumfromdjango.httpimportHttpResponse@contextlib.contextmanagerdefbasic_body(a:Airium,useful_name:str=''):"""Works like a Django/Ninja template."""a('<!DOCTYPE html>')witha.html(lang='en'):witha.head():a.meta(charset='utf-8')a.meta(content='width=device-width, initial-scale=1',name='viewport')# do not use CSS from this URL in a production, it's just for an educational purposea.link(href='https://unpkg.com/@picocss/[email protected]/css/pico.css',rel='stylesheet')a.title(_t=f'Hello World')witha.body():witha.div():witha.nav(klass='container-fluid'):witha.ul():witha.li():witha.a(klass='contrast',href='./'):a.strong(_t="⌨ Foo Bar")witha.ul():witha.li():a.a(klass='contrast',href='#',**{'data-theme-switcher':'auto'},_t='Auto')witha.li():a.a(klass='contrast',href='#',**{'data-theme-switcher':'light'},_t='Light')witha.li():a.a(klass='contrast',href='#',**{'data-theme-switcher':'dark'},_t='Dark')witha.header(klass='container'):witha.hgroup():a.h1(_t=f"You're on the{useful_name}")a.h2(_t="It's a page made by our automatons with a power of steam engines.")witha.main(klass='container'):yield# This is the point where main content gets insertedwitha.footer(klass='container'):witha.small():margin='margin: auto 10px;'a.span(_t='© Airium HTML generator example',style=margin)# do not use JS from this URL in a production, it's just for an educational purposea.script(src='https://picocss.com/examples/js/minimal-theme-switcher.js')defindex(request)->HttpResponse:a=Airium()withbasic_body(a,f'main page:{request.path}'):witha.article():a.h3(_t="Hello World from Django running Airium")witha.p().small():a("This bases on ")witha.a(href="https://picocss.com/examples/company/"):a("Pico.css / Company example")witha.p():a("Instead of a HTML template, airium has been used.")a("The whole body is generated by a template ""and the article code looks like that:")witha.code().pre():a(inspect.getsource(index))returnHttpResponse(bytes(a))# from django.http import HttpResponseRoute it inurls.pyjust like a regular view:# file: your_app/urls.pyfromdjango.contribimportadminfromdjango.urlsimportpathimportyour_appurlpatterns=[path('index/',your_app.views.index),path('admin/',admin.site.urls),]The result ing web page on my machine looks like that:Reverse translationAirium is equipped with a transpiler[HTML -> py]. It generates python code out of a givenHTMLstring.Using reverse translator as a binary:Ensure you haveinstalled[parse]extras. Then call in command line:airiumhttp://www.example.comThat will fetch the document and translate it to python code. The code callsairiumstatements that reproduce theHTMLdocument given. It may give a clue - how to defineHTMLstructure for a given web page usingairiumpackage.To store the translation's result into a file:airiumhttp://www.example.com>/tmp/airium_example_com.pyYou can also parse localHTMLfiles:airium/path/to/your_file.html>/tmp/airium_my_file.pyYou may also try to parse your Django templates. I'm not sure if it works, but there will be probably not much to fix.Using reverse translator as python code:fromairiumimportfrom_html_to_airium# assume we have such a page given as a string:html_str="""\<!DOCTYPE html><html lang="pl"><head><meta charset="utf-8" /><title>Airium example</title></head><body><h3 id="id23409231" class="main_header">Hello World.</h3></body></html>"""# to convert the html into python, just call:py_str=from_html_to_airium(html_str)# airium tests ensure that the result of the conversion is equal to the string:assertpy_str=="""\#!/usr/bin/env python# File generated by reverse AIRIUM translator (version 0.2.6).# Any change will be overridden on next run.# flake8: noqa E501 (line too long)from airium import Airiuma = Airium()a('<!DOCTYPE html>')with a.html(lang='pl'):with a.head():a.meta(charset='utf-8')a.title(_t='Airium example')with a.body():a.h3(klass='main_header', id='id23409231', _t='Hello World.')"""Transpiler limitationsso far in version 0.2.2:result of translation does not keep exact amount of leading whitespaces within<pre>tags. They come over-indented in python code.This is not however an issue when code is generated from python toHTML.although it keeps the proper tags structure, the transpiler does not chain all thewithstatements, so in some cases the generated code may be much indented.it's not too fastInstallationIf you need a new virtual environment, call:virtualenvvenvsourcevenv/bin/activateHaving it activated - you may install airium like this:pipinstallairiumIn order to use reverse translation - two additional packages are needed, run:pipinstallairium[parse]Then check if the transpiler works by calling:airium--helpEnjoy!
airizz
No description available on PyPI.
air-kit
No description available on PyPI.
airkupofrod
Airflow KubernetesPodOperatorFromDeploymentOrairkupofrodfor short, is a tiny package which does one thing - takes a deployment in your kubernetes cluster and turns allows you to use its pod template as aKubernetesPodOperatorobject. It does this by providing theKubernetesPodOperatorFromDeploymentoperator.airkupofrodsupports 1.10.9<=airflow<2Installation and usageEnsure your airflow image has the python packageairkupofrodinstalledpipinstallairkupofrodThen in your dags:fromairkupofrod.operatorimportKubernetesPodOperatorFromDeploymentmy_kupofrod_task=KubernetesPodOperatorFromDeployment(deployment_labels={"app":"my-app"},# deployment labels to lookup bydeployment_fields={"metadata.name":"my-app-deploy-template"},# deployment fields to lookup bydeployment_namespace="some-ns",# where the deployment livesnamespace="default",# where the pod will be deployedtask_id="my-kupofrod-task",dag=dag,in_cluster=True,)You will also need to make sure that a service account attached to your airflow pods has the a role capable of listing deployments bound to it. Seerole-bindingfor an example of this.This is in addition to the role bindings necessary for theKubernetesPodOperatorto work which can be seen in theairflow helm chartDevelopingSkaffoldis used to test and develop inside kubernetes.After ensuring you have:SkaffoldHelmSome type of k8s cluster availableRun:skaffolddev--force=false--cleanup=false--status-check=false--port-forwardThen navigate tohttp://localhost:8080and enable and trigger a run of the test deployments dag.
airlabs
Project Portfolio:https://github.com/calebyhan/CalebHanairlabsPython wrapper forAirLabs.InstallationUse the package managerpipto install airlabs.pipinstallairlabsUsageimportairlabs# Gets airline information about American Airlinesairlabs.airlines(iata_code="AA")# Gets information about real-time flightsairlabs.flights()DocumentationDocumentation found inhttps://airlabs.readthedocs.io/en/latest/.ContributingPull requests are welcome. For major changes, please open an issue firstto discuss what you would like to change.Please make sure to update tests as appropriate.LicenseMIT
airlango-mystic
Manipulation over Airlango Mystic Drone via PythonThis package provides the capabilities for user to play with Mystic drone via python scripts.
airless
AirlessAirless is a package that aims to build a serverless and lightweight orchestration platform, creating workflows of multiple tasks being executed onGoogle Cloud FunctionsWhy not just useApache Airflow?Airflow is the industry standard when we talk about job orchestration and worflow management. However, in some cases, we believe it may not be the best solution. I would like to highlight 3 main cases we face that Airflow struggles to handle.ServerlessAt the beginning of a project we want to avoid dealing with infrastructure since it demands time and it has a fixed cost to reserve an instance to run Airflow. Since we didn't have that many jobs, it didn't make sense to have an instance of Airflow up 24-7.When the project starts to get bigger and, if we use Airflow's instance to run the tasks, we start facing performance issues on the workflow.In order to avoid this problems we decided to build a 100% serverless platform.Parallel processingThe main use case we designed Airless for is for data scrappers. The problem with data scrappers is that normally you want them to process a lot of tasks in parallel, for instance, first you want to fetch a website and collect all links in that page and send them forward for another task to be executed and then that task does the same and so on and so forth.Building this workflow that does not know before hand how many tasks are going to be executed is something hard be built on Airflow.Data sharing between tasksIn order to built this massive parallel processing workflow that we explained on the previous topic, we need to be able to dynamically create and send data to the next task. So use the data from the first task as a trigger and an input data for the next tasks.How it worksAirless builts its workflows based onGoogle Cloud Functions,Google Pub/SubandGoogle Cloud Scheduler.Everything starts with the Cloud Scheduler, which is a serverless product from Google Cloud that is able to publish a message to a Pub/Sub with a cron schedulerWhen a message is published to a Pub/Sub it can trigger a Cloud Function and get executed with that message as an inputThis Cloud Functions is able to publish as many messages as it wants to as many Pub/Sub topics as it wantsRepeat from 2PreparationEnvironment variablesENVGCP_PROJECTPUBSUB_TOPIC_ERRORLOG_LEVELPUBSUB_TOPIC_EMAIL_SENDPUBSUB_TOPIC_SLACK_SENDBIGQUERY_DATASET_ERRORBIGQUERY_TABLE_ERROREMAIL_SENDER_ERROREMAIL_RECIPIENTS_ERRORSLACK_CHANNELS_ERROR
airlib
No description available on PyPI.
airlift
AirliftIntroductionAirlift is a Command Line Interface (CLI) tool designed to provide a local development environment for Apache Airflow with a simple but flexible interface. It is built on top of the officialAirflow Helm Chart.RequirementsAirlift requires the following software to be installed on your system:HelmDockerKindBelow are the installation instructions for each of these tools on MacOS and Linux distributions.It is also recommended to allocate at least 4GB of RAM for Docker to run this service.Install HomebrewHomebrew is a package manager that we will use to install the necessary software. If you don't have Homebrew installed, you can install it by following these instructions:/bin/bash-c"$(curl-fsSLhttps://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"Install SoftwareWith Homebrew installed, you can now install Helm, Docker, and Kind.HelmbrewinstallhelmDockerbrewinstall--caskdocker-OR- use a tool likeDocker DesktoporRancher DesktopKindbrewinstallkindNote: This software was tested and validated working with kind v0.17.There are known issues with kind v0.20 and Rancher. If you are experiencing issues, please downgrade your kind installationby installing from the source/release binaries.InstallationAirlift can be installed using pip:pipinstallairliftUsageThe general syntax for using the Airlift CLI tool is:airlift[subcommand][options]Subcommands and Options1. startStarts the Airflow service.airliftstart-d/path/to/dags-p/path/to/plugins-r/path/to/requirements.txtNote: The DAG and Plugins folders are mounted directly to the airflow service for hot-reloading. When you make a change locally, it should automatically appear in the Airflow UI.Note: Start times for Airflow can be upwards of 5 minutes, due to the bootstrapping, installation of required PyPi packages, and the creation of the Postgres database. This all depends on your local machines power & the complexity of your Airflow setup.2. checkChecks if all pre-requisite software is installed.airliftcheck3. pausePauses the Airflow service.airliftpause4. unpauseUnpauses the Airflow service.airliftunpause5. removeRemoves all containers/clusters related to theairliftservice.airliftremove6. statusChecks the status of the service and whether or not it is reachable.airliftstatus-P80807. import_variablesImports avariables.jsonfile to a running Airflow instance.airliftimport_variables-P8080-V/path/to/variables.json8. run_dagRuns a DAG given an ID.airliftrun_dag-P8080-Dexample_dag_idConfiguration filesHelm values.yamlThis file provides the configuration for the Airflow Helm chart. This can be used for things such as:Setting the Secrets Backend to AWS Secrets ManagerAdding custom environment variables (such as connections)Changing the executorModifying the memory allocation for the webserver/scheduler/workersUpdating anyairflow.cfgvalue.Here's an example:executor:"CeleryExecutor"config:core:load_examples:'False'executor:CeleryExecutorcolored_console_log:'False'# Airflow scheduler settingsscheduler:# hostAliases for the scheduler podhostAliases:[]# - ip: "127.0.0.1"# hostnames:# - "foo.local"# - ip: "10.1.2.3"# hostnames:# - "foo.remote"# If the scheduler stops heartbeating for 5 minutes (5*60s) kill the# scheduler and let Kubernetes restart itlivenessProbe:initialDelaySeconds:10timeoutSeconds:20failureThreshold:5periodSeconds:60command:~# Airflow 2.0 allows users to run multiple schedulers,# However this feature is only recommended for MySQL 8+ and Postgresreplicas:1# Max number of old replicasets to retainrevisionHistoryLimit:~# Command to use when running the Airflow scheduler (templated).command:~# Args to use when running the Airflow scheduler (templated).args:["bash","-c","execairflowscheduler"]You can find all the possible configuration overrides here:https://artifacthub.io/packages/helm/apache-airflow/airflow?modal=valuesNote: By default, we disable thelivenessProbechecks for the scheduler & triggerer due to conflicts with Kind. See./src/airlift/config/helm/values.yamlfor the exact config valuesAirlift ConfigurationThe Airlift configuration file overrides all flag values to simplify starting the service.For example,$HOME/.config/airlift/config.yaml:# config.yamldag_path:/path/to/dagsplugin_path:/path/to/pluginsrequirements_file:/path/to/requirements.txthelm_values_file:/path/to/values.yamlextra_volume_mounts:-hostPath=/my/cool/path,containerPath=/my/mounted/path,name=a_unique_namecluster_config_file:/path/to/cluster/config.yamlimage:'apache/airflow:2.6.0'helm_chart_version:'1.0.0'port:8080post_start_dag_id:'example_dag_id'In this example,dag_pathin the yaml file overrides the-dsetting,plugin_pathoverrides the-psetting, and so forth.Using this configuration, you can now start the service using:airliftstart-c$HOME/.config/airlift/config.yamlExamplesSee here for examples with common configuration modifications.FAQSee here for Frequently Asked QuestionsMotivationThe motivation behind the creation of Airlift is to simplify the process of setting up a local development environment for Apache Airflow. It aims to be a flexible tool that allows developers to easily configure and manage their Airflow instances with unlimited flexibility.Support and ContributionIf you encounter any issues or have suggestions for improvements, feel free to open an issue on the GitHub repository. Contributions to the project are also welcome.ContactIf you have questions or feedback about Airlift, please reach out by opening an issue on the GitHub repository.
airlift-py
Failed to fetch description. HTTP Status Code: 404
airline
airline|Lightweight wide event logging to bring more observability to lambda functions.This is very strongly inspired byhoneycomband theirbeelinelibrary.How?Use the decorators!importairlineairline.init(dataset='your_app_name')@airline.evented()defsome_function(a,b)# do thingsorimportairlinefromairline.awslambdaimportairline_wrapperairline.init(dataset='function_or_app_name')@airline_wrapperdefhandler(event,context):# do thingsWide event logging?The idea is to build up the full context of a function/script into one wide event that gets emitted at the end. This puts all the context in one log message, making it very easy to run analytics on, find like errors, and a lot of other things. A full blown observability platform like honeycomb would be more informative, and allows for the notion of spans and distributed tracing (i.e. across different (micro) services and the like).But this is a start.Take this:importtimeimportrandomimportairlineairline.init(dataset='example')@airline.evented()defmain(a,b):airline.add_context_field("a",a)airline.add_context_field("b",b)withairline.timer('processing_a'):subfunction1(a)withairline.timer('processing_b'):subfunction1(b)defsubfunction1(input):time.sleep(random.uniform(0,len(input)))main("foo","example_long_thing")And emit this at the end{"time":"2020-03-09T09:49:43.376126Z","dataset":"example","client":"airline/0.1.0","data":{"a":"foo","b":"example_long_thing","duration_ms":9438.041,"timers":{"processing_a_ms":255.11,"processing_b_ms":9182.859}}}
airllm
Quickstart|Configurations|MacOS|Example notebooks|FAQAirLLMoptimizes inference memory usage, allowing 70B large language models to run inference on a single 4GB GPU card. No quantization, distillation, pruning or other model compression techniques that would result in degraded model performance are needed.AirLLM优化inference内存,4GB单卡GPU可以运行70B大语言模型推理。不需要任何损失模型性能的量化和蒸馏,剪枝等模型压缩。Updates[2023/12/25] v2.8.2: Support MacOS running 70B large language models.支持苹果系统运行70B大模型![2023/12/20] v2.7: Support AirLLMMixtral.[2023/12/20] v2.6: Added AutoModel, automatically detect model type, no need to provide model class to initialize model.提供AuoModel,自动根据repo参数检测模型类型,自动初始化模型。[2023/12/18] v2.5: added prefetching to overlap the model loading and compute. 10% speed improvement.[2023/12/03] added support ofChatGLM,QWen,Baichuan,Mistral,InternLM!支持ChatGLM, QWEN, Baichuan, Mistral, InternLM![2023/12/02] added support for safetensors. Now support all top 10 models in open llm leaderboard.支持safetensor系列模型,现在open llm leaderboard前10的模型都已经支持。[2023/12/01] airllm 2.0. Support compressions:3x run time speed up!airllm2.0。支持模型压缩,速度提升3倍。[2023/11/20] airllm Initial verion!airllm发布。Table of ContentsQuick startModel CompressionConfigurationsRun on MacOSExample notebooksSupported ModelsAcknowledgementFAQQuickstart1. install packageFirst, install airllm pip package.首先安装airllm包。pipinstallairllm如果找不到package,可能是因为默认的镜像问题。可以尝试制定原始镜像:pipinstall-ihttps://pypi.org/simple/airllm2. InferenceThen, initialize AirLLMLlama2, pass in the huggingface repo ID of the model being used, or the local path, and inference can be performed similar to a regular transformer model.然后,初始化AirLLMLlama2,传入所使用模型的huggingface repo ID,或者本地路径即可类似于普通的transformer模型进行推理。(You can can also specify the path to save the splitted layered model throughlayer_shards_saving_pathwhen init AirLLMLlama2.如果需要指定另外的路径来存储分层的模型可以在初始化AirLLMLlama2是传入参数:layer_shards_saving_path。)fromairllmimportAutoModelMAX_LENGTH=128# could use hugging face model repo id:model=AutoModel.from_pretrained("garage-bAInd/Platypus2-70B-instruct")# or use model's local path...#model = AutoModel.from_pretrained("/home/ubuntu/.cache/huggingface/hub/models--garage-bAInd--Platypus2-70B-instruct/snapshots/b585e74bcaae02e52665d9ac6d23f4d0dbc81a0f")input_text=['What is the capital of United States?',#'I like',]input_tokens=model.tokenizer(input_text,return_tensors="pt",return_attention_mask=False,truncation=True,max_length=MAX_LENGTH,padding=False)generation_output=model.generate(input_tokens['input_ids'].cuda(),max_new_tokens=20,use_cache=True,return_dict_in_generate=True)output=model.tokenizer.decode(generation_output.sequences[0])print(output)Note: During inference, the original model will first be decomposed and saved layer-wise. Please ensure there is sufficient disk space in the huggingface cache directory.注意:推理过程会首先将原始模型按层分拆,转存。请保证huggingface cache目录有足够的磁盘空间。Model Compression - 3x Inference Speed Up!We just added model compression based on block-wise quantization based model compression. Which can furtherspeed up the inference speedfor up to3x, withalmost ignorable accuracy loss!(see more performance evaluation and why we use block-wise quantization inthis paper)我们增加了基于block-wise quantization的模型压缩,推理速度提升3倍几乎没有精度损失。精度评测可以参考此paper:this paperhow to enalbe model compression speed up:Step 1. make sure you havebitsandbytesinstalled bypip install -U bitsandbytesStep 2. make sure airllm verion later than 2.0.0:pip install -U airllmStep 3. when initialize the model, passing the argument compression ('4bit' or '8bit'):model=AutoModel.from_pretrained("garage-bAInd/Platypus2-70B-instruct",compression='4bit'# specify '8bit' for 8-bit block-wise quantization)how model compression here is different from quantization?Quantization normally needs to quantize both weights and activations to really speed things up. Which makes it harder to maintain accuracy and avoid the impact of outliers in all kinds of inputs.While in our case the bottleneck is mainly at the disk loading, we only need to make the model loading size smaller. So we get to only quantize the weights part, which is easier to ensure the accuracy.ConfigurationsWhen initialize the model, we support the following configurations:初始化model的时候,可以指定以下的配置参数:compression: supported options: 4bit, 8bit for 4-bit or 8-bit block-wise quantization, or by default None for no compressionprofiling_mode: supported options: True to output time consumptions or by default Falselayer_shards_saving_path: optionally another path to save the splitted modelhf_token: huggingface token can be provided here if downloading gated models like:meta-llama/Llama-2-7b-hfprefetching: prefetching to overlap the model loading and compute. By default turned on. For now only AirLLMLlama2 supports this.delete_original: if you don't have too much disk space, you can set delete_original to true to delete the original downloaded hugging face model, only keep the transformed one to save half of the disk space.MacOSJust install airllm and run the code the same as on linux. See more inQuick Start.make sure you installedmlxand torchyou probabaly need to install python native see morehereonlyApple siliconis supportedExamplepython notebookExample Python NotebookExample colabs here:Supported ModelsHF open llm leaderboardtop modelsIncluding but not limited to the following:(Most of the open models are based on llama2, so should be supported by default)@12/01/23RankModelSupportedModel Class1TigerResearch/tigerbot-70b-chat-v2✅AirLLMLlama22upstage/SOLAR-0-70b-16bit✅AirLLMLlama23ICBU-NPU/FashionGPT-70B-V1.1✅AirLLMLlama24sequelbox/StellarBright✅AirLLMLlama25bhenrym14/platypus-yi-34b✅AirLLMLlama26MayaPH/GodziLLa2-70B✅AirLLMLlama2701-ai/Yi-34B✅AirLLMLlama28garage-bAInd/Platypus2-70B-instruct✅AirLLMLlama29jondurbin/airoboros-l2-70b-2.2.1✅AirLLMLlama210chargoddard/Yi-34B-Llama✅AirLLMLlama2?mistralai/Mistral-7B-Instruct-v0.1✅AirLLMMistral?mistralai/Mixtral-8x7B-v0.1✅AirLLMMixtralopencompass leaderboardtop modelsIncluding but not limited to the following:(Most of the open models are based on llama2, so should be supported by default)@12/01/23RankModelSupportedModel Class1GPT-4closed.ai😓N/A2TigerResearch/tigerbot-70b-chat-v2✅AirLLMLlama23THUDM/chatglm3-6b-base✅AirLLMChatGLM4Qwen/Qwen-14B✅AirLLMQWen501-ai/Yi-34B✅AirLLMLlama26ChatGPTclosed.ai😓N/A7OrionStarAI/OrionStar-Yi-34B-Chat✅AirLLMLlama28Qwen/Qwen-14B-Chat✅AirLLMQWen9Duxiaoman-DI/XuanYuan-70B✅AirLLMLlama210internlm/internlm-20b✅AirLLMInternLM26baichuan-inc/Baichuan2-13B-Chat✅AirLLMBaichuanexample of other models (ChatGLM, QWen, Baichuan, Mistral, etc):ChatGLM:fromairllmimportAutoModelMAX_LENGTH=128model=AutoModel.from_pretrained("THUDM/chatglm3-6b-base")input_text=['What is the capital of China?',]input_tokens=model.tokenizer(input_text,return_tensors="pt",return_attention_mask=False,truncation=True,max_length=MAX_LENGTH,padding=True)generation_output=model.generate(input_tokens['input_ids'].cuda(),max_new_tokens=5,use_cache=True,return_dict_in_generate=True)model.tokenizer.decode(generation_output.sequences[0])QWen:fromairllmimportAutoModelMAX_LENGTH=128model=AutoModel.from_pretrained("Qwen/Qwen-7B")input_text=['What is the capital of China?',]input_tokens=model.tokenizer(input_text,return_tensors="pt",return_attention_mask=False,truncation=True,max_length=MAX_LENGTH)generation_output=model.generate(input_tokens['input_ids'].cuda(),max_new_tokens=5,use_cache=True,return_dict_in_generate=True)model.tokenizer.decode(generation_output.sequences[0])Baichuan, InternLM, Mistral, etc:fromairllmimportAutoModelMAX_LENGTH=128model=AutoModel.from_pretrained("baichuan-inc/Baichuan2-7B-Base")#model = AutoModel.from_pretrained("internlm/internlm-20b")#model = AutoModel.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")input_text=['What is the capital of China?',]input_tokens=model.tokenizer(input_text,return_tensors="pt",return_attention_mask=False,truncation=True,max_length=MAX_LENGTH)generation_output=model.generate(input_tokens['input_ids'].cuda(),max_new_tokens=5,use_cache=True,return_dict_in_generate=True)model.tokenizer.decode(generation_output.sequences[0])To request other model support:hereAcknowledgementA lot of the code are based on SimJeg's great work in the Kaggle exam competition. Big shoutout to SimJeg:GitHub account @SimJeg,the code on Kaggle,the associated discussion.FAQ1. MetadataIncompleteBuffersafetensors_rust.SafetensorError: Error while deserializing header: MetadataIncompleteBufferIf you run into this error, most possible cause is you run out of disk space. The process of splitting model is very disk-consuming. Seethis. You may need to extend your disk space, clear huggingface.cacheand rerun.如果你碰到这个error,很有可能是空间不足。可以参考一下这个可能需要扩大硬盘空间,删除huggingface的.cache,然后重新run。2. ValueError: max() arg is an empty sequenceMost likely you are loading QWen or ChatGLM model with Llama2 class. Try the following:For QWen model:fromairllmimportAutoModel#<----- instead of AirLLMLlama2AutoModel.from_pretrained(...)For ChatGLM model:fromairllmimportAutoModel#<----- instead of AirLLMLlama2AutoModel.from_pretrained(...)3. 401 Client Error....Repo model ... is gated.Some models are gated models, needs huggingface api token. You can provide hf_token:model=AutoModel.from_pretrained("meta-llama/Llama-2-7b-hf",#hf_token='HF_API_TOKEN')4. ValueError: Asking to pad but the tokenizer does not have a padding token.Some model's tokenizer doesn't have padding token, so you can set a padding token or simply turn the padding config off:input_tokens=model.tokenizer(input_text,return_tensors="pt",return_attention_mask=False,truncation=True,max_length=MAX_LENGTH,padding=False#<----------- turn off padding)Citing AirLLMIf you find AirLLM useful in your research and wish to cite it, please use the following BibTex entry:@software{airllm2023, author = {Gavin Li}, title = {AirLLM: scaling large language models on low-end commodity computers}, url = {https://github.com/lyogavin/Anima/tree/main/air_llm}, version = {0.0}, year = {2023}, }ContributionWelcome contribution, ideas and discussions!If you find it useful, please ⭐ or buy me a coffee! 🙏
airlock
# airlockAirlock is a lightweight, web-security-concious wrapper forwebapp2on Google App Engine. It provides oauth2 integration for identity management with Google Accounts, sessions, and user management.## ComparisonAirlock is a drop-in replacement for severalwebapp2andprotorpcobjects. Specifically, it wrapsremote.Service,webapp2.WSGIApplication, andwebapp2.RequestHandlerto provide authentication and session features via oauth2 and theoauth2clientlibrary.original | airlock variant |——– | ————— |protorpc.remote.Service|airlock.Service|webapp2.RequestHandler|airlock.Handler|webapp2.WSGIApplication|airlock.WSGIApplication|## User featuresOauth2 integration with Google Accounts (sign in and sign out).Anonymous user/session support.## Security featuresA standard configuration format for specifying the security characteristics of an application.Provides a framework for setting the following headers: * Content security policy. * HSTS policy. * XSRF.## Usage1. Download client secrets. 1. In appengine config, use airlock.set_config 1. Use airlock’s subclasses. 1. Set up aUsermodel.
airly
Python wrapper for getting air quality data from Airly sensors.SamplesExamples of how to use the library are undersamplesdirectory. To run them, you need to createsamples.api_keyfile, and fill it with your Airly API Key.Unit tests are also valuable source of information, so check them in case of any doubts.
airmail
AirmailA CLI tool for deploying projects to AWSIntroduction/PhilosophyThe point of Airmail is to make deploying projects into AWS a little easier. It was inspired as a binding layer betweenTerraformedinfrastructure and deploying applications toAWS ECS. At NYMag we wanted to manage infrastructure with Terraform and then allow applications to be more declarative about how they run without caring about the infrastructure. A developer should be able to change easily declare where and how their application will run and then be able to easily configure resources in Terraform to support that. Airmail is designed to deploy codewith the assumption that the underlying infrastructure is there to support the project.How ToAirmail needs to be run in a project with a.deploydirectory. It will look inside this directory for configuration files that will tell the tool how to deploy to ECS.<project dir> ├── app # The directory of your application ├── .deploy # The directory holding the config │ ├── config.yml # Holds the primary config declarations │ └── <env>.env # Environment variable configuration for the container └── ...Config FileTheconfig.ymlfile contains all the information that Airmail needs to build the service and task definitions to deploy to ECS.For an example file click here.CommandsA list of commands and corresponding arguments/environment variablescan be found here.AWS ConfigurationAirmail assumes your local env is configured perBoto3configuration. The tool uses Boto3 to execute requests to AWS and does not do anything to setup your local environment.Environment VariablesYou can use a few environment variables to control how Airmail is run.AWS_PROFILE: will run the Boto3 commands under the local profile you have configuredAIRMAIL_ENV: automatically chooses which environment to run commands for. Good for CI/CD so the prompt is not triggered.AIRMAIL_DRY_RUN: will run all of the command except the actual call to AWSAIRMAIL_CONFIG_FILE(default:config.yml): specifies the file to read from in the.deploydirectory for application configuration. The file must be a valid YAML file.AIRMAIL_VERBOSE: will log in verbose mode. Good for debugging.Local DevelopmentClone and runpython3 setup.py installor downloadWatchcodeand runwatchcodein the root of the project./
airmailer
AirmailerSend e-mails with either plain SMTP or AWS SES.Free software: Apache Software License 2.0Documentation:https://airmailer.readthedocs.io.InstallationFirst, installairmailer:$pipinstallairmailerDocumentationSeehttps://airmailer.readthedocs.io.History0.1.0 (2021-11-30)First release on PyPI.
airmash
No description available on PyPI.
airML
airMLDistributed and Decentralized Deployment of ML models at scaleThis package is created to distribute KBox, which allow users to share and dereference ML models.Download the libraryhereInstall using pippip install airMLUse it in from your terminalOnce you install the airML package, you can directly execute the commands from the terminal. You don't need to open up a python environment to use the airML package.Open a terminal and execute KBox commands in python with airML package as below,airML list -o json**Note: Here the-o jsonis an optional parameter. If you want to get the output as a json message, you should use this. Otherwise, use the command without-o json.{ "status_code": 200, "message": "visited all KNs.", "results": [ { "name": "http://purl.org/pcp-on-web/dbpedia", "format": "kibe", "version": "c9a618a875c5d46add88de4f00b538962f9359ad" }, { "name": "http://purl.org/pcp-on-web/ontology", "format": "kibe", "version": "c9a618a875c5d46add88de4f00b538962f9359ad" }, { "name": "http://purl.org/pcp-on-web/dataset", "format": "kibe", "version": "dd240892384222f91255b0a94fd772c5d540f38b" } ] }Like the above command, you can use all other KBox commands with airML package. You can refer to the documenthereto get a good understanding of other KBox commands as well.Use it in your python application.execute(command)Description: Execute the provided command in the KBox.jar Args: command: 'string', KBox command which should be exectue in KBox. Returns: stringIf you want to use the airML inside your python application, you can follow these instructions,Import the airML package (from airML import airML).Execute any KBox command with execute() function as follows.airML.execute('KBox_Command')**Note:execute()method will return a string output which contains the result of the executed command.Other than the execute command you can use following methods directly,list(kns=False)Description: List all available models(kns=False) or list all KNS services(kns=True). Args: kns:'boolean',defines whether to list only the KNS services or not Returns: Results from the KBox as JSON Stringinstall(modelID, format=None, version=None):Description: Install the a model by given modelID Args: modelID: 'string', url of the model hosted in a public repository. format: 'string', format of the model. version: 'string' specific version to be installed of the the model. Returns: Results from the kbox as JSON String Example: install("http://nspm.org/art","NSPM","0")getInfo(model):Description: Gives the information about a specific model. Args: model: url of the model. Return: Results from the kbox as JSON Stringlocate(modelID, format, version=None):Description: Returns the local address of the given model. Args: modelID: 'string',url of the model to be located. format: 'string',format of the model. version: 'string',version of the model. Returns: Results from the kbox as JSON Stringsearch(pattern, format, version=None):Description: Search for all model-ids containing a given pattern. Args: pattern: 'string',pattern of the url of the models. format: 'string',format of the model. version: 'string',version of the model. Returns: Search Result from the KBox as a JSON StringSource URLsSee the source for this projecthereFind the KBox source codehere
airneo
No description available on PyPI.
airnet
airnetTable of ContentsInstallationLicenseInstallationpip install airnetLicenseairnetis distributed under the terms of theBSD-3-Clauselicense.
airnh
No description available on PyPI.
airnowpy
AirNowPyA Python library to facilitate interactions with theAirNow APIweb service.Quick StartIn order to use this library an API key is required. Get onehere.LicenseMIT License
airoboros
airoboros: using large language models to fine-tune large language modelsThis is my take on implementing theSelf-Instruct paper. The approach is quite heavily modified, and does not use any human-generated seeds.This updated implementation supports either the /v1/completions endpoint or /v1/chat/completions, which is particularly useful in that it supports gpt-4 and gpt-3.5-turbo (which is 1/10 the cost of text-davinci-003).Huge thank you to the folks over ata16zfor sponsoring the costs associated with building models and associated tools!Installvia pip:pip install --no-build-isolation airoborosfrom source (keeping the source):git clone https://github.com/jondurbin/airoboros pip install -e --no-build-isolation ./airoborosKey differences from self-instruct/alpacasupport for either /v1/completions or /v1/chat/completions APIs (which allows gpt-3.5-turbo instead of text-davinci-003, as well as gpt-4 if you have access)support for custom topics list, custom topic generation prompt, or completely random topicsin-memory vector db (Chroma) for similarity comparison, which is much faster than calculating rouge score for each generated instruction(seemingly) better prompts, which includes injection of random topics to relate the instructions to, which creates much more diverse synthetic instructionsasyncio producers with configurable batch sizeseveral "instructors", each targetting specific use-cases, such as Orca style reasoning/math, role playing, etc.tries to ensure the context, if provided, is relevant to the topic and contains all the information that would be necessary to respond to the instruction, and nost just a link to article/etc.generally speaking, this implementation tries to reduce some of thenoiseGoal of this projectProblem and proposed solution:Models can only ever be as good as the data they are trained on.High quality data is difficult to curate manually, so ideally the process can be automated by AI/LLMs.Large models (gpt-4, etc.) are pricey to build/run and out of reach for individuals/small-medium business, and are subject to RLHF bias, censorship, and changes without notice.Smaller models (llama-2-70b, etc.) can reach somewhat comparable performance in specific tasks to much larger models when trained on high quality data.The airoboros tool allows building datasets that are focused on specific tasks, which can then be used to build a plethora of individual expert models. This means we can crowdsource building experts.Using either a classifier model, or simply calculating vector embeddings for each item in the dataset and using faiss index/cosine similarity/etc. search, incoming requests can be routed to a particular expert (e.g. dynamically loading LoRAs) to get extremely high quality responses.Progress:✅ PoC that training via self-instruction, that is, datasets generated from language models, works reasonably well.✅ Iterate on the PoC to use higher quality prompts, more variety of instructions, etc.✅ Split the code into separate "instructors", for specializing in any particular task (creative writing, songs, roleplay, coding, execution planning, function calling, etc.)[in progress]: PoC that an ensemble of LoRAs split by the category (i.e., the instructor used in airoboros) has better performance than the same param count model tuned on all data[in progress]: Remove the dependency on OpenAI/gpt-4 to generate the training data so all datasets can be completely free and open source.[future]: Automatic splitting of experts at some threshold, e.g. "coding" is split into python, js, golang, etc.[future]: Hosted service/site to build and/or extend datasets or models using airoboros.[future]: Depending on success of all of the above, potentially a hosted inference option with an exchange for private/paid LoRAs.LMoELMoE is the simplest architecture I can think of for a mixture of experts. It doesn't use a switch transformer, doesn't require slicing and merging layers with additional fine-tuning, etc. It just dynamically loads the best PEFT/LoRA adapter model based on the incoming request.By using this method, we can theoretically crowdsource generation of dozens (or hundreds/thousands?) of very task-specific adapters and have an extremely powerful ensemble of models with very limited resources on top of a single base model (llama-2 7b/13b/70b).Tuning the expertsThe self-instruct code contained within this project uses many different "instructors" to generate training data to accomplish specific tasks. The output includes the instructor/category that generated the data. We can use this to automatically segment the training data to fine-tune specific "experts".Seescripts/segment_experts.pyfor an example of how the training data can be segmented, with a sampling of each other expert in the event of misrouting.Seescripts/tune_expert.pyfor an example of creating the adapter models (with positional args for expert name, model size, etc.)NOTE: this assumes use of my fork of qlorahttps://github.com/jondurbin/qloraRouting requests to the expertThe "best" routing mechanism would probably be to train a classifier based on the instructions for each category, with the category/expert being the label, but that prohibits dynamic loading of new experts.Instead, this supports 3 options:faiss index similarity search using the training data for each expert (default)agent-based router using the "function" expert (query the LLM with a list of available experts and their descriptions, ask which would be best based on the user's input)specify the agent in the JSON requestRunning the API serverFirst, download the base llama-2 model for whichever model size you want, e.g.:llama-2-7b-hfNext, download the LMoE package that corresponds to that base model, e.g.:airoboros-lmoe-7b-2.1NOTE: 13b also available, 70b in progressHere's an example command to start the server:python -m airoboros.lmoe.api \ --base-model ./llama-2-7b-hf \ --lmoe ./airoboros-lmoe-7b-2.1 \ --router-max-samples 1000 \ --router-k 25 \ --port 8000 \ --host 127.0.0.1to use the agent-based router, add--agent-routerto the argumentsThis uses flash attention via bettertransformers (in optimum). You may need to install torch nightly if you see an error like 'no kernel available', e.g.:pip install -U --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cu118Once started, you can infer using the same API scheme you'd query OpenAI API with, e.g.:curl -H 'content-type: application/json' http://127.0.0.1:8000/v1/chat/completions -d ' { "model": "llama-2-7b-hf", "temperature": 0.7, "max_tokens": 2048, "messages": [ { "role": "system", "content": "A chat." }, { "role": "user", "content": "How much wood would a woodchuck chuck if a woodchuck could chuck wood?" } ] }'I've also added an vllm-based server, but the results aren't quite as good (not sure why yet). To use it, make sure you installvllmandfschat, orpip install airoboros[vllm]python -m airoboros.lmoe.vllm \ --model ./llama-2-7b-hf \ --lmoe-path ../airoboros-lmoe-7b-2.1 \ --router-max-samples 100 \ --router-k 25 \ --port 8000 \ --host 127.0.0.1Generating instructionsNEW - 2023-07-18To better accommodate the plethora of options, the configuration has been moved to a YAML config file.Please create a copy ofexample-config.yamland configure as desired.Once you have the desired configuration, run:airoboros generate-instructions --config-path /path/to/config.yamlGenerating topicsNEW - 2023-07-18Again, this is now all YAML configuration based! Please create a customized version of the YAML config file, then run:airoboros generate-topics --config-path /path/to/config.yamlYou can override thetopic_promptstring in the configuration to use a different topic generation prompt.Support the workhttps://bmc.link/jondurbinETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswfModels (research use only):gpt-4 versionsllama-2 base model2.1 datasetairoboros-l2-7b-2.1airoboros-l2-13b-2.1airoboros-l2-70b-2.1airoboros-c34b-2.12.0/m2.0airoboros-l2-7b-gpt4-2.0airoboros-l2-7b-gpt4-m2.0airoboros-l2-13b-gpt4-2.0airoboros-l2-13b-gpt4-m2.0Previous generation (1.4.1 dataset)airoboros-l2-70b-gpt4-1.4.1airoboros-l2-13b-gpt4-1.4.1airoboros-l2-7b-gpt4-1.4.1original llama base modelLatest version (2.0 / m2.0 datasets)airoboros-33b-gpt4-2.0airoboros-33b-gpt4-m2.0Previous generation (1.4.1 dataset)airoboros-65b-gpt4-1.4airoboros-33b-gpt4-1.4airoboros-13b-gpt4-1.4airoboros-7b-gpt4-1.4older versions on HF as wellmpt-30b base modelairoboros-mpt-30b-gpt4-1.4gpt-3.5-turbo versionsairoboros-gpt-3.5-turbo-100k-7bairoboros-13bairoboros-7bDatasetsairoboros-gpt-3.5-turboairoboros-gpt4airoboros-gpt4-1.1airoboros-gpt4-1.2airoboros-gpt4-1.3airoboros-gpt4-1.4airoboros-gpt4-2.0 (June only GPT4)airoboros-gpt4-m2.0airoboros-2.1 (recommended)
airobot
MingfromairobotimportAiPlatapp_id=''app_key=''# 初始化ai=AiPlat(app_id,app_key)# 发送数据msg=''data=ai.getNlpTextChat(msg)
airobots
Airobots全端自动化测试框架Airobots,整合了Airtest、RobotFramework、Selenium、Appium、HTTPRunner、Locust框架的方法,从而实现一套框架,能够进行全端的测试,统一执行入口和报告的输出。本质上,Airobots可看成是以上优秀开源测试框架的集成库,整合后以实现方法的共用,比如在Selenium和Appium中使用Airtest的图像识别功能。实现一个框架同时支持Android、IOS、WEB、API的自动化测试及性能测试(HTTPRunner提供的基于Locust的压力生成器),并且统一了测试报告输出格式,使用美观大气的Allure作为测试报告,可详细记录测试过程日志、测试步骤和截图。安装框架依赖包, 执行pip install airobots -i https://mirrors.aliyun.com/pypi/simplewindows系统下, 如果安装失败, 可能需要安装C++编译工具: visualcppbuildtools_full.exe, 具体错误请留意控制台报错信息。运行WEB测试,需要安装ChromeDriver,请自行下载安装,或安装node之后执行npm install -g chromedriver安装执行测试Allure 报告(推荐)airobots -t api ./API/Case/Path/ --alluredir=Results # API测试 airobots -t web ./Web/Case/Path/ --alluredir=Results # Web测试 airobots -t android ./Android/Case/Path/ --alluredir=Results # Android测试 airobots -t ios ./IOS/Case/Path/ --alluredir=Results # IOS测试HTML 报告airobots -t api ./API/Case/Path/ --html=Results/report.html # API测试 airobots -t web ./Web/Case/Path/ --html=Results/report.html # Web测试 airobots -t android ./Android/Case/Path/ --html=Results/report.html # Android测试 airobots -t ios ./IOS/Case/Path/ --html=Results/report.html # IOS测试查看Allure报告allure serve ./Results安装AllureLinuxsudo apt-add-repository ppa:qameta/allure sudo apt-get update sudo apt-get install allureMac OS X对于Mas OS,可通过Homebrew进行自动安装brew install allureWindows对于Windows,可从Scoop命令行安装程序获得Allure。要安装Allure,请下载并安装Scoop,然后在Powershell中执行scoop install allure演示项目:https://github.com/BSTester/AirobotsDemo
airohit-fasterrcnn
No description available on PyPI.
airo-models
airo-modelsCurated URDFs and 3D models of the robots and gripper used at airo.Installationairo-modelsis available on PyPi and can be installed with pip:pip install airo-modelsUsageExample of loading a URDF from airo-models, customizing it and writing it to a temporary file:importairo_modelsrobotiq_urdf_path=airo_models.get_urdf_path("robotiq_2f_85")robotiq_urdf=airo_models.urdf.read_urdf(robotiq_urdf_path)# Make the robotiq gripper staticairo_models.urdf.replace_value(robotiq_urdf,"@type","revolute","fixed")airo_models.urdf.delete_key(robotiq_urdf,"mimic")airo_models.urdf.delete_key(robotiq_urdf,"transmission")# Write it to a temporary file to read later with Drake's AddModelFromFilerobotiq_static_urdf_path=airo_models.urdf.write_urdf_to_tempfile(robotiq_urdf,robotiq_urdf_path,prefix="robotiq_2f_85_static_")To check which models are available:fromairo_models.filesimportAIRO_MODEL_NAMESprint(AIRO_MODEL_NAMES)>>>['ur3e','ur5e','robotiq_2f_85']DevelopmentLocal installationClone this repoCreate the conda environmentconda env create -f environment.yamlInitialize the pre-commit hookspre-commit installRun the tests withpytest .ReleasingReleasing to PyPi is done automatically by github actions when a new tag is pushed to the main branch.Update the version inpyproject.toml.git add pyproject.tomlgit commit -m ""git pushgit tag -a v0.1.0 -m "airo-models v0.1.0"git push origin v0.1.0This was set up followingthis guidefirst and thenthis guide.