package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
airbyte-source-rss | Rabbitmq DestinationThis is the repository for the Rabbitmq destination connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thedestination_rabbitmq/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namedestination rabbitmq test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--namedestination-rabbitmqbuildAn image will be built with the tagairbyte/destination-rabbitmq:dev.Viadocker build:dockerbuild-tairbyte/destination-rabbitmq:dev.Then run any of the connector commands as follows:docker run --rm airbyte/destination-rabbitmq:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-rabbitmq:dev check --config /secrets/config.json
cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-rabbitmq:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-rsstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-rss testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/rss.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-s3 | S3 source connectorThis is the repository for the S3 source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_s3/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-s3 spec
poetry run source-s3 check --config secrets/config.json
poetry run source-s3 discover --config secrets/config.json
poetry run source-s3 read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-s3buildAn image will be available on your host with the tagairbyte/source-s3:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-s3:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-s3:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-s3:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-s3:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-s3testCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-s3 testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/s3.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-salesforce | Salesforce source connectorThis is the repository for the Salesforce source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_salesforce/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-salesforce spec
poetry run source-salesforce check --config secrets/config.json
poetry run source-salesforce discover --config secrets/config.json
poetry run source-salesforce read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-salesforcebuildAn image will be available on your host with the tagairbyte/source-salesforce:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-salesforce:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-salesforce:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-salesforce:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-salesforce:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-salesforcetestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-salesforce testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/salesforce.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-salesloft | Salesloft SourceThis is the repository for the Salesloft source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_salesloft/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource salesloft test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-salesloftbuildAn image will be built with the tagairbyte/source-salesloft:dev.Viadocker build:dockerbuild-tairbyte/source-salesloft:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-salesloft:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-salesloft:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-salesloft:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-salesloft:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-saleslofttestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-salesloft testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/salesloft.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-sap-fieldglass | Sap Fieldglass SourceThis is the repository for the Sap Fieldglass configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_sap_fieldglass/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource sap-fieldglass test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-sap-fieldglassbuildAn image will be built with the tagairbyte/source-sap-fieldglass:dev.Viadocker build:dockerbuild-tairbyte/source-sap-fieldglass:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-sap-fieldglass:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sap-fieldglass:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sap-fieldglass:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-sap-fieldglass:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-sap-fieldglasstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-sap-fieldglass testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/sap-fieldglass.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-search-metrics | Search Metrics SourceThis is the repository for the Search Metrics source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_search_metrics/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource search-metrics test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-search-metricsbuildAn image will be built with the tagairbyte/source-search-metrics:dev.Viadocker build:dockerbuild-tairbyte/source-search-metrics:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-search-metrics:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-search-metrics:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-search-metrics:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-search-metrics:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-search-metricstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-search-metrics testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/search-metrics.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-secoda | Secoda SourceThis is the repository for the Secoda configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_secoda/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource secoda test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-secodabuildAn image will be built with the tagairbyte/source-secoda:dev.Viadocker build:dockerbuild-tairbyte/source-secoda:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-secoda:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-secoda:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-secoda:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-secoda:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-secodatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-secoda testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/secoda.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-sendgrid | Sendgrid source connectorThis is the repository for the Sendgrid source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_sendgrid/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-sendgrid spec
poetry run source-sendgrid check --config secrets/config.json
poetry run source-sendgrid discover --config secrets/config.json
poetry run source-sendgrid read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-sendgridbuildAn image will be available on your host with the tagairbyte/source-sendgrid:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-sendgrid:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sendgrid:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sendgrid:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-sendgrid:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-sendgridtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-sendgrid testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/sendgrid.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-sendinblue | Sendinblue SourceThis is the repository for the Sendinblue configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_sendinblue/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource sendinblue test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-sendinbluebuildAn image will be built with the tagairbyte/source-sendinblue:dev.Viadocker build:dockerbuild-tairbyte/source-sendinblue:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-sendinblue:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sendinblue:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sendinblue:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-sendinblue:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-sendinbluetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-sendinblue testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/sendinblue.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-senseforce | Senseforce SourceThis is the repository for the Senseforce configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_senseforce/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource senseforce test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-senseforcebuildAn image will be built with the tagairbyte/source-senseforce:dev.Viadocker build:dockerbuild-tairbyte/source-senseforce:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-senseforce:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-senseforce:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-senseforce:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-senseforce:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-senseforcetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-senseforce testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/senseforce.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-sentry | Sentry source connectorThis is the repository for the Sentry source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_sentry/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-sentry spec
poetry run source-sentry check --config secrets/config.json
poetry run source-sentry discover --config secrets/config.json
poetry run source-sentry read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-sentrybuildAn image will be available on your host with the tagairbyte/source-sentry:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-sentry:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sentry:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sentry:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-sentry:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-sentrytestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-sentry testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/sentry.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-serpstat | Serpstat SourceThis is the repository for the Serpstat configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_serpstat/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource serpstat test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-serpstatbuildAn image will be built with the tagairbyte/source-serpstat:dev.Viadocker build:dockerbuild-tairbyte/source-serpstat:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-serpstat:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-serpstat:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-serpstat:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-serpstat:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-serpstattestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-serpstat testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/serpstat.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-sftp-bulk | SFTP Bulk SourceThis is the repository for the FTP source connector, written in Python, that helps you bulk ingest files with the same data format from an FTP server into a single stream.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_sftp_bulk/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource ftp test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-sftp-bulkbuildAn image will be built with the tagairbyte/source-sftp-bulk:dev.Viadocker build:dockerbuild-tairbyte/source-sftp-bulk:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-sftp-bulk:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sftp-bulk:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sftp-bulk:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-sftp-bulk:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-sftp-bulktestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-sftp-bulk testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/sftp-bulk.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-shopify | Shopify source connectorThis is the repository for the Shopify source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_shopify/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-shopify spec
poetry run source-shopify check --config secrets/config.json
poetry run source-shopify discover --config secrets/config.json
poetry run source-shopify read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-shopifybuildAn image will be available on your host with the tagairbyte/source-shopify:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-shopify:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-shopify:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-shopify:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-shopify:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-shopifytestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-shopify testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/shopify.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-shortio | Shortio SourceThis is the repository for the Shortio configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_shortio/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource shortio test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-shortiobuildAn image will be built with the tagairbyte/source-shortio:dev.Viadocker build:dockerbuild-tairbyte/source-shortio:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-shortio:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-shortio:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-shortio:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-shortio:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-shortiotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-shortio testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/shortio.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-slack | Slack source connectorThis is the repository for the Slack source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_slack/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-slack spec
poetry run source-slack check --config secrets/config.json
poetry run source-slack discover --config secrets/config.json
poetry run source-slack read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-slackbuildAn image will be available on your host with the tagairbyte/source-slack:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-slack:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-slack:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-slack:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-slack:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-slacktestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-slack testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/slack.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-smaily | Smaily SourceThis is the repository for the Smaily configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_smaily/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource smaily test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-smailybuildAn image will be built with the tagairbyte/source-smaily:dev.Viadocker build:dockerbuild-tairbyte/source-smaily:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-smaily:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-smaily:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-smaily:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-smaily:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-smailytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-smaily testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/smaily.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-smartengage | Smartengage SourceThis is the repository for the Smartengage configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_smartengage/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource smartengage test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-smartengagebuildAn image will be built with the tagairbyte/source-smartengage:dev.Viadocker build:dockerbuild-tairbyte/source-smartengage:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-smartengage:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-smartengage:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-smartengage:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-smartengage:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-smartengagetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-smartengage testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/smartengage.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-smartsheets | Customer Io SourceThis is the repository for the Customer Io configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_customer_io/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource customer-io test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--namesource-customer-iobuildAn image will be built with the tagairbyte/source-customer-io:dev.Viadocker build:dockerbuild-tairbyte/source-customer-io:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-customer-io:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-customer-io:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-customer-io:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-customer-io:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-smartsheetstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-smartsheets testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/smartsheets.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-snapchat-marketing | Snapchat-Marketing source connectorThis is the repository for the Snapchat-Marketing source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_snapchat_marketing/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-snapchat-marketing spec
poetry run source-snapchat-marketing check --config secrets/config.json
poetry run source-snapchat-marketing discover --config secrets/config.json
poetry run source-snapchat-marketing read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-snapchat-marketingbuildAn image will be available on your host with the tagairbyte/source-snapchat-marketing:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-snapchat-marketing:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-snapchat-marketing:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-snapchat-marketing:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-snapchat-marketing:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-snapchat-marketingtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-snapchat-marketing testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/snapchat-marketing.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-sonar-cloud | Sonar Cloud SourceThis is the repository for the Sonar Cloud configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_sonar_cloud/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource sonar-cloud test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-sonar-cloudbuildAn image will be built with the tagairbyte/source-sonar-cloud:dev.Viadocker build:dockerbuild-tairbyte/source-sonar-cloud:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-sonar-cloud:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sonar-cloud:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-sonar-cloud:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-sonar-cloud:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-sonar-cloudtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-sonar-cloud testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/sonar-cloud.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-spacex-api | Spacex Api SourceThis is the repository for the Spacex Api configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_spacex_api/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource spacex-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-spacex-apibuildAn image will be built with the tagairbyte/source-spacex-api:dev.Viadocker build:dockerbuild-tairbyte/source-spacex-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-spacex-api:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-spacex-api:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-spacex-api:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-spacex-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-spacex-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-spacex-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/spacex-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-square | Square SourceThis is the repository for the Square configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_square/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource square test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-squarebuildAn image will be built with the tagairbyte/source-square:dev.Viadocker build:dockerbuild-tairbyte/source-square:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-square:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-square:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-square:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-square:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-squaretestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-square testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/square.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-statuspage | Statuspage SourceThis is the repository for the Statuspage configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_statuspage/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource statuspage test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-statuspagebuildAn image will be built with the tagairbyte/source-statuspage:dev.Viadocker build:dockerbuild-tairbyte/source-statuspage:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-statuspage:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-statuspage:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-statuspage:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-statuspage:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-statuspagetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-statuspage testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/statuspage.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-strava | Strava SourceThis is the repository for the Strava configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_strava/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource strava test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-stravabuildAn image will be built with the tagairbyte/source-strava:dev.Viadocker build:dockerbuild-tairbyte/source-strava:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-strava:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-strava:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-strava:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-strava:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-stravatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-strava testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/strava.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-stripe | Stripe source connectorThis is the repository for the Stripe source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_stripe/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-stripe spec
poetry run source-stripe check --config secrets/config.json
poetry run source-stripe discover --config secrets/config.json
poetry run source-stripe read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-stripebuildAn image will be available on your host with the tagairbyte/source-stripe:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-stripe:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-stripe:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-stripe:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-stripe:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-stripetestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-stripe testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/stripe.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-surveycto | Surveycto SourceThis is the repository for the Surveycto source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.The generator boilderplate is generated by this commandcd airbyte-integrations/connector-templates/generator
./generate.shCreate a dev environmentcd ../../connectors/source-surveycto
python3 -m venv .venv # Create a virtual environment in the .venv directory
source .venv/bin/activate
pip install -r requirements.txtTo iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_surveycto/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource surveycto test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-surveyctobuildAn image will be built with the tagairbyte/source-surveycto:dev.Viadocker build:dockerbuild-tairbyte/source-surveycto:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-surveycto:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-surveycto:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-surveycto:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-surveycto:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-surveyctotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-surveycto testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/surveycto.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-surveymonkey | Surveymonkey source connectorThis is the repository for the Surveymonkey source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_surveymonkey/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-surveymonkey spec
poetry run source-surveymonkey check --config secrets/config.json
poetry run source-surveymonkey discover --config secrets/config.json
poetry run source-surveymonkey read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-surveymonkeybuildAn image will be available on your host with the tagairbyte/source-surveymonkey:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-surveymonkey:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-surveymonkey:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-surveymonkey:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-surveymonkey:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-surveymonkeytestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-surveymonkey testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/surveymonkey.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-survey-sparrow | Survey Sparrow SourceThis is the repository for the Survey Sparrow configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_survey_sparrow/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource survey-sparrow test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-survey-sparrowbuildAn image will be built with the tagairbyte/source-survey-sparrow:dev.Viadocker build:dockerbuild-tairbyte/source-survey-sparrow:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-survey-sparrow:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-survey-sparrow:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-survey-sparrow:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-survey-sparrow:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-survey-sparrowtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-survey-sparrow testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/survey-sparrow.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-talkdesk-explore | Talkdesk-Explore SourceThis is the repository for the Talkdesk source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_talkdesk_explore/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource talkdesk-explore test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-talkdesk-explorebuildAn image will be built with the tagairbyte/source-talkdesk-explore:dev.Viadocker build:dockerbuild-tairbyte/source-talkdesk-explore:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-talkdesk-explore:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-talkdesk-explore:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-talkdesk-explore:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-talkdesk-explore:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-talkdesk-exploretestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-talkdesk-explore testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/talkdesk-explore.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-tempo | Fullstory SourceThis is the repository for the Fullstory configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_fullstory/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource fullstory test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--namesource-fullstorybuildAn image will be built with the tagairbyte/source-fullstory:dev.Viadocker build:dockerbuild-tairbyte/source-fullstory:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-fullstory:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fullstory:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fullstory:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-fullstory:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-tempotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-tempo testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/tempo.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-the-guardian-api | The Guardian Api SourceThis is the repository for the The Guardian Api configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_the_guardian_api/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource the-guardian-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-the-guardian-apibuildAn image will be built with the tagairbyte/source-the-guardian-api:dev.Viadocker build:dockerbuild-tairbyte/source-the-guardian-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-the-guardian-api:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-the-guardian-api:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-the-guardian-api:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-the-guardian-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-the-guardian-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-the-guardian-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/the-guardian-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-tiktok-marketing | Tiktok-Marketing source connectorThis is the repository for the Tiktok-Marketing source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_tiktok_marketing/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-tiktok-marketing spec
poetry run source-tiktok-marketing check --config secrets/config.json
poetry run source-tiktok-marketing discover --config secrets/config.json
poetry run source-tiktok-marketing read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-tiktok-marketingbuildAn image will be available on your host with the tagairbyte/source-tiktok-marketing:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-tiktok-marketing:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tiktok-marketing:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tiktok-marketing:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-tiktok-marketing:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-tiktok-marketingtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-tiktok-marketing testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/tiktok-marketing.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-timely | Timely SourceThis is the repository for the Timely configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_timely/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource timely test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-timelybuildAn image will be built with the tagairbyte/source-timely:dev.Viadocker build:dockerbuild-tairbyte/source-timely:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-timely:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-timely:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-timely:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-timely:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-timelytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-timely testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/timely.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-tmdb | Tmdb SourceThis is the repository for the Tmdb configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_tmdb/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource tmdb test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-tmdbbuildAn image will be built with the tagairbyte/source-tmdb:dev.Viadocker build:dockerbuild-tairbyte/source-tmdb:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-tmdb:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tmdb:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tmdb:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-tmdb:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-tmdbtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-tmdb testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/tmdb.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-todoist | Todoist SourceThis is the repository for the Todoist configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_todoist/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource todoist test credsand place them intosecrets/config.json.The Airbyte way of building this connector is to use ourairbyte-citool.
You can follow install instructionshere.
Then running the following command will build your connector:airbyte-ciconnectors--namesource-todoistbuildOnce the command is done, you will find your connector image in your local docker registry:airbyte/source-todoist:dev.When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
You can customize our build process by adding abuild_customization.pymodule to your connector.
This module should contain apre_connector_installandpost_connector_installasync function that will mutate the base image and the connector container respectively.
It will be imported at runtime by our build process and the functions will be called if they exist.Here is an example of abuild_customization.pymodule:from__future__importannotationsfromtypingimportTYPE_CHECKINGifTYPE_CHECKING:fromdaggerimportContainerasyncdefpre_connector_install(base_image_container:Container)->Container:returnawaitbase_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR","my_pre_build_env_var_value")asyncdefpost_connector_install(connector_container:Container)->Container:returnawaitconnector_container.with_env_variable("MY_POST_BUILD_ENV_VAR","my_post_build_env_var_value")This connector is built using our dynamic built process inairbyte-ci.
The base image used to build it is defined within the metadata.yaml file under theconnectorBuildOptions.
The build logic is defined usingDaggerhere.
It does not rely on a Dockerfile.If you would like to patch our connector and build your own a simple approach would be to:Create your own Dockerfile based on the latest version of the connector image.FROMairbyte/source-todoist:latestCOPY../airbyte/integration_codeRUNpipinstall./airbyte/integration_codePlease use this as an example. This is not optimized.Build your image:dockerbuild-tairbyte/source-todoist:dev.
dockerrunairbyte/source-todoist:devspec
Thenrunanyoftheconnectorcommandsasfollows:docker run --rm airbyte/source-todoist:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-todoist:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-todoist:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-todoist:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonCustomize `acceptance-test-config.yml` file to configure tests. See [Connector Acceptance Tests](https://docs.airbyte.com/connector-development/testing-connectors/connector-acceptance-tests-reference) for more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.
Please run acceptance tests via [airbyte-ci](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md#connectors-test-command):
```bash
airbyte-ci connectors --name source-todoist testAll of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-todoist testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/todoist.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-toggl | Toggl SourceThis is the repository for the Toggl configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_toggl/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource toggl test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-togglbuildAn image will be built with the tagairbyte/source-toggl:dev.Viadocker build:dockerbuild-tairbyte/source-toggl:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-toggl:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-toggl:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-toggl:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-toggl:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-toggltestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-toggl testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/toggl.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-tplcentral | 3PL Central SourceThis is the repository for the 3PL Central source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_tplcentral/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource tplcentral test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-tplcentralbuildAn image will be built with the tagairbyte/source-tplcentral:dev.Viadocker build:dockerbuild-tairbyte/source-tplcentral:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-tplcentral:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tplcentral:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tplcentral:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-tplcentral:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-tplcentraltestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-tplcentral testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/tplcentral.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-trello | Trello SourceThis is the repository for the Trello configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_trello/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource trello test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-trellobuildAn image will be built with the tagairbyte/source-trello:dev.Viadocker build:dockerbuild-tairbyte/source-trello:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-trello:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-trello:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-trello:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-trello:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-trellotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-trello testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/trello.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-trustpilot | Trustpilot SourceThis is the repository for the Trustpilot source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_trustpilot/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource trustpilot test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-trustpilotbuildAn image will be built with the tagairbyte/source-trustpilot:dev.Viadocker build:dockerbuild-tairbyte/source-trustpilot:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-trustpilot:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-trustpilot:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-trustpilot:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-trustpilot:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-trustpilottestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-trustpilot testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/trustpilot.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-tvmaze-schedule | Tvmaze Schedule SourceThis is the repository for the Tvmaze Schedule configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_tvmaze_schedule/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource tvmaze-schedule test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-tvmaze-schedulebuildAn image will be built with the tagairbyte/source-tvmaze-schedule:dev.Viadocker build:dockerbuild-tairbyte/source-tvmaze-schedule:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-tvmaze-schedule:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tvmaze-schedule:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tvmaze-schedule:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-tvmaze-schedule:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-tvmaze-scheduletestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-tvmaze-schedule testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/tvmaze-schedule.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-twilio | Twilio source connectorThis is the repository for the Twilio source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_twilio/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-twilio spec
poetry run source-twilio check --config secrets/config.json
poetry run source-twilio discover --config secrets/config.json
poetry run source-twilio read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-twiliobuildAn image will be available on your host with the tagairbyte/source-twilio:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-twilio:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twilio:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twilio:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-twilio:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-twiliotestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-twilio testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/twilio.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-twilio-taskrouter | Twilio Taskrouter SourceThis is the repository for the Twilio Taskrouter configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_twilio_taskrouter/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource twilio-taskrouter test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-twilio-taskrouterbuildAn image will be built with the tagairbyte/source-twilio-taskrouter:dev.Viadocker build:dockerbuild-tairbyte/source-twilio-taskrouter:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-twilio-taskrouter:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twilio-taskrouter:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twilio-taskrouter:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-twilio-taskrouter:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-twilio-taskroutertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-twilio-taskrouter testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/twilio-taskrouter.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-twitter | Twitter SourceThis is the repository for the Twitter configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_twitter/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource twitter test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-twitterbuildAn image will be built with the tagairbyte/source-twitter:dev.Viadocker build:dockerbuild-tairbyte/source-twitter:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-twitter:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twitter:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-twitter:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-twitter:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-twittertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-twitter testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/twitter.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-tyntec-sms | Tyntec Sms SourceThis is the repository for the Tyntec Sms configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_tyntec_sms/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource tyntec-sms test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-tyntec-smsbuildAn image will be built with the tagairbyte/source-tyntec-sms:dev.Viadocker build:dockerbuild-tairbyte/source-tyntec-sms:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-tyntec-sms:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tyntec-sms:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-tyntec-sms:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-tyntec-sms:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-tyntec-smstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-tyntec-sms testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/tyntec-sms.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-typeform | Typeform source connectorThis is the repository for the Typeform source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_typeform/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-typeform spec
poetry run source-typeform check --config secrets/config.json
poetry run source-typeform discover --config secrets/config.json
poetry run source-typeform read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-typeformbuildAn image will be available on your host with the tagairbyte/source-typeform:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-typeform:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-typeform:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-typeform:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-typeform:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-typeformtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-typeform testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/typeform.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-us-census | Us Census SourceThis is the repository for the Us Census source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_us_census/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource us-census test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-us-censusbuildAn image will be built with the tagairbyte/source-us-census:dev.Viadocker build:dockerbuild-tairbyte/source-us-census:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-us-census:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-us-census:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-us-census:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-us-census:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-us-censustestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-us-census testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/us-census.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-vantage | Vantage Source TestThis is the repository for the Vantage configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_vantage/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource vantage test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-vantagebuildAn image will be built with the tagairbyte/source-vantage:dev.Viadocker build:dockerbuild-tairbyte/source-vantage:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-vantage:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-vantage:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-vantage:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-vantage:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-vantagetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-vantage testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/vantage.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-visma-economic | Visma Economic SourceThis is the repository for the Visma Economic configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_visma_economic/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource visma-economic test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-visma-economicbuildAn image will be built with the tagairbyte/source-visma-economic:dev.Viadocker build:dockerbuild-tairbyte/source-visma-economic:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-visma-economic:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-visma-economic:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-visma-economic:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-visma-economic:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-visma-economictestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-visma-economic testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/visma-economic.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-vitally | Vitally SourceThis is the repository for the Vitally configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_vitally/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource vitally test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-vitallybuildAn image will be built with the tagairbyte/source-vitally:dev.Viadocker build:dockerbuild-tairbyte/source-vitally:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-vitally:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-vitally:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-vitally:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-vitally:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-vitallytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-vitally testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/vitally.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-waiteraid | Waiteraid SourceThis is the repository for the Waiteraid configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_waiteraid/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource waiteraid test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-waiteraidbuildAn image will be built with the tagairbyte/source-waiteraid:dev.Viadocker build:dockerbuild-tairbyte/source-waiteraid:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-waiteraid:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-waiteraid:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-waiteraid:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-waiteraid:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-waiteraidtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-waiteraid testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/waiteraid.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-weatherstack | Weatherstack SourceThis is the repository for the Weatherstack source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_weatherstack/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource weatherstack test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-weatherstackbuildAn image will be built with the tagairbyte/source-weatherstack:dev.Viadocker build:dockerbuild-tairbyte/source-weatherstack:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-weatherstack:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-weatherstack:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-weatherstack:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-weatherstack:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-weatherstacktestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-weatherstack testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/weatherstack.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-webflow | Webflow SourceThis is the repository for the Webflow source connector, written in Python.
For information about how to use this connector within Airbyte, seeWebflow source documentation.A detailed tutorial has been written about this implementation. See:Build a connector to extract data from the Webflow APIWebflow v1 API KeyFrom this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_webflow/spec.yamlfile.
Note that any directory namedsecretsis git-ignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.For more information about creating Webflow credentials, seethe documentation.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource webflow test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-webflowbuildAn image will be built with the tagairbyte/source-webflow:dev.Viadocker build:dockerbuild-tairbyte/source-webflow:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-webflow:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-webflow:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-webflow:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-webflow:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-webflowtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-webflow testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/webflow.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-whisky-hunter | Whisky Hunter SourceThis is the repository for the Whisky Hunter configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_whisky_hunter/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource whisky-hunter test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-whisky-hunterbuildAn image will be built with the tagairbyte/source-whisky-hunter:dev.Viadocker build:dockerbuild-tairbyte/source-whisky-hunter:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-whisky-hunter:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-whisky-hunter:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-whisky-hunter:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-whisky-hunter:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-whisky-huntertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-whisky-hunter testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/whisky-hunter.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-wikipedia-pageviews | Wikipedia Pageviews SourceThis is the repository for the Wikipedia Pageviews configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_wikipedia_pageviews/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource wikipedia-pageviews test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-wikipedia-pageviewsbuildAn image will be built with the tagairbyte/source-wikipedia-pageviews:dev.Viadocker build:dockerbuild-tairbyte/source-wikipedia-pageviews:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-wikipedia-pageviews:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-wikipedia-pageviews:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-wikipedia-pageviews:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-wikipedia-pageviews:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-wikipedia-pageviewstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-wikipedia-pageviews testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/wikipedia-pageviews.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-woocommerce | Woocommerce SourceThis is the repository for the Woocommerce configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_woocommerce/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource woocommerce test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-woocommercebuildAn image will be built with the tagairbyte/source-woocommerce:dev.Viadocker build:dockerbuild-tairbyte/source-woocommerce:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-woocommerce:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-woocommerce:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-woocommerce:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-woocommerce:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-woocommercetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-woocommerce testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/woocommerce.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-workable | Workable SourceThis is the repository for the Workable configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_workable/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource workable test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-workablebuildAn image will be built with the tagairbyte/source-workable:dev.Viadocker build:dockerbuild-tairbyte/source-workable:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-workable:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-workable:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-workable:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-workable:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-workabletestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-workable testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/workable.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-workramp | Workramp SourceThis is the repository for the Workramp configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_workramp/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource workramp test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-workrampbuildAn image will be built with the tagairbyte/source-workramp:dev.Viadocker build:dockerbuild-tairbyte/source-workramp:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-workramp:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-workramp:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-workramp:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-workramp:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-workramptestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-workramp testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/workramp.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-wrike | Wrike SourceThis is the repository for the Wrike configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_wrike/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource wrike test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-wrikebuildAn image will be built with the tagairbyte/source-wrike:dev.Viadocker build:dockerbuild-tairbyte/source-wrike:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-wrike:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-wrike:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-wrike:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-wrike:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-wriketestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-wrike testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/wrike.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-xero | Xero SourceThis is the repository for the Xero source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_xero/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource xero test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-xerobuildAn image will be built with the tagairbyte/source-xero:dev.Viadocker build:dockerbuild-tairbyte/source-xero:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-xero:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-xero:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-xero:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-xero:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-xerotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-xero testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/xero.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-xkcd | XKCD SourceThis is the repository for the Xkcd source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python3 -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_xkcd/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource xkcd test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-xkcdbuildAn image will be built with the tagairbyte/source-xkcd:dev.Viadocker build:dockerbuild-tairbyte/source-xkcd:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-xkcd:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-xkcd:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-xkcd:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-xkcd:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-xkcdtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-xkcd testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/xkcd.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-yahoo-finance-price | Yahoo Finance SourceThis is the repository for the Yahoo Finance source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_yahoo_finance_price/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource yahoo-finance-price test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-yahoo-finance-pricebuildAn image will be built with the tagairbyte/source-yahoo-finance-price:dev.Viadocker build:dockerbuild-tairbyte/source-yahoo-finance-price:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-yahoo-finance-price:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yahoo-finance-price:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yahoo-finance-price:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-yahoo-finance-price:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-yahoo-finance-pricetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-yahoo-finance-price testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/yahoo-finance-price.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-yandex-metrica | Yandex Metrica SourceThis is the repository for the Yandex Metrica source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_yandex_metrica/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource yandex-metrica test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-yandex-metricabuildAn image will be built with the tagairbyte/source-yandex-metrica:dev.Viadocker build:dockerbuild-tairbyte/source-yandex-metrica:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-yandex-metrica:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yandex-metrica:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yandex-metrica:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-yandex-metrica:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-yandex-metricatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-yandex-metrica testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/yandex-metrica.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-yotpo | Yotpo SourceThis is the repository for the Yotpo configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_yotpo/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource yotpo test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-yotpobuildAn image will be built with the tagairbyte/source-yotpo:dev.Viadocker build:dockerbuild-tairbyte/source-yotpo:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-yotpo:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yotpo:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-yotpo:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-yotpo:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-yotpotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-yotpo testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/yotpo.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-younium | Zapier Supported Storage SourceThis is the repository for the Zapier Supported Storage configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zapier_supported_storage/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zapier-supported-storage test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--namesource-zapier-supported-storagebuildAn image will be built with the tagairbyte/source-zapier-supported-storage:dev.Viadocker build:dockerbuild-tairbyte/source-zapier-supported-storage:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zapier-supported-storage:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zapier-supported-storage:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zapier-supported-storage:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zapier-supported-storage:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-youniumtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-younium testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/younium.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-youtube-analytics | Youtube Analytics SourceThis is the repository for the Youtube Analytics source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_youtube_analytics/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource youtube-analytics test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-youtube-analyticsbuildAn image will be built with the tagairbyte/source-youtube-analytics:dev.Viadocker build:dockerbuild-tairbyte/source-youtube-analytics:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-youtube-analytics:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-youtube-analytics:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-youtube-analytics:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-youtube-analytics:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-youtube-analyticstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-youtube-analytics testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/youtube-analytics.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-zapier-supported-storage | Zapier Supported Storage SourceThis is the repository for the Zapier Supported Storage configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zapier_supported_storage/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zapier-supported-storage test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-zapier-supported-storagebuildAn image will be built with the tagairbyte/source-zapier-supported-storage:dev.Viadocker build:dockerbuild-tairbyte/source-zapier-supported-storage:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zapier-supported-storage:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zapier-supported-storage:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zapier-supported-storage:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zapier-supported-storage:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zapier-supported-storagetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zapier-supported-storage testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/zapier-supported-storage.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-zendesk-chat | Zendesk-Chat source connectorThis is the repository for the Zendesk-Chat source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zendesk_chat/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-zendesk-chat spec
poetry run source-zendesk-chat check --config secrets/config.json
poetry run source-zendesk-chat discover --config secrets/config.json
poetry run source-zendesk-chat read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-zendesk-chatbuildAn image will be available on your host with the tagairbyte/source-zendesk-chat:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-zendesk-chat:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-chat:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-chat:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zendesk-chat:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zendesk-chattestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zendesk-chat testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/zendesk-chat.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-zendesk-sell | Zendesk Sell SourceThis is the repository for the Zendesk Sell configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zendesk_sell/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zendesk-sell test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-zendesk-sellbuildAn image will be built with the tagairbyte/source-zendesk-sell:dev.Viadocker build:dockerbuild-tairbyte/source-zendesk-sell:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zendesk-sell:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-sell:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-sell:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zendesk-sell:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zendesk-selltestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zendesk-sell testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/zendesk-sell.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-zendesk-sunshine | Zendesk Sunshine SourceThis is the repository for the Zendesk Sunshine configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zendesk_sunshine/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zendesk-sunshine test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-zendesk-sunshinebuildAn image will be built with the tagairbyte/source-zendesk-sunshine:dev.Viadocker build:dockerbuild-tairbyte/source-zendesk-sunshine:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zendesk-sunshine:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-sunshine:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-sunshine:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zendesk-sunshine:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zendesk-sunshinetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zendesk-sunshine testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/zendesk-sunshine.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-zendesk-support | Zendesk-Support source connectorThis is the repository for the Zendesk-Support source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zendesk_support/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-zendesk-support spec
poetry run source-zendesk-support check --config secrets/config.json
poetry run source-zendesk-support discover --config secrets/config.json
poetry run source-zendesk-support read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-zendesk-supportbuildAn image will be available on your host with the tagairbyte/source-zendesk-support:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-zendesk-support:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-support:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-support:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zendesk-support:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zendesk-supporttestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zendesk-support testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/zendesk-support.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-zendesk-talk | Zendesk-Talk source connectorThis is the repository for the Zendesk-Talk source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zendesk_talk/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-zendesk-talk spec
poetry run source-zendesk-talk check --config secrets/config.json
poetry run source-zendesk-talk discover --config secrets/config.json
poetry run source-zendesk-talk read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-zendesk-talkbuildAn image will be available on your host with the tagairbyte/source-zendesk-talk:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-zendesk-talk:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-talk:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zendesk-talk:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zendesk-talk:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zendesk-talktestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zendesk-talk testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/zendesk-talk.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-zenefits | How to access the token from ZenefitsLogin into the Zenefits portal.Follow the steps in the given linkHere, This will generate and Bearer token for the user which can be used to interact with the API using the source-zenefits connector.This is the repository for the Zenefits configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zenefits/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zenefits test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-zenefitsbuildAn image will be built with the tagairbyte/source-zenefits:dev.Viadocker build:dockerbuild-tairbyte/source-zenefits:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zenefits:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zenefits:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zenefits:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zenefits:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zenefitstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zenefits testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/zenefits.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-zenloop | Zenloop SourceThis is the repository for the Zenloop source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zenloop/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zenloop test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-zenloopbuildAn image will be built with the tagairbyte/source-zenloop:dev.Viadocker build:dockerbuild-tairbyte/source-zenloop:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zenloop:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zenloop:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zenloop:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zenloop:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zenlooptestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zenloop testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/zenloop.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-zoho-crm | Zoho Crm SourceThis is the repository for the Zoho Crm source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zoho_crm/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zoho-crm test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-zoho-crmbuildAn image will be built with the tagairbyte/source-zoho-crm:dev.Viadocker build:dockerbuild-tairbyte/source-zoho-crm:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zoho-crm:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zoho-crm:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zoho-crm:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zoho-crm:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zoho-crmtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zoho-crm testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/zoho-crm.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-zuora | Zuora SourceThis is the repository for the Zuora source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python3 -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_zuora/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource zuora test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-zuorabuildAn image will be built with the tagairbyte/source-zuora:dev.Viadocker build:dockerbuild-tairbyte/source-zuora:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-zuora:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zuora:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-zuora:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-zuora:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-zuoratestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-zuora testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/zuora.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airc | AIRC is a implementation of the IRC protocol using Python’s asyncio library.
It contains built in support for Twitch.tv IRC websockets as well.AIRC is still in Alpha, so features may be added/removed/altered at any time. |
aircable-library-op | UNKNOWN |
aircal | AircalAircal is a library that exports future DAG runs as events to Google Calendar.Having DAG run as events in the calendar may help you:visualize the utilization of your airflow workers to better spread your jobsdetermine when a certain DAG should be finished to monitor the service.The library will also observe the changes to your DAGs and synchronize it with the calendar:add runs for the freshly added DAGschange start and/or end time when an existing DAG changes the schedule (or the execution time changes significantly)delete run events when a DAG is removed (or paused)Tip: run the sync script regularly, perhaps, with you know, Airflow :)The library only support DAG schedules that use the standard cron syntax. The rest will be ignored (with a warning).Warning: This is an beta stage software. Expect occassional bugs and rough edges (PR welcome).Installation & setuppip install aircalAlternatively you can clone the repo and install it from there:pip install -e .Google API credentials are required to create events in the calendar. You can obtain themhere. Storecredentials.jsoninto a directory accessible by your code.The library is modifying and deleting calendar events. I highly recommend creating a new calendar to be used by this software:"add calendar" -> "create new calendar" in Google calendar settings.UsageSeeexample.pyfor an example of the potential pipeline that can be run on the regular intervals. |
aircalc | air_calculatorWriting letters with a pen on paper is very different from drawing letters in the air.Drawing "-" minus operator and "+" operator in the air are extremely difficult so that these operators in the current system are replaced by "W" and "P" respectively.When drawing letters with fingers in the air, letters that are difficult
to recognize or write in the air need to be replaced with letters
that can be accurately recognized by artificial intelligence.
For example, replace the number "1" with "L" in the air.How to install necessary libraries$ pip install pytesseractFor Windows users, you should also install the latest tesseracthttps://github.com/UB-Mannheim/tesseract/wikiAnd add tesseract.exe of Tesseract-OCR directory PATH in .profile or .bashrc.$ pip install mediapipeFinally install aircalc$ pip install aircalcfingermath.py is renamed as aircalc.pyHow to run aircalcaircalc is a program for drawing a math expression
in the air for possible calculation.aircalc is based on two open source libraries including mediapipe and tesseract.There are six states of five fingers recognized by mediapipe library:
0-finger, 1-finger, 2-finger, 3-finger, 4-finger, and 5-finger respectively.A pen of index finger tip is used for drawing an expression
by fingers.
0-finger can move the pen without drawing.
1-finger can draw lines in the air.
2-finger can move the pen without drawing.
3-finger can delete the last touches of drawn letters for correction.
4-finger can call tesseract for transforming the hand-writing
images to the digital text for possible calculation.
For several seconds, 4-finger can terminate the program
for showing the answer of the hand-drawn expression.
5-finger can move the pen without drawing.Continuous 4-finger state can terminate and exit this program.0-finger or 5-finger is equivalent to 2-finger.The saved picture is tranformed into digital text using the state-of-the-art
optical character recognition.Writing letters with a pen on paper is very different from drawing letters in the air.Of the 0 to 9 digits, 1 is the least recognizable number.Drawing "L" in the air represents "1"."S" or "5" in the air represents "5"."P" in the air represents "+" plus operator."W" or "-" represents "-" minus operator."V" in the air represents "/" division operator."M" in the air represents "*" multiplication operator."&" in the air represents "**" exponential operator.Drawing two letters "a" and "A" in the air represents the sqrt() function.
Therefore, the string "a13A" or "aL3A" represents sqrt(13).$ aircalchttps://youtu.be/med_jrFTMPAsqrt(6)*2=?1-3=? 10+2=?4-5-3=?[2-3/5=?[34*5=?[2**8=?[2&9V3 --> 2**9/3[aLLAV3=? -> sqrt(11)/3[ |
aircan | AirCanLoad data intoCKAN DataStoreusing Airflow as the runner. This is a replacement for DataPusher and Xloader.Clean separation of components so you can reuse what you want (e.g., you don't use Airflow but your own runner).AirCanGet StartedExamplesExample 1: CSV to JSONUsing Aircan DAGsExample 2: Local file to CKAN DataStore using the Datastore APIPreliminaries: Setup your CKAN instanceIgnore Example 3: Local file to CKAN DataStore using PostgresPreliminaries: Setup your CKAN instanceDoing the loadExample 2a: Remote file to DataStoreExample 3: Auto Load file uploaded to CKAN into CKAN DataStoreRun itTutorialsUsing Google Cloud ComposerGet StartedInstallPython>= 3.5 <= 3.7.x (and make a virtual environment).Cloneaircanso you have examples available:gitclonehttps://github.com/datopian/aircanInstall and setupAirflow(https://airflow.apache.org/docs/stable/installation.html):exportAIRFLOW_HOME=~/airflow
pipinstallapache-airflow
airflowinitdbNote: On recent versions of Python (3.7+), you may face the following error when executingairflow initdb:ModuleNotFoundError:Nomodulenamed'typing_extensions'This can be solved withpip install typing_extensions.Then, start the server and visit your Airflow admin UI:airflowwebserver-p8080By default, the server will be accessible athttp://localhost:8080/as shown in the output of the terminal where you ran the previous command.ExamplesExample 1: CSV to JSONIn this example we'll run an AirCan example to convert a CSV to JSON.Add the DAG to the default directory for Airflow to recognize it:mkdir~/airflow/dags/
cpexamples/aircan-example-1.csv~/airflow/
cpexamples/csv_to_json.py~/airflow/dags/To see this DAG appear in the Airflow admin UI, you may need to restart the server or launch the scheduler to update the list of DAGs (this may take about a minute or two to update, then refresh the page on the Airflow admin UI):airflowschedulerRun this DAG:Enable the dag in theadmin UIwith this toggle to make it run with the scheduler:"Trigger" the DAG with this button:After a moment, check the output. You should see a successful run for this DAG:Locate the output on disk at~/airflow/aircan-example-1.jsonUsing Aircan DAGs in a local Airflow instanceExample 2: Local file to CKAN DataStore using the Datastore APIWe'll assume you have:a local CKAN setup and running athttp://localhost:5000;a dataset (for example,my-dataset) with a resource (for example,my-resourcewith themy-res-id-123as itsresource_id);We also need to set up two environment variables for Airflow. Access theAirflow Variable paneland set upCKAN_SITE_URLand yourCKAN_SYSADMIN_API_KEY:Single-node DAGapi_ckan_load_single_nodeis a single-node DAG which deletes,creates and loads a resource to a local or remote CKAN instance. You can run theapi_ckan_load_single_nodeby following these steps:Open yourairflow.cfgfile (usually located at~/airflow/airflow.cfg) and point your DAG folder to AirCan:dags_folder=/your/path/to/aircan
...otherconfigsdag_run_conf_overrides_params=TrueNote: do not pointdags_folderto/your/path/to/aircan/aircan/dags. It must be pointing to the outeraircanfolder.Verify that Airflow finds the DAGs of Aircan by runningairflow list_dags. The output should list:-------------------------------------------------------------------
DAGS
-------------------------------------------------------------------
ckan_api_load_single_step
...otherDAGs...Make sure you have these environment variables properly set up:exportLC_ALL=en_US.UTF-8exportLANG=en_US.UTF-8exportOBJC_DISABLE_INITIALIZE_FORK_SAFETY=YESRun the Airflow webserver (in case you have skipped the previous example):airflow webserverRun the Airflow scheduler:airflow scheduler. Make sure the environment variables from (3) are set up.Access the Airflow UI (http://localhost:8080/). You should see the DAGckan_api_load_single_steplisted.Activate the DAG by hitting on theOffbutton on the interface.Now we can test the DAG. On your terminal, run:airflowtest\-tp"{ \"resource_id\": \"my-res-id-123\", \\"schema_fields_array\": \"[ 'field1', 'field2']\", \\"csv_input\": \"/path/to/my.csv\", \\"json_output\": \"/path/to/my.json\" }"\ckan_api_load_single_stepfull_load_via_apinowMake sure to replace the parameters accordingly.resource_idis the id of your resource on CKAN.schema_fields_arrayis the header of your CSV file. Everything is being treated as plain text at this time.csv_inputis the path to the CSV file you want to upload.The DAG will convert your CSV file to a JSON file and then upload it.json_outputspecifies the path where you want to dump your JSON file.Check your CKAN instance and verify that the data has been loaded.Trigger the DAG with the following:airflowtrigger_dagckan_api_load_single_step\--conf='{ "resource_id": "my-res-id-123", "schema_fields_array": [ "field1", "field2" ], "csv_input": "/path/to.csv", "json_output": "/path/to.json" }'Do not forget to properly replace the parameters with your data and properly escape the special characters.
Alternatively, you can just run the DAG with theairflow runcommand.api_ckan_load_single_nodealso works for remote CKAN instances. Just set up your AirflowCKAN_SITE_URLvariable accordingly.Multiple-node DAGckan_api_load_multiple_stepsdoes the same steps ofapi_ckan_load_single_node, but it uses multiple nodes (tasks). You can repeat the steps of the previous section and runckan_api_load_multiple_steps.[Ignore] Example 3: Local file to CKAN DataStore using PostgresWe'll load a local csv into CKAN DataStore instance.Preliminaries: Setup your CKAN instanceWe'll assume you have:a local CKAN setup and running athttp://localhost:5000datastore enabled. If you are using Docker, you might need to expose your Postgres Instance port. For example, add the following in yourdocker-composer.ymlfile:db:
ports:
- "5432:5432"(Useful to know: it is possible to access the Postgres on your Docker container. Rundocker psand you should see a container nameddocker-ckan_db, which corresponds to the CKAN database. Rundocker exec -it CONTAINER_ID bashand thenpsql -U ckanto access the corresponding Postgres instance).Now you need to set up some information on Airflow. Access your local Airflow Connections panel athttp://localhost:8080/admin/connection/. Create a new connection namedckan_postgreswith your datastore information. For example, assuming yourCKAN_DATASTORE_WRITE_URL=postgresql://ckan:ckan@db/datastore, use the following schema:We also need to set up two environment variables for Airflow. Access the Airflow Variable panel and set upCKAN_SITE_URLand yourCKAN_SYSADMIN_API_KEY:[TODO PARAMETERIZE VARS]
[TODO PARAMETERIZE PATHS]Then, create a dataset calledaircan-exampleusing this script:cdaircan
pipinstall-rrequirements-example.txt
pythonexamples/setup-ckan.py--api-keyDoing the loadWe assume you now have a dataset namedmy-first-dataset.Create the DAG for loadingcpaircan/lib/api_ckan_load.py~/airflow/dags/Check if Airflow recognize your DAG withairflow list_dags. You should see a DAG namedckan_load.Now you can test each task individually:To delete a datastore, runairflow test ckan_load delete_datastore_table nowTo create a datastore, runairflow test ckan_load create_datastore_table now. You can see the correspondingresource_idfor the datastore on your logs. [TODO JSON is hardcoded now; parameterize on kwargs or some other operator structure]To load a CSV to Postgres, runairflow test ckan_load load_csv_to_postgres_via_copy now. [TODO JSON is hardcoded now; insert resource_id on it. File path is also Hardcode, change it]Finally, set your datastore to active:airflow test ckan_load restore_indexes_and_set_datastore_active now.To run the entire DAG:Select the DAG [screenshot]Configure it with a path to ../your/aircan/examples/example1.csvRun it ... [screenshot]Check the outputVisithttp://localhost:5000/dataset/aircan-example/and see the resource named XXX. It will have data in its datastore now!Example 2a: Remote file to DataStoreSame as example 2 but use this DAG instead:cpaircan/examples/ckan-datastore-from-remote.py~/airflow/dags/Plus set a remote URL for loading.Examples 3: Auto Load file uploaded to CKAN into CKAN DataStoreConfigure CKAN to automatically load.Setup CKAN - see the previous sections.Also install this extension in your ckan instance:ckanext-aircan-connector.TODO: add instructions.Configure ckan with location of your airflow instance and the DAG id (aircan-load-csv).Run itRun this script which uploads a CSV file to your ckan instance and will trigger a load to the datastore.cdaircan
pipinstall-rrequirements-example.txt
pythonexamples/ckan-upload-csv.pyUsing Google Cloud ComposerSign up for an account athttps://cloud.google.com/composer. Create or select an existing project at Google Cloud Platform. For this example, we use one calledaircan-test-project.Create an environment at Google Cloud Composer, either by command line or by UI. Make sure you selectPython 3when creating the project. Here, we create an environment namedaircan-airflow.After creating your environment, it should appear in your environment list:Override the configuration fordag_run_conf_overrides_params:Access the designated DAGs folder (which will be a bucket). Upload the contents oflocal/path/to/aircan/aircanto the bucket:The contents of the subfolderaircanmust be:Enter the subdirectorydagsand delete the__init__.pyfile on this folder. It conflicts with Google Cloud Composer configurations.Similarly to what we did on Example 2, access your Airflow instance (created by Google Cloud Composer) and addCKAN_SITE_URLandCKAN_SYSADMIN_API_KEYas Variables. Now the DAGs must appear on the UI interface.Let's assume you have a resource onhttps://demo.ckan.org/withmy-res-id-123as its resource_id. We also assume you have, in the root of your DAG bucket on Google Cloud platform, two files: One CSV file with the resource you want to upload, namedr3.csv, with two columns,field1andfield2. The other file you must have in the root of your your bucket isr4.json, an empty JSON file.Since our DAGs expect parameters, you'll have to trigger them via CLI:
For example, to triggerapi_ckan_load_single_node, run (from your terminal):gcloudcomposerenvironmentsrunaircan-airflow\--locationus-east1\trigger_dag--ckan_api_load_single_step\--conf='{ "resource_id": "my-res-id-123", "schema_fields_array": [ "field1", "field2" ], "csv_input": "/home/airflow/gcs/dags/r3.csv", "json_output": "/home/airflow/gcs/dags/r4.json" }'Check the logs (tip: filter them by your DAG ID, for example,ckan_api_load_single_step). It should updload the data of your.csvfile todemo.ckansuccessfully. |
airclick | AirClick Python自动化贡献者:北京奥悦科技 |
aircloak-tools | Python Aircloak ToolsTools for querying an Aircloak api.This package contains two main components:Aircloak Api: Wrapper around psycopg to query Aircloak directly.Explorer: An interface to Diffix Explorer for data analytics.Aircloak ApiThe main aim is to provide an Aircloak-friendly wrapper aroundpsycopg2, and in particular to
provide clear error messages when something doesn't go as planned.Query results are returned aspandasdataframes.ExplorerUsesDiffix Explorerto return enhanced statistics. Please see the project homepage for further information about Explorer.InstallationThe package can be installed in youir local environment using pip:pipinstallaircloak-toolsTo use Explorer Features you will also need to runDiffix Explorer.ExampleThe following code shows how to initiate a connection and execute a query.As a pre-requisite you should have a username and password for the postgres interface of an
Aircloak installation (ask your admin for these). Assign these values toAIRCLOAK_PG_USERandAIRCLOAK_PG_PASSWORDenvironment variables.importaircloak_toolsasacAIRCLOAK_PG_HOST="covid-db.aircloak.com"AIRCLOAK_PG_PORT=9432AIRCLOAK_PG_USER=environ.get("AIRCLOAK_PG_USER")AIRCLOAK_PG_PASSWORD=environ.get("AIRCLOAK_PG_PASSWORD")TEST_DATASET="cov_clear"withac.connect(host=AIRCLOAK_PG_HOST,port=AIRCLOAK_PG_PORT,user=AIRCLOAK_PG_USER,password=AIRCLOAK_PG_PASSWORD,dataset=TEST_DATASET)asconn:assert(conn.is_connected())tables=conn.get_tables()print(tables)feeling_now_counts=conn.query('''select feeling_now, count(*), count_noise(*)from surveygroup by 1order by 1 desc''')The easiest way to use Diffix Explorer is with the Docker image ondocker hub. More detailed information on running Diffix Explorer is available at theproject repo. As an example, you can use explorer to generate sample data based on the anonymized dataset as follows:fromaircloak_toolsimportexplorerEXPLORER_URL="http://localhost"EXPLORER_PORT=5000DATASET="gda_banking"TABLE="loans"COLUMNS=["amount","duration"]session=explorer.explorer_session(base_url=EXPLORER_URL,port=EXPLORER_PORT)result=explorer.explore(session,DATASET,TABLE,COLUMNS)assertresult['status']=='Complete'print(f'{COLUMNS[0]:>10}|{COLUMNS[1]:>10}')forrowinresult['sampleData']:print(f'{row[0]:>10}|{row[1]:>10}')# Should print something like:## amount | duration# 33000 | 12# 43000 | 36# 57000 | 12# 91000 | 24# 97000 | 48# 101000 | 60## etc. |
aircloudy | aircloudyAircloudy is an unofficial python library that allow management of RAC (Room Air Conditioner) compatible with Hitachi Air Cloud.This project IS NOT endorsed by Hitachi and is distributed as-is without warranty.Table of ContentsInstallationUsageLicenseDevelopmentInstallationpip install aircloudyUsagefrom__future__importannotationsimportasynciofromaircloudyimportHitachiAirCloud,InteriorUnit,compute_interior_unit_diff_descriptiondefprint_changes(dict:dict[int,tuple[InteriorUnit|None,InteriorUnit|None]])->None:for(id,change)indict.items():print(f"Change on interior unit{id}: "+compute_interior_unit_diff_description(change[0],change[1]))asyncdefmain()->None:asyncwithHitachiAirCloud("[email protected]","top_secret")asac:ac.on_change=print_changesunit_bureau=next((iuforiuinac.interior_unitsifiu.name=="Bureau"),None)ifunit_bureauisNone:raiseException("No unit named `Bureau`")awaitac.set(unit_bureau.id,"ON")awaitac.set(unit_bureau.id,requested_temperature=21,fan_speed="LV3")awaitasyncio.sleep(30)asyncio.run(main())Licenseaircloudyis distributed under modified HL3 license. SeeLICENSE.txt.Developmentpoetry run task lintpoetry run task checkpoetry run task testpoetry run task coverageNotesNot read/used field from notification :iduFrostWashStatus: IduFrostWashStatus
active: bool
priority: int
astUpdatedA: int
subCategory = None
errorCode = None
specialOperationStatus: SpecialOperationStatus
active: bool
priority: int
lastUpdatedAt: int
subCategory = None
errorCode = None
errorStatus: ErrorStatus
active: bool
priority: int
lastUpdatedAt: int
subCategory: str
errorCode = None
cloudId: str
opt4: int
holidayModeStatus: HolidayModeStatus
active: bool
priority: int
lastUpdatedAt: int
subCategory = None
errorCode = None
SysType: intNot read/used field from API:userId: str
iduFrostWash: bool
specialOperation: bool
criticalError: bool
zoneId: str |
airconditioner | No description available on PyPI. |
airconditioner-webthing | airconditioner_webthingAn airconditioner web thing connectorThis project provides awebthing APIto Midea air conditionersThe airconditioner_webthing package exposes a http webthing endpoint which supports controlling the air conditioner via http. E.g.# webthing has been started on host 192.168.0.23
curl http://192.168.0.23:7122/properties
{
"outdoor_temperature": 4,
"indoor_temperature": 22,
"target_temperature": 23,
"operational_mode": "heat",
"fan_speed": 102,
"power": false,
"run_util": ""
}To install this software you may usePIPpackage manager such as shown belowsudo pip install airconditioner_webthingAfter this installation you may start the webthing http endpoint inside your python code or via command line usingsudo aircon --command listen --port 7122 --ip 10.31.33.90 --id 957548654462565Here, the webthing API will be bound to the local port 7122. Additionally, the ip address of the air conditioner has to be set
as well as the device id of the air conditioner. To discovery the device id you may usemidea-msmartlibrary as shon belowmidea-discover -a YOUR_ACCOUNT -p YOUR_PASSWORDAlternatively to thelistencommand, you can use theregistercommand to register and start the webthing service as systemd unit.
By doing this the webthing service will be started automatically on boot. Starting the server manually using thelistencommand is no longer necessary.sudo aircon --command register --port 7122 --ip 10.31.33.90 --id 957548654462565 |
air-connection-app | REST API using Flask and Python [Air connections search]Project consist of:images - folder with images sourcestests - folder with UnitTests, Pytestapp.py - main application resourcedb.py - resource for creation and for fill the databaserequirements.txt - file for install packageslist.csv - list of flights for databasepytest.ini - pytest fileconfig.cfg - file with config, flake8For using application you need to do:Install Python 3.6 or higherInstall requirementspip install -r requirements.txtCreate database for applicationpython db.py$ python db.py
20 Record TransferredThere are 3 environments to choose from with different databases, choose one:testing - for local testing,dev - need to be configured manuallyprod - need to be configured manuallyRun applicationpython app.py testing$ python app.py
* Serving Flask app "app" (lazy loading)
* Environment: production
* Debug mode: on
* Restarting with stat
* Debugger is active!
* Debugger PIN: 231-078-686
* Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)Follow the linkhttp://127.0.0.1:5000/EndpointsEndpointRequestDescription/flightGETGet All Flights/flight/idGETGet Single Flight/flightPOSTCreate a Flight/flight/idPUTUpdate a Flight/flight/idDELETEDelete Flight/flight/searchGETFlight search between 2 citiesStart Using:1. Get All Flightscurl-XGEThttp://127.0.0.1:5000/flight2. Get Single Flightcurl-XGEThttp://127.0.0.1:5000/flight/103. Create a Flightcurl-XPOST-H"Content-Type: application/json"-d'{"source": "Barcelona", "destination": "Palma de Mallorca", "flight_company": "Vueling", "flight_number": "FR3745", "flight_time": "55m", "free_seats": "5", "price": "50$"}'http://127.0.0.1:5000/flight4. Update a Flightcurl-XPUT-H"Content-Type: application/json"-d'{"source": "Barcelona", "destination": "Palma de Mallorca", "flight_company": "Ryanair", "flight_number": "FR3785", "flight_time": "50m", "free_seats": "8", "price": "150$"}'http://127.0.0.1:5000/flight/115. Delete Flightcurl-XDELETEhttp://127.0.0.1:5000/flight/116. Flight search between 2 citieshttp://127.0.0.1:5000/flight/search?c1=Milan&c2=BarcelonaUnitTests:Download the project:Follow thelinkChoose branche (feature/jdobc-255)- Using LinterInstall packagespip install flake8Run flake8flake8$ flake8
0- Using PyTestInstall pytest from pippip install pytestRun pytestpytest -v- Coverage by PyTest:Install packagespip install pytest-covRun commandpytest --cov=. --cov-config='.coveragerc' -c='pytest.ini'GitLab CI Multi-project:Flake8PytestCoverageBuild artifactUpload artifact to s3Deploy terraform codeLogging for AWS CloudWatchHow to run the application locally in the test environment ?run applicatin:python app.py testingLog output file "air-connection-app.log" will be created automaticallyТо see the streaming of application logs in AWS you need to deploy the application in AWS usingterraform templateand go toCloudWatch logs |
aircopy | The CLI software to copy airtable data to Billinge group databaseFree software: 3-clause BSD licenseDocumentation: (COMING SOON!)https://st3107.github.io/aircopy.InstallationInstall pip on your machine. Run the following code in the terminal.pip install aircopyUsageRun the following command to find out the usage in the terminal.aircopy----helpFeaturesTODO |
aircot | Classify aircraft in TAKAirCOT is software for classifying aircraft within the Team Awareness Kit (TAK) ecosystem of products.AirCOT is used by these TAK compatible products:adsbcot: ADSBCOT is software for monitoring and analyzing aviation surveillance data via the Team Awareness Kit (TAK) ecosystem of products.AirTAK: AirTAK is a low SWaP-C device for monitoring and analyzing aviation surveillance data via the Team Awareness Kit (TAK) ecosystem of products.Documentation is available here.LicenseCopyright Sensors & Signals LLChttps://www.snstac.comLicensed under the Apache License, Version 2.0 (the “License”);
you may not use this file except in compliance with the License.
You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an “AS IS” BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. |
aircrack-gui | aircrack-guiAircrack-gui is a python gui foraircrack-ngusinggtk3.0.The priority was to make every step intuitive and easy)RequirementsPyGoObject version: > 3.38.0Python version: 3: stable, 2: not tested.Installationgit clone https://github.com/Cod3dDOT/aircrack-gui
cd aircrack-gui
pip install -r requirements.txtIMPORTANTYou must use aircrack-gui only on networks you have permission to.As this tool is only a gui, it will not work if the tools it uses behind the scenes do not work: python,aircrack-ng,hcxtools.UsageRun:cd aircrack-gui
python3 aircrack-gui.pyIf any interface will be found, a window will open with the option to choose an interface, scan, start airmon-ng or open aircrack-ng.Step 1: Set path (default: /home/SUDO_USER/Desktop/aircrack-ng/wifi/)Step 2: Press 'Scan for networks', wait for ~5 seconds (the main window can become unresponsive, that's normal). A new window will show up with a network list. Choose desired network, check that it has WPA2 encryption (right now WEP/WPA1 are not implemented), hit 'Start Airmon-ng on BSSID: NETWORK_BSSID'.Step 3: Aireplay-ng window will show up. Set amount of deauth packets to send (default: 10) and wait for station to appear (you can choose if several are found or type in a station mac address manually (format: xx-xx, xx:xx, xxxx)). Hit 'Run deauth (aireplay-ng)'. If you see 'Success' on top of the window, then a handshake was received successfully. If not, try changing the station or amount of packets.P.S: If no stations are found, your signal strength is probably too low. Signal strength can be checked when you select your network in Step 2 and is measured from 0 to 100, higher being better.Step 4: Now, you can close aireplay-ng window. In the main window, press 'Open aircrack-ng', select .cap file (capture file located wherever you set it to in Step 1). Select a wordlist, hit 'Start aircrack-ng / hashcat' and hope for the best ;)P.S If you want to convert your .cap to .22000 manually, visitofficial hashcat conversion website. Or, you can installhcxtoolsand .22000 files will be generated automatically.Command line argumentsCan be found by typingpython3 aircrack-gui.py -h.Application Options:
--nokill Do not run 'airmon-ng check kill'. Will retain internet connection on other devices, but is probably a bad idea.
--noclean Do not clean .csv files generated by airodump-ng when scanning for clients.
--nolog Do not print anything to console.
--display=DISPLAY X display to useTodoChange network scanning from nmcli to airmon-ngChangelogChangelog0.0.6 --- Added command line arguments. Files, generated by airodump-ng (.csv, .netxml) are now automatically deleted. Minor log improvements and code formatting.0.0.5 --- Added automatic .cap to .22000 conversion when selecting hashcat andhcxtoolsare installed. Minor ui updates.0.0.4 --- Removed xterm windows (except for aircrack-ng/hashcat), fixed selected station deselecting/changing when updated.0.0.3 --- Hashcat support.0.0.2 --- README.md changes, etc.0.0.1 --- Initial commit. |
aircraft | AircraftA collection ofpyinfrapackaged deploys that can be used
to declaratively configure services such as dnsmasq, apache2, and others. For now
only a few services are supported but more services such as GitLab or Jenkins
will become available in the future.Video IntroductionView on YouTubeConfigure PXE on a PiProject RationaleI work with infrastructure. The kind of infrastructure where you have a bunch
of baremetal machines that don't have an OS installed. Where the only access
you have at the onset are their BMC IPs. In that case, I need some sort of
machine provisioning tool that is agentless. Truly agentless, not just Ansible
agentless.In other situations, I might need to set up a CI/CD cluster. Now if you're like
me, you'd prefer to configure this cluster on top of as thin a technology stack
as possible (read: just bare OSes). The reason for this is because if you want
to deploy your CI/CD cluster on top of the best whiz-bang technology stack,
such as Kubernetes, you have to ensure that you have a CI/CD infrastructure in
place before you do...but that's exactly what's missing and what we're trying
to deploy! Again, what's needed here is an agentless infrastructure automation
tool.In both situations, I've found Ansible usable for a time. However, after years
of using it, I've come to find it cumbersome. With its supposed-declarative
YAML-based DSL slowly transforming into a turing-complete language. At this point
one wonders why we don't just use an already proper language itself like, oh
I don't know, Python?This is what brought be topyinfra. This project builds
on top of pyinfra's good-enough implementation. It is an attempt at replicating
the Ansible project structure that I've been using for years as exemplified in
another project calledrelaxdiego/cicd.Why Didn't You Just Use Terraform?I have ample experience with Terraform in the past too and I've maintained the
"Terraform for provisioning, Ansible for configuration" dichotomy for some time.
I maintain that stand for this project but have changed it to "Terraform for
provisioning, Aircraft for configuration."UsageThis project doesn't add wrappers around pyinfra, so once you get the hang
ofhow to use pyinfra,
then you can easily move on to some of the stuff I do in theexamples/dir
of this project.Once you're comfortable with pyinfra and you start browsing theexamples/dir, you'll see that all I'm doing is addingpyinfra packaged deploysthat you can use in your operations files. I've also created somepydanticmodels that go with the packaged deploys to help with validating inventory
data. Anyway, check out theexamples/directory before I keep blabbering
for ages.Developer's GuidePrerequisitesPython 3MakePrepare Your Python Environment (pyenv style; one-time only)You will need two additional dependencies for this style:pyenvpyenv-virtualenvOnce the above dependencies are installed, do the following:Install an isolated environment for your preferred Python version.python_version=<YOUR-PREFERRED-PYTHON-VERSION>
pyenv install --enable-shared $python_versionNOTE: For more available versions, runpyenv install --listCreate a virtualenv for this projectpyenv virtualenv $python_version aircraftAdd a.python-versionfile to this project dircat >.python-version<<EOF
aircrat
$python_version
EOFYour newly created virtualenv should now be automatically activated if your
prompt changed to the following:(aircraft) [email protected], should you happen to be usingdotfiles.relaxdiego.com,
if it changed to the following... via 🐍 <YOUR-PREFERRED-PYTHON-VERSION> (aircraft)Notice the things in parentheses that corresponds to the virtualenv you created
in the previous step. This is thanks to the coordination of pyenv-virtualenv and
the.python-versionfile in the rootdir of this project.If youcd ..orcdanywhere else outside your project directory, the virtualenv
will automatically be deactivated. When youcdback into the project dir, the
virtualenv will automatically be activated.Prepare Your Python Environment (venv style)If you'd rather manage your virtualenv manually, this section is for you.
Create your virtual environment:python3 -m venv ./venvActivate it in every shell session where you intend to run make or
the unit testssource ./venv/bin/activateInstall The DependenciesInstall all development and runtime dependencies.WARNING: Make sure you are using a virtualenv before running this command. Since it
uses pip-sync to install dependencies, it will remove any package that is not
listed in eitherrequirements-dev.inorsetup.py. If you followed the steps
in any of the Prepare Your Development Environment sections above, then you
should be in good shape.make dependenciesAdding A Development DependencyAdd it torequirements-dev.inand then run make:echo "foo" >> requirements-dev.in
make dependenciesThis will createrequirements-dev.txtand then install all dependenciesCommitrequirements-dev.inandrequirements-dev.txt. Both
files should now be updated and thefoopackage installed in your
local machine. Make sure to commit both files to the repo to let your
teammates know of the new dependency.git add requirements-dev.*
git commit -m "Add foo to requirements-dev.txt"
git push originAdding A Runtime DependencyAdd it toruntime_requirementslist in setup.py and then run:make dependenciesThis will createrequirements.txtand then install all dependenciesCommitsetup.pyand ignorerequirements.txt. We ignore the latter
since this is a library project which may be used with different versions
of its dependencues at development and run time.git add setup.py
git commit -m "Add bar to requirements"
git push originTesting and Building the CharmAfter any change in the library, you want to ensure that all unit tests
pass before building it. This can be easily done by running:make test buildViewing the Coverage ReportTo view the coverage report, run the tests first and then run:make coverage-serverThis will run a simple web server on port 5000 that will serve the files
in the auto-generatedhtmlcov/directory. You may leave this server running
in a separate session as you run the tests so that you can just switch back
to the browser and hit refresh to see the changes to your coverage down to
the line of code.Other Make GoalsRunmake helpor check out the contents ofMakefile.Running the Tests in Multiple Python VersionsMore often than not you want to be able to support more than one version of
Python. This is where tox comes in. Just run the following to get test
results for all Python versions listed in tox.ini's envlist config optiontoxReferencesSecureBoot-Compatible UEFI netbootdnsmasqFully Automated Ubuntu 20.04 InstallConfiguring PXE Network Boot Server on Ubuntu 18.04 LTSUbuntu Network installation with PXE |
aircraft-carrier | AircraftCarrierBooster for Web Project Carriers in Universities |
aircraft_classifiers_jme45 | aircraft_classifiers_jme45Package for classifiers for aircraft classification, to be used in several other projects.
Based on FGVCA aircraft data (seehttps://arxiv.org/pdf/1306.5151.pdf), though this code allows one to use a subsetThis is not expected to be of sufficient general interest that other people might want to install it, but for my personal use it is useful to have it on pypi.Installationpip install aircraft_classifiers_jme45 |
aircraft-design | aircraft_designPara instalar a biblioteca oficialmente, você pode usar o gerenciador de pacotes pip.
A seguinte linha de comando pode ser executada no terminal ou prompt de comando:pipinstallaircraft-designAssim, a biblioteca será baixada e instalada em seu ambiente de desenvolvimento Python. Também é possível instalar a versão mais recente diretamente do repositório GitHub, executando o comando:pipinstallgit+https://github.com/NisusAerodesign/aircraft-design.gitFeito isso, já é possível importar e utilizar a biblioteca nas suas aplicações.0.1. Como instalarPara instalar basta acessar pelo repositório da própria pipy*e já estará pronto para uso.Projeto de Design de Aeronaves NISUS-aerodesignO projeto aircraft-design é um esforço desenvolvido por membros da equipe de competiçãoNISUS-aerodesigncom o objetivo de facilitar a análise de aeronaves. A equipe utiliza a ferramenta Vortex Lattice (ou malha de vórtices, em tradução livre), desenvolvida pelo MIT*, para conduzir essas análises.A ferramenta Vortex Lattice permite que a equipe tenha uma visão detalhada das propriedades aerodinâmicas da aeronave, como por exemplo, a geração de sustentação, arrasto e forças de inclinação. Isso permite que a equipe faça melhorias no design da aeronave, tornando-a mais eficiente e segura para voo.Repositório GitHub1. aircraft_design.WingA classe Wing (Asa) é responsável por criar superfícies aerodinâmicas, como asas e estabilizadores. Ela possui diversos parâmetros que podem ser ajustados para atender às necessidades específicas de cada projeto.A tabela abaixo apresenta cada um dos parâmetros da classe Wing, incluindo seu tipo de dado, valor padrão e se é obrigatório ou não:ParâmetroTipo de dadoValor PadrãoairfoilPathObrigatóriowingspanfloatObrigatóriomean_chordfloatObrigatóriotaper_ratiofloat1.0transition_pointfloat0.0alpha_anglefloat0.0sweep_anglefloat0.0x_positionfloat0.0y_positionfloat0.0z_positionfloat0.0alignstr'LE'namestr'wing'controllist[None]panel_chordwiseint10panel_spanwiseint25Além disso, a classe Wing possui métodosGetterseSetterspara todos os seus elementos, permitindo a manipulação de seus parâmetros de forma fácil e precisa. Também podemos encontrar outros métodos importantes, comoWing().surface -> avl.SurfaceeWing().reference_area() -> float, que fornecem informações valiosas sobre a superfície da asa e sua área de referência.1.1. Ferramenta de plotagemA biblioteca possui uma ferramenta de plotagem para melhor visualização da aeronave construída. Para utilizá-la, basta invocar o método plot() nas classes Wing e Aircraft.A tabela a seguir apresenta os parâmetros que podem ser especificados na função de plotagem:ParâmetroTipo de dadoValor Padrãofigurematplotlib.figure | NoneNoneaxismatplotlib.axis | NoneNonelinewidthfloat1.0colorstr'black'Ambas as classes Wing e Aircraft podem receber uma figura e um eixo para se adequarem aos padrões de plotagem do usuário. Além disso, o plot gerado é tridimensional.2. aircraft_design.AircraftA classe Aircraft é responsável por agrupar as superfícies aerodinâmicas e torná-las executáveis nos parâmetros da biblioteca de simulação de voo. Ela é um elemento fundamental para o projeto, pois permite a definição do avião como um todo, e é a partir dela que serão realizadas as simulações.Abaixo seguem os principais parâmetros que compõem a classe Aircraft:ParâmetroTipo de dadoValor PadrãomachfloatObrigatórioground_effectfloatObrigatórioreference_chordfloatObrigatórioreference_spanfloatObrigatóriosurfaces_listlistObrigatórioref_point_xfloat0.0ref_point_yfloat0.0ref_point_zfloat0.0Além desses parâmetros, a classe Aircraft possui métodos Getters e Setters para todos eles, assim como outros métodos que podem ser necessários para realizar as simulações.2.1. Gerar a geometriaPara poder executar a simulação deve ser gerada a geometria para poder ser executada.Aircraft().geometry(name:str)2.2. Ferramenta de plotagemA ferramenta de plotagem para o móduloAircrafté totalmente compatível com o móduloWing, recebendo os mesmos parâmetros.Verificar1.1.aircraft_design.SessionA classe Session é responsável por realizar a execução do código no AVL. Para fazer isso, é necessário que sejam fornecidos os seguintes parâmetros:ParâmetroTipo de dadoValor PadrãogeometryAircraft.geometryObrigatóriocasesCase |NoneNonenamestr |NoneNoneA variávelgeometryrepresenta a geometria da aeronave, que será utilizada pelo AVL para realizar as análises. A variávelcasesé opcional e representa os casos de simulação que serão executados no AVL. Por fim, a variávelnameé também opcional e representa o nome da sessão que está sendo executada.Com esses parâmetros em mãos, a classe Session é capaz de realizar as simulações no AVL, gerando informações valiosas sobre o comportamento da aeronave em diferentes condições.aircraft_design.MultiSessionA classe MultiSession é responsável por realizar a execução de múltiplas sessões no AVL utilizando uma abordagem paralela que aproveita a capacidade de processamento dos múltiplos núcleos da CPU.Para fazer isso, a classe MultiSession possui o seguinte parâmetro:ParâmetroTipo de dadoValor Padrãosession_arraylist[Session]ObrigatórioA variávelsession_arrayrepresenta a lista de sessões que serão executadas no AVL, que são objetos da classeSessioncom suas respectivas geometrias e casos de simulação.A classe MultiSession é responsável por gerenciar os Workers, compartilhar a memória entre eles e organizar as filas de execução das sessões. Dessa forma, a classe MultiSession permite que múltiplas sessões sejam executadas em paralelo, aumentando a eficiência do processo de simulação.A utilização da classe MultiSession é recomendada quando o número de sessões é maior ou igual ao dobro do número de núcleos da CPU, para que seja possível obter o máximo aproveitamento da capacidade de processamento do hardware. |
aircraft-list | aircraft_modelsDocumentationInstallationInstallation is easy usingpipand will install all required
libraries.$pipinstallaircraft-listHow to use aircraft_listfromaircraft_listimportaircraft_modelsaircrafts=aircraft_models()It returns a list of dictionaries which can be addressed as follows:foracinaircrafts:ac['manufacturer']#it returns the aircraft manufacturerac['model']#it returns the aircraft modelac['icao']#it returns the icao_code designatorac['type']#it returns the aircraft typeac['engine']#it returns the aircraft engine typeac['engine_number']#it returns the aircraft engine numberac['wake']#it returns the aircraft wake turbulence categoryThe list of dictionary is updated as per ICAO DOC 8643 - Aircraft Type Designator |
aircraft-models | No description available on PyPI. |
aircraft-models-list | Failed to fetch description. HTTP Status Code: 404 |
airctrl | Supported OSSupported LanguageWelcome to AirControlAirControl is an Open Source, Modular, Cross-Platform, and Extensible Flight Simulator For Deep Learning Research.AirControl offers a realistic simulation experience with a variety of airplanes. The AirControl is built onUnity Game engine. Following are the salient features of the AirControl:Built withC#, it hasPythonAPI to control it from your favorite Deep learning Framework.Complete source code is open on Github.Aircontrol takes full advantage of object-oriented programming. It is developed fully modular from day one. You can easily introduce new features such asvertical takeoff. You can bring your ownalien plane to AirCotrol.AirControl is truly cross-platform, can be compiled on Linux, macOS, and Windows. Binary will be released for all the platforms.The AirControl uses NvidiaPhysxfor the best possible Newtonian physics simulation.AirControl allows users to take advantage of aerodynamics effects such asGround effect.All the control surfaces (Throttle, Rudder, Ailerons, and Flaps) accept normalized input between -1 and 1. This makes AirControl even more friendly with AI.System RequirementIt depends on how big your Unity environment is. The environment which comes with the AirControl binary releases is the basic one and tested with the following config:Operating System : Ubuntu, Windows, MacCPU: Intel Core i7GPU: Nvidia 1070 or HigherRAM: 16 GBAirControl may work with lower than the specified requirements, but it's not tested.You can run AirControl in server-client mode with two different machines or both in a single machine.Control InputsTo change the keyboard layout mapping manual, refer to :keyboard-layout-editorExport the Layout as PNG and replace the fileAirControl/docs/images/keyboard-layout.pngSubmit a pull requestGetting startedVersion number (MAJOR.MINOR.PATCH) follows Semantic Versioning 2.0.0. Version numbers change as follows:MAJOR version when making incompatible API changesMINOR version when adding API in a backward-compatible mannerPATCH version when making backward-compatible bug fixes
Note that semantic versioning also applies to the Pypi release, Git release, and Snap releases.%%{init: {'theme': 'base', 'themeVariables': { 'primaryColor': '#ffcccc', 'edgeLabelBackground':'#ffffee', 'tertiaryColor': '#fff0f0'}}}%%
graph TD
A[Skills] --> C[Not Familier with Unity Engine]
C --> C_1[Use Binaries]
C_1 --> D[Windows]
C_1 --> E[Ubuntu]
C_1 --> F[Mac]
A[Skills] --> B[Familier with Unity Engine]
B --> H[Wants to Add/Edit Airplane/Assets etc..]
H --> I[yes]
H --> J[No]
I --> K[Build from source]
J --> C_1Windows - TestedDownload Binaries -https://github.com/snlpatel001213/AirControl/releasesBuild it from sourceLinux - TestedDownload Binaries -https://github.com/snlpatel001213/AirControl/releasesBuild it from sourcemacOS - Not Tested [Need Contributors]Download Binaries -https://github.com/snlpatel001213/AirControl/releasesBuild it from sourcePython Package InstallationcdPythonPython3setup.pyinstallORUserelativeimportsPypi Release-AlphaNot Tested:https://pypi.org/project/aircontrol-python/pip install airctrlSnap Release-AlphaInstall :snap install aircontrolInvoke:aircontrol.AirControlFly manually or use python API.DocumentationAirControl Documentation:https://aircontrol.readthedocs.io/C#/Python API Documentation -https://snlpatel001213.github.io/AirControl/html/index.htmlClient examplesSr. No.Client ExampleDetails1Primitive APISimple Client to interact with the server. It does not require the AirControl Pypi package. Just for unit test, not for long runs2Primitive API - 2Simple Client to interact with server. More detailed than the previous one. End to end flight loop is demonstrated. It does not require AirControl Pypi package. Just for unit test, not for long runs3Lidar ControlsDemonstrate how to control lidar from the python client.4Camera ControlsDemonstrate how to control Camera from the python client. It allows switching the camera. It allows capturing Depth, Semantic segmentation, Object segmentation, Optical flow variant of the scene.4Time of Day ControlsAllows controlling the time of day and light conditions. It allows controlling sun position based on Longitude, Latitude, Hour, and Minutes.5UI and Audio ControlsAllows controlling visibility of Airplane control on UI and Airplane Audio.Future ReleaseRefer to the Project page for the future release, features, and bug tracking :https://github.com/snlpatel001213/AirControl/projects/1Tools and TechnologyContributeWe love your input! We want to make contributing to AirControl as easy and transparent as possible. Please see our Contributing GuideCONTRIBUTING.mdto get started. Thank you to all our contributors!Current Contributors |
aircv | False |
air-db | air-dbair-dbis a data access layer (DAL) to easily query atmospheric time series datasets from various sources.air-dbdoes not include any database. It is required to install corresponding database to workair-dbproperly.To install sample database:fromairdbimportDatabaseDatabase.install_sample()A database query can be implemented as follows:fromairdbimportDatabasedb=Database('samp',return_type='df')q=db.query(param=['so2','pm10'],city='istanbul',date=['>2010-05-10','<2012-10-07'],month=5)print(q)deldb# close connection to databaseand the output is:param reg city sta date value
0 pm10 marmara istanbul çatladıkapı 2010-05-10 00:00:00 0.798218
1 pm10 marmara istanbul çatladıkapı 2010-05-10 01:00:00 0.946180
2 pm10 marmara istanbul çatladıkapı 2010-05-10 02:00:00 0.884385
3 pm10 marmara istanbul çatladıkapı 2010-05-10 03:00:00 0.537993
4 pm10 marmara istanbul çatladıkapı 2010-05-10 04:00:00 0.136689
... ... ... ... ... ... ...
16123 so2 marmara istanbul şirinevler mthm 2012-05-31 19:00:00 0.697663
16124 so2 marmara istanbul şirinevler mthm 2012-05-31 20:00:00 0.615755
16125 so2 marmara istanbul şirinevler mthm 2012-05-31 21:00:00 0.489289
16126 so2 marmara istanbul şirinevler mthm 2012-05-31 22:00:00 0.385102
16127 so2 marmara istanbul şirinevler mthm 2012-05-31 23:00:00 0.039451
[16128 rows x 6 columns] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.