package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
airbyte-source-google-sheets
Google-Sheets source connectorThis is the repository for the Google-Sheets source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_google_sheets/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-google-sheets spec poetry run source-google-sheets check --config secrets/config.json poetry run source-google-sheets discover --config secrets/config.json poetry run source-google-sheets read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-google-sheetsbuildAn image will be available on your host with the tagairbyte/source-google-sheets:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-google-sheets:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-sheets:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-sheets:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-google-sheets:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-sheetstestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-sheets testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/google-sheets.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-google-webfonts
Google Webfonts SourceThis is the repository for the Google Webfonts configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_google_webfonts/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource google-webfonts test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-google-webfontsbuildAn image will be built with the tagairbyte/source-google-webfonts:dev.Viadocker build:dockerbuild-tairbyte/source-google-webfonts:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-google-webfonts:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-webfonts:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-webfonts:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-google-webfonts:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-webfontstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-webfonts testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/google-webfonts.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-greenhouse
Greenhouse source connectorThis is the repository for the Greenhouse source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_greenhouse/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-greenhouse spec poetry run source-greenhouse check --config secrets/config.json poetry run source-greenhouse discover --config secrets/config.json poetry run source-greenhouse read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-greenhousebuildAn image will be available on your host with the tagairbyte/source-greenhouse:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-greenhouse:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-greenhouse:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-greenhouse:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-greenhouse:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-greenhousetestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-greenhouse testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/greenhouse.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-gridly
Gridly SourceThis is the repository for the Gridly source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_gridly/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource gridly test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-gridlybuildAn image will be built with the tagairbyte/source-gridly:dev.Viadocker build:dockerbuild-tairbyte/source-gridly:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-gridly:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gridly:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gridly:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-gridly:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-gridlytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-gridly testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/gridly.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-gutendex
Gutendex SourceThis is the repository for the Gutendex configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_gutendex/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource gutendex test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-gutendexbuildAn image will be built with the tagairbyte/source-gutendex:dev.Viadocker build:dockerbuild-tairbyte/source-gutendex:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-gutendex:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gutendex:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gutendex:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-gutendex:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-gutendextestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-gutendex testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/gutendex.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-harness
Harness SourceThis is the repository for the Harness configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_harness/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource harness test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-harnessbuildAn image will be built with the tagairbyte/source-harness:dev.Viadocker build:dockerbuild-tairbyte/source-harness:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-harness:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-harness:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-harness:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-harness:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-harnesstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-harness testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/harness.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-harvest
Harvest source connectorThis is the repository for the Harvest source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_harvest/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-harvest spec poetry run source-harvest check --config secrets/config.json poetry run source-harvest discover --config secrets/config.json poetry run source-harvest read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-harvestbuildAn image will be available on your host with the tagairbyte/source-harvest:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-harvest:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-harvest:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-harvest:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-harvest:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-harvesttestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-harvest testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/harvest.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-hellobaton
Hellobaton SourceThis is the repository for the Hellobaton configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_hellobaton/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource hellobaton test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-hellobatonbuildAn image will be built with the tagairbyte/source-hellobaton:dev.Viadocker build:dockerbuild-tairbyte/source-hellobaton:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-hellobaton:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hellobaton:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hellobaton:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-hellobaton:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-hellobatontestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-hellobaton testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/hellobaton.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-hubplanner
Hubplanner SourceThis is the repository for the Hubplanner configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_hubplanner/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource hubplanner test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-hubplannerbuildAn image will be built with the tagairbyte/source-hubplanner:dev.Viadocker build:dockerbuild-tairbyte/source-hubplanner:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-hubplanner:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hubplanner:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hubplanner:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-hubplanner:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-hubplannertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-hubplanner testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/hubplanner.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-hubspot
Hubspot source connectorThis is the repository for the Hubspot source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_hubspot/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-hubspot spec poetry run source-hubspot check --config secrets/config.json poetry run source-hubspot discover --config secrets/config.json poetry run source-hubspot read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-hubspotbuildAn image will be available on your host with the tagairbyte/source-hubspot:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-hubspot:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hubspot:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-hubspot:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-hubspot:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-hubspottestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-hubspot testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/hubspot.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-insightly
Insightly SourceThis is the repository for the Insightly configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow.To build using Gradle, from the Airbyte repository root, run:./gradlew :airbyte-integrations:connectors:source-insightly:buildIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_insightly/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource insightly test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-insightlybuildAn image will be built with the tagairbyte/source-insightly:dev.Viadocker build:dockerbuild-tairbyte/source-insightly:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-insightly:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-insightly:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-insightly:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-insightly:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json<<<<<<< HEADCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.To run your integration tests with Docker, run:./acceptance-test-docker.shAll commands should be run from airbyte project root. To run unit tests:./gradlew :airbyte-integrations:connectors:source-insightly:unitTestTo run acceptance and custom integration tests:./gradlew :airbyte-integrations:connectors:source-insightly:integrationTest======= You can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-insightlytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.masterAll of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-insightly testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/insightly.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-instagram
Instagram source connectorThis is the repository for the Instagram source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_instagram/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-instagram spec poetry run source-instagram check --config secrets/config.json poetry run source-instagram discover --config secrets/config.json poetry run source-instagram read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-instagrambuildAn image will be available on your host with the tagairbyte/source-instagram:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-instagram:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-instagram:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-instagram:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-instagram:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-instagramtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-instagram testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/instagram.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-instatus
Instatus SourceThis is the repository for the Instatus configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_instatus/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource instatus test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-instatusbuildAn image will be built with the tagairbyte/source-instatus:dev.Viadocker build:dockerbuild-tairbyte/source-instatus:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-instatus:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-instatus:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-instatus:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-instatus:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-instatustestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-instatus testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/instatus.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-intercom
Intercom source connectorThis is the repository for the Intercom source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_intercom/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-intercom spec poetry run source-intercom check --config secrets/config.json poetry run source-intercom discover --config secrets/config.json poetry run source-intercom read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-intercombuildAn image will be available on your host with the tagairbyte/source-intercom:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-intercom:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-intercom:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-intercom:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-intercom:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-intercomtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-intercom testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/intercom.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-intruder
Intruder SourceThis is the repository for the Intruder configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_intruder/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource intruder test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-intruderbuildAn image will be built with the tagairbyte/source-intruder:dev.Viadocker build:dockerbuild-tairbyte/source-intruder:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-intruder:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-intruder:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-intruder:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-intruder:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-intrudertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-intruder testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/intruder.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-ip2whois
Ip2whois SourceThis is the repository for the Ip2whois configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_ip2whois/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource ip2whois test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-ip2whoisbuildAn image will be built with the tagairbyte/source-ip2whois:dev.Viadocker build:dockerbuild-tairbyte/source-ip2whois:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-ip2whois:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ip2whois:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ip2whois:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-ip2whois:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-ip2whoistestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-ip2whois testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/ip2whois.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-iterable
Iterable source connectorThis is the repository for the Iterable source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_iterable/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-iterable spec poetry run source-iterable check --config secrets/config.json poetry run source-iterable discover --config secrets/config.json poetry run source-iterable read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-iterablebuildAn image will be available on your host with the tagairbyte/source-iterable:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-iterable:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-iterable:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-iterable:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-iterable:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-iterabletestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-iterable testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/iterable.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-jira
Jira source connectorThis is the repository for the Jira source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_jira/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-jira spec poetry run source-jira check --config secrets/config.json poetry run source-jira discover --config secrets/config.json poetry run source-jira read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-jirabuildAn image will be available on your host with the tagairbyte/source-jira:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-jira:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-jira:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-jira:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-jira:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-jiratestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-jira testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/jira.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-k6-cloud
K6 Cloud SourceThis is the repository for the K6 Cloud configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_k6_cloud/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource k6-cloud test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-k6-cloudbuildAn image will be built with the tagairbyte/source-k6-cloud:dev.Viadocker build:dockerbuild-tairbyte/source-k6-cloud:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-k6-cloud:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-k6-cloud:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-k6-cloud:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-k6-cloud:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-k6-cloudtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-k6-cloud testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/k6-cloud.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-klarna
Klarna SourceThis is the repository for the Klarna configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_klarna/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource klarna test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-klarnabuildAn image will be built with the tagairbyte/source-klarna:dev.Viadocker build:dockerbuild-tairbyte/source-klarna:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-klarna:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klarna:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klarna:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-klarna:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-klarnatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-klarna testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/klarna.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-klaus-api
Klaus Api SourceThis is the repository for the Klaus Api source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow.To build using Gradle, from the Airbyte repository root, run:./gradlew :airbyte-integrations:connectors:source-klaus-api:buildIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_klaus_api/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource klaus-api test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonFirst, make sure you build the latest Docker image:docker build . -t airbyte/source-klaus-api:devIf you want to build the Docker image with the CDK on your local machine (rather than the most recent package published to pypi), from the airbyte base directory run:CONNECTOR_TAG=<TAG_NAME>CONNECTOR_NAME=<CONNECTOR_NAME>shairbyte-integrations/scripts/build-connector-image-with-local-cdk.shYou can also build the connector image via Gradle:./gradlew :airbyte-integrations:connectors:source-klaus-api:airbyteDockerWhen building via Gradle, the docker image name and tag, respectively, are the values of theio.airbyte.nameandio.airbyte.versionLABELs in the Dockerfile.Then run any of the connector commands as follows:docker run --rm airbyte/source-klaus-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klaus-api:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klaus-api:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-klaus-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonMake sure to familiarize yourself withpytest test discoveryto know how your test files and methods should be named. First install test dependencies into your virtual environment:pip install .[tests]To run unit tests locally, from the connector directory run:python -m pytest unit_testsThere are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector). Place custom tests insideintegration_tests/folder, then, from the connector root, runpython -m pytest integration_testsCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. To run your integration tests with acceptance tests, from the connector root, runpython -m pytest integration_tests -p integration_tests.acceptanceTo run your integration tests with dockerAll commands should be run from airbyte project root. To run unit tests:./gradlew :airbyte-integrations:connectors:source-klaus-api:unitTestTo run acceptance and custom integration tests:./gradlew :airbyte-integrations:connectors:source-klaus-api:integrationTestAll of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing unit and integration tests.Bump the connector version inDockerfile-- just increment the value of theLABEL io.airbyte.versionappropriately (we useSemVer).Create a Pull Request.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-klaviyo
Klaviyo source connectorThis is the repository for the Klaviyo source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_klaviyo/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-klaviyo spec poetry run source-klaviyo check --config secrets/config.json poetry run source-klaviyo discover --config secrets/config.json poetry run source-klaviyo read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-klaviyobuildAn image will be available on your host with the tagairbyte/source-klaviyo:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-klaviyo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klaviyo:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-klaviyo:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-klaviyo:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-klaviyotestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-klaviyo testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/klaviyo.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-kyriba
Kyriba SourceThis is the repository for the Kyriba source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_kyriba/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource kyriba test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonThe Airbyte way of building this connector is to use ourairbyte-citool. You can follow install instructionshere. Then running the following command will build your connector:airbyte-ciconnectors--namesource-kyribabuildOnce the command is done, you will find your connector image in your local docker registry:airbyte/source-kyriba:dev.When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding abuild_customization.pymodule to your connector. This module should contain apre_connector_installandpost_connector_installasync function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist.Here is an example of abuild_customization.pymodule:from__future__importannotationsfromtypingimportTYPE_CHECKINGifTYPE_CHECKING:fromdaggerimportContainerasyncdefpre_connector_install(base_image_container:Container)->Container:returnawaitbase_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR","my_pre_build_env_var_value")asyncdefpost_connector_install(connector_container:Container)->Container:returnawaitconnector_container.with_env_variable("MY_POST_BUILD_ENV_VAR","my_post_build_env_var_value")This connector is built using our dynamic built process inairbyte-ci. The base image used to build it is defined within the metadata.yaml file under theconnectorBuildOptions. The build logic is defined usingDaggerhere. It does not rely on a Dockerfile.If you would like to patch our connector and build your own a simple approach would be to:Create your own Dockerfile based on the latest version of the connector image.FROMairbyte/source-kyriba:latestCOPY../airbyte/integration_codeRUNpipinstall./airbyte/integration_codePlease use this as an example. This is not optimized.Build your image:dockerbuild-tairbyte/source-kyriba:dev. dockerrunairbyte/source-kyriba:devspecThen run any of the connector commands as follows:docker run --rm airbyte/source-kyriba:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-kyriba:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-kyriba:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-kyriba:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-kyribatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-kyriba testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/kyriba.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-kyve
KYVEThis page contains the setup guide and reference information for theKYVEsource connector.The KYVE Data Pipeline enables easy import of KYVE data into any data warehouse or destination supported byAirbyte. With theELTformat, data analysts and engineers can now confidently source KYVE data without worrying about its validity or reliability.For information about how to set up an end-to-end pipeline with this connector, seethe documentation.In order to create an ELT pipeline with KYVE source you should specify thePool-IDof theKYVE storage poolfrom which you want to retrieve data.You can specify a specificBundle-Start-IDin case you want to narrow the records that will be retrieved from the pool. You can find the valid bundles in the KYVE app (e.g.Cosmos Hub pool).In order to extract the validated data from KYVE, you can specify the endpoint which will be requestedKYVE-API URL Base. By default, the official KYVEmainnetendpoint will be used, providing the data ofthese pools.Note:KYVE Network consists of three individual networks:Korelliais thedevnetused for development purposes,Kaonis thetestnetused for testing purposes, andmainnetis the official network. Although through Kaon and Korellia validated data can be used for development purposes, it is recommended to only trust the data validated on Mainnet.You can fetch with one source configuration more than one pool simultaneously. You just need to specify thePool-IDsand theBundle-Start-IDsfor the KYVE storage pool you want to archive separated with comma.VersionDateSubject0.1.025-05-23Initial release of KYVE source connector0.2.010-11-23Update KYVE source to support to Mainnet and Testnet
airbyte-source-launchdarkly
Launchdarkly SourceThis is the repository for the Launchdarkly configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_launchdarkly/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource launchdarkly test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-launchdarklybuildAn image will be built with the tagairbyte/source-launchdarkly:dev.Viadocker build:dockerbuild-tairbyte/source-launchdarkly:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-launchdarkly:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-launchdarkly:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-launchdarkly:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-launchdarkly:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-launchdarklytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-launchdarkly testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/launchdarkly.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-lemlist
Lemlist SourceThis is the repository for the Lemlist configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_lemlist/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource lemlist test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-lemlistbuildAn image will be built with the tagairbyte/source-lemlist:dev.Viadocker build:dockerbuild-tairbyte/source-lemlist:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-lemlist:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lemlist:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lemlist:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-lemlist:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-lemlisttestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-lemlist testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/lemlist.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-lever-hiring
Lever Hiring SourceThis is the repository for the Lever Hiring source connector, written in Python.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, get the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_lever_hiring/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource lever-hiring test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-lever-hiringbuildAn image will be built with the tagairbyte/source-lever-hiring:dev.Viadocker build:dockerbuild-tairbyte/source-lever-hiring:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-lever-hiring:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lever-hiring:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lever-hiring:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-lever-hiring:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-lever-hiringtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-lever-hiring testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/lever-hiring.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-linkedin-ads
Linkedin-Ads source connectorThis is the repository for the Linkedin-Ads source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_linkedin_ads/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-linkedin-ads spec poetry run source-linkedin-ads check --config secrets/config.json poetry run source-linkedin-ads discover --config secrets/config.json poetry run source-linkedin-ads read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-linkedin-adsbuildAn image will be available on your host with the tagairbyte/source-linkedin-ads:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-linkedin-ads:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-linkedin-ads:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-linkedin-ads:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-linkedin-ads:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-linkedin-adstestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-linkedin-ads testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/linkedin-ads.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-linkedin-pages
Linkedin Pages SourceThis is the repository for the Linkedin Pages source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_linkedin_pages/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource linkedin-pages test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-linkedin-pagesbuildAn image will be built with the tagairbyte/source-linkedin-pages:dev.Viadocker build:dockerbuild-tairbyte/source-linkedin-pages:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-linkedin-pages:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-linkedin-pages:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-linkedin-pages:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-linkedin-pages:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-linkedin-pagestestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-linkedin-pages testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/linkedin-pages.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-lokalise
Lokalise SourceThis is the repository for the Lokalise configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_lokalise/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource lokalise test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-lokalisebuildAn image will be built with the tagairbyte/source-lokalise:dev.Viadocker build:dockerbuild-tairbyte/source-lokalise:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-lokalise:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lokalise:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-lokalise:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-lokalise:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-lokalisetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-lokalise testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/lokalise.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-looker
Looker SourceThis is the repository for the Looker source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_looker/spec.jsonfile. Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource looker test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-lookerbuildAn image will be built with the tagairbyte/source-looker:dev.Viadocker build:dockerbuild-tairbyte/source-looker:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-looker:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-looker:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-looker:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-looker:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-lookertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-looker testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/looker.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-mailchimp
Mailchimp source connectorThis is the repository for the Mailchimp source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_mailchimp/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-mailchimp spec poetry run source-mailchimp check --config secrets/config.json poetry run source-mailchimp discover --config secrets/config.json poetry run source-mailchimp read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-mailchimpbuildAn image will be available on your host with the tagairbyte/source-mailchimp:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-mailchimp:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailchimp:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailchimp:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-mailchimp:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mailchimptestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-mailchimp testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/mailchimp.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-mailerlite
Mailerlite SourceThis is the repository for the Mailerlite configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_mailerlite/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource mailerlite test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-mailerlitebuildAn image will be built with the tagairbyte/source-mailerlite:dev.Viadocker build:dockerbuild-tairbyte/source-mailerlite:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-mailerlite:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailerlite:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailerlite:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-mailerlite:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mailerlitetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-mailerlite testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/mailerlite.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-mailersend
Mailersend SourceThis is the repository for the Mailersend configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_mailersend/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource mailersend test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-mailersendbuildAn image will be built with the tagairbyte/source-mailersend:dev.Viadocker build:dockerbuild-tairbyte/source-mailersend:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-mailersend:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailersend:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailersend:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-mailersend:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mailersendtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-mailersend testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/mailersend.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-mailgun
Mailgun SourceThis is the repository for the Mailgun configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_mailgun/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource mailgun test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-mailgunbuildAn image will be built with the tagairbyte/source-mailgun:dev.Viadocker build:dockerbuild-tairbyte/source-mailgun:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-mailgun:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailgun:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailgun:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-mailgun:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mailguntestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-mailgun testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/mailgun.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-mailjet-mail
Mailjet Mail SourceThis is the repository for the Mailjet Mail configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_mailjet_mail/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource mailjet-mail test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-mailjet-mailbuildAn image will be built with the tagairbyte/source-mailjet-mail:dev.Viadocker build:dockerbuild-tairbyte/source-mailjet-mail:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-mailjet-mail:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailjet-mail:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailjet-mail:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-mailjet-mail:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mailjet-mailtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-mailjet-mail testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/mailjet-mail.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-mailjet-sms
Mailjet Sms SourceThis is the repository for the Mailjet Sms configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_mailjet_sms/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource mailjet-sms test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-mailjet-smsbuildAn image will be built with the tagairbyte/source-mailjet-sms:dev.Viadocker build:dockerbuild-tairbyte/source-mailjet-sms:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-mailjet-sms:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailjet-sms:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mailjet-sms:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-mailjet-sms:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mailjet-smstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-mailjet-sms testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/mailjet-sms.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-marketo
Marketo source connectorThis is the repository for the Marketo source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_marketo/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-marketo spec poetry run source-marketo check --config secrets/config.json poetry run source-marketo discover --config secrets/config.json poetry run source-marketo read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-marketobuildAn image will be available on your host with the tagairbyte/source-marketo:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-marketo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-marketo:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-marketo:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-marketo:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-marketotestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-marketo testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/marketo.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-merge
Merge SourceThis is the repository for the Merge configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_merge/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource merge test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-mergebuildAn image will be built with the tagairbyte/source-merge:dev.Viadocker build:dockerbuild-tairbyte/source-merge:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-merge:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-merge:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-merge:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-merge:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mergetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-merge testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/merge.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-metabase
Metabase SourceThis is the repository for the Metabase source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_metabase/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource metabase test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.jsonTo run unit tests locally, from the connector directory run:python -m pytest unit_testsCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. To run your integration tests with acceptance tests, from the connector root, runpython -m pytest integration_tests -p integration_tests.acceptanceAll of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-metabase testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/metabase.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-microsoft-dataverse
Microsoft Dataverse SourceThis is the repository for the Microsoft Dataverse source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_microsoft_dataverse/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource microsoft-dataverse test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-microsoft-dataversebuildAn image will be built with the tagairbyte/source-microsoft-dataverse:dev.Viadocker build:dockerbuild-tairbyte/source-microsoft-dataverse:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-microsoft-dataverse:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-dataverse:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-dataverse:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-microsoft-dataverse:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-microsoft-dataversetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-microsoft-dataverse testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/microsoft-dataverse.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-microsoft-onedrive
Microsoft Onedrive SourceThis is the repository for the Microsoft Onedrive source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_microsoft_onedrive/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource microsoft-onedrive test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonThe Airbyte way of building this connector is to use ourairbyte-citool. You can follow install instructionshere. Then running the following command will build your connector:airbyte-ciconnectors--namesource-microsoft-onedrivebuildOnce the command is done, you will find your connector image in your local docker registry:airbyte/source-microsoft-onedrive:dev.When contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding abuild_customization.pymodule to your connector. This module should contain apre_connector_installandpost_connector_installasync function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist.Here is an example of abuild_customization.pymodule:from__future__importannotationsfromtypingimportTYPE_CHECKINGifTYPE_CHECKING:fromdaggerimportContainerasyncdefpre_connector_install(base_image_container:Container)->Container:returnawaitbase_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR","my_pre_build_env_var_value")asyncdefpost_connector_install(connector_container:Container)->Container:returnawaitconnector_container.with_env_variable("MY_POST_BUILD_ENV_VAR","my_post_build_env_var_value")This connector is built using our dynamic built process inairbyte-ci. The base image used to build it is defined within the metadata.yaml file under theconnectorBuildOptions. The build logic is defined usingDaggerhere. It does not rely on a Dockerfile.If you would like to patch our connector and build your own a simple approach would be to:Create your own Dockerfile based on the latest version of the connector image.FROMairbyte/source-microsoft-onedrive:latestCOPY../airbyte/integration_codeRUNpipinstall./airbyte/integration_codePlease use this as an example. This is not optimized.Build your image:dockerbuild-tairbyte/source-microsoft-onedrive:dev. dockerrunairbyte/source-microsoft-onedrive:devspecThen run any of the connector commands as follows:docker run --rm airbyte/source-microsoft-onedrive:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-onedrive:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-microsoft-onedrive:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-microsoft-onedrive:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonMake sure to familiarize yourself withpytest test discoveryto know how your test files and methods should be named. First install test dependencies into your virtual environment:pip install .[tests]To run unit tests locally, from the connector directory run:python -m pytest unit_testsThere are two types of integration tests: Acceptance Tests (Airbyte's test suite for all source connectors) and custom integration tests (which are specific to this connector). Place custom tests insideintegration_tests/folder, then, from the connector root, runpython -m pytest integration_testsCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py. Please run acceptance tests viaairbyte-ci:airbyte-ciconnectors--namesource-microsoft-onedrivetestAll of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-microsoft-onedrive testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/microsoft-onedrive.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-microsoft-teams
Rabbitmq DestinationThis is the repository for the Rabbitmq destination connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thedestination_rabbitmq/spec.jsonfile. Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namedestination rabbitmq test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--namedestination-rabbitmqbuildAn image will be built with the tagairbyte/destination-rabbitmq:dev.Viadocker build:dockerbuild-tairbyte/destination-rabbitmq:dev.Then run any of the connector commands as follows:docker run --rm airbyte/destination-rabbitmq:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/destination-rabbitmq:dev check --config /secrets/config.json cat messages.jsonl | docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/destination-rabbitmq:dev write --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-microsoft-teamstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-microsoft-teams testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/microsoft-teams.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-mixpanel
Mixpanel source connectorThis is the repository for the Mixpanel source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_mixpanel/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-mixpanel spec poetry run source-mixpanel check --config secrets/config.json poetry run source-mixpanel discover --config secrets/config.json poetry run source-mixpanel read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-mixpanelbuildAn image will be available on your host with the tagairbyte/source-mixpanel:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-mixpanel:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mixpanel:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-mixpanel:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-mixpanel:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mixpaneltestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-mixpanel testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/mixpanel.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-monday
Monday source connectorThis is the repository for the Monday source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_monday/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-monday spec poetry run source-monday check --config secrets/config.json poetry run source-monday discover --config secrets/config.json poetry run source-monday read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-mondaybuildAn image will be available on your host with the tagairbyte/source-monday:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-monday:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-monday:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-monday:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-monday:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-mondaytestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-monday testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/monday.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-my-hours
My Hours SourceThis is the repository for the My Hours source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_my_hours/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource my-hours test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-my-hoursbuildAn image will be built with the tagairbyte/source-my-hours:dev.Viadocker build:dockerbuild-tairbyte/source-my-hours:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-my-hours:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-my-hours:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-my-hours:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-my-hours:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-my-hourstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-my-hours testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/my-hours.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-n8n
N8n SourceThis is the repository for the N8n configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_n8n/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource n8n test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-n8nbuildAn image will be built with the tagairbyte/source-n8n:dev.Viadocker build:dockerbuild-tairbyte/source-n8n:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-n8n:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-n8n:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-n8n:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-n8n:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-n8ntestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-n8n testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/n8n.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-nasa
Nasa SourceThis is the repository for the Nasa configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_nasa/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource nasa test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-nasabuildAn image will be built with the tagairbyte/source-nasa:dev.Viadocker build:dockerbuild-tairbyte/source-nasa:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-nasa:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-nasa:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-nasa:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-nasa:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-nasatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-nasa testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/nasa.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-netsuite
Netsuite Soap SourceThis is the repository for the Netsuite Soap source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_netsuite_soap/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource netsuite test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-netsuitebuildAn image will be built with the tagairbyte/source-netsuite:dev.Viadocker build:dockerbuild-tairbyte/source-netsuite:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-netsuite:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-netsuite:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-netsuite:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-netsuite:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-netsuitetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-netsuite testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/netsuite.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-news-api
News Api SourceThis is the repository for the News Api configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_news_api/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource news-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-news-apibuildAn image will be built with the tagairbyte/source-news-api:dev.Viadocker build:dockerbuild-tairbyte/source-news-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-news-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-news-api:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-news-api:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-news-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-news-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-news-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/news-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-newsdata
Newsdata SourceThis is the repository for the Newsdata configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_newsdata/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource newsdata test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-newsdatabuildAn image will be built with the tagairbyte/source-newsdata:dev.Viadocker build:dockerbuild-tairbyte/source-newsdata:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-newsdata:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-newsdata:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-newsdata:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-newsdata:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-newsdatatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-newsdata testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/newsdata.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-notion
Notion source connectorThis is the repository for the Notion source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_notion/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-notion spec poetry run source-notion check --config secrets/config.json poetry run source-notion discover --config secrets/config.json poetry run source-notion read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-notionbuildAn image will be available on your host with the tagairbyte/source-notion:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-notion:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-notion:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-notion:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-notion:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-notiontestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-notion testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/notion.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-nytimes
Nytimes SourceThis is the repository for the Nytimes configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_nytimes/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource nytimes test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-nytimesbuildAn image will be built with the tagairbyte/source-nytimes:dev.Viadocker build:dockerbuild-tairbyte/source-nytimes:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-nytimes:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-nytimes:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-nytimes:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-nytimes:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-nytimestestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-nytimes testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/nytimes.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-okta
Okta SourceThis is the repository for the Okta source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python-mvenv.venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source.venv/bin/activate pipinstall-rrequirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_okta/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource okta test credsand place them intosecrets/config.json.pythonmain.pyspec pythonmain.pycheck--configsecrets/config.json pythonmain.pydiscover--configsecrets/config.json pythonmain.pyread--configsecrets/config.json--catalogintegration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-oktabuildAn image will be built with the tagairbyte/source-okta:dev.Viadocker build:dockerbuild-tairbyte/source-okta:dev.Then run any of the connector commands as follows:dockerrun--rmairbyte/source-okta:devspec dockerrun--rm-v$(pwd)/secrets:/secretsairbyte/source-okta:devcheck--config/secrets/config.json dockerrun--rm-v$(pwd)/secrets:/secretsairbyte/source-okta:devdiscover--config/secrets/config.json dockerrun--rm-v$(pwd)/secrets:/secrets-v$(pwd)/integration_tests:/integration_testsairbyte/source-okta:devread--config/secrets/config.json--catalog/integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-oktatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-okta testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/okta.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-omnisend
Omnisend SourceThis is the repository for the Omnisend configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_omnisend/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource omnisend test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-omnisendbuildAn image will be built with the tagairbyte/source-omnisend:dev.Viadocker build:dockerbuild-tairbyte/source-omnisend:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-omnisend:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-omnisend:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-omnisend:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-omnisend:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-omnisendtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-omnisend testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/omnisend.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-onesignal
Onesignal SourceThis is the repository for the Onesignal configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_onesignal/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource onesignal test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-onesignalbuildAn image will be built with the tagairbyte/source-onesignal:dev.Viadocker build:dockerbuild-tairbyte/source-onesignal:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-onesignal:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-onesignal:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-onesignal:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-onesignal:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-onesignaltestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-onesignal testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/onesignal.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-open-exchange-rates
Open Exchange Rates SourceThis is the repository for the Open Exchange Rates configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_open_exchange_rates/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource open-exchange-rates test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-open-exchange-ratesbuildAn image will be built with the tagairbyte/source-open-exchange-rates:dev.Viadocker build:dockerbuild-tairbyte/source-open-exchange-rates:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-open-exchange-rates:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-open-exchange-rates:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-open-exchange-rates:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-open-exchange-rates:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-open-exchange-ratestestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-open-exchange-rates testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/open-exchange-rates.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-openweather
Openweather SourceThis is the repository for the Openweather configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_openweather/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource openweather test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-openweatherbuildAn image will be built with the tagairbyte/source-openweather:dev.Viadocker build:dockerbuild-tairbyte/source-openweather:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-openweather:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-openweather:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-openweather:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-openweather:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-openweathertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-openweather testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/openweather.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-opsgenie
Opsgenie SourceThis is the repository for the Opsgenie source connector, written in low-code configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.You can also build the connector in Gradle. This is typically used in CI and not needed for your development workflow.To build using Gradle, from the Airbyte repository root, run:./gradlew :airbyte-integrations:connectors:source-opsgenie:buildIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_opsgenie/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource opsgenie test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-opsgeniebuildAn image will be built with the tagairbyte/source-opsgenie:dev.Viadocker build:dockerbuild-tairbyte/source-opsgenie:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-opsgenie:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-opsgenie:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-opsgenie:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-opsgenie:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-opsgenietestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-opsgenie testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/opsgenie.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-orb
Orb SourceThis is the repository for the Orb source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_orb/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource orb test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-orbbuildAn image will be built with the tagairbyte/source-orb:dev.Viadocker build:dockerbuild-tairbyte/source-orb:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-orb:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-orb:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-orb:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-orb:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-orbtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-orb testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/orb.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-orbit
Orbit SourceThis is the repository for the Orbit configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_orbit/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource orbit test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-orbitbuildAn image will be built with the tagairbyte/source-orbit:dev.Viadocker build:dockerbuild-tairbyte/source-orbit:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-orbit:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-orbit:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-orbit:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-orbit:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-orbittestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-orbit testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/orbit.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-oura
Oura SourceThis is the repository for the Oura configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_oura/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource oura test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-ourabuildAn image will be built with the tagairbyte/source-oura:dev.Viadocker build:dockerbuild-tairbyte/source-oura:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-oura:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-oura:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-oura:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-oura:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-ouratestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-oura testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/oura.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-outbrain-amplify
Outbrain Amplify SourceThis is the repository for the Outbrain Amplify source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_outbrain_amplify/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource outbrain-amplify test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-outbrain-amplifybuildAn image will be built with the tagairbyte/source-outbrain-amplify:dev.Viadocker build:dockerbuild-tairbyte/source-outbrain-amplify:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-outbrain-amplify:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-outbrain-amplify:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-outbrain-amplify:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-outbrain-amplify:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-outbrain-amplifytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-outbrain-amplify testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/outbrain-amplify.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-outreach
Outreach SourceThis is the repository for the Outreach source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_outreach/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource outreach test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-outreachbuildAn image will be built with the tagairbyte/source-outreach:dev.Viadocker build:dockerbuild-tairbyte/source-outreach:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-outreach:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-outreach:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-outreach:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-outreach:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-outreachtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-outreach testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/outreach.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pagerduty
Pagerduty SourceThis is the repository for the Pagerduty configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pagerduty/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pagerduty test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pagerdutybuildAn image will be built with the tagairbyte/source-pagerduty:dev.Viadocker build:dockerbuild-tairbyte/source-pagerduty:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pagerduty:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pagerduty:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pagerduty:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pagerduty:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pagerdutytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pagerduty testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pagerduty.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pardot
Pardot SourceThis is the repository for the Pardot source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pardot/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pardot test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pardotbuildAn image will be built with the tagairbyte/source-pardot:dev.Viadocker build:dockerbuild-tairbyte/source-pardot:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pardot:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pardot:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pardot:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pardot:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pardottestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pardot testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pardot.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-partnerstack
Partnerstack SourceThis is the repository for the Partnerstack configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_partnerstack/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource partnerstack test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-partnerstackbuildAn image will be built with the tagairbyte/source-partnerstack:dev.Viadocker build:dockerbuild-tairbyte/source-partnerstack:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-partnerstack:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-partnerstack:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-partnerstack:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-partnerstack:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-partnerstacktestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-partnerstack testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/partnerstack.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-paypal-transaction
Paypal-Transaction source connectorThis is the repository for the Paypal-Transaction source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionsherePaypal Client ID and Client SecretIf you are going to use the data generator scripts you need to setup yourPaypal Sandbox and a Buyer user in your sandbox, to simulate the data. YOu cna get that information in theApps & Credentials page.Buyer UsernameBuyer PasswordPayer ID (Account ID)Installing the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_paypal_transaction/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.You must have created your credentials under thesecrets/folderFor the read command, you can create separate catalogs to test the streams individually. All catalogs are under the folderintegration_tests. Select the one you want to test with the read command.Locally running the connectorpoetry run source-paypal-transaction spec poetry run source-paypal-transaction check --config secrets/config.json poetry run source-paypal-transaction discover --config secrets/config.json #Example with list_payments catalog and the debug flag poetry run source-paypal-transaction read --config secrets/config.json --catalog integration_tests/configured_catalog_list_payments.json --debugRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:Installing the connectorFrom this connector directory, run:poetryinstall--withdevCustomizing our build processWhen contributing on our connector you might need to customize the build process to add a system dependency or set an env var. You can customize our build process by adding abuild_customization.pymodule to your connector. This module should contain apre_connector_installandpost_connector_installasync function that will mutate the base image and the connector container respectively. It will be imported at runtime by our build process and the functions will be called if they exist.Here is an example of abuild_customization.pymodule:from__future__importannotationsfromtypingimportTYPE_CHECKINGifTYPE_CHECKING:# Feel free to check the dagger documentation for more information on the Container object and its methods.# https://dagger-io.readthedocs.io/en/sdk-python-v0.6.4/fromdaggerimportContainerAn image will be available on your host with the tagairbyte/source-paypal-transaction:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-paypal-transaction:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-paypal-transaction:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-paypal-transaction:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-paypal-transaction:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-paypal-transactiontestIf you are testing locally, you can use your local credentials (config.json file) by using--use-local-secretsairbyte-ciconnectors--namesource-paypal-transaction--use-local-secretstestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Running Unit tests locallyTo run unit tests locally, form the rootsource_paypal_transactiondirectory run:python-mpytestunit_testTest changes in the sandboxIf you have aPaypal Sandboxyou will be able to use some APIs to create new data and test how data is being created in your destinaiton and choose the best syn strategy that suits better your use case. Some endpoints will require special permissions on the sandbox to update and change some values.In thebinfolder you will find several data generator scripts:disputes_generator.py:Update dispute: Uses thePATCHmethod of thehttps://api-m.paypal.com/v1/customer/disputes/{dispute_id}endpoint. You need the ID and create a payload to pass it as an argument. See more informationhere.pythondisputes_generator.pyupdateDISPUTE_ID''[{"op":"replace","path":"/reason","value":"The new reason"}]'Update Evidence status: Uses thePOSTmethod of thehttps://api-m.paypal.com/v1/customer/disputes/{dispute_id}/require-evidenceendpoint. You need the ID and select an option to pass it as an argument. See more informationherepythonupdate_dispute.pyrequire-evidenceDISPUTE_IDSELLER_EVIDENCEinvoices.py:Create draft invoice: Uses thePOSTmethod of thehttps://api-m.sandbox.paypal.com/v2/invoicing/invoicesendpoint. It will automatically generate an invoice (no need to pass any parameters). See more informationhere.pythoninvoices.pycreate_draftSend a Draft Invoice: Uses thePOSTmethod of thehttps://api-m.sandbox.paypal.com/v2/invoicing/invoices/{invoice_id}/sendendpoint. You need the Invoice ID, a subject and a note (just to have something to update) and an email as an argument. See more informationherepythoninvoices.pysend_draft--invoice_id"INV2-XXXX-XXXX-XXXX-XXXX"--subject"Your Invoice Subject"--note"Your custom note"[email protected]_generator.py:Partially update payment: Uses thePATCHmethod of thehttps://api-m.paypal.com/v1/payments/payment/{payment_id}endpoint. You need the payment ID and a payload with new values. (no need to pass any parameters). See more informationhere.pythonscript_name.pyupdatePAYMENT_ID'[{"op": "replace", "path": "/transactions/0/amount", "value": {"total": "50.00", "currency": "USD"}}]'paypal_transaction_generator.py:Make sure you have thebuyer_username,buyer_passwordandpayer_idin your config file. You can get the sample configuratin in thesample_config.json.Generate transactions: This uses Selenium, so you will be prompted to your account to simulate the complete transaction flow. You can add a number at the end of the command to do more than one transaction. By default the script runs 3 transactions.NOTE: Be midnfu of the number of transactions, as it will be interacting with your machine, and you may not be able to use it while creating the transactionspythonpaypal_transaction_generator.py[NUMBER_OF_DESIRED_TRANSACTIONS]product_catalog.py:Create a product: Uses thePOSTmethod of thehttps://api-m.sandbox.paypal.com/v1/catalogs/productsendpoint. You need to add the description and the category in the command line. For the proper category see more informationhere.pythonproduct_catalog.py--actioncreate--description"YOUR DESCRIPTION"--categoryPAYPAL_CATEGORYUpdate a product: Uses thePATCHmethod of thehttps://developer.paypal.com/docs/api/catalog-products/v1/#products_patchendpoint. You need the product ID, a description and the Category as an argument. See more informationherepythonproduct_catalog.py--actionupdate--product_idPRODUCT_ID--update_payload'[{"op": "replace", "path": "/description", "value": "My Update. Does it changes it?"}]'Dependency ManagementAll of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistAll of your dependencies should be managed via Poetry.To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-paypal-transaction testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/paypal-transaction.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-paystack
Paystack SourceThis is the repository for the Paystack source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_paystack/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource paystack test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-paystackbuildAn image will be built with the tagairbyte/source-paystack:dev.Viadocker build:dockerbuild-tairbyte/source-paystack:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-paystack:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-paystack:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-paystack:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-paystack:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-paystacktestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-paystack testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/paystack.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pendo
Pendo SourceThis is the repository for the Pendo configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pendo/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pendo test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pendobuildAn image will be built with the tagairbyte/source-pendo:dev.Viadocker build:dockerbuild-tairbyte/source-pendo:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pendo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pendo:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pendo:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pendo:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pendotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pendo testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pendo.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-persistiq
Persistiq SourceThis is the repository for the Persistiq configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_persistiq/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource persistiq test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-persistiqbuildAn image will be built with the tagairbyte/source-persistiq:dev.Viadocker build:dockerbuild-tairbyte/source-persistiq:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-persistiq:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-persistiq:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-persistiq:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-persistiq:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-persistiqtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-persistiq testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/persistiq.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pexels-api
Pexels Api SourceThis is the repository for the Pexels Api configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pexels_api/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pexels-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pexels-apibuildAn image will be built with the tagairbyte/source-pexels-api:dev.Viadocker build:dockerbuild-tairbyte/source-pexels-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pexels-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pexels-api:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pexels-api:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pexels-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pexels-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pexels-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pexels-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pinterest
Pinterest source connectorThis is the repository for the Pinterest source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pinterest/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-pinterest spec poetry run source-pinterest check --config secrets/config.json poetry run source-pinterest discover --config secrets/config.json poetry run source-pinterest read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-pinterestbuildAn image will be available on your host with the tagairbyte/source-pinterest:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-pinterest:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pinterest:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pinterest:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pinterest:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pinteresttestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pinterest testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/pinterest.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-pipedrive
Pipedrive SourceThis is the repository for the Pipedrive configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pipedrive/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pipedrive test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pipedrivebuildAn image will be built with the tagairbyte/source-pipedrive:dev.Viadocker build:dockerbuild-tairbyte/source-pipedrive:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pipedrive:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pipedrive:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pipedrive:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pipedrive:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pipedrivetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pipedrive testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pipedrive.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pivotal-tracker
Pivotal Tracker SourceThis is the repository for the Pivotal Tracker source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pivotal_tracker/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pivotal-tracker test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pivotal-trackerbuildAn image will be built with the tagairbyte/source-pivotal-tracker:dev.Viadocker build:dockerbuild-tairbyte/source-pivotal-tracker:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pivotal-tracker:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pivotal-tracker:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pivotal-tracker:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pivotal-tracker:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pivotal-trackertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pivotal-tracker testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pivotal-tracker.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-plaid
Plaid SourceThis is the repository for the Plaid source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_plaid/spec.jsonfile. Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource plaid-new test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-plaidbuildAn image will be built with the tagairbyte/source-plaid:dev.Viadocker build:dockerbuild-tairbyte/source-plaid:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-plaid:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-plaid:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-plaid:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-plaid:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-plaidtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-plaid testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/plaid.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-plausible
Plausible SourceThis is the repository for the Plausible configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_plausible/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource plausible test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-plausiblebuildAn image will be built with the tagairbyte/source-plausible:dev.Viadocker build:dockerbuild-tairbyte/source-plausible:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-plausible:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-plausible:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-plausible:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-plausible:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-plausibletestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-plausible testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/plausible.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pocket
Pocket SourceThis is the repository for the Pocket configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pocket/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pocket test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pocketbuildAn image will be built with the tagairbyte/source-pocket:dev.Viadocker build:dockerbuild-tairbyte/source-pocket:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pocket:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pocket:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pocket:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pocket:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pockettestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pocket testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pocket.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pokeapi
Pokeapi SourceThis is the repository for the Pokeapi configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pokeapi/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pokeapi test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pokeapibuildAn image will be built with the tagairbyte/source-pokeapi:dev.Viadocker build:dockerbuild-tairbyte/source-pokeapi:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pokeapi:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pokeapi:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pokeapi:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pokeapi:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pokeapitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pokeapi testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pokeapi.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-polygon-stock-api
Polygon Stock Api SourceThis is the repository for the Polygon Stock Api configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_polygon_stock_api/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource polygon-stock-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-polygon-stock-apibuildAn image will be built with the tagairbyte/source-polygon-stock-api:dev.Viadocker build:dockerbuild-tairbyte/source-polygon-stock-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-polygon-stock-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-polygon-stock-api:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-polygon-stock-api:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-polygon-stock-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-polygon-stock-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-polygon-stock-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/polygon-stock-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-posthog
PostHog SourceThis is the repository for the PostHog source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_posthog/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource posthog test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog sample_files/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-posthogbuildAn image will be built with the tagairbyte/source-posthog:dev.Viadocker build:dockerbuild-tairbyte/source-posthog:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-posthog:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-posthog:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-posthog:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/sample_files:/sample_files airbyte/source-posthog:dev read --config /secrets/config.json --catalog /sample_files/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-posthogtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-posthog testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/posthog.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-postmarkapp
Postmarkapp SourceThis is the repository for the Postmarkapp configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_postmarkapp/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource postmarkapp test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-postmarkappbuildAn image will be built with the tagairbyte/source-postmarkapp:dev.Viadocker build:dockerbuild-tairbyte/source-postmarkapp:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-postmarkapp:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-postmarkapp:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-postmarkapp:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-postmarkapp:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-postmarkapptestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-postmarkapp testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/postmarkapp.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-prestashop
PrestaShop SourceThis is the repository for the Presta Shop source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_prestashop/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource prestashop test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-prestashopbuildAn image will be built with the tagairbyte/source-prestashop:dev.Viadocker build:dockerbuild-tairbyte/source-prestashop:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-prestashop:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-prestashop:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-prestashop:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-prestashop:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-prestashoptestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-prestashop testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/prestashop.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-primetric
Primetric SourceThis is the repository for the Primetric source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_primetric/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource primetric test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-primetricbuildAn image will be built with the tagairbyte/source-primetric:dev.Viadocker build:dockerbuild-tairbyte/source-primetric:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-primetric:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-primetric:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-primetric:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-primetric:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-primetrictestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-primetric testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/primetric.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-public-apis
Public Apis SourceThis is the repository for the Public Apis configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_public_apis/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource public-apis test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-public-apisbuildAn image will be built with the tagairbyte/source-public-apis:dev.Viadocker build:dockerbuild-tairbyte/source-public-apis:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-public-apis:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-public-apis:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-public-apis:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-public-apis:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-public-apistestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-public-apis testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/public-apis.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-punk-api
Punk Api SourceThis is the repository for the Punk Api configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_punk_api/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource punk-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-punk-apibuildAn image will be built with the tagairbyte/source-punk-api:dev.Viadocker build:dockerbuild-tairbyte/source-punk-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-punk-api:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-punk-api:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-punk-api:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-punk-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-punk-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-punk-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/punk-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-pypi
Pypi SourceThis is the repository for the Pypi configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_pypi/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource pypi test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-pypibuildAn image will be built with the tagairbyte/source-pypi:dev.Viadocker build:dockerbuild-tairbyte/source-pypi:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-pypi:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pypi:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-pypi:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-pypi:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-pypitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-pypi testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/pypi.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-qualaroo
Qualaroo SourceThis is the repository for the Qualaroo configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_qualaroo/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource qualaroo test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-qualaroobuildAn image will be built with the tagairbyte/source-qualaroo:dev.Viadocker build:dockerbuild-tairbyte/source-qualaroo:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-qualaroo:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-qualaroo:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-qualaroo:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-qualaroo:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-qualarootestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-qualaroo testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/qualaroo.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-quickbooks
Quickbooks SourceThis is the repository for the Quickbooks configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.Local developmentCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_quickbooks/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource quickbooks test credsand place them intosecrets/config.json.Locally running the connector docker imageBuildViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-quickbooksbuildAn image will be built with the tagairbyte/source-quickbooks:dev.Viadocker build:dockerbuild-tairbyte/source-quickbooks:dev.RunThen run any of the connector commands as follows:docker run --rm airbyte/source-quickbooks:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-quickbooks:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-quickbooks:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-quickbooks:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonTestingYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-quickbookstestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistPublishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-quickbooks testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/quickbooks.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-railz
Railz SourceThis is the repository for the Railz configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_railz/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namePrimary Railz Dev Accountfrom notes the sandbox creds and place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-railzbuildAn image will be built with the tagairbyte/source-railz:dev.Viadocker build:dockerbuild-tairbyte/source-railz:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-railz:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-railz:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-railz:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-railz:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-railztestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-railz testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/railz.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-rd-station-marketing
RD Station Marketing SourceThis is the repository for the RD Station Marketing source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python3 -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_rd_station_marketing/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource rd-station test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-rd-station-marketingbuildAn image will be built with the tagairbyte/source-rd-station-marketing:dev.Viadocker build:dockerbuild-tairbyte/source-rd-station-marketing:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-rd-station-marketing:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rd-station-marketing:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rd-station-marketing:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-rd-station-marketing:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-rd-station-marketingtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-rd-station-marketing testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/rd-station-marketing.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-recharge
Recharge source connectorThis is the repository for the Recharge source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_recharge/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-recharge spec poetry run source-recharge check --config secrets/config.json poetry run source-recharge discover --config secrets/config.json poetry run source-recharge read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-rechargebuildAn image will be available on your host with the tagairbyte/source-recharge:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-recharge:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recharge:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recharge:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-recharge:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-rechargetestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-recharge testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/recharge.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-recreation
Recreation SourceThis is the repository for the Recreation configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.The Recreation Information Database (RIDB) provides data resources to citizens, offering a single point of access to information about recreational opportunities nationwide. The RIDB represents an authoritative source of information and services for millions of visitors to federal lands, historic sites, museums, and other attractions/resources. This initiative integrates multiple Federal channels and sources about recreation opportunities into a one-stop, searchable database of recreational areas nationwide.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_recreation/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource recreation test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-recreationbuildAn image will be built with the tagairbyte/source-recreation:dev.Viadocker build:dockerbuild-tairbyte/source-recreation:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-recreation:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recreation:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recreation:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-recreation:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-recreationtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-recreation testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/recreation.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-recruitee
Recruitee SourceThis is the repository for the Recruitee configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_recruitee/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource recruitee test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-recruiteebuildAn image will be built with the tagairbyte/source-recruitee:dev.Viadocker build:dockerbuild-tairbyte/source-recruitee:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-recruitee:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recruitee:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-recruitee:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-recruitee:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-recruiteetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-recruitee testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/recruitee.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-recurly
Recurly source connectorThis is the repository for the Recurly source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreating credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_recurly/spec.jsonfile. Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource recurly test credsand place them intosecrets/config.json.Locally running the connectorpoetryrunsource-recurlyspec poetryrunsource-recurlycheck--configsecrets/config.json poetryrunsource-recurlydiscover--configsecrets/config.json poetryrunsource-recurlyread--configsecrets/config.json--catalogsample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetryrunpytestunit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-recurlybuildAn image will be available on your host with the tagairbyte/source-recurly:dev.Running the docker containerThen run any of the connector commands as follows:dockerrun--rmairbyte/source-recurly:devspec dockerrun--rm-v$(pwd)/secrets:/secretsairbyte/source-recurly:devcheck--config/secrets/config.json dockerrun--rm-v$(pwd)/secrets:/secretsairbyte/source-recurly:devdiscover--config/secrets/config.json dockerrun--rm-v$(pwd)/secrets:/secrets-v$(pwd)/integration_tests:/integration_testsairbyte/source-recurly:devread--config/secrets/config.json--catalog/integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-recurlytestCustomizing acceptance TestsCustomize theacceptance-test-config.ymlfile to configure acceptance tests. See ourConnector Acceptance Tests referencefor more information. If your connector requires you to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry. To add a new dependency, run:poetryadd<package-name>Please commit the changes to thepyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-recurly testBump the connector version listed asdockerImageTaginmetadata.yaml. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/recurly.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry.
airbyte-source-reply-io
Reply Io SourceThis is the repository for the Reply Io configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_reply_io/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource reply-io test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-reply-iobuildAn image will be built with the tagairbyte/source-reply-io:dev.Viadocker build:dockerbuild-tairbyte/source-reply-io:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-reply-io:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-reply-io:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-reply-io:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-reply-io:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-reply-iotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-reply-io testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/reply-io.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-retently
Retently SourceThis is the repository for the Retently configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_retently/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource retently test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-retentlybuildAn image will be built with the tagairbyte/source-retently:dev.Viadocker build:dockerbuild-tairbyte/source-retently:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-retently:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-retently:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-retently:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-retently:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-retentlytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-retently testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/retently.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-ringcentral
Ringcentral SourceThis is the repository for the Ringcentral configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_ringcentral/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource ringcentral test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-ringcentralbuildAn image will be built with the tagairbyte/source-ringcentral:dev.Viadocker build:dockerbuild-tairbyte/source-ringcentral:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-ringcentral:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ringcentral:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ringcentral:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-ringcentral:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-ringcentraltestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-ringcentral testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/ringcentral.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-rki-covid
RKI Covid SourceThis is the repository for the RkI (Robert Koch-Institut - von Marlon Lückert) Covid-19 source connector, written in Python. For information about how to use this connector within Airbyte, seethe documentation.Germany: 1. /germany 2. /germany/age-groups 3. /germany/history/cases/:days 4. /germany/history/incidence/:days 5. /germany/history/deaths/:days 6. /germany/history/recovered/:days 7. /germany/history/frozen-incidence/:days 8. /germany/history/hospitalization/:days 9. /germany/states 10. /germany/states/age-groups 11. /germany/states/history/cases/:days 12. /germany/states/history/incidence/:days 13. /germany/states/history/frozen-incidence/:days 14. /germany/states/history/deaths/:days 15. /germany/states/history/recovered/:days 16. /germany/states/history/hospitalization/:daysTo iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your development environment of choice. To activate it from the terminal, run:source .venv/bin/activate pip install -r requirements.txt pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py. If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_rki_covid/spec.jsonfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource rki-covid test credsand place them intosecrets/config.json.python main.py spec python main.py check --config secrets/config.json python main.py discover --config secrets/config.json python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-rki-covidbuildAn image will be built with the tagairbyte/source-rki-covid:dev.Viadocker build:dockerbuild-tairbyte/source-rki-covid:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-rki-covid:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rki-covid:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rki-covid:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-rki-covid:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-rki-covidtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-rki-covid testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/rki-covid.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.
airbyte-source-rocket-chat
Rocket Chat SourceThis is the repository for the Rocket Chat configuration based source connector. For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_rocket_chat/spec.yamlfile. Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information. Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource rocket-chat test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-rocket-chatbuildAn image will be built with the tagairbyte/source-rocket-chat:dev.Viadocker build:dockerbuild-tairbyte/source-rocket-chat:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-rocket-chat:dev spec docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rocket-chat:dev check --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-rocket-chat:dev discover --config /secrets/config.json docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-rocket-chat:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-rocket-chattestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information. If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development. We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-rocket-chat testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/rocket-chat.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.