package
stringlengths 1
122
| pacakge-description
stringlengths 0
1.3M
|
---|---|
airbyte-api | Programatically control Airbyte Cloud through an API.AuthenticationDevelopers will need to create an API Key within yourDeveloper Portalto make API requests. You can use your existing Airbyte account to log in to the Developer Portal. Once you are in the Developer Portal, use the API Keys tab to create or remove API Keys. You can see awalkthrough demo here🎦The Developer Portal UI can also be used to help build your integration by showing information about network requests in the Requests tab. API usage information is also available to you in the Usage tab.SDK Installationpipinstallairbyte-apiSDK Example UsageExampleimportairbytefromairbyte.modelsimportshareds=airbyte.Airbyte(security=shared.Security(basic_auth=shared.SchemeBasicAuth(password="<YOUR_PASSWORD_HERE>",username="<YOUR_USERNAME_HERE>",),),)req=shared.ConnectionCreateRequest(destination_id='c669dd1e-3620-483e-afc8-55914e0a570f',source_id='6dd427d8-3a55-4584-b835-842325b6c7b3',namespace_format='${SOURCE_NAMESPACE}',)res=s.connections.create_connection(req)ifres.connection_responseisnotNone:# handle responsepassAvailable Resources and Operationsconnectionscreate_connection- Create a connectiondelete_connection- Delete a Connectionget_connection- Get Connection detailslist_connections- List connectionspatch_connection- Update Connection detailsdestinationscreate_destination- Create a destinationdelete_destination- Delete a Destinationget_destination- Get Destination detailslist_destinations- List destinationspatch_destination- Update a Destinationput_destination- Update a Destination and fully overwrite itjobscancel_job- Cancel a running Jobcreate_job- Trigger a sync or reset job of a connectionget_job- Get Job status and detailslist_jobs- List Jobs by sync typesourcescreate_source- Create a sourcedelete_source- Delete a Sourceget_source- Get Source detailsinitiate_o_auth- Initiate OAuth for a sourcelist_sources- List sourcespatch_source- Update a Sourceput_source- Update a Source and fully overwrite itstreamsget_stream_properties- Get stream propertiesworkspacescreate_or_update_workspace_o_auth_credentials- Create OAuth override credentials for a workspace and source type.create_workspace- Create a workspacedelete_workspace- Delete a Workspaceget_workspace- Get Workspace detailslist_workspaces- List workspacesupdate_workspace- Update a workspaceError HandlingHandling errors in this SDK should largely match your expectations. All operations return a response object or raise an error. If Error objects are specified in your OpenAPI Spec, the SDK will raise the appropriate Error type.Error ObjectStatus CodeContent Typeerrors.SDKError4x-5xx/Exampleimportairbytefromairbyte.modelsimporterrors,shareds=airbyte.Airbyte(security=shared.Security(basic_auth=shared.SchemeBasicAuth(password="<YOUR_PASSWORD_HERE>",username="<YOUR_USERNAME_HERE>",),),)req=shared.ConnectionCreateRequest(destination_id='c669dd1e-3620-483e-afc8-55914e0a570f',source_id='6dd427d8-3a55-4584-b835-842325b6c7b3',namespace_format='${SOURCE_NAMESPACE}',)res=Nonetry:res=s.connections.create_connection(req)excepterrors.SDKErrorase:# handle exceptionraise(e)ifres.connection_responseisnotNone:# handle responsepassServer SelectionSelect Server by IndexYou can override the default server globally by passing a server index to theserver_idx: intoptional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the indexes associated with the available servers:#ServerVariables0https://api.airbyte.com/v1NoneExampleimportairbytefromairbyte.modelsimportshareds=airbyte.Airbyte(server_idx=0,security=shared.Security(basic_auth=shared.SchemeBasicAuth(password="<YOUR_PASSWORD_HERE>",username="<YOUR_USERNAME_HERE>",),),)req=shared.ConnectionCreateRequest(destination_id='c669dd1e-3620-483e-afc8-55914e0a570f',source_id='6dd427d8-3a55-4584-b835-842325b6c7b3',namespace_format='${SOURCE_NAMESPACE}',)res=s.connections.create_connection(req)ifres.connection_responseisnotNone:# handle responsepassOverride Server URL Per-ClientThe default server can also be overridden globally by passing a URL to theserver_url: stroptional parameter when initializing the SDK client instance. For example:importairbytefromairbyte.modelsimportshareds=airbyte.Airbyte(server_url="https://api.airbyte.com/v1",security=shared.Security(basic_auth=shared.SchemeBasicAuth(password="<YOUR_PASSWORD_HERE>",username="<YOUR_USERNAME_HERE>",),),)req=shared.ConnectionCreateRequest(destination_id='c669dd1e-3620-483e-afc8-55914e0a570f',source_id='6dd427d8-3a55-4584-b835-842325b6c7b3',namespace_format='${SOURCE_NAMESPACE}',)res=s.connections.create_connection(req)ifres.connection_responseisnotNone:# handle responsepassCustom HTTP ClientThe Python SDK makes API calls using therequestsHTTP library. In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with a customrequests.Sessionobject.For example, you could specify a header for every request that this sdk makes as follows:importairbyteimportrequestshttp_client=requests.Session()http_client.headers.update({'x-custom-header':'someValue'})s=airbyte.Airbyte(client:http_client)AuthenticationPer-Client Security SchemesThis SDK supports the following security schemes globally:NameTypeSchemebasic_authhttpHTTP Basicbearer_authhttpHTTP BearerYou can set the security parameters through thesecurityoptional parameter when initializing the SDK client instance. The selected scheme will be used by default to authenticate with the API for all operations that support it. For example:importairbytefromairbyte.modelsimportshareds=airbyte.Airbyte(security=shared.Security(basic_auth=shared.SchemeBasicAuth(password="<YOUR_PASSWORD_HERE>",username="<YOUR_USERNAME_HERE>",),),)req=shared.ConnectionCreateRequest(destination_id='c669dd1e-3620-483e-afc8-55914e0a570f',source_id='6dd427d8-3a55-4584-b835-842325b6c7b3',namespace_format='${SOURCE_NAMESPACE}',)res=s.connections.create_connection(req)ifres.connection_responseisnotNone:# handle responsepassMaturityThis SDK is in beta, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning usage
to a specific package version. This way, you can install the same version each time without breaking changes unless you are intentionally
looking for the latest version.ContributionsWhile we value open-source contributions to this SDK, this library is generated programmatically.
Feel free to open a PR or a Github issue as a proof of concept and we'll do our best to include it in a future release !SDK Created bySpeakeasy |
airbyte-api-wrapper | Python wrapper for APIFull API spec of the API can be found here:https://airbyte-public-api-docs.s3.us-east-2.amazonaws.com/rapidoc-api-docs.htmlHow to useCreate clientfrom airbyte_python_helper import AirbyteHelper
import os
airbyte_url = os.environ["AIRBYTE_URL"]
airbyte_client_id = os.environ["CLIENT_ID"]
airbyte_client_secret = os.environ["CLIENT_SECRET"]
airbyte_client = AirbyteHelper(
airbyte_url, airbyte_client_id, airbyte_client_secret
)Destinationswid = airbyte_client.get_first_workspace_id()
print("workspaceId", wid)
print(airbyte_client.list_destinations(wid))
for destination in airbyte_client.list_destinations(wid):
airbyte_client.delete_destination(destination["destinationId"])
sources = airbyte_client.list_sources(wid)Sourceswid = airbyte_client.get_first_workspace_id()
print("workspaceId", wid)
sources = airbyte_client.list_sources(wid)
print(sources)
for source in sources:
print(source["sourceId"])Workspacesworkspaces = airbyte_client.list_workspaces()
print(workspaces) |
airbyte-cdk | Connector Development Kit (Python)The Airbyte Python CDK is a framework for rapidly developing production-grade Airbyte connectors.The CDK currently offers helpers specific for creating Airbyte source connectors for:HTTP APIs (REST APIs, GraphQL, etc..)Generic Python sources (anything not covered by the above)The CDK provides an improved developer experience by providing basic implementation structure and abstracting away low-level glue boilerplate.This document is a general introduction to the CDK. Readers should have basic familiarity with theAirbyte Specificationbefore proceeding.Getting StartedGenerate an empty connector using the code generator. First clone the Airbyte repository then from the repository root runcdairbyte-integrations/connector-templates/generator
./generate.shthen follow the interactive prompt. Next, find allTODOs in the generated project directory -- they're accompanied by lots of comments explaining what you'll need to do in order to implement your connector. Upon completing all TODOs properly, you should have a functioning connector.Additionally, you can followthis tutorialfor a complete walkthrough of creating an HTTP connector using the Airbyte CDK.Concepts & DocumentationSee theconcepts docsfor a tour through what the API offers.Example ConnectorsHTTP Connectors:StripeSlackSimple Python connectors using the barebonesSourceabstraction:Google SheetsMailchimpContributingFirst time setupWe assumepythonpoints to Python 3.9 or higher.Setup a virtual env:python-mvenv.venvsource.venv/bin/activate
pipinstall-e".[dev]"# [dev] installs development-only dependenciesIterationIterate on the code locallyRun tests viapython -m pytest -s unit_testsPerform static type checks usingmypy airbyte_cdk.MyPyconfiguration is inmypy.ini.Runmypy <files to check>to only check specific files. This is useful as the CDK still contains code that is not compliant.Thetype_check_and_test.shscript bundles both type checking and testing in one convenient command. Feel free to use it!Autogenerated filesIf the iteration you are working on includes changes to the models, you might want to regenerate them. In order to do that, you can run:cdairbyte-cdk/python
./gradlewbuildThis will generate the files based on the schemas, add the license information and format the code. If you want to only do the former and rely on
pre-commit to the others, you can run the appropriate generation command i.e../gradlew generateComponentManifestClassFiles.TestingAll tests are located in theunit_testsdirectory. Runpython -m pytest --cov=airbyte_cdk unit_tests/to run them. This also presents a test coverage report.Building and testing a connector with your local CDKWhen developing a new feature in the CDK, you may find it helpful to run a connector that uses that new feature. You can test this in one of two ways:Running a connector locallyBuilding and running a source via DockerInstalling your local CDK into a local Python connectorIn order to get a local Python connector running your local CDK, do the following.First, make sure you have your connector's virtual environment active:# from the `airbyte/airbyte-integrations/connectors/<connector-directory>` directorysource.venv/bin/activate# if you haven't installed dependencies for your connector alreadypipinstall-e.Then, navigate to the CDK and install it in editable mode:cd../../../airbyte-cdk/python
pipinstall-e.You should see thatpiphas uninstalled the version ofairbyte-cdkdefined by your connector'ssetup.pyand installed your local CDK. Any changes you make will be immediately reflected in your editor, so long as your editor's interpreter is set to your connector's virtual environment.Building a Python connector in Docker with your local CDK installedPre-requisite: Install theairbyte-ciCLIYou can build your connector image with the local CDK using# from the airbytehq/airbyte base directoryairbyte-ciconnectors--use-local-cdk--name=<CONNECTOR>buildNote that the local CDK is injected at build time, so if you make changes, you will have to run the build command again to see them reflected.Running Connector Acceptance Tests for a single connector in Docker with your local CDK installedPre-requisite: Install theairbyte-ciCLITo run acceptance tests for a single connectors using the local CDK, from the connector directory, runairbyte-ciconnectors--use-local-cdk--name=<CONNECTOR>testWhen you don't have access to the APIThere can be some time where you do not have access to the API (either because you don't have the credentials, network access, etc...) You will probably still want to do end-to-end testing at least once. In order to do so, you can emulate the server you would be reaching using a server stubbing tool.For example, usingmockserver, you can set up an expectation file like this:{"httpRequest":{"method":"GET","path":"/data"},"httpResponse":{"body":"{\"data\": [{\"record_key\": 1}, {\"record_key\": 2}]}"}}Assuming this file has been created atsecrets/mock_server_config/expectations.json, running the following command will allow to match any requests on path/datato return the response defined in the expectation file:dockerrun-d--rm-v$(pwd)/secrets/mock_server_config:/config-p8113:8113--envMOCKSERVER_LOG_LEVEL=TRACE--envMOCKSERVER_SERVER_PORT=8113--envMOCKSERVER_WATCH_INITIALIZATION_JSON=true--envMOCKSERVER_PERSISTED_EXPECTATIONS_PATH=/config/expectations.json--envMOCKSERVER_INITIALIZATION_JSON_PATH=/config/expectations.jsonmockserver/mockserver:5.15.0HTTP requests tolocalhost:8113/datashould now return the body defined in the expectations file. To test this, the implementer either has to change the code which defines the base URL for Python source or update theurl_basefrom low-code. With the Connector Builder running in docker, you will have to use domainhost.docker.internalinstead oflocalhostas the requests are executed within docker.Publishing a new version to PyPiOpen a PROnce it is approved andmerged, an Airbyte member must run thePublish CDK Manuallyworkflow from master usingrelease-type=major|manor|patchand setting the changelog message. |
airbyte-cdk-bootstrap | airbyte-cdk-bootstrapAboutPackage that must help you with fast startup your airbyte source. Based on airbyte-cdk package.Possibilitiesairbyte-cdk-bootstrap can help you with:Fast setup on spec datesFast setup on spec authorizationTODO: multithread sourcesFast setup on CredentialsCraft authorization |
airbyte-cdk-PHLAIR | Connector Development Kit (Python)The Airbyte Python CDK is a framework for rapidly developing production-grade Airbyte connectors. The CDK currently offers helpers specific for creating Airbyte source connectors for:HTTP APIs (REST APIs, GraphQL, etc..)Singer TapsGeneric Python sources (anything not covered by the above)The CDK provides an improved developer experience by providing basic implementation structure and abstracting away low-level glue boilerplate.This document is a general introduction to the CDK. Readers should have basic familiarity with theAirbyte Specificationbefore proceeding.Getting StartedGenerate an empty connector using the code generator. First clone the Airbyte repository then from the repository root runcd airbyte-integrations/connector-templates/generator
./generate.shthen follow the interactive prompt. Next, find allTODOs in the generated project directory -- they're accompanied by lots of comments explaining what you'll need to do in order to implement your connector. Upon completing all TODOs properly, you should have a functioning connector.Additionally, you can followthis tutorialfor a complete walkthrough of creating an HTTP connector using the Airbyte CDK.Concepts & DocumentationSee theconcepts docsfor a tour through what the API offers.Example ConnectorsHTTP Connectors:Exchangerates APIStripeSlackSinger connectors:SalesforceGithubSimple Python connectors using the barebonesSourceabstraction:Google SheetsMailchimpContributingFirst time setupWe assumepythonpoints to python >=3.9.Setup a virtual env:python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]" # [dev] installs development-only dependenciesIterationIterate on the code locallyRun tests viapytest -s unit_testsPerform static type checks usingmypy airbyte_cdk.MyPyconfiguration is in.mypy.ini.Thetype_check_and_test.shscript bundles both type checking and testing in one convenient command. Feel free to use it!TestingAll tests are located in theunit_testsdirectory. Runpytest --cov=airbyte_cdk unit_tests/to run them. This also presents a test coverage report.Publishing a new version to PyPiBump the package version insetup.pyOpen a PRAn Airbyte member must comment/publish-cdk dry-run=trueto publish the package to test.pypi.org or/publish-cdk dry-run=falseto publish it to the real index of pypi.org.Coming SoonFull OAuth 2.0 support (including refresh token issuing flow via UI or CLI)Airbyte Java HTTP CDKCDK for Async HTTP endpoints (request-poll-wait style endpoints)CDK for other protocolsDon't see a feature you need?Create an issue and let us know how we can help! |
airbyte-cdk-test | Connector Development Kit (Python CDK)The Airbyte Python CDK is a framework for rapidly developing production-grade Airbyte connectors.
The CDK currently offers helpers specific for creating Airbyte source connectors for:HTTP APIs (REST APIs, GraphQL, etc..)Singer TapsGeneric Python sources (anything not covered by the above)The CDK provides an improved developer experience by providing basic implementation structure and abstracting away low-level glue boilerplate.This document is a general introduction to the CDK. Readers should have basic familiarity with theAirbyte Specificationbefore proceeding.Getting StartedGenerate an empty connector using the code generator. First clone the Airbyte repository then from the repository root runcd airbyte-integrations/connector-templates/generator
npm run generatethen follow the interactive prompt. Next, find allTODOs in the generated project directory -- they're accompanied by lots of comments explaining what you'll need to do in order to implement your connector. Upon completing all TODOs properly, you should have a functioning connector.Additionally, you can followthis tutorialfor a complete walkthrough of creating an HTTP connector using the Airbyte CDK.Concepts & DocumentationSee theconcepts docsfor a tour through what the API offers.Example ConnectorsHTTP Connectors:Exchangerates APIStripeSlackSinger connectors:SalesforceGithubSimple Python connectors using the barebonesSourceabstraction:Google SheetsMailchimpContributingFirst time setupWe assumepythonpoints to python >=3.7.Setup a virtual env:python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]" # [dev] installs development-only dependenciesIterationIterate on the code locallyRun tests viapytest -s unit_testsPerform static type checks usingmypy airbyte_cdk.MyPyconfiguration is in.mypy.ini.Thetype_check_and_test.shscript bundles both type checking and testing in one convenient command. Feel free to use it!TestingAll tests are located in theunit_testsdirectory. Runpytest --cov=airbyte_cdk unit_tests/to run them.
This also presents a test coverage report.Publishing a new version to PyPiBump the package version insetup.pyOpen a PRAn Airbyte member must comment/publish-cdk --dry-run=<true or false>. Dry runs publish to test.pypi.org.Coming SoonFull OAuth 2.0 support (including refresh token issuing flow via UI or CLI)Airbyte Java HTTP CDKCDK for Async HTTP endpoints (request-poll-wait style endpoints)CDK for other protocolsGeneral CDK for DestinationsDon't see a feature you need?Create an issue and let us know how we can help! |
airbyte-cdk-velocity | Connector Development Kit (Python)The Airbyte Python CDK is a framework for rapidly developing production-grade Airbyte connectors. The CDK currently offers helpers specific for creating Airbyte source connectors for:HTTP APIs (REST APIs, GraphQL, etc..)Singer TapsGeneric Python sources (anything not covered by the above)The CDK provides an improved developer experience by providing basic implementation structure and abstracting away low-level glue boilerplate.This document is a general introduction to the CDK. Readers should have basic familiarity with theAirbyte Specificationbefore proceeding.Getting StartedGenerate an empty connector using the code generator. First clone the Airbyte repository then from the repository root runcd airbyte-integrations/connector-templates/generator
./generate.shthen follow the interactive prompt. Next, find allTODOs in the generated project directory -- they're accompanied by lots of comments explaining what you'll need to do in order to implement your connector. Upon completing all TODOs properly, you should have a functioning connector.Additionally, you can followthis tutorialfor a complete walkthrough of creating an HTTP connector using the Airbyte CDK.Concepts & DocumentationSee theconcepts docsfor a tour through what the API offers.Example ConnectorsHTTP Connectors:Exchangerates APIStripeSlackSinger connectors:SalesforceGithubSimple Python connectors using the barebonesSourceabstraction:Google SheetsMailchimpContributingFirst time setupWe assumepythonpoints to python >=3.7.Setup a virtual env:python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]" # [dev] installs development-only dependenciesIterationIterate on the code locallyRun tests viapytest -s unit_testsPerform static type checks usingmypy airbyte_cdk.MyPyconfiguration is in.mypy.ini.Thetype_check_and_test.shscript bundles both type checking and testing in one convenient command. Feel free to use it!TestingAll tests are located in theunit_testsdirectory. Runpytest --cov=airbyte_cdk unit_tests/to run them. This also presents a test coverage report.Publishing a new version to PyPiBump the package version insetup.pyOpen a PRAn Airbyte member must comment/publish-cdk dry-run=trueto publish the package to test.pypi.org or/publish-cdk dry-run=falseto publish it to the real index of pypi.org.Coming SoonFull OAuth 2.0 support (including refresh token issuing flow via UI or CLI)Airbyte Java HTTP CDKCDK for Async HTTP endpoints (request-poll-wait style endpoints)CDK for other protocolsDon't see a feature you need?Create an issue and let us know how we can help! |
airbyte-cdk-velocity-amazon | Connector Development Kit (Python)The Airbyte Python CDK is a framework for rapidly developing production-grade Airbyte connectors. The CDK currently offers helpers specific for creating Airbyte source connectors for:HTTP APIs (REST APIs, GraphQL, etc..)Singer TapsGeneric Python sources (anything not covered by the above)The CDK provides an improved developer experience by providing basic implementation structure and abstracting away low-level glue boilerplate.This document is a general introduction to the CDK. Readers should have basic familiarity with theAirbyte Specificationbefore proceeding.Getting StartedGenerate an empty connector using the code generator. First clone the Airbyte repository then from the repository root runcd airbyte-integrations/connector-templates/generator
./generate.shthen follow the interactive prompt. Next, find allTODOs in the generated project directory -- they're accompanied by lots of comments explaining what you'll need to do in order to implement your connector. Upon completing all TODOs properly, you should have a functioning connector.Additionally, you can followthis tutorialfor a complete walkthrough of creating an HTTP connector using the Airbyte CDK.Concepts & DocumentationSee theconcepts docsfor a tour through what the API offers.Example ConnectorsHTTP Connectors:Exchangerates APIStripeSlackSinger connectors:SalesforceGithubSimple Python connectors using the barebonesSourceabstraction:Google SheetsMailchimpContributingFirst time setupWe assumepythonpoints to python >=3.7.Setup a virtual env:python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]" # [dev] installs development-only dependenciesIterationIterate on the code locallyRun tests viapytest -s unit_testsPerform static type checks usingmypy airbyte_cdk.MyPyconfiguration is in.mypy.ini.Thetype_check_and_test.shscript bundles both type checking and testing in one convenient command. Feel free to use it!TestingAll tests are located in theunit_testsdirectory. Runpytest --cov=airbyte_cdk unit_tests/to run them. This also presents a test coverage report.Publishing a new version to PyPiBump the package version insetup.pyOpen a PRAn Airbyte member must comment/publish-cdk dry-run=trueto publish the package to test.pypi.org or/publish-cdk dry-run=falseto publish it to the real index of pypi.org.Coming SoonFull OAuth 2.0 support (including refresh token issuing flow via UI or CLI)Airbyte Java HTTP CDKCDK for Async HTTP endpoints (request-poll-wait style endpoints)CDK for other protocolsDon't see a feature you need?Create an issue and let us know how we can help! |
airbyte-lib | airbyte-libairbyte-lib is a library that allows to run Airbyte syncs embedded into any Python application, without the need to run Airbyte server.DevelopmentMake surePoetry is installed.Runpoetry installFor examples, check out theexamplesfolder. They can be run viapoetry run python examples/<example file>Unit tests and type checks can be run viapoetry run pytestReleaseIn your PR:Bump the version inpyproject.tomlAdd a changelog entry to the table belowOnce the PR is merged, go to Github and trigger thePublish AirbyteLib Manuallyworkflow. This will publish the new version to PyPI.Secrets ManagementAirbyteLib can auto-import secrets from the following sources:Environment variables.Google Colab secrets.Manual entry viagetpass.Note: Additional secret store options may be supported in the future.More info here.Retrieving Secretsfromairbyte_libimportget_secret,SecretSourcesource=get_connection("source-github")source.set_config("credentials":{"personal_access_token":get_secret("GITHUB_PERSONAL_ACCESS_TOKEN"),})Theget_secret()function accepts an optionalsourceargument of enum typeSecretSource. If omitted or set toSecretSource.ANY, AirbyteLib will search all available secrets sources. Ifsourceis set to a specific source, then only that source will be checked. If a list ofSecretSourceentries is passed, then the sources will be checked using the provided ordering.By default, AirbyteLib will prompt the user for any requested secrets that are not provided via other secret managers. You can disable this prompt by passingprompt=Falsetoget_secret().VersioningVersioning followsSemantic Versioning. For new features, bump the minor version. For bug fixes, bump the patch version. For pre-releases, appenddev.Nto the version. For example,0.1.0dev.1is the first pre-release of the0.1.0version.DocumentationRegular documentation lives in the/docsfolder. Based on the doc strings of public methods, we generate API documentation usingpdoc. To generate the documentation, runpoetry run generate-docs. The documentation will be generated in thedocs/generatefolder. This needs to be done manually when changing the public interface of the library.A unit test validates the documentation is up to date.Connector compatibilityTo make a connector compatible with airbyte-lib, the following requirements must be met:The connector must be a Python package, with apyproject.tomlor asetup.pyfile.In the package, there must be arun.pyfile that contains arunmethod. This method should read arguments from the command line, and run the connector with them, outputting messages to stdout.Thepyproject.tomlorsetup.pyfile must specify a command line entry point for therunmethod calledsource-<connector name>. This is usually done by adding aconsole_scriptssection to thepyproject.tomlfile, or aentry_pointssection to thesetup.pyfile. For example:[tool.poetry.scripts]source-my-connector="my_connector.run:run"setup(...entry_points={'console_scripts':['source-my-connector = my_connector.run:run',],},...)To publish a connector to PyPI, specify thepypisection in themetadata.yamlfile. For example:data:# ...remoteRegistries:pypi:enabled:truepackageName:"airbyte-source-my-connector"Validating source connectorsTo validate a source connector for compliance, theairbyte-lib-validate-sourcescript can be used. It can be used like this:airbyte-lib-validate-source—connector-dir.-—sample-configsecrets/config.jsonThe script will install the python package in the provided directory, and run the connector against the provided config. The config should be a valid JSON file, with the same structure as the one that would be provided to the connector in Airbyte. The script will exit with a non-zero exit code if the connector fails to run.For a more lightweight check, the--validate-install-onlyflag can be used. This will only check that the connector can be installed and returns a spec, no sample config required.ChangelogVersionPRDescription0.1.0#35184Beta Release 0.1.00.1.0dev.2#34111Initial publish - add publish workflow |
airbyte-protocol-models | airbyte-protocolDeclares the Airbyte Protocol.Key Filesairbyte_protocol.yaml- declares the Airbyte Protocol (in JSONSchema)io.airbyte.protocol.models- this package contains various java helpers for working with the protocol. |
airbyte-serverless | Airbyte made simple🔍️ What is AirbyteServerless?AirbyteServerless is a simple tool tomanage Airbyte connectors, run themlocallyor deploy them inserverlessmode.💡 Why AirbyteServerless?Airbyteis a must-have in your data-stack with itscatalog of open-source connectors to move your data from any source to your data-warehouse.To manage these connectors, Airbyte offersAirbyte-Open-Source-Platformwhich includes a server, workers, database, UI, orchestrator, connectors, secret manager, logs manager, etc.AirbyteServerless aims at offeringa lightweight alternativeto Airbyte-Open-Source-Platform to simplify connectors management.📝 Comparing Airbyte-Open-Source-Platform & AirbyteServerlessAirbyte-Open-Source-PlatformAirbyteServerlessHas a UIHas NO UIConnections configurations are managed by documented yaml filesHas a databaseHas NO database- Configurations files are versioned in git- The destination stores thestate(thecheckpointof where sync stops) andlogswhich can then be visualized with your preferred BI toolHas a transform layerAirbyte loads your data in a raw format but then enables you to perform basic transform such as replace, upsert,schema normalizationHas NO transform layer- Data is appended in your destination in raw format.-airbyte_serverlessis dedicated to do one thing and do it well:Extract-Load.NOT Serverless- Can be deployed on a VM or Kubernetes Cluster.- The platform is made of tens of dependent containers that you CANNOT deploy with serverlessServerless- An Airbyte source docker image is upgraded with a destination connector- The upgraded docker image can then be deployed as an isolatedCloud Run Job(orCloud Run Service)- Cloud Run is natively monitored with metrics, dashboards, logs, error reporting, alerting, etc- It can be scheduled or triggered by eventsIs scalable with conditionsScalable if deployed on autoscaled Kubernetes Cluster and if you are skilled enough.👉Check that you are skilled enough with Kubernetes by watchingthis video😁.Is scalableEach connector is deployed independently of each other. You can have as many as you want.💥 Getting Started withabsCLIabsis the CLI (command-line-interface) of AirbyteServerless which facilitates connectors management.Installabs🛠️pipinstallairbyte-serverlessCreate your first Connection 👨💻abscreatemy_first_connection--source="airbyte/source-faker:0.1.4"--destination="bigquery"--remote-runner"cloud_run_job"Docker is required. Make sure you have it installed.sourceparam can be any Public Docker Airbyte Source (hereis the list). We recomend that you use faker source to get started.destinationparam must be one of the following:print(default value if not set)bigquerycontributions are welcome to offer more destinations🤗remote-runnerparam must becloud_run_job. More integrations will come in the future. This remote-runner is only used if you want to run the connection on a remote runner and schedule it.The command will create a configuration file./connections/my_first_connection.yamlwith initialized configuration.Update this configuration file to suit your needs.Run it! ⚡absrunmy_first_connectionThis will launch an Extract-Load Job from the source to the destination.Theruncommmand will only work if you have correctly edited./connections/my_first_connection.yamlconfiguration file.If you chosebigquerydestination, you must:havegcloudinstalled on your machine with default credentials initialized with the commandgcloud auth application-default login.have correctly edited thedestinationsection of./connections/my_first_connection.yamlconfiguration file. You must havedataEditorpermission on the chosen BigQuery dataset.Data is always appended at destination (not replaced nor upserted). It will be in raw format.If the connector supports incremental extract (extract only new or recently modified data) then this mode is chosen.Select only some streams 🧛🏼You may not want to copy all the data that the source can get. To see all availablestreamsrun:abslist-available-streamsmy_first_connectionIf you want to configure your connection with only some of these streams, run:absset-streamsmy_first_connection"stream1,stream2"Nextrunexecutions will extract selected streams only.Handle Secrets 🔒For security reasons, you do NOT want to store secrets such as api tokens in your yaml files. Instead, add your secrets in Google Secret Manager by followingthis documentation. Then you can add the secret resource name in the yaml file such as below:source:docker_image:"..."config:api_token:GCP_SECRET({SECRET_RESOURCE_NAME})Replace{SECRET_RESOURCE_NAME}by your secret resource name which must have the format:projects/{PROJECT_ID}/secrets/{SECRET_ID}/versions/{SECRET_VERSION}. To get this path:Go to theSecret Manager pagein the Google Cloud console.Go to the Secret Manager pageOn the Secret Manager page, click on the Name of a secret.On the Secret details page, in the Versions table, locate a secret version to access.In the Actions column, click on the three dots.Click on 'Copy Resource Name' from the menu.Run from the Remote Runner 🚀absremote-runmy_first_connectionTheremote-runcommmand will only work if you have correctly edited./connections/my_first_connection.yamlconfiguration file including theremote_runnerpart.This command will launch an Extract-Load Job like theabs runcommand. The main difference is that the command will be run on a remote deployed container (we use Cloud Run Job as the only container runner for now).If you chosebigquerydestination, the service account you put inservice_accountfield ofremote_runnersection of the yaml must bebigquery.dataEditoron the target dataset and have permission to create some BigQuery jobs in the project.If your yaml config contains some Google Secrets, the service account you put inservice_accountfield ofremote_runnersection of the yaml must has read access to the secrets.Schedule the run from the Remote Runner ⏱️absschedule-remote-runmy_first_connection"0 * * * *"⚠️ THIS IS NOT IMPLEMENTED YETGet help 📙$abs--help
Usage:abs[OPTIONS]COMMAND[ARGS]...
Options:--helpShowthismessageandexit.
Commands:createCreateCONNECTIONlistListcreatedconnectionslist-available-streamsListavailablestreamsofCONNECTIONremote-runRunCONNECTIONExtract-LoadJobfromremoterunnerrunRunCONNECTIONExtract-LoadJobrun-env-varsRunExtract-LoadJobconfiguredbyenvironment...set-streamsSetSTREAMStoretrieveforCONNECTION(STREAMS...Keep in touch 🧑💻Join our Slackfor any question, to get help for getting started, to speak about a bug, to suggest improvements, or simply if you want to have a chat 🙂.👋 ContributeAny contribution is more than welcome 🤗!Add a ⭐ on the repo to show your supportJoin our Slackand talk with usRaise an issue to raise a bug or suggest improvementsOpen a PR! Below are some suggestions of work to be done:implements a schedulerimplement theget_logsmethod ofBigQueryDestinationenable updating cloud run job instead of deleting/creating when it already existsadd a new destination connector (Cloud Storage?)add more remote runners such compute instances.implements vpc accessimplement optional post-processing (replace, upsert data at destination instead of append?)🏆 CreditsBig kudos to Airbyte for all the hard work on connectors!The generation of the sample connector configuration in yaml is heavily inspired from the code ofoctaviaCLI developed by airbyte. |
airbyte-source-activecampaign | Activecampaign SourceThis is the repository for the Activecampaign configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_activecampaign/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource activecampaign test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-activecampaignbuildAn image will be built with the tagairbyte/source-activecampaign:dev.Viadocker build:dockerbuild-tairbyte/source-activecampaign:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-activecampaign:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-activecampaign:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-activecampaign:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-activecampaign:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-activecampaigntestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-activecampaign testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/activecampaign.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-adjust | Adjust SourceThis is the repository for the Adjust source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_adjust/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource adjust test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--namesource-adjustbuildAn image will be built with the tagairbyte/source-adjust:dev.Viadocker build:dockerbuild-tairbyte/source-adjust:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-adjust:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-adjust:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-adjust:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-adjust:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-adjusttestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlist |
airbyte-source-aha | Aha SourceThis is the repository for the Aha configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_aha/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource aha test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-ahabuildAn image will be built with the tagairbyte/source-aha:dev.Viadocker build:dockerbuild-tairbyte/source-aha:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-aha:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aha:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aha:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-aha:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-ahatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-aha testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/aha.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-aircall | Aircall SourceThis is the repository for the Aircall configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_aircall/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource aircall test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-aircallbuildAn image will be built with the tagairbyte/source-aircall:dev.Viadocker build:dockerbuild-tairbyte/source-aircall:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-aircall:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aircall:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aircall:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-aircall:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-aircalltestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-aircall testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/aircall.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-airtable | Airtable source connectorThis is the repository for the Airtable source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_airtable/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-airtable spec
poetry run source-airtable check --config secrets/config.json
poetry run source-airtable discover --config secrets/config.json
poetry run source-airtable read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-airtablebuildAn image will be available on your host with the tagairbyte/source-airtable:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-airtable:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-airtable:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-airtable:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-airtable:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-airtabletestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-airtable testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/airtable.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-alpha-vantage | Alpha Vantage SourceThis is the repository for the Alpha Vantage configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_alpha_vantage/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource alpha-vantage test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-alpha-vantagebuildAn image will be built with the tagairbyte/source-alpha-vantage:dev.Viadocker build:dockerbuild-tairbyte/source-alpha-vantage:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-alpha-vantage:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-alpha-vantage:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-alpha-vantage:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-alpha-vantage:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-alpha-vantagetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-alpha-vantage testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/alpha-vantage.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-amazon-ads | Amazon-Ads source connectorThis is the repository for the Amazon-Ads source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_amazon_ads/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-amazon-ads spec
poetry run source-amazon-ads check --config secrets/config.json
poetry run source-amazon-ads discover --config secrets/config.json
poetry run source-amazon-ads read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-amazon-adsbuildAn image will be available on your host with the tagairbyte/source-amazon-ads:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-amazon-ads:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-ads:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-ads:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-amazon-ads:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-amazon-adstestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-amazon-ads testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/amazon-ads.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-amazon-seller-partner | Amazon Seller-Partner SourceThis is the repository for the Amazon Seller-Partner source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_amazon_seller-partner/integration_tests/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource amazon-seller-partner test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-amazon-seller-partnerbuildAn image will be built with the tagairbyte/source-amazon-seller-partner:dev.Viadocker build:dockerbuild-tairbyte/source-amazon-seller-partner:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-amazon-seller-partner:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-seller-partner:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-seller-partner:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-amazon-seller-partner:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-amazon-seller-partnertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-amazon-seller-partner testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/amazon-seller-partner.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-amazon-sqs | Amazon Sqs SourceThis is the repository for the Amazon Sqs source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_amazon_sqs/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource amazon-sqs test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-amazon-sqsbuildAn image will be built with the tagairbyte/source-amazon-sqs:dev.Viadocker build:dockerbuild-tairbyte/source-amazon-sqs:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-amazon-sqs:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-sqs:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amazon-sqs:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-amazon-sqs:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-amazon-sqstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-amazon-sqs testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/amazon-sqs.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-amplitude | Amplitude source connectorThis is the repository for the Amplitude source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_amplitude/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-amplitude spec
poetry run source-amplitude check --config secrets/config.json
poetry run source-amplitude discover --config secrets/config.json
poetry run source-amplitude read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-amplitudebuildAn image will be available on your host with the tagairbyte/source-amplitude:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-amplitude:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amplitude:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-amplitude:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-amplitude:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-amplitudetestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-amplitude testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/amplitude.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-apify-dataset | Apify Dataset SourceThis is the repository for the Apify Dataset configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.Create a Python virtual environmentvirtualenv --python $(which python3.10) .venvSource itsource .venv/bin/activateCheck connector specifications/definitionpython main.py specBasic check - check connection to the APIpython main.py check --config secrets/config.jsonIntegration tests - read operation from the APIpython main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_apify_dataset/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource apify-dataset test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-apify-datasetbuildAn image will be built with the tagairbyte/source-apify-dataset:dev.Viadocker build:dockerbuild-tairbyte/source-apify-dataset:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-apify-dataset:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-apify-dataset:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-apify-dataset:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-apify-dataset:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-apify-datasettestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-apify-dataset testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/apify-dataset.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-appfollow | Appfollow SourceThis is the repository for the Appfollow configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_appfollow/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource appfollow test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-appfollowbuildAn image will be built with the tagairbyte/source-appfollow:dev.Viadocker build:dockerbuild-tairbyte/source-appfollow:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-appfollow:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-appfollow:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-appfollow:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-appfollow:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-appfollowtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-appfollow testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/appfollow.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-apple-search-ads | Apple Search Ads SourceThis is the repository for the Apple Search Ads configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_apple_search_ads/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource apple-search-ads test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-apple-search-adsbuildAn image will be built with the tagairbyte/source-apple-search-ads:dev.Viadocker build:dockerbuild-tairbyte/source-apple-search-ads:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-apple-search-ads:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-apple-search-ads:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-apple-search-ads:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-apple-search-ads:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-apple-search-adstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-apple-search-ads testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/apple-search-ads.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-appsflyer | Appsflyer SourceThis is the repository for the Appsflyer source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_appsflyer/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource appsflyer test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-appsflyerbuildAn image will be built with the tagairbyte/source-appsflyer:dev.Viadocker build:dockerbuild-tairbyte/source-appsflyer:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-appsflyer:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-appsflyer:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-appsflyer:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-appsflyer:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-appsflyertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-appsflyer testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/appsflyer.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-asana | Asana SourceThis is the repository for the Asana source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_asana/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource asana test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-asanabuildAn image will be built with the tagairbyte/source-asana:dev.Viadocker build:dockerbuild-tairbyte/source-asana:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-asana:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-asana:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-asana:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-asana:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-asanatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-asana testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/asana.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-ashby | Ashby SourceThis is the repository for the Ashby configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_ashby/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource ashby test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-ashbybuildAn image will be built with the tagairbyte/source-ashby:dev.Viadocker build:dockerbuild-tairbyte/source-ashby:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-ashby:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ashby:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-ashby:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-ashby:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-ashbytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-ashby testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/ashby.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-auth0 | Auth0 SourceThis is the repository for the Auth0 configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_auth0/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource auth0 test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-auth0buildAn image will be built with the tagairbyte/source-auth0:dev.Viadocker build:dockerbuild-tairbyte/source-auth0:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-auth0:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-auth0:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-auth0:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-auth0:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-auth0testCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-auth0 testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/auth0.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-aws-cloudtrail | Aws Cloudtrail SourceThis is the repository for the Aws Cloudtrail source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_aws_cloudtrail/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource aws-cloudtrail test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-aws-cloudtrailbuildAn image will be built with the tagairbyte/source-aws-cloudtrail:dev.Viadocker build:dockerbuild-tairbyte/source-aws-cloudtrail:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-aws-cloudtrail:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aws-cloudtrail:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-aws-cloudtrail:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-aws-cloudtrail:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-aws-cloudtrailtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-aws-cloudtrail testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/aws-cloudtrail.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-azure-blob-storage | Azure Blob Storage SourceThis is the repository for the Azure Blob Storage source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_azure_blob_storage/spec.yamlfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource azure-blob-storage test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonThe Airbyte way of building this connector is to use ourairbyte-citool.
You can follow install instructionshere.
Then running the following command will build your connector:airbyte-ciconnectors--name=source-azure-blob-storagebuildOnce the command is done, you will find your connector image in your local docker registry:airbyte/source-azure-blob-storage:dev.When contributing on our connector you might need to customize the build process to add a system dependency or set an env var.
You can customize our build process by adding abuild_customization.pymodule to your connector.
This module should contain apre_connector_installandpost_connector_installasync function that will mutate the base image and the connector container respectively.
It will be imported at runtime by our build process and the functions will be called if they exist.Here is an example of abuild_customization.pymodule:from__future__importannotationsfromtypingimportTYPE_CHECKINGifTYPE_CHECKING:fromdaggerimportContainerasyncdefpre_connector_install(base_image_container:Container)->Container:returnawaitbase_image_container.with_env_variable("MY_PRE_BUILD_ENV_VAR","my_pre_build_env_var_value")asyncdefpost_connector_install(connector_container:Container)->Container:returnawaitconnector_container.with_env_variable("MY_POST_BUILD_ENV_VAR","my_post_build_env_var_value")This connector is built using our dynamic built process inairbyte-ci.
The base image used to build it is defined within the metadata.yaml file under theconnectorBuildOptions.
The build logic is defined usingDaggerhere.
It does not rely on a Dockerfile.If you would like to patch our connector and build your own a simple approach would be to:Create your own Dockerfile based on the latest version of the connector image.FROMairbyte/source-azure-blob-storage:latestCOPY../airbyte/integration_codeRUNpipinstall./airbyte/integration_codePlease use this as an example. This is not optimized.Build your image:dockerbuild-tairbyte/source-azure-blob-storage:dev.
dockerrunairbyte/source-azure-blob-storage:devspecThen run any of the connector commands as follows:docker run --rm airbyte/source-azure-blob-storage:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-azure-blob-storage:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-azure-blob-storage:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-azure-blob-storage:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-azure-blob-storagetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-azure-blob-storage testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/azure-blob-storage.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-azure-table | Azure Table SourceThis is the repository for the Azure Table source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_azure_table/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource azure-table test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.json --state integration_tests/state.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-azure-tablebuildAn image will be built with the tagairbyte/source-azure-table:dev.Viadocker build:dockerbuild-tairbyte/source-azure-table:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-azure-table:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-azure-table:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-azure-table:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-azure-table:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.json --state /integration_tests/state.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-azure-tabletestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-azure-table testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/azure-table.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-babelforce | Babelforce SourceThis is the repository for the Babelforce configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_babelforce/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource babelforce test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-babelforcebuildAn image will be built with the tagairbyte/source-babelforce:dev.Viadocker build:dockerbuild-tairbyte/source-babelforce:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-babelforce:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-babelforce:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-babelforce:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-babelforce:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-babelforcetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-babelforce testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/babelforce.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-bamboo-hr | Bamboo Hr SourceThis is the repository for the Bamboo Hr source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_bamboo_hr/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource bamboo-hr test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-bamboo-hrbuildAn image will be built with the tagairbyte/source-bamboo-hr:dev.Viadocker build:dockerbuild-tairbyte/source-bamboo-hr:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-bamboo-hr:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bamboo-hr:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bamboo-hr:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-bamboo-hr:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-bamboo-hrtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-bamboo-hr testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/bamboo-hr.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-bigcommerce | Bigcommerce SourceThis is the repository for the Bigcommerce configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_bigcommerce/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource bigcommerce test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-bigcommercebuildAn image will be built with the tagairbyte/source-bigcommerce:dev.Viadocker build:dockerbuild-tairbyte/source-bigcommerce:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-bigcommerce:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bigcommerce:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bigcommerce:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-bigcommerce:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-bigcommercetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-bigcommerce testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/bigcommerce.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-bing-ads | Bing-Ads source connectorThis is the repository for the Bing-Ads source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_bing_ads/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-bing-ads spec
poetry run source-bing-ads check --config secrets/config.json
poetry run source-bing-ads discover --config secrets/config.json
poetry run source-bing-ads read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-bing-adsbuildAn image will be available on your host with the tagairbyte/source-bing-ads:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-bing-ads:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bing-ads:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-bing-ads:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-bing-ads:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-bing-adstestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-bing-ads testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/bing-ads.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-braintree | Braintree SourceThis is the repository for the Braintree source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_braintree/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource braintree test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-braintreebuildAn image will be built with the tagairbyte/source-braintree:dev.Viadocker build:dockerbuild-tairbyte/source-braintree:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-braintree:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braintree:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braintree:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-braintree:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-braintreetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-braintree testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/braintree.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-braze | Braze SourceThis is the repository for the Braze configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_braze/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource braze test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-brazebuildAn image will be built with the tagairbyte/source-braze:dev.Viadocker build:dockerbuild-tairbyte/source-braze:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-braze:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braze:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-braze:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-braze:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-brazetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-braze testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/braze.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-breezometer | Breezometer SourceThis is the repository for the Breezometer configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_breezometer/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource breezometer test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-breezometerbuildAn image will be built with the tagairbyte/source-breezometer:dev.Viadocker build:dockerbuild-tairbyte/source-breezometer:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-breezometer:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-breezometer:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-breezometer:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-breezometer:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-breezometertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-breezometer testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/breezometer.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-callrail | Callrail SourceThis is the repository for the Callrail configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_callrail/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource callrail test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-callrailbuildAn image will be built with the tagairbyte/source-callrail:dev.Viadocker build:dockerbuild-tairbyte/source-callrail:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-callrail:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-callrail:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-callrail:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-callrail:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-callrailtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-callrail testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/callrail.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-captain-data | Captain Data SourceThis is the repository for the Captain Data configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_captain_data/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource captain-data test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-captain-databuildAn image will be built with the tagairbyte/source-captain-data:dev.Viadocker build:dockerbuild-tairbyte/source-captain-data:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-captain-data:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-captain-data:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-captain-data:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-captain-data:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-captain-datatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-captain-data testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/captain-data.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-cart | Cart.com SourceThis is the repository for the Cart.com source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_cart/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource cart test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-cartbuildAn image will be built with the tagairbyte/source-cart:dev.Viadocker build:dockerbuild-tairbyte/source-cart:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-cart:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-cart:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-cart:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-cart:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-carttestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-cart testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/cart.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-chargebee | Chargebee source connectorThis is the repository for the Chargebee source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_chargebee/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-chargebee spec
poetry run source-chargebee check --config secrets/config.json
poetry run source-chargebee discover --config secrets/config.json
poetry run source-chargebee read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-chargebeebuildAn image will be available on your host with the tagairbyte/source-chargebee:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-chargebee:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chargebee:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chargebee:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-chargebee:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-chargebeetestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-chargebee testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/chargebee.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-chargify | Chargify SourceThis is the repository for the Chargify configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_chargify/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource chargify test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-chargifybuildAn image will be built with the tagairbyte/source-chargify:dev.Viadocker build:dockerbuild-tairbyte/source-chargify:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-chargify:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chargify:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chargify:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-chargify:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-chargifytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-chargify testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/chargify.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-chartmogul | Chartmogul SourceThis is the repository for the Chartmogul source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python3 -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_chartmogul/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource chartmogul test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-chartmogulbuildAn image will be built with the tagairbyte/source-chartmogul:dev.Viadocker build:dockerbuild-tairbyte/source-chartmogul:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-chartmogul:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chartmogul:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-chartmogul:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-chartmogul:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-chartmogultestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-chartmogul testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/chartmogul.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-clickup-api | Clickup Api SourceThis is the repository for the Clickup Api configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_clickup_api/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource clickup-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-clickup-apibuildAn image will be built with the tagairbyte/source-clickup-api:dev.Viadocker build:dockerbuild-tairbyte/source-clickup-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-clickup-api:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-clickup-api:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-clickup-api:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-clickup-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-clickup-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-clickup-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/clickup-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-clockify | Clockify SourceThis is the repository for the Clockify configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_clockify/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource clockify test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-clockifybuildAn image will be built with the tagairbyte/source-clockify:dev.Viadocker build:dockerbuild-tairbyte/source-clockify:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-clockify:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-clockify:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-clockify:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-clockify:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-clockifytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-clockify testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/clockify.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-close-com | Close.com SourceThis is the repository for the Close.com source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_close_com/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource close.com test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-close-combuildAn image will be built with the tagairbyte/source-close-com:dev.Viadocker build:dockerbuild-tairbyte/source-close-com:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-close-com:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-close-com:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-close-com:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-close-com:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-close-comtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-close-com testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/close-com.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-coda | Coda SourceThis is the repository for the Coda source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_coda/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource coda test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-codabuildAn image will be built with the tagairbyte/source-coda:dev.Viadocker build:dockerbuild-tairbyte/source-coda:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-coda:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coda:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coda:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-coda:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-codatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-coda testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/coda.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-coin-api | Coin Api SourceThis is the repository for the Coin Api configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_coin_api/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource coin-api test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-coin-apibuildAn image will be built with the tagairbyte/source-coin-api:dev.Viadocker build:dockerbuild-tairbyte/source-coin-api:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-coin-api:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coin-api:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coin-api:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-coin-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-coin-apitestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-coin-api testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/coin-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-coingecko-coins | Coingecko Coins SourceThis is the repository for the Coingecko Coins configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_coingecko_coins/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource coingecko-coins test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-coingecko-coinsbuildAn image will be built with the tagairbyte/source-coingecko-coins:dev.Viadocker build:dockerbuild-tairbyte/source-coingecko-coins:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-coingecko-coins:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coingecko-coins:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coingecko-coins:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-coingecko-coins:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-coingecko-coinstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-coingecko-coins testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/coingecko-coins.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-coinmarketcap | Coinmarketcap SourceThis is the repository for the Coinmarketcap configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_coinmarketcap/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource coinmarketcap test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-coinmarketcapbuildAn image will be built with the tagairbyte/source-coinmarketcap:dev.Viadocker build:dockerbuild-tairbyte/source-coinmarketcap:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-coinmarketcap:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coinmarketcap:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-coinmarketcap:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-coinmarketcap:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-coinmarketcaptestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-coinmarketcap testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/coinmarketcap.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-commcare | Commcare SourceThis is the repository for the Commcare source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_commcare/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource commcare test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-commcarebuildAn image will be built with the tagairbyte/source-commcare:dev.Viadocker build:dockerbuild-tairbyte/source-commcare:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-commcare:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-commcare:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-commcare:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-commcare:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-commcaretestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-commcare testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/commcare.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-commercetools | Commercetools SourceThis is the repository for the Commercetools configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_commercetools/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource commercetools test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-commercetoolsbuildAn image will be built with the tagairbyte/source-commercetools:dev.Viadocker build:dockerbuild-tairbyte/source-commercetools:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-commercetools:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-commercetools:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-commercetools:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-commercetools:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-commercetoolstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-commercetools testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/commercetools.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-configcat | Configcat SourceThis is the repository for the Configcat configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_configcat/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource configcat test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-configcatbuildAn image will be built with the tagairbyte/source-configcat:dev.Viadocker build:dockerbuild-tairbyte/source-configcat:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-configcat:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-configcat:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-configcat:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-configcat:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-configcattestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-configcat testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/configcat.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-confluence | Confluence SourceThis is the repository for the Confluence configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_confluence/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource confluence test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-confluencebuildAn image will be built with the tagairbyte/source-confluence:dev.Viadocker build:dockerbuild-tairbyte/source-confluence:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-confluence:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-confluence:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-confluence:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-confluence:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-confluencetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-confluence testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/confluence.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-convertkit | Convertkit SourceThis is the repository for the Convertkit configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_convertkit/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource convertkit test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-convertkitbuildAn image will be built with the tagairbyte/source-convertkit:dev.Viadocker build:dockerbuild-tairbyte/source-convertkit:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-convertkit:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convertkit:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convertkit:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-convertkit:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-convertkittestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-convertkit testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/convertkit.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-convex | Convex SourceThis is the repository for the Convex source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_convex/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource convex test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-convexbuildAn image will be built with the tagairbyte/source-convex:dev.Viadocker build:dockerbuild-tairbyte/source-convex:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-convex:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convex:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-convex:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-convex:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-convextestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-convex testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/convex.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-copper | Copper SourceThis is the repository for the Copper configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_copper/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource copper test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-copperbuildAn image will be built with the tagairbyte/source-copper:dev.Viadocker build:dockerbuild-tairbyte/source-copper:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-copper:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-copper:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-copper:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-copper:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-coppertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-copper testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/copper.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-courier | Courier SourceThis is the repository for the Courier configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_courier/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource courier test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-courierbuildAn image will be built with the tagairbyte/source-courier:dev.Viadocker build:dockerbuild-tairbyte/source-courier:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-courier:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-courier:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-courier:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-courier:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-couriertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-courier testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/courier.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-customer-io | Customer Io SourceThis is the repository for the Customer Io configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_customer_io/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource customer-io test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-customer-iobuildAn image will be built with the tagairbyte/source-customer-io:dev.Viadocker build:dockerbuild-tairbyte/source-customer-io:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-customer-io:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-customer-io:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-customer-io:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-customer-io:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-customer-iotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-customer-io testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/customer-io.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-datadog | Datadog SourceThis is the repository for the Datadog configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_datadog/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource datadog test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-datadogbuildAn image will be built with the tagairbyte/source-datadog:dev.Viadocker build:dockerbuild-tairbyte/source-datadog:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-datadog:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datadog:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datadog:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-datadog:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-datadogtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-datadog testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/datadog.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-datascope | Datascope SourceThis is the repository for the Datascope configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_datascope/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource datascope test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-datascopebuildAn image will be built with the tagairbyte/source-datascope:dev.Viadocker build:dockerbuild-tairbyte/source-datascope:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-datascope:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datascope:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-datascope:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-datascope:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-datascopetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-datascope testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/datascope.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-delighted | Delighted SourceThis is the repository for the Delighted source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_delighted/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource delighted test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-delightedbuildAn image will be built with the tagairbyte/source-delighted:dev.Viadocker build:dockerbuild-tairbyte/source-delighted:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-delighted:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-delighted:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-delighted:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-delighted:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-delightedtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-delighted testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/delighted.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-dixa | Dixa SourceThis is the repository for the Dixa configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_dixa/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource dixa test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-dixabuildAn image will be built with the tagairbyte/source-dixa:dev.Viadocker build:dockerbuild-tairbyte/source-dixa:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-dixa:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dixa:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dixa:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-dixa:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-dixatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-dixa testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/dixa.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-dockerhub | Dockerhub SourceThis is the repository for the Dockerhub configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_dockerhub/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource dockerhub test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-dockerhubbuildAn image will be built with the tagairbyte/source-dockerhub:dev.Viadocker build:dockerbuild-tairbyte/source-dockerhub:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-dockerhub:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dockerhub:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dockerhub:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-dockerhub:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-dockerhubtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-dockerhub testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/dockerhub.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-dremio | Dremio SourceThis is the repository for the Dremio configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_dremio/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource dremio test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-dremiobuildAn image will be built with the tagairbyte/source-dremio:dev.Viadocker build:dockerbuild-tairbyte/source-dremio:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-dremio:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dremio:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dremio:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-dremio:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-dremiotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-dremio testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/dremio.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-drift | Drift SourceThis is the repository for the Drift configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_drift/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource drift test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-driftbuildAn image will be built with the tagairbyte/source-drift:dev.Viadocker build:dockerbuild-tairbyte/source-drift:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-drift:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-drift:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-drift:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-drift:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-drifttestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-drift testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/drift.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-dv-360 | DISPLAY & VIDEO 360 SourceThis is the repository for the DISPLAY & VIDEO 360 source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_dv_360/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource dv360 test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-dv-360buildAn image will be built with the tagairbyte/source-dv-360:dev.Viadocker build:dockerbuild-tairbyte/source-dv-360:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-dv-360:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dv-360:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-dv-360:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-dv-360:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-dv-360testCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-dv-360 testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/dv-360.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-emailoctopus | EmailOctopus SourceThis is the repository for the EmailOctopus configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_emailoctopus/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource emailoctopus test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-emailoctopusbuildAn image will be built with the tagairbyte/source-emailoctopus:dev.Viadocker build:dockerbuild-tairbyte/source-emailoctopus:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-emailoctopus:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-emailoctopus:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-emailoctopus:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-emailoctopus:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-emailoctopustestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-emailoctopus testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/emailoctopus.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-everhour | Everhour SourceThis is the repository for the Everhour configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_everhour/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource everhour test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-everhourbuildAn image will be built with the tagairbyte/source-everhour:dev.Viadocker build:dockerbuild-tairbyte/source-everhour:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-everhour:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-everhour:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-everhour:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-everhour:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-everhourtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-everhour testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/everhour.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-exchange-rates | Exchange Rates SourceThis is the repository for the Exchange Rates configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_exchange_rates/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource exchange-rates test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-exchange-ratesbuildAn image will be built with the tagairbyte/source-exchange-rates:dev.Viadocker build:dockerbuild-tairbyte/source-exchange-rates:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-exchange-rates:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-exchange-rates:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-exchange-rates:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-exchange-rates:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-exchange-ratestestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-exchange-rates testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/exchange-rates.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-facebook-marketing | Facebook-Marketing source connectorThis is the repository for the Facebook-Marketing source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_facebook_marketing/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-facebook-marketing spec
poetry run source-facebook-marketing check --config secrets/config.json
poetry run source-facebook-marketing discover --config secrets/config.json
poetry run source-facebook-marketing read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-facebook-marketingbuildAn image will be available on your host with the tagairbyte/source-facebook-marketing:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-facebook-marketing:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-facebook-marketing:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-facebook-marketing:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-facebook-marketing:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-facebook-marketingtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-facebook-marketing testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/facebook-marketing.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-facebook-pages | Facebook Pages SourceThis is the repository for the Facebook Pages configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_facebook_pages/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource facebook-pages test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-facebook-pagesbuildAn image will be built with the tagairbyte/source-facebook-pages:dev.Viadocker build:dockerbuild-tairbyte/source-facebook-pages:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-facebook-pages:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-facebook-pages:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-facebook-pages:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-facebook-pages:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-facebook-pagestestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-facebook-pages testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/facebook-pages.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-faker | Faker source connectorThis is the repository for the Faker source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_faker/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-faker spec
poetry run source-faker check --config secrets/config.json
poetry run source-faker discover --config secrets/config.json
poetry run source-faker read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-fakerbuildAn image will be available on your host with the tagairbyte/source-faker:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-faker:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-faker:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-faker:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-faker:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-fakertestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-faker testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/faker.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-fastbill | Fastbill SourceThis is the repository for the Fastbill configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_fastbill/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource fastbill test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-fastbillbuildAn image will be built with the tagairbyte/source-fastbill:dev.Viadocker build:dockerbuild-tairbyte/source-fastbill:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-fastbill:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fastbill:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fastbill:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-fastbill:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-fastbilltestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-fastbill testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/fastbill.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-fauna | New ReadersIf you know how Airbyte works, readbootstrap.mdfor a quick introduction to this source. If you haven't
used airbyte before, readoverview.mdfor a longer overview about what this connector is and how to use
it.First, start a local fauna container:docker run --rm --name faunadb -p 8443:8443 fauna/faunadbIn another terminal, cd into the connector directory:cd airbyte-integrations/connectors/source-faunaOnce started the container is up, setup the database:fauna eval "$(cat examples/setup_database.fql)" --domain localhost --port 8443 --scheme http --secret secretFinally, run the connector:python main.py spec
python main.py check --config examples/config_localhost.json
python main.py discover --config examples/config_localhost.json
python main.py read --config examples/config_localhost.json --catalog examples/configured_catalog.jsonTo pick up a partial failure you need to pass in a state file. To test via example induce a crash via bad data (e.g. a missing required field), updateexamples/sample_state_full_sync.jsonto contain your emitted state and then run:python main.py read --config examples/config_localhost.json --catalog examples/configured_catalog.json --state examples/sample_state_full_sync.jsonFirst, cd into the connector directory:cd airbyte-integrations/connectors/source-faunaThe integration tests require a secret config.json. Ping me on slack to get this file.
Once you have this file, put it insecrets/config.json. A sample of this file can be
found atexamples/secret_config.json. Once the file is created, build the connector:docker build . -t airbyte/source-fauna:devNow, run the integration tests:python -m pytest -p integration_tests.acceptanceThis is the repository for the Fauna source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_fauna/spec.yamlfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeexamples/secret_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource fauna test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-faunabuildAn image will be built with the tagairbyte/source-fauna:dev.Viadocker build:dockerbuild-tairbyte/source-fauna:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-fauna:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fauna:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fauna:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-fauna:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-faunatestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-fauna testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/fauna.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-file | File source connectorThis is the repository for the File source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_file/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-file spec
poetry run source-file check --config secrets/config.json
poetry run source-file discover --config secrets/config.json
poetry run source-file read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-filebuildAn image will be available on your host with the tagairbyte/source-file:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-file:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-file:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-file:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-file:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-filetestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-file testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/file.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-firebase-realtime-database | Firebase Realtime Database SourceThis is the repository for the Firebase Realtime Database source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_firebase_realtime_database/spec.yamlfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource firebase-realtime-database test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-firebase-realtime-databasebuildAn image will be built with the tagairbyte/source-firebase-realtime-database:dev.Viadocker build:dockerbuild-tairbyte/source-firebase-realtime-database:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-firebase-realtime-database:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebase-realtime-database:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebase-realtime-database:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-firebase-realtime-database:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-firebase-realtime-databasetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-firebase-realtime-database testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/firebase-realtime-database.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-firebolt | Firebolt SourceThis is the repository for the Firebolt source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_firebolt/spec.jsonfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource firebolt test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-fireboltbuildAn image will be built with the tagairbyte/source-firebolt:dev.Viadocker build:dockerbuild-tairbyte/source-firebolt:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-firebolt:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebolt:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-firebolt:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-firebolt:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-firebolttestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-firebolt testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/firebolt.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-flexport | Flexport SourceThis is the repository for the Flexport configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_flexport/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource flexport test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-flexportbuildAn image will be built with the tagairbyte/source-flexport:dev.Viadocker build:dockerbuild-tairbyte/source-flexport:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-flexport:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-flexport:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-flexport:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-flexport:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-flexporttestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-flexport testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/flexport.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-freshcaller | Freshcaller SourceThis is the repository for the Freshcaller source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_freshcaller/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource freshcaller test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-freshcallerbuildAn image will be built with the tagairbyte/source-freshcaller:dev.Viadocker build:dockerbuild-tairbyte/source-freshcaller:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-freshcaller:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshcaller:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshcaller:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-freshcaller:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-freshcallertestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-freshcaller testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/freshcaller.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-freshdesk | Freshdesk source connectorThis is the repository for the Freshdesk source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_freshdesk/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-freshdesk spec
poetry run source-freshdesk check --config secrets/config.json
poetry run source-freshdesk discover --config secrets/config.json
poetry run source-freshdesk read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-freshdeskbuildAn image will be available on your host with the tagairbyte/source-freshdesk:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-freshdesk:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshdesk:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshdesk:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-freshdesk:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-freshdesktestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-freshdesk testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/freshdesk.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-freshsales | Freshsales SourceThis is the repository for the Freshsales source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_freshsales/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource freshsales test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-freshsalesbuildAn image will be built with the tagairbyte/source-freshsales:dev.Viadocker build:dockerbuild-tairbyte/source-freshsales:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-freshsales:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshsales:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshsales:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-freshsales:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-freshsalestestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-freshsales testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/freshsales.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-freshservice | Freshservice SourceThis is the repository for the Freshservice configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_freshservice/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource freshservice test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-freshservicebuildAn image will be built with the tagairbyte/source-freshservice:dev.Viadocker build:dockerbuild-tairbyte/source-freshservice:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-freshservice:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshservice:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshservice:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-freshservice:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-freshservicetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-freshservice testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/freshservice.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-fullstory | Fullstory SourceThis is the repository for the Fullstory configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_fullstory/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource fullstory test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-fullstorybuildAn image will be built with the tagairbyte/source-fullstory:dev.Viadocker build:dockerbuild-tairbyte/source-fullstory:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-fullstory:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fullstory:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-fullstory:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-fullstory:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-fullstorytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-fullstory testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/fullstory.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-gainsight-px | Gainsight Px SourceThis is the repository for the Gainsight Px configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_gainsight_px/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource gainsight-px test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-gainsight-pxbuildAn image will be built with the tagairbyte/source-gainsight-px:dev.Viadocker build:dockerbuild-tairbyte/source-gainsight-px:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-gainsight-px:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gainsight-px:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gainsight-px:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-gainsight-px:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-gainsight-pxtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-gainsight-px testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/gainsight-px.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-gcs | Gcs SourceThis is the repository for the Gcs source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txtIf you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_gcs/spec.yamlfile.
Note that thesecretsdirectory is gitignored by default, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource gcs test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-gcsbuildAn image will be built with the tagairbyte/source-gcs:dev.Viadocker build:dockerbuild-tairbyte/source-gcs:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-gcs:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gcs:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gcs:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-gcs:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-gcstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-gcs testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/gcs.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-genesys | Genesys SourceThis is the repository for the Genesys source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.We are usingOAuth2as this is the only supported authentication method.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_genesys/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource genesys test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-genesysbuildAn image will be built with the tagairbyte/source-genesys:dev.Viadocker build:dockerbuild-tairbyte/source-genesys:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-genesys:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-genesys:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-genesys:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-genesys:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-genesystestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-genesys testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/genesys.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-getlago | Lago SourceThis is the repository for the Lago configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_getlago/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource getlago test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-getlagobuildAn image will be built with the tagairbyte/source-getlago:dev.Viadocker build:dockerbuild-tairbyte/source-getlago:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-getlago:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-getlago:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-getlago:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-getlago:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-getlagotestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-getlago testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/getlago.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-github | Github source connectorThis is the repository for the Github source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_github/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-github spec
poetry run source-github check --config secrets/config.json
poetry run source-github discover --config secrets/config.json
poetry run source-github read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-githubbuildAn image will be available on your host with the tagairbyte/source-github:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-github:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-github:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-github:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-github:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-githubtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-github testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/github.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-gitlab | Gitlab source connectorThis is the repository for the Gitlab source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_gitlab/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-gitlab spec
poetry run source-gitlab check --config secrets/config.json
poetry run source-gitlab discover --config secrets/config.json
poetry run source-gitlab read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-gitlabbuildAn image will be available on your host with the tagairbyte/source-gitlab:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-gitlab:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gitlab:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gitlab:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-gitlab:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-gitlabtestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-gitlab testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/gitlab.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-glassfrog | Glassfrog SourceThis is the repository for the Glassfrog configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_glassfrog/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource glassfrog test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-glassfrogbuildAn image will be built with the tagairbyte/source-glassfrog:dev.Viadocker build:dockerbuild-tairbyte/source-glassfrog:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-glassfrog:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-glassfrog:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-glassfrog:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-glassfrog:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-glassfrogtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-glassfrog testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/glassfrog.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-gocardless | Gocardless SourceThis is the repository for the Gocardless configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_gocardless/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource gocardless test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-gocardlessbuildAn image will be built with the tagairbyte/source-gocardless:dev.Viadocker build:dockerbuild-tairbyte/source-gocardless:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-gocardless:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gocardless:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gocardless:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-gocardless:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-gocardlesstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-gocardless testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/gocardless.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-gong | Gong SourceThis is the repository for the Gong configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_gong/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource gong test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-gongbuildAn image will be built with the tagairbyte/source-gong:dev.Viadocker build:dockerbuild-tairbyte/source-gong:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-gong:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gong:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-gong:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-gong:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-gongtestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-gong testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/gong.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-google-ads | Google-Ads source connectorThis is the repository for the Google-Ads source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_google_ads/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-google-ads spec
poetry run source-google-ads check --config secrets/config.json
poetry run source-google-ads discover --config secrets/config.json
poetry run source-google-ads read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-google-adsbuildAn image will be available on your host with the tagairbyte/source-google-ads:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-google-ads:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-ads:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-ads:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-google-ads:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-adstestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-ads testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/google-ads.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-google-analytics-data-api | Google-Analytics-Data-Api source connectorThis is the repository for the Google-Analytics-Data-Api source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_google_analytics_data_api/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-google-analytics-data-api spec
poetry run source-google-analytics-data-api check --config secrets/config.json
poetry run source-google-analytics-data-api discover --config secrets/config.json
poetry run source-google-analytics-data-api read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-google-analytics-data-apibuildAn image will be available on your host with the tagairbyte/source-google-analytics-data-api:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-google-analytics-data-api:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-analytics-data-api:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-analytics-data-api:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-google-analytics-data-api:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-analytics-data-apitestCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-analytics-data-api testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/google-analytics-data-api.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-google-analytics-v4 | Google-Analytics-V4 source connectorThis is the repository for the Google-Analytics-V4 source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.Local developmentPrerequisitesPython (~=3.9)Poetry (~=1.7) - installation instructionshereInstalling the connectorFrom this connector directory, run:poetryinstall--withdevCreate credentialsIf you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_google_analytics_v4/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seesample_files/sample_config.jsonfor a sample config file.Locally running the connectorpoetry run source-google-analytics-v4 spec
poetry run source-google-analytics-v4 check --config secrets/config.json
poetry run source-google-analytics-v4 discover --config secrets/config.json
poetry run source-google-analytics-v4 read --config secrets/config.json --catalog sample_files/configured_catalog.jsonRunning unit testsTo run unit tests locally, from the connector directory run:poetry run pytest unit_testsBuilding the docker imageInstallairbyte-ciRun the following command to build the docker image:airbyte-ciconnectors--name=source-google-analytics-v4buildAn image will be available on your host with the tagairbyte/source-google-analytics-v4:dev.Running as a docker containerThen run any of the connector commands as follows:docker run --rm airbyte/source-google-analytics-v4:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-analytics-v4:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-analytics-v4:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-google-analytics-v4:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonRunning our CI test suiteYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-analytics-v4testCustomizing acceptance TestsCustomizeacceptance-test-config.ymlfile to configure acceptance tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.Dependency ManagementAll of your dependencies should be managed via Poetry.
To add a new dependency, run:poetryadd<package-name>Please commit the changes topyproject.tomlandpoetry.lockfiles.Publishing a new version of the connectorYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-analytics-v4 testBump the connector version (please followsemantic versioning for connectors):bump thedockerImageTagvalue in inmetadata.yamlbump theversionvalue inpyproject.tomlMake sure themetadata.yamlcontent is up to date.Make sure the connector documentation and its changelog is up to date (docs/integrations/sources/google-analytics-v4.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master.Once your PR is merged, the new version of the connector will be automatically published to Docker Hub and our connector registry. |
airbyte-source-google-directory | Freshsales SourceThis is the repository for the Freshsales source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_freshsales/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource freshsales test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--namesource-freshsalesbuildAn image will be built with the tagairbyte/source-freshsales:dev.Viadocker build:dockerbuild-tairbyte/source-freshsales:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-freshsales:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshsales:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-freshsales:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-freshsales:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-directorytestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-directory testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/google-directory.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-google-drive | Google Drive SourceThis is the repository for the Google Drive source connector, written in Python.
For information about how to use this connector within Airbyte, seethe documentation.To iterate on this connector, make sure to complete this prerequisites section.From this connector directory, create a virtual environment:python -m venv .venvThis will generate a virtualenv for this module in.venv/. Make sure this venv is active in your
development environment of choice. To activate it from the terminal, run:source .venv/bin/activate
pip install -r requirements.txt
pip install '.[tests]'If you are in an IDE, follow your IDE's instructions to activate the virtualenv.Note that while we are installing dependencies fromrequirements.txt, you should only editsetup.pyfor your dependencies.requirements.txtis
used for editable installs (pip install -e) to pull in Python dependencies from the monorepo and will callsetup.py.
If this is mumbo jumbo to you, don't worry about it, just put your deps insetup.pybut install usingpip install -r requirements.txtand everything
should work as you expect.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_google_drive/spec.jsonfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource google-drive test credsand place them intosecrets/config.json.python main.py spec
python main.py check --config secrets/config.json
python main.py discover --config secrets/config.json
python main.py read --config secrets/config.json --catalog integration_tests/configured_catalog.jsonViaairbyte-ci(recommended):airbyte-ciconnectors--name=source-google-drivebuildAn image will be built with the tagairbyte/source-google-drive:dev.Viadocker build:dockerbuild-tairbyte/source-google-drive:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-google-drive:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-drive:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-drive:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-google-drive:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-drivetestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-drive testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/google-drive.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
airbyte-source-google-pagespeed-insights | Google Pagespeed Insights SourceThis is the repository for the Google Pagespeed Insights configuration based source connector.
For information about how to use this connector within Airbyte, seethe documentation.If you are a community contributor, follow the instructions in thedocumentationto generate the necessary credentials. Then create a filesecrets/config.jsonconforming to thesource_google_pagespeed_insights/spec.yamlfile.
Note that any directory namedsecretsis gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
Seeintegration_tests/sample_config.jsonfor a sample config file.If you are an Airbyte core member, copy the credentials in Lastpass under the secret namesource google-pagespeed-insights test credsand place them intosecrets/config.json.Viaairbyte-ci(recommended):airbyte-ciconnectors--name=source-google-pagespeed-insightsbuildAn image will be built with the tagairbyte/source-google-pagespeed-insights:dev.Viadocker build:dockerbuild-tairbyte/source-google-pagespeed-insights:dev.Then run any of the connector commands as follows:docker run --rm airbyte/source-google-pagespeed-insights:dev spec
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-pagespeed-insights:dev check --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets airbyte/source-google-pagespeed-insights:dev discover --config /secrets/config.json
docker run --rm -v $(pwd)/secrets:/secrets -v $(pwd)/integration_tests:/integration_tests airbyte/source-google-pagespeed-insights:dev read --config /secrets/config.json --catalog /integration_tests/configured_catalog.jsonYou can run our full test suite locally usingairbyte-ci:airbyte-ciconnectors--name=source-google-pagespeed-insightstestCustomizeacceptance-test-config.ymlfile to configure tests. SeeConnector Acceptance Testsfor more information.
If your connector requires to create or destroy resources for use during acceptance tests create fixtures for it and place them inside integration_tests/acceptance.py.All of your dependencies should go insetup.py, NOTrequirements.txt. The requirements file is only used to connect internal Airbyte dependencies in the monorepo for local development.
We split dependencies between two groups, dependencies that are:required for your connector to work need to go toMAIN_REQUIREMENTSlist.required for the testing need to go toTEST_REQUIREMENTSlistYou've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?Make sure your changes are passing our test suite:airbyte-ci connectors --name=source-google-pagespeed-insights testBump the connector version inmetadata.yaml: increment thedockerImageTagvalue. Please followsemantic versioning for connectors.Make sure themetadata.yamlcontent is up to date.Make the connector documentation and its changelog is up to date (docs/integrations/sources/google-pagespeed-insights.md).Create a Pull Request: useour PR naming conventions.Pat yourself on the back for being an awesome contributor.Someone from Airbyte will take a look at your PR and iterate with you to merge it into master. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.