input_text
stringlengths 25
459
| output_text
stringclasses 3
values |
---|---|
When I attempt to initiate Citrix Studio, an error message appears
stating "Delivery Controller not found." | Azure Data Factory |
When I initiate Citrix Studio, an error message stating "Delivery
Controller not found" is displayed. | Azure Data Factory |
Upon launching Citrix Studio, an error notification will appear
indicating that the Delivery Controller cannot be located. | Azure Data Factory |
A warning message stating that the Delivery Controller cannot be
found will show up when Citrix Studio is launched. | Azure Data Factory |
In Azure Databricks it had returned an Error code: 3200 | Azure Data Factory |
An Error code: 3200 was produced in Azure Databricks. | Azure Data Factory |
Azure Databricks issued an error with the code 3200. | Azure Data Factory |
The error code from Azure Databricks was 3200.Please help me in
solving it | Azure Data Factory |
Azure Databricks returned the 3200 error code. Please assist me in
resolving this | Azure Data Factory |
Azure Databricks back the 3200 error code. Please assist me in
resolving this | Azure Data Factory |
Invalid Python file URI... Please visit Databricks user guide for supported URI schemes. | Azure Data Factory |
Invalid Python file URI... Please go to Databricks individual
information for supported URI schemes. | Azure Data Factory |
The Python file URI is not valid... Kindly refer to Databricks'
documentation for the list of URI schemes that are supported. | Azure Data Factory |
The URI for the Python file is invalid... Please consult Databricks' documentation for a comprehensive list of supported URI schemes. | Azure Data Factory |
The Python file's URI is not valid... Kindly refer to Databricks'
documentation for a thorough compilation of supported URI schemes. | Azure Data Factory |
The URI for the Python file is invalid. For a comprehensive list of
supported URI schemes, please refer to the Databricks manual. | Azure Data Factory |
Could not parse request object: Expected 'key' and 'value' to be set for JSON map field base_parameters, got 'key: "..."' instead. | Azure Data Factory |
Unable to interpret request object: Anticipated 'key' and 'value' to be arranged for JSON map field base_parameters, received 'key: "..."' instead. | Azure Data Factory |
Parsing of request object failed: The JSON map field base_parameters is missing its 'key' and 'value' parameters and instead has 'key: "..."' parameter. | Azure Data Factory |
The JSON map field base_parameters is missing its 'key' and 'value' parameters and instead has a 'key: '..."' parameter, which means that the request object's parsing failed. | Azure Data Factory |
The request object's parsing failed because the key and value parameters for the JSON map field base_parameters are missing, leaving only the key: parameter. | Azure Data Factory |
The parsing of the request object was unsuccessful as the JSON map field base_parameters lacks both the key and value parameters, resulting in only the presence of the key: parameter. | Azure Data Factory |
The cluster is in Terminated state, not available to receive jobs.
"Please fix the cluster or retry later."
| Azure Data Factory |
The group is currently in a state of termination and cannot receive any tasks.
"Kindly rectify the cluster or attempt again at a later time". | Azure Data Factory |
The group is now terminated and unable to accept any tasks.
"Please fix the problem or try again at a later time." | Azure Data Factory |
The team has been disbanded and is incapable of taking on any assignments.
"Please fix the cluster or retry later." | Azure Data Factory |
The team was disbanded and is no longer able to accept any tasks.
"Please fix the cluster or retry later." | Azure Data Factory |
The team used to be disbanded and is no longer capable to accept any tasks.
"Please fix the cluster or retry later." | Azure Data Factory |
There were already 1000 jobs created in past 3600 seconds, exceedingrate limit: 1000 job creations per 3600 seconds. | Azure Data Factory |
There have been already one thousand jobs created in past 3600 seconds, exceedingrate limit: one thousand job creations per 3600 seconds. | Azure Data Factory |
One thousand jobs have already been generated in the last 3600 seconds, exceeding the cap of one thousand jobs per 3600 seconds. | Azure Data Factory |
There had been already 1000 jobs created in past 3600 seconds, exceedingrate limit: 1000 job creations per 3600 seconds | Azure Data Factory |
The rate restriction of 1,000 jobs produced every 3600 seconds had already been exceeded in the previous 3600 seconds. | Azure Data Factory |
1000 jobs have already been generated in the last 3600 seconds, exceeding the cap of 1000 jobs per 3600 seconds. | Azure Data Factory |
How can I schedule a pipeline? | Azure Data Factory |
How do I plan a pipeline? | Azure Data Factory |
What is a pipeline schedule? | Azure Data Factory |
How can a pipeline be scheduled? | Azure Data Factory |
How might a pipeline be planned? | Azure Data Factory |
How is a pipeline timetable made? | Azure Data Factory |
Hi, I am getting BadRequest error while running adf pipeline which contains data flow activity. It was running fine previously. | Azure Data Factory |
Hello, I am encountering a BadRequest issue when executing ADF pipeline that comprises a data flow task. It was functioning properly in the past. | Azure Data Factory |
Hello, whenever I run an adf pipeline that includes data flow activities, I receive a BadRequest error. Before, everything worked smoothly. | Azure Data Factory |
Hi, I am getting BadRequest error while running adf pipeline which consists of facts drift activity. It used to be jogging great previously. | Azure Data Factory |
When I attempt to execute an adf pipeline that has data flow activity, I receive a BadRequest error. Before, everything worked smoothly. | Azure Data Factory |
While running an adf pipeline with data flow activity, I keep getting the BadRequest error. Before, it was functioning properly. | Azure Data Factory |
I came across some strange issue. I created a pipeline to bulk load tables into the blob storage. In the Foreach container , copy activity dataset, I created two parameters schema and table, but when I click on the pipeline i can see only schema and not the table. | Azure Data Factory |
I encountered an unusual problem. I devised a pipeline to mass
upload tables into the blob storage. In the Foreach container, copy
activity dataset, I established two variables schema and table,
however, upon clicking on the pipeline, I am only able to view
schema and not the table. | Azure Data Factory |
I got into a peculiar problem. To bulk load tables into the blob storage, I made a pipeline. I created two parameters, a schema and a table, in the Foreach container, copy activity dataset, but when I click on the pipeline, I can only see the schema and not the table. | Azure Data Factory |
I got here across some extraordinary issue. I created a pipeline to bulk load tables into the blob storage. In the Foreach container , copy activity dataset, I created two parameters schema and table, however when I click on the pipeline i can see solely schema and not the table. | Azure Data Factory |
I got upon an odd problem. For the purpose of bulk loading tables into the blob storage, I made a pipeline. When I click on the pipeline, I can only see the schema and not the table that I built in the Foreach container, copy activity dataset. | Azure Data Factory |
I came across some atypical issue. I created a pipeline to bulk load tables into the blob storage. In the Foreach container , reproduction recreation dataset, I created two parameters schema and table, but when I click on on the pipeline i can see solely schema and not the table. | Azure Data Factory |
DUMMY1();
DUMMY2(Message VARCHAR);
I am able to call the one without arguments, but not able to call the one with parameters.
I get the following error :
ERROR [07002] [Microsoft][ODBC] (10690) Expected descriptor record does not exist during query execution. | Azure Data Factory |
DUMMY1();
DUMMY2(Message VARCHAR);
The one without arguments can be called, while the one with parameters cannot.
I get the following error :
ERROR [07002] [Microsoft][ODBC] (10690) Expected descriptor record does not exist during query execution. | Azure Data Factory |
I can call the method without arguments, but I can't call the method with parameters.
The code as follows
DUMMY1();
DUMMY2(Message VARCHAR);
I get the following error :
ERROR [07002] [Microsoft][ODBC] (10690) Expected descriptor record does not exist during query execution. | Azure Data Factory |
DUMMY1();
DUMMY2(Message VARCHAR);
Although I am able to call the method without any parameters, I cannot do it with arguments.
I get the following error :
ERROR [07002] [Microsoft][ODBC] (10690) Expected descriptor record does not exist during query execution. | Azure Data Factory |
The method can be called, however I am unable to pass parameters when doing so.
DUMMY1();
DUMMY2(Message VARCHAR);
I get the following error :
ERROR [07002] [Microsoft][ODBC] (10690) Expected descriptor record does not exist during query execution. | Azure Data Factory |
I am able to call the method, but I am unable to call it with parameters.
DUMMY1();
DUMMY2(Message VARCHAR);
I get the following error :
ERROR [07002] [Microsoft][ODBC] (10690) Expected descriptor record does not exist during query execution. | Azure Data Factory |
I am getting an issue as
Missing required field: settings.task.notebook_task.notebook_path. | Azure Data Factory |
I am seeing a problem with
"Missing required field:" settings.task.notebook_task.notebook_path. | Azure Data Factory |
Missing needed field: I'm experiencing a problem
. settings.task.notebook_task.notebook_path. | Azure Data Factory |
This is a problem I'm having.
settings.task.notebook_task.notebook_path. | Azure Data Factory |
I am experiencing a problem with the message
"Missing needed field: settings.task.notebook_task.notebook_path. | Azure Data Factory |
Experiencing a problem with the message
"Missing needed field: settings.task.notebook_task.notebook_path. | Azure Data Factory |
I am getting an issue as follows:
User: SimpleUserContext{userId=..., [email protected], orgId=...} is not authorized to access cluster.
| Azure Data Factory |
I'm experiencing the following problem:
User: SimpleUserContext is not authorised to access the cluster with the credentials [email protected], userId=..., and orgId=
| Azure Data Factory |
I am having the following problem:
User: SimpleUserContext is not authorised to enter cluster with the following credentials: userId=..., [email protected], and orgId=
| Azure Data Factory |
Here's the problem I'm having:
The user SimpleUserContext with the credentials userId=..., [email protected], and orgId=... is not permitted to access the cluster. | Azure Data Factory |
I'm experiencing the following problem:
User: SimpleUserContext is not authorised to access the cluster with the credentials [email protected], userId=..., and orgId=
| Azure Data Factory |
Currently, I'm having the following issue:
With the credentials [email protected], userId=..., and orgId=..., SimpleUserContext is not authorized to access the cluster. | Azure Data Factory |
The cluster is in Terminated state, not available to receive jobs.
Please fix the cluster or retry later. | Azure Data Factory |
The cluster is not accepting jobs because it is in the terminated state. Attempt again later or fix the cluster. | Azure Data Factory |
It is not possible to receive jobs since the cluster is in a terminated condition. The cluster has to be fixed, or try again later. | Azure Data Factory |
Due to the cluster's terminated state, receiving jobs is not possible.
Otherwise, try again later till the cluster is fixed. | Azure Data Factory |
Given that the cluster is ended, it is not able to receive jobs.
In order to continue, the cluster must be fixed. | Azure Data Factory |
The cluster cannot accept work since it has ended.
The cluster needs to be fixed for the process to continue. | Azure Data Factory |
ADF pipeline failing to read CSV file if a column values contains
comma delimeter anlong with double quotes. | Azure Data Factory |
Pipeline ADF unable to read a CSV file if a column's values include double quotes and a comma. | Azure Data Factory |
Linear ADF If a column value includes a comma delimeter together with double quotes, the CSV file won't read correctly. | Azure Data Factory |
pipeline for ADF If a column's values include double quotes and a comma, the CSV file won't read correctly. | Azure Data Factory |
system for the ADF The CSV file won't read properly if a column contains double quotes and a comma in its values. | Azure Data Factory |
ADF's pipeline The CSV file won't read properly if a column's values contain double quotes and a comma. | Azure Data Factory |
How to load updated tables records from OData source to azure
SQL server using Azure data factory | Azure Data Factory |
How to use Azure Data Factory to import updated table records from an OData source into an Azure SQL server | Azure Data Factory |
Using Azure Data Factory, how do I load updated table records from an OData source into a SQL server in Azure? | Azure Data Factory |
How can I import updated table entries from an OData source into an Azure SQL server using Azure Data Factory? | Azure Data Factory |
What is the procedure for importing updated table entries from an OData source into an Azure SQL server using Azure Data Factory? | Azure Data Factory |
What steps are involved in importing updated table entries from an OData source using Azure Data Factory into an Azure SQL server? | Azure Data Factory |
I am facing a connection failed error in Azure Data Factory Studio help me out with this | Azure Data Factory |
Please assist me with this as I am experiencing a connection failed error in Azure Data Factory Studio. | Azure Data Factory |
Azure Data Factory Studio is giving me a connection failed error. Please help! | Azure Data Factory |
My connection to Azure Data Factory Studio has failed. Please assist! | Azure Data Factory |
I'm getting a connection failed issue in Azure Data Factory Studio. Kindly assist! | Azure Data Factory |
A connection failure error has shown in Azure Data Factory Studio. Help us, please! | Azure Data Factory |
An error occurred when change linked service type warning message in datasets | Azure Data Factory |
When changing the associated service type warning message in datasets, an error happened. | Azure Data Factory |
Change associated service type warning notice in datasets caused an error. | Azure Data Factory |
When warning message for linked service type change in datasets, an error happened. | Azure Data Factory |
Change associated service type warning notice in datasets caused an error. | Azure Data Factory |
Datasets error: Change associated service type warning notification. | Azure Data Factory |
Subsets and Splits