Complex JSON with nested arrays to SQL: Help! Summary: REST data source resulting in json that needs to be transferred into an on premises SQL environment I'd like one table per array object I've got the REST pull working fine and can drop it into a json blob Sample of the json at bottom
Azure Portal: How to pass docker related setting from Azure Data . . . But When I trigger my batch job from Azure Data Factory, I got error: Task failed "Container-enabled compute node requires task container settings" I know we need to pass the parameters like "Image name" to a task like the following:
Dynamic variable in azure data factory pipeline Question 0 Sign in to vote Hello Team, I want to create dynamic parameter in pipeline below is my requirement :- i have source file at ADLS and file name is abc_20181223232345 txt and i have create pipeline and have three component
Data Factory DataFlow source DataSets? - social. msdn. microsoft. com Please make sure that your Azure blob dataset exists in your Data factory Please try creating a dummy Copy activity and try if you were able to see that Azure blob data set listed in your Copy activity If it appears, then try to create a new Data flow activity and search for the blob dataset in your Data flow activity
Empty dataset. json file for ADF v2 custom activity I am trying to get access to the dataset (as described here: https: docs microsoft com en-us azure data-factory transform-data-using-dotnet-custom-activity), however the dataset json file remains empty (The linkedService json file holds the correct information ) It looks like the referenceObject is incorrectly defined