Dumping a file into a Datastore Blob Storage from an AzureML v2 Pipeline

Zeid Adabel0Reputation points
2025-04-23T19:57:47.1566667+00:00

Hello,

We've recently migrated our AzureML Pipelines from using Python SDK v1 to v2.
This went mostly smoothly with the exception of one datastore feature we used to use:
In v1 we were able to get write access to an registered datastore with this simple python script within a command component:

datastore = Datastore.get(workspace,datastore_name) datastore.upload_files(files=[local_path], target_path=folder_path) 

With v2 this API was no longer available. Unfortunately also the azureml-fsspec library only allows reads from a datastore so that also couldn't be used. In the documentation the only example given is how to accomplish this via a command job. Our setup is using the azure.ai decorators in python so this wasn't very helpful.

Nevertheless, we tried to incorporate the command job example in our pipeline but it doesn't compile it as part of the flow (the inputs from other components are not used).

Could someone helpout with incorporating a command job that copies a file that is an output from another component Output(type="uri_file") into a Datastore Blob Storage within a @pipeline?

Azure Machine Learning
Azure Machine Learning
An Azure machine learning service for building and deploying models.
3,244 questions
{count} votes