Skip to content

Latest commit

 

History

History
195 lines (139 loc) · 6.2 KB

LargeFileUploadTask.md

File metadata and controls

195 lines (139 loc) · 6.2 KB

Large File Upload Task - Uploading large files to OneDrive, Outlook, Print API.

References -

Creating the client instance

Refer this documentation for initializing the client.

Using the LargeFileUpload Task

Create an upload session

First step for any upload task is the creation of the upload session.

Example of a payload for Outlook

constpayload={AttachmentItem: {attachmentType: "file",name: "<FILE_NAME>",size: FILE_SIZE,},};

Example of a payload for OneDrive

constpayload={item: {"@microsoft.graph.conflictBehavior": "rename",name: "<FILE_NAME>",},};

Create the upload session

constuploadSession=LargeFileUploadTask.createUploadSession(client,"REQUEST_URL",payload);

Creating the LargeFileUploadTask object

  • First, you will need to initialize a Client instance. This client instance should passed as a parameter when creating the LargeFileUploadTask or OneDriveLargeFileUploadTask object.
  • To create the LargeFileUploadTask object you need to create - - An upload session as shown above. - A FileObject instance.

FileObject Interface

exportinterfaceFileObject<T>{content: T;name: string;size: number;sliceFile(range: Range): Promise<ArrayBuffer|Blob|Buffer>;}

The Microsoft Graph JavaScript Client SDK provides two implementations -

  1. StreamUpload - Supports Node.js stream upload
importStreamUploadfrom"@microsoft/microsoft-graph-client";import*asfsfrom"fs";constfileName="<FILE_NAME>";conststats=fs.statSync(`./test/sample_files/${fileName}`);consttotalsize=stats.size;constreadStream=fs.createReadStream(`./test/sample_files/${fileName}`);constfileObject=newStreamUpload(readStream,fileName,totalsize);

Note - In case of a browser application, you can use stream-browserify and buffer.

  1. FileUpload - Supports upload of file formats - ArrayBuffer, Blob, Buffer
importFileUploadfrom"@microsoft/microsoft-graph-client";import*asfsfrom"fs";constfileName="<FILE_NAME>";conststats=fs.statSync(`./test/sample_files/${fileName}`);consttotalsize=stats.size;constfileContent=fs.readFileSync(`./test/sample_files/${fileName}`);constfileObject=newFileUpload(fileContent,fileName,totalsize);

Note - You can also have a customized FileObject implementation which contains the sliceFile(range: Range) function which implements the logic to split the file into ranges.

Initiate the LargefileUploadTask options with Progress Handler and Range Size

constprogress=(range?: Range,extraCallBackParam?: unknown)=>{// Handle progress event};constuploadEventHandlers: UploadEventHandlers={ progress, extraCallBackParam,// additional parameters to the callback};constoptions: LargeFileUploadTaskOptions={rangeSize: 327680,uploadEventHandlers: UploadEventHandlers,};

Create a LargefileUploadTask object

constuploadTask=newLargeFileUploadTask(client,fileObject,uploadSession,optionsWithProgress);constuploadResult: UploadResult=awaituploadTask.upload();

UploadResult contains the location(received in the Outlook API response headers) and the responseBody (responseBody received after successful upload.) properties.

OneDriveLargeFileUploadTask.

You can also use OneDriveLargeFileUploadTask which provides easier access to upload to OneDrive API

Example -

constuploadEventHandlers: UploadEventHandlers={ progress,extraCallBackParam: true,};constoptions: OneDriveLargeFileUploadOptions={path: "/Documents", fileName,rangeSize: 1024*1024, uploadEventHandlers,uploadSessionURL: "optional_custom_uploadSessionURL"//if undefined defaults to "/me/drive/root:/{file-path}:/createUploadSession"};constreadStream=fs.createReadStream(`./fileName`);constfileObject=newStreamUpload(readStream,fileName,totalsize);orconstuploadContent=fs.readFileSync(`./fileName`);constfileObject=newFileUpload(uploadContent,fileName,totalsize);constuploadTask=awaitOneDriveLargeFileUploadTask.createTaskWithFileObject(client,fileObject,options);constuploadResult:UploadResult=awaituploadTask.upload();}

Note: The OneDriveLargeFileUploadTask.createTaskWithFileObject also handles the upload session creation.**

We can just resume the broken upload

Lets consider some break down happens in the middle of uploading, with the uploadTask object in hand you can resume easily.

uploadTask.resume();

Even you can control the whole upload process

You can create the upload task, and play with it by using sliceFile and uploadSlice methods

letrange=uploadTask.getNextRange();letslicedFile=uploadTask.sliceFile(range);uploadTask.uploadSlice(slicedFile,range,uploadTask.file.size);

Cancelling a largeFileUpload task

Cancelling an upload session sends a DELETE request to the upload session URL

constcancelResponse=awaituploadTask.cancel();

Get the largeFileUpload session

Returns the largeFileUpload session information containing the URL, expiry date and cancellation status of the task

constuploadsession: LargeFileUploadSession=uploadTask.getUploadSession();

Samples

Check out the samples for:

close