Official utilities to use the Hugging Face Hub API.
pnpm add @huggingface/hubnpm add @huggingface/hubyarn add @huggingface/hub
// esm.shimport{uploadFiles,listModels}from"https://esm.sh/@huggingface/hub"// or npm:import{uploadFiles,listModels}from"npm:@huggingface/hub"
Check out the full documentation.
For some of the calls, you need to create an account and generate an access token.
Learn how to find free models using the hub package in this interactive tutorial.
import*ashubfrom"@huggingface/hub";importtype{RepoDesignation}from"@huggingface/hub";constrepo: RepoDesignation={type: "model",name: "myname/some-model"};const{name: username}=awaithub.whoAmI({accessToken: "hf_..."});forawait(constmodelofhub.listModels({search: {owner: username},accessToken: "hf_..."})){console.log("My model:",model);}constspecificModel=awaithub.modelInfo({name: "openai-community/gpt2"});awaithub.checkRepoAccess({repo,accessToken: "hf_..."});awaithub.createRepo({ repo,accessToken: "hf_...",license: "mit"});awaithub.uploadFiles({ repo,accessToken: "hf_...",files: [// path + blob content{path: "file.txt",content: newBlob(["Hello World"]),},// Local file URLpathToFileURL("./pytorch-model.bin"),// Web URLnewURL("https://huggingface.co/xlm-roberta-base/resolve/main/tokenizer.json"),// Path + Web URL{path: "myfile.bin",content: newURL("https://huggingface.co/bert-base-uncased/resolve/main/pytorch_model.bin")}// Can also work with native File in browsers],});// orforawait(constprogressEventofawaithub.uploadFilesWithProgress({ repo,accessToken: "hf_...",files: [ ... ],})){console.log(progressEvent);}awaithub.deleteFile({repo,accessToken: "hf_...",path: "myfile.bin"});await(awaithub.downloadFile({ repo,path: "README.md"})).text();forawait(constfileInfoofhub.listFiles({repo})){console.log(fileInfo);}awaithub.deleteRepo({ repo,accessToken: "hf_..."});
It's possible to login using OAuth ("Sign in with HF").
This will allow you get an access token to use some of the API, depending on the scopes set inside the Space or the OAuth App.
import{oauthLoginUrl,oauthHandleRedirectIfPresent}from"@huggingface/hub";constoauthResult=awaitoauthHandleRedirectIfPresent();if(!oauthResult){// If the user is not logged in, redirect to the login pagewindow.location.href=awaitoauthLoginUrl();}// You can use oauthResult.accessToken, oauthResult.accessTokenExpiresAt and oauthResult.userInfoconsole.log(oauthResult);
Checkout the demo: https://huggingface.co/spaces/huggingfacejs/client-side-oauth
The @huggingface/hub
package provide basic capabilities to scan the cache directory. Learn more about Manage huggingface_hub cache-system.
You can get the list of cached repositories using the scanCacheDir
function.
import{scanCacheDir}from"@huggingface/hub";constresult=awaitscanCacheDir();console.log(result);
Note: this does not work in the browser
You can cache a file of a repository using the downloadFileToCacheDir
function.
import{downloadFileToCacheDir}from"@huggingface/hub";constfile=awaitdownloadFileToCacheDir({repo: 'foo/bar',path: 'README.md'});console.log(file);
Note: this does not work in the browser
You can download an entire repository at a given revision in the cache directory using the snapshotDownload
function.
import{snapshotDownload}from"@huggingface/hub";constdirectory=awaitsnapshotDownload({repo: 'foo/bar',});console.log(directory);
The code use internally the downloadFileToCacheDir
function.
Note: this does not work in the browser
When uploading large files, you may want to run the commit
calls inside a worker, to offload the sha256 computations.
Remote resources and local files should be passed as URL
whenever it's possible so they can be lazy loaded in chunks to reduce RAM usage. Passing a File
inside the browser's context is fine, because it natively behaves as a Blob
.
Under the hood, @huggingface/hub
uses a lazy blob implementation to load the file.
@huggingface/tasks
: Typings only@huggingface/lz4
: URL join utility