storage-adapters
npx skills add https://github.com/rytass/utils --skill storage-adapters
Agent 安装分布
Skill 文档
File Storage Adapters
This skill provides comprehensive guidance for using @rytass/storages-adapter-* packages to integrate file storage providers.
Overview
All adapters implement the StorageInterface from @rytass/storages, providing a unified API across different storage providers:
| Package | Provider | Description |
|---|---|---|
@rytass/storages-adapter-s3 |
AWS S3 | Amazon S3 storage adapter |
@rytass/storages-adapter-gcs |
Google Cloud Storage | GCS storage adapter |
@rytass/storages-adapter-r2 |
Cloudflare R2 | R2 storage adapter with custom domain support |
@rytass/storages-adapter-azure-blob |
Azure Blob Storage | Azure blob storage adapter |
@rytass/storages-adapter-local |
Local File System | Local disk storage with usage tracking |
Base Interface (@rytass/storages)
All adapters share these core methods:
// StorageInterface - åºç¤ä»é¢ï¼å
å®ç¾©æ ¸å¿æ¹æ³ï¼
interface StorageInterface {
// Upload files
write(file: InputFile, options?: WriteFileOptions): Promise<StorageFile>;
batchWrite(files: InputFile[]): Promise<StorageFile[]>;
// Download files (3 overloads)
read(key: string): Promise<Readable>;
read(key: string, options: ReadBufferFileOptions): Promise<Buffer>;
read(key: string, options: ReadStreamFileOptions): Promise<Readable>;
// Delete files
remove(key: string): Promise<void>;
}
// ReadFileOptions åå¥
interface ReadBufferFileOptions {
format: 'buffer';
}
interface ReadStreamFileOptions {
format: 'stream';
}
// 以䏿¹æ³å¨ Storage æ½è±¡é¡å¥ä¸å®ç¾©ï¼ä¸å¨ StorageInterfaceï¼ï¼
// - isExists(key: string): Promise<boolean>; // ææ adapter 齿¯æ´
// 以䏿¹æ³çºå adapter çé¡å¤åè½ï¼ä¸å¨ StorageInterfaceï¼ï¼
// - url(key: string, options?): Promise<string>; // å
é²ç«¯ adapter æ¯æ´
注æ:
batchWrite()ç options é£å忏å LocalStorage æ¯æ´ãé²ç«¯é©é å¨ (S3, GCS, R2, Azure Blob) çbatchWrite()䏿¥å options 忏ã
Key Typesï¼å¾ @rytass/storages å°åºï¼:
InputFile– Buffer or Readable stream (alias forConvertableFile)StorageFile– Object withreadonly key: stringpropertyWriteFileOptions–{ filename?: string; contentType?: string }FilenameHashAlgorithm–'sha1' | 'sha256'FileKey– Type alias forstringReadBufferFileOptions–{ format: 'buffer' }ReadStreamFileOptions–{ format: 'stream' }ConverterManager– Re-export from@rytass/file-converter
StorageOptions:
interface StorageOptions<O = Record<string, unknown>> {
converters?: FileConverter<O>[]; // æªæ¡è½æå¨é£å
hashAlgorithm?: FilenameHashAlgorithm; // æªåéæ¹æ¼ç®æ³ï¼é è¨ 'sha256'ï¼
}
Storage Base Class:
abstract class Storage<O = Record<string, unknown>> implements StorageInterface {
readonly converterManager: ConverterManager; // æªæ¡è½æç®¡é
readonly hashAlgorithm: FilenameHashAlgorithm; // æªåéæ¹æ¼ç®æ³
// æªæ¡é¡å嵿¸¬è¼å©æ¹æ³
getExtension(file: InputFile): Promise<FileTypeResult | undefined>;
getBufferFilename(buffer: Buffer): Promise<[string, string | undefined]>;
getStreamFilename(stream: Readable): Promise<[string, string | undefined]>;
// æ½è±¡æ¹æ³ï¼å adapter å¿
é 實ä½ï¼
abstract write(file: InputFile, options?: WriteFileOptions): Promise<StorageFile>;
abstract batchWrite(files: InputFile[], options?: WriteFileOptions[]): Promise<StorageFile[]>;
abstract read(key: string): Promise<Readable>;
abstract read(key: string, options: ReadBufferFileOptions): Promise<Buffer>;
abstract read(key: string, options: ReadStreamFileOptions): Promise<Readable>;
abstract remove(key: string): Promise<void>;
abstract isExists(key: string): Promise<boolean>; // å¨ Storage æ½è±¡é¡å¥ä¸ï¼ä¸å¨ StorageInterface
}
Installation
# Install base package
npm install @rytass/storages
# Choose the adapter for your provider
npm install @rytass/storages-adapter-s3
npm install @rytass/storages-adapter-gcs
npm install @rytass/storages-adapter-r2
npm install @rytass/storages-adapter-azure-blob
npm install @rytass/storages-adapter-local
Quick Start
AWS S3
import { StorageS3Service } from '@rytass/storages-adapter-s3';
import { readFileSync, createReadStream } from 'fs';
// Initialize S3 storage
const storage = new StorageS3Service({
accessKey: process.env.AWS_ACCESS_KEY_ID!,
secretKey: process.env.AWS_SECRET_ACCESS_KEY!,
bucket: 'my-bucket',
region: 'ap-northeast-1',
// endpoint: 'https://custom-s3-endpoint.com', // Optional: custom S3 endpoint (e.g., MinIO)
});
// Upload a Buffer
const buffer = readFileSync('./document.pdf');
const file1 = await storage.write(buffer, {
filename: 'documents/report.pdf',
contentType: 'application/pdf',
});
console.log('Uploaded:', file1.key); // documents/report.pdf
// Upload a Stream (auto-generates filename with hash)
const stream = createReadStream('./image.jpg');
const file2 = await storage.write(stream);
console.log('Uploaded:', file2.key); // e.g., a3f2...b1c4.jpg
// Download as Buffer
const downloadedBuffer = await storage.read(file1.key, { format: 'buffer' });
// Download as Stream
const downloadedStream = await storage.read(file1.key, { format: 'stream' });
// Generate presigned URL (valid for limited time)
const url = await storage.url(file1.key);
console.log('Presigned URL:', url);
// Delete file
await storage.remove(file1.key);
// Check if file exists
const exists = await storage.isExists(file1.key);
console.log('Exists:', exists); // false
Google Cloud Storage
import { StorageGCSService } from '@rytass/storages-adapter-gcs';
// Initialize GCS storage with service account credentials
const storage = new StorageGCSService({
bucket: 'my-gcs-bucket',
projectId: 'my-project-id',
credentials: {
client_email: process.env.GCS_CLIENT_EMAIL!,
private_key: process.env.GCS_PRIVATE_KEY!.replace(/\\n/g, '\n'),
},
});
// Upload file
const file = await storage.write(buffer, {
filename: 'uploads/file.pdf',
});
// Generate signed URL with custom expiration (default: 24 hours)
const url = await storage.url(file.key, Date.now() + 1000 * 60 * 60); // 1 hour
Cloudflare R2
import { StorageR2Service } from '@rytass/storages-adapter-r2';
// Initialize R2 storage with custom domain
const storage = new StorageR2Service({
accessKey: process.env.R2_ACCESS_KEY!,
secretKey: process.env.R2_SECRET_KEY!,
bucket: 'my-r2-bucket',
account: process.env.R2_ACCOUNT_ID!,
customDomain: 'https://cdn.example.com', // Optional: rewrites presigned URLs
});
// Upload and get presigned URL
const file = await storage.write(buffer);
// Generate presigned URL with custom expiration (in seconds)
const url = await storage.url(file.key, { expires: 3600 }); // 1 hour
console.log('Custom domain URL:', url); // Uses customDomain if configured
Azure Blob Storage
import { StorageAzureBlobService } from '@rytass/storages-adapter-azure-blob';
// Initialize Azure Blob storage
const storage = new StorageAzureBlobService({
connectionString: process.env.AZURE_STORAGE_CONNECTION_STRING!,
container: 'my-container',
});
// Upload file
const file = await storage.write(buffer, {
filename: 'files/document.pdf',
});
// Generate SAS token URL with custom expiration
const url = await storage.url(file.key, Date.now() + 1000 * 60 * 60 * 24); // 24 hours
Local File System
import { LocalStorage, StorageLocalUsageInfo } from '@rytass/storages-adapter-local';
// Initialize local storage
const storage = new LocalStorage({
directory: './uploads',
autoMkdir: true, // Automatically create directory if not exists
});
// Upload file
const file = await storage.write(buffer, {
filename: 'documents/file.pdf',
});
// Download file
const downloadedBuffer = await storage.read(file.key, { format: 'buffer' });
// Get disk usage information (unique to Local adapter)
// Returns StorageLocalUsageInfo: { used: number, free: number, total: number } (in MB)
const usage: StorageLocalUsageInfo = await storage.getUsageInfo();
console.log(`Used: ${usage.used}MB, Free: ${usage.free}MB, Total: ${usage.total}MB`);
LocalStorage Types:
import {
LocalStorage,
StorageLocalOptions,
StorageLocalUsageInfo,
StorageLocalHelperCommands,
} from '@rytass/storages-adapter-local';
// Options interface
interface StorageLocalOptions extends StorageOptions {
directory: string; // å²åç®éè·¯å¾
autoMkdir?: boolean; // èªå建ç«ç®éï¼é è¨ falseï¼
}
// Usage info interface (å®ä½: MB)
interface StorageLocalUsageInfo {
used: number; // 已使ç¨ç©ºé
free: number; // å¯ç¨ç©ºé
total: number; // 總空é
}
// Helper commands for *NIX systems (å
§é¨ä½¿ç¨)
enum StorageLocalHelperCommands {
USED = "du -sm __DIR__ | awk '{ print $1 }'",
FREE = "df -m __DIR__ | awk '$3 ~ /[0-9]+/ { print $4 }'",
TOTAL = "df -m __DIR__ | awk '$3 ~ /[0-9]+/ { print $2 }'",
}
Common Patterns
Buffer vs Stream Upload
Use Buffer when:
- File is already in memory
- File size is small (< 10MB)
- You need to process file content first
Use Stream when:
- File is large (> 10MB)
- Streaming from file system or network
- Memory efficiency is important
// Buffer upload - simple and direct
const buffer = readFileSync('./file.pdf');
await storage.write(buffer, { filename: 'file.pdf' });
// Stream upload - memory efficient for large files
const stream = createReadStream('./large-video.mp4');
await storage.write(stream, { filename: 'video.mp4' });
Batch Upload Operations
Upload multiple files concurrently:
import { readFileSync } from 'fs';
const files = [
readFileSync('./file1.pdf'),
readFileSync('./file2.jpg'),
readFileSync('./file3.doc'),
];
// é²ç«¯é©é
å¨ (S3, GCS, R2, Azure Blob) - 䏿¯æ´ options é£å
const results = await storage.batchWrite(files);
// æªåå°èªå以 hash çæï¼ä¾å¦: a3f2...b1c4.pdf
results.forEach(result => console.log('Uploaded:', result.key));
LocalStorage å°ç¨: å¯éé options é£åçºæ¯åæªæ¡æå®æªåï¼
import { LocalStorage } from '@rytass/storages-adapter-local';
const localStorage = new LocalStorage({ directory: './uploads', autoMkdir: true });
// LocalStorage æ¯æ´çºæ¯åæªæ¡æå® options
const results = await localStorage.batchWrite(files, [
{ filename: 'documents/file1.pdf' },
{ filename: 'images/file2.jpg' },
{ filename: 'documents/file3.doc' },
]);
å¦éå¨é²ç«¯é©é å¨ä¸çºæ¯åæªæ¡æå®ä¸åç filename/contentTypeï¼è«ä½¿ç¨è¿´åå¼å«
write()æ¹æ³ã
File Converters Integration
ä½¿ç¨ converters é¸é
å¨ä¸å³åèªåèçæªæ¡ï¼
import { StorageS3Service } from '@rytass/storages-adapter-s3';
import { ImageResizer } from '@rytass/file-converter-adapter-image-resizer';
import { ImageTranscoder } from '@rytass/file-converter-adapter-image-transcoder';
// 建ç«å
·æèªåè½æåè½ç storage
const storage = new StorageS3Service({
accessKey: process.env.AWS_ACCESS_KEY_ID!,
secretKey: process.env.AWS_SECRET_ACCESS_KEY!,
bucket: 'my-bucket',
region: 'ap-northeast-1',
// ä¸å³åèªå縮æ¾ä¸¦è½æçº WebP
converters: [
new ImageResizer({ maxWidth: 1200, maxHeight: 800, keepAspectRatio: true }),
new ImageTranscoder({ targetFormat: 'webp', quality: 85 }),
],
});
// æªæ¡æå
ç¶é縮æ¾åæ ¼å¼è½æï¼åä¸å³
const file = await storage.write(originalImageBuffer);
Custom Hash Algorithm
è®æ´æªåéæ¹æ¼ç®æ³ï¼é è¨ä½¿ç¨ SHA256ï¼ï¼
const storage = new StorageS3Service({
// ...
hashAlgorithm: 'sha1', // æ 'sha256' (é è¨)
});
Error Handling
All adapters throw StorageError with specific error codes:
import { StorageError, ErrorCode } from '@rytass/storages';
try {
const file = await storage.read('non-existent-file.pdf', { format: 'buffer' });
} catch (error) {
if (error instanceof StorageError) {
switch (error.code) {
case ErrorCode.FILE_NOT_FOUND:
console.error('File not found');
break;
case ErrorCode.READ_FILE_ERROR:
console.error('Failed to read file');
break;
default:
console.error('Storage error:', error.message);
}
}
}
Error Codes:
WRITE_FILE_ERROR(‘101’) – Failed to upload fileREAD_FILE_ERROR(‘102’) – Failed to download fileREMOVE_FILE_ERROR(‘103’) – Failed to delete fileUNRECOGNIZED_ERROR(‘104’) – Unknown errorDIRECTORY_NOT_FOUND(‘201’) – Directory not found (Local adapter)FILE_NOT_FOUND(‘202’) – File does not exist
Note: Error codes are strings, not numbers.
File Existence Checking
Always check if a file exists before performing operations:
const key = 'documents/important.pdf';
// Check before reading
if (await storage.isExists(key)) {
const file = await storage.read(key, { format: 'buffer' });
// Process file...
} else {
console.log('File does not exist');
}
// Check before deleting
if (await storage.isExists(key)) {
await storage.remove(key);
console.log('File deleted');
}
Generating Presigned URLs
Cloud adapters support generating temporary URLs for direct file access:
// S3 - NO custom expiration supported (uses default)
const s3Url = await s3Storage.url('file.pdf');
// GCS - custom expiration (timestamp in milliseconds, default: 24 hours)
const gcsUrl = await gcsStorage.url('file.pdf', Date.now() + 1000 * 60 * 30); // 30 minutes
// R2 - custom expiration (seconds in options object) with custom domain
const r2Url = await r2Storage.url('file.pdf', { expires: 1800 }); // 30 minutes in seconds
// Azure Blob - custom expiration (timestamp in milliseconds, default: 24 hours)
const azureUrl = await azureStorage.url('file.pdf', Date.now() + 1000 * 60 * 60); // 1 hour
URL Method Signatures:
| Adapter | Signature | Expiration |
|---|---|---|
| S3 | url(key: string) |
Default only |
| GCS | url(key: string, expires?: number) |
Timestamp (ms), default: 24 hours |
| R2 | url(key: string, options?: { expires?: number }) |
Seconds |
| Azure | url(key: string, expires?: number) |
Timestamp (ms), default: 24 hours |
Note: Local adapter does not support url() method as files are stored locally.
Feature Comparison
| Feature | S3 | GCS | R2 | Azure Blob | Local |
|---|---|---|---|---|---|
| Presigned URL | â | â | â | â | â |
| Custom Domain | â | â | â | â | N/A |
| Batch Upload | â | â | â | â | â |
| Batch Upload w/ Options | â | â | â | â | â |
| Buffer Support | â | â | â | â | â |
| Stream Support | â | â | â | â | â |
| Usage Info | â | â | â | â | â |
| File Converters | â | â | â | â | â |
| Hash Algorithms | â | â | â | â | â |
| Auto MIME Detection | â | â | â | â | â |
| Custom Filename | â | â | â | â | â |
NestJS Integration
Complete integration with NestJS dependency injection:
Basic Setup
// file-storage.service.ts
import { Injectable } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { StorageS3Service } from '@rytass/storages-adapter-s3';
import { LocalStorage } from '@rytass/storages-adapter-local';
import { Storage } from '@rytass/storages';
@Injectable()
export class FileStorageService {
private storage: Storage;
constructor(private configService: ConfigService) {
const provider = this.configService.get('STORAGE_PROVIDER', 'local');
if (provider === 's3') {
this.storage = new StorageS3Service({
accessKey: this.configService.get('AWS_ACCESS_KEY_ID')!,
secretKey: this.configService.get('AWS_SECRET_ACCESS_KEY')!,
bucket: this.configService.get('S3_BUCKET')!,
region: this.configService.get('AWS_REGION', 'ap-northeast-1')!,
});
} else {
this.storage = new LocalStorage({
directory: this.configService.get('LOCAL_STORAGE_DIR', './uploads')!,
autoMkdir: true,
});
}
}
async uploadFile(file: Buffer, filename: string, contentType?: string) {
return this.storage.write(file, { filename, contentType });
}
async downloadFile(key: string): Promise<Buffer> {
return this.storage.read(key, { format: 'buffer' });
}
async deleteFile(key: string): Promise<void> {
return this.storage.remove(key);
}
async fileExists(key: string): Promise<boolean> {
return this.storage.isExists(key);
}
async getFileUrl(key: string): Promise<string> {
// Type guard to check if storage supports url()
if ('url' in this.storage && typeof this.storage.url === 'function') {
return this.storage.url(key);
}
throw new Error('Storage provider does not support presigned URLs');
}
}
Async Configuration with ConfigService
// app.module.ts
import { Module } from '@nestjs/common';
import { ConfigModule } from '@nestjs/config';
import { FileStorageService } from './file-storage.service';
@Module({
imports: [
ConfigModule.forRoot({
isGlobal: true,
}),
],
providers: [FileStorageService],
exports: [FileStorageService],
})
export class StorageModule {}
Dynamic Provider Selection
// storage.factory.ts
import { ConfigService } from '@nestjs/config';
import { Storage } from '@rytass/storages';
import { StorageS3Service } from '@rytass/storages-adapter-s3';
import { StorageGCSService } from '@rytass/storages-adapter-gcs';
import { LocalStorage } from '@rytass/storages-adapter-local';
export const STORAGE_TOKEN = Symbol('STORAGE');
export const storageFactory = {
provide: STORAGE_TOKEN,
useFactory: (configService: ConfigService): Storage => {
const provider = configService.get('STORAGE_PROVIDER', 'local');
switch (provider) {
case 's3':
return new StorageS3Service({
accessKey: configService.get('AWS_ACCESS_KEY_ID')!,
secretKey: configService.get('AWS_SECRET_ACCESS_KEY')!,
bucket: configService.get('S3_BUCKET')!,
region: configService.get('AWS_REGION')!,
});
case 'gcs':
return new StorageGCSService({
bucket: configService.get('GCS_BUCKET')!,
projectId: configService.get('GCS_PROJECT_ID')!,
credentials: {
client_email: configService.get('GCS_CLIENT_EMAIL')!,
private_key: configService.get('GCS_PRIVATE_KEY')!.replace(/\\n/g, '\n'),
},
});
case 'local':
default:
return new LocalStorage({
directory: configService.get('LOCAL_STORAGE_DIR', './uploads')!,
autoMkdir: true,
});
}
},
inject: [ConfigService],
};
// app.module.ts
@Module({
providers: [storageFactory],
exports: [STORAGE_TOKEN],
})
export class StorageModule {}
// Using in service
@Injectable()
export class UploadService {
constructor(@Inject(STORAGE_TOKEN) private storage: Storage) {}
async handleUpload(file: Express.Multer.File) {
return this.storage.write(file.buffer, {
filename: `uploads/${file.originalname}`,
contentType: file.mimetype,
});
}
}
Environment Variables
# .env
STORAGE_PROVIDER=s3 # or gcs, r2, azure, local
# AWS S3
AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_REGION=ap-northeast-1
S3_BUCKET=my-bucket
# Google Cloud Storage
GCS_BUCKET=my-gcs-bucket
GCS_PROJECT_ID=my-project-id
GCS_CLIENT_EMAIL=service-account@project.iam.gserviceaccount.com
GCS_PRIVATE_KEY="-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n"
# Cloudflare R2
R2_ACCESS_KEY=your_r2_access_key
R2_SECRET_KEY=your_r2_secret_key
R2_ACCOUNT_ID=your_account_id
R2_BUCKET=my-r2-bucket
R2_CUSTOM_DOMAIN=https://cdn.example.com
# Azure Blob
AZURE_STORAGE_CONNECTION_STRING=DefaultEndpointsProtocol=https;AccountName=...
AZURE_CONTAINER=my-container
# Local Storage
LOCAL_STORAGE_DIR=./uploads
Detailed Documentation
For complete API reference and advanced usage: