AWS Services
The @pikku/aws-services package provides AWS implementations of Pikku's core service interfaces.
- npm
- Yarn
- pnpm
- Bun
npm install @pikku/aws-services
yarn add @pikku/aws-services
pnpm add @pikku/aws-services
bun add @pikku/aws-services
Servicesβ
| Class | Implements | AWS Service |
|---|---|---|
S3Content | ContentService | S3 + CloudFront |
AWSSecrets | SecretService | Secrets Manager |
SQSQueueService | QueueService | SQS |
S3Contentβ
File storage using S3 with CloudFront signed URLs:
import { S3Content } from '@pikku/aws-services'
const contentService = new S3Content(
{
bucketName: 'my-bucket',
region: 'us-east-1',
endpoint: undefined, // Optional: for LocalStack or custom endpoints
},
logger,
{
keyPairId: process.env.CLOUDFRONT_KEY_PAIR_ID!,
privateKey: process.env.CLOUDFRONT_PRIVATE_KEY!,
}
)
Featuresβ
- Signed uploads β
getUploadURL()generates presigned S3 PUT URLs (1 hour expiry) - Signed downloads β
signContentKey()andsignURL()use CloudFront signing - File operations β
readFile(),writeFile(),copyFile(),deleteFile() - Streaming β reads and writes use Node.js streams
Usageβ
// Generate upload URL for client-side upload
const { uploadUrl, assetKey } = await contentService.getUploadURL(
'uploads/photo.jpg',
'image/jpeg'
)
// Sign a download URL (expires in 1 hour)
const signedUrl = await contentService.signContentKey(
'uploads/photo.jpg',
new Date(Date.now() + 3600_000)
)
// Read file as stream
const stream = await contentService.readFile('uploads/photo.jpg')
See ContentService API for the full interface.
AWSSecretsβ
Read-only access to AWS Secrets Manager:
import { AWSSecrets } from '@pikku/aws-services'
const secretService = new AWSSecrets({
awsRegion: 'us-east-1',
})
Usageβ
// Get a plain string secret
const apiKey = await secretService.getSecret('my-api-key')
// Get and parse a JSON secret
const dbConfig = await secretService.getSecretJSON<{
host: string
password: string
}>('database-config')
// Check if a secret exists
const exists = await secretService.hasSecret('my-api-key')
note
AWSSecrets is read-only. setSecretJSON() and deleteSecret() throw errors β use the AWS Console or CLI to manage secrets.
See SecretService API for the full interface.
SQSQueueServiceβ
Fire-and-forget job publishing to SQS queues:
import { SQSQueueService } from '@pikku/aws-services'
const queueService = new SQSQueueService({
region: 'us-east-1',
queueUrlPrefix: 'https://sqs.us-east-1.amazonaws.com/123456789/',
endpoint: undefined, // Optional: for LocalStack
})
Usageβ
// Add a job to a queue
const messageId = await queueService.add('my-queue', {
userId: 'user-123',
action: 'send-email',
})
// Add with delay (max 900 seconds / 15 minutes per SQS limits)
const delayedId = await queueService.add(
'my-queue',
{ action: 'reminder' },
{ delay: 300_000 } // 5 minutes in milliseconds
)
Limitationsβ
SQS is a fire-and-forget service β supportsResults is false:
- No
getJob()support (throws an error) - No job status tracking after submission
- Delay limited to 900 seconds (SQS constraint)
- Standard queues only (no FIFO)
For job result tracking, use BullMQ or PG Boss instead.
Processing SQS Messagesβ
On the worker side, use runSQSQueueWorker from @pikku/lambda to process messages:
import { runSQSQueueWorker } from '@pikku/lambda'
export const sqsHandler: SQSHandler = async (event) => {
const singletonServices = await coldStart()
return await runSQSQueueWorker({
singletonServices,
createWireServices,
event,
})
}
See AWS Lambda β SQS Queue Worker for the full setup.
Setup Exampleβ
Register AWS services in your singleton services:
import { S3Content, AWSSecrets, SQSQueueService } from '@pikku/aws-services'
const singletonServices = await createSingletonServices(config, {
content: new S3Content(s3Config, logger, cloudFrontConfig),
secrets: new AWSSecrets({ awsRegion: 'us-east-1' }),
queueService: new SQSQueueService(sqsConfig),
})