Tools
Tools: AWS Lambda Durable Functions on Hexagonal Architecture: The Pattern You’ve Been Looking For
2026-02-25
0 views
admin
Introduction ## The old-fashioned way of building software ## Hexagonal Architecture ## Durable 🤝 Hexagonal ## Inversion of control (IoC) ## A concrete example ## It's all about code ## 1️⃣ The IoC container: wiring behavior, not hard-coding it ## 2️⃣ The Lambda entrypoint: keeping Durable at the edge ## 3️⃣ The workflow base layer: centralizing the initial orchestration mechanic ## 4️⃣ The concrete Durable use case ## For those who come after ## Conclusion Yes, you read it right. When building serverless applications on AWS, one little thing seems to be forgotten in 2026: design patterns. And that's especially true when using Lambda Durable Functions and its new open-source Durable execution SDK. And no, this is not another "Step Functions vs Lambda Durable Functions" comparison. In this article, we will not look back. We will explore how you can build a strong foundation for Durable Functions with Hexagonal Architecture, from a developer's perspective, and why this pattern might be the missing piece for building durable applications. At AWS re:Invent 2025, AWS introduced Lambda Durable Functions with an interesting premise: build like a monolith, deploy to microservices. As an all-time big fan of the microservices approach, I have to admit: I got super excited that we can now build a Lambdalith without a guilty conscience. A little over a year ago, I wrote an article explaining how to refactor a Lambdalith to microservices, using Hexagonal. What was considered before as an anti-pattern, it is fascinating that Durable Functions allows us now to do the opposite, but now, with the microservices benefits. Plus, at first glance, it looked like an immediate replacement for AWS Step Functions, meant for developers. And that's what we've been seeing the community doing so far: comparing both services and exploring ways of migrating existing state machines to Durable Functions. The new Durable execution SDK is powerful, and it can do pretty much everything that you already have available in Step Functions, but when building a Lambda function that handles orchestration, it is also easy to fall into the trap of building the well-known Lambda Bogeyman: spaghetti code, which makes the application hard to explain and evolve. The problem isn't Durable Functions.
The problem isn't its SDK.
The problem is the lack of boundaries. And if there is one thing that I learned with my Durable endeavors, that thing is: now, we need better ways of organizing the application code. Lately, something keeps hammering my mind: we need, more than ever, principles. Back when coding tools were nothing but a daydream, we used to think differently. Coding was the most important skill of a developer, and the ability to structure the code in a way that is, among other things, readable and testable. In object-oriented programming, the SOLID principles, for instance, remain a great starting point for designing clean software. But SOLID alone was never the end goal. You can absolutely apply SOLID principles when building with Lambda, but when orchestration becomes central, clear separation of concerns matters even more. That’s where Hexagonal Architecture comes in. Hexagonal Architecture, also known as "Ports and Adapters," offers a way to modularize your application so it can be more flexible and maintainable. By isolating the core business logic from external systems, this architecture promotes separation of concerns, where the application's core logic isn't tightly coupled to any specific technology or service. The real strength of Hexagonal Architecture is the boundaries it creates. It keeps business logic isolated, dependencies replaceable, and infrastructure concerns at the edges. That becomes especially important when we introduce the Durable execution SDK. It brings powerful workflow capabilities, but it also introduces execution-specific mechanics that should be kept separate from the rest of your code. Hexagonal Architecture doesn’t remove that complexity. It gives it a place to live. Durable Functions changed something important: the what and the how now live in the same place. With regular Lambda functions, we mostly wrote the what: validate an order, process a payment, update a record. How it was executed wasn’t something we had to think much about, as this used to be part of our infrastructure code (A.K.A Step Functions). Plus, what was before split into microservices, now it can be part of a single monolith, as Durable Functions gives us "distributed system reliability". More importantly, with the Durable execution SDK, the how(s) is part of the code. Parallel steps, maps, and child contexts all sit next to the business logic. That’s where it can get confusing. Hexagonal Architecture is not a silver bullet, but it allows us to separate those concerns a bit. We can make the domain stay focused on what the system does (with a little bit of how). The workflow base layer handles how it runs. The adapters handle external calls. Durable Functions gives us reliability in a monolith. Hexagonal keeps the structure clean. And when that happens, the SOLID principles can make sense again. This is where things start to get really interesting. "In software design, inversion of control (IoC) is a design principle in which custom-written portions of a computer program receive the flow of control from an external source (e.g., a framework). In procedural programming, a program's custom code calls reusable libraries to take care of generic tasks, but with inversion of control, it is the external code or framework that is in control and calls the custom code."
Source: https://en.wikipedia.org/wiki/Inversion_of_control. In short, IoC is the practical tool that makes Hexagonal Architecture work. It's how you implement the D in SOLID, where abstractions should not depend on details. Details should depend on abstractions. And how to leverage IoC when building with Lambda Durable Functions? You are a developer. You need to build a data pipeline with the requirements to: After some conversations with your team, you decided to build it like a monolith, so that you don't have the cognitive overhead of splitting the application into microservices. Single code base, single deployment, super straightforward. Of course, when someone asks how exactly you are going to design it, the universal engineering answer applies: Although in fact it does depend, you are a great developer. You want to build a future-proof application, and despite all possibilities, you decided to: Now let’s look at what that means in practice. To demonstrate, I have used InversifyJS, a library used to create inversion of control (IoC) container for TypeScript. An IoC container uses a class constructor to identify and inject its dependencies. While Hexagonal Architecture is particularly well-suited for typed languages, it is language-agnostic and can be implemented in any language or framework of your choice. The container defines multiple implementations for the same abstractions.
Each data source and mapper is bound by name, allowing us to switch behavior based on context instead of conditionals. Here, IoC allows us to inject different ingestion and transformation strategies without changing the workflow structure. No if (type === 'customers') spread all over the codebase. This file acts as the Lambda entrypoint. It wraps the handler with Durable execution and delegates the actual logic to a resolved use case. With that, we have access to the DurableContext, and we use it later down the line. This abstract class centralizes the Durable parallel execution pattern. It takes an eventContexts array and runs the same workflow in parallel for each data type, with a parent context called execute-contexts-in-parallel and using the runInChildContext method underneath it. Each value in eventContexts corresponds to a named binding in the IoC container (for example, customers resolves to CustomerDataSource + CustomerDataMapper). The workflow structure remains the same, only the injected behavior changes. Because this class is abstract, it will be extended by other Lambdas that require the same parallel orchestration pattern. The parallel mechanics live in one place. Concrete implementations only need to provide the execute method. This class implements the actual use case while inheriting orchestration mechanics from the base class. Plus, it is not pure domain logic. It is the workflow use case layer.
Its responsibility is to coordinate execution, not to define how ingestion or transformation works. What matters here is isolation. If tomorrow the ingestion logic changes for customers, or a new data type is introduced, the Durable workflow does not need to change, but only the injected implementation. The orchestration remains untouched. What we built here is a monolith with distributed system reliability, by keeping a single codebase while letting Durable Functions handle the hard parts: parallel execution, retries, etc. And the payoff shows up immediately in the Durable operations graph: you can literally see the top-level execute-contexts-in-parallel, from the abstract class, each branch running its own context (customers/products/…), and the steps and map iterations underneath. A nice side-effect of this pattern is testing. You can test the container (IoC bindings) independently from workflow execution, and you don’t need to touch the orchestration layer when you change a mapper or data source. IoC container test (short version): And your Durable function test file, also in a short version, using the Durable Execution SDK JS Testing library: The full example, including the complete test files, adapters, and the full hexagonal implementation, can be found here. Lambda Durable Functions are more than a replacement for Step Functions. They are meant for a different kind of problem and a different kind of developer experience. When you choose Durable, you are choosing to write your workflows in code. And once orchestration lives in your codebase, boundaries become essential. Durable Functions gives you distributed system reliability inside a monolith: parallel execution, maps, retries, etc. But reliability alone is not enough. Without clear organization, it’s easy to mix workflow mechanics with business logic and slowly create something hard to understand. Also, Durable Functions work just as well for simple workflows as for complex ones. The difference is not in the workflow size or the number of steps, it’s in how you structure the code around it. Hexagonal Architecture helps keep things in place, while IoC helps keep dependencies clean. Together, they allow you to build a Lambdalith that is reliable at runtime and maintainable in the long run. The monolith vs microservices debate is still relevant. Durable Functions change how we approach it. You can build a great monolith, but only if you design it properly. Durable Functions are powerful, but they require discipline. Design patterns were never optional; we just stopped talking about them. It doesn’t remove the need for architecture. It makes it impossible to ignore. Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink. Hide child comments as well For further actions, you may consider blocking this person and/or reporting abuse COMMAND_BLOCK:
const container: Container = new Container(); // Bind multiple data source implementations with names
container.bind<IDataSource>(TYPES.DataSource).to(CustomerDataSource).whenNamed('customers');
container.bind<IDataSource>(TYPES.DataSource).to(ProductDataSource).whenNamed('products');
container.bind<IDataSource>(TYPES.DataSource).to(OrderDataSource).whenNamed('orders'); // Bind multiple data mappers implementations with names
container.bind<IDataMapper>(TYPES.DataMapper).to(CustomerDataMapper).whenNamed('customers');
container.bind<IDataMapper>(TYPES.DataMapper).to(ProductDataMapper).whenNamed('products');
container.bind<IDataMapper>(TYPES.DataMapper).to(OrderDataMapper).whenNamed('orders'); container .bind<Factory<{ dataSource: IDataSource; dataMapper: IDataMapper }, [string]>>(TYPES.DataSourceFactory) .toFactory((context: ResolutionContext) => { return (named: string) => { const dataSource: IDataSource = context.get<IDataSource>(TYPES.DataSource, { name: named, }); const dataMapper = context.get<IDataMapper>(TYPES.DataMapper, { name: named, }); return { dataSource, dataMapper }; }; }); Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
const container: Container = new Container(); // Bind multiple data source implementations with names
container.bind<IDataSource>(TYPES.DataSource).to(CustomerDataSource).whenNamed('customers');
container.bind<IDataSource>(TYPES.DataSource).to(ProductDataSource).whenNamed('products');
container.bind<IDataSource>(TYPES.DataSource).to(OrderDataSource).whenNamed('orders'); // Bind multiple data mappers implementations with names
container.bind<IDataMapper>(TYPES.DataMapper).to(CustomerDataMapper).whenNamed('customers');
container.bind<IDataMapper>(TYPES.DataMapper).to(ProductDataMapper).whenNamed('products');
container.bind<IDataMapper>(TYPES.DataMapper).to(OrderDataMapper).whenNamed('orders'); container .bind<Factory<{ dataSource: IDataSource; dataMapper: IDataMapper }, [string]>>(TYPES.DataSourceFactory) .toFactory((context: ResolutionContext) => { return (named: string) => { const dataSource: IDataSource = context.get<IDataSource>(TYPES.DataSource, { name: named, }); const dataMapper = context.get<IDataMapper>(TYPES.DataMapper, { name: named, }); return { dataSource, dataMapper }; }; }); COMMAND_BLOCK:
const container: Container = new Container(); // Bind multiple data source implementations with names
container.bind<IDataSource>(TYPES.DataSource).to(CustomerDataSource).whenNamed('customers');
container.bind<IDataSource>(TYPES.DataSource).to(ProductDataSource).whenNamed('products');
container.bind<IDataSource>(TYPES.DataSource).to(OrderDataSource).whenNamed('orders'); // Bind multiple data mappers implementations with names
container.bind<IDataMapper>(TYPES.DataMapper).to(CustomerDataMapper).whenNamed('customers');
container.bind<IDataMapper>(TYPES.DataMapper).to(ProductDataMapper).whenNamed('products');
container.bind<IDataMapper>(TYPES.DataMapper).to(OrderDataMapper).whenNamed('orders'); container .bind<Factory<{ dataSource: IDataSource; dataMapper: IDataMapper }, [string]>>(TYPES.DataSourceFactory) .toFactory((context: ResolutionContext) => { return (named: string) => { const dataSource: IDataSource = context.get<IDataSource>(TYPES.DataSource, { name: named, }); const dataMapper = context.get<IDataMapper>(TYPES.DataMapper, { name: named, }); return { dataSource, dataMapper }; }; }); COMMAND_BLOCK:
import { withDurableExecution, DurableContext } from '@aws/durable-execution-sdk-js';
import { container, TYPES } from '../container/inversify.config'; export const durableFunction = withDurableExecution(async (event: any, context: DurableContext): Promise<any> => { const useCase = container.get<any>(TYPES.DurableFunction); return useCase.handler(event, context);
}); Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
import { withDurableExecution, DurableContext } from '@aws/durable-execution-sdk-js';
import { container, TYPES } from '../container/inversify.config'; export const durableFunction = withDurableExecution(async (event: any, context: DurableContext): Promise<any> => { const useCase = container.get<any>(TYPES.DurableFunction); return useCase.handler(event, context);
}); COMMAND_BLOCK:
import { withDurableExecution, DurableContext } from '@aws/durable-execution-sdk-js';
import { container, TYPES } from '../container/inversify.config'; export const durableFunction = withDurableExecution(async (event: any, context: DurableContext): Promise<any> => { const useCase = container.get<any>(TYPES.DurableFunction); return useCase.handler(event, context);
}); CODE_BLOCK:
{ "eventContexts": ["customers", "products", "orders"]
} Enter fullscreen mode Exit fullscreen mode CODE_BLOCK:
{ "eventContexts": ["customers", "products", "orders"]
} CODE_BLOCK:
{ "eventContexts": ["customers", "products", "orders"]
} COMMAND_BLOCK:
import { DurableContext } from '@aws/durable-execution-sdk-js'; abstract class DurableParallelAbstractHandler { async handler(event: DurableFunctionEvent, context: DurableContext): Promise<DurableFunctionResponse> { try { const contextsToBeExecuted = event.eventContexts.map((eventContext) => async (ctx: DurableContext) => { return await ctx.runInChildContext(eventContext, async (childCtx) => { return await this.execute(eventContext, childCtx) }); }, ); const results = await context.parallel('execute-contexts-in-parallel', contextsToBeExecuted); console.log('Durable Function completed successfully'); return { success: true, response: results, timestamp: new Date().toISOString(), }; } catch (error: any) { console.error('Durable Function failed:', error); return { success: false, error: error.message, timestamp: new Date().toISOString(), }; } } protected abstract execute(event: any, childContext: DurableContext): Promise<any>;
} export default DurableParallelAbstractHandler; Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
import { DurableContext } from '@aws/durable-execution-sdk-js'; abstract class DurableParallelAbstractHandler { async handler(event: DurableFunctionEvent, context: DurableContext): Promise<DurableFunctionResponse> { try { const contextsToBeExecuted = event.eventContexts.map((eventContext) => async (ctx: DurableContext) => { return await ctx.runInChildContext(eventContext, async (childCtx) => { return await this.execute(eventContext, childCtx) }); }, ); const results = await context.parallel('execute-contexts-in-parallel', contextsToBeExecuted); console.log('Durable Function completed successfully'); return { success: true, response: results, timestamp: new Date().toISOString(), }; } catch (error: any) { console.error('Durable Function failed:', error); return { success: false, error: error.message, timestamp: new Date().toISOString(), }; } } protected abstract execute(event: any, childContext: DurableContext): Promise<any>;
} export default DurableParallelAbstractHandler; COMMAND_BLOCK:
import { DurableContext } from '@aws/durable-execution-sdk-js'; abstract class DurableParallelAbstractHandler { async handler(event: DurableFunctionEvent, context: DurableContext): Promise<DurableFunctionResponse> { try { const contextsToBeExecuted = event.eventContexts.map((eventContext) => async (ctx: DurableContext) => { return await ctx.runInChildContext(eventContext, async (childCtx) => { return await this.execute(eventContext, childCtx) }); }, ); const results = await context.parallel('execute-contexts-in-parallel', contextsToBeExecuted); console.log('Durable Function completed successfully'); return { success: true, response: results, timestamp: new Date().toISOString(), }; } catch (error: any) { console.error('Durable Function failed:', error); return { success: false, error: error.message, timestamp: new Date().toISOString(), }; } } protected abstract execute(event: any, childContext: DurableContext): Promise<any>;
} export default DurableParallelAbstractHandler; COMMAND_BLOCK:
@injectable()
class DurableFunction extends DurableParallelAbstractHandler { private dataFactoryInstance: { dataSource: IDataSource; dataMapper: IDataMapper }; constructor( @inject(TYPES.DataSourceFactory) private dataSourceFactory: (named: string) => { dataSource: IDataSource; dataMapper: IDataMapper }, @inject(TYPES.Storage) private storage: IStorage, ) { super(); } async execute(dataSource: string, context: DurableContext): Promise<any> { const dataSourceType = dataSource || 'customers'; await context.step('create-data-source-factory', async () => { this.dataFactoryInstance = this.dataSourceFactory(dataSourceType); }); const rawResponse = await context.step(`fetch-data`, async () => await this.dataFactoryInstance.dataSource.fetch()); await context.map(rawResponse, async (ctx, item, _index) => { ctx.runInChildContext(`process-item-${item.id}`, async (childCtx) => { let transformedData: DomainResponse; childCtx.step('transform-data', async () => { transformedData = this.dataFactoryInstance.dataMapper.mapToDomain(item); return transformedData; }); childCtx.step('store-transformed-item', async () => { const storageKey = `processed-data/${transformedData.entity.type}/${transformedData.entity.id}-${Date.now()}.json`; await this.storage.put(storageKey, JSON.stringify(transformedData, null, 2)); } ); }); }); return { message: "Data processed and stored successfully", }; }
} export default DurableFunction; Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
@injectable()
class DurableFunction extends DurableParallelAbstractHandler { private dataFactoryInstance: { dataSource: IDataSource; dataMapper: IDataMapper }; constructor( @inject(TYPES.DataSourceFactory) private dataSourceFactory: (named: string) => { dataSource: IDataSource; dataMapper: IDataMapper }, @inject(TYPES.Storage) private storage: IStorage, ) { super(); } async execute(dataSource: string, context: DurableContext): Promise<any> { const dataSourceType = dataSource || 'customers'; await context.step('create-data-source-factory', async () => { this.dataFactoryInstance = this.dataSourceFactory(dataSourceType); }); const rawResponse = await context.step(`fetch-data`, async () => await this.dataFactoryInstance.dataSource.fetch()); await context.map(rawResponse, async (ctx, item, _index) => { ctx.runInChildContext(`process-item-${item.id}`, async (childCtx) => { let transformedData: DomainResponse; childCtx.step('transform-data', async () => { transformedData = this.dataFactoryInstance.dataMapper.mapToDomain(item); return transformedData; }); childCtx.step('store-transformed-item', async () => { const storageKey = `processed-data/${transformedData.entity.type}/${transformedData.entity.id}-${Date.now()}.json`; await this.storage.put(storageKey, JSON.stringify(transformedData, null, 2)); } ); }); }); return { message: "Data processed and stored successfully", }; }
} export default DurableFunction; COMMAND_BLOCK:
@injectable()
class DurableFunction extends DurableParallelAbstractHandler { private dataFactoryInstance: { dataSource: IDataSource; dataMapper: IDataMapper }; constructor( @inject(TYPES.DataSourceFactory) private dataSourceFactory: (named: string) => { dataSource: IDataSource; dataMapper: IDataMapper }, @inject(TYPES.Storage) private storage: IStorage, ) { super(); } async execute(dataSource: string, context: DurableContext): Promise<any> { const dataSourceType = dataSource || 'customers'; await context.step('create-data-source-factory', async () => { this.dataFactoryInstance = this.dataSourceFactory(dataSourceType); }); const rawResponse = await context.step(`fetch-data`, async () => await this.dataFactoryInstance.dataSource.fetch()); await context.map(rawResponse, async (ctx, item, _index) => { ctx.runInChildContext(`process-item-${item.id}`, async (childCtx) => { let transformedData: DomainResponse; childCtx.step('transform-data', async () => { transformedData = this.dataFactoryInstance.dataMapper.mapToDomain(item); return transformedData; }); childCtx.step('store-transformed-item', async () => { const storageKey = `processed-data/${transformedData.entity.type}/${transformedData.entity.id}-${Date.now()}.json`; await this.storage.put(storageKey, JSON.stringify(transformedData, null, 2)); } ); }); }); return { message: "Data processed and stored successfully", }; }
} export default DurableFunction; COMMAND_BLOCK:
import 'reflect-metadata';
import { container } from '../../../src/container/inversify.config';
import TYPES from '../../../src/container/types';
import IStorage from '../../../src/interfaces/storageIF'; describe('IoC container wiring', () => { it('resolves the data factory for a named context', () => { const factory = container.get<(name: string) => { dataSource: any; dataMapper: any }>(TYPES.DataSourceFactory); const { dataSource, dataMapper } = factory('products'); expect(dataSource).toBeDefined(); expect(typeof dataSource.fetch).toBe('function'); expect(dataMapper).toBeDefined(); expect(typeof dataMapper.mapToDomain).toBe('function'); }); it('resolves infrastructure providers', () => { const storage = container.get<IStorage>(TYPES.Storage); expect(storage).toBeDefined(); });
}); Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
import 'reflect-metadata';
import { container } from '../../../src/container/inversify.config';
import TYPES from '../../../src/container/types';
import IStorage from '../../../src/interfaces/storageIF'; describe('IoC container wiring', () => { it('resolves the data factory for a named context', () => { const factory = container.get<(name: string) => { dataSource: any; dataMapper: any }>(TYPES.DataSourceFactory); const { dataSource, dataMapper } = factory('products'); expect(dataSource).toBeDefined(); expect(typeof dataSource.fetch).toBe('function'); expect(dataMapper).toBeDefined(); expect(typeof dataMapper.mapToDomain).toBe('function'); }); it('resolves infrastructure providers', () => { const storage = container.get<IStorage>(TYPES.Storage); expect(storage).toBeDefined(); });
}); COMMAND_BLOCK:
import 'reflect-metadata';
import { container } from '../../../src/container/inversify.config';
import TYPES from '../../../src/container/types';
import IStorage from '../../../src/interfaces/storageIF'; describe('IoC container wiring', () => { it('resolves the data factory for a named context', () => { const factory = container.get<(name: string) => { dataSource: any; dataMapper: any }>(TYPES.DataSourceFactory); const { dataSource, dataMapper } = factory('products'); expect(dataSource).toBeDefined(); expect(typeof dataSource.fetch).toBe('function'); expect(dataMapper).toBeDefined(); expect(typeof dataMapper.mapToDomain).toBe('function'); }); it('resolves infrastructure providers', () => { const storage = container.get<IStorage>(TYPES.Storage); expect(storage).toBeDefined(); });
}); COMMAND_BLOCK:
import 'reflect-metadata';
import { LocalDurableTestRunner } from '@aws/durable-execution-sdk-js-testing';
import { container, TYPES } from '../../../../src/container/inversify.config';
import { durableFunction } from '../../../../src/example-app/handlers';
import { DurableFunctionEvent } from '../../../../src/example-app/durableAbstractHandler';
import IStorage from '../../../../src/interfaces/storageIF';
import LocalStorage from '../../adapters/storage/local/localStorage'; describe('Durable workflow', () => { let runner: LocalDurableTestRunner; beforeAll(async () => { process.env.ENVIRONMENT = 'test'; (await container.rebind<IStorage>(TYPES.Storage)).to(LocalStorage).whenDefault(); await LocalDurableTestRunner.setupTestEnvironment({ skipTime: true }); }); beforeEach(() => { runner = new LocalDurableTestRunner({ handlerFunction: durableFunction }); }); afterAll(async () => { await LocalDurableTestRunner.teardownTestEnvironment(); delete process.env.ENVIRONMENT; }); it('runs multiple data sources in parallel', async () => { const event: DurableFunctionEvent = { eventContexts: ['customers', 'products', 'orders'] }; const execution = await runner.run({ payload: event }); expect(execution.getStatus()).toBe('SUCCEEDED'); expect(execution.getResult()?.success).toBe(true); });
}); Enter fullscreen mode Exit fullscreen mode COMMAND_BLOCK:
import 'reflect-metadata';
import { LocalDurableTestRunner } from '@aws/durable-execution-sdk-js-testing';
import { container, TYPES } from '../../../../src/container/inversify.config';
import { durableFunction } from '../../../../src/example-app/handlers';
import { DurableFunctionEvent } from '../../../../src/example-app/durableAbstractHandler';
import IStorage from '../../../../src/interfaces/storageIF';
import LocalStorage from '../../adapters/storage/local/localStorage'; describe('Durable workflow', () => { let runner: LocalDurableTestRunner; beforeAll(async () => { process.env.ENVIRONMENT = 'test'; (await container.rebind<IStorage>(TYPES.Storage)).to(LocalStorage).whenDefault(); await LocalDurableTestRunner.setupTestEnvironment({ skipTime: true }); }); beforeEach(() => { runner = new LocalDurableTestRunner({ handlerFunction: durableFunction }); }); afterAll(async () => { await LocalDurableTestRunner.teardownTestEnvironment(); delete process.env.ENVIRONMENT; }); it('runs multiple data sources in parallel', async () => { const event: DurableFunctionEvent = { eventContexts: ['customers', 'products', 'orders'] }; const execution = await runner.run({ payload: event }); expect(execution.getStatus()).toBe('SUCCEEDED'); expect(execution.getResult()?.success).toBe(true); });
}); COMMAND_BLOCK:
import 'reflect-metadata';
import { LocalDurableTestRunner } from '@aws/durable-execution-sdk-js-testing';
import { container, TYPES } from '../../../../src/container/inversify.config';
import { durableFunction } from '../../../../src/example-app/handlers';
import { DurableFunctionEvent } from '../../../../src/example-app/durableAbstractHandler';
import IStorage from '../../../../src/interfaces/storageIF';
import LocalStorage from '../../adapters/storage/local/localStorage'; describe('Durable workflow', () => { let runner: LocalDurableTestRunner; beforeAll(async () => { process.env.ENVIRONMENT = 'test'; (await container.rebind<IStorage>(TYPES.Storage)).to(LocalStorage).whenDefault(); await LocalDurableTestRunner.setupTestEnvironment({ skipTime: true }); }); beforeEach(() => { runner = new LocalDurableTestRunner({ handlerFunction: durableFunction }); }); afterAll(async () => { await LocalDurableTestRunner.teardownTestEnvironment(); delete process.env.ENVIRONMENT; }); it('runs multiple data sources in parallel', async () => { const event: DurableFunctionEvent = { eventContexts: ['customers', 'products', 'orders'] }; const execution = await runner.run({ payload: event }); expect(execution.getStatus()).toBe('SUCCEEDED'); expect(execution.getResult()?.success).toBe(true); });
}); - Core Logic (Domain): The core contains the application's core business rules, completely isolated from the outer layers.
- Ports: Defined interfaces that describe actions available to the core.
- Adapters: Connect external systems to the application's core through ports, making it easy to switch out databases, API integrations, or other dependencies without impacting the core logic. - Ingest some data, transform it, and store it into a database.
- Support more than one data source, so the ingestion and transformation code will be different depending on the data type.
- Build the application in a modular way so that it's easier to evolve in the future with potential new data types. - Use Lambda Durable Functions. Build a monolith and use Parallel execution with automatic retries for each data type for reliability.
- Leverage Hexagonal Architecture to keep the code structure clean.
- Apply IoC (Inversion of Control) so that different ingestion and transformation code can be injected without modifying the orchestration logic, avoiding condition-heavy, tightly coupled code.
how-totutorialguidedev.toaiserverswitchdatabase