Tools: ASP.NET Core Request Pipeline Explained: What Happens When an API Receives a Request

Tools: ASP.NET Core Request Pipeline Explained: What Happens When an API Receives a Request

The Big Picture: The ASP.NET Core Request Flow

Step 1: The Request Reaches Kestrel

Step 2: The ASP.NET Core Hosting Layer

Step 3: The Request Enters the Middleware Pipeline

Middleware Can Short-Circuit the Pipeline

Step 4: Built-in Middleware Components

Middleware Order Matters

Step 5: Endpoint Routing

Step 6: Endpoint Execution

Model Binding

Validation

Business Logic

Returning a Result

Step 7: The Response Travels Back Through Middleware

A Simple Performance Debugging Trick

Visual Summary of the Request Flow

Key Takeaways You send a request to an API endpoint. Milliseconds later, a response comes back. Most of the time, we don’t think much about what happens in between. We write controllers, configure middleware, run the application, and everything works. Maybe authentication suddenly stops working. Maybe a middleware behaves differently than expected. Maybe performance drops under load. Or routing starts sending requests to the wrong endpoint. When that happens, the question becomes unavoidable: What actually happens inside ASP.NET Core when a request hits your API? Understanding the request pipeline is what turns ASP.NET Core from a black box into something you can actually debug and optimize. In this article, we'll walk through the lifecycle of a request in ASP.NET Core—from the moment it reaches your server to the moment the response is sent back. If you trace a request from the network all the way to your controller or endpoint, it roughly goes through this path: Each stage gets a chance to process the request before it reaches your application logic. Once you understand this flow, debugging strange behavior becomes much easier. The first component inside your application that receives the request is Kestrel. Kestrel is the default high-performance web server used by ASP.NET Core. Its job is to: Kestrel is designed for high throughput and low latency. It uses asynchronous I/O and efficient networking primitives to handle thousands of concurrent connections. In production environments, Kestrel usually sits behind a reverse proxy such as: The reverse proxy handles things like TLS termination, load balancing, and security filtering, while Kestrel still processes the request inside the application. Once Kestrel receives the request, it passes it into the ASP.NET Core pipeline. Before the request reaches middleware, ASP.NET Core’s hosting layer has already done some important work. When the application starts, the hosting layer: This setup happens during application startup in Program.cs. By the time a request arrives, the middleware pipeline has already been assembled and is ready to process incoming requests. Most of the interesting work in ASP.NET Core happens inside the middleware pipeline. Middleware are small components that can: Middleware are configured in Program.cs. Here’s what happens during execution: This creates a two-way pipeline: One thing that surprises many developers when debugging middleware is that responses travel back through the pipeline in reverse order. Middleware can also stop the pipeline entirely. In this case, the request never reaches later middleware or the endpoint. This behavior is commonly used for: ASP.NET Core provides several built-in middleware components that most applications rely on. Common examples include: Routing Middleware Determines which endpoint matches the request. Authentication Middleware Validates the user identity. Authorization Middleware Checks whether the authenticated user has permission. Exception Handling Middleware Handles unhandled exceptions globally. *Other Common Production Middleware * Real-world APIs often include additional middleware such as: Each middleware adds a delegate to the request pipeline. Individually they’re lightweight, but extremely long middleware chains can introduce small overhead in very high-throughput systems. One of the most common sources of bugs in ASP.NET Core applications is incorrect middleware ordering. Consider this configuration: This breaks authentication because authorization runs before the user identity is established. The correct order is: When debugging strange authentication behavior, middleware order is often the first thing worth checking. After middleware processing, ASP.NET Core needs to determine which endpoint should handle the request. This is handled by endpoint routing. Routing selects this endpoint and prepares it for execution. UseRouting() identifies the matching endpoint, while the endpoint delegate itself executes later in the pipeline. ASP.NET Core’s routing system is highly optimized and capable of efficiently matching large numbers of routes. Once routing selects the correct endpoint, ASP.NET Core executes the endpoint logic. For controller-based APIs, ASP.NET Core performs several additional steps automatically. ASP.NET Core maps incoming request data into method parameters. Data can be bound from multiple sources: If validation attributes are used, ASP.NET Core validates the model automatically. Invalid models typically produce a 400 Bad Request response. This is where your application code runs. Typical tasks include: The endpoint returns a result such as: ASP.NET Core then converts this result into an HTTP response. Once the endpoint finishes execution, the response begins its return journey. The response flows back through the middleware pipeline in reverse order. This allows middleware to: Finally, the response reaches Kestrel, which sends it back to the client. When diagnosing slow requests, a small timing middleware can quickly identify bottlenecks. This simple middleware can reveal slow endpoints or middleware components almost instantly. ASP.NET Core processes requests through a middleware pipeline Kestrel is the web server that receives HTTP requests Middleware can inspect, modify, or terminate requests Middleware order directly affects application behavior Endpoint routing determines which API logic executes Responses travel back through the same middleware pipeline Once you understand this flow, ASP.NET Core stops feeling like a black box. Debugging becomes easier, middleware behavior makes more sense, and performance issues are much easier to track down. Have you ever spent hours debugging an ASP.NET Core API only to realize the issue was caused by middleware order? Templates let you quickly answer FAQs or store snippets for re-use. Are you sure you want to ? It will become hidden in your post, but will still be visible via the comment's permalink. as well , this person and/or

Code Block

Copy

Client ↓ Reverse Proxy (optional) ↓ Kestrel Web Server ↓ ASP.NET Core Hosting Layer ↓ Middleware Pipeline ↓ Endpoint Routing ↓ Endpoint Execution (Controller / Minimal API) ↓ Middleware (Response Flow) ↓ Kestrel ↓ Client Response CODE_BLOCK: Client ↓ Reverse Proxy (optional) ↓ Kestrel Web Server ↓ ASP.NET Core Hosting Layer ↓ Middleware Pipeline ↓ Endpoint Routing ↓ Endpoint Execution (Controller / Minimal API) ↓ Middleware (Response Flow) ↓ Kestrel ↓ Client Response CODE_BLOCK: Client ↓ Reverse Proxy (optional) ↓ Kestrel Web Server ↓ ASP.NET Core Hosting Layer ↓ Middleware Pipeline ↓ Endpoint Routing ↓ Endpoint Execution (Controller / Minimal API) ↓ Middleware (Response Flow) ↓ Kestrel ↓ Client Response CODE_BLOCK: app.Use(async (context, next) => { var logger = context.RequestServices .GetRequiredService<ILoggerFactory>() .CreateLogger("RequestLogger"); logger.LogInformation("Request started: {Path}", context.Request.Path); await next(); logger.LogInformation("Response finished with status {StatusCode}", context.Response.StatusCode); }); CODE_BLOCK: app.Use(async (context, next) => { var logger = context.RequestServices .GetRequiredService<ILoggerFactory>() .CreateLogger("RequestLogger"); logger.LogInformation("Request started: {Path}", context.Request.Path); await next(); logger.LogInformation("Response finished with status {StatusCode}", context.Response.StatusCode); }); CODE_BLOCK: app.Use(async (context, next) => { var logger = context.RequestServices .GetRequiredService<ILoggerFactory>() .CreateLogger("RequestLogger"); logger.LogInformation("Request started: {Path}", context.Request.Path); await next(); logger.LogInformation("Response finished with status {StatusCode}", context.Response.StatusCode); }); CODE_BLOCK: Request → Middleware → Endpoint Response ← Middleware ← Endpoint CODE_BLOCK: Request → Middleware → Endpoint Response ← Middleware ← Endpoint CODE_BLOCK: Request → Middleware → Endpoint Response ← Middleware ← Endpoint CODE_BLOCK: app.Use(async (context, next) => { if (!context.User.Identity?.IsAuthenticated ?? true) { context.Response.StatusCode = StatusCodes.Status401Unauthorized; return; } await next(); }); CODE_BLOCK: app.Use(async (context, next) => { if (!context.User.Identity?.IsAuthenticated ?? true) { context.Response.StatusCode = StatusCodes.Status401Unauthorized; return; } await next(); }); CODE_BLOCK: app.Use(async (context, next) => { if (!context.User.Identity?.IsAuthenticated ?? true) { context.Response.StatusCode = StatusCodes.Status401Unauthorized; return; } await next(); }); CODE_BLOCK: app.UseRouting(); CODE_BLOCK: app.UseRouting(); CODE_BLOCK: app.UseRouting(); CODE_BLOCK: app.UseAuthentication(); CODE_BLOCK: app.UseAuthentication(); CODE_BLOCK: app.UseAuthentication(); CODE_BLOCK: app.UseAuthorization(); CODE_BLOCK: app.UseAuthorization(); CODE_BLOCK: app.UseAuthorization(); CODE_BLOCK: app.UseExceptionHandler(); CODE_BLOCK: app.UseExceptionHandler(); CODE_BLOCK: app.UseExceptionHandler(); CODE_BLOCK: app.UseAuthorization(); app.UseAuthentication(); CODE_BLOCK: app.UseAuthorization(); app.UseAuthentication(); CODE_BLOCK: app.UseAuthorization(); app.UseAuthentication(); CODE_BLOCK: app.UseAuthentication(); app.UseAuthorization(); CODE_BLOCK: app.UseAuthentication(); app.UseAuthorization(); CODE_BLOCK: app.UseAuthentication(); app.UseAuthorization(); CODE_BLOCK: app.MapGet("/products/{id}", (int id) => { return Results.Ok($"Product {id}"); }); CODE_BLOCK: app.MapGet("/products/{id}", (int id) => { return Results.Ok($"Product {id}"); }); CODE_BLOCK: app.MapGet("/products/{id}", (int id) => { return Results.Ok($"Product {id}"); }); CODE_BLOCK: GET /products/10 CODE_BLOCK: GET /products/10 CODE_BLOCK: GET /products/10 CODE_BLOCK: [HttpPost] public IActionResult CreateOrder(OrderDto order) CODE_BLOCK: [HttpPost] public IActionResult CreateOrder(OrderDto order) CODE_BLOCK: [HttpPost] public IActionResult CreateOrder(OrderDto order) CODE_BLOCK: public class OrderDto { [Required] public string CustomerEmail { get; set; } } CODE_BLOCK: public class OrderDto { [Required] public string CustomerEmail { get; set; } } CODE_BLOCK: public class OrderDto { [Required] public string CustomerEmail { get; set; } } CODE_BLOCK: return Ok(order); CODE_BLOCK: return Ok(order); CODE_BLOCK: return Ok(order); CODE_BLOCK: app.Use(async (context, next) => { var stopwatch = Stopwatch.StartNew(); await next(); stopwatch.Stop(); var logger = context.RequestServices .GetRequiredService<ILoggerFactory>() .CreateLogger("Performance"); logger.LogInformation("Request completed in {Elapsed} ms", stopwatch.ElapsedMilliseconds); }); CODE_BLOCK: app.Use(async (context, next) => { var stopwatch = Stopwatch.StartNew(); await next(); stopwatch.Stop(); var logger = context.RequestServices .GetRequiredService<ILoggerFactory>() .CreateLogger("Performance"); logger.LogInformation("Request completed in {Elapsed} ms", stopwatch.ElapsedMilliseconds); }); CODE_BLOCK: app.Use(async (context, next) => { var stopwatch = Stopwatch.StartNew(); await next(); stopwatch.Stop(); var logger = context.RequestServices .GetRequiredService<ILoggerFactory>() .CreateLogger("Performance"); logger.LogInformation("Request completed in {Elapsed} ms", stopwatch.ElapsedMilliseconds); }); CODE_BLOCK: Request Flow Summary Client ↓ Kestrel ↓ Middleware Pipeline ↓ Routing ↓ Endpoint Execution ↓ Middleware (response) ↓ Client CODE_BLOCK: Request Flow Summary Client ↓ Kestrel ↓ Middleware Pipeline ↓ Routing ↓ Endpoint Execution ↓ Middleware (response) ↓ Client CODE_BLOCK: Request Flow Summary Client ↓ Kestrel ↓ Middleware Pipeline ↓ Routing ↓ Endpoint Execution ↓ Middleware (response) ↓ Client - Listen for incoming HTTP requests - Parse HTTP messages - Forward the request into the ASP.NET Core application pipeline - Azure App Service infrastructure - Builds the dependency injection container - Configures logging - Loads configuration - Constructs the middleware pipeline - Inspect the request - Modify the request - Stop the request from continuing - Pass the request to the next component - Modify the response before it leaves - The request enters the middleware - Code before await next() runs - The request moves to the next middleware - Eventually an endpoint executes - The response travels back through middleware - Code after await next() runs - authentication checks - rate limiting - request filtering - CORS (UseCors) - Response compression (UseResponseCompression) - HTTPS redirection (UseHttpsRedirection) - Rate limiting (UseRateLimiter) - HTTP method (GET, POST, etc.) - request path - route parameters - a controller action - a minimal API handler - a Razor page - a gRPC service - request body - route values - query parameters - database queries - calling services - performing calculations - invoking external APIs - objects → JSON - status codes → HTTP response codes - headers → HTTP headers - modify response headers - compress responses - log execution time - transform output