Manage attachments chunks with ASP.NET Web Api

In the previous post I spoke about a custom MultipartFormData stream provider and how it can help us to manage some custom informations included in a request message.

In that example I generated chunks form a file and I sent those to a rest service (AKA Web API) with some additional informations that were then retrieved from the custom provider.

Now I want to use these informations to manage the upload session and merge all the chunks when received.

What I need to do is define the models involved in the process and the service that manage the chunks.

Models

We have to define two stuff, the first one is the model for the chunk:


public class ChunkMetadata
{
public string Filename { get; set; }
public int ChunkNumber { get; set; }

public ChunkMetadata(string filename, int chunkNumber)
{
Filename = filename;
ChunkNumber = chunkNumber;
}
}

The ChunkNumber property deserves an explanation; is the number associated to the chunk and will be useful to understand the correct order when we’ll have to merge all of them.

The second one is the model of the session, that is the bunch of the chunks that compose the file.

First of all we define the interface:


public interface IUploadSession
{
ConcurrentBag<ChunkMetadata> Chunks { get; set; }
string Filename { get; }
long Filesize { get; }
bool AddChunk(string filename, string chunkFileName, int chunkNumber, int totalChunks);
Task MergeChunks(string path);
}

The FileName and Filesize are closely tied to the session; we need AddChunk and MergeChunks methods as well.

We also need a thread safe collection for the chunks that compose the session, so we define a CuncurrentBag collection, that is the thread safe representation of the List.

Now we can implement the model:


public class UploadSession : IUploadSession
{
public string Filename { get; private set; }
public long Filesize { get; private set; }
private int _totalChunks;
private int _chunksUploaded;

public ConcurrentBag<ChunkMetadata> Chunks { get; set; }

public UploadSession()
{
Filesize = 0;
_chunksUploaded = 0;
Chunks = new ConcurrentBag<ChunkMetadata>();
}

public bool AddChunk(string filename, string chunkFileName, int chunkNumber, int totalChunks)
{
if (Filename == null)
{
Filename = filename;
_totalChunks = totalChunks;
}

var metadata = new ChunkMetadata(chunkFileName, chunkNumber);
Chunks.Add(metadata);

_chunksUploaded = Interlocked.Increment(ref _chunksUploaded);
return _chunksUploaded == _totalChunks;
}

public async Task MergeChunks(string path)
{
var filePath = path + Filename;

using (var mainFile = new FileStream(filePath, FileMode.Create))
{
foreach (var chunk in Chunks.OrderBy(c => c.ChunkNumber))
{
using (var chunkFile = new FileStream(chunk.Filename, FileMode.Open))
{
await chunkFile.CopyToAsync(mainFile);
Filesize += chunkFile.Length;
}
}
}

foreach (var chunk in Chunks)
{
File.Delete(chunk.Filename);
}
}
}

The implementation is quite simple.

The AddChunk method add the new chunk to the collection, then increment the _chunksUploaded property with the thread safe operation Interlocked.Increment; at the end, the method returns a bool that is true if all the chunks are received, otherwise false.

The MergeChunks method deal with the retrieve of all the chunks from the file system.

It gets the collection, order by the chunk number, read the bytes from the chunks and copy those to the main file stream.

After all, the chunks are deleted.

Service

The service will have an interface like this:


public interface IUploadService
{
Guid StartNewSession();
Task<bool> UploadChunk(HttpRequestMessage request);
}

In my mind, the StartNewSession method will instantiate a new Session object and assign a new correlation id that is the unique identifier of the session.

This is the implementation:


public class UploadService : IUploadService
{
private readonly Context _db = new Context();
private readonly string _path;
private readonly ConcurrentDictionary<string, UploadSession> _uploadSessions;

public UploadService(string path)
{
_path = path;
_uploadSessions = new ConcurrentDictionary<string, UploadSession>();
}

public async Task<bool> UploadChunk(HttpRequestMessage request)
{
var provider = new CustomMultipartFormDataStreamProvider(_path);
await request.Content.ReadAsMultipartAsync(provider);
provider.ExtractValues();

UploadSession uploadSession;
_uploadSessions.TryGetValue(provider.CorrelationId, out uploadSession);

if (uploadSession == null)
throw new ObjectNotFoundException();

var completed = uploadSession.AddChunk(provider.Filename, provider.ChunkFilename, provider.ChunkNumber, provider.TotalChunks);

if (completed)
{
await uploadSession.MergeChunks(_path);

var fileBlob = new FileBlob()
{
Id = Guid.NewGuid(),
Path = _path + uploadSession.Filename,
Name = uploadSession.Filename,
Size = uploadSession.Filesize
};

_db.FileBlobs.Add(fileBlob);
await _db.SaveChangesAsync();

return true;
}

return false;
}

public Guid StartNewSession()
{
var correlationId = Guid.NewGuid();
var session = new UploadSession();
_uploadSessions.TryAdd(correlationId.ToString(), session);

return correlationId;
}
}

In the StartNewSession method we use the thread safe method TryAdd to add a new session to the CuncurrentBag.

About the UploadChunk method, we seen the first part of the implementation in the previous post.

Once the metadata is retrieved from the request, we try to find the session object with a thread safe operation.

If we don’t find the object, of course we need to throw an exception because we expect that the related session exists.

If the session exists, we add the chunk to the session and we check the result of the operation.

If is the last chunk, we merge all of them and we can do a database operation if needed.

Controller

The implementation of the controller is very simple:


public class FileBlobsController : ApiController
{
private readonly IUploadService _fileBlobsService;
private readonly Context _db = new Context();

public FileBlobsController(IUploadService uploadService)
{
_fileBlobsService = uploadService;
}

[Route("api/fileblobs/getcorrelationid")]
[HttpGet]
public IHttpActionResult GetCorrelationId()
{
return Ok(_fileBlobsService.StartNewSession());
}

[HttpPost]
public async Task<IHttpActionResult> PostFileBlob()
{
if (!Request.Content.IsMimeMultipartContent())
throw new Exception();

var result = await _fileBlobsService.UploadChunk(Request);

return Ok(result);
}
}

You can find the source code here.

Manage attachments chunks with ASP.NET Web Api

Custom MultipartFormDataStreamProvider in C#

Frequently, when we manage multipart/form requests and we send them to the server, we might want to add some additional informations.

Perhaps we might want to split a big file in chunks and we might want to add some additional informations like the id of the upload session, the chunk number, the file name and the total chunks number that compose the file.

Suppose that we use for the client side Angularjs, the code of the controller is quite simple:


.....

public AddAttachment(event) {
let attachments = event.target.files;
if (attachments.length > 0) {
let file: File = attachments[0];

this.$http.get(this.url + "/GetCorrelationId").then((correlationId) => {
let chunks = this.SplitFile(file);

for (let i = 0; i < chunks.length; i++) {
let formData = new FormData();
formData.append("file", chunks[i], file.name);
formData.append("correlationId", correlationId.data);
formData.append("chunkNumber", i + 1);
formData.append("totalChunks", chunks.length);

this.$http.post(this.url, formData, { headers: { "Content-Type": undefined } }).then((result) => {
if(result.data) {
this.Load();
}
});
}
});
}
}

private SplitFile(file: File): Array<Blob> {
let chunks = Array<Blob>();
let size = file.size;
let chunkSize = 1024 * 1024 * 10;
let start = 0;
let end = chunkSize;

while (start < size) {
let chunk = file.slice(start, end);
chunks.push(chunk);
start = end;
end += chunkSize;
}

return chunks;
}

.....

The AddAttachment method is invoked by the view; once the file is retrieved, the split method generate the array of chunks.

Then, with the $http factory we send every single chunks to the server with additional metadata.

In order to read these datas from the server side, we need to implement a custom MultipartFormData stream provider.

The first step is define the interface of our provider:


public interface ICustomMultipartFormDataStreamProvider
{
string ChunkFilename { get; }
int ChunkNumber { get; }
string CorrelationId { get; }
string Filename { get; }
int TotalChunks { get; }
void ExtractValues();
}

The interface has the same properties sent by the client, and a method that deal with extract the values from the message.

Now we can proceed with the implementation:


public class CustomMultipartFormDataStreamProvider : MultipartFormDataStreamProvider, ICustomMultipartFormDataStreamProvider
{
public string Filename { get; private set; }
public string ChunkFilename { get; private set; }
public string CorrelationId { get; private set; }
public int ChunkNumber { get; private set; }
public int TotalChunks { get; private set; }

public CustomMultipartFormDataStreamProvider(string rootPath) : base(rootPath) { }

public CustomMultipartFormDataStreamProvider(string rootPath, int bufferSize) : base(rootPath, bufferSize) { }

public override Task ExecutePostProcessingAsync()
{
foreach (var file in Contents)
{
var parameters = file.Headers.ContentDisposition.Parameters;
var filename = ExtractParameter(parameters, "filename");
if (filename != null) Filename = filename.Value.Trim('\"');
}

return base.ExecutePostProcessingAsync();
}

public void ExtractValues()
{
var chunkFileName = FileData[0].LocalFileName;
var correlationId = FormData?.GetValues("correlationId");
var chunkNumber = FormData?.GetValues("chunkNumber");
var totalChunks = FormData?.GetValues("totalChunks");

if (string.IsNullOrEmpty(chunkFileName) || correlationId == null || chunkNumber == null || totalChunks == null)
throw new Exception("Missing values in UploadChunk session.");

ChunkFilename = chunkFileName;
CorrelationId = correlationId.First();
ChunkNumber = int.Parse(chunkNumber.First());
TotalChunks = int.Parse(totalChunks.First());
}

private NameValueHeaderValue ExtractParameter(ICollection<NameValueHeaderValue> parameters, string name)
{
return parameters.FirstOrDefault(p => p.Name.Equals(name, StringComparison.OrdinalIgnoreCase));
}
}

The class inherits from MultipartFormDataStreamProvider base class and implements our interface.

Two methods are implemented; the first one override ExecutePostProcessingAsync and in this method we retrieve the name of the main file.

The second one extract the custom parameters from the FormData; we retrieve also the chunk filename from the FileData object; this information is included as default information in the MultipartFormData message.

Now the informations are retrieved and we can use the custom provider in a service:


public async Task<bool> UploadChunk(HttpRequestMessage request)
{
var provider = new CustomMultipartFormDataStreamProvider(_path);
await request.Content.ReadAsMultipartAsync(provider);
provider.ExtractValues();

.....
}

The metadata will be available in the provider object.

You can find the source code here.

 

 

Custom MultipartFormDataStreamProvider in C#

Real-time search with ASP.NET and Elasticsearch

A common problem that we are faced when we have deployed our applications is improve the performance of a page or feature.

In my case for example I had a field where I could search and select a city, so the starting elements were a lot and the search was quite slow; I wanted a better user experience.

We can solve performance problems like these with the help of a cache or a full-text search.

I have chosen the last one and elastic search as full-text engine, so I’ll describe the steps that I followed to configure and use it in my application.

Installation

The first step is install the elastic search server, that you can download here.

Once installed we have to start it by executing the following executable:

<Installation path>\bin\elasticsearch.bat

This is the server log:

log

The server will take care to index the content that we will pass to it; in order to do that we need a client to use in our application; in my case the application was .NET and I used NEST.

NEST

As said above, NEST is an elastic search high level client for .NET applications.

The first step is install it in the application with nuget:

Install-package NEST

And in the package.config we’ll have:

package

Now we have all the necessary tools and we can develop the code for the search feature.

Client

We define a client class that has one responsability, that is setup the url and the default index of the client, and that instantiate it:


public class ElasticSearchClient
{
privatereadonlyIElasticClient _client;

publicElasticSearchClient(IElasticClient client)
{
_client = client;
}

publicElasticSearchClient(string uri, string indexName) : this(CreateElasticClient(uri, indexName)) {}

publicIElasticClientGetClient()
{
return_client;
}

privatestaticElasticClientCreateElasticClient(string uri, string indexName)
{
var node = newUri(uri);
var setting = newConnectionSettings(node);
setting.DefaultIndex(indexName);
returnnewElasticClient(setting);
}
}

Once instantiated, the class returns a new instance of the client; we can register it in the startup class of the application with autofac:


public partial class Startup
{
publicvoidConfiguration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();
...
}
}

Service base class

A service that uses an elasticsearch index should be able to do some basic operations, that concerns the logics of the full-text indexes.

We have to deal with the initialize a specific index, populate the index with the contents, obviously performs a search on the index with specific parameters.

So, we have to define an interface like this:


internal interface IElasticSearchService<T> where T : class
{
voidInit();
voidCheckIndex();
voidBulkInsert(List<T> objects);
IEnumerable<T> Search(string query);
}

I like to separate the init method, that create the index, from the checkindex method, that check if the index already exists.

Now we can implement the basic service:


public class ElasticSearchService<T> : IElasticSearchService<T> where T : class
{
protectedreadonlyContext Db = newContext();
protectedreadonlyElasticSearchClient ElasticSearchClient;
protectedreadonlystring IndexName;

publicElasticSearchService(ElasticSearchClient elasticSearchClient, string indexName)
{
ElasticSearchClient = elasticSearchClient;
IndexName = indexName;
}

publicvirtualvoidInit()
{
CheckIndex();
BulkInsert(Db.Set<T>().ToList());
}

publicvoidCheckIndex()
{
if (IndexExist()) return;
var response = CreateIndex();

if (!response.IsValid)
{
thrownewException(response.ServerError.ToString(), response.OriginalException);
}
}

publicvoidBulkInsert(List<T> objects)
{
var response = ElasticSearchClient.GetClient().IndexMany(objects, IndexName);
if (!response.IsValid)
{
thrownewException(response.ServerError.ToString(), response.OriginalException);
}
}

publicvirtualIEnumerable<T> Search(string query)
{
var results = ElasticSearchClient.GetClient().Search<T>(c => c.From(0).Size(10).Query(q => q.Prefix("_all", query)));

returnresults.Documents;
}

protectedvirtualIResponseCreateIndex()
{
var indexDescriptor = newCreateIndexDescriptor(IndexName).Mappings(ms => ms.Map<T>(m => m.AutoMap()));
returnElasticSearchClient.GetClient().CreateIndex(indexDescriptor);
}

protectedboolIndexExist()
{
returnElasticSearchClient.GetClient().IndexExists(IndexName).Exists;
}
}

The constructor accept the client and the index name.

We define a virtual init method, that check if the index exists and do a bulkinsert of a list of object; the method is virtual, we think that a derived service could override the method.

This bulkinsert method leverage the client to index the object list and the search method implements a basic search, that searchs in all the fields of the objects by using the special field _all, which contains the concatenate values of all fields.

The method returns the first 10 elements.

Createindex create a specific index with automap option, that infers the elasticsearch fields datatypes from the POCO object that we pass to it; it’s protected, so the derived class could use it.

IndexExists check if an index exists and it can be used from the derived class as well.

Service

Now we can implement a specific service, that inherits from ElasticSearchService class.

In this example I need to search in a list of cities and related districts, so I need to override the CreateIndex method like this:


public sealed class CitiesService : ElasticSearchService<City>
{
publicCitiesService(ElasticSearchClient elasticSearchClient, string indexName): base(elasticSearchClient, indexName) {}

protectedoverrideIResponseCreateIndex()
{
var indexDescriptor = newCreateIndexDescriptor(IndexName).Mappings(
ms => ms.Map<City>(m => m.AutoMap().Properties(ps =>
ps.Nested<District>(n => n
.Name(nn => nn.District)
.AutoMap()))));

returnElasticSearchClient.GetClient().CreateIndex(indexDescriptor);
}

publicoverrideIEnumerable<City> Search(string query)
{
var results = ElasticSearchClient.GetClient().Search<City>(c => c.From(0).Size(10).Query(q => q.Prefix(p => p.Name, query) || q.Term("district.name", query)));

returnresults.Documents.OrderBy(d => d.Name);
}
}

What I need to do is automap the city object and the district, that is a closely related entity of the city; so I have to map the District property as nested with the automap option as well.

Thus I will able to search for all the properties of the city and the district.

The other method that I override is the Search method; I search partially in the name of the city (Prefix) and the specific term in the district name (Term) and I returns the first 10 elements.

Now I have to register the service with autofac:


public partial class Startup
{
public void Configuration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();

builder.Register(c => newCitiesService(c.Resolve<ElasticSearchClient>(), "cities"))
.AsSelf()
.AsImplementedInterfaces()
.SingleInstance();

...
}
}

The last step is initialize the full text index of my service:


public partial class Startup
{
publicvoidConfiguration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();

builder.Register(c => newCitiesService(c.Resolve<ElasticSearchClient>(), "cities"))
.AsSelf()
.AsImplementedInterfaces()
.SingleInstance();

...

InitElasticSearchServices(containerBuilder);
}

privatestaticvoidInitElasticSearchServices(IContainer containerBuilder)
{
var citiesServices = containerBuilder.Resolve<CitiesService>();
citiesServices.Init();
}
}

I make a new instance of the service and call the Init method of the ElasticSearchService that we have seen above.

This method will create and populate the index.

Web API

Now I can use the service in my Web API, like this:


public class CitiesController : ApiController
{
privatereadonlyCitiesService _elasticSearchService;

publicCitiesController(CitiesService elasticSearchService)
{
_elasticSearchService = elasticSearchService;
}

// GET: api/Cities
publicIEnumerable<City> GetCities(string query)
{
return_elasticSearchService.Search(query);
}

protectedoverridevoidDispose(bool disposing)
{
base.Dispose(disposing);
}
}

You can find the source code of this topic here.

Real-time search with ASP.NET and Elasticsearch

Attachments management with Angular 2

A common issue that we faced in our applications is implement a component to allow the management of the attachment upload.

We need to insert a file input field in the page, grab the change event of the field, extract the file and send it to a service.

Recently I have needed to implement this functionality with Angular 2, so I’m going to explain what I have done.

Services

First of all I implement two different services, one for the file metadata and one for the blob object.

Based on a recent post, I use a base class WebApi and I define the service url:


import { Injectable } from "@angular/core";
import { Http } from "@angular/http";
import { Attachment } from "./attachment.model";
import { WebApi } from "../shared/webapi";

@Injectable()
export class AttachmentService extends WebApi<Attachment> {
constructor(public http: Http) {
super("/api/attachments", http);
}
}

The referenced service is a simple Restful service.

The second one is a service for the blob upload:


import { Injectable } from "@angular/core";
import { Http, Headers, RequestOptions, Response } from "@angular/http";
import { Observable } from "rxjs/Observable";
import { FileBlob } from "./fileBlob.model";
import { WebApi } from "../shared/webapi";

@Injectable()
export class FileBlobService extends WebApi<FileBlob> {
constructor(public http: Http) {
super("/api/fileBlobs", http);
}

public DownloadFile(id: string) {
window.open("api/fileBlobs/GetFileBlob?id=" + id, '_blank');
}

public PostFile(entity: File): Observable<File> {
let formData = new FormData();
formData.append(entity.name, entity);

return this.http.post(this.url, formData).map(this.extractData).catch(this.handleError);
}
}

The PostFile method compose a HTML FormData object with the content of the file and post it to a specific WebApi.

The DownloadFile method is simplier and call a service in a new window that returns the content of the file.

The server-side method is look like this:


public class FileBlobsController : ApiController
{
private readonly Context _db = new Context();

[ResponseType(typeof(Guid))]
public async Task<IHttpActionResult> PostFileBlob()
{
if (!Request.Content.IsMimeMultipartContent())
throw new Exception();

var provider = new MultipartMemoryStreamProvider();
await Request.Content.ReadAsMultipartAsync(provider);

HttpContent content = provider.Contents.First();
var fileName = content.Headers.ContentDisposition.FileName.Trim('\"');
var buffer = await content.ReadAsByteArrayAsync();

var fileBlob = new FileBlob()
{
Id = Guid.NewGuid(),
Name = fileName,
File = buffer
};

_db.FileBlobs.Add(fileBlob);
await _db.SaveChangesAsync();

return Ok(fileBlob.Id);
}
}

We have to use the MultiPartMemoryStreamProvider to retrieve the content of the file and store it in a specific table.

Component

We need two methods, the first one to download the existing attachment, the second one to add a new attachment:


import { Component, Input, Output, EventEmitter } from "@angular/core";
import { Constants } from "../shared/commons";
import { Attachment } from "./attachment.model";
import { FileBlobService } from "./fileBlob.service";
import { AlertService } from "../core/alert.service";

@Component({
moduleId: module.id,
selector: "attachment",
templateUrl: "attachment.component.html"
})

export class AttachmentComponent {
@Input() placeholder: string;
@Input() name: string;
@Input() validationEnabled: boolean;
@Input() attachment: Attachment;
@Output() onSaved = new EventEmitter<Attachment>();
public fileBlob: File;

constructor (private fileBlobService: FileBlobService, private alertService: AlertService) {}

public DownloadAttachment() {
this.fileBlobService.DownloadFile(this.attachment.IdFileBlob);
}

public AddAttachment(event) {
let attachments = event.target.files;
if (attachments.length > 0) {
let file:File = attachments[0];
this.fileBlobService.PostFile(file).subscribe(
(res) => {
let id: string = Constants.guidEmpty;

if (this.attachment != null)
id = this.attachment.Id;

this.attachment = {
Id: id,
IdFileBlob: res.toString(),
Name: file.name,
Size: file.size
};

this.onSaved.emit(this.attachment);
},
(error) => this.alertService.Error(error));
}
}

...
}

The AddAttachment method deserves an explanation; it accepts an event parameter, fired by the file input filed of the ui when a new attachment is selected.

The method retrieves the file from the event and pass it as a parameter to the PostFile method that we have seen above.

Once saved, an object with the file metadata is created and passed with the onSaved event to the parent component, that it deal with the object:


export class InvoiceDetailComponent {
...

public onAttachmentSaved(attachment: Attachment) {
this.attachment = attachment;
}
}

Module

We define a feature module like this:


import { NgModule } from "@angular/core";
import { HttpModule } from "@angular/http";

import { SharedModule } from "../shared/shared.module";
import { AttachmentComponent } from "./attachment.component";
import { AttachmentService } from "./attachment.service";
import { FileBlobService } from "./fileBlob.service";

let options: any = {
autoDismiss: true,
positionClass: 'toast-bottom-right',
};

@NgModule ({
imports: [
SharedModule,
HttpModule
],
exports: [
AttachmentComponent
],
declarations: [
AttachmentComponent
],
providers: [
AttachmentService,
FileBlobService
]
})

export class AttachmentModule {}

The module exports the component and provides the services discussed above.

View

We have to implement the view for the attachment module:

<div *ngIf="AttachmentIsNull()">
<label class="btn btn-primary" for="fileBlob">
<i class="fa fa-paperclip"></i> {{ "ATTACHINVOICE" | translate }}
<input id="fileBlob" type="file" [(ngModel)]="fileBlob" (change)="AddAttachment($event)" [required]="validationEnabled" style="display: none;" />
</label>
</div>
<div *ngIf="!AttachmentIsNull()">
<span *ngIf="attachment" (click)="DownloadAttachment()">{{attachment.Name}}</span>
<input type="button" class="btn btn-primary" value="Upload new" (click)="UploadNewAttachment()" />
</div>

In the view we have a file input field that bind the change event with the AddAttachment method.

The additional buttons allow us to clear the current attachment and upload a new one.

The last change is in the parent view:


<form #invoiceForm="ngForm">
<div class="form">
...
<div class="form-group">
<label for="attachment">{{ "ATTACHMENT" | translate }}</label>
<attachment placeholder="ATTACHMENT" name="attachment" [attachment]="attachment" (onSaved)="onAttachmentSaved($event)" [validationEnabled]="validationEnabled"></attachment>
</div>
</div>
</form>

We have added the attachment component in the view and we have binded the onSaved event, in order to retrieve the file metadata.

You can find the source code here.

 

 

 

 

Attachments management with Angular 2

Register and test per-request services with Autofac

When we develop web application, like ASP.NET applications, we often need to implement a service with some informations related to the user request, such as session/account infos.

In this case, the service will be tied to the web request lifecycle and there will be an instance of the service for each request.

Autofac help us to manage the instances and the life cycles of these services.

Service

We can develop a simple service that we use for our tests:


public class AccountService
{
private ITokenService _tokenService;

public AccountService(ITokenService tokenService)
{
_tokenService = tokenService;
}
}

Module

In order to register the service, we use an Autofac module:


public class PerRequestModule : Module
{
protected override void Load(ContainerBuilder builder)
{
builder.RegisterType<AccountService>()
.AsSelf()
.InstancePerRequest()
.WithParameter(new ResolvedParameter(
(pi, ctx) => pi.ParameterType == typeof(ITokenService),
(pi, ctx) => ctx.ResolveKeyed<ITokenService>("singletonTokenService")
));
}
}

The service was defined has InstancePerRequest; we also specified explicitly the implementation of the parameter interface.

And register it in the Autofac container:


var builder = new ContainerBuilder();
...
builder.RegisterModule(new PerRequestModule());
...
containerBuilder = builder.Build();

Test methods

Now, the last step is the test methods:


[Test]
public void should_is_not_the_same_instance_for_different_requests()
{
AccountService accountService1, accountService2;

using (HttpRequestMessage request = new HttpRequestMessage())
{
request.SetConfiguration(httpConfiguration);
var dependencyScope = request.GetDependencyScope();
accountService1 = dependencyScope.GetService(typeof(AccountService)) as AccountService;
}

using (HttpRequestMessage request = new HttpRequestMessage())
{
request.SetConfiguration(httpConfiguration);
var dependencyScope = request.GetDependencyScope();
accountService2 = dependencyScope.GetService(typeof(AccountService)) as AccountService;
}

ReferenceEquals(accountService1, accountService2).ShouldBeEquivalentTo(false);
}

[Test]
public void should_be_able_to_resolve_instance_per_request()
{
using (HttpRequestMessage request = new HttpRequestMessage())
{
request.SetConfiguration(httpConfiguration);
var dependencyScope = request.GetDependencyScope();
AccountService service = dependencyScope.GetService(typeof(AccountService)) as AccountService;

service.Should().NotBeNull();
}
}

In order to be able to test a per-request service, we need an instance of the HttpRequestMessage class and set the configuration with the httpConfiguration defined in the Autofac container; then we can use the request scope to get an instance of our services and make the test.

You can find the source code here.

Register and test per-request services with Autofac

Registering of the ASP.NET MVC Controllers with Autofac

One of the capabilities of Autofac is the integration with the ASP.NET applications.

ASP.NET MVC is a framework thinked to supports the dependency injection, and so we can use Autofac to register the modules that compose the application, such as the controllers.

Therefore, let’s start by implementing the controllers of the application, then we ‘ll add the Autofac module that define the controllers registration.

Controllers

We implement two controllers, a Controller and an ApiController:

public class HomeController : Controller
 {
 LoggerService _loggerService;

public HomeController(LoggerService loggerService)
 {
 _loggerService = loggerService;
 }
 }

The second one is the Web Api:

public class AccountController : ApiController
 {
 LoggerService _loggerService;

 public AccountController(LoggerService loggerService)
 {
 _loggerService = loggerService;
 }
 }

Autofac module

The first step is install the two autofac packages needed for the integration with ASP.NET:

install-package Autofac.Integration.Mvc
install-package Autofac.Integration.WebApi

Now we can register the controllers with an Autofac module:

public class PerRequestModule : Module
 {
 protected override void Load(ContainerBuilder builder)
 {
 var controllersAssembly = Assembly.GetAssembly(typeof(HomeController));
 var apiControllersAssembly = Assembly.GetAssembly(typeof(AccountController));

 builder.RegisterControllers(controllersAssembly);
 builder.RegisterApiControllers(apiControllersAssembly);
 }
 }

By using the reflection, we can register all the controllers in one shot.

The module needs to be registered in the autofac container:

var builder = new ContainerBuilder();
....
builder.RegisterModule(new PerRequestModule());
...
containerBuilder = builder.Build();

DependencyResolver.SetResolver(new AutofacDependencyResolver(containerBuilder));

httpConfiguration = new HttpConfiguration
{
DependencyResolver = new AutofacWebApiDependencyResolver(containerBuilder)
};

We built the container and passed that to the AutofacDependencyResolver (Controller) and to the AutofacWebApiDependencyResolver (ApiController).

Tests

Now we can implement the test methods:

public class PerRequestTests : BaseTests
{
[Test]
public void should_be_able_to_resolve_mvc_controller()
{
using (var scope = containerBuilder.BeginLifetimeScope())
{
var controller = scope.Resolve<HomeController>();
controller.Should().NotBeNull();
}
}

[Test]
public void should_be_able_to_resolve_api_controller()
{
using (var scope = containerBuilder.BeginLifetimeScope())
{
var controller = scope.Resolve<AccountController>();
controller.Should().NotBeNull();
}
}
}

As you can see, we initialized a new lifetime scope and tried to resolve the controllers.
You can find the source code here.

Registering of the ASP.NET MVC Controllers with Autofac

Services lifetime scope with Autofac

Basically when we resolve a component with Autofac, we could register the object instance in the root container; this is not the best practice, because these components will never be disposed as long as the container lives, that normally is the lifetime of the application.

Then, the root container will hold the references until the application shutdown, and this could be cause memoty leaks.

A better approach is the using of the lifetime scopes, that help us to define an area where the service can be shared with other components and disposed at the end.

Let’s start defining the classes to be used as services in the Autofac configuration.

Services

We can use two simple class, a CustomerService class and an OrdersService:


public class CustomerService
{
public CustomerService() { }
}


public class OrdersService
{
private ITokenService _tokenService;

public OrdersService(ITokenService tokenService)
{
this._tokenService = tokenService;
}

}

The OrdersService accepts in the constructor an ITokenService and we have an implementation of that:


public class PerDependencyTokenService : ITokenService
 {
 private Guid _token { get; set; }

public Guid GetToken()
 {
 if (_token == Guid.Empty)
 _token = Guid.NewGuid();

return _token;
 }
 }

Now we can define the services registration, let’s start creating the Autofac modules.

Modules

First of all, we need to register the PerDependencyTokenService and we must identify it in order to distinguish this service from the others that implements the ITokenService interface:


public class PerDependencyModule : Module
 {
 protected override void Load(ContainerBuilder builder)
 {

builder.RegisterType<PerDependencyTokenService>()
 .AsSelf()
 .AsImplementedInterfaces()
 .Keyed<ITokenService>("perDependencyTokenService");
 }
 }

We registered the service as Keyed, and we identified that with a specific tag.

Now we can register the other services:


public class PerLifetimeScopeModule : Module
 {
 protected override void Load(ContainerBuilder builder)
 {
 builder.RegisterType<OrdersService>()
 .AsSelf()
 .InstancePerLifetimeScope()
 .WithParameter(new ResolvedParameter(
 (pi, ctx) => pi.ParameterType == typeof(ITokenService),
 (pi, ctx) => ctx.ResolveKeyed<ITokenService>("perDependencyTokenService")
 ));

// PER MATCHING LIFETIMESCOPE
 builder.RegisterType<CustomerService>()
 .AsSelf()
 .InstancePerMatchingLifetimeScope("scope1");
 }

}

OrderService is registered as InstancePerLifetimeScope, so an instance of this component will be unique in a specific scope.

We specify the Parameter ITokenService as well, and we tell to Autofac to resolve this with a specific key; so we are able to specify the concrete class that implements the ITokenService interface.

The second registration is about the CustomerService; this registration is slightly different from the first one, because we want to resolve the CustomerService only for the lifetime scopes that has a specific name.

Now we are going to implement the test methods that will use the services.

Test methods

The first step is register the module in the Autofac container:

builder.RegisterModule(new PerLifetimeScopeModule());

Then we can implement the test methods:

public class InstancePerLifetimeScopeTests : BaseTests
{
[Test]
public void should_is_not_the_same_instance_for_different_lifetime_scopes()
{
OrdersService ordersService1, ordersService2;

using (var scope = containerBuilder.BeginLifetimeScope())
{
ordersService1 = scope.Resolve<OrdersService>();
}

using (var scope = containerBuilder.BeginLifetimeScope())
{
ordersService2 = scope.Resolve<OrdersService>();
}

ReferenceEquals(ordersService1, ordersService2).ShouldBeEquivalentTo(false);
}

[Test]
public void should_not_be_able_to_resolve_instance_per_lifetime_scope()
{
CustomerService customerService1 = null, customerService2 = null;

try
{
using (var scope = containerBuilder.BeginLifetimeScope())
{
customerService1 = scope.Resolve<CustomerService>();

using (var scope1 = containerBuilder.BeginLifetimeScope("scope1"))
{
customerService2 = scope.Resolve<CustomerService>();
}

customerService1.ShouldBeEquivalentTo(null);
}
}
catch (Exception)
{
customerService1.ShouldBeEquivalentTo(null);
}
}

[Test]
public void should_be_able_to_resolve_instance_per_lifetime_scope()
{
CustomerService customerService1, customerService2;

using (var scope = containerBuilder.BeginLifetimeScope("scope1"))
{
customerService1 = scope.Resolve<CustomerService>();

using (var scope1 = containerBuilder.BeginLifetimeScope())
{
customerService2 = scope.Resolve<CustomerService>();
}
}

ReferenceEquals(customerService1, customerService2).ShouldBeEquivalentTo(true);
}
}

In the first method, we create two different lifetime scopes and we check that the two objects have not the same reference.

In the second one, we should not be able to resolve the CustomerService for a generic lifetime scope, because we registered this service for a scope named scope1.

So this row:

customerService1 = scope.Resolve<CustomerService>();

will throw a DependencyResolutionException.

Instead in the third method will be able to resolve the service, because we try to create an instance in a LifetimeScope named scope1; and of course, this service will be single instance for every lifetime scope that match the name.

You can find the source code here.

 

Services lifetime scope with Autofac