Continuous integration with Atlassian Bamboo

The continuous integration is a frequent theme in the application development, expecially in the projects with many developers involved that work on differents components of the application.

Just because the developers works on different features, a process that integrate the new code frequently in project and verify that all the dependencies of that are working yet is recommended.

A practice to integrate daily the new code in the shared project repository avoid problems like integration hell, where a developer has troubles to integrate code of another developer, that worked isolated for a long period.

Again, a process that manage this integration allow us to introduce in this phase the automatic tests and discover eventually regressions in the code modified.

There are different products to manage the continuous integration process and one of those is Bamboo, a product owned by Atlassian and in this post I want to talk about the configuration of the continuous integration on a .NET project with a bunch of tests implemented with NUnit.

Server configuration

Before to proceed with a project configuration we need to take care about the executables that Bamboo will needs to execute the process.

In my case I need the MSBuild and NUnit executables and I can configure them in the server capabilities section of the Bamboo configuration:

configuration7

In my case I use the MSBuild installed with Visual Studio, but you can download that from this url.

The NUnit console is the executable used for the tests execution and you can find it here.

Plan

When I need to configure a new project in Bamboo, I need to do a new plan.

A plan is composed by one or more stages; for example in my case I have a build stage, a test stage and a deploy stage.

In the stages I will create the jobs that will do some stuff.

In the plan configuration I need to take care about the referenced repository, that is the repository where Bamboo will load the source code:

configuration5

In my case I have specified the url of a GitHub repository.

Another thing that I can do is define a trigger for the plan, that is an event after which the plan execution will be fired.

I define a branch that Bamboo will polling for new changes:

configuration6

Every time a new commit will be pushed on the master branch, the plan will be executed.

Build

Now I configure the Solution Build job in the Build stage; every job is composed by one or more tasks and in this case I have three tasks.

The first one is the Source Code Checkout, that is the phase where Bamboo get the source code from the repository and copy all the content in his working directory (locally on the server where is installed).

The second one is a Script configuration, in my case I need to restore some nuget packages before to build the project; I do that by writing a script that launch the nuget executable with the option restore:

configuration8

You can find the nuget executable download here.

The last step is the MSBuild task and we use the executable configured above:

configuration9

Another thing that we can do in the job configuration is define an artifact, thus the output of the job will be available for the next jobs:

configuration10

In this way, we speedup the next job because it won’t need to checkout the repository again.

Test

The test job is more simple, because it has the source code already available and it has only the test runner:

configuration11

Again here we use the other executable configured above and we have only to specify the dll of the test project.

We don’t forget to configure the artifact for the job and use that produced in the previous job:

configuration12

Deploy

The last stage is the deploy.

To have a clean situation I do again a Source Code Checkout and than I  execute a powershell script:

configuration13

The script was explained in the previous post and deal with some stuff like build, apply a version to the assembly and copy the result in a shared directory.

That’s all, when you’ll run the plan, the result (if the project is ok :))) will looks like this:

configuration4

 

Advertisements
Continuous integration with Atlassian Bamboo

Deploy a .NET project with powershell and git

In this topic I want to share my experience in writing a powershell script to compile and publish a .NET project.

I use Git and a GitHub repository, so in my mind the script should restore the nuget packages, build the project with MSBuild, take the last tag (version) from the master branch of the GitHub repository and apply (if valid) the version found in the tag in the project assembly.

Finally, it have to produce a folder with the deployment package.

So, let’s start with the steps to implement this script.

Project configuration

Before to write the powershell script, I need to setup the configuration for the project deploy.

So right click on the project (in my case a web project) and select Publish, it will appear this window:

configuration1

Then we need to create a new profile and select (in my case) the folder option:

configuration2

After that, we will have the a new pubxml file in the solution:

configuration3

We will use this file in our powershell script.

Source code versioning

Now it’s the time to implement the script and the first step that I want to do is apply a version to the assembly and use it in the name of the folder where the application will be published.

My repository provider is GitHub and everytime I release a version on the master branch, I apply a tag on the commit with the release number, like 0.2.0.

So my script have to be able to get the last tag from the master branch and apply it to the assembly.

We have some options to apply the version on a .NET application, the most standard way is use the AssemblyInfo.cs file, where we could have attributes like AssemblyVersion, AssemblyFileVersion, and also AssemblyInformationalVersion.

If the first two attributes need to have a version in standard format, the last attribute leave us the freedom to use a custom versioning, for example if we want to include in the versioning the current date, or the the name of the git branch and so on.

For this reason I’ll update the AssemblyInformationalVersion.

So, first of all I need to retrieve the version from the tag applied on the GitHub repository:

$version = $(git describe --abbrev=0 --tag)

By executing this git command from the solution folder, I can retrieve the last tag applied and use it as the new version, or part of it.

Now I can check if the version has a specific format, for example I want that the version is composed by two or three numbers:


$versionRegex1 = "\d+.\d+.\d+"
$versionData1 = [regex]::matches($version,$versionRegex1)
$versionRegex2 = "\d+.\d+"
$versionData2 = [regex]::matches($version,$versionRegex2)

if ($versionData1.Count -eq 0 -and $versionData2.Count -eq 0) { Throw "Version " + $version + " has a bad format" }

If these checks are satisfied, I can apply the version and I search the AssemblyVersion.cs files in the solution:


$files = Get-ChildItem $sourceDirectory -recurse -include "*Properties*" |
?{ $_.PSIsContainer } |
foreach { Get-ChildItem -Path $_.FullName -Recurse -include AssemblyInfo.* }

If I have found the files, I can apply the new version:


if ($files) {
Write-Host "Updating version" $version
foreach ($file in $files) {
$filecontent = Get-Content($file)
attrib $file -r
$informationalVersion = [regex]::matches($filecontent,"AssemblyInformationalVersion\(""$version""\)")

if ($informationalVersion.Count -eq 0) {
Write-Host "Version " $version " applied to " $file
$filecontent -replace "AssemblyInformationalVersion\(.*\)", "AssemblyInformationalVersion(""$version"")" | Out-File $file
}

}
}

I check that the attribute has not the new version value yet, otherwise nothing needs to do.

The hardest step is completed, now I can build and deploy my project.

Build and deploy

I order to build the project I need MSBuild 15, that in my case is already installed with Visual Studio 2017.

But, if you haven’t, you can download it from the Microsoft web site and this link.

If you have some nuget packages in the project, in order to restore the packages before to build you need the nuget executable as well, and you can download it from this link.

Now we are ready to write the code to build and deploy the project:


$msbuild = "C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\MSBuild\15.0\Bin\msbuild.exe"
$solutionFolder = $PSScriptRoot + "..\..\"
$solutionFile = $solutionFolder + "\EFContextMock.sln"
$projectFile = $solutionFolder + "\WebApp\WebApp.csproj"
$nuget = $solutionFolder + "\nuget\nuget.exe"
$version = $(git describe --abbrev=0 --tag)
$publishUrl = "c:\temp\EFContextMock\" + $version

SetAssemblyInfo $solutionFolder $version

Write-Host "Restore packages"

& $nuget restore $solutionFile

if ($LastExitCode -ne 0){
$exitCode=$LastExitCode
Write-Error "Build failed!"
exit $exitCode
}
else{
Write-Host "Build succeeded"
}

Write-Host "Building"

& $msbuild $projectFile /p:DeployOnBuild=true /p:PublishProfile=Publish.pubxml /p:PublishUrl=$publishUrl

if ($LastExitCode -ne 0){
$exitCode=$LastExitCode
Write-Error "Build failed!"
exit $exitCode
}
else{
Write-Host "Build succeeded"
}

After some variables setup I assign the version with the code discussed above and I restore the nuget packages.

The msbuild command uses the pubxml file created in the first step; one of the parameters of the command is the PublishUrl, that in my case is a local path.

You can find the complete powershell script here.

Deploy a .NET project with powershell and git

Mocking Entity Framework DbContext with Moq

When we have to test methods that involves Entity Framework, a typical choice that we have to face is use integration tests, with an effective database, or unit tests.

If we choice the first option, with a database like SQL LocalDB, we’ll have performance problems because the cost of the database creation and the data inserts in the test startup is very high, and in order to guarantee the initial conditions we’ll have to do it for each test.

What we can do is use a mock framework that help us to mockup the entity framework context; it would be an in-memory db context, like the in-memory db context of .NET Core, that we have seen in this post.

The factory

In pratice, mocking a class means substitute the real implementation of a method with our custom behaviour; what we can do for every method of the class is setup returns values of the method; therefore we don’t need the real implementation of the class, we have mocked the methods.

In our case, we can setup the EF DbSet with an in-memory list and the methods that use the context will no longer need the real database, they will use our lists provided with the mock.

Anyway, before to do it we have another problem; how we can provide our in-memory db context to the methods that we need to test?

In the real world, it’s likely that the context will be instantiated with the using statement in every method, like this:


public async Task<List<Person>> GetPersons(string query)
{
using (var db = new Context())
{
.....
}
}

This is a problem, because we are not able to inject our mocked context in the class.

But we can solve it with a Factory service, a singleton service that returns instances of the db context:


public class ContextFactory
{
private Type _dbContextType;
private DbContext _dbContext;

public void Register<TDbContext>(TDbContext dbContext) where TDbContext : DbContext, new()
{
_dbContextType = typeof(TDbContext);
_dbContext = dbContext;
}

public TDbContext Get<TDbContext>() where TDbContext : DbContext, new()
{
if (_dbContext == null || _dbContextType != typeof(TDbContext))
{
return new TDbContext();
}

return (TDbContext)_dbContext;
}
}

We have two methods, with the Register method we can setup a specific db context implementation; with the Get method we can get an instance of a db context, that is the registered implementation if we have one, otherwise the default implementation.

We can now inject this service as a dependency and use it:


public class PersonService
{
private readonly ContextFactory _contextFactory;

public PersonService(ContextFactory contextFactory)
{
_contextFactory = contextFactory;
}

public async Task<List<Person>> GetPersons(string query)
{
using (var db = _contextFactory.Get<Context>())
{
return await db.Persons.Where(p => p.TaxCode.Contains(query) || p.Firstname.Contains(query) || p.Surname.Contains(query)).ToListAsync();
}
}
}

Now we are ready to mock the EF context.

The mock

The framework that I use for this purphose is moq and I can install it with nuget:

install-package moq

It’s likely that you use async methods of entity framework; if yes, in order to mock we need to create an in-memory DbAsyncQueryProvider, and you can find the implementation here.

The Unit Testing used for this example is NUnit and I can configure the mocked context in the setup method; the first step is prepare a list of queryable objects:


[SetUp]
public void Setup()
{
var persons = new List<Person>() {
new Person() { TaxCode = "taxcode1", Firstname = "firstname1", Surname = "surname1" },
new Person() { TaxCode = "taxcode2", Firstname = "firstname2", Surname = "surname2" }
};
var queryable = persons.AsQueryable();
}

Now I’m ready to setup the mock:


MockSet = new Mock<DbSet<Person>>();

MockSet.As<IQueryable<Person>>().Setup(m => m.Expression).Returns(queryable.Expression);
MockSet.As<IQueryable<Person>>().Setup(m => m.ElementType).Returns(queryable.ElementType);
MockSet.As<IQueryable<Person>>().Setup(m => m.GetEnumerator()).Returns(queryable.GetEnumerator);

MockSet.As<IQueryable<Person>>().Setup(m => m.Provider).Returns(new AsyncQueryProvider<Person>(queryable.Provider));
MockSet.As<IDbAsyncEnumerable<Person>>().Setup(m => m.GetAsyncEnumerator()).Returns(new AsyncEnumerator<Person>(queryable.GetEnumerator()));

In order to mock an IQueryable, I have to setup returns values for Expression method, ElementType and GetEnumerator; every time these methods will be invoked in the queries executions, the values that I setup in the Returns expression will be returned.

I need to do the same operations for Provider method and GetAsyncEnumerator, but, since async methods are involved, I need to use the custom classes AsyncQueryProvider and AsyncEnumerator of the in-memory DbAsyncQueryProvider.

The mock for the Add and Remove operations are simplier:


MockSet.Setup(m => m.Add(It.IsAny<Person>())).Callback((Person person) => persons.Add(person));
MockSet.Setup(m => m.Remove(It.IsAny<Person>())).Callback((Person person) => persons.Remove(person));

Since Add and Remove methods returns nothing, we use Callback methods instead of Returns.

The last step is setup the factory service context with the mocked version:


MockContext = new Mock<Context>();
MockContext.Setup(m => m.Persons).Returns(MockSet.Object);

var contextFactory = new ContextFactory();
contextFactory.Register(MockContext.Object);
PersonService = new PersonService(contextFactory);

First of all I setup the DbSet of the mocked context with the mocked DbSet.

Then I register the mocked context in the factory service and then I pass the factory service as a dependency of the service to be test.

With these lines of code I have mocked the entity framework context with an in-memory instance and leveraging the context factory I was able to inject the mocked context to the service.

You can find the source code here.

 

Mocking Entity Framework DbContext with Moq

Manage attachments chunks with ASP.NET Web Api

In the previous post I spoke about a custom MultipartFormData stream provider and how it can help us to manage some custom informations included in a request message.

In that example I generated chunks form a file and I sent those to a rest service (AKA Web API) with some additional informations that were then retrieved from the custom provider.

Now I want to use these informations to manage the upload session and merge all the chunks when received.

What I need to do is define the models involved in the process and the service that manage the chunks.

Models

We have to define two stuff, the first one is the model for the chunk:


public class ChunkMetadata
{
public string Filename { get; set; }
public int ChunkNumber { get; set; }

public ChunkMetadata(string filename, int chunkNumber)
{
Filename = filename;
ChunkNumber = chunkNumber;
}
}

The ChunkNumber property deserves an explanation; is the number associated to the chunk and will be useful to understand the correct order when we’ll have to merge all of them.

The second one is the model of the session, that is the bunch of the chunks that compose the file.

First of all we define the interface:


public interface IUploadSession
{
ConcurrentBag<ChunkMetadata> Chunks { get; set; }
string Filename { get; }
long Filesize { get; }
bool AddChunk(string filename, string chunkFileName, int chunkNumber, int totalChunks);
Task MergeChunks(string path);
}

The FileName and Filesize are closely tied to the session; we need AddChunk and MergeChunks methods as well.

We also need a thread safe collection for the chunks that compose the session, so we define a CuncurrentBag collection, that is the thread safe representation of the List.

Now we can implement the model:


public class UploadSession : IUploadSession
{
public string Filename { get; private set; }
public long Filesize { get; private set; }
private int _totalChunks;
private int _chunksUploaded;

public ConcurrentBag<ChunkMetadata> Chunks { get; set; }

public UploadSession()
{
Filesize = 0;
_chunksUploaded = 0;
Chunks = new ConcurrentBag<ChunkMetadata>();
}

public bool AddChunk(string filename, string chunkFileName, int chunkNumber, int totalChunks)
{
if (Filename == null)
{
Filename = filename;
_totalChunks = totalChunks;
}

var metadata = new ChunkMetadata(chunkFileName, chunkNumber);
Chunks.Add(metadata);

_chunksUploaded = Interlocked.Increment(ref _chunksUploaded);
return _chunksUploaded == _totalChunks;
}

public async Task MergeChunks(string path)
{
var filePath = path + Filename;

using (var mainFile = new FileStream(filePath, FileMode.Create))
{
foreach (var chunk in Chunks.OrderBy(c => c.ChunkNumber))
{
using (var chunkFile = new FileStream(chunk.Filename, FileMode.Open))
{
await chunkFile.CopyToAsync(mainFile);
Filesize += chunkFile.Length;
}
}
}

foreach (var chunk in Chunks)
{
File.Delete(chunk.Filename);
}
}
}

The implementation is quite simple.

The AddChunk method add the new chunk to the collection, then increment the _chunksUploaded property with the thread safe operation Interlocked.Increment; at the end, the method returns a bool that is true if all the chunks are received, otherwise false.

The MergeChunks method deal with the retrieve of all the chunks from the file system.

It gets the collection, order by the chunk number, read the bytes from the chunks and copy those to the main file stream.

After all, the chunks are deleted.

Service

The service will have an interface like this:


public interface IUploadService
{
Guid StartNewSession();
Task<bool> UploadChunk(HttpRequestMessage request);
}

In my mind, the StartNewSession method will instantiate a new Session object and assign a new correlation id that is the unique identifier of the session.

This is the implementation:


public class UploadService : IUploadService
{
private readonly Context _db = new Context();
private readonly string _path;
private readonly ConcurrentDictionary<string, UploadSession> _uploadSessions;

public UploadService(string path)
{
_path = path;
_uploadSessions = new ConcurrentDictionary<string, UploadSession>();
}

public async Task<bool> UploadChunk(HttpRequestMessage request)
{
var provider = new CustomMultipartFormDataStreamProvider(_path);
await request.Content.ReadAsMultipartAsync(provider);
provider.ExtractValues();

UploadSession uploadSession;
_uploadSessions.TryGetValue(provider.CorrelationId, out uploadSession);

if (uploadSession == null)
throw new ObjectNotFoundException();

var completed = uploadSession.AddChunk(provider.Filename, provider.ChunkFilename, provider.ChunkNumber, provider.TotalChunks);

if (completed)
{
await uploadSession.MergeChunks(_path);

var fileBlob = new FileBlob()
{
Id = Guid.NewGuid(),
Path = _path + uploadSession.Filename,
Name = uploadSession.Filename,
Size = uploadSession.Filesize
};

_db.FileBlobs.Add(fileBlob);
await _db.SaveChangesAsync();

return true;
}

return false;
}

public Guid StartNewSession()
{
var correlationId = Guid.NewGuid();
var session = new UploadSession();
_uploadSessions.TryAdd(correlationId.ToString(), session);

return correlationId;
}
}

In the StartNewSession method we use the thread safe method TryAdd to add a new session to the CuncurrentBag.

About the UploadChunk method, we seen the first part of the implementation in the previous post.

Once the metadata is retrieved from the request, we try to find the session object with a thread safe operation.

If we don’t find the object, of course we need to throw an exception because we expect that the related session exists.

If the session exists, we add the chunk to the session and we check the result of the operation.

If is the last chunk, we merge all of them and we can do a database operation if needed.

Controller

The implementation of the controller is very simple:


public class FileBlobsController : ApiController
{
private readonly IUploadService _fileBlobsService;
private readonly Context _db = new Context();

public FileBlobsController(IUploadService uploadService)
{
_fileBlobsService = uploadService;
}

[Route("api/fileblobs/getcorrelationid")]
[HttpGet]
public IHttpActionResult GetCorrelationId()
{
return Ok(_fileBlobsService.StartNewSession());
}

[HttpPost]
public async Task<IHttpActionResult> PostFileBlob()
{
if (!Request.Content.IsMimeMultipartContent())
throw new Exception();

var result = await _fileBlobsService.UploadChunk(Request);

return Ok(result);
}
}

You can find the source code here.

Manage attachments chunks with ASP.NET Web Api

Real-time search with ASP.NET and Elasticsearch

A common problem that we are faced when we have deployed our applications is improve the performance of a page or feature.

In my case for example I had a field where I could search and select a city, so the starting elements were a lot and the search was quite slow; I wanted a better user experience.

We can solve performance problems like these with the help of a cache or a full-text search.

I have chosen the last one and elastic search as full-text engine, so I’ll describe the steps that I followed to configure and use it in my application.

Installation

The first step is install the elastic search server, that you can download here.

Once installed we have to start it by executing the following executable:

<Installation path>\bin\elasticsearch.bat

This is the server log:

log

The server will take care to index the content that we will pass to it; in order to do that we need a client to use in our application; in my case the application was .NET and I used NEST.

NEST

As said above, NEST is an elastic search high level client for .NET applications.

The first step is install it in the application with nuget:

Install-package NEST

And in the package.config we’ll have:

package

Now we have all the necessary tools and we can develop the code for the search feature.

Client

We define a client class that has one responsability, that is setup the url and the default index of the client, and that instantiate it:


public class ElasticSearchClient
{
privatereadonlyIElasticClient _client;

publicElasticSearchClient(IElasticClient client)
{
_client = client;
}

publicElasticSearchClient(string uri, string indexName) : this(CreateElasticClient(uri, indexName)) {}

publicIElasticClientGetClient()
{
return_client;
}

privatestaticElasticClientCreateElasticClient(string uri, string indexName)
{
var node = newUri(uri);
var setting = newConnectionSettings(node);
setting.DefaultIndex(indexName);
returnnewElasticClient(setting);
}
}

Once instantiated, the class returns a new instance of the client; we can register it in the startup class of the application with autofac:


public partial class Startup
{
publicvoidConfiguration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();
...
}
}

Service base class

A service that uses an elasticsearch index should be able to do some basic operations, that concerns the logics of the full-text indexes.

We have to deal with the initialize a specific index, populate the index with the contents, obviously performs a search on the index with specific parameters.

So, we have to define an interface like this:


internal interface IElasticSearchService<T> where T : class
{
voidInit();
voidCheckIndex();
voidBulkInsert(List<T> objects);
IEnumerable<T> Search(string query);
}

I like to separate the init method, that create the index, from the checkindex method, that check if the index already exists.

Now we can implement the basic service:


public class ElasticSearchService<T> : IElasticSearchService<T> where T : class
{
protectedreadonlyContext Db = newContext();
protectedreadonlyElasticSearchClient ElasticSearchClient;
protectedreadonlystring IndexName;

publicElasticSearchService(ElasticSearchClient elasticSearchClient, string indexName)
{
ElasticSearchClient = elasticSearchClient;
IndexName = indexName;
}

publicvirtualvoidInit()
{
CheckIndex();
BulkInsert(Db.Set<T>().ToList());
}

publicvoidCheckIndex()
{
if (IndexExist()) return;
var response = CreateIndex();

if (!response.IsValid)
{
thrownewException(response.ServerError.ToString(), response.OriginalException);
}
}

publicvoidBulkInsert(List<T> objects)
{
var response = ElasticSearchClient.GetClient().IndexMany(objects, IndexName);
if (!response.IsValid)
{
thrownewException(response.ServerError.ToString(), response.OriginalException);
}
}

publicvirtualIEnumerable<T> Search(string query)
{
var results = ElasticSearchClient.GetClient().Search<T>(c => c.From(0).Size(10).Query(q => q.Prefix("_all", query)));

returnresults.Documents;
}

protectedvirtualIResponseCreateIndex()
{
var indexDescriptor = newCreateIndexDescriptor(IndexName).Mappings(ms => ms.Map<T>(m => m.AutoMap()));
returnElasticSearchClient.GetClient().CreateIndex(indexDescriptor);
}

protectedboolIndexExist()
{
returnElasticSearchClient.GetClient().IndexExists(IndexName).Exists;
}
}

The constructor accept the client and the index name.

We define a virtual init method, that check if the index exists and do a bulkinsert of a list of object; the method is virtual, we think that a derived service could override the method.

This bulkinsert method leverage the client to index the object list and the search method implements a basic search, that searchs in all the fields of the objects by using the special field _all, which contains the concatenate values of all fields.

The method returns the first 10 elements.

Createindex create a specific index with automap option, that infers the elasticsearch fields datatypes from the POCO object that we pass to it; it’s protected, so the derived class could use it.

IndexExists check if an index exists and it can be used from the derived class as well.

Service

Now we can implement a specific service, that inherits from ElasticSearchService class.

In this example I need to search in a list of cities and related districts, so I need to override the CreateIndex method like this:


public sealed class CitiesService : ElasticSearchService<City>
{
publicCitiesService(ElasticSearchClient elasticSearchClient, string indexName): base(elasticSearchClient, indexName) {}

protectedoverrideIResponseCreateIndex()
{
var indexDescriptor = newCreateIndexDescriptor(IndexName).Mappings(
ms => ms.Map<City>(m => m.AutoMap().Properties(ps =>
ps.Nested<District>(n => n
.Name(nn => nn.District)
.AutoMap()))));

returnElasticSearchClient.GetClient().CreateIndex(indexDescriptor);
}

publicoverrideIEnumerable<City> Search(string query)
{
var results = ElasticSearchClient.GetClient().Search<City>(c => c.From(0).Size(10).Query(q => q.Prefix(p => p.Name, query) || q.Term("district.name", query)));

returnresults.Documents.OrderBy(d => d.Name);
}
}

What I need to do is automap the city object and the district, that is a closely related entity of the city; so I have to map the District property as nested with the automap option as well.

Thus I will able to search for all the properties of the city and the district.

The other method that I override is the Search method; I search partially in the name of the city (Prefix) and the specific term in the district name (Term) and I returns the first 10 elements.

Now I have to register the service with autofac:


public partial class Startup
{
public void Configuration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();

builder.Register(c => newCitiesService(c.Resolve<ElasticSearchClient>(), "cities"))
.AsSelf()
.AsImplementedInterfaces()
.SingleInstance();

...
}
}

The last step is initialize the full text index of my service:


public partial class Startup
{
publicvoidConfiguration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();

builder.Register(c => newCitiesService(c.Resolve<ElasticSearchClient>(), "cities"))
.AsSelf()
.AsImplementedInterfaces()
.SingleInstance();

...

InitElasticSearchServices(containerBuilder);
}

privatestaticvoidInitElasticSearchServices(IContainer containerBuilder)
{
var citiesServices = containerBuilder.Resolve<CitiesService>();
citiesServices.Init();
}
}

I make a new instance of the service and call the Init method of the ElasticSearchService that we have seen above.

This method will create and populate the index.

Web API

Now I can use the service in my Web API, like this:


public class CitiesController : ApiController
{
privatereadonlyCitiesService _elasticSearchService;

publicCitiesController(CitiesService elasticSearchService)
{
_elasticSearchService = elasticSearchService;
}

// GET: api/Cities
publicIEnumerable<City> GetCities(string query)
{
return_elasticSearchService.Search(query);
}

protectedoverridevoidDispose(bool disposing)
{
base.Dispose(disposing);
}
}

You can find the source code of this topic here.

Real-time search with ASP.NET and Elasticsearch

Register and test per-request services with Autofac

When we develop web application, like ASP.NET applications, we often need to implement a service with some informations related to the user request, such as session/account infos.

In this case, the service will be tied to the web request lifecycle and there will be an instance of the service for each request.

Autofac help us to manage the instances and the life cycles of these services.

Service

We can develop a simple service that we use for our tests:


public class AccountService
{
private ITokenService _tokenService;

public AccountService(ITokenService tokenService)
{
_tokenService = tokenService;
}
}

Module

In order to register the service, we use an Autofac module:


public class PerRequestModule : Module
{
protected override void Load(ContainerBuilder builder)
{
builder.RegisterType<AccountService>()
.AsSelf()
.InstancePerRequest()
.WithParameter(new ResolvedParameter(
(pi, ctx) => pi.ParameterType == typeof(ITokenService),
(pi, ctx) => ctx.ResolveKeyed<ITokenService>("singletonTokenService")
));
}
}

The service was defined has InstancePerRequest; we also specified explicitly the implementation of the parameter interface.

And register it in the Autofac container:


var builder = new ContainerBuilder();
...
builder.RegisterModule(new PerRequestModule());
...
containerBuilder = builder.Build();

Test methods

Now, the last step is the test methods:


[Test]
public void should_is_not_the_same_instance_for_different_requests()
{
AccountService accountService1, accountService2;

using (HttpRequestMessage request = new HttpRequestMessage())
{
request.SetConfiguration(httpConfiguration);
var dependencyScope = request.GetDependencyScope();
accountService1 = dependencyScope.GetService(typeof(AccountService)) as AccountService;
}

using (HttpRequestMessage request = new HttpRequestMessage())
{
request.SetConfiguration(httpConfiguration);
var dependencyScope = request.GetDependencyScope();
accountService2 = dependencyScope.GetService(typeof(AccountService)) as AccountService;
}

ReferenceEquals(accountService1, accountService2).ShouldBeEquivalentTo(false);
}

[Test]
public void should_be_able_to_resolve_instance_per_request()
{
using (HttpRequestMessage request = new HttpRequestMessage())
{
request.SetConfiguration(httpConfiguration);
var dependencyScope = request.GetDependencyScope();
AccountService service = dependencyScope.GetService(typeof(AccountService)) as AccountService;

service.Should().NotBeNull();
}
}

In order to be able to test a per-request service, we need an instance of the HttpRequestMessage class and set the configuration with the httpConfiguration defined in the Autofac container; then we can use the request scope to get an instance of our services and make the test.

You can find the source code here.

Register and test per-request services with Autofac

Registering of the ASP.NET MVC Controllers with Autofac

One of the capabilities of Autofac is the integration with the ASP.NET applications.

ASP.NET MVC is a framework thinked to supports the dependency injection, and so we can use Autofac to register the modules that compose the application, such as the controllers.

Therefore, let’s start by implementing the controllers of the application, then we ‘ll add the Autofac module that define the controllers registration.

Controllers

We implement two controllers, a Controller and an ApiController:

public class HomeController : Controller
 {
 LoggerService _loggerService;

public HomeController(LoggerService loggerService)
 {
 _loggerService = loggerService;
 }
 }

The second one is the Web Api:

public class AccountController : ApiController
 {
 LoggerService _loggerService;

 public AccountController(LoggerService loggerService)
 {
 _loggerService = loggerService;
 }
 }

Autofac module

The first step is install the two autofac packages needed for the integration with ASP.NET:

install-package Autofac.Integration.Mvc
install-package Autofac.Integration.WebApi

Now we can register the controllers with an Autofac module:

public class PerRequestModule : Module
 {
 protected override void Load(ContainerBuilder builder)
 {
 var controllersAssembly = Assembly.GetAssembly(typeof(HomeController));
 var apiControllersAssembly = Assembly.GetAssembly(typeof(AccountController));

 builder.RegisterControllers(controllersAssembly);
 builder.RegisterApiControllers(apiControllersAssembly);
 }
 }

By using the reflection, we can register all the controllers in one shot.

The module needs to be registered in the autofac container:

var builder = new ContainerBuilder();
....
builder.RegisterModule(new PerRequestModule());
...
containerBuilder = builder.Build();

DependencyResolver.SetResolver(new AutofacDependencyResolver(containerBuilder));

httpConfiguration = new HttpConfiguration
{
DependencyResolver = new AutofacWebApiDependencyResolver(containerBuilder)
};

We built the container and passed that to the AutofacDependencyResolver (Controller) and to the AutofacWebApiDependencyResolver (ApiController).

Tests

Now we can implement the test methods:

public class PerRequestTests : BaseTests
{
[Test]
public void should_be_able_to_resolve_mvc_controller()
{
using (var scope = containerBuilder.BeginLifetimeScope())
{
var controller = scope.Resolve<HomeController>();
controller.Should().NotBeNull();
}
}

[Test]
public void should_be_able_to_resolve_api_controller()
{
using (var scope = containerBuilder.BeginLifetimeScope())
{
var controller = scope.Resolve<AccountController>();
controller.Should().NotBeNull();
}
}
}

As you can see, we initialized a new lifetime scope and tried to resolve the controllers.
You can find the source code here.

Registering of the ASP.NET MVC Controllers with Autofac