Managing OAuth 2 authentication with Swagger

In this post I want to talk about a product that could help us to produce documentation about the Web API services implemented in our application.

Swagger is a popular framework that once installed in an ASP.NET application is able to produce documentation about all the Web API implemented in the project.

Furthermore it give us a lot of flexibility and is possible to add some custom filters in order to change the standard behaviours; for example add the OAuth authentication management for the protected applications.

So let’s go to view the steps necessaries to install and configure this framework.

Configuration

The first step is install the package in our project and we can do that with nuget:


Install-Package Swashbuckle -Version 5.6.0

Now we need to add the Swagger configuration in the startup.cs file:


config.EnableSwagger(c =>
{
c.SingleApiVersion("v1", "BaseSPA.Web");
c.OperationFilter<SwaggerFilter>();
c.PrettyPrint();
c.IgnoreObsoleteActions();
}).EnableSwaggerUi();

In the configuration we define the description of the project, we says that we want json in prettify format and we want to ignore obsolete actions.

Furthermore we register a SwaggerFilter, that is a custom filter used to manage the OAuth authentication.

This is the SwaggerFilter implementation:


public class SwaggerFilter: IOperationFilter
 {
 public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
 {
 var toBeAuthorize = apiDescription.GetControllerAndActionAttributes<AuthorizeAttribute>().Any();

if (toBeAuthorize)
 {
 if (operation.parameters == null)
 operation.parameters = new List<Parameter>();

operation.parameters.Add(new Parameter()
 {
 name = "Authorization",
 @in = "header",
 description = "bearer token",
 required = true,
 type = "string"
 });
 }
 }
 }

First of all we need to implement the IOperationFilter interface with the Apply method.

In this method we check the actions protected with the Authorize attribute; for these, we add a new Authorization parameter that we’ll be showed in the Swagger UI and will be used to set the bearer token.

Test Web API

After compiling the project, we can access the url of the application and append the term swagger at the end of that, like this:


http://localhost/swagger

This is what will be showed:

swagger2

If we open one of the actions available, we notice the Authorization attribute that we have configure above:

swagger5

Now what we need is the bearer token to send to the action and we can retrieve it with Postman; we have to send a post request to the token endpoint configured in the application like this:

swagger1

I send a request with my username, password and grant_type keys and I specify the content type as x-www-form-urlencoded as well.

The response contains the access_token and I can copy it in the field in the swagger UI:

swagger3

That’s all, by sending the request, the application recognize me and it send me the response.

You can find the source code here.

 

 

 

 

 

Advertisements
Managing OAuth 2 authentication with Swagger

Consume Web API OData with ODataAngularResources

One month ago, I wrote this post about the odata services and how we can consume those with the $http factory of Angularjs.

This approach is not wrong but is not smart at all, because we have to write every query for any call that the application needs to do, and is not flexible.

What we could do is write our custom library that deal with the odata url format and it expose some fluent methods that the components of the application can call; in this way we have a layer between the angular services and the odata controller and the code should be more readable.

Anyway, write a library like that is not very easy and require a lot of amount of time.

The solution of these problems is a library already implemented that we can use directly in our angular application, that is ODataAngularResources.

This is a lightweight library that allow writing the OData queries with fluent methods.

What I’ll go to do is leverage this library for the queries and implement a custom service to manage the entities saving.

ODataResource factory

The first factory that I want to implement is an object that wrap the basic ODataResources module and expose the CRUD operations of the entity.

First of all, I need to install the library, and you can refer to the GitHub project for any detail.

Than, I can start with the implementation and creating the new module:


(function(window, angular) {
'use-strict';
angular.module('odataResourcesModule', ['ui.router', 'ODataResources'])

.factory('odataResource', function ($odataresource, $http, $q) {

function odataResource(serviceRootUrl, resourcePath, key) {
this.serviceRootUrl = serviceRootUrl;
this.resourcePath = resourcePath;
this.key = key;

this._odataResource = $odataresource(serviceRootUrl + '/' + resourcePath, {}, {}, {
odatakey: key,
isodatav4: true
});

angular.extend(this._odataResource.prototype, {
'$patch': function () {
var defer = $q.defer();

var req = {
method: 'PATCH',
url: serviceRootUrl + '/' + resourcePath + '(' + this.Id + ')',
data: this
};

$http(req).then(function (data) {
defer.resolve(data);
}, function (error) {
defer.reject(error);
});

return defer.promise;
}
});
}

.....

The factory has a constructor where I pass three parameters, the ServiceRootUrl that is the first part of the url (in my case odata), the resourcePath that identify the entity (Blogs) and the key of the table.

Then I configure the $odataresource with these parameters and I setup the odata v4 as well.

At the end I extend the OdataResource prototype because this library doesn’t provide the implementation for the PATCH verb, I need to implement it with my custom function.

So far so good, now I want to expose some methods to queries the resource or retrieve a single entity:


odataResource.prototype.getResource = function () {
return this._odataResource.odata();
}

odataResource.prototype.get = function (id) {
var defer = $q.defer();

this._odataResource.odata().get(id, function (data) {
data._originalResource = angular.copy(data);
defer.resolve(data);
}, function (error) {
defer.reject(error);
});

return defer.promise;
}

With the getResource method I implement a getter for the resource and allow the caller to execute the queries.

The get method deserves a particular explanation, once the single entity is retrieved, I make copy of the original entity and assign that to a new private property named _originalResource.

This will allow me two check the values that will be changed on the entity, and compose the object that will be sent with the patch method.

Now I can implement the others methods to add, update and delete the entity:


odataResource.prototype.new = function () {
return new this._odataResource();
}

odataResource.prototype.add = function (resource) {
var defer = $q.defer();

resource.$save(function (data) {
data._originalResource = angular.copy(data);
defer.resolve(data);
}, function (error) {
defer.reject(error);
});

return defer.promise;
}

odataResource.prototype.update = function (resource) {
var defer = $q.defer();

var self = this;
resource.$patch().then(function () {
self.get(resource[self.key]).then(function (data) {
defer.resolve(data);
});
}, function (error) {
defer.reject(error);
});

return defer.promise;
}

odataResource.prototype.delete = function (resource) {
var defer = $q.defer();

resource.$delete(function (data) {
defer.resolve(data);
}, function (error) {
defer.reject(error);
});

return defer.promise;
}

Like the get method, I do a copy of the entity in the add method as well.

Now I have a factory that is able to query Web API OData services and manage the crud operations and I can proceed with some business logic.

ODataGenericResource factory

The first step is define the new factory:


.factory('odataGenericResource', function ($q, odataResource) {

function odataGenericResource(serviceRootUrl, resourcePath, key) {
this.odataResource = new odataResource(serviceRootUrl, resourcePath, key);
}

In the constructor I create a new instance of the ODataResource factory, in order to leverage the methods implemented above.

Based on the id parameter, now I implement a get method that will create a new resource or perform a get from a Web API OData service:


odataGenericResource.prototype.get = function (id) {
if (id === '') {
var defer = $q.defer();
defer.resolve(this.odataResource.new());

return defer.promise;
} else {
return this.odataResource.get(id);
}
}

If an external service will call this method with an empty id will receive a new resource, otherwise will receive the entity, if exists.

Now I need to implement the save method:


odataGenericResource.prototype.isChanged = function (resource) {
var isChanged = false;
for (var propertyName in resource) {
if (isEntityProperty(propertyName) && resource._originalResource[propertyName] !== resource[propertyName]) {
isChanged = true;
}
}

return isChanged;
}

odataGenericResource.prototype.getObjectToUpdate = function (resource) {
var object = this.odataResource.new();
object[this.odataResource.key] = resource[this.odataResource.key];

for (var propertyName in resource) {
if (isEntityProperty(propertyName) && resource._originalResource[propertyName] !== resource[propertyName]) {
object[propertyName] = resource[propertyName];
}
}

return object;
}

odataGenericResource.prototype.save = function(resource) {
if (!resource._originalResource) {
return this.odataResource.add(resource);
} else if (this.isChanged(resource)) {
var object = this.getObjectToUpdate(resource);
return this.odataResource.update(object);
} else {
var defer = $q.defer();
defer.resolve(resource);

return defer.promise;
}
}

I need some methods, to check the state of the entity.

The IsChanged method check if any property of the entity is changed, leveraging the _originalResource property.

The second method prepare the effective object to save, with only the properties that has been changed.

The save method check if the entity is new (it doesn’t have the _originalResource property) or not; based on that, it will be added or saved.

The delete method doesn’t have particular stuff and we can now take a look at the using of this factory in the application.

Application

I can use this factory in any angular module that needs to manage crud operations for an entity.

For example, in a blogsModule, I can define a factory like this:


(function (window, angular) {
'use-strict';
angular.module('blogsModule', ['ui.router', 'odataResourcesModule'])
.factory('blogsService', function ($http, odataGenericResource) {
return new odataGenericResource('odata', 'Blogs', 'Id');
})

And now in the controller:


.controller('blogsCtrl', function ($scope, $state, blogsService) {

$scope.new = function() {
$state.go("home.blog", { id: null });
};

$scope.detail = function(id) {
$state.go("home.blog", { id: id });
};

$scope.Blogs = blogsService.getOdataResource().query();
})
.controller('blogsDetailCtrl', function ($scope, $state, blogsService) {
var load = function (id) {
blogsService.get(id).then(function(data) {
$scope.Blog = data;
});
};

$scope.save = function () {
blogsService.save($scope.Blog).then(function(data) {
load(data.Id);
});
}

$scope.delete = function () {
blogsService.delete($scope.Blog).then(function () {
$scope.close();
});
};

$scope.close = function () {
$state.go("home.blogs");
};

load($state.params.id);
});

With few lines of code I can manage the calls to a Web API OData service, with the possibility do fluent odata queries.

You can find the source code here.

Consume Web API OData with ODataAngularResources

Consume Web API with an Angularjs service

This topic concerns a problem that I faced in these days, that was the building of an Angularjs service to consume Web API’s, witch in my case were implemented with OData protocol.

The start point are some Web API’s, one for every entity of the project, that they responds to a specific url, for example:

I wouldn’t want to implement a service for every single Web API that do the same operations, so the idea is implement a generic javascript class with the basic methods and a factory that generates instances of this class; every module will have an own instance with some custom options, such as the service url.

So what I want to do are two things, a generic class that implements commons methods for CRUD operations and then a factory that generates instances of this service.

Generic service

Angularjs has several provider that we can use to instantiate services; the provider is the lower level and is use usually to configure reusable services that can be used in different applications.

Factory and services are syntactic sugar on top of a provider, and allow us to define singleton services in our application; the last two are constant and value that they provides values.

One of the main debates on Angularjs is when we use a factory or a service.

The main difference from these is that in the first one we needs to return an instance of the service in this way:


(function (window, angular) {
'use-strict'
angular.module('odataModule', [])
.factory('odataService', [function() {
returns {
.....
}
}]);
})(window, window.angular)

In the second one, we don’t need to do this, the angular service deal with object instantiation:


(function (window, angular) {
'use-strict'
angular.module('odataModule', [])
.service('odataService', [function() {
.....
}]);
})(window, window.angular)

What I want to implement is a javacript class that will have these properties/methods:

  • A url property, the Web API reference
  • An entityClass property, is the the javascript object that represents the entity to be managed
  • A isNew property, that allow the service to understand that the object need to be added (post) or saved (patch)
  • A getEntities method that retrieves the list of the entities
  • A createEntity method that returns a instance of a new entity
  • A getEntity method to retrieves a single entity
  • A saveEntity method that deal with the post/patch of the entity
  • And finally a method deleteEntity

An angular service help in this work by letting me write cleaner code that factory; a concrete implementation is:


(function (window, angular) {
'use-strict'
angular.module('odataModule', [])
.constant('blogsUrl', '/odata/Blogs')
.constant('postsUrl', '/odata/Posts')
.service('odataService', ['$http', function($http) {
function Instance(url, entityObject) {
this.isNew = false;
this.url = url;
this.entityClass = entityObject;
}

Instance.prototype.getEntities = function () {
return $http.get(this.url);
}

Instance.prototype.createEntity = function() {
this.isNew = true;
return angular.copy(this.entityClass);
}

Instance.prototype.getEntity = function (id) {
return $http.get(this.url + '(guid\'' + id + '\')');
}

Instance.prototype.saveEntity = function (entity) {
if (this.isNew) {
this.isNew = false;
return $http.post(this.url, entity);
}
else
return $http.patch(this.url + '(guid\'' + entity.Id + '\')', entity);
}

Instance.prototype.deleteEntity = function (entity) {
return $http.delete(this.url + '(guid\'' + entity.Id + '\')');
}
this.GetInstance = function(url, entityObject) {
return new Instance(url, entityObject);
};
}]);
})(window, window.angular)

The Instance class is the implementation of the common methods that we needed.

The last row is the main difference from a factory; in a service, the container function is a javascript function with a constructor and the service returns an instance of it, so I can define a method GetInstance that will returns an instance of my custom class.

If I had used a factory, the last rows would be the following:


return {
GetInstance: function(url, entityObject) {
return new Instance(url, entityObject);
}

Much better the implementation with the service.

Factories of the entities

Now I need to define a specific factory for every entity:


(function (window, angular) {
'use-scrict';
angular.module('blogsModule', ['ui.router', 'odataModule'])
.factory('blogsFactory', ['odataService', 'blogsUrl', function (odataService, blogsUrl) {
var blogEntity = { Name: '', Url: '' };
return odataService.GetInstance(blogsUrl, blogEntity);
}])
})(window, window.angular)

So far so good, I have an instance of the service with specific parameters, and now I can use the factory for crud operations in my controller:


(function (window, angular) {
'use-scrict';
angular.module('blogsModule', ['ui.router', 'odataModule'])
....
.controller('blogsController', ['$scope', '$state', 'blogsFactory', function ($scope, $state, blogsFactory) {
$scope.Title = 'Blogs';

blogsFactory.getEntities().then(function (result) {
$scope.blogs = result.data.value;
});

$scope.create = function() {
$state.go('main.blog', { id: null });
};

$scope.edit = function (blog) {
$state.go('main.blog', { id: blog.Id });
};
}])
.controller('blogController', ['$scope', '$state', '$stateParams', 'odataService', function ($scope, $state, $stateParams, odataService) {
$scope.Title = 'Blog';
var service = new odataService.GetInstance('/odata/Blogs', { Name: '', Url: '' });

$scope.save = function () {
service.saveEntity($scope.blog).then(function () {
$state.go('main.blogs');
});
};

$scope.delete = function() {
service.deleteEntity($scope.blog).then(function () {
$state.go('main.blogs');
});
};

$scope.close = function() {
$state.go('main.blogs');
};

if ($stateParams.id === '')
$scope.blog = service.createEntity();
else {
odataService.getEntity($stateParams.id).then(function (result) {
$scope.blog = result.data;
});
}
}]);
})(window, window.angular)

With few lines of code I have managed the crud operations.

In my case the odata module can be improved with checks about the entity validations, by using the $metadata informations returned by the odata services and I can retrieve also the entity fields without passing those as parameters, but, anyway, this is a good starting point.

 

 

 

 

Consume Web API with an Angularjs service

Web API in ASP.NET Core

In this post I’ll talk about the Web API in ASP.NET Core and how we can implement a Restful service with this new framework.

One of the main difference in ASP.NET Core is that Web API have an unified class with the classic MVC Controllers.

So the declaration of a Web API in ASP.NET Core is look like this:


public class PostController : Controller
{
}

The attribute of the actions verbs is pretty similar to MVC 5.

We can define the controller route as well and we can do that with the Route attribute:


[Route("api/[controller]")]
public class PostController : Controller
{
}

We can implement an action in the same way as the Web API 2:


[HttpGet("{offset}", Name = "GetPost")]
 public IActionResult Get(int offset)
 {
 var result = _postService.GetPosts(offset, true);

return Ok(result);
 }

With the HttpGet attribute we can specify the verb of the method, the parameters and the action url.

So far so good, now there is the last question about this topic, that is the content negotiation.

Since ASP.NET Core MVC and Web API frameworks are unified, with the attribute Produces we can define the response content type for the controllers.

So, if we want to return a content type in json format, we need to apply this attribute:


[Produces("application/json")]
[Route("api/[controller]")]
public class PostController : Controller
{
}

With this attribute the content type of the request will be application/json, otherwise the controller will returns a response 415 – Unsupported media type.

The source code of the topic is here.

 

Web API in ASP.NET Core

Manage attachments chunks with ASP.NET Web Api

In the previous post I spoke about a custom MultipartFormData stream provider and how it can help us to manage some custom informations included in a request message.

In that example I generated chunks form a file and I sent those to a rest service (AKA Web API) with some additional informations that were then retrieved from the custom provider.

Now I want to use these informations to manage the upload session and merge all the chunks when received.

What I need to do is define the models involved in the process and the service that manage the chunks.

Models

We have to define two stuff, the first one is the model for the chunk:


public class ChunkMetadata
{
public string Filename { get; set; }
public int ChunkNumber { get; set; }

public ChunkMetadata(string filename, int chunkNumber)
{
Filename = filename;
ChunkNumber = chunkNumber;
}
}

The ChunkNumber property deserves an explanation; is the number associated to the chunk and will be useful to understand the correct order when we’ll have to merge all of them.

The second one is the model of the session, that is the bunch of the chunks that compose the file.

First of all we define the interface:


public interface IUploadSession
{
ConcurrentBag<ChunkMetadata> Chunks { get; set; }
string Filename { get; }
long Filesize { get; }
bool AddChunk(string filename, string chunkFileName, int chunkNumber, int totalChunks);
Task MergeChunks(string path);
}

The FileName and Filesize are closely tied to the session; we need AddChunk and MergeChunks methods as well.

We also need a thread safe collection for the chunks that compose the session, so we define a CuncurrentBag collection, that is the thread safe representation of the List.

Now we can implement the model:


public class UploadSession : IUploadSession
{
public string Filename { get; private set; }
public long Filesize { get; private set; }
private int _totalChunks;
private int _chunksUploaded;

public ConcurrentBag<ChunkMetadata> Chunks { get; set; }

public UploadSession()
{
Filesize = 0;
_chunksUploaded = 0;
Chunks = new ConcurrentBag<ChunkMetadata>();
}

public bool AddChunk(string filename, string chunkFileName, int chunkNumber, int totalChunks)
{
if (Filename == null)
{
Filename = filename;
_totalChunks = totalChunks;
}

var metadata = new ChunkMetadata(chunkFileName, chunkNumber);
Chunks.Add(metadata);

_chunksUploaded = Interlocked.Increment(ref _chunksUploaded);
return _chunksUploaded == _totalChunks;
}

public async Task MergeChunks(string path)
{
var filePath = path + Filename;

using (var mainFile = new FileStream(filePath, FileMode.Create))
{
foreach (var chunk in Chunks.OrderBy(c => c.ChunkNumber))
{
using (var chunkFile = new FileStream(chunk.Filename, FileMode.Open))
{
await chunkFile.CopyToAsync(mainFile);
Filesize += chunkFile.Length;
}
}
}

foreach (var chunk in Chunks)
{
File.Delete(chunk.Filename);
}
}
}

The implementation is quite simple.

The AddChunk method add the new chunk to the collection, then increment the _chunksUploaded property with the thread safe operation Interlocked.Increment; at the end, the method returns a bool that is true if all the chunks are received, otherwise false.

The MergeChunks method deal with the retrieve of all the chunks from the file system.

It gets the collection, order by the chunk number, read the bytes from the chunks and copy those to the main file stream.

After all, the chunks are deleted.

Service

The service will have an interface like this:


public interface IUploadService
{
Guid StartNewSession();
Task<bool> UploadChunk(HttpRequestMessage request);
}

In my mind, the StartNewSession method will instantiate a new Session object and assign a new correlation id that is the unique identifier of the session.

This is the implementation:


public class UploadService : IUploadService
{
private readonly Context _db = new Context();
private readonly string _path;
private readonly ConcurrentDictionary<string, UploadSession> _uploadSessions;

public UploadService(string path)
{
_path = path;
_uploadSessions = new ConcurrentDictionary<string, UploadSession>();
}

public async Task<bool> UploadChunk(HttpRequestMessage request)
{
var provider = new CustomMultipartFormDataStreamProvider(_path);
await request.Content.ReadAsMultipartAsync(provider);
provider.ExtractValues();

UploadSession uploadSession;
_uploadSessions.TryGetValue(provider.CorrelationId, out uploadSession);

if (uploadSession == null)
throw new ObjectNotFoundException();

var completed = uploadSession.AddChunk(provider.Filename, provider.ChunkFilename, provider.ChunkNumber, provider.TotalChunks);

if (completed)
{
await uploadSession.MergeChunks(_path);

var fileBlob = new FileBlob()
{
Id = Guid.NewGuid(),
Path = _path + uploadSession.Filename,
Name = uploadSession.Filename,
Size = uploadSession.Filesize
};

_db.FileBlobs.Add(fileBlob);
await _db.SaveChangesAsync();

return true;
}

return false;
}

public Guid StartNewSession()
{
var correlationId = Guid.NewGuid();
var session = new UploadSession();
_uploadSessions.TryAdd(correlationId.ToString(), session);

return correlationId;
}
}

In the StartNewSession method we use the thread safe method TryAdd to add a new session to the CuncurrentBag.

About the UploadChunk method, we seen the first part of the implementation in the previous post.

Once the metadata is retrieved from the request, we try to find the session object with a thread safe operation.

If we don’t find the object, of course we need to throw an exception because we expect that the related session exists.

If the session exists, we add the chunk to the session and we check the result of the operation.

If is the last chunk, we merge all of them and we can do a database operation if needed.

Controller

The implementation of the controller is very simple:


public class FileBlobsController : ApiController
{
private readonly IUploadService _fileBlobsService;
private readonly Context _db = new Context();

public FileBlobsController(IUploadService uploadService)
{
_fileBlobsService = uploadService;
}

[Route("api/fileblobs/getcorrelationid")]
[HttpGet]
public IHttpActionResult GetCorrelationId()
{
return Ok(_fileBlobsService.StartNewSession());
}

[HttpPost]
public async Task<IHttpActionResult> PostFileBlob()
{
if (!Request.Content.IsMimeMultipartContent())
throw new Exception();

var result = await _fileBlobsService.UploadChunk(Request);

return Ok(result);
}
}

You can find the source code here.

Manage attachments chunks with ASP.NET Web Api

Custom MultipartFormDataStreamProvider in C#

Frequently, when we manage multipart/form requests and we send them to the server, we might want to add some additional informations.

Perhaps we might want to split a big file in chunks and we might want to add some additional informations like the id of the upload session, the chunk number, the file name and the total chunks number that compose the file.

Suppose that we use for the client side Angularjs, the code of the controller is quite simple:


.....

public AddAttachment(event) {
let attachments = event.target.files;
if (attachments.length > 0) {
let file: File = attachments[0];

this.$http.get(this.url + "/GetCorrelationId").then((correlationId) => {
let chunks = this.SplitFile(file);

for (let i = 0; i < chunks.length; i++) {
let formData = new FormData();
formData.append("file", chunks[i], file.name);
formData.append("correlationId", correlationId.data);
formData.append("chunkNumber", i + 1);
formData.append("totalChunks", chunks.length);

this.$http.post(this.url, formData, { headers: { "Content-Type": undefined } }).then((result) => {
if(result.data) {
this.Load();
}
});
}
});
}
}

private SplitFile(file: File): Array<Blob> {
let chunks = Array<Blob>();
let size = file.size;
let chunkSize = 1024 * 1024 * 10;
let start = 0;
let end = chunkSize;

while (start < size) {
let chunk = file.slice(start, end);
chunks.push(chunk);
start = end;
end += chunkSize;
}

return chunks;
}

.....

The AddAttachment method is invoked by the view; once the file is retrieved, the split method generate the array of chunks.

Then, with the $http factory we send every single chunks to the server with additional metadata.

In order to read these datas from the server side, we need to implement a custom MultipartFormData stream provider.

The first step is define the interface of our provider:


public interface ICustomMultipartFormDataStreamProvider
{
string ChunkFilename { get; }
int ChunkNumber { get; }
string CorrelationId { get; }
string Filename { get; }
int TotalChunks { get; }
void ExtractValues();
}

The interface has the same properties sent by the client, and a method that deal with extract the values from the message.

Now we can proceed with the implementation:


public class CustomMultipartFormDataStreamProvider : MultipartFormDataStreamProvider, ICustomMultipartFormDataStreamProvider
{
public string Filename { get; private set; }
public string ChunkFilename { get; private set; }
public string CorrelationId { get; private set; }
public int ChunkNumber { get; private set; }
public int TotalChunks { get; private set; }

public CustomMultipartFormDataStreamProvider(string rootPath) : base(rootPath) { }

public CustomMultipartFormDataStreamProvider(string rootPath, int bufferSize) : base(rootPath, bufferSize) { }

public override Task ExecutePostProcessingAsync()
{
foreach (var file in Contents)
{
var parameters = file.Headers.ContentDisposition.Parameters;
var filename = ExtractParameter(parameters, "filename");
if (filename != null) Filename = filename.Value.Trim('\"');
}

return base.ExecutePostProcessingAsync();
}

public void ExtractValues()
{
var chunkFileName = FileData[0].LocalFileName;
var correlationId = FormData?.GetValues("correlationId");
var chunkNumber = FormData?.GetValues("chunkNumber");
var totalChunks = FormData?.GetValues("totalChunks");

if (string.IsNullOrEmpty(chunkFileName) || correlationId == null || chunkNumber == null || totalChunks == null)
throw new Exception("Missing values in UploadChunk session.");

ChunkFilename = chunkFileName;
CorrelationId = correlationId.First();
ChunkNumber = int.Parse(chunkNumber.First());
TotalChunks = int.Parse(totalChunks.First());
}

private NameValueHeaderValue ExtractParameter(ICollection<NameValueHeaderValue> parameters, string name)
{
return parameters.FirstOrDefault(p => p.Name.Equals(name, StringComparison.OrdinalIgnoreCase));
}
}

The class inherits from MultipartFormDataStreamProvider base class and implements our interface.

Two methods are implemented; the first one override ExecutePostProcessingAsync and in this method we retrieve the name of the main file.

The second one extract the custom parameters from the FormData; we retrieve also the chunk filename from the FileData object; this information is included as default information in the MultipartFormData message.

Now the informations are retrieved and we can use the custom provider in a service:


public async Task<bool> UploadChunk(HttpRequestMessage request)
{
var provider = new CustomMultipartFormDataStreamProvider(_path);
await request.Content.ReadAsMultipartAsync(provider);
provider.ExtractValues();

.....
}

The metadata will be available in the provider object.

You can find the source code here.

 

 

Custom MultipartFormDataStreamProvider in C#

Real-time search with ASP.NET and Elasticsearch

A common problem that we are faced when we have deployed our applications is improve the performance of a page or feature.

In my case for example I had a field where I could search and select a city, so the starting elements were a lot and the search was quite slow; I wanted a better user experience.

We can solve performance problems like these with the help of a cache or a full-text search.

I have chosen the last one and elastic search as full-text engine, so I’ll describe the steps that I followed to configure and use it in my application.

Installation

The first step is install the elastic search server, that you can download here.

Once installed we have to start it by executing the following executable:

<Installation path>\bin\elasticsearch.bat

This is the server log:

log

The server will take care to index the content that we will pass to it; in order to do that we need a client to use in our application; in my case the application was .NET and I used NEST.

NEST

As said above, NEST is an elastic search high level client for .NET applications.

The first step is install it in the application with nuget:

Install-package NEST

And in the package.config we’ll have:

package

Now we have all the necessary tools and we can develop the code for the search feature.

Client

We define a client class that has one responsability, that is setup the url and the default index of the client, and that instantiate it:


public class ElasticSearchClient
{
privatereadonlyIElasticClient _client;

publicElasticSearchClient(IElasticClient client)
{
_client = client;
}

publicElasticSearchClient(string uri, string indexName) : this(CreateElasticClient(uri, indexName)) {}

publicIElasticClientGetClient()
{
return_client;
}

privatestaticElasticClientCreateElasticClient(string uri, string indexName)
{
var node = newUri(uri);
var setting = newConnectionSettings(node);
setting.DefaultIndex(indexName);
returnnewElasticClient(setting);
}
}

Once instantiated, the class returns a new instance of the client; we can register it in the startup class of the application with autofac:


public partial class Startup
{
publicvoidConfiguration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();
...
}
}

Service base class

A service that uses an elasticsearch index should be able to do some basic operations, that concerns the logics of the full-text indexes.

We have to deal with the initialize a specific index, populate the index with the contents, obviously performs a search on the index with specific parameters.

So, we have to define an interface like this:


internal interface IElasticSearchService<T> where T : class
{
voidInit();
voidCheckIndex();
voidBulkInsert(List<T> objects);
IEnumerable<T> Search(string query);
}

I like to separate the init method, that create the index, from the checkindex method, that check if the index already exists.

Now we can implement the basic service:


public class ElasticSearchService<T> : IElasticSearchService<T> where T : class
{
protectedreadonlyContext Db = newContext();
protectedreadonlyElasticSearchClient ElasticSearchClient;
protectedreadonlystring IndexName;

publicElasticSearchService(ElasticSearchClient elasticSearchClient, string indexName)
{
ElasticSearchClient = elasticSearchClient;
IndexName = indexName;
}

publicvirtualvoidInit()
{
CheckIndex();
BulkInsert(Db.Set<T>().ToList());
}

publicvoidCheckIndex()
{
if (IndexExist()) return;
var response = CreateIndex();

if (!response.IsValid)
{
thrownewException(response.ServerError.ToString(), response.OriginalException);
}
}

publicvoidBulkInsert(List<T> objects)
{
var response = ElasticSearchClient.GetClient().IndexMany(objects, IndexName);
if (!response.IsValid)
{
thrownewException(response.ServerError.ToString(), response.OriginalException);
}
}

publicvirtualIEnumerable<T> Search(string query)
{
var results = ElasticSearchClient.GetClient().Search<T>(c => c.From(0).Size(10).Query(q => q.Prefix("_all", query)));

returnresults.Documents;
}

protectedvirtualIResponseCreateIndex()
{
var indexDescriptor = newCreateIndexDescriptor(IndexName).Mappings(ms => ms.Map<T>(m => m.AutoMap()));
returnElasticSearchClient.GetClient().CreateIndex(indexDescriptor);
}

protectedboolIndexExist()
{
returnElasticSearchClient.GetClient().IndexExists(IndexName).Exists;
}
}

The constructor accept the client and the index name.

We define a virtual init method, that check if the index exists and do a bulkinsert of a list of object; the method is virtual, we think that a derived service could override the method.

This bulkinsert method leverage the client to index the object list and the search method implements a basic search, that searchs in all the fields of the objects by using the special field _all, which contains the concatenate values of all fields.

The method returns the first 10 elements.

Createindex create a specific index with automap option, that infers the elasticsearch fields datatypes from the POCO object that we pass to it; it’s protected, so the derived class could use it.

IndexExists check if an index exists and it can be used from the derived class as well.

Service

Now we can implement a specific service, that inherits from ElasticSearchService class.

In this example I need to search in a list of cities and related districts, so I need to override the CreateIndex method like this:


public sealed class CitiesService : ElasticSearchService<City>
{
publicCitiesService(ElasticSearchClient elasticSearchClient, string indexName): base(elasticSearchClient, indexName) {}

protectedoverrideIResponseCreateIndex()
{
var indexDescriptor = newCreateIndexDescriptor(IndexName).Mappings(
ms => ms.Map<City>(m => m.AutoMap().Properties(ps =>
ps.Nested<District>(n => n
.Name(nn => nn.District)
.AutoMap()))));

returnElasticSearchClient.GetClient().CreateIndex(indexDescriptor);
}

publicoverrideIEnumerable<City> Search(string query)
{
var results = ElasticSearchClient.GetClient().Search<City>(c => c.From(0).Size(10).Query(q => q.Prefix(p => p.Name, query) || q.Term("district.name", query)));

returnresults.Documents.OrderBy(d => d.Name);
}
}

What I need to do is automap the city object and the district, that is a closely related entity of the city; so I have to map the District property as nested with the automap option as well.

Thus I will able to search for all the properties of the city and the district.

The other method that I override is the Search method; I search partially in the name of the city (Prefix) and the specific term in the district name (Term) and I returns the first 10 elements.

Now I have to register the service with autofac:


public partial class Startup
{
public void Configuration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();

builder.Register(c => newCitiesService(c.Resolve<ElasticSearchClient>(), "cities"))
.AsSelf()
.AsImplementedInterfaces()
.SingleInstance();

...
}
}

The last step is initialize the full text index of my service:


public partial class Startup
{
publicvoidConfiguration(IAppBuilder app)
{
var builder = newContainerBuilder();

builder.Register(c => newElasticSearchClient("http://localhost:9200", "cities"))
.AsSelf()
.SingleInstance();

builder.Register(c => newCitiesService(c.Resolve<ElasticSearchClient>(), "cities"))
.AsSelf()
.AsImplementedInterfaces()
.SingleInstance();

...

InitElasticSearchServices(containerBuilder);
}

privatestaticvoidInitElasticSearchServices(IContainer containerBuilder)
{
var citiesServices = containerBuilder.Resolve<CitiesService>();
citiesServices.Init();
}
}

I make a new instance of the service and call the Init method of the ElasticSearchService that we have seen above.

This method will create and populate the index.

Web API

Now I can use the service in my Web API, like this:


public class CitiesController : ApiController
{
privatereadonlyCitiesService _elasticSearchService;

publicCitiesController(CitiesService elasticSearchService)
{
_elasticSearchService = elasticSearchService;
}

// GET: api/Cities
publicIEnumerable<City> GetCities(string query)
{
return_elasticSearchService.Search(query);
}

protectedoverridevoidDispose(bool disposing)
{
base.Dispose(disposing);
}
}

You can find the source code of this topic here.

Real-time search with ASP.NET and Elasticsearch