Skip to main content

Using Newtonsoft serializer in CosmosDB client

Problem

In some scenarios engineers might want to use a custom JSON serializer for documents stored in CosmosDB. 

Solution

In CosmosDBV3 .NET Core API, when creating an instance of CosmosClient one of optional setting in CosmosClientOptions is to specify an instance of a Serializer. This serializer must be JSON based and be of CosmosSerializer type. This means that if a custom serializer is needed this should inherit from CosmosSerializer abstract class and override its two methods for serializing and deserializing of an object. The challenge is that both methods from CosmosSerializer are stream based and therefore might be not as easy to implement as engineers used to assume - still not super complex. 
For demonstration purpose as or my custom serializer I'm going to use Netwonsoft.JSON library. Firstly a new type is needed and this must inherit from CosmosSerializer. 

using Microsoft.Azure.Cosmos;

using Newtonsoft.Json;
 
using System.IO;
using System.Text;
 
/// <summary>
/// Custom serializer for CosmosDB client.
/// </summary>
public class SerliarizationService : CosmosSerializer
{
    private readonly JsonSerializer serializer;
 
    public SerliarizationService()
    {
        this.serializer = new JsonSerializer
        {
            NullValueHandling = NullValueHandling.Ignore
        };
    }
 
    /// <summary>
    /// Logic for deserialization.
    /// </summary>
    public override T FromStream<T>(Stream stream)
    {
        return this.serializer.Deserialize<T>(new JsonTextReader(new StreamReader(stream)));
    }
 
    /// <summary>
    /// Serialization logic.
    /// </summary>
    public override Stream ToStream<T>(T input)
    {
        using var stringWriter = new StringWriter();
        using var jsonTextWriter = new JsonTextWriter(stringWriter);
        this.serializer.Serialize(jsonTextWriter, input, input.GetType());
 
        return new MemoryStream(Encoding.UTF8.GetBytes(stringWriter.ToString()));
    }
}
With a custom serialization in place, the only thing to change is to change settings when initializing a new instance of CosmosClient.

cosmosClient = new CosmosClient(
    this.dbConfig.EndpointUrl,
    this.dbConfig.AuthorizationKey,
    new CosmosClientOptions()
    {
        Serializer = new SerliarizationService(),
    });

Job done.
Thank you

/dz

Popular posts from this blog

Persisting Enum in database with Entity Framework

Problem statement We all want to write clean code and follow best coding practices. This all engineers 'North Star' goal which in many cases can not be easily achievable because of many potential difficulties with converting our ideas/good practices into working solutions.  One of an example I recently came across was about using ASP.NET Core and Entity Framework 5 to store Enum values in a relational database (like Azure SQL). Why is this a problem you might ask... and my answer here is that you want to work with Enum types in your code but persist an integer in your databases. You can think about in that way. Why we use data types at all when everything could be just a string which is getting converted into a desirable type when needed. This 'all-string' approach is of course a huge anti-pattern and a bad practice for many reasons with few being: degraded performance, increased storage space, increased code duplication.  Pre-requirements 1. Status enum type definition...

Multithread processing of the SqlDataReader - Producer/Consumer design pattern

In today post I want to describe how to optimize usage of a ADO.NET SqlDataReader class by using multi-threading. To present that lets me introduce a problem that I will try to solve.  Scenario : In a project we decided to move all data from a multiple databases to one data warehouse. It will be a good few terabytes of data or even more. Data transfer will be done by using a custom importer program. Problem : After implementing a database agnostic logic of generating and executing a query I realized that I can retrieve data from source databases faster that I can upload them to big data store through HTTP client -importer program. In other words, data reader is capable of reading data faster then I can process it an upload to my big data lake. Solution : As a solution for solving this problem I would like to propose one of a multi-thread design pattern called Producer/Consumer . In general this pattern consists of a two main classes where: Producer class is res...