Skip to main content

Persisting Enum in database with Entity Framework

Problem statement

We all want to write clean code and follow best coding practices. This all engineers 'North Star' goal which in many cases can not be easily achievable because of many potential difficulties with converting our ideas/good practices into working solutions. 

One of an example I recently came across was about using ASP.NET Core and Entity Framework 5 to store Enum values in a relational database (like Azure SQL). Why is this a problem you might ask... and my answer here is that you want to work with Enum types in your code but persist an integer in your databases.

You can think about in that way. Why we use data types at all when everything could be just a string which is getting converted into a desirable type when needed. This 'all-string' approach is of course a huge anti-pattern and a bad practice for many reasons with few being: degraded performance, increased storage space, increased code duplication. 

Pre-requirements

1. Status enum type definition.
public enum Status
  {
      Undefined,
      New,
      Processing,
      Completed,
      Error
  }
2. Entity class

public class BackgroudJob
 {
     public int Id { getset; }
 
     public String Title { getset; }
 
     public JobStatus Status { getset; }
 }
3 . Azure (or on-prem) SQL super simple table structure. 'Status' column is the one that we will insert our enum into

CREATE TABLE [dbo].[BackgroudJob]
  (
     [Id]   [INT] PRIMARY KEY NOT NULL IDENTITY,
     [Title] [NVARCHAR](50) NOT NULL,
     [Status] [INT] NOT NULL
  ); 

Recommended solution 

To me the best solution for this problem is to use a build-in support for Enum persistence which comes with Entity Framework Core and it's about built on the top of a value conversion

As you can see the above SQL table expects a status of integer type where the entity class stores a status as an Enum. Unfortunately these two types (Integer and JobStatus) are not directly compatible while persisting data. Thankfully one can be easily converted into another in a fully automated and transparent to developers way with use of a native EnumToNumberConverter. In order to use it, all is required is to add a specific conversion logic during creation of a DB structure in CodeFirst DB schema creation - it's simple.
protected override void OnModelCreating(ModelBuilder modelBuilder)
      {
          base.OnModelCreating(modelBuilder);
 
          modelBuilder.Entity<BackgroudJob>(entity =>
          {
              entity.HasKey(e => e.Id);
              entity.Property(e => e.Title).IsRequired().HasMaxLength(50);
              entity.Property(e => e.Status).IsRequired().IsRequired()
                        .HasConversion(new EnumToNumberConverter<JobStatusint>());
          });
      }

Pros
  • Conversion logic is centralized
  • Logic is transparent to developer
  • No need to a 'manual' enum to int (and vice versa) conversion explicitly in the code
  • Less extra code to test
  • Use of native framework
Cons
  • Logic is not really transparent
  • Custom converters might be needed for complex use cases

Another solution #1 - manual conversion to integer (not recommended)

Manual conversion solution is about changing a type from enum to integer (or string) explicitly in code. This conversion should be of course centralized but unfortunately this might not be possible in the case when logic for accessing persistence layer is all around a project code. It might also requires to create a separated data transfer object (DTO) class and perform a mapping from underlying entity into DTO. It will looks something like this.  


Thank you.

/dz

Popular posts from this blog

Using Newtonsoft serializer in CosmosDB client

Problem In some scenarios engineers might want to use a custom JSON serializer for documents stored in CosmosDB.  Solution In CosmosDBV3 .NET Core API, when creating an instance of  CosmosClient one of optional setting in  CosmosClientOptions is to specify an instance of a Serializer . This serializer must be JSON based and be of  CosmosSerializer type. This means that if a custom serializer is needed this should inherit from CosmosSerializer abstract class and override its two methods for serializing and deserializing of an object. The challenge is that both methods from  CosmosSerializer are stream based and therefore might be not as easy to implement as engineers used to assume - still not super complex.  For demonstration purpose as or my custom serializer I'm going to use Netwonsoft.JSON library. Firstly a new type is needed and this must inherit from  CosmosSerializer.  using  Microsoft.Azure.Cosmos; using  Newtonsoft.Json; usin...

Multithread processing of the SqlDataReader - Producer/Consumer design pattern

In today post I want to describe how to optimize usage of a ADO.NET SqlDataReader class by using multi-threading. To present that lets me introduce a problem that I will try to solve.  Scenario : In a project we decided to move all data from a multiple databases to one data warehouse. It will be a good few terabytes of data or even more. Data transfer will be done by using a custom importer program. Problem : After implementing a database agnostic logic of generating and executing a query I realized that I can retrieve data from source databases faster that I can upload them to big data store through HTTP client -importer program. In other words, data reader is capable of reading data faster then I can process it an upload to my big data lake. Solution : As a solution for solving this problem I would like to propose one of a multi-thread design pattern called Producer/Consumer . In general this pattern consists of a two main classes where: Producer class is res...