Skip to main content

Log database structure changes

In many application the database structure can be changed by the user ex. CRM or Reporting. In such situation we may want to know such changes was maded. In SQL SERVER 2008 we have a possibility to react when DML (Data Manipulation Language) AND DDL (Data Definition Language) execution occured in out database. So try to log it.

Firstly we want to Create a single table named 'DatabaseChangesLog':

use model;
GO

CREATE SCHEMA LOGS
CREATE TABLE DatabaseChangeLogs
(
   EventId int Identity Primary Key,
   EventDate datetime2  Constraint DF_DefaultLogDate DEFAULT(sysdatetime()),
   EventType nvarchar(100) NOT NULL,
   UserName nvarchar(1050) NOT NULL,
   Command nvarchar(max) NOT NULL
  )
GO

Note that we creata this table in model databse. This means that each newly created database will has this table after creation. For existing databases You need to run this script manullay.

Now when we have appropriate table we can insert record to. We are able to detect each of DML changes  using database trigger which is a new feauture in SQL Server 2008. Such trigger may react for more than INSERT, UPDATE and DELETE statments. Whole list of 'events' can be obtained by the executing following query: SELECT * FROM sys.trigger_event_types; .

CREATE TRIGGER StructureLogTrigger
ON DATABASE
FOR
DDL_DATABASE_LEVEL_EVENTS AS

   DECLARE @data XML;
   SET @data = EVENTDATA();
   INSERT LOGS.DatabaseChangeLogs  (EventType, UserName , Command )

VALUES
   (@data.value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(100)'),
   CONVERT(nvarchar(100), CURRENT_USER),
      @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(2000)') ) ;
GO


Now each changes will be logged directly to out table.
Made some changes and take a look.


Thanks

Links:

Popular posts from this blog

Persisting Enum in database with Entity Framework

Problem statement We all want to write clean code and follow best coding practices. This all engineers 'North Star' goal which in many cases can not be easily achievable because of many potential difficulties with converting our ideas/good practices into working solutions.  One of an example I recently came across was about using ASP.NET Core and Entity Framework 5 to store Enum values in a relational database (like Azure SQL). Why is this a problem you might ask... and my answer here is that you want to work with Enum types in your code but persist an integer in your databases. You can think about in that way. Why we use data types at all when everything could be just a string which is getting converted into a desirable type when needed. This 'all-string' approach is of course a huge anti-pattern and a bad practice for many reasons with few being: degraded performance, increased storage space, increased code duplication.  Pre-requirements 1. Status enum type definition...

Using Newtonsoft serializer in CosmosDB client

Problem In some scenarios engineers might want to use a custom JSON serializer for documents stored in CosmosDB.  Solution In CosmosDBV3 .NET Core API, when creating an instance of  CosmosClient one of optional setting in  CosmosClientOptions is to specify an instance of a Serializer . This serializer must be JSON based and be of  CosmosSerializer type. This means that if a custom serializer is needed this should inherit from CosmosSerializer abstract class and override its two methods for serializing and deserializing of an object. The challenge is that both methods from  CosmosSerializer are stream based and therefore might be not as easy to implement as engineers used to assume - still not super complex.  For demonstration purpose as or my custom serializer I'm going to use Netwonsoft.JSON library. Firstly a new type is needed and this must inherit from  CosmosSerializer.  using  Microsoft.Azure.Cosmos; using  Newtonsoft.Json; usin...

Multithread processing of the SqlDataReader - Producer/Consumer design pattern

In today post I want to describe how to optimize usage of a ADO.NET SqlDataReader class by using multi-threading. To present that lets me introduce a problem that I will try to solve.  Scenario : In a project we decided to move all data from a multiple databases to one data warehouse. It will be a good few terabytes of data or even more. Data transfer will be done by using a custom importer program. Problem : After implementing a database agnostic logic of generating and executing a query I realized that I can retrieve data from source databases faster that I can upload them to big data store through HTTP client -importer program. In other words, data reader is capable of reading data faster then I can process it an upload to my big data lake. Solution : As a solution for solving this problem I would like to propose one of a multi-thread design pattern called Producer/Consumer . In general this pattern consists of a two main classes where: Producer class is res...