Pages

Monday, 30 September 2013

SQL Server :Part 1 : Architecture of Transaction Log

In our earlier post, we have discussed in detail about the architecture of the data file and different types of pages. In this post, we will be discussing about the architecture of log file.

Logging is a mechanism used in RDBMS to support various ACID (Atomicity,Consistency,Isolation and Durability) properties of transactions. A transaction log is a physical file in which SQL server stores the details of all transactions and data modifications performed on the database.In the event of of disaster, that causes SQL server to shutdown unexpectedly (Power failure/hardware failure), the transaction log is used to bring the database in a consistent state while restarting the server.On restarting the server, database goes through the recovery process.During this recovery process , the transaction log is used to make sure that all committed transactions are written to respective data pages (rolled forward) and revert the uncommitted transaction that were written to data pages.


Logically transaction log is a set of log records.Each records is identified by a log sequence number (LSN). The new log records is always written at the logical end of log file with a LSN which is greater than the previous one.Each LSN will be associated with a transaction id , which can be used to group the log records of a transaction. As log file store the log records in the sequential order as it happens, It is not necessary that, log records of a transaction are always available in sequence in the log file.Each log records will have the LSN of previous log as a backward pointer and that will help in rollback of transaction.

Transaction log will store separate log entries for each operation.For example, while inserting a record into a table, transaction log will store separate log entry for inserting into clustered index and other non clustered index. In the same way, if a single update statement is updating 10 records, transaction log will capture 10 separate log entries.For data modification, transaction log store either the logical operation performed or the before and after image of the record.



To understand it better, let us look into the transaction log using the sys.fn_dblog. It is an undocumented function which will help us to read the active portion of the log. we are using the below query to create two tables and insert some records into that.We will delete the records from these table to keep the table clean. This insert and delete operation is to make sure that the pages are allocated to the table and we will not get unnecessary entries in the transaction log while examining it.We have issued a manual checkpoint to force the SQL server to write the log information to data file and clear the log. Note that, one table is heap table and other one is clustered index table.


USE Mydb
GO
CREATE TABLE LoginfoHeap
(
   
id INT)

GO
CREATE TABLE LoginfoCI
(
   
id INT PRIMARY KEY)

INSERT INTO LoginfoHeap VALUES(1),(2)
INSERT INTO LoginfoCI VALUES(2),(4)
GO
DELETE FROM LoginfoHeap 
DELETE FROM LoginfoCI
GO
CHECKPOINT
GO
SELECT 
[Current LSN]
Operation  ,Context    ,
[Transaction ID],
[Previous LSN],AllocUnitName,[
Previous Page LSN],
[Page ID],[XACT ID],SUSER_SNAME(CONVERT(VARBINARY,[Transaction SID])) AS 'Login',
[Begin Time],[End Time]
FROM sys.fn_dblog (NULL, NULL)






From the output it is very clear that , we have only two active log entries.First one is written ,when the checkpoint started. The second one is written,  when the checkpoint completed the process.


Now we will insert,update and delete records to these tables through two session 

---SESSION I
   BEGIN TRAN
   INSERT INTO LoginfoCI VALUES(2)

--SESSION 2
   BEGIN TRAN
   INSERT INTO LoginfoHeap VALUES(1),(2)

---SESSION I
   INSERT INTO LoginfoCI VALUES(4)

--SESSION 2
   UPDATE LoginfoHeap   SET id =WHERE id=1

---SESSION I
   UPDATE LoginfoCI  SET id =WHERE id=2

--SESSION 2
   
DELETE FROM LoginfoHeap    WHERE id=2

---SESSION I
   DELETE FROM LoginfoCI   WHERE id=4
   SELECT FROM sys.dm_tran_current_transaction
   COMMIT

--SESSION 2
   SELECT FROM sys.dm_tran_current_transaction
   COMMIT

The DMV sys.dm_tran_current_transcation returns a single row that displays the state information of the current transaction in the current session.We are interested only in the transaction_id, which will help us to filter the output of sys.fn_dblog. Let us see the output of sys.fn_dblog.

SELECT 
[Current LSN]
Operation  ,
Context    ,
[Transaction ID],
[Previous LSN],
AllocUnitName,
[Previous Page LSN],
[Page ID],[XACT ID],
SUSER_SNAME(CONVERT(VARBINARY,[Transaction SID])) AS 'Login',
[Begin Time],
[End Time]
FROM sys.fn_dblog (NULL, NULL) 
WHERE [Transaction ID] IN 
(
   SELECT [Transaction ID] FROM sys.fn_dblog (NULL, NULL) 
   WHERE [XACT ID] IN (856960,856981)
)

The values 856960 and 856981 are the transaction_id returned from sys.dm_tran_current_transaction.We have filter the output to get only the relevant rows in which we are interested.

















In our script, we have opened two transaction and all our transaction log entries are grouped to 
one of these transaction_id marked in red and green.Let us analyse what we did and how it is captured in the transaction log.

In the session 1, we have started the transaction and inserted a single record.The first records in the output map to the BEGIN TRAN command. This is the starting point of the transaction and created a new transaction_id.The previous LSN column value is 0 as this is the first log entry in this transaction.In the same log records,it stores the XACT_ID,login and transaction start time.The second record represent the insert into the clustered table.The transaction_id is used to group the entries associated with a transaction. The previouse LSN column, is a pointer to the previous log entry in the same transaction which help SQL server to move backwards in case of rollback.Page id column refer the the page number where this LSN made the change.Previous Page LSN column refer the last log sequence number(LSN) which modify this page.When LSN modify a page, it will also update the corresponding LSN number in the page header (m_lsn field in the header. For more detail refer this post)

In the session 2, we have opened another transaction and inserted two records through single insert statement to the heap table. You can map these operations to row number 3,4, and 5 in the transaction log output. Third row represent the Begin tran command. Even if we inserted two records in single insert statement , SQL server recorded two separate entry in the transaction log. 

As a next step, in session 1 we have added 1 record to the clustered index table.We can map this operation to the 6th record in the transaction log output.

In the next statement , we have modified a record in heap table through Session 2. You can map this to the 7th record in the transaction log output.If you look into the previous LSN column , it will be current LSN column value of the last record associated with this transaction.

In the same way, as a next statement we have modified a record in the clustered table through session 1. We can map the 8th and 9th records in the transaction log output to the update operation on the clustered table. You might have noticed that, when we modified a record in the heap table, transaction log recorded operation in a single row. Where as the same operation in a clustered table has two record in the transaction log. One for delete and other one for insert. When you modify the clustered index key, SQL server internally delete the existing record and insert a new record. This is because, the record need to be stored in the new location based on the modified value(based on the order of clustered index column). The easiest way for SQL server to achieve this is , delete the existing record and insert it as new records with modified clustered column value.

In the next two statement, we are deleting one record from heap table and clustered table.This can be mapped to the 10th and 11th records in the output.Finally we have issued the commit statement in both sessions.12th and 13th record in the transaction log output can be mapped to the commit operation.The Previous LSN column refer the Current LSN column of corresponding  begin tran statement. It will also capture the transaction end time in the End time column.

Understanding the VLF(Virtual Log File)

A database can have one or more log file. In general there will be only one log file as there is no performance improvement by having multiple log file. SQL server uses the transaction log in sequential manner.As the data file divided into pages,log files are divided into virtual log file(VLF).The size of the VLFs  in a log file may not be in equal size. SQL server decide the size and number of VLF in a log file based on the size of the log file growth as given below.

Growth upto 64 MB          = 4  VLF
From 64 MB to 1 GB       = 8   VLF
Larger than 1 GB             = 16 VLF

Let us create a database with 64 MB initial log size and later increase it to 1 GB. As per above calculation the log file should have 12 VLFs.  4 VLF based on initial size and 8 VLF due to changing the log size to 1 GB.

USE MASTER;
GO
CREATE DATABASE Mydb
ON 
(      NAME = MyDb_dat,  FILENAME = 'D:\MyDb\Mydb.mdf',
       SIZE = 10MB, MAXSIZE = 3072MB,   FILEGROWTH = 5MB )
LOG ON ( NAME = MyDb_log,FILENAME = 'D:\MyDb\MyDB.ldf',
    
SIZE = 64MBMAXSIZE = 2048MBFILEGROWTH = 5MB ) ;

GO

ALTER DATABASE Mydb 
MODIFY FILE ( NAME = MyDb_Log,FILENAME = 'D:\MyDb\MyDB.ldf',    SIZE = 1024MB)

Now Let us see how many VLF got created. To find out the number of VLF in database log file, we can make use of DBCC Loginfo.

DBCC loginfo('mydb')

The output is given below.

















There are 12  records in the output each represent a VLF.Let us try to understand the result

FileId: This is the file id of the log file and will be same for all 12 records as we have only one log file.If we have multiple log file , we can multiple numbers here

FileSize: This is the size of the VLF. If you look into the first four, have same size except the fourth one. This because first 8KB of the log file is used for file header. If you add filesize value of first four records along with 8192(8KB) , you will get 64MB which is the initial size of the log file.
16711680+16711680+16711680+16965632 =67100672+8192 =67108864bytes =64MB
In the same if you add the last 8 records it will account the 960 MB (1024-64) , the growth happened due to the alter statement.

StartOffSet: This values is also in bytes, and is the sort column of the output. The first VLF alwasy start from 8192, which is the number of bytes in a page.As mentioned above, the first 8KB is used for file header and will not store any log.

FSeqNo: The file sequence number indicates the order of usage of the VLFs. The row with the highest FSeqNo value is the VLF where current log records are being written.FSeqNo values are not consistent. It will keep changing each time when VLF are getting reused. We will discuss more about this later in this post. A value of 0 in this column means that this VLF has never been used at all. That is the reason we have 0 for all records except one where it is currently logging.

Status: Status has two possible values : 0 and 2. A value of 2 means the VLF is not reusable and a value 0 means it can be reused.It will be more clear as we go further.

Parity: Parity has three possible values 0 ,64 and 128. If the VLF is not used yet, it will have a value 0 and will be set to 64 on first use.Every time a VLF is reused, the parity value is switched between 64 and 128.

CreateLSN: The value indicates when the VLF is created or to group the VLF based on the creation. A values 0 indicates, those VLFs are created as part of database creation. In our case first four records has a value 0 which indicate these VLFs are created as part of database creation with 64MB log size. The remaining 8 records has the same value. These VLF are created as part of our alter database statement to increase the size of the log file from 64 MB to 1024MB

The above output description is referred from Kalen Delaney Blog Post

Now our transaction log will looks like below







Now we have learned about the LSN and VLF. we will discuss more about transaction log in the next post.

If you liked this post, do like my page on FaceBook

63 comments:

  1. Magnificent article a debt of gratitude is in order for the post on sql server dba. I propose you to look our site for indepth data on sql server dba training and demo additionally accessibleRead more...

    Check this site mindmajix for indepth Sql server dba blogs.
    Go here if you’re looking for information on Sql server dba training.

    ReplyDelete
  2. This is the excellent one for all sqlserver learning & working professionals......

    For best Oracle Apps Technical Training In Hyderabad with job assistance... for all graduates ... in India , U.S.A , U.K ...
    Many professionals were placed in MNCs through RCP Technologies.
    Lots of openings on oracle technical... contact us for more details::::::::::
    Oracle Apps Technical Training In Hyderabad

    ReplyDelete
    Replies
    1. I have read your blog its very attractive and impressive. I like it your blog.

      Java Online Training Java EE Online Training Java EE Online Training Java 8 online training Core Java 8 online training

      Java Online Training from India Java Online Training from India Java Online Training

      Delete
  3. This comment has been removed by the author.

    ReplyDelete

  4. such a good website and given to more information thanks! and more visit
    sas online training

    ReplyDelete
    Replies
    1. I have read your blog its very attractive and impressive. I like it your blog.

      ES6 Online Training JavaScript Training Courses JavaScript Training Courses | Angular 4 Online Training Angular 4 Online Training

      Delete

  5. Spot on with this write-up, I absolutely believe this web site needs a lot more attention. I’ll probably be back again to read through more, thanks for the information!
    SAP APO Online Training

    ReplyDelete
  6. About the SQl server information was more useful at the my studies and the easily observe all given information,thanks for sharing that valuable information.
    html5 training in chennai

    ReplyDelete
  7. Dear friend. I truly just like your posting and your current web page all in all! That write-up is really plainly composed and without difficulty understandable.ok about SQL inforamtion sap sql

    ReplyDelete
  8. Many professionals were placed in MNCs through RCP Technologies.
    Lots of openings on oracle technical... contact us for more details
    SAP GTS Training In Hyderabad

    ReplyDelete
  9. Wow amazing i saw the article with execution models you had posted. It was such informative. Really its a wonderful article. Thank you for sharing and please keep update like this type of article because i want to learn more relevant to this topic.

    SAS Training

    ReplyDelete
  10. This content creates a new hope and inspiration with in me. Thanks for sharing article like this. The way you have stated everything above is quite awesome. Keep blogging like this. Thanks.


    SAP training in Chennai

    ReplyDelete
  11. Being new to the blogging world I feel like there is still so much to learn. Your tips helped to clarify a few things for me as well as giving guidance...

    Linux Training in Chennai

    ReplyDelete
  12. Excellent post! I must thank you for this informative read. I hope you will post again soon. Warehousing of merchandise .
    While general look the things you ought to utilize little stockpiling zone space stockpiling range compartments to program the aides. Books get to be heavier when pressed 7 or more aides in a crate.
    Packers And Movers Chennai

    ReplyDelete
  13. Hey guys, surbhi Sharma here, listen,i need to gather some ppl for some stuff that's going down soon...
    Packers And Movers Hyderabad
    We have effectively balanced by frameworks for 21st century; rapidly extending our routines for organizations provided for you by much of the time updating ourselves.

    ReplyDelete
  14. This comment has been removed by the author.

    ReplyDelete
  15. Thanks a lot,your every article is very helpful for an interview even great real time concepts. I have found another post for the same see here: http://sqlserver-qa.net/2016/06/22/transaction-log-architecture/

    ReplyDelete
  16. Really nice to know about the SQL. and i am much interested to know more about this. So please keep update like this.

    Web Designing Training in Chennai Adyar

    ReplyDelete
  17. Thanku for sharing..
    Hadoop online training in hyderabad.All the basic and get the full knowledge of hadoop.
    hadoop online training in hyderbad

    ReplyDelete
  18. Big data and data warehousing related information is always updated to me at hadoop online training in hyderabad. Nice insight on the topic refer the details at
    hadoop online training

    ReplyDelete
  19. Updating with the latest technology and implementing it is the only way to survive in our niche. Thanks for making me this article. You have done a great job by sharing this content in here. Keep writing article like this.
    SAS Training in Chennai | SAS Course in Chennai

    ReplyDelete

  20. Thanks for posting this useful content, Good to know about new things here, Let me share this, . LINUX training in pune

    ReplyDelete
  21. i love this website. your site is good and your work is very good for social work.. keep work.
    Packers And Movers Ahmedabad
    http://packersmoversahmedabad.co.in/

    ReplyDelete
  22. An excellent information provided thanks for all the information i must say great efforts made by you. thanks a lot for all the information you provided.
    Packers And Movers Pune

    ReplyDelete
  23. Wow amazing i saw the article with execution models you had posted. It was such informative. Really its a wonderful article. Thank you for sharing and please keep update like this type of article because i want to learn more relevant to this topic.
    Jio TV

    ReplyDelete
  24. Thank you for sharing actual data. it's miles a top notch informative publish. your article is truly too nicely.
    Preserve posting those articles continuously.
    An extraordinary information supplied thanks for all of the records i must say incredible efforts made by way of you. thanks plenty for all of the facts you provided
    from
    oracle fusion procurement online training
    oracle fusion procurement training

    ReplyDelete
  25. Thanks for Sharing. It is very useful to me and all. We Are offer online as well as offline training real time projects. we provide low price of fee for on-line coaching.
    Oracle fusion financials training

    ReplyDelete
  26. Thanks for sharing the useful information about the sql server and good points were stated in this article. I found this article as very informative for the further information visit
    Oracle Fusion Financials Training

    ReplyDelete
  27. Hi,
    Another interesting articles on SQL and i find more new information,i like that kind of information,not only i like that post all peoples like that post,because of all given information was very excellent.
    Thank you,
    Oracle EBS training

    ReplyDelete
  28. This is really too useful and has more ideas from your blog. Keep sharing many techniques. We are waiting for your new blog and for useful information. Keep post more blogs.
    Oracle Fusion HCM Technical Online Training

    ReplyDelete
  29. thank you for such a great article with us. hope it will be much useful for us. please keep on updating..
    Video editing institute in chennai

    ReplyDelete

  30. Very Usefull Inforamtion,Thanks for sharing SQL Content Keep Updating US..............

    ReplyDelete
  31. BIGO LIVE APP is the most popular broadcasting app on mobile platform where you may start your own Live Stream and watch enjoyable show of talented performers.
    Come and join us to win over popularity and collect prizes.
    Video Streaming or broadcasting has got a new trend with Bigo Live App.
    The app discovers you, talented people, worldwide. It doesn’t stop with that, and any Bigo Live user can broadcast any video to the world.
    It might be your anything that you are passionate doing.
    Tried a new dish? Want people to try it? Then go live with Bigo Live App

    ReplyDelete
  32. HI,
    this is very interesting topic.Thanks for sharing such a nice topic.

    oracle fusion HCM online training.

    ReplyDelete
  33. Real time industry based learning - Another very important aspect of any learning program. Learning EA theoretically is different from learning it within the industrial setup.buy Revit 2018

    ReplyDelete

  34. شركة نقل عفش
    اهم شركات مكافحة حشرات بالخبر كذلك معرض اهم شركة مكافحة حشرات بالدمام والخبر والجبيل والخبر والاحساء والقطيف كذلك شركة رش حشرات بالدمام ومكافحة الحشرات بالخبر
    شركة مكافحة حشرات بالدمام
    شركة تنظيف خزانات بجدة الجوهرة من افضل شركات تنظيف الخزانات بجدة حيث ان تنظيف خزانات بجدة يحتاج الى مهارة فى كيفية غسيل وتنظيف الخزانات الكبيرة والصغيرة بجدة على ايدى متخصصين فى تنظيف الخزانات بجدة
    شركة تنظيف خزانات بجدة
    شركة كشف تسربات المياه بالدمام
    شركة نقل عفش واثاث

    ReplyDelete

  35. شركة نقل عفش بالرياض وجدة والدمام والخبر والجبيل اولقطيف والاحساء والرياض وجدة ومكة المدينة المنورة والخرج والطائف وخميس مشيط وبجدة افضل شركة نقل عفش بجدة نعرضها مجموعة الفا لنقل العفش بمكة والخرج والقصيم والطائف وتبوك وخميس مشيط ونجران وجيزان وبريدة والمدينة المنورة وينبع افضل شركات نقل الاثاث بالجبيل والطائف وخميس مشيط وبريدة وعنيزو وابها ونجران المدينة وينبع تبوك والقصيم الخرج حفر الباطن والظهران
    شركة نقل عفش بجدة
    شركة نقل عفش بالمدينة المنورة
    شركة نقل اثاث بالرياض
    شركة نقل عفش بالدمام
    شركة نقل عفش بالطائف

    ReplyDelete
  36. Thanks for sharing such a wonderful information with helpful content...keep updating.
    Final Year Project Center in Chennai | No.1 Project Center in Chennai | Project Center in Velachery

    ReplyDelete
  37. هل تبحث عن شركة متخصصة فى مكافحة الحشرات نحن نقدم لكم افضل شركة مكافحة حشرات فى الدمام كما انه تقدم خدمة متميزة ايضا وهى تنظيف المنازل والمجالس والفلل والقصور وسبب تقديم هذه الخدم انه تهتم بالاناقة والمنظر الخلاب والصحة العامة خدمات شركتنا كاتالى شركة تنظيف منازل بالدمام
    شركة ركن الاندلس من افضل الشركات التى لها باع طويل فى مجال الخدمات المنزلية ولانه افضل شركة متخصصة نقدم لكم اليوم بعض من خدمتنا المميزة مثل شركة تنظيف بالقصيم
    شركة تنظيف خزانات بالقصيم
    شركة تنظيف مجالس بالقصيم
    شركة نقل عفش بالرياض
    شركة تنظيف بالرياض
    شركة تسليك مجارى بالرياض
    شركة نقل عفش بالرياض
    شركة تنظيف منازل بالرياض

    ReplyDelete
  38. Pretty article! I found some useful information in your blog, it was awesome to read, thanks for sharing this great content to my vision, keep sharing.CCNA Training Institute in Chennai | CCNA Training Institute in Velachery.

    ReplyDelete
  39. توفر الشركة لعملاؤها الكرام الكثير من خيارات و برامج عمليات تنظيف المنازل
    شركة تنظيف منازل بالجبيل
    وخبرتنا العملية و استطلاعنا الدائم لسوق خدمات تنظيف الشقق يؤكد لكم عملاؤنا الكرام انكم حتما ستجدون ما يناسب
    احتياجاتكم ضمن شرائحنا المختلفة والمتعددة المصممة خصيصا لتلبى مختلف احتياجاتكم فى اعمال تنظيف المنازل بالجبيل
    تنظيف المنازل بالقطيف انتشر بكثرة بين الناس لضمان الحصول على افضل اسعار تنظيف الشقق ذلك لأن شركات التنظيف لديها من
    العمالة ما يكفي لأنجاز المهام فى اقل مدة زمنية، ايضاَ كل شركة تنظيف من المفروض ان يتوفر بها ماكينات تنظيف مخصصة فى
    تنظيف الفلل و تنظيف القصور ونظافة المنازل وكل ماكينة تنظيف تقوم بمهام معينة يصعب على ربة المنزل القيام بها بمفردها
    مثل تنظيف القماش و لهذا يفضل إسناد المهمة إلي افضل شركة تنظيف منازل بالقطيف شركة تاج الافضل دائما 0551844053
    شركة تنظيف بالجبيل
    شركة تنظيف سجاد بالجبيل
    شركة تنظيف شقق بالجبيل
    شركة تنظيف فلل بالجبيل
    شركة تنظيف مجالس بالجبيل
    شركة تنظيف منازل بالقطيف
    شركة تنظيف مجالس بالقطيف
    شركة تنظيف سجاد بالقطيف
    شركة تنظيف شقق بالقطيف

    ReplyDelete
  40. Nice blog. Thank you for sharing. The information you shared is very effective for learners I have got some important suggestions from it.
    BE Project Center in Chennai | ME Project Center in Chennai | MSC Project Center in Chennai

    ReplyDelete
  41. Really nice to know about the SQL. and i am much interested to know more about this. So please keep update like this.BEST PHP TRAINING IN CHENNAI

    ReplyDelete
  42. اصبح الكثير يرد الانتقال الى مكان مناسب لتوفير مسكن اكثر امان للاسرة وراحة وتاقلم لهم لهم وعند البداء فى عملية النقل هنا ياتى العجز والازعاج الذى يواجه كثير من الاعائلات وهنا ياتى دور شركة ركن الرياض لخدمات نقل العفش بالرياض افضل شركة متخصصة فى خدمات نقل العفش بالرياض بافضل سيارات النقل المجهزة والمامنة على اعلى مستوى ضد الاخطار الجوية كما لدا الشركة اكبر وافضل فريق يمتكل الخبرة حول فك وتركيب العفش والتغليف اثناء النقل لمزيد من المعلومات او طلب الخدمة يرجى الدخول الى الرابط التالى شركة نقل عفش بالرياض
    يوجد ايضا شركات اخرى تابعه لنا فى الرياض تقدم خدمة نقل العفش باعلى مستوى مماثل لنا تماما وبافضل الاسعار المناسب ويتم التعاقد معاهم ايضا من خلال شركتنا لاننا نعمل فى نفس المجال معا بكل راحة ويسر واليكم رابط الشركة الاخرة فى الرياض شركة نقل عفش بالرياض

    ReplyDelete
  43. Informative and impressive. Keep Updating
    Jobbörse

    ReplyDelete
  44. Thank You For Sharing This Post, Its very informative and useful.
    Best Oracle Training in Bangalore

    ReplyDelete
  45. Good post and I like it very much. By the way, anybody try this app development company for iOS and Android? I find it is so professional to help me boost app ranking and increase app downloads.

    ReplyDelete