Data with Bert logo

How to Put SQL Column Names Onto Multiple Lines in SSMS

A few keystrokes and BAM! A mostly formatted query

SQL in 60 Seconds is a series where I share SQL tips and tricks that you can learn and start using in less than a minute.

Watch this week's video on YouTube

Have you ever copied and pasted a query into SQL Server Management Studio and been annoyed that the list of column names in the SELECT statement were all on one line?

There are 30 columns here. Ugh.

sql SELECT Col1, Col2, Col3, Col4, Col5,Col6,Col7, Col8, Col9, Col10,Col11,Col12,Col13,Col14,Col15,Col16,Col17,Col18,Col19,Col20,Col21,Col22,Col23,Col24,Col25,Col26,Col27,Col28,Col29,Col30 FROM dbo.MyTable

You can make the query easier to read by putting each column name onto its own line.

Simply open the Find and Replace window (CTRL + H) and type in ,(:Wh)* for the Find value and ,nt for the Replace value (in some versions of SSMS you may have better luck using ,(:Wh|t| )* in the Find field). Make sure "Use Regular Expressions" is checked and press Replace All:

Make sure the regular expression icon/box is checked

A few keystrokes and BAM! A mostly formatted query

The magic you just used is a Regular Expression, and Microsoft has its own flavor used in SSMS and Visual Studio. Basically, we found text that

  • began with a comma (,)
  • followed by any whitespace (:Wh) (line break, tab, space, etc…)
  • (in newer versions of SSMS we add \|t\| to indicate or tab or space)
  • and replaced it with a comma (,) and a new line (n) and tab (t).

Sure, this trick isn't going to give you the same output as if you used a proper SQL formatter, but this technique is free and built straight into SSMS.

5 Business Problems You Can Solve With Temporal Tables

3404b-14frjyurrbemzjlzyfhtz7w

Watch this week's video on YouTube

It's 4:30 pm on Friday and Mr. Manager comes along to tell you that he needs you to run some important ad-hoc analysis for him.

Previously this meant having to stay late at the office, writing cumbersome queries to extract business information from transactional data.

Lucky for you, you've recently started using Temporal Tables in SQL Server ensuring that you'll be able to answer your boss's questions and still make it to happy hour for \$3 margaritas.

Sound like a plan? Keep reading below!

The Data

For these demos, we'll be using my imaginary car rental business data. It consists of our temporal table dbo.CarInventory and our history table dbo.CarInventoryHistory:

I've upgraded my business — we now have FOUR Chevy Malibus available for you to rent

Business Problem #1 — "Get me current inventory!"

To get our current inventory of rental cars, all we have to do is query the temporal table:

sql SELECT * FROM dbo.CarInventory

That's it.

I know this query seems lame — it's just a SELECT FROM statement. There are no FOR SYSTEM TIME clauses, WHERE statements, and no other interesting T-SQL features.

But that's the point! Have you ever had to get the "current" rows out of a table that is keeping track of all transactions? I'm sure it involved some GROUP BY statements, some window functions, and more than a few cups of coffee.

Temporal tables automatically manage your transaction history, providing the most current records in one table (dbo.CarInventory) and all of the historical transactions in another (dbo.CarInventoryHistory). No need for complicated queries.

Business Problem #2 — "How many miles on average do our customers drive each of our cars?"

In this example, we use FOR SYSTEM_TIME ALL and a plain old GROUP BY to get the data we need:

sql SELECT CarId, AVG(Mileage) AS AverageMileage FROM dbo.CarInventory FOR SYSTEM_TIME ALL WHERE InLot = 1 -- The car has been successfully returned to our lot AND SysStartTime > '2017-05-13 08:00:00.0000000' -- Ignore our initial car purchase GROUP BY CarId

Some cars get driven a lot more. Causation or correlation?

FOR SYSTEM_TIME ALL returns all rows from both the temporal and history table. It's equivalent to:

sql SELECT * FROM dbo.CarInventory UNION ALL SELECT * FROM dbo.CarInventoryHistory

Once again, there isn't anything too fancy going on here — but that's the point. With temporal tables, your data is organized to make analysis easier.

Business Problem #3 — "How many cars do we rent out week over week?"

Here at Wagner Car Rentals we want to figure out how often our cars are being rented and see how those numbers change from week to week.

sql SELECT CurrentWeek.CarId, CurrentWeek.RentalCount AS CurrentRentalCount, PreviousWeek.RentalCount AS PreviousRentalCount FROM ( SELECT CarId, COUNT(*) AS RentalCount FROM dbo.CarInventory FOR SYSTEM_TIME FROM '2017-06-05' TO '2017-06-12' WHERE InLot = 0 -- Car is out with the customer GROUP BY CarId ) CurrentWeek FULL JOIN ( SELECT CarId, COUNT(*) AS RentalCount FROM dbo.CarInventory FOR SYSTEM_TIME FROM '2017-05-29' TO '2017-06-05' WHERE InLot = 0 -- Car is out with the customer GROUP BY CarId ) PreviousWeek ON CurrentWeek.CarId = PreviousWeek.CarId

0771c-1jwllidj567utcv3bgdsjow

In this query, we are using FOR SYSTEM_TIME FOR/TO on our temporal table to specify what data we want in our "CurrentWeek" and "PreviousWeek" subqueries.

FOR/TOreturns any records that were active during the specified range(BETWEEN/AND does the same thing, but its upper bound datetime2 value is inclusive instead of exclusive).

Business Problem #4 — "What color cars are rented most often?"

We're thinking of expanding our fleet of rental vehicles and want to purchase cars in the most popular colors so we can keep customers happy (and get more of their business!). How can we tell which color cars get rented most often?

sql SELECT CarId, Color, COUNT(*)/2 AS RentalCount -- Divide by 2 because transactions are double counted (rental and return dates) FROM dbo.CarInventory FOR SYSTEM_TIME CONTAINED IN ('2017-05-15','2017-06-15') GROUP BY CarId, Color

Here we use CONTAINED IN because we want to get precise counts of how many cars were rented and returned in a specific date range (if a car wasn't returned — stolen, wrecked and totaled, etc… — we don't want to purchase more of those colors in the future).

67c8c-1xtqczezvl0y_s6hxzhpi0g

Business Problem #5 — "Jerry broke it. FIX IT!"

The computer systems that we use at Wagner Car Rentals are a little…dated.

Instead of scanning a bar code to return a car back into our system, our employees need to manually type in the car details. The problem here is that some employees (like Jerry) can't type, and often makes typos:

sql SELECT * FROM dbo.CarInventory FOR SYSTEM_TIME ALL WHERE CarId = 4

c0dd3-1nxqy43njb7r63theaq0kja

Having inconsistent data makes our reporting much more difficult, but fortunately since we have our temporal table tracking row-level history, we can easily correct Jerry's typos by pulling the correct values from a previous record:

sql ;WITH InventoryHistory AS ( SELECT ROW_NUMBER () OVER (PARTITION BY CarId ORDER BY SysStartTime DESC) AS RN, * FROM dbo.CarInventory FOR SYSTEM_TIME ALL WHERE CarId = 4 ) --SELECT * FROM InventoryHistory /*Update current row by using N-th row version from history (default is 1 - i.e. last version)*/ UPDATE dbo.CarInventory SET Color = h.Color FROM dbo.CarInventory i INNER JOIN InventoryHistory h ON i.CarId = h.CarId AND RN = 2

Typos fixed!

Although we could have fixed this issue without using a temporal table, it shows how having all of the row-level transaction history makes it possible to repair incorrect data in more difficult scenarios. For even hairier situations, you can even roll-back your temporal table data.

Conclusion

Temporal tables are easy to setup and make writing analytical queries a cinch.

Hopefully writing queries against temporal tables will prevent you from having to stay late in the office the next time your manager asks you to run some ad-hoc analysis.

Behind the Scenes of an Online Presentation

Charlie is an attentive audience member

Last week I presented my session "DBAs vs Developers: JSON in SQL Server 2016" at the online GroupBy Conference.

As I prepared for the event, I thought about all of the things that were different about getting ready for an online versus an in-person event.

Thinking that others might be interested in seeing what I do to get ready for an online talk, I filmed myself as I prepared for presentation day and put together this "behind the scenes" video.

Check it out, along with my actual talk on JSON in SQL Server 2016, in the videos below!

https://www.youtube.com/watch?v=oDYCSym6Lko

Slides and demo code from the presentation is available at https://bertwagner.com/presentations/

And the presentation video itself:

https://www.youtube.com/watch?v=WJpUGHWUMoc

And slide deck:

https://www.slideshare.net/BertWagner/dbas-vs-developers-json-in-sql-server-2016

5 Things You Need to Do When Performance Testing JSON in SQL and C#

1ce87-1opx0seatjtntutc1qi1txg

Watch this week's video on YouTube

Want to learn more about using JSON in SQL Server? Watch me present at the online GroupBy conference on June 9, 2017 at 8am.

I've written a few articles this year about how awesome JSON performance is in SQL Server 2016.

The more I continue to use JSON in SQL Server, the more impressed I become with its speed and versatility. Over time I've learned new techniques and realize that JSON in SQL Server is actually much faster than I initially thought.

Today I want to correct some performance tests where I think I unfairly compared SQL Server JSON performance the first time around.

Major thanks to @JovanPop_MSFT for his help with performance testing suggestions.

Performance testing is hard

Before I dive into the performance tests, I want to be clear that these tests are still not perfect.

Performance testing in SQL Server is hard enough. When you start trying to compare SQL Server functions to code in .NET, lots of of other factors come in to play.

I'll try to to highlight where there still might be some problems with my methodology in the tests below, but overall I think these tests are more accurate comparisons of these features.

SQL Server JSON vs. Json.Net

There are two major issues with comparing SQL Server JSON functions to Json.NET functions in C#:

  1. Queries running in SQL Server Management Studio have significant overhead when rendering results to the results grid.
  2. The way SQL Server retrieves pages of data from disk or memory is not the same as how C# retrieves data from disk or memory.

The below tests should provide a more accurate comparison between SQL Server and .NET.

I am capturing SQL run times for the below tests using SET STATISTICS TIME ON. All of the test data for the below tests is available here: https://gist.github.com/bertwagner/f0645cf1b244af7d6bb75856db8744e0

Test #1 — Deserializing 20k JSON elements

For this first test, we will deserialize ~20k rows of car year-make-model data comparing the SQL Server OPENJSON function against Json.NET's DeserializeObject.

Previously this test used JSON_VALUE which was adding unnecessary processing overhead. With the query rewritten to run more efficiently, it looks like this:

sql SELECT year, make, model FROM OPENJSON(@cars) WITH (year int, make nvarchar(50), model nvarchar(50)); -- 160ms

Now the problem with this query is that we are still drawing all ~20k rows of data to the screen in SQL Server Management Studio. The best way to avoid this extra processing is to simply convert the query to use COUNT:

sql SELECT COUNT(*) FROM OPENJSON(@cars) WITH (year int, make nvarchar(50), model nvarchar(50)); -- 71ms

0f867-1n9s0lw6pnonrbxbcu5ljag

Looking at the execution plans, the OPENJSON function is still processing all ~20k rows in both queries, only the number of rows being brought back to the SSMS GUI differ.

This still isn't the same as what the C# test below does (all data in the C# example stays in memory at all times) but it is as close of a comparison that I could think of:

C# var deserializedCars = JsonConvert.DeserializeObject<IEnumerable<Car>>(carsJSON); // 66ms

(Full C# code available at: https://gist.github.com/bertwagner/8e5e8b6ec977c1704355166f96ae3efa)

And the result of this test? SQL Server was nearly as fast as Json.NET!

736c4-1l8jgil5hl_acboawarlepq

Test #2 — Deserializing ~20k rows with a predicate

In this next test we filter and return only a subset of rows.

SQL:

sql SELECT count(*) FROM OPENJSON(@cars) WITH(model nvarchar(20) ) WHERE model = 'Golf' // 58ms

C#

C# var queriedCars = JsonConvert.DeserializeObject<IEnumerable<Car>>(carsJSON).Where(x => x.Model == "Golf"); // 52ms

Result: SQL Server is nearly as fast once again!

08563-1mevkdr3q5nqtjumfnuqtdq

One more important thing to note about this specific test — if you add this data into a SQL table and add a computed column index, SQL Server will beat out Json.NET every time.

Test #3 — Serializing ~20 elements into JSON

This scenario is particularly difficult to test. If I want to serialize data in a SQL table to a JSON string, how do I write the equivalent of that in C#? Do I use a DataTable and hope that SQL's data is all in cache? Is the retrieval speed between the SQL Server buffer equivalent to C#'s DataTable? Would a collection of List's in C# be more appropriate than a DataTable?

In the end, I decided to force SQL to read pages from disk by clearing the cache and have C# read the object data from a flat file. This still isn't perfect, but it is as close as I think we can get:

SQL:

sql DBCC DROPCLEANBUFFERS SELECT * FROM dbo.Cars FOR JSON AUTO -- 108ms

C#:

C# string carsJSONFromFile = File.ReadAllText(@"../../CarData.json"); var serializedCars = JsonConvert.SerializeObject(deserializedCars); // 63ms

This test still isn't perfect though because SSMS is outputting the JSON string to the screen while C# never has to. I didn't want to play around with outputting the C# version to a form or the console window because it still wouldn't have been an equal comparison.

Result: Json.Net is about twice as fast although this test is by far the most inaccurate. Still, SQL is still much faster than I initially thought.

9c286-19i6qzwiispgc1eirzhtjsa

SQL Server JSON vs. XML

In my previous article comparing SQL Server JSON to SQL Server XML, I focused on tests that were all done within SQL Server.

These tests were incomplete however: most of the time, a developer's app will have to do additional processing to get data into an XML format, while JSON data usually already exists in JSON format (assuming we have Javascript web app).

These two tests examine cases where XML may have been slightly faster than JSON on SQL Server, but if you consider the entire environment (app + database), using JSON wins.

Scenario #1 — XML data needs to be serialized

Although inserting XML data that is already in memory into a SQL Server table is faster than the equivalent operation in JSON, what happens if we need to serialize the data in our app first before sending the data to SQL Server?

```C# // Serialize Car objects to XML var result1 = SerializeToXML(cars); // 166ms

// Serialize Car objects to JSON var result2 = SerializeToJSON(cars); // 69ms

public static Tuple SerializeToXML(List cars) { Stopwatch sw = new Stopwatch(); sw.Start(); StringWriter writer = new StringWriter(); XmlSerializer serializer = new XmlSerializer(typeof(List)); serializer.Serialize(writer, cars); string result = writer.ToString(); sw.Stop(); return new Tuple(sw.ElapsedMilliseconds, result); }

public static Tuple SerializeToJSON(List cars) { Stopwatch sw = new Stopwatch(); sw.Start(); var json = JsonConvert.SerializeObject(cars); sw.Stop(); return new Tuple(sw.ElapsedMilliseconds, json); } ```

Using the most common libraries available to serializing data to XML and JSON, serializing data to JSON is twice as fast as serializing to XML (and as mentioned before, a lot of the time apps already have JSON data available — no need to serialize). This means the app serialization code will run faster and allow for the data to make it to SQL Server faster.

Scenario #5 — Transferring XML and JSON to SQL Server

Finally, after we have our serialized XML and JSON data in C#, how long does it take to transfer that data to SQL Server?

```C# // Write XML string to SQL XML column var result3 = WriteStringToSQL( result1.Item2, "INSERT INTO dbo.XmlVsJson (XmlData) VALUES (@carsXML)", new SqlParameter[] { new SqlParameter("carsXML", result1.Item2) }); // 142ms, 1.88mb of data

// Write JSON string to SQL var result4 = WriteStringToSQL( carsJSON, "INSERT INTO dbo.XmlVsJson (JsonData) VALUES (@carsJSON)", new SqlParameter[] { new SqlParameter("carsJSON", carsJSON) }); // 20ms, 1.45mb of data

// Write XML string to nvarchar SQL column. Taking the difference between this and result3, 100ms+ of time is spent converting to XML format on insert. var result5 = WriteStringToSQL( result1.Item2, "INSERT INTO dbo.XmlVsJson (JSONData) VALUES (@carsXML)", new SqlParameter[] { new SqlParameter("carsXML", result1.Item2) }); // 29ms, 1.88mb of data ```

Result: Writing JSON data to a nvarchar SQL Server column is much faster than writing XML data to an XML typed (or even an nvarchar typed) column.

Not only does SQL server need to parse the XML data upon insert, the physical size of the XML data being sent over TCP is larger due to the repetitive nature of XML syntax.

Conclusion

JSON performance in SQL Server is still awesome. In fact, it's even better than I had previously thought.

These tests are not meant to be conclusive; think of them more as errata for my previous JSON performance posts.

However, I think that these comparisons show that SQL Server's JSON functions are competitive with other languages' performance of handling JSON data.

Additionally, if serializing/deserializing reduces the amount of data transferred over TCP, using the JSON functions in SQL Server will most likely give you better total app/environment performance.

How To Roll-Back Data in a Temporal Table

"The Big Red Button" by włodi used under CC BY-SA 2.0 / Cropped and text added from original

Watch this week's video on YouTube

So you've started using temporal tables because they make your point-in-time analysis queries super easy.

Your manager is happy because you're getting historical data to him quickly. Your DBA is happy because she doesn't have to clean up any performance killing triggers that replicate a temporal table's functionality. Everything with temporal tables has made your life better.

Except that time when you accidentally inserted some bad data into your temporal table.

Whoops

The good news is that all of your data is still intact — it's been copied over to the historical table. Phew!

Now all you need to do is rollback this inadvertent row insertion and make your tables look just like you did before you started breaking them.

This should be easy right?

Well not exactly — there's no automatic way to roll back the data in a temporal table. However, that doesn't mean we can't write some clever queries to accomplish the same thing.

Let's make some data

Don't mind the details of this next query too much. It uses some non-standard techniques to fake the data into a temporal/historical table with "realistic" timestamps:

```sql IF OBJECT_ID('dbo.CarInventory', 'U') IS NOT NULL BEGIN -- When deleting a temporal table, we need to first turn versioning off ALTER TABLE dbo.CarInventory SET ( SYSTEM_VERSIONING = OFF ) DROP TABLE dbo.CarInventory DROP TABLE dbo.CarInventoryHistory END; CREATE TABLE CarInventory
(
CarId INT IDENTITY PRIMARY KEY NOT NULL, Year INT, Make VARCHAR(40), Model VARCHAR(40), Color varchar(10), Mileage INT, InLot BIT NOT NULL DEFAULT 1
); CREATE TABLE CarInventoryHistory
(
CarId INT NOT NULL, Year INT, Make VARCHAR(40), Model VARCHAR(40), Color varchar(10), Mileage INT, InLot BIT NOT NULL, SysStartTime datetime2 NOT NULL, SysEndTime datetime2 NOT NULL );

INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(1,2017,'Chevy','Malibu','Black',0,1,'2017-05-13 8:00:00.0000000','2017-05-14 8:00:00.0000000'); INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(2,2017,'Chevy','Malibu','Silver',0,1,'2017-05-13 8:00:00.0000000','2017-05-14 9:00:00.0000000'); INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(1,2017,'Chevy','Malibu','Black',0,0,'2017-05-14 8:00:00.0000000','2017-05-15 7:00:00.0000000'); INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(2,2017,'Chevy','Malibu','Silver',0,0,'2017-05-14 9:00:00.0000000','2017-05-19 15:00:00.0000000'); INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(1,2017,'Chevy','Malibu','Black',73,1,'2017-05-15 7:00:00.0000000','2017-05-16 10:00:00.0000000'); INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(2,2017,'Chevy','Malibu','Silver',488,1,'2017-05-19 15:00:00.0000000','2017-05-20 08:00:00.0000000');

ALTER TABLE dbo.CarInventory ADD SysStartTime DATETIME2 GENERATED ALWAYS AS ROW START NOT NULL CONSTRAINT DF_SysStart DEFAULT SYSUTCDATETIME(), SysEndTime DATETIME2 GENERATED ALWAYS AS ROW END NOT NULL CONSTRAINT DF_SysEnd DEFAULT '9999-12-31 23:59:59.9999999', PERIOD FOR SYSTEM_TIME (SysStartTime, SysEndTime);

SET IDENTITY_INSERT dbo.CarInventory ON; INSERT INTO dbo.CarInventory (CarId,Year,Make,Model,Color,Mileage,InLot) VALUES(1,2017,'Chevy','Malibu','Black',120,1); INSERT INTO dbo.CarInventory (CarId,Year,Make,Model,Color,Mileage,InLot) VALUES(2,2017,'Chevy','Malibu','Silver',591,1); SET IDENTITY_INSERT dbo.CarInventory OFF;

-- We need to make sure that the last SysEndTimes in our historical table match the SysStartTimes in our temporal table DECLARE @LastSysStartTimeInTemporalCar1 DATETIME2, @LastSysStartTimeInTemporalCar2 DATETIME2 SELECT @LastSysStartTimeInTemporalCar1 = SysStartTime FROM dbo.CarInventory WHERE CarId = 1 SELECT @LastSysStartTimeInTemporalCar2 = SysStartTime FROM dbo.CarInventory WHERE CarId = 2 INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(1,2017,'Chevy','Malibu','Black',73,0,'2017-05-16 10:00:00.0000000',@LastSysStartTimeInTemporalCar1); INSERT INTO dbo.CarInventoryHistory (CarId,Year,Make,Model,Color,Mileage,InLot,SysStartTime,SysEndTime) VALUES(2,2017,'Chevy','Malibu','Silver',488,0,'2017-05-20 08:00:00.0000000',@LastSysStartTimeInTemporalCar2);

ALTER TABLE dbo.CarInventory SET ( SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.CarInventoryHistory));

-- If everything worked well, we should see our data correctly in these table SELECT * FROM dbo.CarInventory SELECT * FROM dbo.CarInventoryHistory ```

If you look at the results of our temporal table (top) and historical table (bottom), they should look something like this:

This data represents my totally real, very very not-fake rental car business.

You see those two rows in the top temporal table? Those are the ones I just added accidentally. I actually had a bug in my code *ahem* and all of the data inserted after 2017–05–18 is erroneous.

The bug has been fixed, but we want to clean up the incorrect entries and roll back the data in our temporal tables to how it looked on 2017–05–18. Basically, we want the following two rows to appear in our "current" temporal table and the historical table to be cleaned up of any rows inserted after 2017–05–18:

ec914-15dfitymhzrrzuxxwf6fq_w

Fortunately, we can query our temporal table using FOR SYSTEM_TIME AS OF to get the two rows highlighted above pretty easily. Let's do that and insert into a temp table called ##Rollback:

```sql DROP TABLE IF EXISTS ##Rollback SELECT * INTO ##Rollback FROM dbo.CarInventory FOR SYSTEM_TIME AS OF '2017-05-18'

-- Update the SysEndTime to the max value because that's what it's always set to in the temporal table UPDATE ##Rollback SET SysEndTime = '9999-12-31 23:59:59.9999999' ```

You'll notice we also updated the SysEndTime — that's because a temporal table always has its AS ROW END column set to the max datetime value.

Looking at ##Rollback, we have the data we want to insert into our temporal table:

This is the data we want!

Now, it'd be nice if we could just insert the data from #Rollback straight into our temporal table, but that would get tracked by the temporal table!

So instead, we need to turn off system versioning, allow identity inserts, delete our existing data, and insert from ##Rollback. Basically:

```sql ALTER TABLE dbo.CarInventory SET ( SYSTEM_VERSIONING = OFF);

SET IDENTITY_INSERT dbo.CarInventory ON;

DELETE FROM dbo.CarInventory WHERE CarId IN (SELECT DISTINCT CarId FROM ##Rollback)

INSERT INTO dbo.CarInventory (CarId,Year,Make,Model,Mileage,Color,InLot) SELECT CarId,Year,Make,Model,Mileage,Color,InLot FROM ##Rollback ```

While system versioning is off, we can also clean up the historical table by deleting all records after 2017–05–18 by joining the ##Rollback temp table on SysStartTime:

sql DELETE h FROM ##Rollback t INNER JOIN dbo.CarInventoryHistory h ON h.CarId = t.CarId AND t.SysStartTime <= h.SysStartTime

We have rolled back our data successfully!

482a5-16k6xjuxuri0agyghgjoiza

Only One Tiny Problem

Did you notice that the last SysEndTime values in our historical table don't match up with the SysStartTime values in our temporal table?

1fd81-1jlehwp11deydac7hrc5npg

This is a data integrity issue for our temporal table — our datetimes should always be continuous.

Fortunately this is easily fixed with one more UPDATE statement:

```sql UPDATE t SET t.SysEndTime = i.SysStartTime FROM dbo.CarInventoryHistory t INNER JOIN ##Rollback r ON t.CarId = r.CarId AND t.SysEndTime = r.SysStartTime INNER JOIN dbo.CarInventory i ON t.CarId = i.CarId AND r.CarId = i.CarId

SELECT * FROM dbo.CarInventory SELECT * FROM dbo.CarInventoryHistory ```

Our correctly rolled back temporal table

Finally, remember to turn system versioning back on and to turn off our identity inserts to restore the original functionality of our temporal tables:

sql ALTER TABLE dbo.CarInventory SET ( SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.CarInventoryHistory)); SET IDENTITY_INSERT dbo.CarInventory OFF;

Congratulations, you've rolled back your temporal table data!