Data with Bert logo

The Project Graveyard

MJ-t-sql-Tuesday

This post is a response to this month's T-SQL Tuesday #107 prompt by Jeff Mlakar.  T-SQL Tuesday is a way for the SQL Server community to share ideas about different database and professional topics every month.

This month's Halloween themed topic asks to "... share a story about a project you worked on or were impacted by that went horribly wrong."


Watch this week's video on YouTube

I've been fortunate enough to never have been part of a large disastrous project at work.  My projects always have a "fail fast" mentality, so they never build up to a point where they come crashing down in a death spiral.

But that's not to say I haven't experienced my own project horror story in my personal work.

A while back I made a goal to produce a quality SQL Server focused blog post and video every week.  Essentially this means I am starting a new small-scale project each week where I play the part of project manager, developer, analyst, etc... with a delivery deadline of every Tuesday morning.  While I've gotten better at this process over time, I have also failed to meet my personal goals numerous for a variety of reasons.

Scope Creep-y

In order to meet my weekly deadline, I need to stay laser focused on the topic I choose for that particular week.  If I get additional ideas while writing and start trying to incorporate them into my post (ie. scope creep), I inevitably miss midweek milestones and have to try to make up time elsewhere to make my deadline.

One instance of scope-creep I experienced earlier this year was when I was trying to write a post on how to build a table-driven validation system.

I've built many table-driven processes in the past so this seemed like it would be an easy topic to write about.  I started that week's blogging process by building the demo templates that would include table structures, execution scripts, etc...

Instead of wrapping up my basic demos so I could move on to writing the actual post, I kept building out demos for more features: logging functions, parameterization, SQL injection protection, common performance problems, etc...

It was exciting to be building all of this out, but instead of creating one-week's content, I realized I had started working on enough demos for several weeks of posts.  This wouldn't have necessarily been a bad thing on its own; after all it's nice to be a few weeks ahead on content creation.

However, I didn't quite finish enough demos for any one post in particular, and due to some other life events I didn't get back to working on my demos until Sunday afternoon.  Normally at that point I'd already have my demos done, a blog post written, a video filmed, and either a finished video edit that I'm uploading or getting really close to uploading to YouTube.  What I had instead was a bunch of half-finished SQL demos saved in a very rough outlined blog post.

The Project Graveyard

This isn't the first time poor time management and scope creep has gotten me in trouble:

Some projects sent prematurely to the grave

I have several posts that I've invested a good amount of time into but never released because they are incomplete.  In almost all of these cases my problems stemmed from poor planning and scope creep.

In the case of my table-driven post, by late-Sunday afternoon I realized I was going to miss my weekly deadline goal if I continued with that post, so I scrapped the idea for now and quickly wrote and shot a different post on an SSMS trick instead.  It was discouraging to have to do that, but at the end of the day I was able to meet my weekly deadline even if it was with a different result than I initially expected.

You might be thinking, "Why not ignore deadlines and release the post later in the week/month?"  For me, I like my weekly deadlines because I like the creative challenges that come from having time constraints.  It forces me to limit my scope and work on different projects on a regular basis.  My goal from blogging and video making is to learn how to present information in a succinct manner so that my communication skills, both written and verbal, improve.  So while I can (and probably will) complete these posts at some point in the future, I treat them as failures for that particular week's project.

And while failures aren't particularly fun, they can wind up being great learning opportunities: after all, I haven't gotten so off track due to scope creep ever since.

Power BI Dashboards, Reports, and Datasets with the SQL Chefs

Watch this week's video on YouTube

This week I had the opportunity to learn about dashboards, reports, and datasets in Power BI using the best kind of teaching tool: fresh, delicious, chocolate chip cookies.

Be our guest for the web's premier Power BI food show as Eugene Meidinger (@SQLGene) explains how these concepts relate to each other and then grab a second helping with his blog post that will leave you feeling full.

Shortchanged with International Money in SQL Server

Watch this week's video on YouTube

Imagine you have to perform some salary analysis for your employer International Mega Corp.

The data you have to work with looks something like this:

DROP TABLE IF EXISTS ##InternationalMegaCorpSalaries;
GO
CREATE TABLE ##InternationalMegaCorpSalaries
(
    EmployeeId int IDENTITY,
    Salary nvarchar(10),
    Country nvarchar(20),
);
INSERT INTO ##InternationalMegaCorpSalaries VALUES ('1,000.00','United States');
INSERT INTO ##InternationalMegaCorpSalaries VALUES ('8 789,37','Sweden');
INSERT INTO ##InternationalMegaCorpSalaries VALUES ('6.274,26','Turkey');
INSERT INTO ##InternationalMegaCorpSalaries VALUES ('1000.00','United States');

2018-09-21_12-48-56

Why are the salaries stored as nvarchar and formatted with commas, spaces, and periods?

Great question!  Someone wanted to make sure these amounts would look good in the UI so storing the formatted values in the database would be the way to go...

Pretty for the UI, not really great for needing to do analysis on.

Thanks to Zanoni Labuschagne, one of the subscribers to my YouTube channel, for recommending this topic!

CONVERT!

I'm a firm believer that money values should always be stored in the decimal datatype.  I can't think of a time where I wouldn't care about the precision and accuracy of money.

So let's try converting our salaries to decimal(10,2):

SELECT 
    EmployeeId,
    TRY_CONVERT(decimal(10,2),Salary) AS Salary,
    Country 
FROM ##InternationalMegaCorpSalaries

image-3

Well, that didn't work.  Maybe converting to floats will work as a quick fix?

SELECT 
    EmployeeId,
    TRY_CONVERT(float,Salary) AS Salary,
    Country 
FROM ##InternationalMegaCorpSalaries

image-4

Nope that didn't work either

(sidenote: I'm OK with that though - I don't think float should ever be used for storing money.  If you want to see a quick example of why float math is problematic take a look at this (and for more detail read about it here):

DECLARE 
    @Num1 float = .15,
    @Num2 float = .15,
    @Num3 float = .1,
    @Num4 float = .2
-- Not equal
SELECT IIF(@Num1+@Num2 = @Num3+@Num4,1,0)

Ok so those didn't work.  What if we try converting to the money datatype - that should work for being able to read these money formats right?

SELECT 
    EmployeeId,
    TRY_CONVERT(money,Salary) AS Salary,
    Country 
FROM ##InternationalMegaCorpSalaries

image

50% correct...!  Our employees in Turkey are being seriously underpaid with conversion though.  I'm kind of glad to not have to rely on this solution though since the money datatype has its own fair share of problems as well.

Time to get Cultured

SQL Server's TRY_PARSE function might be able to help us, but first we need to create a relationship between each country's money formatting and it's culture code:

ALTER TABLE ##InternationalMegaCorpSalaries
ADD CultureCode AS CASE Country 
                        WHEN 'United States' THEN 'en-US'
                        WHEN 'Sweden' THEN 'sv-SE'
                        WHEN 'Turkey' THEN 'Tr-TR' END

And finally our SELECT query:

SELECT 
    EmployeeId,
    TRY_PARSE(Salary AS DECIMAL(10,2) USING CultureCode) AS Salary,
    Country 
FROM ##InternationalMegaCorpSalaries

image-1

Success!  Our salary values are now perfectly converted into the decimal datatype without the need for any ugly REPLACE(), SUBSTRING(), or other string parsing functions.

While this carefully curated demo correctly converted all of our values, it's important to always test that the culture value you choose correctly formats your string formatted number.  For example, Wikipedia leads me to believe that the  Danes write their numbers like "6 338,70" SQL Server's culture definition doesn't convert this correctly:

SELECT TRY_PARSE('6 338,70' AS DECIMAL(10,2) USING 'da-dk')

image-2

In those instances, you may need to substitute another culture code to get the correct conversion to occur.

Extracting JSON Values Longer Than 4000 Characters

Watch this week's video on YouTube

A while back I built an automated process that parses JSON strings into a relational format.

Up until recently this process had been working great: my output table had all of the data I was expecting, neatly parsed into the correct rows and columns.

Last week I noticed an error in the output table however.  One row that was supposed to have a nicely parsed JSON value for a particular column had an ugly NULL instead.

Truncated?

First I checked my source JSON string - it had the "FiveThousandAs" property I was looking for:

DECLARE @json nvarchar(max) = N'{
    "Id" : 1,
    "FiveThousandAs" : "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",
    "FourAs" : "aaaa"
}';

So the source data was fine.

I checked the table column I was inserting into as well and confirmed it was defined as nvarchar(max), so no problem there.

The last thing I checked was the query I was using:

SELECT JSON_VALUE(@json, '$.FiveThousandAs')

If I run that on it's own, I reproduce the NULL I was seeing inserted into my table:

2018-09-15_18-26-27

JSON_VALUE is limiting

After a little bit more research, I discovered that the return type for JSON_VALUE is limited to 4000 characters.   Since JSON_VALUE is in lax mode by default, if the output has more than 4000 characters, it fails silently.

To force an error in future code I could use SELECT JSON_VALUE(@json, 'strict \$.FiveThousandAs')  so at least I would be notified immediately of an problem with my  query/data (via failure).

Although strict mode will notify me of issues sooner, it still doesn't help me extract all of the data from my JSON property.

(Side note: I couldn't define my nvarchar(max) column as NOT NULL because for some rows the value could be NULL, but in the future I might consider adding additional database validation with a check constraint).

OPENJSON

The solution to reading the entire 5000 character value from my JSON property is to use OPENJSON:

SELECT * 
FROM OPENJSON(@json) 
WITH (FiveThousandAs nvarchar(max) '$.FiveThousandAs')

My insert query needed to be slightly refactored, but now I'm able to return any length value (as long as it's under 2gb).

In hindsight, I should have used OPENJSON() from the start: not only is it capable of parsing the full length values from JSON strings, but it performs significantly faster than any of the other SQL Server JSON functions.

As a best practice, I think I'm going to use OPENJSON by default for any JSON queries to avoid problems like this in the future.

Faking Temporal Tables with Triggers

MJ-t-sql-TuesdayThis post is a response to this month's T-SQL Tuesday #106 prompt by Steve Jones.  T-SQL Tuesday is a way for the SQL Server community to share ideas about different database and professional topics every month.

This month's topic asks to share our experiences with triggers in SQL Server.


Watch this week's video on YouTube

Triggers are something that I rarely use.  I don't shy away from them because of some horrible experience I've had, but rather I rarely have a good need for using them.

The one exception is when I need a poor man's temporal table.

Temporal Table <3

When temporal tables were added in SQL Server 2016 I was quick to embrace them.

A lot of the data problems I work on benefit from being able to view what data looked like at a certain point back in time, so the easy setup and queriability of temporal tables was something that I immediately loved.

No System Versioning For You

Sometimes I can't use temporal tables though, like when I'm forced to work on an older version of SQL Server.

Now, this isn't a huge issue; I can still write queries on those servers to achieve the same result as I would get with temporal tables.

But temporal tables have made me spoiled.  They are easy to use and I like having SQL Server manage my data for me automatically.

Fake Temporal Tables With Triggers

I don't want to have to manage my own operational versus historical data and write complicated queries for "point-in-time" analysis, so I decided to fake temporal table functionality using triggers.

Creating the base table and history table are pretty similar to that of a temporal table, just without all of the fancy PERIOD and GENERATED ALWAYS syntax:

CREATE TABLE dbo.Birds  
(   
 Id INT IDENTITY PRIMARY KEY,
 BirdName varchar(50),
 SightingCount int,
 SysStartTime datetime2 DEFAULT SYSUTCDATETIME(),
 SysEndTime datetime2 DEFAULT '9999-12-31 23:59:59.9999999'  
);
GO
CREATE TABLE dbo.BirdsHistory
(   
 Id int,
 BirdName varchar(50),
 SightingCount int,
 SysStartTime datetime2,
 SysEndTime datetime2  
) WITH (DATA_COMPRESSION = PAGE);
GO
CREATE CLUSTERED INDEX CL_Id ON dbo.BirdsHistory (Id);
GO

The single UPDATE,DELETE trigger is really where the magic happens though.  Everytime a row is updated or deleted, the trigger inserts the previous row of data into our history table with correct datetimes:

CREATE TRIGGER TemporalFaking ON dbo.Birds
AFTER UPDATE, DELETE
AS
BEGIN
SET NOCOUNT ON;

DECLARE @CurrentDateTime datetime2 = SYSUTCDATETIME();

/* Update start times for newly updated data */
UPDATE b
SET
       SysStartTime = @CurrentDateTime
FROM
    dbo.Birds b
    INNER JOIN inserted i
        ON b.Id = i.Id

/* Grab the SysStartTime from dbo.Birds
   Insert into dbo.BirdsHistory */
INSERT INTO dbo.BirdsHistory
SELECT d.Id, d.BirdName, d.SightingCount,d.SysStartTime,ISNULL(b.SysStartTime,@CurrentDateTime)
FROM
       dbo.Birds b
       RIGHT JOIN deleted d
              ON b.Id = d.Id
END
GO

The important aspect to this trigger is that we always join our dbo.Birds table to our inserted and deleted tables based on the primary key, which is the Id column in this case.

If you try to insert/update/delete data from the dbo.Birds table, the dbo.BirdsHistory table will be updated exactly like a regular temporal table would:

/* inserts */
INSERT INTO dbo.Birds (BirdName, SightingCount) VALUES ('Blue Jay',1);
GO
INSERT INTO dbo.Birds (BirdName, SightingCount) VALUES ('Cardinal',1);
GO
BEGIN TRANSACTION
INSERT INTO dbo.Birds (BirdName, SightingCount) VALUES ('Canada Goose',1)
INSERT INTO dbo.Birds (BirdName, SightingCount) VALUES ('Nuthatch',1)
COMMIT
GO
BEGIN TRANSACTION
INSERT INTO dbo.Birds (BirdName, SightingCount) VALUES ('Dodo',1)
INSERT INTO dbo.Birds (BirdName, SightingCount) VALUES ('Ivory Billed Woodpecker',1)
ROLLBACK
GO

/* updates */
UPDATE dbo.Birds SET SightingCount = SightingCount+1 WHERE id = 1;
GO
UPDATE dbo.Birds SET SightingCount = SightingCount+1 WHERE id in (2,3);
GO
BEGIN TRANSACTION
UPDATE dbo.Birds SET SightingCount = SightingCount+1 WHERE id =4;
GO
ROLLBACK

/* deletes */

DELETE FROM dbo.Birds WHERE id = 1;
GO
DELETE FROM dbo.Birds WHERE id in (2,3);
GO
BEGIN TRANSACTION
UPDATE dbo.Birds SET SightingCount = SightingCount+1 WHERE id =4;
GO
ROLLBACK

If you run each of those batches one at a time and check both tables, you'll see how the dbo.BirdsHistory table keeps track of all of our data changes.

2018-09-07_12-31-21

Now seeing what our dbo.Birds data looked like at a certain point-in-time isn't quite as easy as a system versioned table in SQL Server 2016, but it's not bad:

DECLARE @SYSTEM_TIME datetime2 = '2018-09-07 16:30:11';
SELECT * 
FROM
    (
    SELECT * FROM dbo.Birds
    UNION ALL
    SELECT * FROM dbo.BirdsHistory
    ) FakeTemporal
WHERE 
    @SYSTEM_TIME >= SysStartTime 
    AND @SYSTEM_TIME < SysEndTime;

Real Performance

One reason many people loath triggers is due to their potential for bad performance (particular when many triggers get chained together).

I wanted to see how this trigger solution compares to an actual temporal table.  While searching for good ways to test this difference, I found that Randolph West has done some testing on trigger-based temporal tables.  While our solutions are different, I like their performance testing methodology: view the transaction log records for real temporal tables and compare them to those of the trigger-based temporal tables.

I'll let you read the details of how to do the comparison test in their blog post but I'll just summarize the results of my test: the trigger based version is almost the same as a real system versioned temporal table.

Because of how I handle updating the SysStartTime column in my dbo.Birds table, I get one more transaction than a true temporal table:

2018-09-07_12-48-54

You could make the trigger solution work identical to the true temporal table (as Randolph does) if you are willing to make application code changes to populate the SysStartTime column on insert into dbo.Birds.

Conclusion

For my purposes, the trigger-based temporal table solution has a happy ending.  It works for the functionality that I need it for and prevents me from having to manage a history table through some other process.

If you decide to use this in your own pre-2016 instances, just be sure to test the functionality you need; while it works great for the purposes that I use temporal tables for, your results may vary if you need additional functionality (preventing truncates on the history table, defining a retention period for the history, etc... are all features not implemented in the examples above).