Data with Bert logo

Behind the Scenes: How I Create Weekly Technical Videos

Watch this week's video on YouTube

Today I want to share my process for blogging and making weekly YouTube videos.

I've been hesitant to write this post because of how much I am still refining the process. Since I started making videos two years ago, not a few weeks go by without trying something new. I frequently try to improve my video and audio quality, learn new teaching strategies, and automate parts of the editing process.

So while I am excited to share what goes into making one of my videos, please be aware that it is still a work in progress.

My July 2019 Process for Creating Weekly Videos

Think of a topic - Monday/Tuesday

When I started blogging and making videos, thinking of topics to write about was difficult.

Then I started keeping a OneNote notebook to build up a list of ideas. Whenever I learned how to solve a problem or helped someone else solve a problem, I would write that idea down.

I also began outlining presentations I wanted to give and added a blog/video idea for each demo that I knew I wanted to include in my presentation. Not only did this help spread out the work required for creating a new presentation, but it gave me a lot of material to create for my blog and YouTube channel.

Today I have what feels like an endless list of topic ideas to write about. Every Monday and Tuesday I browse my list and choose what I think would be a good topic to use for the upcoming week's post/video. I take into consideration whether I discovered something particularly surprising recently, whether I need to create some demos for an upcoming presentation, and how busy my upcoming week is with other events.

The goal is that by Tuesday night I know what topic I'll want to work on during the upcoming week.

Create Demos - Wednesday/Thursday

On Wednesday and Thursday I write my demos.

Some weeks this is easy: often when I add a topic to my idea notebook, I include code that either partially or fully meets my demo needs.

Other weeks this step is more involved. This happens in particular when I need to pin down why a certain behavior is occurring. Maybe I discovered an interesting SQL Server behavior during my workday but didn't have time to dig into why it was happening. I might have found a quick fix to get me through the problem at work, but I want to do a little bit more research to get to the root cause for my post/video.

This is often my favorite part of the process, although sometimes it can also be the most frustrating. There have definitely been weeks where I start out with one idea in mind, but after struggling to build a solid demo, I choose a different topic for that week and stash away the original idea to try again some time in the future.

Film - Friday

I usually film my videos on Friday nights. Except when I film them on Saturday mornings.

Having a permanent studio setup in my basement has made this process much easier. Instead of needing to make sure my video background is clear of toys/general clutter, needing to set up lights and cameras, and asking for perfect silence from everyone in the house, I now have a space I can just sit down and record without any fuss.

I shoot my videos in one continuous take that later gets edited for content. I've listed the gear I use to shoot on at the bottom of this post, but to get a good feel for my setup check out this week's video starting at the 2:54 mark. I think it gives a cool example of how much lighting matters in these types of videos, as well as the benefits of post-production color grading.

Edit - Saturday/Sunday

Editing the raw video and audio recordings into a short and (hopefully) entertaining and educational video is where I have evolved my process the most.

Since I record everything in one long take, the first thing I do is sync up all of the various footage from cameras, screen captures, and audio recordings.

I then use Adobe Premiere's Multi-Camera editor to produce the rough cut of the video. This process is a lot like producing a live TV show - I use keyboard shortcuts to jump between video and audio tracks in real-time. By the time I finish watching the raw footage, I have a pretty good edit of the final video.

Once I have the initial edit done, I go back and start adding whatever additional overlays and screenshots I want to include. This part used to be a huge burden because Premiere requires 8-10 clicks just to get a screenshot added to the editing timeline.

Then I watched this 4-hour long video of a professional editor editing a YouTube video in real-time. The key takeaway was how much he is able to automate using AutoHotkey.

Inspired, I wrote my own AutoHotkey script to copy screen caps from SnagIt directly into Premiere with a single keyboard shortcut. This saves me a TON of time, not to mention makes the process of editing much more enjoyable.

Although all of this editing is time consuming, it's also fun. I create tightly edited videos like this because it's the type of video that I've always wished there was more of in the YouTube programming space. I figured if these are the kinds of videos that I want to watch, then hopefully other people will want to watch them too (and you have been, thank you!!!!)

Blog and Schedule - Monday

By Sunday, my video is usually done and uploaded (but not shared) on YouTube.

Sometimes I write my blog post on Monday nights, sometimes it gets written earlier in the process. Writing the post usually goes quickly since all of work of creating demos and thinking of a logical structure has already been prepared for the video.

Once the post is written, I schedule the post and video to automatically publish the next day.

Tuesday morning the post and video go live at 7am and the whole process repeats itself later that evening.

Tools of the Trade

Below is the list of gear I currently use. These are all tools I currently use to produce my online videos. Some of the links below are affiliate links to help fund my coffee consumption during those early morning editing sessions.

Software and Subscriptions

Epidemic Sound - Royalty free music and sound effects. Easy to search. I used to scour the internet for royalty free tracks, and while there are a lot out there, I got tired of searching through all of the junk to find the high quality recordings.

Adobe Creative Cloud - I've been using Photoshop and Premiere for decades. New versions of Premiere are generally buggy and terrible, so I usually wait a few months until a few updates come out after a release before installing a new version (and always install it side by side with the old version).

SnagIt - Lightweight screen capture software, both for stills and video. TechSmith also makes Camtasia which is a more full featured with its capturing and editing abilities.

Handbrake - I use this to convert my variable frame rate SnagIt screen recordings into constant frame rate videos that will sync nicely with my video footage in Premiere

Gear

Canon 80D DSLR - I bought this camera specifically for shooting video and it has proven to be a great value. Depending on where I am shooting, I will use the the 18-55mm kit lens, a Canon 10-18mm EFS lens (great for hand held selfie shots), or a Canon 50m f/1.4 EF lens (when set up in my studio).

Manfrotto Aluminum Tripod - Great solid tripod. I've owned several of these throughout the years and they perform great. I pair this with my favorite tripod head, the Manfrotto Joystick, which gives full 360-degree smooth adjustments.

JOBY GorillaPod - I love this little tripod when shooting handheld or when traveling because I don't have to bring full sized tripods with me (although, that means I'm usually moving tables and standing near objects that the GorillaPod can mount to).

RODE VideoMic - My go-to when I am recording within a few feet of the camera. Captures great audio, and the windscreen doubles as a dog-toy (not really...but dogs will try to eat it).

Tascam DR-05 - I use this when needing to capture 360-degree sound (typically when multiple people are talking on stage). In addition to using the built in microphones (which are decent), I also use it to record sound from an external microphone.

RODE Lavalier Microphone - A clip on mic. Essential when I'm standing far away from the camera or in noisy environments.

Studio backdrop and lights- This kit includes backdrops, stands, and lights. It's not high quality, but if you aren't afraid of modifying things to fit your needs this works really well as a starter pack. I replaced the included bulbs with 100W LED equivalents and rigged it up so that there were two lights in each softbox.

GVM LED Panel Light - My favorite light that's easy to take with me wherever I go. Lots of good light as long as you keep it within a few feet of you.

Mini Video Lights - Great for adding accent lights. I usually pair these with some colored gels to get nice background colors.

Multiple Identity Inserts

Watch this week's video on YouTube

This week I want to share something that surprised me about using SQL Server's SET IDENTITY_INSERT statement.

I started with two tables with identity columns defined:

CREATE TABLE dbo.[User]
(
    Id int identity,
    UserName varchar(40)
);
CREATE TABLE dbo.StupidQuestions
(
    Id bigint identity,
    UserId int,
    Question varchar(400)
);


INSERT INTO dbo.[User] (UserName) VALUES ('Jim');
INSERT INTO dbo.[User] (UserName) VALUES ('Jane');
INSERT INTO dbo.[User] (UserName) VALUES ('Jin');
INSERT INTO dbo.[User] (UserName) VALUES ('Joyce');

INSERT INTO dbo.StupidQuestions (UserId,Question) VALUES (1,'Is smooth peanut butter better than chunky?');
INSERT INTO dbo.StupidQuestions (UserId,Question) VALUES (1,'Do I really need to backup my production databases?');
INSERT INTO dbo.StupidQuestions (UserId,Question) VALUES (2,'How to grant developers SA access?');
INSERT INTO dbo.StupidQuestions (UserId,Question) VALUES (3,'I''m getting an error about not being able to add any more indexes to my table - how do I increase the limit?');
INSERT INTO dbo.StupidQuestions (UserId,Question) VALUES (4,'How can I include more than 32 columns in my index key?');
GO

I wanted to copy the data from these two tables into two other tables:

CREATE TABLE dbo.User_DEV
(
    Id int identity,
    UserName varchar(40)
);

CREATE TABLE dbo.StupidQuestions_DEV
(
    Id bigint identity,
    UserId int,
    Question varchar(400)
);

This would allow me to safely test some changes on these _DEV table copies without breaking my original tables.

The next step was to write a couple of INSERT INTO SELECT statements:

INSERT INTO dbo.User_DEV
SELECT Id,UserName FROM dbo.[User]

INSERT INTO dbo.StupidQuestions_DEV
SELECT Id,UserId,Question FROM dbo.StupidQuestions

And of course as soon as I executed them SQL Server threw an error stating that I can't INSERT data into tables containing identity columns without first enabling identity inserts:

An explicit value for the identity column in table 'dbo.User_DEV' can only be specified when a column list is used and IDENTITY_INSERT is ON.

Ok, simple enough to fix: we just need to do what the error message says and SET IDENTITY_INSERT ON for both tables:

SET IDENTITY_INSERT dbo.User_DEV ON;  
SET IDENTITY_INSERT dbo.StupidQuestions_DEV ON;  

And... it still didn't work:

IDENTITY_INSERT is already ON for table 'IdentityTest.dbo.User_DEV'. Cannot perform SET operation for table 'dbo.StupidQuestions_DEV'.

One at a time

Although I've probably moved data around like this hundreds (thousands?) of times before, I've never encountered this particular error.

Apparently SQL Server only allows one table to have the IDENTITY_INSERT property enabled at a time within each session. The solution therefore is straightforward: enable identity inserts and copy each table's data one at a time:

SET IDENTITY_INSERT dbo.User_DEV ON; 
INSERT INTO dbo.User_DEV (Id,UserName)
SELECT Id,UserName FROM dbo.[User];
SET IDENTITY_INSERT dbo.User_DEV OFF; 

SET IDENTITY_INSERT dbo.StupidQuestions_DEV ON;
INSERT INTO dbo.StupidQuestions_DEV (Id,UserId,Question)
SELECT Id,UserId,Question FROM dbo.StupidQuestions
SET IDENTITY_INSERT dbo.StupidQuestions_DEV OFF;

20/20

In hindsight, I think I've never encountered this error before because I normally use the the Export Data Wizard in SSMS or a dedicated SSIS package to move data around. Either of those options are typically easier than writing T-SQL to move data across servers or for repeatability for when I need to regularly refresh tables with test data.

However, when using either of those options I've never paid attention to the implementation details, causing me to assume I knew how SQL Server handles identity inserts.

Trailing Spaces in SQL Server

Watch this week's video on YouTube

A long time ago I built an application that captured user input. One feature of the application was to compare the user's input against a database of values.

The app performed this text comparison as part of a SQL Server stored procedure, allowing me to easily update the business logic in the future if necessary.

One day, I received an email from a user saying that the value they were typing in was matching with a database value that they knew shouldn't match. That is the day I discovered SQL Server's counter intuitive equality comparison when dealing with trailing space characters.

Padded white space

You are probably aware that the CHAR data type pads the value with spaces until the defined length is reached:

DECLARE @Value CHAR(10) = 'a'
SELECT
    @Value AS OriginalValue,
    LEN(@Value) AS StringLength,
    DATALENGTH(@Value) AS DataLength,
    CAST(@Value AS BINARY) AS StringToHex;

image

The LEN() function shows the number of characters in our string, while the DATALENGTH() function shows us the number of bytes used by that string.

In this case, DATALENGTH is equal to 10. This result is due to the padded spaces occurring after the character "a" in order to fill the defined CHAR length of 10. We can confirm this by converting the value to hexadecimal. We see the value 61 ("a" in hex) followed by nine "20" values (spaces).

If we change our variable's data type to VARCHAR, we'll see the value is no longer padded with spaces:

DECLARE @Value VARCHAR(10) = 'a'
SELECT
    @Value AS OriginalValue,
    LEN(@Value) AS StringLength,
    DATALENGTH(@Value) AS DataLength,
    CAST(@Value AS BINARY) AS StringToHex;

image-1

Given that one of these data types pads values with space characters while the other doesn't, what happens if we compare the two?

DECLARE 
    @CharValue CHAR(10) = '',
    @VarcharValue VARCHAR(10) = ''
SELECT
    IIF(@CharValue=@VarcharValue,1,0) AS ValuesAreEqual,
    DATALENGTH(@CharValue) AS CharBytes,
    DATALENGTH(@VarcharValue) AS VarcharBytes

image-2

In this case SQL Server considers both values equal, even though we can confirm that the DATALENGTHs are different.

This behavior doesn't only occur with mixed data type comparisons however. If we compare two values of the same data type, with one value containing several space characters, we experience something...unexpected:

DECLARE 
    @NoSpaceValue VARCHAR(10) = '',
    @MultiSpaceValue VARCHAR(10) = '    '
SELECT
    IIF(@NoSpaceValue=@MultiSpaceValue,1,0) AS ValuesAreEqual,
    DATALENGTH(@NoSpaceValue) AS NoSpaceBytes,
    DATALENGTH(@MultiSpaceValue) AS MultiSpaceBytes

image-3

Even though our two variables have different values (a blank compared to four space characters), SQL Server considers these values equal.

If we add a character with some trailing whitespace we'll see the same behavior:

DECLARE 
    @NoSpaceValue VARCHAR(10) = 'a',
    @MultiSpaceValue VARCHAR(10) = 'a     '
SELECT
    IIF(@NoSpaceValue=@MultiSpaceValue,1,0) AS ValuesAreEqual,
    DATALENGTH(@NoSpaceValue) AS NoSpaceBytes,
    DATALENGTH(@MultiSpaceValue) AS MultiSpaceBytes

image-4

Both values are clearly different, but SQL Server considers them to be equal to each other. Switching our equal sign to a LIKE operator changes things slightly:

DECLARE 
   @NoSpaceValue VARCHAR(10) = 'a',
   @MultiSpaceValue VARCHAR(10) = 'a     '
SELECT
   IIF(@NoSpaceValue LIKE @MultiSpaceValue,1,0) AS ValuesAreEqual,
   DATALENGTH(@NoSpaceValue) AS NoSpaceBytes,
   DATALENGTH(@MultiSpaceValue) AS MultiSpaceBytes

image-5

Even though I would think that a LIKE without any wildcard characters would behave just like an equal sign, SQL Server doesn't perform these comparisons the same way.

If we switch back to our equal sign comparison and prefix our character value with spaces we'll also notice a different result:

DECLARE 
    @NoSpaceValue VARCHAR(10) = 'a',
    @MultiSpaceValue VARCHAR(10) = '    a'
SELECT
    IIF(@NoSpaceValue=@MultiSpaceValue,1,0) AS ValuesAreEqual,
    DATALENGTH(@NoSpaceValue) AS NoSpaceBytes,
    DATALENGTH(@MultiSpaceValue) AS MultiSpaceBytes

image-6

SQL Server considers two values equal regardless of spaces occurring at the end of a string. Spaces preceding a string however, no longer considered a match.

What is going on?

ANSI

While counter intuitive, SQL Server's functionality is justified. SQL Server follows the ANSI specification for comparing strings, adding white space to strings so that they are the same length before comparing them. This explains the phenomena we are seeing.

It does not do this with the LIKE operator however, which explains the difference in behavior.

Comparisons when extra spaces matter

Let's say we want to do a comparison where the difference in trailing spaces matters.

One option is to use the LIKE operator as we saw a few examples back. This is not the typical use of the LIKE operator however, so be sure to comment and explain what your query is attempting to do by using it. The last thing you want is some future maintainer of your code to switch it back to an equal sign because they don't see any wild card characters.

Another option that I've seen is to perform a DATALENGTH comparison in addition to the value comparison:

DECLARE 
    @NoSpaceValue VARCHAR(10) = 'a',
    @MultiSpaceValue VARCHAR(10) = 'a    '
SELECT
    IIF(@NoSpaceValue = @MultiSpaceValue AND DATALENGTH(@NoSpaceValue) = DATALENGTH(@MultiSpaceValue),1,0) AS ValuesAreEqual,
    DATALENGTH(@NoSpaceValue) AS NoSpaceBytes,
    DATALENGTH(@MultiSpaceValue) AS MultiSpaceBytes

image-7

This solution isn't right for every scenario however. For starters, you have no way of knowing if SQL Server will execute your value comparison or DATALENGTH predicate first. This could wreck havoc on index usage and cause poor performance.

A more serious problem can occur if you are comparing fields with different data types. For example, when comparing a VARCHAR to NVARCHAR data type, it's pretty easy to create a scenario where your comparison query using DATALENGTH will trigger a false positive:

DECLARE 
    @NoSpaceValue VARCHAR(10) = 'a ',
    @MultiSpaceValue NVARCHAR(10) = 'a'
SELECT
    IIF(@NoSpaceValue = @MultiSpaceValue AND DATALENGTH(@NoSpaceValue) = DATALENGTH(@MultiSpaceValue),1,0) AS ValuesAreEqual,
    DATALENGTH(@NoSpaceValue) AS NoSpaceBytes,
    DATALENGTH(@MultiSpaceValue) AS MultiSpaceBytes

image-8

Here the NVARCHAR stores 2 bytes for every character, causing the DATALENGTHs of a single character NVARCHAR to be equal to a character + a space VARCHAR value.

The best thing to do in these scenarios is understand your data and pick a solution that will work for your particular situation.

And maybe trim your data before insertion (if it makes sense to do so)!

Joker's Wild

This past weekend I had a blast presenting Joker's Wild with Erin Stellato (blog\|twitter), Andy Mallon (blog\|twitter), and Drew Furgiuele (blog\|twitter).

Watch this week's video on YouTube

Table of contents:

  • What is Joker's Wild? Watch this to witness Andy's amazing PowerPoint animation skills (0:00)
  • Bert demos SQL injection (2:25)
  • Erin recollects desserts (9:55)
  • Andy shares an automation tip (18:55)
  • Andy explains an ANSI standard (23:10)
  • Drew describes containers (27:02)

While a video doesn't quite give you the same experience as being in the room with dozens of other data professionals laughing and shouting along, hopefully it gives you an idea.

Here's a behind-the-scenes peek at how it all came together.

A Different Kind Of Presentation

I've wanted to do a "fun" SQL Server presentation for a while; something that would be lighthearted while still delivering (some) educational value.

I ran some ideas past Erin after SQL Saturday Cleveland earlier this year. We came up with several concepts ideas we could incorporate into the presentation (thanks to Paul Popovich and Luis Gonzalez for also helping us generate a lot of these ideas) and at that point I think Erin came up with the name "Joker's Wild."

Blind Commitment

Fast forward a few months: occasionally I'd talk about the presentation idea with people but still wasn't any closer to actually making it real.

Then a few days before the SQL Saturday Columbus submission deadline, Erin reached out to ask if we were going to submit. We recruited Andy and Drew to help present and submitted an abstract:

Come one, come all to the greatest (and only) SQL Server variety show at SQL Saturday Columbus.

This session features a smattering of lightning talks covering a range of DBA- and developer-focused SQL Server topics, interspersed with interactive games to keep the speakers and audience on their toes.

Plan for plenty of sarcasm, laughs, and eye rolls in this thoughtfully structured yet highly improvised session.

We can't guarantee what you'll learn, but we do promise a great time!

*Slot machine will not generate real money for "winners"

Structure

If that abstract reads a little vague, it's because at that point we didn't know exactly what we wanted to do yet. Once our session was selected though it was time to come up with a concrete plan (big thank you to David Maxwell and Peter Shore for giving us the opportunity to try something like this).

After some discussion, Erin, Andy, Drew, and I came up with the following structure:

  1. The audience will choose the lightning talk topic
  2. We will spin the "Wheel of Misfortune" to determine the presentation style, including:
    • Slides I didn't write
    • Random slide timing
    • Who has the clicker?
  3. We will play some SQL Server themed Jeopardy and Pictionary with the audience

After our first meeting Andy created the world's most versatile PowerPoint presentation that would run the show. Seriously, if you haven't watched the video above yet, go watch it - that introduction is all PowerPoint goodness created by him.

The Session and Final Thoughts

I'm incredibly happy with how it all went. The session was planned but a lot of it was still left up to a highly improvised performance. I had a lot of fun preparing and presenting, and I think the session was well received by the audience. Jeopardy and Pictionary were a lot of fun too, even though I ran out of video recording space so I couldn't include them in the video.

I hope we have another opportunity to present this session again in the future.

Thank you again David and Peter for letting us do this session as part of SQL Saturday Columbus.

Thank you to our audience for taking a risk on attending a session you didn't know much about. Also for your great participation.

And thank you Erin, Andy, and Drew for helping do something fun and different.

CHOOSE() in SQL Server

Watch this week's video on YouTube

While I know I don't utilize most of the features available in SQL Server, I like to think I'm at least aware that those features exist.

This week I found a blind-spot in my assumption however. Even though it shipped in SQL Server 2012, the SQL Server CHOOSE function is a feature that I think I'm seeing for the first time this past week.

CHOOSE is CASE

CHOOSE returns the n-th item from a comma-delimited list.

Whenever learning a new feature in SQL Server I try to think of a good demo I could build to test out the functionality. In this case the immediate example that came to mind was building something that would provide a lookup of values:

SELECT 
    [key],
    [value],
    [type],
    CHOOSE(type+1,'null','string','int','boolean','array','object') AS JsonType
FROM
    OPENJSON(N'{
        "Property1":null,
        "Property2":"a",
        "Property3":3,
        "Property4":false,
        "Property5":[1,2,"3"],
        "Property6":{
            "SubProperty1":"a"
        }
    }');

In this case, the OPENJSON function returns a "type" field that indicates the datatype of that particular JSON property's value. The issue is that the "type" column is numeric and I can never remember what type of data each number represents.

The above query solves this by using CHOOSE to create a lookup of values. Since OPENJSON returns results starting with 0, we need to use type+1 in order to get the 1-based CHOOSE function to work correctly:

2019-06-03-21-08-58

If we look at the CHOOSE function's scalar operator properties in the execution plan, we'll see that this function is just a fancy alias for a more verbose CASE statement:

[Expr1000] = Scalar Operator(
    CASE WHEN (CONVERT_IMPLICIT(int,OPENJSON_DEFAULT.[type],0)+(1))=(1) 
    THEN 'null' 
    ELSE 
        CASE WHEN (CONVERT_IMPLICIT(int,OPENJSON_DEFAULT.[type],0)+(1))=(2) 
        THEN 'string' 
        ELSE 
            CASE WHEN (CONVERT_IMPLICIT(int,OPENJSON_DEFAULT.[type],0)+(1))=(3) 
            THEN 'int' 
            ELSE 
                CASE WHEN (CONVERT_IMPLICIT(int,OPENJSON_DEFAULT.[type],0)+(1))=(4) 
                THEN 'boolean' 
                ELSE 
                    CASE WHEN (CONVERT_IMPLICIT(int,OPENJSON_DEFAULT.[type],0)+(1))=(5) 
                    THEN 'array' 
                    ELSE 
                        CASE WHEN (CONVERT_IMPLICIT(int,OPENJSON_DEFAULT.[type],0)+(1))=(6) 
                        THEN 'object' 
                        ELSE NULL END 
                    END 
                END 
            END 
        END 
    END
)

The Set-Based Way

I think one of the reasons I've never used CHOOSE is because I would hate typing up all of those lookup values and trapping them in a SELECT statement, never to be used again.

Previously, I would have stored the lookup values in table and joined them with the OPENJSON results to accomplish the same end result:

DROP TABLE IF EXISTS #JsonType;
CREATE TABLE #JsonType
(
    Id tinyint,
    JsonType varchar(20),
    CONSTRAINT PK_JsonTypeId PRIMARY KEY CLUSTERED (Id)
);

INSERT INTO #JsonType VALUES (0,'null');
INSERT INTO #JsonType VALUES (1,'string');
INSERT INTO #JsonType VALUES (2,'int');
INSERT INTO #JsonType VALUES (3,'boolean');
INSERT INTO #JsonType VALUES (4,'array');
INSERT INTO #JsonType VALUES (5,'object');

SELECT 
    j.[key],
    j.[value],
    j.[type],
    t.JsonType
FROM
    OPENJSON(N'{
        "Property1":null,
        "Property2":"a",
        "Property3":3,
        "Property4":false,
        "Property5":[1,2,"3"],
        "Property6":{
                        "SubProperty1":"a"
                    }
    }') j
    INNER JOIN #JsonType t
        ON j.[type] = t.Id

While more initial setup is involved with this solution, it's more flexible long-term. With a centralized set of values, there's no need to update the CHOOSE function in all of your queries when you can update the values in a single lookup table.

And while I didn't bother performance testing it, by virtue of being a scalar function, CHOOSE will probably perform worse in many real-world scenarios when compared to the table-based lookup approach (eg. large datasets, parallel plans, etc...).

CHOOSE What Works For You

I'm not surprised that it took me this long to learn about the CHOOSE function: while a simplified way to write certain CASE statements, I can't think of many (any?) scenarios where I would prefer to use it over a CASE or a lookup-table solution.