Displaying Long Values in SSMS

Published on: 2018-08-07

Click here to watch this week’s video on YouTube.

I write a lot of dynamic SQL and frequently encounter variables that contain many characters:

This variable is 4059 characters long, and when I execute it it runs great.

A homage to one of my favorite Uncyclopedia entries.

If my programmatically built query had an error in it, the first thing I’d want to do when debugging it would be to see the the text of the entire @LongValue variable.

I could do this by just saying SELECT @LongValue, and while recent versions of SSMS will display the whole value for me, it completely loses my formatting which stinks (and is especially bad if there are any comments prefixed with --  in the query):

Need a ultra HD wide display to fit this all on one screen.

I can say PRINT @LongValue, which will keep the formatting, but it will get trimmed at 4,000 characters (notice the missing ORDER BY):

Some Better Ways

Erik Darling posts one solution to this problem in his T-SQL Tuesday #104 entry (as well as some other problems/solutions for lengthy SQL variables). Specifically he links to a SQL string printing script that will loop through the lengthy variable and print everything while maintaining formatting:

Not perfectly formatted, but good enough.

And while I like using that stored procedure on my primary server, I’m too lazy to install it every where I need it.

Instead, I have a couple of go-to solutions that work on all SQL Server instances 2008 forward.

Solution 1: CAST to XML

Casting the long variable to XML allows SSMS to generate a clickable, single-row result that preserves formatting:

IntelliSense complains but I’m OK with it

The only downside to this approach is that certain charaters, like “<” and “>”, can’t be converted to XML:

Solution 2: FOR XML PATH

A slight variation on solution 1, we can get similar results using FOR XML PATH:

FOR XML PATH is one of the most abused SQL Server functions.

In this solution, the “<” is escaped to “&lt;”, which isn’t perfect but at least my variable can be displayed with formatting intact.  A quick find and replace for any escaped characters and I’m good to go.

Good Enough

These techniques aren’t perfect, but for purposes of debugging dynamically generated code they are good enough.

Maybe one day SSMS will print longer strings or include a syntax formatter and I won’t care nearly as much.

And if not, I’ll happily continue to abuse FOR XML to do things other than generate XML documents.

Thanks for reading. You might also enjoy following me on Twitter.

Want to learn even more SQL?

Sign up for my newsletter to receive weekly SQL tips!

Building Dynamic Table-Driven Queries

Published on: 2018-07-10

This post is a response to this month’s T-SQL Tuesday #104 prompt by me! T-SQL Tuesday is a way for SQL Server bloggers to share ideas about different database and professional topics every month.

This month’s topic is asking what code would you hate to live without?


When given the choice between working on new projects versus maintaining old ones, I’m always more excited to work on something new.

That means that when I build something that is going to used for years to come, I try to build it so that it will require as little maintenance as possible in the future.

One technique I use for minimizing maintenance is making my queries dynamic.  Dynamic queries, while not right for every situation, do one thing really well: they allow you to modify functionality without needing a complete rewrite when your data changes.  The way I look it, it’s much easier to add rules and logic to rows in table than having to modify a table’s columns or structure.

To show you what I mean,let’s say I want to write a query selecting data from model.sys.database_permissions:

Writing the query as above is pretty simple, but it isn’t flexible in case the table structure changes in the future or if we want to programmatically write some conditions.

Instead of hardcoding the query as above, here is a general pattern I use for writing dynamic table-driven queries.  SQL Server has the handy views sys.all_views and sys.all_columns that show information about what columns are stored in each table/view:

Using these two views, I can use this dynamic SQL pattern to build the same exact query as above:

The way building a dynamic statement like this works is that I build my SELECT statement as a string based on the values stored in my all_columns view.  If a column is ever added to this view, my dynamic code will handle it (I wouldn’t expect this view to change that much in future versions of SQL, but in other real-world tables I can regularly expect changing data).

Yes, writing certain queries dynamically like this means more up front work.  It also means some queries won’t run to their full potential (not necessarily reusing plans, not tuning every individual query, needing to be thoughtful about SQL injection attacks, etc…).  There are A LOT of downsides to building queries dynamically like this.

But dynamically built queries make my systems flexible and drastically reduce the amount of work I have to do down the road.  In the next few weeks I hope to go into this type of dynamically built, table-driven process in more detail (so you should see the pattern in the example above get reused soon!).

Thanks for reading. You might also enjoy following me on Twitter.

Want to learn even more SQL?

Sign up for my newsletter to receive weekly SQL tips!

Converting JSON to SQL Server CREATE TABLE Statements

Published on: 2018-05-22

Watch this week’s episode on YouTube.

Tedious, repetitive tasks are the bane of any lazy programmer.  I know, because I am one.

One such repetitive task that I find comparable to counting grains of rice is building database layouts from JSON data sources.

While some online services exist that will parse JSON objects into database structures, I don’t like using them because I don’t trust the people running those sites with my data.  Nothing personal against them, I just don’t want to be passing my data through their servers.

My solution to this problem was to write a query that will parse my unfamiliar JSON documents into a series of CREATE TABLE statements.

Automatically Generating A SQL Database Schema From JSON

You can always get the most recent version of the query from GitHub, but I’ll post the current version below so that it’s easier to explain in this post:

In the variables section, we can define our input JSON document string as well as define things like a root table name and default database schema name.

There is also a string padding variable.  This padding variable’s value is added to the max value length found in each column being generated, giving each column a little bit more breathing room.

Next in the script is the recursive CTE that parses the JSON string.  The OPENJSON() function in SQL Server makes this part relatively easy since some of the work of determining datatypes is already done for you.

I’ve taken the liberty to convert all strings to nvarchar types, numbers to either floats or ints, booleans to bits, and datetime strings to datetime2s.

Two additional CTE expressions add an integer IDENTITY PRIMARY KEY column to each table as well as a column referencing the parent table if applicable (our foreign key column).

Finally, a little bit of dynamic SQL pieces together all of these components to generate our CREATE TABLE scripts.

Limitations

I created this code with a lot of assumptions about my (unfamiliar) JSON data sets.  For the purpose of roughly building out tables from large JSON files, I don’t need the results to be perfect and production-ready; I just want the results to be mostly correct so the vast majority of tedious table creation work is automated.

With that disclaimer made, here are a few things to be aware of:

  • Sometimes there will be duplicate column names generated because of naming – just delete one.
  • While foreign key columns exist, the foreign key constraints don’t.
  • This code uses STRING_AGG.  I’ll leave it up to you to convert to STUFF and FOR XML PATH if you need to run it in versions prior to 2017.

Summary

This script is far from perfect.  But it has eliminated the need for me to build out these tables and columns from scratch.  Sure, the output sometimes needs a tweak or too, but for my purposes I’m happy with how it turned out.  I hope it helps you eliminate some boring table creation work too.

Thanks for reading. You might also enjoy following me on Twitter.

Want to learn even more SQL?

Sign up for my newsletter to receive weekly SQL tips!