Converting JSON to SQL Server CREATE TABLE Statements

Published on: 2018-05-22

Watch this week’s episode on YouTube.

Tedious, repetitive tasks are the bane of any lazy programmer.  I know, because I am one.

One such repetitive task that I find comparable to counting grains of rice is building database layouts from JSON data sources.

While some online services exist that will parse JSON objects into database structures, I don’t like using them because I don’t trust the people running those sites with my data.  Nothing personal against them, I just don’t want to be passing my data through their servers.

My solution to this problem was to write a query that will parse my unfamiliar JSON documents into a series of CREATE TABLE statements.

Automatically Generating A SQL Database Schema From JSON

You can always get the most recent version of the query from GitHub, but I’ll post the current version below so that it’s easier to explain in this post:

/*
This code takes a JSON input string and automatically generates
SQL Server CREATE TABLE statements to make it easier
to convert serialized data into a database schema.

It is not perfect, but should provide a decent starting point when starting
to work with new JSON files.

A blog post with more information can be found at https://bertwagner.com/2018/05/22/converting-json-to-sql-server-create-table-statements/
*/
SET NOCOUNT ON;

DECLARE 
	@JsonData nvarchar(max) = '
		{
			"Id" : 1,
			"IsActive":true,
			"Ratio": 1.25,
			"ActivityArray":[true,false,true],
			"People" : ["Jim","Joan","John","Jeff"],
			"Places" : [{"State":"Connecticut", "Capitol":"Hartford", "IsExpensive":true},{"State":"Ohio","Capitol":"Columbus","MajorCities":["Cleveland","Cincinnati"]}],
			"Thing" : { "Type":"Foo", "Value" : "Bar" },
			"Created_At":"2018-04-18T21:25:48Z"
		}',
	@RootTableName nvarchar(4000) = N'AppInstance',
	@Schema nvarchar(128) = N'dbo',
	@DefaultStringPadding smallint = 20;

DROP TABLE IF EXISTS ##parsedJson;
WITH jsonRoot AS (
	SELECT 
		0 as parentLevel, 
		CONVERT(nvarchar(4000),NULL) COLLATE Latin1_General_BIN2 as parentTableName, 
		0 AS [level], 
		[type] ,
		@RootTableName COLLATE Latin1_General_BIN2 AS TableName,
		[key] COLLATE Latin1_General_BIN2 as ColumnName,
		[value],
		ROW_NUMBER() OVER (ORDER BY (SELECT 1)) AS ColumnSequence
	FROM 
		OPENJSON(@JsonData, '$')
	UNION ALL
	SELECT 
		jsonRoot.[level] as parentLevel, 
		CONVERT(nvarchar(4000),jsonRoot.TableName) COLLATE Latin1_General_BIN2, 
		jsonRoot.[level]+1, 
		d.[type],
		CASE WHEN jsonRoot.[type] IN (4,5) THEN CONVERT(nvarchar(4000),jsonRoot.ColumnName) ELSE jsonRoot.TableName END COLLATE Latin1_General_BIN2,
		CASE WHEN jsonRoot.[type] IN (4) THEN jsonRoot.ColumnName ELSE d.[key] END,
		d.[value],
		ROW_NUMBER() OVER (ORDER BY (SELECT 1)) AS ColumnSequence
	FROM 
		jsonRoot
		CROSS APPLY OPENJSON(jsonRoot.[value], '$') d
	WHERE 
		jsonRoot.[type] IN (4,5) 
), IdRows AS (
	SELECT 
		-2 as parentLevel,
		null as parentTableName,
		-1 as [level],
		null as [type],
		TableName as Tablename,
		TableName+'Id' as columnName, 
		null as [value],
		0 as columnsequence
	FROM 
		(SELECT DISTINCT tablename FROM jsonRoot) j
), FKRows AS (
	SELECT 
		DISTINCT -1 as parentLevel,
		null as parentTableName,
		-1 as [level],
		null as [type],
		TableName as Tablename,
		parentTableName+'Id' as columnName, 
		null as [value],
		0 as columnsequence
	FROM 
		(SELECT DISTINCT tableName,parentTableName FROM jsonRoot) j
	WHERE 
		parentTableName is not null
)
SELECT 
	*,
	CASE [type]
		WHEN 1 THEN 
			CASE WHEN TRY_CONVERT(datetime2, [value], 127) IS NULL THEN 'nvarchar' ELSE 'datetime2' END
		WHEN 2 THEN 
			CASE WHEN TRY_CONVERT(int, [value]) IS NULL THEN 'float' ELSE 'int' END
		WHEN 3 THEN 
			'bit'
		END COLLATE Latin1_General_BIN2 AS DataType,
	CASE [type]
		WHEN 1 THEN 
			CASE WHEN TRY_CONVERT(datetime2, [value], 127) IS NULL THEN MAX(LEN([value])) OVER (PARTITION BY TableName, ColumnName) + @DefaultStringPadding ELSE NULL END
		WHEN 2 THEN 
			NULL
		WHEN 3 THEN 
			NULL
		END AS DataTypePrecision
INTO ##parsedJson
FROM jsonRoot
WHERE 
	[type] in (1,2,3)
UNION ALL SELECT IdRows.parentLevel, IdRows.parentTableName, IdRows.[level], IdRows.[type], IdRows.TableName, IdRows.ColumnName, IdRows.[value], -10 AS ColumnSequence, 'int IDENTITY(1,1) PRIMARY KEY' as datatype, null as datatypeprecision FROM IdRows 
UNION ALL SELECT FKRows.parentLevel, FKRows.parentTableName, FKRows.[level], FKRows.[type], FKRows.TableName, FKRows.ColumnName, FKRows.[value], -9 AS ColumnSequence, 'int' as datatype, null as datatypeprecision FROM FKRows 

-- For debugging:
-- SELECT * FROM ##parsedJson ORDER BY ParentLevel, level, tablename, columnsequence

DECLARE @CreateStatements nvarchar(max);

SELECT
	@CreateStatements = COALESCE(@CreateStatements + CHAR(13) + CHAR(13), '') + 
	'CREATE TABLE ' + @Schema + '.' + TableName + CHAR(13) + '(' + CHAR(13) +
		STRING_AGG( ColumnName + ' ' + DataType + ISNULL('('+CAST(DataTypePrecision AS nvarchar(20))+')','') +  CASE WHEN DataType like '%PRIMARY KEY%' THEN '' ELSE ' NULL' END, ','+CHAR(13)) WITHIN GROUP (ORDER BY ColumnSequence) 
	+ CHAR(13)+')'
FROM
	(SELECT DISTINCT 
		j.TableName, 
		j.ColumnName,
		MAX(j.ColumnSequence) AS ColumnSequence, 
		j.DataType, 
		j.DataTypePrecision, 
		j.[level] 
	FROM 
		##parsedJson j
		CROSS APPLY (SELECT TOP 1 ParentTableName + 'Id' AS ColumnName FROM ##parsedJson p WHERE j.TableName = p.TableName ) p
	GROUP BY
		j.TableName, j.ColumnName,p.ColumnName, j.DataType, j.DataTypePrecision, j.[level] 
	) j
GROUP BY
	TableName


PRINT @CreateStatements;


In the variables section, we can define our input JSON document string as well as define things like a root table name and default database schema name.

There is also a string padding variable.  This padding variable’s value is added to the max value length found in each column being generated, giving each column a little bit more breathing room.

Next in the script is the recursive CTE that parses the JSON string.  The OPENJSON() function in SQL Server makes this part relatively easy since some of the work of determining datatypes is already done for you.

I’ve taken the liberty to convert all strings to nvarchar types, numbers to either floats or ints, booleans to bits, and datetime strings to datetime2s.

Two additional CTE expressions add an integer IDENTITY PRIMARY KEY column to each table as well as a column referencing the parent table if applicable (our foreign key column).

Finally, a little bit of dynamic SQL pieces together all of these components to generate our CREATE TABLE scripts.

Limitations

I created this code with a lot of assumptions about my (unfamiliar) JSON data sets.  For the purpose of roughly building out tables from large JSON files, I don’t need the results to be perfect and production-ready; I just want the results to be mostly correct so the vast majority of tedious table creation work is automated.

With that disclaimer made, here are a few things to be aware of:

  • Sometimes there will be duplicate column names generated because of naming – just delete one.
  • While foreign key columns exist, the foreign key constraints don’t.
  • This code uses STRING_AGG.  I’ll leave it up to you to convert to STUFF and FOR XML PATH if you need to run it in versions prior to 2017.

Summary

This script is far from perfect.  But it has eliminated the need for me to build out these tables and columns from scratch.  Sure, the output sometimes needs a tweak or too, but for my purposes I’m happy with how it turned out.  I hope it helps you eliminate some boring table creation work too.

Thanks for reading. You might also enjoy following me on Twitter.

Want to learn even more SQL?

Sign up for my newsletter to receive weekly SQL tips!

ʼ;ŚℇℒℇℂƮ *: How Unicode Homoglyphs Can Thwart Your Database Security

Published on: 2017-09-12

Photo by Niketh Vellanki on Unsplash

For the past couple weeks I’ve been writing about how to protect your database from a SQL injection attack.  Today, we will keep the trend going by looking at how implicit unicode conversions can leave your data vulnerable.

You can also watch this content on my YouTube channel.

What’s a homoglyph?

A homoglyph is a character that looks like another character. l  (lowercase “L”) and 1  (the number) are considered homoglyphs.  So are O  (the letter) and 0   (the number).

Homoglpyhs can exist within a character set (like the Latin character set examples above) or they can exist between character sets.  For example, you may have the unicode apostrophe ʼ, which is a homoglyph to the Latin single quote character  .

How does SQL Server handle unicode homoglyphs?

Funny you should ask.  If you pass in a unicode character to a non-unicode datatype (like char), SQL implicitly converts the unicode character to its closest resembling non-unicode homoglyph.

To see this in action, we can use the unicode apostrophe from the example above:

SELECT 
	CAST(N'ʼ' AS nchar) AS UnicodeChar, 
	CAST(N'ʼ' AS char) AS NonUnicodeChar

You can see in the second column SQL automatically converted the apostrophe to a single quote:

Although this implicit character conversion can be convenient for when you want to display unicode characters in a non-unicode character set, it can spell disaster for your SQL Server security.

Unicode Homoglyph SQL Injection

If you are already using sp_executesql or QUOTENAME() when building your dynamic SQL queries then you are safe from this type of SQL injection.

I know you aren’t the kind of person who would ever write your own security functions when solid, safe, and tested functions like the above are available.  However, just this one time let’s pretend you think you can outsmart a hacker by writing your own quote delimiting code.

Using the same dataset as last week, let’s create a new stored procedure that is going to return some data from a user’s profile:

DROP PROCEDURE IF EXISTS dbo.GetProfile
GO
CREATE PROCEDURE dbo.GetProfile
	@Username nvarchar(100)
AS
BEGIN
	-- Add quotes to escape injection...or not?
	SET @Username = REPLACE(@Username, '''','''''')

	DECLARE @Query varchar(max)

	SET @Query = 'SELECT 
					FullName, 
					JoinDate
				FROM
					dbo.RegisteredUser
				WHERE
					UserName = ''' + @Username + '''
					'

	EXEC(@Query)
END
GO

Instead of using sp_executesql or QUOTENAME(), let’s try to write our own clever REPLACE() function that will replace single quotes with two sets of single quotes.  This should, in theory, prevent SQL injection.

If we test out a “normal” attempt at SQL injection, you’ll notice this logic works great.  Give yourself a pat on the back!

However, if we pass in a unicode apostrophe…:

The reason this happens is because we declared our @Query parameter as varchar instead of the unicode nvarchar.  When we build our dynamic SQL statement, SQL implicitly converts the nvarchar @Username parameter to the non-unicode varchar:

So if I replace apostrophes will that make me safe?

No.

I know it seems like black listing/replacing the unicode apostrophe would solve all of our problems.

And it would…in this scenario only.  There are more unicode homoglpyhs than just an apostrophe though.

Out of curiosity I wrote a script to search through the unicode character space to see what other homoglyphs exist:

DECLARE @FirstNumber INT=0;
-- number of possible characters in the unicode space
DECLARE @LastNumber INT=1114112;
 
WITH Numbers AS (
    SELECT @FirstNumber AS n
    UNION ALL
    SELECT n+1 FROM Numbers WHERE n+1<=@LastNumber
), UnicodeConversion AS (
SELECT
       n AS CharacterNumber,
       CASE CAST(NCHAR(n) as CHAR(1))
              WHEN '''' THEN NCHAR(n)
              WHEN ';' THEN NCHAR(n)
       END AS UnicodeCharacter,
       CAST(NCHAR(n) as CHAR(1)) AS ASCIICharacter
FROM Numbers
)
SELECT
       *
FROM
       UnicodeConversion
WHERE
       UnicodeCharacter IS NOT NULL
OPTION (MAXRECURSION 0)

Although the characters in the above screen shot might look similar, they are actually homoglyphs.

I decided to only search for single quotes and semi-colons since they are frequently used in SQL injection attacks, but this by no means is an extensive list of all of the characters you would want to blacklist.

Not only would it be very difficult to confidently blacklist every dangerous homoglyph, but new characters are being added to unicode all of the time so maintaining a black list would be a maintenance nightmare.  Especially if the person maintaining this code in the future isn’t familiar with these types of injection attacks.

And don’t be cheeky thinking you can filter out dangerous SQL keywords either – even if you REPLACE(@Username,’SELECT’,”), just remember someone can come by and pass in a value like ‘ŚεℒℇℂƮ’.

Conclusion

Don’t write your own security functions – they will fail.

Your best protection against SQL injection is to not use dynamic SQL.  If you have to use dynamic SQL, then use either sp_executesql and QUOTENAME().

Thanks for reading. You might also enjoy following me on Twitter.

Want to learn even more SQL?

Sign up for my newsletter to receive weekly SQL tips!

How to Safely Parameterize Table Names

Published on: 2017-09-05

Protecting against SQL Injection Part 2

Photo by Igor Ovsyannykov on Unsplash

Watch this content on YouTube.

Last week we talked about building dynamic SQL queries and how doing so might leave you open to SQL injection attacks.

In that post we examined how using sp_executesql to parameterize our dynamic SQL statements protects us.

Today, we are going to look at where sp_executesql falls short.

The problem with sp_executesql

Let’s modify last week’s final query to make our table name dynamic:

CREATE PROCEDURE dbo.sp_GetFullNameFromTable
		@ParmTableName varchar(100),
		@ParmUserName varchar(100)
AS
BEGIN
	DECLARE @FullQuery nvarchar(1000)
	SET @FullQuery = N'SELECT FullName FROM dbo.@TableName WHERE UserName = @UserName'

	DECLARE @ParmDefinition nvarchar(100) = N'@TableName varchar(100), @UserName varchar(100)';  

	EXEC sp_executesql @FullQuery, @ParmDefinition,  
					  @UserName = @ParmUserName,
					  @TableName = @ParmTableName;  

END

(there’s an entirely different discussion to be had about whether you should allow table and column names to be dynamic, but we’ll assume we need a dynamic table name for the rest of this post)

If we pass in a table name parameter value and execute this query, we’ll be greeted with this error:

Yeah, sp_executesql doesn’t like parameterizing a table names.

So how do we make table name dynamic and safe?

Unfortunately we have to fall back on SQL’s EXEC command.

However, like we discussed last week, we need to be vigilant about what kind of user input we allow to be built as part of our query.

Assuming our app layer is already sanitizing as much of the user input as possible, here are some precautions we can take on the SQL side of the equation:

QUOTENAME()

If we wrap our user input parameters with QUOTENAME(), we’ll achieve safety:

CREATE PROCEDURE dbo.sp_GetFullNameFromTableSanitized
	@ParmTableName varchar(100),
	@ParmUserName varchar(100)
AS
BEGIN
	DECLARE @FullQuery nvarchar(1000)
	SET @FullQuery = N'SELECT FullName FROM dbo.' + QUOTENAME(@ParmTableName) + ' WHERE UserName = @UserName'

	DECLARE @ParmDefinition nvarchar(100) = N'@UserName varchar(100)';  

	EXEC sp_executesql @FullQuery, @ParmDefinition,  
					  @UserName = @ParmUserName

END

This results in:

Although QUOTENAME() works well by adding brackets (by default) to sanitize quotes, it’s downside is that it will only output strings up to 128 characters long. If you are expecting parameters with values longer than that, you will have to use something like REPLACE(@TableName,'''','''''') instead to delimit single quotes (however, rolling your own logic like this is really hard to do securely and not recommended).

EXECUTE AS

The account running any dynamic SQL queries should be locked down so that it won’t be able to perform any operations you don’t want it to do.

Taking this idea one step further, you can create another account with very limited permissions, and add EXECUTE AS to your stored procedure to run that code under the very limited account.

CREATE PROCEDURE dbo.sp_GetFullNameFromTableSanitized
  @ParmTableName varchar(100),
  @ParmUserName varchar(100)
  WITH EXECUTE AS 'LimitedUser'
AS
BEGIN
...

This won’t prevent injection, but it will limit what the malicious user is able to do.

Thanks for reading. You might also enjoy following me on Twitter.

Want to learn even more SQL?

Sign up for my newsletter to receive weekly SQL tips!