Importing Data With PowerShell and dbatools

I've found myself working with PowerShell more frequently as of late, in no small part due to the amazing dbatools module. This has led to me trying to figure out how else I can utilize it for some of my personal internal processes.

I like to use public datasets for experimentation and presentation demos, especially data that people can easily understand and relate to. For some, keeping them up-to-date was a manual process of downloading files, loading tables, and merging. There are of course many better ways to do this, some of which are more automated than others. I could have simply used PowerShell to call bcp, or even just implemented an insert statement and some loops. Then I found dbatools, which has commands which enable me to do an even better job with far less work – just the way I like it!. Here's how I now keep my datasets current:

Getting The Data

I'll be using data from the City of Chicago's Data Portal. They have a tremendous online resource with lots of public datasets available. One that I really like is their listing of towed vehicles. Any time the city tows or impounds a vehicle, a record gets added here and remains for 90 days. It's very manageable, with only 10 columns and a few thousand rows. (As an added bonus, you can search for license plates you know and then ask your friends about their experience at the impound lot!)

Chicago's data portal uses Socrata, which is a very well-documented and easy-to-use tool for exposing data. It has a wonderful API for querying and accessing data, but to keep things simple for this post we're just going to download a CSV file.

If you're on the page for a dataset, you can download it by clicking on "Export" on the top right and then selecting "CSV". To avoid all that, the direct link to download a CSV of this dataset is here. Download it and take a look at what we've got using your spreadsheet or text editor of choice (mine is Notepad++).

Loading The Data

We've got our data, now let's load it. I like to load the entire downloaded dataset into a stage table, and then copy new rows I haven't previously seen into my production table that I query from. Here's the script to create these tables:

 1-- CREATE STAGE TABLE
 2CREATE TABLE [dbo].[TowedVehiclesSTG](
 3	[TowDate] [date] NOT NULL,
 4	[Make] [nchar](4) NULL,
 5	[Style] [nchar](2) NULL,
 6	[Model] [nchar](4) NULL,
 7	[Color] [nchar](3) NULL,
 8	[Plate] [nchar](8) NULL,
 9	[State] [nchar](2) NULL,
10	[TowedToFacility] [nvarchar](75) NULL,
11	[FacilityPhone] [nchar](14) NULL,
12	[ID] [int] NOT NULL
13);
14
15
16-- CREATE FINAL TABLE
17CREATE TABLE [dbo].[TowedVehicles](
18	[ID] [int] NOT NULL,
19	[TowDate] [date] NOT NULL,
20	[Make] [nchar](4) NULL,
21	[Style] [nchar](2) NULL,
22	[Model] [nchar](4) NULL,
23	[Color] [nchar](3) NULL,
24	[Plate] [nchar](8) NULL,
25	[State] [nchar](2) NULL,
26	[TowedToFacility] [nvarchar](75) NULL,
27	[FacilityPhone] [nchar](14) NULL,
28	CONSTRAINT PK_TowedVehicles PRIMARY KEY CLUSTERED (ID)
29);

Now for the magic – let's load some data! The dbatools command that does all the heavy lifting here is called Import-DbaCsvToSql. It loads CSV files into a SQL Server table quickly and easily. As an added bonus, the entire import is within a transaction, so if an error occurs everything gets rolled back. I like to specify my tables and datatypes ahead of time, but if you want to load into a table that doesn't exist yet, this script will create a table and do its best to guess the appropriate datatype. To use, simply point it at a CSV file and a SQL Server instance, database, and (optionally) a table. It will take care of the rest.

1# Load from CSV into staging table
2Import-DbaCsvToSql -Csv $downloadFile -SqlInstance InstanceName -Database TowedVehicles -Table TowedVehiclesSTG `
3-Truncate -FirstRowColumns

The two parameters on the second line tell the command to truncate the table before loading, and that the first line of the CSV file contains column names.

Now the data has been staged, but since this dataset contains all cars towed over the past 90 days, chances are very good that I already have some of these tows in my production table from a previous download. A simple query to insert all rows from staging into production that aren't already there will do the trick. This query is run using another dbatools command, Invoke-Sqlcmd2.

 1# Move new rows from staging into production table
 2Invoke-Sqlcmd2 -ServerInstance InstanceName -Database TowedVehicles `
 3-Query "INSERT INTO [dbo].[TowedVehicles]
 4SELECT
 5  [ID],
 6  [TowDate],
 7  [Make],
 8  [Style],
 9  [Model],
10  [Color],
11  [Plate],
12  [State],
13  [TowedToFacility],
14  [FacilityPhone]
15FROM (
16  SELECT
17    s.*,
18    ROW_NUMBER() OVER (PARTITION BY s.ID ORDER BY s.ID) AS n
19  FROM [dbo].[TowedVehiclesSTG] s
20  LEFT JOIN [dbo].[TowedVehicles] v ON s.ID = v.ID
21  WHERE v.ID IS NULL
22) a
23WHERE a.n = 1"

The ID column uniquely identifies each tow event, and the production table uses it as a primary key, however I have found that occasionally the dataset will contain duplicated rows. The ROW_NUMBER() window function addresses this issue and ensures each ID is attempted to be inserted only once.

Putting it all together

I've showed you how simple dbatools makes it to load a CSV file into a table and then run a query to load from staging into production, but the beauty of PowerShell is that it's easy to do way more than that. I actually scripted this entire process, including downloading the data! You can download the full PowerShell script, along with a T-SQL Script for creating the tables, from my GitHub here.

Happy Data Loading!