Ever try to figure out how to track who logged into your Azure SQL database? You checked all the usual ways you might handle that with a SQL Server database, but one-by-one find out they just don’t work. Here’s one way to do it.
To track who is logging into your Azure SQL database, enable auditing (here’s how to do that) with audit entries directed to an Azure storage blob. There are two ways to do this: at the database server level and at the individual database level. Either is fine, but for the example that follows, auditing is assumed to be at the db server level. The example query can be adjusted to work with auditing at the database level, but one of the two auditing options is definitely required to be on!
Run this query to find out all the principals (users) who have logged in so far today into your Azure SQL database.
|— Turn on Audit Logging to Blob for your Azure SQL Database. Then you can query who has logged in.|
|— The example below assumes DB Server-level audit logging. Details will vary slightly for Database-level audit logging.|
|— The example below shows who logged in so far today.|
|— Change "-0" to "-1" to look at yesterday (from a UTC perspective, not your local timezone).|
|— Change "-0" to "-100" to look at 100 days ago.|
|SELECT FORMATMESSAGE('%s (%s)', CAST(DATEADD(day, –0, CONVERT(date, SYSUTCDATETIME())) as varchar),|
|DATENAME(WEEKDAY, DATEADD(day, –0, SYSUTCDATETIME()))),|
|COUNT(server_principal_name) as 'Logins'|
|CAST(DATEADD(day, –0, CONVERT(date, SYSUTCDATETIME())) as varchar)),default, default)|
|WHERE (event_time >= CAST(CONVERT(date, SYSUTCDATETIME()) as datetime2))AND (action_id = 'DBAS')|
|GROUP BY server_principal_name|
|HAVING COUNT(server_principal_name) > 0|
The output is something like the following, assuming if I’ve logged in 12 times so far today with my AAD account (email@example.com) and 1 time with a database-specific credential (myadmin):
09-Nov-2019 (Saturday) firstname.lastname@example.org 12 09-Nov-2019 (Saturday) myadmin 1
The query might take a while time to run, depending on how much data you are traversing. In one of my test environments, it takes nearly 20 minutes. I am sure it is sensitive the amount of data you are logging, database activity, and maybe settings on your blob (not sure if premium storage is supported, but I’m not using it and didn’t test with it).
Note: There are other ways to accomplish this, but every way I know of requires use of Azure SQL auditing. In this post we pushed them to blobs, but other destinations are available. For example, you could send to Event Hubs for a more on-the-fly tracker.