Today I had the opportunity to speak at VT Code Camp #11 in Burlington, VT. As part of my series of talks on Running Azure Securely, my talk today was around defense in depth and was called Running Azure Securely – which of these Azure security features are for me?. The session was interactive, engaging a half-dozen folks in the audience in a discussion of how to defend various workloads using the (fictitious) page of photos app as a foil.
Some Resources Mentioned
Azure offers thousands of security features. Some of them are easy to use and others are complicated. Some are free to use and some look really, really expensive. Which ones should I be using for my applications?
In this talk we’ll look at some ways to reason about which security controls you might want to apply and why. We’ll consider groups of Azure security features through a pragmatic lens of security best practices and defense-in-depth/breadth, but tempered by the reality that “more security” is not always the answer, but rather “what is the right security” for a situation. By the end of this talk you should have a better idea of the security feature set offered by Azure, why/when they might or might not be needed, and have discussed some ways to reason about how which are relevant you by helping you think about how to assess appropriately for multiple situations.
Do you have specific questions about the applicability of Azure security features already? Feel free to tweet your questions at Bill in advance to @codingoutloud and he’ll try to work answers to any questions into the talk in advance.
(if I can find one)
Today I had the opportunity to speak at SQL Saturday #877 in Burlington, MA. As part of my series of talks on Running Azure Securely, my talk today was Running Azure SQL Database Securely and applied to Azure SQL DB and Azure SQL DB Managed Instances.
Some Resources Mentioned
Running Azure SQL DBs Securely – Bill Wilder – SQL Saturday #877 – 14-Sep-2019
If you know your way around SQL Server, then you will find Azure SQL Database to be familiar territory. But some aspects are more familiar than others, which is especially true for security-related differences.
In this session we review the key differences around identity management and authentication (including multi-factor authentication), managing server credentials (or, even better, not needing to in some cases), how to audit logins (probably not what you expect), an overview of encryption and data masking options, and the supporting role of Azure Key Vault. We will also touch on compliance and disaster recovery to give the complete picture of powerful features you’ll definitely want to know about to protect your data.
This talk will cover relevant capabilities for both traditional Azure SQL Databases and the newer Azure SQL Managed Instances.
This talk assumes you are already familiar with SQL Server or another enterprise database.
(Credit Taiob Ali @SqlWorldWide)
On Tuesday July, 30, 2019 I had the opportunity to speak at North Boston Azure. The talk was part of a series on Running Azure Securely and was called Are all these Azure security features for me? and was not really a “talk” in that it was highly interactive. For those who attended, you will recall we filled in some slides collaboratively. Thus, they may not appear so polished for those of you who did not join live. Either way, please find the slides (“collaborative” and all) below.
This was an experimental approach for me and the feedback from the audience tells me it worked pretty well. The group at North Boston Azure was already knowledgeable and engaged, so hopefully made for a interesting experience for all involved (was certainly fun for me).
You can follow me on Twitter (@codingoutloud).
You can also follow Boston Azure on Twitter (@bostonazure).
Ever try to figure out how to track who logged into your Azure SQL database? You checked all the usual ways you might handle that with a SQL Server database, but one-by-one find out they just don’t work. Here’s one way to do it.
To track who is logging into your Azure SQL database, enable auditing (here’s how to do that) with audit entries directed to an Azure storage blob. There are two ways to do this: at the database server level and at the individual database level. Either is fine, but for the example that follows, auditing is assumed to be at the db server level. The example query can be adjusted to work with auditing at the database level, but one of the two auditing options is definitely required to be on!
Run this query to find out all the principals (users) who have logged in so far today into your Azure SQL database.
||— Turn on Audit Logging to Blob for your Azure SQL Database. Then you can query who has logged in.
||— The example below assumes DB Server-level audit logging. Details will vary slightly for Database-level audit logging.
||— The example below shows who logged in so far today.
||— Change "-0" to "-1" to look at yesterday (from a UTC perspective, not your local timezone).
||— Change "-0" to "-100" to look at 100 days ago.
||SELECT FORMATMESSAGE('%s (%s)', CAST(DATEADD(day, –0, CONVERT(date, SYSUTCDATETIME())) as varchar),
|| DATENAME(WEEKDAY, DATEADD(day, –0, SYSUTCDATETIME()))),
|| COUNT(server_principal_name) as 'Logins'
|| CAST(DATEADD(day, –0, CONVERT(date, SYSUTCDATETIME())) as varchar)),default, default)
||WHERE (event_time >= CAST(CONVERT(date, SYSUTCDATETIME()) as datetime2))AND (action_id = 'DBAS')
||GROUP BY server_principal_name
||HAVING COUNT(server_principal_name) > 0
The output is something like the following, assuming if I’ve logged in 12 times so far today with my AAD account (firstname.lastname@example.org) and 1 time with a database-specific credential (myadmin):
09-Nov-2019 (Saturday) email@example.com 12
09-Nov-2019 (Saturday) myadmin 1
The query might take a while time to run, depending on how much data you are traversing. In one of my test environments, it takes nearly 20 minutes. I am sure it is sensitive the amount of data you are logging, database activity, and maybe settings on your blob (not sure if premium storage is supported, but I’m not using it and didn’t test with it).
Note: There are other ways to accomplish this, but every way I know of requires use of Azure SQL auditing. In this post we pushed them to blobs, but other destinations are available. For example, you could send to Event Hubs for a more on-the-fly tracker.
At most recent Boston Azure meeting I give (what turns out to be…) the first part of a multi-part talk on Running Azure Securely. Even though I did not cover all this content, I’ve attached the whole powerpoint deck below.
Please watch for a Part II to be scheduled.
Last night’s Boston Azure meeting featured Marija Strazdas from @AlertLogic who spoke about the Shared Security Model for security in the cloud. I also showed in more detail some of the tools that Azure provides to help customers with their side of the responsibility model including some with Azure SQL DB, Storage, KeyVault, and Azure Security Center. Here are the slides I presented (though I didn’t get through most of them).
EDIT: Here is the deck presented by Marija Strazdas from @AlertLogic who spoke about the Shared Security Model for security in the cloud:
Alert Logic Azure Security Presentation
You can find @bostonazure on twitter, and feel free to join us on slack.
Boston Application Security Conference (BASC) hosted by the Boston chapter of OWASP (The Open Web Application Security Project).
For my part, I attended a number of interesting sessions (especially the frighteningly entertaining talk by Francis Brown on using Google and Bing to hack (or protect) web properties). Due to scheduling challenges, I missed Andrew Wilson‘s talk on Reversing Web Applications, which I wanted to check out.
For my part, I offered a Birds-of-a-Feather session on Securing Applications in the Cloud (with examples drawn from Windows Azure Platform). In this session, I reviewed both pros and cons of cloud deployments from a security point of view, and attempted to make the case that, ultimately, either your applications will simply be safer in the cloud, or at least if you want them to be sufficiently safe, it will be more cost-effective to let the specialists at Microsoft (or some other trusted cloud vendor) handle much of the dirty work.
This session was interesting for me to put together and then go through with an intimate crowd (due, at least in part I suppose, to (me) changing the scheduled time slot after the conference schedule went to the printer… D’oh! … that combined with the seeming invisibility of the BoF sessions generally). Anyhow, it was still fun to discuss, and here is the slide deck I used: OWASP Boston – BoF – Securely Running Applications in Cloud (examples drawn from Windows Azure Platform) – Bill Wilder – 08-Oct-2011.
I took notes during the Boston Cloud Computing Group Meetup 23-Sept-2009 – the raw notes are below, but a couple of more noteworthy highlights appear first with some of my views interspersed.
Executive Summary – Key Take-Aways & Highlights
Notes from Javed Ikbal’s talk (http://10domains.blogspot.com) are in regular type. My editorial comments and thoughts are in italics or bold italics – so don’t blame these on Javed. 🙂
- Key take-away – going to the Cloud is waaaay more about Business Tradeoffs than it is about Technology.
- “There are 2 kinds of companies – those which have had a [data security]breach, and those which are going to have a [data security] breach” -Javed
- Centralization of data makes insider threat a bigger risk -Javed
- “On premise does not mean people are doing the right thing” –Javed – right on! I bet the majority of the fortune five-million (as 37 Signals refers to the medium and small business market) have insufficient IT – they just don’t know it. Any stats?
- Someone from the audience stated there are more breaches in on-premise data centers than in cloud. Therefore cloud is safer. I don’t buy the logic. There could so many more publicized breaches in on-premise systems simply because there are so many more on premise data centers today. So this is easy to misinterpret. We can’t tell either way from the data. My personal prediction: today if there is a data breach for data stored in the cloud, people will not be able to believe you were reckless enough to store it in the cloud; 5 years from now, if there is a data breach for data stored on premise, people will not be able to believe you were reckless enough to store it locally instead of in the cloud which everyone will then believe is the safest place.
- Someone from audience commented that business value of losing data will be balanced against business cost of it being exposed. This comment did not account for the PROBABILITY of there being a breach – how do you calculate this risk? I bet it is easier to calculate this risk on the cloud than on premise (though *I* don’t know how to do this)
- Comment from Stefan: We can’t expect all cloud services to be up all the time (we were chatting about Google and Amazon downtime, which has been well documented). I completely agree – And many businesses don’t have the data to fairly/accurately compare their own uptimes with those of the cloud vendors – and, further, if the cloud vendors did have 100% up-time, that may destroy the economies we are seeing on the cloud today (who cares if it is 100% reliable if it is 0% affordable – that’s too expensive to be interesting)
- Off-premise security != in cloud – different security issues for different data – Javed In other words, treat SSN and Credit Card data differently than which books I bought last year. But I can think of LOTS of data that is seemingly innocuous, but that SOME PEOPLE will balk at having it classified as “non-sensitive” – might be my bookmarks, movie rentals, books purchased, travel plans/history, many more… not just those that support identity theft and/or direct monetary loss (bank account hacks). I think it would be a fine idea for data hosts to publicly declare their data classification scheme – shouldn’t we all have a right to know?
- I think IT generally – and The Cloud specifically – could benefit from the kind of thinking that went into GoodGuide.com.
Raw Notes Follow
The rest of these notes are a bit rough – and may or may not make sense – but here they are anyway…
- Pizza & drinks, some social (sat next to Stefan Schueller from TechDroid Sytems and enjoyed chatting with him)
- Went around the room introducing ourselves
- People who were hiring / looking for work spoke up
- Around 30 people in attendance
- Meeting host: Aprigo – 460 Totten Pond rd, suite 660 – Waltham, MA 02451 – USA
- Feisty audience! Lots of participation. This added to the meeting impact.
Twisted Storage talk
From Meetup description: Charles Wegrzyn – CTO at TwistedStorage Inc. (Check actually built an Open source cloud storage system back in ’05)
TwistedStorage is open source software that converts multiple storage
repositories, legacy or green-field, into a single petabyte-scale cloud
for unstructured data, digital media storage, and archiving. The Twisted
Storage Enterprise Storage Cloud provides federated search, electronic
data discovery with lock-down, and policy-driven file management
including indexing, retention, security, encryption, format conversion,
information lifecycle management, and automatic business continuity.
History of Building Storage Management software
- Open Source
- Been downloaded 75k times
- Re-wrote – now version 4 – in Python
Common anti-pattern observed in real world:
- Users storing “stuff” in Exchange since that was a convenient place to store it
- Results in a LOT of email storage (and add’l capacity is easy to keep adding on)
- Can’t find your data (too much to logically manage)
- Backups inadequate
- Complexity, complexity, complexity
The Twisted Storage Way
- Federated storage silos w/ adaptors/agents
- Provide enterprise capabilities spanning sites (access control, audits, search/indexing – including support for metadata, simplified administration and recovery)
- ILM = Information Lifecycle Management
- Open Source
- Work-flow (Python scripts, XML coming)
- Policy-driven (“delete this after 2 years”, “encrypt me”) (Python scripts)
Twisted Storage Design Goals
- Always available content (via replication)
- No back-up or recovery needed (due to replication)
- Linear scalability (scales out)
- Able to trade off durability with performance
- Supports old hardware
- Minimal admin overhead
- Support external storage systems and linkage
- Portable – will run on Linux, Windows, (iPhone?) – due to portable Python implementation
- Pricing: Enterprise Edition: $500 / TB up to 2 PB (annual), minimum $10k for first 20 TB (see web site for full story)
- versus competition like Centera which charge $15k/Silo + Enterprise Edition
- http://www.twistedstorage.com, firstname.lastname@example.org
Info Security & Cloud Computing Talk
From Meetup description: Javed Ikbal (principal and co-founder of zSquad LLC)- will talk about: “Marketing, Uncertainty and Doubt: Information Security and Cloud Computing”
- What is the minimum security due diligence that a company needs to do before putting it’s data in the cloud?
- Since 2007, Amazon has been telling us they are “.. working with a public accounting firm to … attain certifications such as SAS70 Type II” but these have not happened in 2+ years.
- On one side of the cloud security issue we have the marketing people, whohype up the existing security and gloss over the non-existing. On the other side we have security services vendors, who hawk their wares by hyping up the lack of security. The truth is, there is a class of data for every cloud out there, and there is also someone who will suffer a data breach because they did not secure it properly.
- We will look at Amazon’s EC2, risk tolerance, and how to secure the data in the cloud.
- Javed is a principal and co-founder of zSquad LLC, a Boston-based information security consulting practice.
Javed is a Security Consultant
Also co-founded http://www.layoffsupportnetwork.com
Formerly worked in Fidelity (in security area)
- Elastic – provision up/down on demand (technical)
- Avail from anywhere (technical)
- Pay-as-you-go (business model)
- Data stored in China – gov’t could get at it
- We never have direct access
- May be locked in? (for practical reasons)
- March 7, 2009 from WSJ – Google disclosed that it exposed a “small number” of Google docs – users not supposed to be authorized were able to view them. Google estimated < 0.05% of all stored Google docs were impacted – BUT! – this is a LOT of documents. http://blogs.wsj.com/digits/2009/03/08/1214/
- Sept 18, 2009 from NYT – a recent bug in Google Apps allowed students at several colleges to read each other’s emails – this impacted only a “small handful” of colleges (like Brown University, for 3 days)http://www.nytimes.com/external/readwriteweb/2009/09/18/18/18readwriteweb-whoops-students-going-google-get-to-read-ea-12995.html
- Google’s official policy for paid customers states “at your sole risk” and no guarantee it will be uninterrupted, timely, secure, or free from errors
- Amazon states it is not responsible for “deletioreach” – Javedn, destruction, loss” etc.
- Google will not allow customers to audit Google’s cloud storage claims
- Amazon says PCI level 2 compliance is possible with AWS, level 1 not possible
- SAS 70 Type II reports not meaningful unless you can see which controls were evaluated
- “on premise does not mean people are doing the right thing” –Javed
- Perception of more breaches in on-premise systems – but there are so many more of them, it is easy to misinterpret
- Business value of losing data will be balanced against business cost of it being exposed – but this does not account for the PROBABILITY of there being a breach – how do you calculate this risk? I bet it is easier to calculate this risk on the cloud than on premise (though *I* don’t know how to do this)
- We can’t expect all cloud services to be up all the time – right, and many businesses don’t have the data to fairly/accurately compare their own uptimes with those of the cloud vendors – and, further, if the cloud vendors did have 100% up-time, that may destroy the economies we are seeing on the cloud today (it may be 100% reliable, but too expensive to be interesting)
- Off-premise security != in cloud – different security issues for different data
- “There are 2 kinds of companies – those which have had a [data security]breach, and those which are going to have a [data security] breach” -Javed
- Centralization of data makes insider threat a bigger risk
- Customers should perform on-site inspections of cloud provider facilities (but rare?)
- Ask SaaS vendor to see 3rd party audit reports – SalesForce has one, Amazon does not (Google neither? What about Microsoft – not yet?)
- Providers need to be clear about what you will NOT support – e.g., Amazon took 2 years to provide an answer… Amazon/AWS disclaimers are excellent models
- Providers need to understand they may be subject to legal/regulatory discovery due to something a customer did
- Unisys has ISO 27001-certified data centers (high cost, effort)
Creating Secure Software
- Devs care about deadlines and meeting the requirements
- If security is not in the requirements, it will not get done
- if devs don’t know how to code securely, it will not get done right (if at all)
- Train your devs and archs: one day will help with 90% of issues!
- Build security into your software dev life-cycle
- Let security experts, not necessarily developers, write the security requirements
- Secure Code Review can be expensive – bake in an application security audit into your schedule, to be done before going live
- (high customer extensibility + low provider security responsibility) IaaS – PaaS – SaaS (low customer extensibility + high provider security responsibility)