Category Archives: Azure

Related to Microsoft’s Windows Azure platform

Workshop: AI Mini-Workshop at Boston Azure

The March 28 Virtual Boston Azure was headlined by Pamela Fox from Microsoft. She explained all about the RAG pattern which is commonly used for building effective applications based on Large Language Models (“LLMs”) and Generative AI (“GenAI”). Pamela shared many superb insights, including lots of depth, while answering a ton of interesting follow-up questions. Was a fantastic talk. Boston Azure has a YouTube channel at youtube.com/bostonazure where you can find recordings of many past events. Pamela Fox’s talk is available there as the 48th video to be posted to the channel.

After Pamela’s talk around 15 people stuck around to participate in our first ever “AI mini-workshop” hands-on experience. The remainder of this post is about that mini-workshop.

The AI mini-workshop was a facilitated hands-on coding experience with the following goals:

1. Demystify Azure OpenAI

As background, OpenAI’s ChatGPT burst onto the scene in November 2022. That led to an explosion of people learning about AI and associated technologies such as “LLMs” which is the common shorthand for Large Language Models.

The vast majority of people interact with LLMs via chat interfaces such as available from OpenAI’s public version of ChatGPT or via Copilot on Microsoft Bing search. There’s also a more integrated programming experience surfaced through GitHub Copilot for use with VS Code and several other popular IDEs.

But what about programming your own solution that uses an LLM? Microsoft has done a great job of providing an enterprise-grade version of the OpenAI LLM as a set of services known as Azure OpenAI.

The first goal of this AI mini-workshop was to demystify this programming experience.

This was accomplished by giving the mini-workshop participants a working C# or Python program that fit on a page. And there are only around 10 lines of meaningful code needed to interact with the AI service. This is NOT that complex.

Creating a production-grade application has additional requirements, but at its core, it is straight-forward to interact with Azure OpenAI service programmatically.

The hoped for “Aha!” moment was this:

Aha #1! I can do this! I can programmatically interact with the Azure OpenAI LLM. It isn’t that mysterious after all.

Aha #2! This is possible without much code! In the Python and C# solutions shared there were only around 10 lines of core code.

2. Understand Some AI Concepts

Part of the mini-workshop exercise was to recognize a hallucination and fix it through some additional grounding using a very simple form of RAG.

The hope here is for some “Aha!” moments:

Aha #3! Here’s a concrete, understandable example of a hallucination!

Aha #4! And here’s a concrete, simple example use of RAG pattern to better ground the AI so that it no longer hallucinates about today’s date! But do note that other hallucinations remain…

3. Wield Great Power

The ability to program a LLM to generate unique content is something that essentially NO DEVELOPER COULD DO, EVER, before the super-powerful LLMs that were developed at costs of hundreds of millions of dollars and democratized by the Microsoft Azure OpenAI services (as well as by OpenAI themselves).

The hands-on AI mini-workshop required either (a) a functional Python 3 environment, or (b) a functional C#/.NET environment – everything else was provided, including sufficient access to the Azure OpenAI LLM service to complete the mini-workshop.

But in the end with very little coding you can get to the 5th Aha! moment which is:

Aha #5! I have at my command capabilities that have not been possible in all of the history of computers. The magic of LLMs available via Azure OpenAI gives me superpowers that we are only in the very beginning of understanding the ways this can be put to use.


The source code for the AI mini-working is available here. Note that the API key has subsequently been rolled (invalidated), but the code works pretty well otherwise. 🙂

My original thinking was to distribute the keys separately (like this). If this was an in-person workshop I would have kept the configuration values separated from the source, but given the added challenge of doing this with an online distributed audience I decided to simplify the mini workshop by included the configuration values directly in the source code. Looking back, I believe it was a good concession for minimizing obstacles to learning. So I’d do it again next time.

Talk: Orlando Code Camp – Meet GitHub Copilot

24-Feb-2024

Had a great time hanging out with the Orlando tech community at their annual Code Camp. I made the trip with Maura (she gave a talk on blockchain) and we met a lot of cool people and had a great time.

I spoke on GitHub Copilot. Much of my talk was demo and discussion – you have to see this in action (or use it) to appreciate what’s happening. I consider this a glimpse into the future – it will surely become then norm to have an AI assistant when programming.

One part of the demo showed using one AI (GitHub Copilot) to help program another AI (Azure OpenAI).

It is invigorating to engage with a vibrant community of technologists. Thank you Orlando Code Camp organizers, sponsors, and all those in the tech community!

The deck I used in the talk is both attached for download and available on slideshare.

Talk: GitHub Copilot is your AI Pair Programming Assistant

08-Aug-2023

I gave an extended demo of GitHub Copilot at the Boston Azure AI event tonight. Most of the session was a demo, but I did also walk through some slides. Those slides are attached.

I also showed these links or services:

Talk: Running #Azure Securely and Compliantly – Granite State Code Camp #GSCC2021 – aka Compliance for Lazy People


Today I had the opportunity to speak at the Granite State Code Camp (#GSCC2021) in Manchester, NH. This was the first time I’ve given an in-person talk since the start of COVID and it was great to see so many smiling facing (even when partially obscured by a mask!).

Last year my focus was a more in-the-weeds talk called Running Azure Securely – which of these Azure security features are for me?. This year I stepped back a level and focused on Compliance. In the session I discussed security vs. compliance, the shared responsibility model, and touched on a few other features, but spent a good bit of time focused on what I am thinking about as the “Policy stack” where one can gather lots of insight about your workload’s compliance with technology controls indicated by various compliance standards – based on the Azure Policy capabilities, a pillar of governance, and rolled up and available from Azure Security Center Microsoft Defender for Cloud.

Azure Security Center as a brand is no more – it is part of a rebranding to Microsoft Defender for Cloud. I assume this renaming, announced at Ignite, is because it is a feature set that can span beyond Azure – for example, keeping an on on-premises resources and resources in non-Azure clouds like AWS.

The session was interactive (as preferred!) and many thanks to Kevin and Vishwas and the nice lady whose name I didn’t catch who I think worked for the college for help in overcoming technical limitations in the room I was speaking from.

If you want to experience MORE AZURE please check out https://meetup.com/bostonazure (currently operating as part of “Virtual Boston Azure”).

If you are someone who would like to SPEAK at Virtual Boston Azure, please get in touch. (Twitter is a good way to reach me – I am @codingoutloud – or you can address it to @bostonazure.)

Slide deck is attached.

Talk: Running #Azure Securely – Turning on the WAF

Tonight I had the opportunity to speak at #VirtualBostonAzure to talk about raising the visibility of security signals in your environment by turning on your WAF. In demos the WAF available in Azure Front Door was used.

Slides:

YouTube:

https://www.youtube.com/watch?v=OWXTtCUNmes&feature=youtu.be

Talk: Running #Azure Securely – Are all these security features for me?

Today I had the opportunity to speak at VT Code Camp #11 in Burlington, VT. As part of my series of talks on Running Azure Securely, my talk today was around defense in depth and was called Running Azure Securely – which of these Azure security features are for me?. The session was interactive, engaging a half-dozen folks in the audience in a discussion of how to defend various workloads using the (fictitious) page of photos app as a foil.

Some Resources Mentioned

The deck

VermontCodeCamp-BillWilder-2019-Sep-28.AllTheseSecurityFeatures

Talk description

Azure offers thousands of security features. Some of them are easy to use and others are complicated. Some are free to use and some look really, really expensive. Which ones should I be using for my applications?

In this talk we’ll look at some ways to reason about which security controls you might want to apply and why. We’ll consider groups of Azure security features through a pragmatic lens of security best practices and defense-in-depth/breadth, but tempered by the reality that “more security” is not always the answer, but rather “what is the right security” for a situation. By the end of this talk you should have a better idea of the security feature set offered by Azure, why/when they might or might not be needed, and have discussed some ways to reason about how which are relevant you by helping you think about how to assess appropriately for multiple situations.

Do you have specific questions about the applicability of Azure security features already? Feel free to tweet your questions at Bill in advance to @codingoutloud and he’ll try to work answers to any questions into the talk in advance.

Action Photo

(if I can find one)

 

Talk: Are all these #Azure security features for me?

On Tuesday July, 30, 2019 I had the opportunity to speak at North Boston Azure. The talk was part of a series on Running Azure Securely and was called Are all these Azure security features for me? and was not really a “talk” in that it was highly interactive. For those who attended, you will recall we filled in some slides collaboratively. Thus, they may not appear so polished for those of you who did not join live. Either way, please find the slides (“collaborative” and all) below.

highres_483599366

This was an experimental approach for me and the feedback from the audience tells me it worked pretty well. The group at North Boston Azure was already knowledgeable and engaged, so hopefully made for a interesting experience for all involved (was certainly fun for me).

Azure-DefenseInDepth-BillWilder-2019-July-30

You can follow me on Twitter (@codingoutloud).

You can also follow Boston Azure on Twitter (@bostonazure).

 

Who logged into my #Azure SQL Database?

Ever try to figure out how to track who logged into your Azure SQL database? You checked all the usual ways you might handle that with a SQL Server database, but one-by-one find out they just don’t work. Here’s one way to do it.

To track who is logging into your Azure SQL database, enable auditing (here’s how to do that) with audit entries directed to an Azure storage blob. There are two ways to do this: at the database server level and at the individual database level. Either is fine, but for the example that follows, auditing is assumed to be at the db server level. The example query can be adjusted to work with auditing at the database level, but one of the two auditing options is definitely required to be on!

Run this query to find out all the principals (users) who have logged in so far today into your Azure SQL database.


— Turn on Audit Logging to Blob for your Azure SQL Database. Then you can query who has logged in.
— The example below assumes DB Server-level audit logging. Details will vary slightly for Database-level audit logging.
— The example below shows who logged in so far today.
— Change "-0" to "-1" to look at yesterday (from a UTC perspective, not your local timezone).
— Change "-0" to "-100" to look at 100 days ago.
SELECT FORMATMESSAGE('%s (%s)', CAST(DATEADD(day, -0, CONVERT(date, SYSUTCDATETIME())) as varchar),
DATENAME(WEEKDAY, DATEADD(day, -0, SYSUTCDATETIME()))),
server_principal_name,
COUNT(server_principal_name) as 'Logins'
FROM sys.fn_get_audit_file(FORMATMESSAGE('https://<MYBLOB&gt;.blob.core.windows.net/sqldbauditlogs/<MYDBSERVER>/<MYDB>/SqlDbAuditing_ServerAudit/%s/', 
CAST(DATEADD(day, -0, CONVERT(date, SYSUTCDATETIME())) as varchar)),default, default)
WHERE (event_time >= CAST(CONVERT(date, SYSUTCDATETIME()) as datetime2))AND (action_id = 'DBAS')
GROUP BY server_principal_name
HAVING COUNT(server_principal_name) > 0

The output is something like the following, assuming if I’ve logged in 12 times so far today with my AAD account (bill@example.com) and 1 time with a database-specific credential (myadmin):

09-Nov-2019 (Saturday) codingoutloud@example.com 12

09-Nov-2019 (Saturday) myadmin 1

The query might take a while time to run, depending on how much data you are traversing. In one of my test environments, it takes nearly 20 minutes. I am sure it is sensitive the amount of data you are logging, database activity, and maybe settings on your blob (not sure if premium storage is supported, but I’m not using it and didn’t test with it).

Note: There are other ways to accomplish this, but every way I know of requires use of Azure SQL auditing. In this post we pushed them to blobs, but other destinations are available. For example, you could send to Event Hubs for a more on-the-fly tracker.

Talk: Azure Security Toolbox at Boston Azure

Last night’s Boston Azure meeting featured Marija Strazdas from @AlertLogic who spoke about the Shared Security Model for security in the cloud. I also showed in more detail some of the tools that Azure provides to help customers with their side of the responsibility model including some with Azure SQL DB, Storage, KeyVault, and Azure Security Center. Here are the slides I presented (though I didn’t get through most of them).

EDIT: Here is the deck presented by Marija Strazdas from @AlertLogic who spoke about the Shared Security Model for security in the cloud:

Alert Logic Azure Security Presentation

marija

You can find @bostonazure on twitter, and feel free to join us on slack.

 

Talk: When NOT to use PowerShell with Azure

Today at PowerShell in Action I spoke twice about not going TOO far in your PowerShell when managing Azure resources.

The point of the talks wasn’t really that using PowerShell is bad/wrong, more that it might not be the best tool for the job in certain scenarios. In particular, an ARM template is a powerful modeling tool in support of a “no pets” policy, which is interesting to consider as your cloud environments grow more complex while also wanting to make environments easier to manage. Another benefit stems from keeping the ARM template itself as an “infrastructure as code” artifact that can be used to document – and, more to the point, as executable documentation – for stamping out environments predictably. And still another feature: the ARM runtime handles a lot of the complex parts that could come by trying to script one resource at a time via imperative PowerShell scripts – for example, error recovery and retries.

The deck is on the event shared github repo.  There are lots of otherPowerShelly resources on that repo that you may find worth checking out.

(Added 03-June) For those of you who attended my Advanced session, when I attempted to clean up at the end using Remove-AzureRmResourceGroupDeployment, my PowerShell command had an error in it. Here is the correct version. In the first screen shot I show how to ascertain the correct value for  the first the parameter using Get-AzureRmResourceGroupDeployment.

Get-AzureRmResourceGroupDeployment

Remove-AzureRmResourceGroupDeployment `
   -Name Microsoft.Template -ResourceGroupName k1

Remove-AzureRmResourceGroupDeployment.png

Once that PowerShell command executed, all 8 resources associated with that deployment were removed (deleted, and billing stopped).

Ta da!

Hope to see all you locals at Boston Azure (@bostonazure) in the future for more Azurey action.